Why is finding the right keyboard impossible?

In a world where every product you could possibly think of wanting is made, and a huge number of orders of magnitude more products exist that shouldn’t be made at all, why is it impossible for me to find the keyboard I want?

When I started using PCs, the IBM PC and AT keyboards had function keys on the left-hand side, the keyswitches were loud and mechanical, and took a fair amount of force to depress. We cared a lot about keyboards in the days before graphical user interfaces were invented because they were the user interface.

I liked those keyboards. In the intervening 25 years have come a whole variety of input devices but I’ve remained wedded to the Northgate Omnikey keyboard I bought back in 1988. I must have written millions of words on that 23-year-old keyboard and nothing has ever gone wrong. I can disassemble it to clean it every now and again, but it’s had no other maintenance. It’s a marvellous piece of kit.

Incidentally, Northgate as a company is now, sadly, long gone, but you can still buy very similar keyboards from Creative Vision Technologies.

But I’m ready to move on. I’m ready to give up the loud forceful and clicky keyboard for a quieter device with keys that are easier to press and which, after a heavy day’s typing, don’t make my hands feel like they been through an assault course. But I’m not ready to give up the F keys down the left-hand side of the keyboard.

I use the keyboard a lot, even though I could use a mouse. It takes me a few milliseconds hit Ctrl-S or Alt-F4 to save a file or close a program, compared to the seconds it takes to move my hand over to the mouse, schlepp the cursor to the right icon and click it. It’s what my hands are used to doing.

And you need only one hand to do it when the the F keys are in the right place: on the left. Try Alt-F4 with the F keys on the top of the main keyboard layout and it feels really clunky and awkward — the result being that very few people ever use those keys.

So why can’t I find a quiet, easy-action keyboard with F keys on the left? Advice gratefully received….

Guy Kewney RIP

I want simply to express how what a huge loss to the world the death early this morning (Thursday 8 April 2010) of Guy Kewney is. He died aged just 63.

He was the first technology journalist. He started in the mid-1970s as a result of which he got to know all the big names when they were still speaking to journalists. Alan Sugar. Bill Gates. Steve Jobs. And Douglas Adams for good measure.

Guy’s approach never wavered, and was born out of his fierce intelligence, a small smattering of humility, intense curiosity and deep loyalty to his readers. He’d ask the right question in a terribly polite way – and it got results. The IT exec would quaver and then blurt out what they didn’t want to say – or they’d give the game away by clamming up.

I worked with him for almost 15 years on PCW and PC Magazine and he was always the same. My job was, more often than not, to extract and edit his copy. His copy was incisive, insightful and idiosyncratic – and invariably, horribly late. But what you got was Guy’s voice, every time, no kow-towing to corporate or magazine style. It was sometimes infuriating – but he was right to do that, and readers loved him for it.

More entertaining was the Kewney Chaos Field (it acquired several names over the years) which resulted in perfectly good pieces of technology, often new, pre-released hardware or software, turning into door-stops as soon as they got within 10 feet of the man. How come? No-one ever figured out how he managed to break stuff that no-one ever did.

As for his expenses: managing them drove one individual to leave the country….but more importantly, he was an inspiration to two generations of journalists and PR flacks over the decades of his working life.

I didn’t speak to him as much over the last years of his life as I did when I sat across from him in the PC Magazine office for almost ten years. But I’m glad I went to see him just a week before his death. He was weak physically but his brain was undimmed, and he was perfectly relaxed and accepting of what was about to happen. He knew he was soon to die of the cancer that started in his bowels then ate away the rest of him. But the rational man that he was took it in his stride.

I only hope I can leave this world as gracefully. Guy: you are missed.

His final blog is here and there’s a nice obit from Iain Thomson here.

The worst press release of 2010 – by a country mile

It’s an old story but it keeps on running. Companies employ PR companies to put themselves before the media. The main way they do that is through press releases.

So would you be happy if your PR company put out a release announcing an initiative but which omitted not one but three key facts?

  1. Who was launching it
  2. Why they were launching it
  3. Why anyone else would care

How could they get it so wrong?

Here it is, in all its glory, with only the PR company’s name stripped out to protect its blushes. Though, under enough pressure, I might publish that too….

The Common Assurance Metric (CAM) launched today is a global initiative that aims to produce objective quantifiable metrics, to assure Information Security maturity in cloud, third party service providers, as well as internally hosted systems. This collaborative initiative has received strong support from Public and Private sectors, industry associations, and global key industry stakeholders.

There is currently an urgent need for customers of cloud computing and third party IT services to be able to make an objective comparison between providers on the basis of their security features. As ENISA’s work on cloud computing, has shown, security is the number one concern for many businesses and governments. Existing mechanisms to measure security are often subjective and in many cases are bespoke solutions. This makes quantifiable measurement of security profiles difficult, and imposes the need to apply a bespoke approach, impacting in time, and of course cost. The CAM aims to bridge the divide between what is available, and what is required. By using existing standards that are often industry specific, the CAM will provide a singular approach of benefit to all organisations regardless of geography or industry.

[Quotes about how wonderful it is removed from here]

The project team anticipate delivery of the framework in late 2010 followed by a process towards global adoption for organisations wishing to obtain an objective measurement of security provided by cloud providers, as well as the level of security for systems hosted internally.

You’ll notice other issues (polite word) in there too. Who is ENISA, mentioned in the second para but never explained? Why is the first sentence only barely comprehensible — or even grammatical — on the first read-through? The second sentence in the second para doesn’t belong there, it should be at the top of that para. Since when does the phrase “impacting in time” qualify as English? And as for the last sentence/para, how many times did you have to read it to extract what the hapless writer was driving at?

Finally, why do people still feel the need to double-space between sentences? I gave up typewriters and starting using a word processor almost 30 years ago, and haven’t felt the need to do that since then…

It makes you wonder.

iPad? Just say no

If the world needed an iPad, why hasn’t one been invented before? Oh look: it has. Called the Newton when Apple launched it in 1992 – there were a couple of others released about the same time but the Newton got the headlines – it died in 1998 as not enough people bought it.

Will the iPad be different? Do you care?

Amid the inevitable hoopla and swooning going on in Applista diasporas at media outlets such as the Guardian and the BBC, let’s be clear: the iPad is a blown-up iPhone. And already we hear calls for there to be a cut-down version of the iPad so that you can carry it in your pocket. Thought that’s what an iPhone was…

The iPad’s remit seems to be more limited than the Newton’s. There’s no handwriting recognition for a start but it is very shiny, has bright colours and maybe the battery life is long enough to make it useful enough to carry around all day. I await review samples for verification. There’s no talk of local connectivity to either Mac or Windows, no talk of open access to all the applications you want, no talk of opening up the OS so that others can develop extensions or applications.

And for all of Jobs’ sneering at netbooks, mine works for hours on a single charge, runs Ubuntu quite happily – though I suspect that Windows 7 might actually be easier to to use in terms of getting everything working, but at least I have the choice.

As one blogger has already pointed out, this closed-world mentality could be the fatal flaw in the iPad’s shiny armour.

iPad? I don’t think so.

Guidance for upgrading to Windows 7 RTM

The purpose of this post is to help you start with a copy of the Windows 7 RTM code and upgrade an existing installation, especially if you installed the Windows 7 release candidate.

Note that this is not a detailed how to – there’s plenty of those on the Web already, and anyway it’s really not that difficult. Instead, this is a list of some of the gotchas that could put a spoke in the smooth ride that Microsoft promises but so often fails to deliver. Rule one: be patient…

1. Copy the installation files onto your hard disk. It’s faster than installing from a DVD and allows you to make a small tweak that you’ll need if you’re upgrading from the Windows 7 RC (release candidate) as I did. Even better, install from a second hard disk or a USB stick, as this will speed things further.

2. Get a proper copy of the OS and check the files are all valid. If you don’t do this, you could get a third of the way through the process only for the installation process to throw up an error because it couldn’t read a file. That’s annoying.

3. If you’re upgrading from the release candidate of Windows 7, then you need to make a small alteration to \Sources\cversion.ini, as an in-place upgrade, as opposed to a fresh installation, is not officially supported. However, it does work without problems if you do the following:
i) Open cversion.ini with Notepad. This is what you’ll see:
[HostBuild]
MinClient=7233.0
MinServer=7100.0

ii) Alter the first line so it reads:
MinClient=7000.0

iii) Save the file. That’s it.

4. Start the installation and load up your patience.

5. After a few prompts, the system will tell you that it doesn’t need your attention: effectively, it’s telling you to go away and come back in a while. If you’re upgrading an existing installation, that timescale is longer than you think. I left it running overnight and it took over six hours to copy across an installation of several hundred gigabytes. Don’t be tempted to reboot if it seems to be doing nothing at the start of the ‘Transferring settings’ section.

6. If you do reboot before the process completes, check the logs in the c:\$WINDOWS.~BT\Sources\Panther folder, especially setupact.log, to find out why it failed. There’s a workaround available here for an installation that gets stuck around 61%-62%. The system should then roll back to your original OS – it worked three times for me after rebooting at different points in the installation process, despite the dire warnings about doing so.

7. Why might you want to do this?
i) Your Win7 RC licence key runs out in June 2010
ii) The release candidate manifested a few glitches
iii) The release candidate certainly included some debug code, slowing things down.

Each of these three reasons is a good one to get onto the RTM code asap – together they’re compelling. Unless of course you decide that Windows 7 is not for you and you want to go run Ubuntu…

Time to end loyalty card schemes

Am I alone (distraction: how many rants start like this?) in thinking that few of the trappings of the modern world are as annoying and deeply insidious as the loyalty card? Every shop in the high street offers one, it seems, so it can’t be a bad thing, or they wouldn’t get away with it, would they?

“Do you have a loyalty card,” they twitter. I’ve just encountered the final straw – hence this posting.

On the face if it, what’s not to like? You give the organisation your name and address, they send you a card, and you get a percentage point or two off your shopping. In these hard times, many a mickle makes a mackle.

But they never tell you the whole story. They will never come out and say that, if you subscribe, the company will bombard you with offers that, based on your spending patterns, they think you will want. Well, maybe you will, maybe you won’t, but would you rather not make those choices at a time of your own choosing, under your own steam as it were, rather than being manipulated by some marketing droid or, worse, by some marketing algorithm deep in a data centre somewhere?

If those cards weren’t worthwhile to administer, then companies such as Tesco – feted in marketing circles as among the most successful deployers of such schemes – wouldn’t do it. The reason it’s worth it is not because they get their hands on your spending patterns, which of course they do and which raises other issues – see below – but also because you spend more. Each marketing mailout increases demand for whatever it is that’s being pushed at the consumer.

So whatever discount you’re promised, you’re almost certain to have blown it out by buying more stuff you wouldn’t have bought had the scheme not been in place. That’s more profit for Tesco.

What’s more, the cost of the scheme is offset by hiking prices, as demonstrated by the Morrisons chain of supermarkets which cut prices when it abolished its loyalty card scheme back in 2004. and Asda told the Daily Telegraph it wouldn’t be implementing a scheme because: “It would have cost £60m to set up and £20m to £30m a year to maintain.”

But more fundamentally important is the loss of privacy that these cards entail. As the Telegraph feature referenced above reports, one campaigner likened having a loyalty card to walking around with a barcode stamped on your backside.

What I buy is my business, not that of a marketing programme. The data my buying provides means that more snippets of data about me sit in the public domain, waiting for some future organisation to hoover up and use in ways as yet unspecified.

Those who made this argument ten years ago were shouted down as paranoid. But today, with the growth of huge databases, accessible worldwide, as companies amalgamate and share data, and as basic security issues – such as not walking around with databases on a device liable to either theft or absent-mindedness, such as a laptop of USB memory stick – seem to be beyond either commercial organisations or the government, it behoves us all to hang onto those snippets.

Piled up in one place, a lot of snippets make a profile. Many a mickle makes a muckle.

What’s the prognosis for true high-speed mobile data?

The mobile industry confuses its customers and doesn’t deliver what it promises.

We all talk much about the latest technology, and how it will transform this that or the other element of our personal and/or working lives.

I spent quite a bit of time yesterday talking about LTE — also known as 4G by some, but not everyone, in the mobile industry. It’s known as 4G because it succeeds 3G, today’s iteration of mobile broadband technology. Even though, confusingly, some of it, such as HSPA which can give you as much as 21Mbits/sec is known as 3.5G.

And LTE isn’t 4G technically, because it doesn’t quite meet the definition of 4G laid down by the global standards body, the ITU, according to one analyst I spoke to. So you’ll find LTE referred as 4G or as 3.5+G, LTE-Advanced — which does meet the 4G spec — or just plain LTE. WiMax, incidentally, is 4G according to the ITU. No wonder the mobile industry confuses its customers. There’s a pithy piece about LTE and 4G here.

But that’s all by the by in some ways. The important thing about LTE is that it promises 100Mbit/sec download and 50Mbits/sec upload speeds. If you know anything about the technology, you’ll know that in practice some 25 percent that is likely to be eaten up by protocol and other overheads. You’ll also know that a further 25 percent is likely to be lost to distance losses, cell sharing, and clogged up backhaul networks.

All this is due to arrive over the next ten years. Yes, ten years. Roll-outs are unlikely to start happening in the UK before 2012, more likely 2015.

Except that this is so much hogwash.

I was in the middle of London — yes, challenging conditions due to the concrete canyon effect, but the kind of area in which the mobile industry has to demonstrate its best technology. And the best mobile data rate I managed inside or out was a standard GSM-level 56kbits/sec. This is early 1990s technology.

So if 20 years after its invention and 15 years after its introduction, that’s the best I can get in the middle of one of the world’s leading capital cities, I suspect it’ll be 2025 before I see LTE speeds.

You know what? I’m not sure how much I’ll care by then…

Computing is making progress, but so slowly…

I’m sitting in a hotel room on a press trip. Flown on a flight landing at 1000, the first official engagement with the vendor (who remains for the moment anonymous) is tonight. I’ve had all day to hang around and do work stuff. Naturally, you’re never as efficient as you would be at the office, with all the stuff around you that you need. Not least, a nice cup of tea.

But here in the room, miles from anywhere, I’ve hotel-provided wi-fi for free, a laptop whose battery life is measured in half-day – this dual-core machine with 4GB of memory and a 15.4-inch LCD-lit screen lasts for up to seven hours on one charge – and a phone with no charger that won’t last more than a day and a half. I thought I’d brought a cable but managed to forget it in the early dawn rush to the airport. But all I need is a mini-USB to USB cable and they’re near-ubiquitous: I’m reasonably confident of finding or borrowing one sometime in the next 24 hours.

Five years ago, the battery life of laptops was abysmal, and phone chargers were all proprietary. And if you’d asked for wi-fi anywhere but a city centre, you’d have been looked at as if you had horns growing out of your head.

Things are improving, if slowly…

Dust to dust…or is that CPUs?

A quick follow-up from a news story I wrote for eWeek yesterday entitled ‘Moore’s Law – Still Driving Down The IT Footprint’.

The story concerned the research by Stanford University professor, Dr Jonathan Koomey, who found that Moore’s Law was active for long before Gordon Moore coined his eponymous observation. Koomey reckons that Moore’s Law will result in the huge growth in mobile devices with fixed computational needs, such as controllers.

Their requirements won’t grow but the growth of smaller, more power-efficient processors will come towards them, to the point where battery life becomes a non-issue.

Then you’ll get ‘dust’, as this fascinating paper posits. Read and enjoy!

Democracy loses to Murdoch – again

Capitalism tends to create monopolies. Over time, we’ve all come to appreciate that monopolies are generally a bad thing (perhaps with the exception of a few areas such as utilities and railways) and should be curbed.

They accumulate too much power in one organisation’s hands, and, because of lack of competition, tend to be able to raise prices to any level they like as well as reducing product choice.

And the media is an industry where that’s particularly egregious because it tends to undermine the democratic process. Here’s a case in point.

According to Ofcom, the UK’s media and telecoms regulator, Rupert Murdoch’s satellite TV operation BSkyB has now reached a point where the regulator has published “a further consultation as part of its pay TV market investigation” as a result of its “concerns about the restricted distribution of premium sports and movies channels operated by BSkyB”.

Specifically, Ofcom is concerned about the “limited distribution of football and movies”, which has seen national games such as cricket and football disappearing from terrestrial TV, and instead commanding premium prices on top of already-expensive pay TV bundles. The regulator said that it “considers that Sky has market power in the wholesale supply of channels containing this attractive content, and that it is acting on an incentive to limit the distribution of these channels to rival TV platforms”. It won’t let its rivals have access to that content for a reasonable price.

Ofcom issued that statement on 26 June 2009. On 6 July, in a little-reported speech – note that Murdoch-owned newspapers dominate the UK market – the UK’s opposition leader David Cameron, who looks set to become UK Prime Minister in 2010, has promised that Ofcom “as we know it will cease to exist….Its remit will be restricted to its narrow technical and enforcement roles. It will no longer play a role in making policy.

“And the policy-making functions it has today will be transferred back fully to the Department for Culture, Media and Sport.”

Only one organisation will benefit from Cameron’s new policy: BSkyB.

In other words, the opposition leader, who is now being politically backed by Murdoch in his many media outlets, is already paying back the political capital that Murdoch has invested in him. That’s despite the Tories’ much-trumpeted belief in competition – which clearly does not apply when there’s Murdoch brown-nosing to be done.

The result will be even greater concentration of media power in the hands of one organisation, fewer outlets for not just movies and sport but news too, and – doubt it not – further politically motivated attacks on the UK media’s one big success story, the BBC.

And, incidentally, if you doubt that the BBC, despite its faults, is a success story, just ask any informed observer outside the UK if they would like to see a BBC-style setup replicated in their own country: none will demur.

If the product in question were rivets, perhaps this would be of little moment. But the product is information that’s required by the electorate.

I leave the logical conclusion to your conscience.

There’s more on this in the Guardian here.