Hard disks and flash storage will co-exist – for the moment

When it comes to personal storage, flash is now the default technology. It’s in your phone, tablet, camera, and increasingly in your laptop too. Is this about to change?

I’ve installed solid-state disks in my laptops for the last three or so years simply because it means they fire up very quickly and – more importantly – battery life is extended hugely. My Thinkpad now works happily for four or five hours while I’m using it quite intensively, where three hours used to be about the maximum.

The one downside is the price of the stuff. It remains stubbornly stuck at 10x or more the price per GB of spinning disks. When you’re using a laptop as I do, with most of my data in the cloud somewhere and only a working set kept on the machine, a low-end flash disk is big enough and therefore affordable: 120GB will store Windows and around 50GB of data and applications.

From a company’s point of view, the equation isn’t so different. Clearly, the volumes of data to be stored are bigger but despite the blandishments of those companies selling all-flash storage systems, many companies are not seeing the benefits. That’s according to one storage systems vendor which recently announced the results of an industry survey.

Caveat: industry surveys are almost always skewed because of sample size and/or the types of questions asked, so the results need to be taken with a pinch – maybe more – of salt.

Tegile Systems reckons that 99 percent of SME and enterprise users who are turning to solid state storage will overpay. They’re buying more than they need, the survey finds, at least according to the press release, which wastes no time by mentioning in its second paragraph that the company penning the release just happens to have the solution. So shameless!

Despite that, I think Tegile is onto something. Companies are less sensitive to the price per GB than they are to the price / performance ratio, usually expressed in IOPS, which is where solid-state delivers in spades. It’s much quicker than spinning disks at returning information to the processor, and it’s cheaper to run in terms of its demands on power and cooling.

Where the over-payment bit comes in is this (from the release): “More than 60% of those surveyed reported that these applications need only between 1,000 and 100,000 IOPS. Paying for an array built to deliver 1,000,000 IOPS to service an application that only needs 100,000 IOPS makes no sense when a hybrid array can service the same workload for a fraction of the cost.”

In other words, replacing spinning disks with flash means you’ve got more performance than you need, a claim justified by the assertion that only a small proportion of the data is being worked on at any one time. So, the logic goes, you store that hot data on flash for good performance but the rest can live on spinning disks, which are much cheaper to buy. In other words, don’t replace all your disks with flash, just a small proportion, depending on the size of your working data set.

It’s a so-called hybrid solution. And of course Tegile recommends you buy its tuned-up, all-in-one hybrid arrays which saves you the trouble of building your own.

Tegile is not alone in the field, with Pure Storage having recently launched in Europe. Pure uses ordinary consumer-grade disks, which should make it even cheaper although price comparisons are invariably difficult due to the ‘how long is a piece of string?’ problem.

There are other vendors too but I’ll leave you to find out who they are.

From a consumer point of view though, where’s the beef? There’s a good chance you’re already using a hybrid system if you use a recent desktop or laptop, as a number of hard disk manufacturers have taken to front-ending their mechanisms with flash to make them feel more responsive from a performance perspective.

Hard disks are not going away as the price per GB is falling just as quickly as it is for flash, although its characteristics are different. There will though come a time when flash disk capacities are big enough for ordinary use – just like my laptop – and everyone will get super-fast load times and longer battery life.

Assuming that laptops and desktops survive at all. But that’s another story for another time.

Technology highlights 2013

I’ve been shamefully neglecting this blog recently, yet a lot of interesting new technologies and ideas have come my way. So by way of making amends, here’s quick round-up of the highlights.

Nivio
This is a company that delivers a virtual desktop service with a difference. Virtual desktops have been a persistent topic of conversation among IT managers for years, yet delivery has always been some way off. Bit like fusion energy only not as explosive.

The problem is that, unless you’re serving desktops to people who do a single task all day, which describes call centre workers but not most people, people expect a certain level of performance and customisation from their desktops. If you’re going to take a desktop computer away from someone who uses it intensively as a tool, you’d better make sure that the replacement technology is just as interactive.

Desktops provided by terminal services have tended to be slow and a bit clunky – and there’s no denying that Nivio’s virtual desktop service, which I’ve tried, isn’t quite as snappy as having 3.4GHz of raw compute power under your fingertips.

On the other hand, there’s a load of upsides. From an IT perspective, you don’t need to provide the frankly huge amounts of bandwidth needed to service multiple desktops. You don’t care what the end user wants to access the service with – so if you’re allowing people to bring and use their own devices into work, this will work with anything, needing only a browser to work. I’ve seen a Windows desktop running on an iPhone – scary…

And you don’t need to buy applications. The service provides them all for you from its standard set of over 40 applications – and if you need one the company doesn’t currently offer, they’ll supply it. Nivio also handles data migration, patching, and the back-end hardware.

All you need to do is hand over $35 per month per user.

Quantum
The company best known for its tape backup products launched a new range of tape libraries.

The DXi6800 is, says Quantum’s Stéphane Estevez, three times more scalable than any other such device, allowing you to scale from 13TB to 156TB. Aimed at mid-sized as well as large enterprises, it includes an array of disks that you effectively switch on with the purchase of a new licence. Until then, they’re dormant, not spinning. “We are taking a risk of shipping more disks than the customer is paying for – but we know customer storage is always growing. You unlock the extra storage when you need it,” said Estevez.

It can handle up to 16TB/hour which, is, reckons the company, four times faster than EMC’s DD670 – its main competitor – and all data is encrypted and protected by an electronic certificate so you can’t simply swap it into another Quantum library. And the management tools mean that you can manage multiple devices across datacentres.

Storage Fusion
If ever you wanted to know at a deep level how efficient your storage systems are, especially when it comes to virtual machine management, then Storage Fusion reckons it has the answers in the form of its storage analysis software, Storage Fusion Analyze.

I spoke to Peter White, Storage Fusion’s operations director, who reckoned that companies are wasting storage capacity by not over-provisioning enough, and by leaving old snapshots and storage allocated to servers that no longer exist.

“Larger enterprise environments have the most reclaimable storage because they’re uncontrolled,” White said, “while smaller systems are better controlled.”

Because the company’s software has analysed large volumes of storage, White was in a position to talk about trends in storage usage.

For example, most companies have 25% capacity headroom, he said. “Customers need that level of comfort zone. Partners and end users say that the reason is because the purchasing process to get disk from purchase order to installation can take weeks or even months, so there’s a buffer built in. Best practice is around that level but you could go higher.”

You also get what White called system losses, due to formatting inefficiencies and OS storage. “And generally processes are often broken when it comes to decommissioning – without processes, there’s an assumption of infinite supply which leads to infinite demand and a lot of wastage.”

The sister product, Storage Fusion Virtualize “allows us to shine a torch into VMware environments,” White said. “It can see how VM storage is being used and consumed. It offers the same fast analysis, with no agents needed.”

Typical customers include not so much enterprises as systems integrators, service providers and consultants.

“We are complementary to main storage management tools such as those from NetApp and EMC,” White said. “Vendors take a global licence, and end users can buy via our partners – they can buy report packs to run it monthly or quarterly, for example.”

Solidfire
Another product aimed at service providers, SolidFire steps aside from the usual pitch for all solid-state disks (SSD). Yes solid-state is very fast when compared to spinning media but the company claims to be offering the ability to deliver a guarantee not just of uptime but of performance.

If you’re a provider of storage services in the cloud, one of your main problems, said the company’s Jay Prassl, is the noisy neighbour, the one tenant in a multi-tenant environment who sucks up all the storage performance with a single database call. This leaves the rest of the provider’s customers suffering from a poor response, leading to trouble tickets and support calls, so adding to the provider’s costs.

The aim, said Prassl, is to help service providers offer guarantees to enterprises they currently cannot offer because the technology hasn’t – until now – allowed it. “The cloud provider’s goal is to compute all the customer’s workload but high-performance loads can’t be deployed in the cloud right now,” he said.

So the company has built SSD technology that, because of the way that data is distributed across multiple solid-state devices – I hesitate to call them disks because they’re not – offers predictable latency.

“Some companies manage this by keeping few people on a single box but it’s a huge problem when you have hundreds or thousands of tenants,” Prassl said. “So service providers can now write a service level agreement (SLA) around performance, and they couldn’t do that before.”

Key to this is the automated way that the system distributes the data around the company’s eponymous storage systems, according to Prassl. It then sets a level of IOPS that a particular volume can achieve, and the service provider can then offer a performance SLA around it. “What we do for every volume is dictate a minimum, maximum and a burst level of performance,” he said. “It’s not a bolt-on but an architecture at the core of our work.”

Technology predictions for 2013

The approaching end of the year marks the season of predictions for and by the technology industry for the next year, or three years, or decade. These are now flowing in nicely, so I thought I’d share some of mine.

Shine to rub off Apple
I don’t believe that the lustre that attaches to everything Apple does will save it from the ability of its competitors to do pretty much everything it does, but without the smugness. Some of this was deserved when it was the only company making smartphones, but this is no longer true. and despite the success of the iPhone 5, I wonder if its incremental approach – a slightly bigger screen and some nice to have features – will be enough to satisfy in the medium term. With no dictatorial obsessive at the top of a company organised and for around that individual’s modus operandi, can Apple make awesome stuff again, but in a more collective way?

We shall see, but I’m not holding my breath.

Touch screens
Conventional wisdom says that touchscreens only work when they are either horizontal and/or attached to a handheld device. It must be true: Steve Jobs said so. But have you tried using a touchscreen laptop? Probably not.

One reviewer has, though, and he makes a compelling case for them, suggesting that they don’t lead to gorilla arm, after all. I’m inclined to agree that a touchscreen laptop could become popular, as they share a style of interaction with users’ phones – and they’re just starting to appear. Could Apple’s refusal to make a touchscreen MacBook mean it’s caught wrong-footed on this one?

I predict that touchscreen laptops will become surprisingly popular.

Windows 8
Everyone’s a got a bit of a downer on Windows 8. After all, it’s pretty much Windows 7 but with a touchscreen interface slapped on top. Doesn’t that limit its usefulness? And since enterprises are only now starting to upgrade from Windows XP to Windows 7 — and this might be the last refresh cycle that sees end users being issued with company PCs — doesn’t that spell the end for Windows 8?

I predict that it will be more successful than many think: not because it’s especially great because it certainly has flaws, especially when used with a mouse, which means learning how to use the interface all over again.

In large part, this is because the next version of Windows won’t be three years away or more, which has tended to be the release cycle of new versions. Instead, Microsoft is aiming for a series of smaller, point releases, much as Apple does but hopefully without the annoying animal names from which it’s impossible to derive an understanding of whether you’ve got the latest version.

So Windows Blue – the alleged codename – is the next version and will take into account lessons from users’ experiences with Windows 8, and take account of the growth in touchscreens by including multi-touch. And it will be out in 2013, probably the third quarter.

Bring your own device
The phenomenon whereby firms no longer provide employees with a computing device but instead allow you to bring your own, provided it fulfils certain security requirements, will blossom.

IT departments hate this bring your own device policy because it’s messy and inconvenient but they have no choice. They had no choice from the moment the CEO walked into the IT department some years ago with his shiny new iPhone – he was the first because he was the only one able to afford one at that point – and commanded them to connect it to the company network. They had to comply and, once that was done, the floodgates opened. The people have spoken.

So if you work for an employer, expect hot-desking and office downsizing to continue as the austerity resulting from the failed economic policies of some politicians continue to be pursued, in the teeth of evidence of their failure.

In the datacentre
Storage vendors will be snapped up by the deep-pocketed big boys – especially Dell and HP – as they seek to compensate for their mediocre financial performance by buying companies producing new technologies, such as solid-state disk caching and tiering.

Datacentres will get bigger as cloud providers amalgamate, and will more or less be forced to consider and adopt software-defined networking (SDN) to manage their increasingly complex systems. SDN promises to do that by virtualising the network, in the same way as the other major datacentre elements – storage and computing – have already been virtualised.

And of course, now that virtualisation is an entirely mainstream technology, we will see even bigger servers hosting more complex and mission-critical applications such as transactional databases, as the overhead imposed by virtualisation shrinks with each new generation of technology. What is likely to lag however is the wherewithal to manage those virtualised systems, so expect to see some failures as virtual servers go walkabout.

Security
Despite the efforts of technologists to secure systems – whether for individuals or organisations, security breaches will continue unabated. Convenience trumps security every time, experience teaches us. And this means that people will find increasingly ingenious ways around technology designed to stop them walking around with the company’s customer database on a USB stick in their pocket, or exposing the rest of the world to a nasty piece of malware because they refuse to update their operating system’s defences.

That is, of course, not news at all, sadly.

New developments in open source security

I just spent some time talking to Claudio Guarnieri, European security researcher for Rapid7, about some interesting new open source security developments. Guarnieri is responsible for Cuckoo Sandbox, a malware analysis system. His website reckons that “you can throw any suspicious file at it and in a matter of seconds Cuckoo will provide you back some detailed results outlining what such file did when executed inside an isolated environment.”

But he was also talking about a USB threat detection software which appears to be unique. Ghost USB Honeypot is a honeypot for malware which spreads via USB storage devices. The aim is to fool malware into infecting a fake device, from which point you can trap and/or analyse the malware.

It works by emulating a USB device so that, if a computer is infected by malware which propagates using USB flash drives, as so much of it does, the honeypot will trick the malware into infecting the emulated device, where it can be detected without compromising the host system. This kind of attack can particularly difficult to detect because it can attack high security machines that aren’t network-connected. Stuxnet was one such.

To anyone looking at it from user space or from higher levels in the kernel-mode storage architecture, the Ghost drive appears to be a real removable storage device, that strives to behave exactly like disk.sys, the operating system’s disk class driver. The key to its operation is that malware should not be able to detect that it’s not a real USB device.

You can drive it from a GUI or from the command line, and the aim is for companies to be able to deploy the software on standard client machines without the user having to get involved.

In fact, ideally, according to Ghost’s developer, Bonn University student Sebastian Poeplau, the best way to get this to work successfully is to hide it from the user so they don’t try to write to it. In this way, any write access can be assumed to be malware, and the data written is copied into an image file and can be copied off for later analysis. There’s a video of a recent presentation Poeplau gave about the project, its rationale and how it works, here.

Why is finding the right keyboard impossible?

In a world where every product you could possibly think of wanting is made, and a huge number of orders of magnitude more products exist that shouldn’t be made at all, why is it impossible for me to find the keyboard I want?

When I started using PCs, the IBM PC and AT keyboards had function keys on the left-hand side, the keyswitches were loud and mechanical, and took a fair amount of force to depress. We cared a lot about keyboards in the days before graphical user interfaces were invented because they were the user interface.

I liked those keyboards. In the intervening 25 years have come a whole variety of input devices but I’ve remained wedded to the Northgate Omnikey keyboard I bought back in 1988. I must have written millions of words on that 23-year-old keyboard and nothing has ever gone wrong. I can disassemble it to clean it every now and again, but it’s had no other maintenance. It’s a marvellous piece of kit.

Incidentally, Northgate as a company is now, sadly, long gone, but you can still buy very similar keyboards from Creative Vision Technologies.

But I’m ready to move on. I’m ready to give up the loud forceful and clicky keyboard for a quieter device with keys that are easier to press and which, after a heavy day’s typing, don’t make my hands feel like they been through an assault course. But I’m not ready to give up the F keys down the left-hand side of the keyboard.

I use the keyboard a lot, even though I could use a mouse. It takes me a few milliseconds hit Ctrl-S or Alt-F4 to save a file or close a program, compared to the seconds it takes to move my hand over to the mouse, schlepp the cursor to the right icon and click it. It’s what my hands are used to doing.

And you need only one hand to do it when the the F keys are in the right place: on the left. Try Alt-F4 with the F keys on the top of the main keyboard layout and it feels really clunky and awkward — the result being that very few people ever use those keys.

So why can’t I find a quiet, easy-action keyboard with F keys on the left? Advice gratefully received….

Microsoft trashes its brand — with Apple the big winner

You have to wonder if Microsoft really knows what it’s doing. There’s a lot of hoo-hah around the Web about Windows 7, and how it’s going to fix Vista’s problems. Thing is, the signs are that it won’t — and that Apple will be the biggest winner.

That Microsoft understands it has a problem with Vista is obvious: the company took — what? — six years to drag Vista onto dealers’ shelves after the launch of Windows XP, following which Microsoft seemed to just squat on its haunches and watch the money roll in. In comparison, Windows 7, slated to launch later this year, follows hard on Vista’s heels, just over two years later.

Windows Vista’s done a lot of damage to Microsoft’s reputation and brand. The last sheer dog was Windows ME, which answered a question no-one asked (a bit like a Porsche Cayenne – only prettier) but proved to have all sorts of technical problems associated with it (not at all like a Porsche Cayenne, apparently).

But when ME was launched, messing with PCs was still by and large a minority sport.

No longer. Everyone and his or her dog has at least one PC. My sister-in-law, who knows close to nothing about computers, has two in her family — and guess who gets the tech support questions — but let’s just leave that one there. The point is that the brand is now ubiquitous, and Microsoft messes with it at its peril.

So the damage to Microsoft is proportionately bigger when it messes up as it has done with Vista — it’s so bad even people who know nothing about Vista notice. They notice that some of their old software doesn’t run properly any more. They notice too that they keep getting asked stupid questions to which they don’t know and couldn’t possibly be expected to know the answer. So of course they click OK — in which case, users quite reasonably say, why does the computer bother them at all?

Is Windows 7 going to fix these issues? We’re told so and I hope to be able to report on a copy on the release candidate in the not too distant future.

Just as important from Microsoft’s point of view is the enterprise market. A recent survey of over 1,100 IT managers and commissioned by KACE, a systems management company, found that “84 percent of IT staff polled do not have plans to upgrade existing Windows desktop and laptop systems to Windows 7 in the next year”.

Why aren’t IT managers following the Microsoft roadmap — assuming such a thing exists (Redmond used to flaunt one but hasn’t done so for years)? They cited software compatibility, cost of implementation, and the current economic environment as their main concerns.

The story told to me by the company’s Wynne White is that enterprises are sticking with XP for the time being. Some 89 per cent of the 500,000-plus desktops managed by KACE appliances use it, while just 1.89 per cent use Vista. For sure, deployments of new enterprise desktops are always slow — it’s the nature of the beast — but White reckoned that he could see at least five years’ life in XP yet.

And while they’re not going to Vista, they’re also being much more cautious with Win7. “People’s perception is positive but they’re being much more cautious in their approach,” he said.

“Eighty-four per cent are not going to adopt Windows 7 in the next 12 months, but more telling is that 72 per cent said they were more concerned about upgrading to Windows 7 than they were about staying with XP,” said White.

What all this suggests is that Windows has run out of steam. People no longer have any real reason to upgrade — if that’s the right word. It’s hard to avoid the conclusion that, for all intents and purposes, Windows XP is good enough: it’s easy to use, robust, stable, and reasonably secure (could do better, of course). Neither of its two successors offer all of that — and they’re just as expensive.

Linux looks to be a big desktop OS winner — at least in the enterprise. White reckoned that, in the 2007 version of this annual survey, 42 per cent said they’d switch to Linux, but two years later in 2009, half said they’d switch. And when asked if they either had switched or were in the process of switching, nine per cent said yes in 2007, 11 per cent in 2008 and 14 per cent this year.

But Linux isn’t the big beast Microsoft fears: it’s Apple. Between a half and a third of those IT managers said they were contemplating going Mac for their next tranche of desktops.

It’s hard to avoid the conclusion that Microsoft seems to be in the process of trashing its brand — and Apple looks to be the biggest picker-up of the pieces.

It’s just a major shame that Apple’s business model and contempt for its users is even less appetising than Microsoft’s…