Cloud transfers made easy

transfer
Transfers made easy

A while back, I wrote about the problem of consumer trust in the cloud – in particular, the problem of what happens when your cloud provider decides to change the T&Cs to your detriment, and how this can erode the trust that consumers, already alert to the technology industry’s much-publicised failures, are in danger of losing.

The issue that prompted this was the massive capacity reduction by Amazon for its cloud storage service – Cloud Drive – from unlimited to a maximum of 5GB. The original price was just £55 a year but Amazon’s new price for 15TB, for example, is £1,500.

So at this point, unless you’re happy to pay that amount, two solutions suggest themselves. The first is to invest in a pile of very large hard disks – twice as many as you need because, you know, backups, and then become your own storage manager. Some excellent NAS devices and software packages such as FreeNAS make this process much easier than it used to be, but you’ll still need to manage the systems and/or buy the supporting hardware, and pay the power bill.

The alternative is to retain some trust in the cloud – while remaining wary. But this is only half the solution; I’ll get back to that later.

This individual has found another cloud provider, Google G Suite, which offers unlimited storage and a whole heap of business services for a reasonable £6 per month. Google requires you to own your domain and to be hosting your own website but if you can satisfy those requirements, you’re in. Other cloud providers have deals too but this was the best deal I could find.

Cloud-to-cloud transfer
So the problem then is how to transfer a large volume of data to the new cloud service. One way is to re-upload it but this is very long-winded: using a 20Mbps fibre-to-the-cabinet (FTTC) connection it will take months, it can clog up your connection if you have other uses for that bandwidth, and for anyone on a metered broadband connection it will be expensive too. And if you don’t run a dedicated server, you’ll need a machine left on during this time.

Cloud-to-cloud transfer services exist to solve this problem, – and after some research, I found cloudHQ. For a reasonable fee – or for free if you blog about it (yes, this what I’m doing here) – cloudHQ will transfer data between a range of cloud services, including Google, Amazon (S3 and Cloud Drive), Gmail, Box, Basecamp, Office 365, Evernote and many more.

CloudHQ does more: it will backup and sync in real time too, forward emails, save them as PDFs, act as a repository for large attachments, and a range of other email- and scheduling related services for Google and other cloud providers.

The basic service is free but this is limited to 20GB and a maximum file size of 150MB – but the next tier up – Premium – costs £19.80 a month and offers pretty much everything the power user could want.

Hybrid clouds and backup
So is cloudHQ the solution to the problem of cloud-to-cloud transfers? Yes, but putting your data in the cloud still leaves you with a single copy without a backup (I said I’d get back to this). So either you need another cloud service, in which case cloudHQ will keep them in sync, or you create a hybrid solution, where the primary data lives under your direct control and management, but the off-site backup lives in the cloud.

This hybrid setup is the one that businesses are increasingly opting for, and for good reason. And frankly, since your irreplaceable personal data – think photos and the like – is at risk unless you keep at least two copies, preferably three, then using both local and cloud storage make huge sense.

How to stay safe on the Internet – trust no-one

key
Working close to the IT industry as I do, it’s hard to avoid the blizzard of announcements and general excitement around the growth of the Internet of things allied to location-based services. This, we are told, will be a great new way to market your goods and services to consumers.

You can get your sales assistants to greet shoppers by name! You can tell them about bargains by text as they walk past your store! You might even ring them up! Exclamation marks added for general effect.

But here’s the thing. Most people don’t trust big corporations any more, according to the recently published 2013 IT Risk/Reward Barometer report. Instead, finds this international study: “Across all markets surveyed, the vast majority of consumers worry that their information will be stolen (US: 90%, Mexico: 91%, India: 88%, UK: 86%).”

As a result, blizzard marketing of the kind that triangulation technologies now permits makes people feel uneasy at best and downright annoyed at worst. People ask themselves questions about who has their data, how they got it, and what control they have over that data once it’s escaped into the ether.

From ICASA’s point of view, this is largely the fault of individuals who don’t control their passwords properly or otherwise secure their systems. It’s an auditing organisation, so that’s not an unusual position to adopt. But I think it goes further than that.

As the study also points out: “Institutional trust is a critical success factor in an increasingly connected world. […] Organisations have much work to do to increase consumer (and employee) trust in how personal information is used.”

In other words, companies need to work harder at winning your trust. Does that make you feel any better?

This is clearly not an issue that will be solved – ever. For every ten organisations that are trustworthy and manage personal data responsibly – you do read that text-wall of privacy policy each time you log onto a new site, don’t you? – there will be one that doesn’t. Even if all companies were trustworthy, people will still make mistakes and hackers will win the security battle from time to time, resulting in compromised personal data.

The only rational policy for the rest of us to adopt is to trust none of them, and that is what this study shows most people tend to do.

The least you should do is to use long, complex passwords and change them regularly, using a password safe (eg KeePass) so you don’t have commit them to memory – or worse, bits of paper.

FYI, the study was conducted by ICASA, which describes itself “an independent, nonprofit, global association, ISACA engages in the development, adoption and use of globally accepted, industry-leading knowledge and practices for information systems.”

Whom do you trust?

Keeping your data secure is something you need to be constantly aware of. Apart from the army of people out there who actively seek your credit card and other financial and personal details, not to mention the breadcrumbs that accumulate to a substantial loaf of data on social media, it’s too easy to give the stuff away on your own.

It’s really all about trust. We’re not very good at choosing whom we trust, as we tend to trust people we know – or even people we have around us sometimes. As an example, I present a little scenario I encountered yesterday on a train.

The train divides en route, so to get to your destination, you need to be in the right portion of the train. An individual opposite me sat for 45 minutes through seemingly endless announcements – from the guard, the scrolling dot matrix screens, and the irritatingly frequent, automated announcements – all conveying the same information both before, during and after the three or four stops before we arrived at the decision point about which bit of the train to be in.

At the station where a decision had to be made, she leaned across and asked if she was in the right portion of the train for her destination.

Why? She would rather trust other passengers than the umpteen announcements. She’s not alone, as I’ve seen this happen countless times.

So it’s all about whom you trust. As passengers, we were trustworthy.

So presumably were the security researchers with clipboards standing at railway stations asking passengers for their company PC’s password in exchange for a cheap biro. They gathered plenty of passwords.

I recently left a USB phone charger in a hotel belonging to a major international chain. They said they would post it back if I sent them a scanned copy of my credit card to cover the postage. That they offered suggests there must be plenty of people willing to take the gamble that their email won’t be read by someone who shouldn’t. Not to mention what happens after the hotel has finished with the data. Can they be sure the email would be securely deleted?

I declined the offer and suggested that this major chain could afford the £7 it would cost to pop it in the post. Still waiting, but not with bated breath. I don’t trust them.

2012: the tech year in view (part 1)

As 2012 draws to a close, here’s a round-up of some of the more interesting news stories that came my way this year. This is part 1 of 2 – part 2 will be posted on Monday 31 December 2012.

Storage
Virsto, a company making software that boosts storage performance by sequentialising the random data streams from multiple virtual machines, launched Virsto for vSphere 2.0. According to the company, this adds features for virtual desktop infrastructures (VDI), and it can lower the cost of providing storage for each desktop by 50 percent. The technology can save money because you need less storage to deliver sufficient data throughput, says Virsto.

At the IPExpo show, I spoke with Overland which has added a block-based product called SnapSAN to its portfolio. According to the company, the SnapSAN 3000 and 5000 offer primary storage using SSD for cacheing or auto-tiering. This “moves us towards the big enterprise market while remaining simple and cost-effective,” said a spokesman. Also, Overland’s new SnapServer DX series now includes dynamic RAID, which works somewhat like Drobo’s system in that you can install differently sized disks into the array and still use all the capacity.

Storage startup Tegile is one of many companies making storage arrays with both spinning and solid-state disks to boost performance and so, the company claims boost performance cost-effectively. Tegile claims it reduces data aggressively, using de-duplication and compression, and so cuts the cost of the SSD overhead. Its main competitor is Nimble Storage.

Nimble itself launched a so-called ‘scale to fit’ architecture for its hybrid SSD-spinning disk arrays this year, adding a rack of expansion shelves that allows capacity to be expanded. It’s a unified approach, says the company, which means that adding storage doesn’t mean you need to perform a lot of admin moving data around.

Cloud computing
Red Hat launched OpenShift Enterprise, a cloud-based platform service (PaaS). This is, says Red Hat, a solution for developers to launch new projects, including a development toolkit that allows you to quickly fire up new VM instances. Based on SE Linux, you can fire up a container and get middleware components such as JBoss, php, and a wide variety of languages. The benefits, says the company, are that the system allows you to pool your development projects.

Red Hat also launched Enterprise Virtualization 3.1, a platform for hosting virtual servers with up to 160 logical CPUs and up to 2TB of memory per virtual machine. It adds command line tools for administrators, and features such as RESTful APIs, a new Python-based software development kit, and a bash shell. The open source system includes a GUI to allow you to manage hundreds of hosts with thousands of VMs, according to Red Hat.

HP spoke to me at IPExpo about a new CGI rendering system that it’s offering as a cloud-based service. According to HP’s Bristol labs director, it’s 100 percent automated and autonomic. It means that a graphics designer uses a framework to send a CGI job to a service provider who creates the film frame. The service works by estimating the number of servers required, sets them up and configures them automatically in just two minutes, then tears them down after delivery of the video frames. The evidence that it works can apparently be seen in the animated film Madagascar where, to make the lion’s mane move realistically, calculations were needed for 50,000 individual hairs.

For the future, HP Labs is looking at using big data and analytics for security purposes and is looking at providing an app store for analytics as a service.

Security
I also spoke with Rapid7, an open-source security company that offers a range of tools for companies large and small to control and manage the security of their digital assets. It includes a vulnerability scanner, Nexpose, a penetration testing tool, Metasploit, and Mobilisafe, a tool for mobile devices that “discovers, identifies and eliminates risks to company data from mobile devices”, according to the company. Overall, the company aims to provide “solutions for comprehensive security assessments that enable smart decisions and the ability to act effectively”, a tall order in a crowded security market.

I caught up with Druva, a company that develops software to protect mobile devices such as smartphones, laptops and tablets. Given the explosive growth in the numbers of end-user owned devices in companies today, this company has found itself in the right place at the right time. New features added to its flagship product inSync include better usability and reporting, with the aim of giving IT admins a clearer idea of what users are doing with their devices on the company network.

Networking
Enterasys – once Cabletron for the oldies around here – launched a new wireless system, IdentiFi. The company calls it wireless with embedded intelligence offering wired-like performance but with added security. The system can identify issues of performance and identity, and user locations, the company says, and it integrates with Enterasys’ OneFabric network architecture that’s managed using a single database.

Management
The growth of virtualisation in datacentres has resulted in a need to manage the virtual machines, so a number of companies focusing on this problem have sprung up. Among them is vKernel, whose product vOPS Server aims to be a tool for admins that’s easy to use; experts should feel they have another pair of hands to help them do stuff, was how one company spokesman put it. The company, now owned by Dell, claims it has largest feature set for virtualisation management when you include its vKernel and vFoglight products, which provide analysis, advice and automation of common tasks.

Technology predictions for 2013

The approaching end of the year marks the season of predictions for and by the technology industry for the next year, or three years, or decade. These are now flowing in nicely, so I thought I’d share some of mine.

Shine to rub off Apple
I don’t believe that the lustre that attaches to everything Apple does will save it from the ability of its competitors to do pretty much everything it does, but without the smugness. Some of this was deserved when it was the only company making smartphones, but this is no longer true. and despite the success of the iPhone 5, I wonder if its incremental approach – a slightly bigger screen and some nice to have features – will be enough to satisfy in the medium term. With no dictatorial obsessive at the top of a company organised and for around that individual’s modus operandi, can Apple make awesome stuff again, but in a more collective way?

We shall see, but I’m not holding my breath.

Touch screens
Conventional wisdom says that touchscreens only work when they are either horizontal and/or attached to a handheld device. It must be true: Steve Jobs said so. But have you tried using a touchscreen laptop? Probably not.

One reviewer has, though, and he makes a compelling case for them, suggesting that they don’t lead to gorilla arm, after all. I’m inclined to agree that a touchscreen laptop could become popular, as they share a style of interaction with users’ phones – and they’re just starting to appear. Could Apple’s refusal to make a touchscreen MacBook mean it’s caught wrong-footed on this one?

I predict that touchscreen laptops will become surprisingly popular.

Windows 8
Everyone’s a got a bit of a downer on Windows 8. After all, it’s pretty much Windows 7 but with a touchscreen interface slapped on top. Doesn’t that limit its usefulness? And since enterprises are only now starting to upgrade from Windows XP to Windows 7 — and this might be the last refresh cycle that sees end users being issued with company PCs — doesn’t that spell the end for Windows 8?

I predict that it will be more successful than many think: not because it’s especially great because it certainly has flaws, especially when used with a mouse, which means learning how to use the interface all over again.

In large part, this is because the next version of Windows won’t be three years away or more, which has tended to be the release cycle of new versions. Instead, Microsoft is aiming for a series of smaller, point releases, much as Apple does but hopefully without the annoying animal names from which it’s impossible to derive an understanding of whether you’ve got the latest version.

So Windows Blue – the alleged codename – is the next version and will take into account lessons from users’ experiences with Windows 8, and take account of the growth in touchscreens by including multi-touch. And it will be out in 2013, probably the third quarter.

Bring your own device
The phenomenon whereby firms no longer provide employees with a computing device but instead allow you to bring your own, provided it fulfils certain security requirements, will blossom.

IT departments hate this bring your own device policy because it’s messy and inconvenient but they have no choice. They had no choice from the moment the CEO walked into the IT department some years ago with his shiny new iPhone – he was the first because he was the only one able to afford one at that point – and commanded them to connect it to the company network. They had to comply and, once that was done, the floodgates opened. The people have spoken.

So if you work for an employer, expect hot-desking and office downsizing to continue as the austerity resulting from the failed economic policies of some politicians continue to be pursued, in the teeth of evidence of their failure.

In the datacentre
Storage vendors will be snapped up by the deep-pocketed big boys – especially Dell and HP – as they seek to compensate for their mediocre financial performance by buying companies producing new technologies, such as solid-state disk caching and tiering.

Datacentres will get bigger as cloud providers amalgamate, and will more or less be forced to consider and adopt software-defined networking (SDN) to manage their increasingly complex systems. SDN promises to do that by virtualising the network, in the same way as the other major datacentre elements – storage and computing – have already been virtualised.

And of course, now that virtualisation is an entirely mainstream technology, we will see even bigger servers hosting more complex and mission-critical applications such as transactional databases, as the overhead imposed by virtualisation shrinks with each new generation of technology. What is likely to lag however is the wherewithal to manage those virtualised systems, so expect to see some failures as virtual servers go walkabout.

Security
Despite the efforts of technologists to secure systems – whether for individuals or organisations, security breaches will continue unabated. Convenience trumps security every time, experience teaches us. And this means that people will find increasingly ingenious ways around technology designed to stop them walking around with the company’s customer database on a USB stick in their pocket, or exposing the rest of the world to a nasty piece of malware because they refuse to update their operating system’s defences.

That is, of course, not news at all, sadly.

Time to end loyalty card schemes

Am I alone (distraction: how many rants start like this?) in thinking that few of the trappings of the modern world are as annoying and deeply insidious as the loyalty card? Every shop in the high street offers one, it seems, so it can’t be a bad thing, or they wouldn’t get away with it, would they?

“Do you have a loyalty card,” they twitter. I’ve just encountered the final straw – hence this posting.

On the face if it, what’s not to like? You give the organisation your name and address, they send you a card, and you get a percentage point or two off your shopping. In these hard times, many a mickle makes a mackle.

But they never tell you the whole story. They will never come out and say that, if you subscribe, the company will bombard you with offers that, based on your spending patterns, they think you will want. Well, maybe you will, maybe you won’t, but would you rather not make those choices at a time of your own choosing, under your own steam as it were, rather than being manipulated by some marketing droid or, worse, by some marketing algorithm deep in a data centre somewhere?

If those cards weren’t worthwhile to administer, then companies such as Tesco – feted in marketing circles as among the most successful deployers of such schemes – wouldn’t do it. The reason it’s worth it is not because they get their hands on your spending patterns, which of course they do and which raises other issues – see below – but also because you spend more. Each marketing mailout increases demand for whatever it is that’s being pushed at the consumer.

So whatever discount you’re promised, you’re almost certain to have blown it out by buying more stuff you wouldn’t have bought had the scheme not been in place. That’s more profit for Tesco.

What’s more, the cost of the scheme is offset by hiking prices, as demonstrated by the Morrisons chain of supermarkets which cut prices when it abolished its loyalty card scheme back in 2004. and Asda told the Daily Telegraph it wouldn’t be implementing a scheme because: “It would have cost £60m to set up and £20m to £30m a year to maintain.”

But more fundamentally important is the loss of privacy that these cards entail. As the Telegraph feature referenced above reports, one campaigner likened having a loyalty card to walking around with a barcode stamped on your backside.

What I buy is my business, not that of a marketing programme. The data my buying provides means that more snippets of data about me sit in the public domain, waiting for some future organisation to hoover up and use in ways as yet unspecified.

Those who made this argument ten years ago were shouted down as paranoid. But today, with the growth of huge databases, accessible worldwide, as companies amalgamate and share data, and as basic security issues – such as not walking around with databases on a device liable to either theft or absent-mindedness, such as a laptop of USB memory stick – seem to be beyond either commercial organisations or the government, it behoves us all to hang onto those snippets.

Piled up in one place, a lot of snippets make a profile. Many a mickle makes a muckle.

EU takes the Janus position

The UK government has on many occasions shown itself to be more interested in spying on its subjects than on fixing the recession — and this week’s no different but the European Union seems with one hand to be aiding and abetting such activity, and with the other hand, bashing the UK round the head for something similar.

This week brings us news that the European Union has followed up its threat of legal action against the beleaguered New Labour administration by instating a case against the UK for allowing BT to test Phorm’s deep packet inspection and behavioural advertising. Customers were unaware that their data was being examined by BT for commercial purposes.

As a result, the EU has told the UK that it must comply with the EU Directive on privacy and electronic communications, which is equivalent to a legally enforceable requirement, and which mandates that member states must “ensure confidentiality of communications and related data traffic data by prohibiting unlawful interception and surveillance” unless the users concerned have consented.

The legal case follows numerous letters, to which the UK government responded that it was happy that the Phorm system meets European data laws, and then ignored further requests for clarification.

On the other hand however, the EU’s Data Retention Directive has provoked widespread condemnation. This compels member states to store users’ communication information for a full year starting on 15 March 2009. It means that every email, phone call and text message sent or received will have to be recorded.

The thinking behind it is that: “retention of data has proved to be such a necessary and effective investigative tool for law enforcement in several Member States, and in particular concerning serious matters such as organised crime and terrorism, it is necessary to ensure that retained data are made available to law enforcement authorities.”

That’s despite the assertion in the directive that, in a democracy, “everyone has the right to respect for his private life and his correspondence”. It does however go on to say that the directive: “relates only to data generated or processed as a consequence of a communication or a communication service and does not relate to data that are the content of the information communicated.”

In other words, it’s the contact not the content that will be stored — although this does include your user ID, IP address, DSL line (where appropriate) and the date and time of contact.

Even so, it seems contradictory for the EU to be lambasting the UK for spying on Internet traffic, while on the other hand it’s insisting that your phone bill be made available to the local police forces.