Cloud transfers made easy

transfer
Transfers made easy

A while back, I wrote about the problem of consumer trust in the cloud – in particular, the problem of what happens when your cloud provider decides to change the T&Cs to your detriment, and how this can erode the trust that consumers, already alert to the technology industry’s much-publicised failures, are in danger of losing.

The issue that prompted this was the massive capacity reduction by Amazon for its cloud storage service – Cloud Drive – from unlimited to a maximum of 5GB. The original price was just £55 a year but Amazon’s new price for 15TB, for example, is £1,500.

So at this point, unless you’re happy to pay that amount, two solutions suggest themselves. The first is to invest in a pile of very large hard disks – twice as many as you need because, you know, backups, and then become your own storage manager. Some excellent NAS devices and software packages such as FreeNAS make this process much easier than it used to be, but you’ll still need to manage the systems and/or buy the supporting hardware, and pay the power bill.

The alternative is to retain some trust in the cloud – while remaining wary. But this is only half the solution; I’ll get back to that later.

This individual has found another cloud provider, Google G Suite, which offers unlimited storage and a whole heap of business services for a reasonable £6 per month. Google requires you to own your domain and to be hosting your own website but if you can satisfy those requirements, you’re in. Other cloud providers have deals too but this was the best deal I could find.

Cloud-to-cloud transfer
So the problem then is how to transfer a large volume of data to the new cloud service. One way is to re-upload it but this is very long-winded: using a 20Mbps fibre-to-the-cabinet (FTTC) connection it will take months, it can clog up your connection if you have other uses for that bandwidth, and for anyone on a metered broadband connection it will be expensive too. And if you don’t run a dedicated server, you’ll need a machine left on during this time.

Cloud-to-cloud transfer services exist to solve this problem, – and after some research, I found cloudHQ. For a reasonable fee – or for free if you blog about it (yes, this what I’m doing here) – cloudHQ will transfer data between a range of cloud services, including Google, Amazon (S3 and Cloud Drive), Gmail, Box, Basecamp, Office 365, Evernote and many more.

CloudHQ does more: it will backup and sync in real time too, forward emails, save them as PDFs, act as a repository for large attachments, and a range of other email- and scheduling related services for Google and other cloud providers.

The basic service is free but this is limited to 20GB and a maximum file size of 150MB – but the next tier up – Premium – costs £19.80 a month and offers pretty much everything the power user could want.

Hybrid clouds and backup
So is cloudHQ the solution to the problem of cloud-to-cloud transfers? Yes, but putting your data in the cloud still leaves you with a single copy without a backup (I said I’d get back to this). So either you need another cloud service, in which case cloudHQ will keep them in sync, or you create a hybrid solution, where the primary data lives under your direct control and management, but the off-site backup lives in the cloud.

This hybrid setup is the one that businesses are increasingly opting for, and for good reason. And frankly, since your irreplaceable personal data – think photos and the like – is at risk unless you keep at least two copies, preferably three, then using both local and cloud storage make huge sense.

How Firefox just blew it

firefox_current_logo-150x150As a journalist, my Firefox browser – which I’ve been using since almost the day it arrived – is my primary research tool. It’s the place I call home. And it’s just been upgraded. It’s a big upgrade that for me will change the way it works, massively. I’m saying no.

Upgraded

The web is full of articles praising its developer, Mozilla, for updating it so it’s twice as fast. One article lauds “Mozilla’s mission is to keep the web open and competitive, and Firefox is how Mozilla works to endow the web with new technology like easier payments, virtual reality and fast WebAssembly-powered games.” This is endorsed by a Gartner analyst; Gartner is the biggest, and therefore the go-to analyst house in the technology industry for those needing a quote.

If you’re waiting for a ‘but’, here it is. Frankly, I don’t care how much faster it is if means I that half the functionality I’m used to is stripped away. Because that’s what allowing my browser to upgrade to the latest, greatest version would mean.

Extensions

It’s all because Firefox made the clever move to open up its browser very early on to third parties, who wrote extensions to add features and functionality. I loved that idea, embraced it wholeheartedly, and now run about 20 extensions.

The new Firefox – which despite its apocalyptic upgrade moves only from version 56.02 to 57.0 – will no longer run those extensions which for me have been the most useful.

Software developers love adding new stuff and making things look new using the latest software tools. Mozilla has been no slouch in this department. Fine for developers perhaps, but as a user, this constant change is a pain in the arse, as it means I need to re-learn each time how to use the software.

So Classic Theme Restorer (CTR) is particularly precious to me, as it enables Firefox to look and feel pretty much as it did when I first started using it.

CTR puts things, such as toolbars and menus – back where they were, so they work they have always worked – and for that matter, the way that most of my software works. But after the upgrade, CTR cannot work, as the hooks provided by the browser for it to do its stuff don’t exist in the new version.

Two other extensions are key from my point of view. One gives me tree-style tab navigation to the left of the browser window, not along the top where multiple tabs pretty soon get lost. And tab grouping, a feature that disappeared a few generations of browser ago but was replaced by a couple of extensions, means you can keep hundreds of tabs open, arranged neatly by topic or project. Who wouldn’t want this if they work in the browser all day?

Meanwhile, the developers of some other extensions have given up, due to the effort involved in completely re-writing their code, while others will no doubt get there in some form or other, eventually.

Messing with look and feel

This is a serious issue. Back in the day, one of the much-touted advantages of a graphical user interface was that all software worked the same, reducing training time: if you could use one piece of software, you could use them all. No more. Where did that idea go?

Mozilla clearly thinks performance – which can instead be boosted by adding a faster CPU – is paramount. Yes, it’s important but a browser is now a key tool, and removing huge chunks of functionality is poor decision-making.

I feel like my home is being dismantled around me. The walls have shifted so that the bedroom is now where the living room used to be, the front door is at the back, and I’ve no idea where the toilet is.

Some might argue that I should suck it up and move with the times. But I don’t use a browser to interact with the technology but rather to capture information. Muscle memory does the job without having to think about the browser’s controls or their placement. If the tool gets in the way and forces me to think about how it works, it’s a failure.

So version 57 is not happening here. Not yet, anyway.

Is the cloud letting consumers down?

The promise of cloud services has, by and large, been fulfilled. Back in the day, and right up to the present day still, the big issue has been security: is your data safe?

What this question is really asking is whether you can retrieve your data quickly in the event of a technological melt-down. You know the kind of thing: an asteroid hits your business premises, a flood or fire makes your office unusable for weeks or months, or some form of weird glitch or malware makes your data unavailable, and you need to restore a backup to fix it.

All these scenarios are now pretty much covered by the main cloud vendors so, from a business perspective, what’s not to like?

Enter the consumer

Consumers – all of us, in other words – are also users of cloud services. Whether your phone uploads photos to the manufacturer’s cloud service, or you push terabytes of multimedia data up to a big provider’s facility, the cloud is integrated into everything that digital natives do.

The problem here is that, when it comes to cloud services, you get what you pay for. Enterprises will pay what it takes to get the level of service they want, whether it’s virtual machines for development purposes that can be quick and easy to set up and tear down, or business-critical applications that need precise configuration and multiple levels of redundancy.

Consumers on the other hand are generally unable to pay enterprise-level cash but an increasing number have built large multimedia libraries and see the cloud as a great way of backing up their data. Cloud providers have responded to this demand in various ways but the most common is a bait-and-switch offer.

Amazon’s policy changes provide the latest and arguably the most egregious example. In March 2015, it initiated, all for just £55 a year, an unlimited data storage service, not just photos as Google and others were already offering. Clearly many people saw this as a massive bargain and, although figures are not publicly available, many took it up.

Amazon dumps the deal

But in May 2017, just over two years later, Amazon announced that the deal was going to be changed, and subscribers would have to pay on a per-TB basis instead. This was after many subscribers – according to user forums – had uploaded dozens of terabytes over a period of months at painfully slow, asymmetrical data rates.

Now they are offered on a take it or leave it basis an expensive cloud service – costing perhaps three or four times more depending on data volumes – and a whole bunch of data that it will be difficult to migrate. On Reddit, many said they have given up on cloud providers and are instead investing in local storage.

This isn’t the first time such a move has been made by a cloud provider: bait the users in, then once they’re committed, switch the deal.

Can you trust the cloud?

While cloud providers are of course perfectly at liberty to change their terms and conditions according to commercial considerations, it’s hard to think of any other consumer service where such a major change in the T&Cs would be implemented because of the fear of user backlash. Especially by one of the largest global providers.

The message that Amazon’s move transmits is that cloud providers cannot be trusted, and that a deal that looks almost too good to be true will almost certainly turn out to be just so, even when it’s offered by a very large service provider who users might imagine would be more stable and reliable. That the switch comes at a time when storage costs continue to plummet makes it all the more surprising.

In its defence, Amazon said it will honour existing subscriptions until they expire, and only start deleting data 180 days after expiry.

That said, IT companies need to grow up. They’re not startups any more. If they offer a service and users in all good faith take them up on it, as the commercial managers at Amazon might have expected, they should deal with it in a way that doesn’t potentially have the effect of destroying faith and trust in cloud providers.

It’s not just consumers who are affected. It shouldn’t be forgotten that business people are also consumers and the cloud purchasing decisions they make are bound to be influenced to a degree by their personal experiences as well as by business needs, corporate policy and so on.

So from the perspective of many consumers, the answer to the question of whether you can trust the cloud looks pretty equivocal. The data might still be there but you can’t assume the service will continue along the same or similar lines as those you originally signed up to.

Can you trust the cloud? Sometimes.

Innergie mMini DC10 twin-USB charging car adapter

Innergie adapter 1
Clean design

We all travel with at least two gadgets these days – or is it just me? What you too often don’t think about though is that each widget adds to the task of battery management. The Innergie 2A adapter’s twin USB charging ports will help.
Twin USB ports
Twin USB ports

The company sent me a sample to try and I found the design to be clean and tidy, and it all works as expected. It’s also quite compact, measuring 70mm long from tip to tail, and protruding from the car’s power socket by just 28mm. This means it won’t take up too much precious space, an issue especially if the power socket is mounted in the glovebox.
Innergie adapter 2
Nice shiny contact

When activated the front lights up a pleasing blue, and it then allows you to charge your USB-fitted devices to its max 2A potential. This means that if your device’s battery capacity is 2,000mAh, which is reasonably typical, it’ll take an hour (in theory) to recharge from empty.

Officially, it costs £19 (probably less on the street), and there’s more about it here.

Technology predictions for 2013

The approaching end of the year marks the season of predictions for and by the technology industry for the next year, or three years, or decade. These are now flowing in nicely, so I thought I’d share some of mine.

Shine to rub off Apple
I don’t believe that the lustre that attaches to everything Apple does will save it from the ability of its competitors to do pretty much everything it does, but without the smugness. Some of this was deserved when it was the only company making smartphones, but this is no longer true. and despite the success of the iPhone 5, I wonder if its incremental approach – a slightly bigger screen and some nice to have features – will be enough to satisfy in the medium term. With no dictatorial obsessive at the top of a company organised and for around that individual’s modus operandi, can Apple make awesome stuff again, but in a more collective way?

We shall see, but I’m not holding my breath.

Touch screens
Conventional wisdom says that touchscreens only work when they are either horizontal and/or attached to a handheld device. It must be true: Steve Jobs said so. But have you tried using a touchscreen laptop? Probably not.

One reviewer has, though, and he makes a compelling case for them, suggesting that they don’t lead to gorilla arm, after all. I’m inclined to agree that a touchscreen laptop could become popular, as they share a style of interaction with users’ phones – and they’re just starting to appear. Could Apple’s refusal to make a touchscreen MacBook mean it’s caught wrong-footed on this one?

I predict that touchscreen laptops will become surprisingly popular.

Windows 8
Everyone’s a got a bit of a downer on Windows 8. After all, it’s pretty much Windows 7 but with a touchscreen interface slapped on top. Doesn’t that limit its usefulness? And since enterprises are only now starting to upgrade from Windows XP to Windows 7 — and this might be the last refresh cycle that sees end users being issued with company PCs — doesn’t that spell the end for Windows 8?

I predict that it will be more successful than many think: not because it’s especially great because it certainly has flaws, especially when used with a mouse, which means learning how to use the interface all over again.

In large part, this is because the next version of Windows won’t be three years away or more, which has tended to be the release cycle of new versions. Instead, Microsoft is aiming for a series of smaller, point releases, much as Apple does but hopefully without the annoying animal names from which it’s impossible to derive an understanding of whether you’ve got the latest version.

So Windows Blue – the alleged codename – is the next version and will take into account lessons from users’ experiences with Windows 8, and take account of the growth in touchscreens by including multi-touch. And it will be out in 2013, probably the third quarter.

Bring your own device
The phenomenon whereby firms no longer provide employees with a computing device but instead allow you to bring your own, provided it fulfils certain security requirements, will blossom.

IT departments hate this bring your own device policy because it’s messy and inconvenient but they have no choice. They had no choice from the moment the CEO walked into the IT department some years ago with his shiny new iPhone – he was the first because he was the only one able to afford one at that point – and commanded them to connect it to the company network. They had to comply and, once that was done, the floodgates opened. The people have spoken.

So if you work for an employer, expect hot-desking and office downsizing to continue as the austerity resulting from the failed economic policies of some politicians continue to be pursued, in the teeth of evidence of their failure.

In the datacentre
Storage vendors will be snapped up by the deep-pocketed big boys – especially Dell and HP – as they seek to compensate for their mediocre financial performance by buying companies producing new technologies, such as solid-state disk caching and tiering.

Datacentres will get bigger as cloud providers amalgamate, and will more or less be forced to consider and adopt software-defined networking (SDN) to manage their increasingly complex systems. SDN promises to do that by virtualising the network, in the same way as the other major datacentre elements – storage and computing – have already been virtualised.

And of course, now that virtualisation is an entirely mainstream technology, we will see even bigger servers hosting more complex and mission-critical applications such as transactional databases, as the overhead imposed by virtualisation shrinks with each new generation of technology. What is likely to lag however is the wherewithal to manage those virtualised systems, so expect to see some failures as virtual servers go walkabout.

Security
Despite the efforts of technologists to secure systems – whether for individuals or organisations, security breaches will continue unabated. Convenience trumps security every time, experience teaches us. And this means that people will find increasingly ingenious ways around technology designed to stop them walking around with the company’s customer database on a USB stick in their pocket, or exposing the rest of the world to a nasty piece of malware because they refuse to update their operating system’s defences.

That is, of course, not news at all, sadly.

Happy birthday Simon the smartphone

IBM Simon
IBM Simon

Today, 23 November 2012, is the 20th anniversary of the launch of the first smartphone. The IBM Simon was a handheld cellular phone and PDA that ended up selling some 50,000 units. This was impressive as, at the time, publicly available cellular networks were a rarity.

In fact, at the London launch of the device, I remember wondering how many people would buy one given the high costs of both a subscription and the phone. In the USA, BellSouth Cellular initially offered the Simon for US$899 with a two-year service contract or US$1099 without a contract.

As well as a touch screen, the widget included an address book, calendar, appointment scheduler, calculator, world time clock, electronic note pad, handwritten annotations, and standard and predictive stylus input screen keyboards.

Measuring 203mm by 63.5mm by 38mm, it had a massive 35mm by 115mm monochrome touch screen and weighed a stonking 510g, but was only on the market for about six months. The UK never saw it commercially available.

So while it never really took off, this was largely down to timing: it was ahead of its time and it was soon overtaken by smaller, less well-featured devices that were more affordable.

But when you contemplate which shiny shiny is your next object of desire, think about the Simon, and remember, Apple didn’t invent the smartphone: IBM did.

AVM Fritz!Box 7390 review

I’ve just acquired a handful of home/small business networking products – the AVM Fritz!Box Fon WLAN 7390 router, AVM Fritz!WLAN Repeater, and AVM Fritz!Powerline 500E Set – and I’d like to share the experience.

AVM Fritz!Box Fon WLAN 7390

Like any broadband router, the 7390 connects your local area network (LAN) to an ADSL-enabled phone line – but there’s far more to it than that. It’s probably the fullest featured product of its kind.

At its heart, the 7390 runs Linux but you never need to know that unless you like tinkering. It provides a huge range of information about your DSL connection, not just speed but signal to noise ratio, error stats, and a graphical representation of the line’s carrier frequency spectrum.

If your line is noisy, you can adjust the sensitivity of the device to accommodate that, in order to trade off stability for speed. My phone line was very crackly for a few days which at first resulted in the router disconnecting and retraining frequently. Using the 7390’s line settings, I was able to achieve a stable, albeit slower connection until the line cleared.

And as well as the more common ADSL/ADSL2+, it will also connect to a VDSL line, useful for small businesses with a need for high speed uploading.

Telephony is one of the 7390’s fortés. It includes a DECT base station that allows all GAP-compatible cordless phones to connect to it, and you can also use with a SIP service to call over the Internet, with full logging and call quality data available. The telephony module includes an answering machine, a phone book, alarms, call blocking and diversion, and a call logging screen.

WiFi support includes all modern 2.4GHz standards, plus the 5GHz 802.11a standard, all with a full set of security controls, as well as the ability to avoid channels being used by nearby WLANs. Its four LAN ports are now Gigabit Ethernet enabled – its predecessor supported only 100Mbps – so the 7390 is now useful as a full participant of a home or small business network. It also includes a USB port into which you can plug storage, such as a NAS containing video and audio files, to be shared over the LAN via UPnP.

Other features include an energy saving mode, a night service, and daily, weekly or monthly email reports.

Any downsides? It’s expensive at around £185 from Amazon, and some users have complained of poor English language support from the German parent company.

There’s a lot more the device can do but in summary, it’s a highly capable router and a whole lot more.

AVM Fritz!WLAN Repeater 300E

A simple-to-use device, this extends the WLAN, connected either via the WLAN itself or using a wired network connection. Connection is simple, using push buttons on both the router and repeater, and the link is fully encrypted so, unlike some products, you don’t have to drop strong encryption to extend the WLAN. A good way to get a wireless connection in the workshop.

AVM Fritz!Powerline 500E Set

Fritz!Powerline adapters are an alternative to running network cables: instead, use the mains system for networking. The main drawback compared to a standard Ethernet connection is speed: the max theoretical throughput is 500Mbps but much less than that in practice. A batch file-driven data transfer showed a data rate over the Powerline network of 171Mbps, compared to a rate using gigabit Ethernet of 392Mbps.

The pair of devices in the box each sport an Ethernet port, and a security button which enables 128-bit encryption. You must use this or your data could be visible to everyone else connected to the same circuit – including your neighbours. You also get a pair of Ethernet cables, and the adapters are IEEE P1901 compliant, and so should be compatible with adapters from other vendors.

Summary

Both the Powerline devices and repeater are useful for extending your network, or you could combine both for a faster, more robust connection.