The other day a reader wrote in asking if cloud computing could help save his hard drive space, which made me realize that it’s time to talk about exactly what this moronic buzzword really means.
What is Cloud Computing?
According to the National Institute of Standards and Technology, the definition for “Cloud Computing” is this incomprehensible piece of nonsense clearly written to be as confusing as possible:
So what’s a definition for real people?
Cloud Computing = Web Applications
That’s all there is to it. If you’re using a web or internet-based application from a major provider like Google or Microsoft, you’re using cloud computing. Congrats!
Every web application that you’ve ever used, like Gmail, Google Calendar, Hotmail, SalesForce, Dropbox, and Google Docs, are based on “cloud computing”, because when you connect to one of these services, you’re really connecting to a massive pool of servers somewhere out there on the internet. The client doesn’t need to be a web browser, but that’s the direction everything is heading.
Think there’s more to it than that? Don’t believe me? Just listen to Larry Ellison, the CEO & co-founder of Oracle, talk about how moronic this term really is:
So Why Cloud Computing?
We’ve already established that it’s a pointless term that simply describes web applications, which have been around for a very long time—but in order to get businesses to start switching to web applications instead of self-hosted servers, the marketing types invented a new buzzword.
The reason why they used the word “cloud” in the buzzword is simple: in network diagrams, the internet is usually represented with a cloud in the middle of the drawing. Those marketing drones are inventive, aren’t they?
So basically the term itself is just a way for consultants and companies to sell more services in a shiny new package. Here’s a good illustration of how this works:
Comic by Geek and Poke
How Can Cloud Computing Help Me?
Since businesses everywhere are moving their applications to the web and coming out with new and interesting features accessible through your web browser, you’ll soon be able to access virtually anything from any browser on any PC, and the lines will blur between desktop and the internet.
Now that Microsoft has finally released the beta for Internet Explorer 9, which supports new web standards like HTML5 and uses hardware acceleration to make the whole experience speedy—every browser will finally be on the same footing. When Microsoft said that IE9 is going to change the web, they weren’t kidding—they were the only ones holding the web back with their anemic IE7 and IE8 browsers, not to mention the ancient IE6. And now the nightmare is finally almost over.
It’ll get even more interesting whenever Chrome OS is finally released, which is basically an entire operating system built around a web browser as the primary interface, with all of your applications as web applications instead of local—hopefully it will support web integration like IE9 does with the Windows 7 taskbar.
How Is Cloud Computing Different for Businesses?
If you’re in the IT world you’re probably scratching your head at this point and thinking that I’m oversimplifying the idea behind cloud computing, so let’s explain the real difference from the more technical side of things.
In the past, every company would run all of their applications on all of their own servers, hosted at their own location or data center. This obviously requires a lot of maintenance and money to keep everything running, upgraded, and secure.
From a business perspective, businesses can now move much of their computing to cloud services, which provide the same applications that you would install on your own servers, but now they are accessible over the internet for any of their customers. Have you read about companies switching to Google Docs? That’s a perfect example of companies switching from hosting their own local servers to using cloud computing instead.
Most of these services operate on a pay-for-resources basis—so your application only gets charged for the amount of CPU and network use that it actually uses—when your application is small and doesn’t have a lot of users, you don’t get charged much, but the benefit is that it can scale up to 10,000 users without any trouble (though you’ll be paying a lot more for the added CPU usage).
Still need more? Here’s a video that explains it with… little fluffy clouds.
Web Applications are the future. Cloud Computing is a stupid buzzword. Discuss.