By now most people are more than familiar with the concept of Cloud Computing, but what about the new concept referred to as Fog Computing? Today’s Q&A post takes a look at this new concept and how it differs from Cloud Computing.

Today’s Question & Answer session comes to us courtesy of SuperUser—a subdivision of Stack Exchange, a community-driven grouping of Q&A web sites.

Image courtesy of The Paper Wall.

The Question

SuperUser reader user1306322 wants to know what fog computing is:

What is “Fog Computing” and how is it different from “Cloud Computing”?

Wikipedia has a few words about “Fog Computing” on its Edge Computing page. I suppose it could mean that processing is distributed unevenly between a set of devices, but it is somehow different from concentrating all processing on a central data server (Cloud Computing) or end-user devices (Edge Computing), but I am not sure.

So what exactly is “Fog Computing”?

The Answer

SuperUser contributor Dan D. has the first answer for us:

To go with what Dan D. has shared/quoted from Cisco, we have a bit more to add from a quick bit of research that we did:

Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end-users. The distinguishing Fog characteristics are its proximity to end-users, its dense geographical distribution, and its support for mobility. Services are hosted at the network edge or even end devices such as set-top-boxes or access points. By doing so, Fog reduces service latency, and improves QoS, resulting in superior user-experience. Fog Computing supports emerging Internet of Everything (IoE) applications that demand real-time/predictable latency (industrial automation, transportation, networks of sensors and actuators). Thanks to its wide geographical distribution the Fog paradigm is well positioned for real time big data and real time analytics. Fog supports densely distributed data collection points, hence adding a fourth axis to the often mentioned Big Data dimensions (volume, variety, and velocity).

Unlike traditional data centers, Fog devices are geographically distributed over heterogeneous platforms, spanning multiple management domains. Cisco is interested in innovative proposals that facilitate service mobility across platforms, and technologies that preserve end-user and content security and privacy across domains.

Fog provides unique advantages for services across several verticals such as IT, entertainment, advertising, personal computing etc. Cisco is specially interested in proposals that focus on Fog Computing scenarios related to Internet of Everything (IoE), Sensor Networks, Data Analytics and other data intensive services to demonstrate the advantages of such a new paradigm, to evaluate the trade-offs in both experimental and production deployments and to address potential research problems for those deployments.

Note: You can read the full articles/posts via the links we have included below for each section.

As you can see, “Fog Computing” focuses on lifting part of the work load off of regular cloud services by using localized resources in order to provide a quicker, smoother, and more streamlined experience for users. What are your thoughts on “Fog Computing”? Do you think it will become as popular and useful as Cloud Computing or would you classify it as a “marketing fad” with no future?

The so-called IoT (Internet of Things) encompasses a range of Internet-capable devices that could be almost limitless: Thermometers, electric meters, brake assemblies, blood pressure gauges and almost anything else that can be monitored or measured. The one thing they have in common is that they’re spread out around the world.

There can be huge amounts of data coming out of these devices. For example, a jet engine may produce 10TB of data about its performance and condition in just 30 minutes, according to Cisco. It’s often a waste of time and bandwidth to ship all the data from IoT devices into a cloud and then transmit the cloud’s responses back out to the edge, said Guido Jouret, vice president and general manager of Cisco’s Internet of Things Business Unit. Instead, some of the cloud’s work should take place in the routers themselves, specifically industrial-strength Cisco routers built to work in the field, he said.

“This is all about location,” Jouret said. Using local instead of cloud computing has implications for performance, security and new ways of taking advantage of IoT, he said.

Quoted from the definition/explanation at WhatIs.com:

Fog computing, also known as fogging, is a model in which data, processing and applications are concentrated in devices at the network edge rather than existing almost entirely in the cloud.

That concentration means that data can be processed locally in smart devices rather than being sent to the cloud for processing. Fog computing is one approach to dealing with the demands of the ever-increasing number of Internet-connected devices sometimes referred to as the Internet of Things (IoT).

In the IoT scenario, a thing is any natural or man-made object that can be assigned an IP address and provided with the ability to transfer data over a network. Some such things can create a lot of data. Cisco provides the example of a jet engine, which they say can create 10 terabytes (TB) of data about its performance and condition in a half-hour. Transmitting all that data to the cloud and transmitting response data back puts a great deal of demand on bandwidth, requires a considerable amount of time and can suffer from latency. In a fog computing environment, much of the processing would take place in a router, rather than having to be transmitted.

Have something to add to the explanation? Sound off in the comments. Want to read more answers from other tech-savvy Stack Exchange users? Check out the full discussion thread here.