Today, The Wall Street Journal had an article about Fog Computing. They were lamenting the fact that data on edge devices, most predominately on mobile wireless devices, is capped, doesn't go or arrive very fast, and are typically feeding or being fed by the 'clould'. I'll also add that the wireless companies are sucking as much money as possible out of the data plans while limiting data ability. Are their networks that constrained? Perhaps we need new technology?
Twitter has references to Fog Computing going back about 7 days from the day of this writing.
So the idea, supposedly originated by Cisco, is to keep the data on the edge as much as possible. A blog article, Cisco IOx in Cisco Live 2014: Showcasing “fog computing” at work introduces the concept under the umbrella of 'Internet of Everything'.
As devices get more compact and more powerful, this could conceivably become true. Every device could become a map / reduce client processor. But map / reduce only gives each device slivers of information, with out making the 'big picture' available.
We had mainframes (central), then we had personal computers (distributed), then we had cloud computing (central), now we have mobile devices (distributed), .....
It used to be 'Internet of Things'. At some time, it may be 'Internet of Universal'.
Following on in the subject, Internet of Things and Fog Computing is referenced in a 'Disruptive Innovation' blog called From Cloud to ‘Fog’ computing: Cisco looks to accelerate IoT innovation.
On the lighter side of things, there is a tweet by @oisin where we have:
- Fog Computing
- Mild Precipitation Computing
- Sleet Computing
- Thundery Downpour Computing
- Soft Mist Computing