Wow, I didn’t even know someone had sat around long enough to name compute models after weather conditions - can you imagine the brain-storming session. This is what resonated the best with me:
Communication takes 5x the power of computing in an embedded microcontroller, so, by collecting raw data at the edge of your network, then grouping it using filtering, anomaly identification mechanisms, and pattern recognition, you are able to send only essential data up to the gateway, router, or server, which in turn, conserves battery power as well as bandwidth.
My last company, we developed an embedded agent for cellular devices and networks that did precisely this! The objective was, with privacy at the forefront, detect non-normative device, network or service operation through local event capture, filter, reporting married with cloud computing for post-mortem root-cause analysis. So for example, if the device drops a call, roll back 5 seconds, collect the signal characteristics leading up to the drop, the towers in proximity, etc. and upload to the cloud the minimum needed for big-data root-cause-analysis. The agent was full-Turing capable meaning you could vend a set of triggers and filtering instructions to the device to tailor it to the problem - (e.g. new cellular network technology rollout, new device rollout, etc). We originally did this out of need because circa 2005, we only had 1xRTT cellular flip phones Was boon with all the major device manufacturers and network operators. Ultimately sold it to AT&T.
However, I must be old because I I used to describe it to customers as cloud computing where all the devices comprise the cloud. It makes me chuckle when the things that were done long ago out of ordinary need circa 2005 are later tagged with a new word to make hyping easier. Almost like Xerox (for copying) and Google (for searching). But I digress.
Good read @melih - thanks for sharing the article!