IoT is no fad, stated an engineer from Intel who works on the substrate designs of the future. He didn’t want his name used, which is OK. I can vouch for him.
He emphatically suggests that the IoT and/or the IIoT is the “ubiquitous interconnectivity of everything.”
Do you remember working on a computer in the 1980s? We had gobs and gobs of power at our disposal. Moore’s Law was active, alive and well. Also, we had extended memory, or was that expanded memory, for applications that were developed and needed more room to move?
There is no stopping us now, we said. We can do anything with our Intel 486 processors and our 640 KB of base memory and an install of Quarterdeck memory manger. Lest we forget our 200 MB—not a misprint—hard drive. Was it modified frequency modulation (MFM) or run-length limited (RLL) encoding?
Bill Gates’ alleged 1981 prediction of 640 KB of RAM being enough memory for anyone was quickly blown out of the water.
Windows 2.1 ran on Intel’s 386 with a math co-processor. It wasn’t pretty, but it worked. The move to a 486 with built-in math co-processor via reduced instruction set computing (RISC) helped a little, but the memory issue was a pain. Enter the Pentium processor. No need for Quarterdeck any more.
The ’80s produced so many firsts that it was mind boggling—Flash memory, application-specific integrated circuits (ASICs), MatLab, AutoCAD, oh, and the proverbial leader in word processing, WordPerfect.
Another oh moment—C++ took off, and a small company called Adobe started to own the publishing arena. Now horsepower for compiling and graphics became a need instead of a want.
However, in 1988 the world changed with the introduction of Microsoft Windows 3.1. Dennis Morin, then CEO of Wonderware, the SCADA/HMI company, worked with Microsoft during the Windows 3.1 development, so when it hit the streets so did Wonderware. Wonderware developed NetDDE for Microsoft, and thus it would lead on into the land of Windows 3.11 networking version. Netware disappeared.
OS/2 was still around and for a time challenged Windows, but the user flatly denied IBM its holy grail because it was too resource-hungry.
But we needed more hardware because the software applications were becoming more demanding on that hardware, and the hoops that the user had to jump through to get things to work well were significant. An ordinary Joe could not set up a networking pod at a doctor’s office. You had to have a deep talent. "Just because it’s Windows doesn’t mean it’s easy" was the mantra.
So we are at a new stage of the 1980s in my book. We have cellular technology, small size processing capabilities, no memory implementations issues, energy harvesting and the like. We have all the tools and technology we need, don’t we?
What does the technology hold for the future? Some think it will make things easy—well, relatively.
Will we be able to 3D-print a car, a body part, an internal organ? Probably. How about wearable tech that is actually embedded into us where it uses our body’s DNA as the network? Yep. MIT is doing research already.
How about a shirt or jacket that is nano-something which changes color and/or patterns based on the "something." Wonder what Armani would think about that?
Our tech future is very bright. We will need 5G cell technology for the speeds at which information will travel. Communication carriers will need to provide 10 GB or even 100 GB Ethernet to our homes. Internet bandwidth will need to be limitless.
My Intel guy says everything will be connected. Every store product will have RFID or something similar. Advertising will finance the complete package as Google injects itself into your every move. Holographic real-time ads will pop up in front of you in your wearable tech glasses when you pass by a store that you have shopped at before.
But, just as we did in the 1980s, we will need more speed, Scotty. We have the building blocks, albeit we are missing some pieces, but the pace at which IoT is percolating reminds me of those 1980 years where every six months a huge innovation happened.
Who will develop an IoT scripting language, because C# doesn’t cut it? It’s already here.
Where will industrial manufacturing be if we 3D-print cars? Will we be in the Jetsons era where a hamburger is printed before your very eyes in seconds?
Exciting, isn’t it? But, just like in the ’80s, we have applications far outstretching the means to implement them.
And yes it is all a process, one that my Intel buddy says will happen very fast. Stay in school, kids.