IoT today - going from hype to reality
The internet of Things has been used as a buzz word for at least 5 years now.
This is the second part of our Introduction to IoT for non-techies. In this article we go through the history of IoT.
My name is Alex, and I work for a venture company builder that operates in the Internet of Things (IoT) sphere. When I started back in February 2016, I had a very vague understanding of what IoT was. I had read some articles about the billions of devices that apparently would be hitting the market in the coming decade, and how the Internet would be everywhere and in every device. I might have even had some experience with a connected environment after having visited my mad scientist dad in Sweden who usually puts up new Raspberry Pi’s around the house for extremely arbitrary reasons every weekend. But if you had asked me to define what IoT was you would have been met with a blank stare.
With this introduction, I want to offer some help to people like me - non-developers that maybe can throw together some HTML at gunpoint but still want to know more about IoT. You might work with it every day. You might have a project coming up for which you need some insight, or you might just have read about it in an article or seen a presentation and decided that you wanted to know more. Whatever the reason, this introduction is for you.
As the Internet of things is comprised of a complex, vast and diverse set of technologies, I won’t be able to cover everything in this introduction. But I will try to provide you with a good starting point from which you can continue to explore the topic in more detail, following up on areas of your personal interest.
The Internet of things is a combination of various technologies and concepts that have been developed over time. However, instead of boring you with the details of every single technology or concept, I will focus on two specific technologies that perfectly illustrate how we got to where we are today.
SCADA (Supervisory Control and Data Acquisition) systems started to gain popularity in the 60s. In simple words: a SCADA system consists of multiple remote terminal units and one main terminal unit. These operating systems are used mainly by industry and the military. The remote terminal units work as data collectors or sensors, sending information to the main terminal unit. The main terminal unit collects that information and displays it on an interface.
Early systems would seem very crude to us, with supervisors logging hourly readings on paper. However, as SCADA systems became digital they could scan and monitor the data or status, notify users of changes, log data, and present it on digital displays. In the 70s, keyboards replaced previously analogue controls and new screens that were able to update every 5 seconds became the norm. Since that time, SCADA systems have continued to develop and are still widely used today.
SCADA systems are interesting in the IoT context as the concept in many ways resembles modern IoT systems: in both contexts, data from the real world is collected and processed. Also, SCADA systems can be intelligent, although rarely to the degree of an advanced IoT solution, as most SCADA systems worked (and still work) as data collectors that leave advanced decision making to humans. For the sake of this article, let’s argue that SCADA and all technologies used in developing SCADA systems can be seen as a precursor to today’s IoT, with the same core functionality and, in some cases, similar applications.
RFID (Radio Frequency Identification) is another technology that is used in many modern IoT solutions and which led to the development of IoT as we know it today. The RFID-technology however is by no means new. During the Second World War, the British army outfitted every British plane with a radio transmitter that would broadcast sound when it received a signal from a radar station. Although very low-tech, this can be considered the first use of RFID-technology as it allowed the British to identify ‘something’ - in this case a friendly plane, by using radio technology.
After the war, scientists kept on refining this technology. One of the first use cases were anti-theft systems with electronic article surveillance tags, which used a 1-bit tag that could be turned on (not paid for) or off (paid for) by the shop clerk. By combining those with a RFID-reader that was connected to an alarm, they had created a system that’s still in use nowadays, and that many of us probably have experienced involuntarily when we leaving a store without removing the still active tag. The RFID technology kept on developing through the 70s, 80s, and into the 90s when keycards (with incorporated RFID tags) became commonplace.
Precisely around this time, in the beginning of the 90s, global manufacturers of everyday products like soap and chocolate bars had a problem. Most companies had adopted barcodes in the preceding decades so that they could monitor, manage, and track their inventory. One key feature of barcodes is that they can give you an overview of product inventory and when a product needs to be restocked. However, it soon became apparent that barcodes had one major problem. Namely, humans. It turns out that humans are prone to errors.
Imagine a cashier, let’s call her Sarah, sitting behind a till in your local supermarket. It’s a hot summer day, there’s no air conditioning active, and Sarah hasn’t had a break for more than two hours now, so her concentration is slowly flagging. Also, there’s a queue of 10 people to the till making her feel stressed. At this very moment, a customer wants to buy three different chocolate bars – a Milky Way, a Bounty, and a Twix - that all cost the same amount, so she decides to just scan the Milky Way three times instead of scanning the three items separately.
Situations like this happen all the time. And although that’s a perfect illustration of human behavior, what it ultimately does is creating false data. So, how do we come up with a solution that is less susceptible to human error?
That was the problem a young brand manager by the name of Kevin Ashton, who was working for Procter & Gamble, tried to solve back in 1999. In his job, he had observed that the barcode system didn’t work. He had also previously come across loyalty cards, another piece of RFID-technology. So he came up with the idea to use the same RFID-chips that were embedded in these loyalty cards to track everything in a store. And then he would connect those chips to the internet so that different parts of the supply chain (retail stores and manufacturers) could ‘talk’ to each other. By using RFID, goods would be scanned automatically at different points (leaving the warehouse, entering a retail store, leaving a retail store) taking human error out of the process.
However, he was not the first to come up with what we today would recognize as an IoT solution. Back in 1982, a group of students at Carnegie Mellon connected a coke-machine to the then fledgling internet so that they could see what drinks it offered and how cold these were. In 1993, a group of scientists in Cambridge got tired of going for a coffee in their university cafe, only to find out that there was none left. Their solution? A camera that would take three pictures per minute of the coffee pot and then post these pictures in the internet. Unfortunately, those ideas never took off and didn’t enter into the public mainstream.
This changed with the help of Kevin Ashton (among others). He would become one of the early pioneers of IoT. He also contributed something else: the terminology. Before Ashton was able to start working on his idea, he needed buy-in from his managers. So he created an internal presentation but wasn’t happy with the title he had given it: “Smart Packaging.” Since he wanted to connect devices to the internet, and as this was the middle of the dot.com bubble, he felt that the word “internet” really needed to be in there. And what he wanted to connect to the internet were “things.” That’s how he coined “Internet of Things.”
Ashton’s presentation went quite well. Procter & Gamble granted him a budget to start a research project at MIT, called AUTO-ID, together with his cofounders Sanjay Sarma, Sunny Siu, and David Brock. AUTO-ID laid much of the foundation for the standardization of RFID-technology and put a lot of effort into reducing both the price as well as the size of tags. However, maybe even more importantly in this context is that they started to spread the idea of IoT by presenting their work and their vision to the world.
The years after AUTO-ID was created, IoT would continue to develop. Without going into too much detail, I’ll directly introduce you to the two main technologies that drove that development. The first one, which we already touched upon, is RFID. In 2014, 4 billion RFID tags were manufactured and sold. In comparison, there were 1 billion tags in existence in 2005. As more businesses understand that their ability to track their products and assets is virtually limitless, that growth will continue.
The second technology that has driven the development of IoT is the smartphone. To some, saying that smartphones are IoT devices might be controversial. I would hazard a guess that most people think that smartphones are handheld computers but not IoT-devices. However, in every smartphone there are also sensors incorporated.There’s GPS that geolocates your phone, an accelerometer that can sense how you hold your phone, and flip the screen accordingly; and an ambient light sensor that adjusts your screen’s brightness automatically depending on the light conditions around you. These are just a few of the sensors you will find in your phone.
With the increased use of smartphones, more and more IoT services for everyday consumers started to emerge. Let’s take Uber as an example. Most people probably wouldn’t consider Uber an IoT company. However, Uber is a great example of how the internet of things can change our world. To illustrate this point, let’s compare how Uber works with how I defined IoT in the previous part of this introduction.
Of course, smartphones and tags were not the only technologies laying the groundwork for the development of IoT today. Other notable drivers for IoT include better internet and cellular connections, more efficient batteries, and cloud computing - just to name a few.
All in all, the technological advancements we have experienced since 1999 can best be summed up by the following:
In 2017, anyone with some knowledge of programming, networking protocols, and cloud services can create what Kevin Ashton was dreaming about in 1999 by using a Raspberry PI, 100 RFID tags, a RFID-reader, and a cloud service - and all of this will only cost you about 300 euros. That is an astounding technical development.
To be clear, stating that smartphones are IoT devices and that some popular apps that use sensors are IoT services is controversial. Many wouldn’t agree with me. However, I would argue that if you look at smartphones in this light, you get a much better understanding of how we can connect the digital world with the physical, and the enormous potential that lies in doing that. If we overlook smartphone applications like Uber, we would also disregard many valuable learnings that we can take from successful uses of IoT in smartphones that could be applied to a more traditional view of the field.
If you liked this article and want to know more about this topic, keep checking our blog for the next in my series focused on current applications of IoT.
The Simpsons” TM and (C) (or copyright) Fox and its related companies. All rights reserved. Any reproduction, duplication, or distribution in any form is expressly prohibited.
This website, its operators, and any content contained on this site relating to “The Simpsons” are not authorized by Fox.