The client-server model evolves, to powerful effect.
Since the dawn of computing, a seesaw balance of power has occurred between the client and the server.
And as evolution occurs, we see again and again that intelligence will always migrate towards the edge.
In the beginning, mainframes were the size of small buildings and only accessible through timeshared terminals. All the intelligence was in the mainframe, the terminal was only a conduit. Later, the intelligence moved from the mainframe to the personal computer, with the mainframe serving as backup to the client-side device.
“It is actually a nice push-pull of centralized computing, which is actually the model that we have had in computing for a long time,” said Raj Talluri, Qualcomm senior vice president and general manage, in an interview with ARC at Mobile World Congress 2016 in Barcelona. “You know mainframes and distributed terminals back to main frames and the cloud and … it is a natural evolution of how compute works.”
The client-server model hasn’t really changed all that much through the years from a fundamental standpoint, even if the actual technology on each side has. Take smartphones and mobile technology. In the beginning, smartphones were low powered devices capable of only so much because of limited hardware. The compute power needed to perform advanced application processing was not yet present. So smartphones and tablets performed most of their high-level computations in the cloud, using wireless technologies as a conduit between the two.
In 2011, then NASA CTO and OpenStack founder Chris Kemp perfectly described to me the relationship between mobile devices and the cloud:
“When you are carrying around a tablet, you are carrying a gateway to the cloud,” Kemp said.
Nine years into the mobile revolution, the intelligence has again moved to the edge. Smartphones today are as powerful as laptops, running upwards of 4 GB of RAM on 64-bit chip architecture. The most robust of smartphones even have 128 GB of internal flash memory. My laptop 10 years ago could only hope to be that capable.
The Internet of Things is starting to follow this same pattern.
The Computing Paradigm Moves Down To The Internet Of Things
The cloud continues to grow as the backbone of the Internet. But it is no longer necessary for the heavy lifting of application processing on the smartphone.
In many ways, the smartphone is now becoming the server in the client-server relationship. Look at smartwatches. These small, low-powered devices do not have the computational capability for more than the lightest application processing. So the smartphone becomes the server, performing the computation and pushing it down to the smartwatch through wireless technology.
Smartwatches with decent application and graphics processors are already being shipped, so the intelligence is again moving down the stack.
And that is just the beginning. The Internet of Things will follow the same model, to varying degrees. The intelligence is moving to the edge, instead of the smartphone or the cloud. The things are going to be smart.
“But as soon as you get that you want that same experience on the edge and the latency of going to the cloud and back doesn’t quite work,” said Talluri.
So it is exactly happening in the same way that phones did. Because you start by being just a connected device but soon customers want the connected device to do more and you can’t do more by always going to the cloud and back. Of course, you still want it in the cloud, but there is a natural division of labor, if you will, between the cloud and the device which actually moves as times goes on. The more properties the device gets, the less you need to do on the cloud.
The Mobile Supernova
Mobile is everything.
That was motto of Mobile World Congress 2016. On one hand, the slogan may have been a way for MWC to justify its continued relevance in an era where smartphone shipments have more or less peaked. On the other hand, most of the various subsectors of the Internet of Things will be based on the same supply chain that has served the smartphone industry.
The way I tend to describe it to people is the example of a supernova. A star is born, lives its life creating massive amounts of energy, then expands to massive proportions and blows up. The material ejected from that supernova is then used to create new stars, new planets and solar systems. Eventually, the cycle repeats. It is the cosmic cycle of life.
Computing goes through similar cycles. The mainframe was a star that exploded, using its technology to build personal computers. Personal computers went supernova and created the Internet. The combination of the Internet and personal computers—a binary star combination—blew up and created smartphones. Smartphones are now the big bang that spreads the material to create the Internet of Things.
Andreesen Horowitz analyst Ben Evans states this principle succinctly in a blog post this week wondering what comes after the mobile revolution:
Today, if you are innovating in sensors or cameras or radios or pretty much any other component, you are far more likely to target ARM and mobile. And over time, these economies of scale (amongst other things) mean that mobile will supplant the PC just as the PC supplanted everything before it. But again, the first step is that the new ecosystem gets scale from a new and much larger customer base, and only afterwards can the new ecosystem start supplanting the old one.
Qualcomm’s Talluri confirms the thesis.
“It is very interesting for us because the technologies that are needed are actually very similar technologies as in mobile,” said Talluri. “You want multiple forms of connectivity, you want application processors, you need low power and an ecosystem of people that know how to use that.”
From a computing perspective, we are now in our fourth or fifth supernova cycle, dating back to when Alan Turing was building the first computers to crack enemy code in World War II.
And we are getting increasingly better at measuring the progress of computing cycles.
ARM—the chipset and IP architecture firm that designs most processors in smartphones—is the progenitor of the mobile revolution. If you look back 10 to 12 years ago, almost nothing was running on ARM chipset architecture as x86 chips dominated laptops and PCs. If you were to go on the show floor of the Consumer Electronics Show in Las Vegas in 2005, you’d have seen lots of x86-based computers and only a couple ARM-based products.
The show floor of CES 2016? An empirical guess would be that more than 90% of the products were ARM based. That includes the drones, cameras, intelligence in automobiles, televisions, streaming boxes, appliances, utilities and other assorted gadgets. The low-power, high-efficiency model of ARM has given birth to an entirely new ecosystem of intelligent devices that are not reliant on a server for computational functionality.
“That can be embedded in your automotive or entertainment systems. They can be controlling the lights in the back of your car,” said James Bruce, lead mobile strategist at ARM, in an interview with ARC at MWC. “But it is also very much now, what you’re seeing is a lot of companies seeing all these great CPUs in smartphones, great GPUs. It has got connectivity around it, the image processing. How can we actually take this into new markets?”
We can quantify just how much the mobile market is expanding into the Internet of Things, using ARM shipments as a proxy. Bruce said that 14.8 billion ARM chips were shipped in 2015. That is an order of magnitude greater than the about 1.4 billion smartphones that were shipped in 2015.
I think if you actually look, obviously the smartphone has become the face of personal computing device in the world. I mean there are three billion smartphones in use today. However, if you actually look at ARM shipments, if you look at our partnerships there were something like 14.8 billion chips with ARM CPUs in there. Really what you are seeing is this sort of pervasive computing everywhere.
Not every one of those 14.8 billion ARM chips is part of a sophisticated System-on-a-Chip (SoC) like a Qualcomm Snapdragon set. Many of them are simple Cortex micro-controllers for simpler machines. But even the most mundane of chips gains incredible layers of intelligence as time goes on.
Where Is The Mobile Intelligence Going?
ARM is not in the game of setting markets. ARM creates architecture and IP and gives instruction sets for how its partners—chipset manufacturers like Qualcomm, Samsung or Nvidia—will then build out the functionality of the architecture.
“I think if you look at ARM, we don’t actually go out and try to define markets,” Bruce said. “What we do is actually define IP that will be suitable for a wide-range of markets.”
What ARM does do is look five years ahead to determine the end points its chips will need to support and make the adjustments and tweaks necessary to support those end points. For instance, ARM’s new Cortex A32 chip it announced at Mobile World Congress is directly aimed at the embeddable Internet of Things market.
“Embedded, shall we say is a very loose term, but what it is means is using our Cortex-M cores and that can literally be anything from IoT to good old fashioned industrial equipment,” Bruce said. “That growth is coming as people are taking the power of the ARM ecosystem—great software and great development tools and the range of SoCs that are available—and bringing that intelligence into a wide range of products.”
Qualcomm definitely is in the business of setting markets. Talluri would not break down the demographics or volume of Qualcomm’s customers, but he did note that part of Qualcomm’s jobs is to work with its partners to create new use cases and markets using mobile technology.
“We like to think of it not so much as [filling in] hollow spots [of IoT deployment] but rather creating new markets,” said Talluri. “Because when we are able to bring this type of technology into these spaces, those things are able to do things that they could not do before.”
The way sales and innovation cycles work is that the Internet of Things technology we see deployed today is actually already a couple of years old. People buy silicon chips from the likes of Qualcomm and then work to give it connectivity and intelligent software layers. Most of what Talluri is working on today will be on the market in two to three years.
So what is coming in two to three years? Talluri says that Qualcomm is seeing an uptick in the type of connectivity that its customers are buying. Instead of just Wi-Fi capabilities, manufacturers of looking at Wi-Fi, Bluetooth, LTE etc. together as essential to building the next round of connected gadgets. Once connectivity is taken care of, those same manufacturers are buying application processors to run even the simplest of gadgets.
“Now we are finding that people don’t just want to buy our connectivity, they want to buy our application processors because they want smarts there,” Talluri said.
Talluri gives the example of a home security camera that is smart enough—without the need of the cloud or being attached to a laptop or smartphone—that can determine when and what it records. In older days, the camera would just record and archive everything on a local server or in the cloud. Now, the camera can selectively record just the movement of a human being (instead of say, a dog or raccoon) and analyze and store that video itself. With machine learning and facial recognition, the camera could determine if the person is a resident of the house or a potential burglar.
All on the device.
“What is happening now is this whole push for the cloud versus the non-cloud. That is another thing that is happening,” Talluri said. “Now you push a lot more intelligence to the edge, to the camera.”
This is what it means to watch the intelligence of the Internet of Things move to the Things themselves. It is part of a familiar cycle that has been 70 years in the making. And the results—mixed with the evolution of cloud and smartphones—creates for incredibly potent blend of innovation that can be applied to almost any industry or human behavior.