My mind races when I think of all the similarities between the past rise of computing and the current rise of artificial intelligence. It is amazing to me as I watch what seems to be the same story with different characters playing out all over again. Everything is obviously not the same but there are some really solid key tenet similarities. The biggest difference, I think, is the fact that most people are aware that it’s happening because they’ve seen how fast technology happened before with computing. At least I hope everyone is aware.
Enter stage left, IBM. Yep. Just like in the good ole’ 1960s. They had the lock on computing through their mainframe market. These giant, room sized machines produced magical outputs that would one day turn our future into a dystopian sci-fi novel. Right? Well it was hard to know because very few people actually saw them, or used them, or understood them. But they sounded very impressive and of course the computing technology certainly was. It was the mainframe that kicked off our destiny with computers. It was a glimpse into our future relationship with intelligent machines, but it wasn’t the mainframe that changed the world.
IBM has a new mainframe. They call it Watson and it is does AI. Have I ever seen one? Nope. Just on TV when it played Jeopardy. Is it big…probably. Expensive…you bet. Are lots of people allowed to program on it…nope. But wait…in all fairness, they do have the Bluemix application developer capability that exposes Watson skills. Sort of like a modern day IBM 5150. The 1981 5150 was IBM’s attempt to enter the PC market after Apple had sold 6M Apple II’s since 1977. The 5150 was the best example of IBM “shrinking” their mainframe capabilities and putting it in the hands of real people since a couple vaporware flops (not FLOPS) with the SCAMP. They sold about 100,000 units. Not bad, but not Apple. One thing it did succeed to do was get Microsoft its first big piece of market share. So…are we in 1982 with AI? You have Watson which is getting beat up like crazy right now for allegedly being all sizzle and no steak. You have a growing number of companies diving into the AI space. Similar in volume to how many computer were jumping into the PC market in the early eighties. Maybe 1982, but let’s unpack this some more.
Let’s talk about what made the PC so powerful and seminal in computing history. I think it was three things: they became accessible, relatable, and programmable. Wow. I just blacked out for a minute right there. That was genius.
Okay, so the PC became accessible. That means normal humans could get their hands on it. They could put it in their house without having to sell their kidney. Cool. It can be argued that the 1981 Sinclair ZX81 fit that mold. They were priced at $99 and sold 600,000 units. You could also argue of course that the 1977 Apple II was accessible. They sold 6M of them at a price of around $2,000. I’ll settle on accessible in early 1980’s.
Now let’s talk what relatable means. Relatable means you can use the PC for things that you do on a daily basis. Things at work, home or school. Things like writing papers, doing spreadsheets, or playing a sick video game. PC’s aren’t relatable when they are only used for arcane tasks. Likewise, AI isn’t relatable when it’s used for arcane tasks.
Finally, they became programmable. Not just customizable. Programmable. You could start to make them better and more powerful by creating tools for them. When humans could start making tools for computers to use (software) at scale, it changed everything. The more people with access to the computers, the more programmers were made and the more users of the tools the programmers made. It was and still is a powerful and virtuous cycle.
Let’s use those three elements to assess where AI is today. Maybe that will help us figure out which year we’re in as it compares to computing history.
We’ll start with accessibility. How many people have access to AI? Well…a whole bunch. I mean it’s kind of everywhere. But more specifically, I think we can confidently call Siri and Alexa AI, right? I mean, at least they satisfy some Turing qualities, and deep in the bowels of their code are some neat neural nets for learning and some other cool machine intelligence stuff. Amazon has sold about 8M Echos. They’re not the cheapest thing in the world but they’re not crazy expensive. I have one…but not two. So I’m going to say it’s at 1977 in terms of accessibility. It’s super important to note that besides these voice assistants, AI isn’t that accessible. It has a long way to go. Maybe the iPhone X will have an impact with its on board GPU.
Next, let’s look at how relatable we are with AI. Like above, the Alexas and Siris of the world are super relatable. However, most of the AI companies out there are more focused on things that aren’t so relatable. AI certainly hasn’t invaded our lives and impacted the routing things we do everyday. Given the arcane nature of most AI solutions out there, with the exception of Alexa and Siri, I’m going to say we’re in 1977 with relatability too.
Finally, let’s think about programmability. There are some tools out there. Tensor Flow, Tesseract, OpenCV, etc. are pretty available. It’s actually pretty straightforward to build a neural net. But man, those GPUs are freaking expensive. How are we supposed to have a million programmers building unsupervised learning when the can’t access the compute power? That’s a problem. That’s like a 1972 problem. Hurry NVIDIA. We need GPU in every computer stat. I also think their needs to be some new IDEs. We are in the early days here at AI software engineering from a tools perspective. I think this will happen quickly but we are still mid 80’s from that perspective. Final ruling: Libraries = 1998, Hardware = 1977, IDE = 1983. Average that and we can put programmability in 1986.
Now we average all three categories together and see where we are: Accessible = 1977, Relatable = 1977, Programmable = 1986. Average that out…and we are in 1980.
Welcome to 1980. The Apple II just rocked the world of personal computing and are about to hibernate again until 1998 when the come back from the dead with the iMAC. IBM is about to retaliate with the IBM 5150 to little fanfare. Underdogs Commodore and NEC is about to be IBM’s launch for the first half of the decade selling over 30 million machines. And this year, Tim Berners-Lee is about to invent hypertext. Get ready.
Wonder what this year in AI will look like and how many similarities there will be?