BETA
This is a BETA experience. You may opt-out by clicking here

More From Forbes

Edit Story

The Disappearance Of AI

This article is more than 5 years old.

Kurt Cagle 2019

All things come to an end, especially economic cycles. People who have logged more than a couple of decades in information technology especially are attuned to it, because their jobs and interests both tend to be very forward facing - the inability of a software developer or information manager to read the future, at least in a general sense, usually means that they won't last long in the field. As the markets enter into the gyrations of this last December, with the Dow Jones Industrial Average now down 16% from the year's highs, the thought that the party would never end is now giving way to the notion that maybe it's time to grab the car keys and bid the hosts adieu, and those of us in IT are battening down the hatches in a serious way.

Understanding the IT Business Cycle

I started writing these year end predictions way back in 2003, at a time when "blogging" was still considered a novel thing, and Google had just wrested the mantle of king of the search engines away from Alta Vista. Fifteen years later, with my then three-year-old baby girl now heading to college and my red hair and beard now gone mostly white, the landscape has changed, most of the big players have changed (who knew Microsoft would eventually end up migrating to Linux), and the buzzwords are now almost a different language, yet at the same time, the patterns that underlie tech remain very predictable.

Business cycles, most economists have noticed, follow an eight to ten year pattern, usually with a bit of a wobble at the halfway point, and you can make a pretty compelling argument that there's a broader cycle that's double that, between eighteen and twenty years, where the economic crises oscillate between equity crashes (typically accompanied by commercial real estate disintegration) and mortgage (or residential real estate)  collapses. This is not to say that a stock market crash won't happen in a residential downturn (cf. 2008), but rather that this time around, real estate values will largely be casualties of the collapse, but not the cause.

IT cycles seem to have about a three and a half year period - meaning that you can generally get in at least two and perhaps three IT "waves" during a given economic cycle. Let's call these tech epochs. The first tech epoch in this cycle was the rise of mobile rising from the ashes of the 2008 meltdown. In 2010, if you weren't in mobile phones or mobile apps, you weren't making money. By 2012, IT recruiters were throwing the "I has APP experience" resumes in the trashcan, because, well, no one was making money in the mobile space anymore. About this time, you began hearing about BIG DATA. It was, well, BIGGG! Java developers got a new lease on life as the rise of Hadoop suddenly made Java relevant again. In 2018, the two biggest remaining Hadoop players finally merged, because demand had collapsed and the market could no longer support an alpha and beta dominance play. This second epoch also saw the rise of Bitcoin and ICOs, with Bitcoin falling spectacularly from a high of $20,000 USD per bitcoin to about $3900 today (amidst the complete collapse of most other ICOs).

This takes us to the most recent cycle, the Data Science / AI / Blockchain epoch. I'll make a prediction here - 2019 will be the year you first start seeing stories like "Who Killed the Data Scientist?". Big Data needed Big Data Scientists to analyze all that data, because the average businessman just couldn't fit it cleanly into a spreadsheet. This led a lot of companies to spend a lot of money attempting to entice data analysts with a smattering of R or Python to come over to the corporate side, where they sat for two years twiddling their thumbs because they realized that there really wasn't much IN the data that these companies were throwing at them.

Blockchain is a typical example of an ancillary technology that often underlies another one that gets repurposed. Blockchain was the technology behind BitCoin, and was an attempt to address a fundamental problem: how do you assure uniqueness in a world where any data can be replicated. The idea at its core was to federate claims of ownership, so that any time a transaction occurred, the transaction used an algorithm to distribute the claim between multiple blockchains in such a way as to ensure that a record would only be considered legitimate if the participants shared a common provenance chain. This made it possible to create a "verifiable' distributed virtual ledger, though at the potential cost of needing fairly complex mechanisms to generate completely unique keys.

Unfortunately, while such a ledger could be seen as very secure, the cost of computing those keys has proven to be quite onerous - mining for primes, as one example, is now taking more energy than the energy budget of several smaller countries. Moreover, the reality is that a simpler solution to this problem is simply to restrict access to a network where such transactions are made, much as the ATM network is (mostly) closed off from the Internet.  Electronic ledgers have been around for decades (Microsoft Excel is essentially a glorified electronic ledger), and secure distributed ledgers within networks have been around since the 1970s. As a consequence, many of the pilot projects that both banks and startups looked at, while not exactly failing, did not necessarily show enough benefit for the associated energy expenditure compared to other technologies. Put another way, what already existed was good enough for most practical use cases.

The final areas that have seen the most dramatic growth have been in machine learning and deep learning, areas that fall into the general domain of artificial intelligence (AI). It should be pointed out that AI as a rubric or category almost inevitably gets trotted out towards the end of an economic cycle - these technologies are more forward-looking, more risky in terms of speculation, and often may have a longer time frame before they are likely to bear results. The latest move towards self-driving cars is a good example of this - while a few such cars have actually made it to a very limited market, people within the intelligent car space acknowledge privately that it may be another decade before these efforts really see mass adoption. Similarly, voice recognition AIs - such as Siri or Alexis - are finally making their way into general production now, after having been considered "AI" in the last economic cycle. They are still not pervasive, nor without growing pains, but as a technology, they are considerably more mature than the early prototypes seen ten to twelve years ago.

The stock market in general is not the economy, but it usually correlates with economic activity six to eight months down the road. The tech heavy Nasdaq in particular has been hit hard of late, likely because investors are not seeing anything even on the horizon that promises to reward investments and because there are a lot of companies in the AI space, gaming, personal electronics, social media, and so forth that do not look like they are truly groundbreaking.

There is also a changing relationship that people have with both software and hardware shaping these expectations. Software, in general, has shifted from being a profit center to being a commodity, and any given service that can be delivered electronically most likely is being delivered electronically. Where software is profitable, it is increasingly likely to be models, templates, configurations, game extensions or media - books, audiobooks, movies, and so forth.

Hardware is similar. Cell phones - in reality, hand-held computers - continue to grow as the primary computing platform for everything from gaming to shopping to managing the day to day activities. The expected revolution in wearable never really materialized, save in very specialized applications (Chinese police now use smart glasses with facial recognition to identify potential scofflaws). Robots, predicted at the beginning of the decade to be the next big thing have indeed appeared, but most exist very much in the background.  Indeed, perhaps the one truly novel hardware innovation in the last ten years has been the rise of drones, which can be thought of as cell phones with rotors, and even there, the actual adoption of drones has been surprisingly minimal, as people object to the obnoxiousness of noisy, invasive flying robots.

The Maturing of the IT Revolution

These are all signs of a maturing ecosystem. Advances have gone from being qualitative in nature - the introduction of desktop publishing, which had a radical impact upon many industries, to largely quantitative and incremental, with a diminishing return on investment. Changes are becoming far more subtle - word processors that suggest whole phrases and sentences as you type, search which seems to become increasingly and uncannily accurate in helping you find what you're looking for, software for creating realistic worlds, plotting out (and often cowriting) stories, synthesizers for augmenting music creation.

This has interesting implications for the next decade. The Pareto principle states that, in many cases, roughly 80% of the effects of a situation come from 20% of the causes. This dictum has been adapted to many different scenarios - 20% of the customers buy 80% of the products being the most common. Yet in the realm of both commercial software and hardware, this can be restated as 80% of the needs for a given set of services or product can be fulfilled by about 20% of the available options. A corollary to that is that of the remaining 20% of needs that aren't fulfilled by that 20%, an additional 16% will solve all but about 4% of the problems and so forth. Such a power law produces a long tail, but go too far out on that tail and the return on investment is not worth making that investment. This is roughly the situation we're approaching now.

You'll see this evolution in self-driving vehicles. My anticipation is that most people will not be willing to trust a truly autonomous vehicle for at least a generation. Instead, self-driving will come about in a far more subtle fashion. Your car will ask you what your destination is and will draw out the map. You'll back up out of your driveway, though at some point an option to auto-back will come up on dashboard, and when you press a button on the wheel, it does so. You drive for a while through your neighborhood, and eventually, you'll see an indicator pop up saying that the car can go into cruise mode if you wish, and it then assumes driving once you get up onto the highway. You're still at the wheel, still nominally in control, but throughout the process of getting from point a to point b, the car will assist. A good AI will be able to determine when someone wishes to take control, a better AI will be able to evaluate when someone shouldn't be behind the wheel and will notify the police or simply take over. In most cases, it will not be an issue.

This is the next level of AI: the judgment of an AI will improve, quietly and unobtrusively, both due to the accumulation of ever larger amounts of learning material and to periodic upgrades in algorithms and back end capacities. They will go from systems that act upon simple stimulae to systems that mark recommendations: "Have you thought about doing this instead, Kurt?" "I've found a new band I think you might like?" "The last time you had the fried onions you had an upset stomach? Are you sure you want the onion rings?" These are not simple questions - they require a history to compare to, an awareness of what's going on (context), and the ability to distinguish between being helpful and being obnoxious. It also requires processing speed and bandwidth on a pretty phenomenal level.

This points out the immediate future, a future which has analogs in the 1970s and early 2000s. It can be argued that we are reaching a bandwidth plateau. Put in simple terms, a bandwidth plateau occurs when the infrastructure necessary to spur innovation (typically increased processing power or bandwidth) is not high enough for novel innovation to occur. Automated vehicles need a lot of processing power, most of it local, to be able to make rapid decisions in a highly complex world.  This is not a technological issue (at least not wholly) - it is a resource and infrastructure issue.

In the 1970s, tech went through a period called the AI Winter, in which interest in artificial intelligence dried up almost completely. This came about in part because of the realization that many of the predictions that had been made about the future of AI required a deeper understanding about how intelligence itself actually worked, as well as a lack of real infrastructure to support things at that level. The dot-com bust occurred in part because the bandwidth available was insufficient for what was envisioned at the time, and again, the infrastructure was too primitive to accommodate those innovations.  Many of the things that were predicted as actually being possible in 2000 took fifteen to eighteen years to reach fruition, just as many of the broad claims about computer artificial intelligence from the 1970s would not reach fruition for another 30 years.

Each time, as reality caught up with us, the hype that had gone into "AI" turned into disillusionment.

Is this time different? Sort of. Natural language processing has matured to a level where it is largely possible to rapidly determine the intent of a human speaker. Digital signal processing chips are increasingly able to handle the process of identifying a number of classes of things in an image (and increasingly in video), and from that using this information to create internal models representing those things. We're beginning to understand why neural networks actually work as well as they do, and that in turn has spurred other areas of investigation as we get into the realm of meta-cognition. These are all necessary precursors for limited general AIs, but they are not, by themselves sufficient.

From Technology Change to Societal Change

So what to expect in the AI space in 2019? Consolidation, bankruptcies, and artificial intelligence moving off the front-pages for a while will likely be a part of it. Funding into artificial intelligence won't stop altogether - think AI Late Fall rather than a full blown AI Winter, where the landscape will be dreary and grey, but not completely iced over. This is not actually a bad thing. Such a shakeout will shift the prioritization of AI from fantastic moonshots to what is just sufficient to meet a more tightly constrained use case need. It reduces the number of "AI researchers" who have little experience but were drawn in by high salaries, provides a stress test for viable technologies, and can slow the rush to get immature products out the door. A similar "breather" followed the 2000 tech recession, giving mobile technologies a change to mature and resolve standards contention. By the time the market had recovered, uptake on mobile was likely accelerated compared to where it would have been had the tech "winter" not taken place.

Beyond that, there are several key areas to watch. Semantics and metadata interchange, especially in the arena of data catalogs, looks likely to be one of the next big pushes. This uses a maturing cognitive model for handling the often complex task of mapping between multiple distinct databases, in effect making it possible to create enterprise-wide unified data spaces, in effect, making it possible to manage the flow of information as it moves throughout the data life-cycle within organizations. Longer term, this will result in a radical transformation of businesses away from the command and control hierarchy approach that categorized most of the 20th century and towards a more event-driven model in which changes of data state based upon various inputs can automate much of the day to day management of production and distribution requirements.

Real-time entity recognition and model synthesis - in which a computer is able to identify things in an environment and model avatars around them - is becoming increasingly possible based upon driving recognition systems. Smart contracts, which uses much of the research coming from semantics and blockchain, will replace traditional contracts. Such contracts can automatically track transactions over time, can determine compliance to those contracts, and can be used with automated payment systems and micropayments to better facilitate sensor nets.

A similar future is in the works for drones, from tiny ones the size of bugs to autonomous trucks, trains and of course unmanned Aircraft Vehicles (or UAVs) as they are known in the industry.  Most drones are semi-autonomous - they can be manually controlled, but by the 2020s, most will have avoidance behavior at a minimum built in and in many cases are able to navigate between two points without human intervention, courtesy of sophisticated AIs. A down period will make it possible for municipalities and regulatory agencies to establish reasonable policies regarding drones at a systematic level, rather than on a case-by-case basis.

IoT has gone through a boom/bust cycle, at this point, including a minimal commercialization phase where it was discovered that most of the "obvious" markets for IoT had comparatively little traction (do we really need yet another "smart" central thermostat, intelligent toaster or schedule aware coffee-pot?). The exceptions, such as the plethora of voice-activated AI assistants such as Siri, Alexa, Hi Google and Cortana, are in many respects a necessary precursor technology, stabilizing the playing field (and interfaces) for consolidating these IoTs. One or two of those (my bet's on Alexa at this stage) will likely become a market maker over the next couple of years, to the extent that IoT APIs will likely stabilize on Alexa's core API set. This, in turn, will see a rush towards "Alexa Aware" IoT devices, with the preponderance of them likely being in automobiles rather than the home.

I'm also wondering when the automotive industry will finally standardize on docking units. Switching out an IoT stereo or video player currently requires removing the front-facing across the central console unit. By standardizing on a single swappable container, a person could replace an IoT component with another by accessing a locking code, pulling out the old unit, pushing in the new, and entering a new locking code to fix it in place. While it has nothing (much) to do with AI, industry standardization of components is a very large part of IoT adoption.

To wrap it up, AI into 2019 and beyond will begin to disappear from the front pages and instead simply become augmentative infrastructure - making it easier to locate things, easier to control things remotely, easier to analyze or create. The robot revolution is well underway now, but it's hard to see because such robots don't LOOK like robots - they look like smartphone cameras or drones or coffee pots or automobiles. Distributed intelligence is far from obvious - most of it is hidden away in server farms on the other side of a web services connection.

The implications of that are then fairly obvious: even 2030 may not, in fact, look all that different from today on the surface. There will be more cars with no one behind the wheel, but not all that many more, though far more may not be actually driving even while at the wheel.  The smart-phone will still be the dominant mobile computing device, but will have roughly fifteen times the current computing power. People will still be buying groceries in stores, but they will be smaller and emptier, and checkout lanes may have disappeared entirely. Many more people will get their groceries delivered, however, and the auto-pantry, essentially an external electronically locked refrigerator, will be common-place. TV as we know it will be dead, the cable companies ultimately absorbed into exclusively internet companies, though people will still go to the movies.

Wrap Up

The irony is that the most obvious changes will not be in technology, but in the societal adaptation to that technology. In many respects, we are closer to the end of the latest technological revolution than the beginning, but the shockwaves from these changes are really only now beginning to hit the structural foundations of our society. How do we define a source of authority when the requisite for becoming one (e.g., publishing a website) comes down to the ability to generate content, no matter how fact-bare or spurious? What do we do when technology reaches a stage where meaningful work, which usually comes down to work that generates enough money to pay the bills, becomes a privilege available only to the most highly trained?  How do we educate our children when our current strategy of education - teaching facts in a structured manner - gets replaced by a just-in-time education process where a person ramps up on information when they need it? What happens to those people who cannot adapt fast enough? How do you create governing coalitions when everyone exists in their own thematic silo, unable to build consensus on even what the real problems are, let alone establish priorities?

None of these are problems that artificial intelligence, by itself, will be able to solve. Artificial intelligence can help us to model scenarios, to explore the potential consequences of taking or not taking certain actions, but the process of making the decisions to take those actions rests firmly within human hands. The next couple of decades will likely be contentious because of that.

Follow me on LinkedIn