Artificial Intelligence and Machine Learning-Are We There Yet?

Barry Barlow, SVP & CTO, Vencore, Inc
43
60
13
Barry Barlow, SVP & CTO, Vencore, Inc

Barry Barlow, SVP & CTO, Vencore, Inc

“Let me put it this way, Mr. Amor. The 9000 series is the most reliable computer ever made. No 9000 computer has ever made a mistake or distorted information. We are all, by any practical definition of the words, foolproof and incapable of error.”

HAL, “2001: A Space Odyssey,” 1968.

As we approach the 50 year anniversary of “2001: A Space Odyssey,” a movie that arguably foreshadowed concerns that many face today with the increased emphasis on and investment in artificial intelligence (AI), machine learning, drones and autonomous operations (e.g., driverless cars), one has to wonder if society has made the leap and now agrees with and accepts HAL’s proclamation that computers are “foolproof and incapable of error.” Before you say “of course not,” let me ask if you’ve ever deviated from your typical morning commute based on Waze saying “we found a better route?” And if your answer to the original question is “certainly,” do you recall the stock market flash crashes that were caused by high-frequency trading (HFT)? Were you aware that stock exchanges can now execute trades in less than a half a millionth of a second—more than a million times faster than the human mind can make a decision? Did you know that over 50 percent of all trading, at least in the US, is performing without human intervention by HFT algorithms? Do you realize that in the time it takes to read this sentence, you could be bankrupt? In 2012, Knight Capital lost $10M per minute for 45 minutes while a team of trading experts tried to figure out what the HFT software was doing, and shut it down. If experts in the business who are actively monitoring the results of machine actions can’t figure out what is going on, what does that say for the rest of us mere mortals? And what does it say, writ large, about all the other areas where AI and machine learning are being deployed?

   Value to the enterprise comes when insights are put into action, and that typically requires human judgment  

First, lest there be any doubt, I am strongly of the opinion that there are places where AI and machine learning make absolute sense. In a competitive era, AI and machine learning allow a business to secure a powerful edge. At a time when many companies offer similar capabilities and comparable technologies, the correct use of machine learning and the associated predictive or anticipatory analytics could provide a rare point of differentiation. The capacity of any AI model to learn from responses on an individual level is what truly distinguishes it from other approaches, such as crowd-sourcing or population-based approaches. The National Academy of Sciences led by mathematician Jan Lorenz and sociologist Heiko Rahut of Switzerland’s ETH Zurich pointed out that, “Although groups are initially ‘wise,’ knowledge about estimates of others narrows the diversity of opinions to such an extent that it undermines collective wisdom. Even mild social influence can undermine the wisdom of crowd effect.” More data is not necessarily better data, and not all data is designed to answer all questions.

Second, and having said that, while great progress has been made in this space over the past decade, in many (or most) applications, it is not yet a replacement for human judgment. The insights AI models provide are just that—insights which have limited value, in and of themselves. Value to the enterprise comes when insights are put into action, and that typically requires human judgment. AI analytics are not a replacement for enterprise strategy, or corporate campaigns, or strategic initiatives. They are intended to frame recommendations, drive decisions and provide an assessment of the success or failure of actions taken. Ultimately, it is humans who decide which questions must be answered and humans that will look at the recommendations from any machine learning algorithm, and make the decision to act or not. At least for now.

So where does that leave us? Are we there yet? Not quite. If we look back at history, from 1968-1970, roughly the same time as “2001: A Space Odyssey was released”; Terry Winograd was hard at work at MIT on his seminal paper on AI, “Procedures as a Representation for Data in a Computer Program for Understanding Natural Language,” (MIT AI Technical Report 235, February 1971). In that research effort, Terry developed an early natural language understanding computer program, where the user carried on a conversation with the computer, moving objects, naming collections and querying the state of a simplified “blocks world,” essentially a virtual box filled with different blocks. Think of it as a Siri-interface to a virtual world, and it was a great example of what AI could become. However, as other researchers began to expand the universe and add in more realistic examples of ambiguity all too common in the real world, optimism quickly faded. It was replaced by the reality that we live in a complex world where computer programs often fail—some more spectacularly than others—while human judgment and thought processes readily adapt and accommodate. But will computers ever catch up?

John von Neumann, the great mathematician and computer scientist once commented that “You insist that there is something a machine cannot do. If you will tell me precisely what it is that a machine cannot do, then I can always make a machine which will do just that!” I suspect that we’ll continue down that path of teaching machines more and more about the human world, and one day we will arrive at our destination.

Read Also

ISR Analytics: Challenges for IT, Implications for Big Data, and Opportunities for IoT

ISR Analytics: Challenges for IT, Implications for Big Data, and Opportunities for IoT

Ravi Ravichandran, Ph.D., Director, BAE Systems Technology Solutions
Are we at war?

Are we at war?

Barry Barlow, Chief Technology Officer, Vencore, Inc.
Protecting the Defense Industrial Base from Cyber-attack

Protecting the Defense Industrial Base from Cyber-attack

Ted Bujewski, Associate Director, Office of Small Business Programs Department of Defense