14.2.5 Half a century and counting
At the beginning for the 21st century, after more than 40 years of intense effort by some of the most brilliant minds on the planet, researchers seemed much farther away from achieving many of the basic goals of AI than in the late 1950’s when they were just getting started. The year 2001 came and went and machines with the capabilities of HAL remained firmly in the realm of science fiction. At the turn of the century, in only the most specialized domains could computers be said to “see” and no one really “talked” to their computers.
With the arrival of the 21st century came the rise of the Internet and increasingly powerful computer systems. By the later part of the first decade of the 21st century much of human knowledge was available online – accessible to most anyone, anywhere with a smart phone and a broadband Internet connection. The internet provided AI researchers for the first time with vast quantities of digital data, generated by hundreds of millions of people, writing in many dozens of human languages. The emergence of Big Data and statistical machine learning techniques has enabled renewed progress in many areas of AI research such as speech recognition, question answering systems, machine translation and computer vision.