301.519.9237 exdirector@nesaus.org


5.13.19 – SSI – 

Though sometimes used interchangeably, AI and machine learning are not the same thing. Here are the capabilities of each one.

Artificial intelligence, machine learning … they’re not simply interchangeable. Here’s a brief look at key terms and differentiators.

Artificial Intelligence

A broad term that first appeared in published research in 1956. For years, most people’s understanding of artificial intelligence (AI) came from pop culture, in the form of robots. Examples of what many consider to be AI — Deep Blue beating a top chess player, Siri recognizing a song, Amazon suggesting a new book — are really examples of increasingly small computers running a series of algorithms, searching through huge databases, or doing a lot of calculations very quickly. Using the terms artificial intelligence or AI can be inaccurate and, more often than not, raise unrealistic expectations.

An area of AI that uses data to help a computer improve performance without being explicitly programmed. Static programming provides a computer with a set of instructions that do not change over time. Machine learning, conversely, allows programmers to enable a computer to assess and alter its computational processes through training. Working primarily with data in the form of language, text, video, or image, machine learning uses statistical techniques to enable computer systems to solve problems, make decisions and predictions, or improve the efficiency of specific, narrowly defined tasks.

Supervised Machine Learning

Here, computers are “trained” to properly classify inputs. This training occurs by providing the computer with structured datasets — data that has been organized or labeled in a predefined manner — that correlate thousands of possible inputs with corresponding labels that the computer understands. Once the computer has ingested and classified a new input, programmers must evaluate the “truthfulness” or accuracy of the output that the computer generates to help it improve. For example, if you label and input millions of images of roses and petunias into the computer with their associated labels, through supervised machine learning, the computer will ultimately be able to differentiate between future images of roses and petunias at a tolerable rate.

Also called data mining, this tackles very narrow problems by analyzing unstructured data — data that has not been organized or labeled in advance — in order to find patterns. The computer is looking for discernible patterns in the data and searching for an unknown output or “ground truth.” One of the main focuses is anomaly detection; the computer identifies points in a dataset or stream that are outside the normal range without this range being predefined.


Dr. Sean Lawlor is a data scientist at Genetec