Expert Knowledge on Digitalization & Automation of Business Processes
Expert Knowledge on Digitalization & Automation of Business Processes
Topic: AI and Machine Learning
We were an exhibitor at this year's Procurement Summit in Hamburg, a symposium for purchasing managers. I took the opportunity to attend some master classes myself. The lectures were titled things like "Artificial intelligence in procurement reporting," "Artificial intelligence in purchasing: How it works and what you get," and "How to optimize your purchasing processes with artificial intelligence." What came out clearly at the event was that artificial intelligence had become a hot topic—no less for digital business processes.
But what exactly is this "artificial intelligence," or "AI"? That's just what I'd like to look at in today's article. In addition to exploring the basic concept of AI, there are some other terms that have recently entered the arena—"machine learning," "neural networks," "deep learning,"—that are badly wanting in explanation, so I'll address these as well.
The concept of artificial intelligence has been around for decades, at least in theory. Since the emergence of the first computers, scientists have tried to breathe human-like intelligence into them. Up to a few years ago, however, this failed due to the computing capacity required: The earlier generations of computers did not have the necessary performance, so artificial intelligence remained a purely theoretical concept for a time—wishful thinking, so to speak.
Over the last ten to fifteen years, however, computer performance has improved dramatically. This has resulted in a comeback of the technological concepts that form the collective term AI. Not only has artificial intelligence become technologically feasible, it can also be applied profitably.
There are various definitions of artificial intelligence, each time emphasizing a different aspect. Definitions do overlap, however, in that they all refer to the idea of machines, software, and robots possessing capabilities that were previously reserved for humans. These capabilities might be skills such as perceiving and generalizing parameters, recognizing meanings and drawing conclusions, learning from experiences and the past, and, based on gathered information, making decisions or predictions.
Concrete areas of possible application include speech, text, and image recognition. Use cases, too, are manifold, ranging from language assistants like Alexa and Siri to autonomous driving, tumor recognition in medicine, and automatic classification in document management.
The term "artificial intelligence" covers a variety of technological approaches, the primary ones being machine learning, neural networks, and deep learning.
"Machine learning" is a term used to describe algorithms that, over time, increase the reliability of software solutions when making decisions or predicting results. The aspect of autonomy is important here: The software is not provided with a fixed catalog of decision criteria, but derives the criteria itself over time or from historical data. That is why systems using machine learning are also referred to as learning systems.
This process of learning requires a certain number of data sets. This is the reason why machine learning applications could not really deliver good results until a certain amount of digital data had been amassed. Now, even in contexts of greater complexity, good results can be achieved. The keyword here is "big data."
Artificial neural networks and the way they learn are modelled on the human brain. The human brain is made up of billions of neurons. When information enters the brain, it is passed from one neuron to the next. Each neuron takes on a small part of the processing. When information enters and is transmitted, connections are created between the individual neurons. Through repetition, they solidify, and patterns emerge. This is the process of learning.
Artificial neural networks are based on this model. Rather than neurons in the brain, it is individual computer nodes that are interconnected. When this network is fed with information, communication paths also form and are consolidated between different computers, resulting in learning within an artificial neural network.
In order to simulate this kind of learning artificially, it is necessary to have a large number of computer nodes, combined with the corresponding computing power. This is the reason that neural networks were not really put into practical use until the recent (and dramatic) improvement of performance in computers.
"Deep learning" or "deep neural networking" synthesizes the two approaches machine learning and neural networks. The term "deep learning" therefore refers to machine learning based on a neural network. Deep learning increases the intelligence of these systems to an even greater degree, requiring fewer human interventions and corrections and increasing the potential level of complexity in contexts to be captured. On the other hand, both these approaches require a higher number of data sets for learning and an increased amount of computing power. For this reason, deep learning is—at least for the time being—used for fewer applications than are machine learning or artificial neural networks.