top of page

Harnessing Artificial Intelligence in Modern Supply Chains: A Comprehensive Analysis

By Sophie Xu (Research Lead), Rijul Mahajan, and Tinashe Chiduma

Since the 1800s, the manufacturing sphere has undergone 3 industrial revolutions. Movements that earned this moniker through not only improving productivity and efficiency but completely revolutionizing how goods were produced and how work was done. Currently, the manufacturing industry is in the midst of its 4th Industrial Revolution, colloquially dubbed Industry 4.0. Over the past decade, many companies and industries have been hyper-focused on discussing, exploring, and testing the promises of solutions offered to businesses through the movement, which can broadly be defined as the integration of intelligent digital technologies into manufacturing and industrial processes. The benefits ushered in by these technologies focus heavily on implementing interconnectivity, automation, real-time data, and other advanced technologies that drive throughput, optimize processes, and streamline efficiencies while, theoretically, allowing for smart manufacturing and the creation of intelligent factories through the use of 9 technological pillars: Cloud computing, Augmented reality (AR), Industrial Internet of Things (IoT), Additive manufacturing/3D printing, Autonomous robots, Simulation/digital twins, Cybersecurity and Big Data and AI analytics.


Figure 1: Showcasing a summary of each industrial revolution’s developments


Some of these pillars have seen success befitting the optimism around the movement including autonomous robots which have been at the helm of warehouse optimization over the past decade, helping many key industry players such as Amazon decrease long-term costs; provide labor and utilization stability; increase worker productivity; reduce error rate; reduce frequency of inventory checks; optimize picking, sorting, and storing times; and increase access to difficult or dangerous locations in warehouses and distribution centers. That said, however, other pillars have struggled to produce similar results such as, for instance, additive manufacturing which around 2015, was heralded as an innovation that would revolutionize manufacturing, with some articles even exploring the idea of this technology replacing large segments traditional production lines entirely! Unfortunately, this extreme optimism led to unrealistic expectations, and while the technology has indeed been useful and can help businesses augment their practice by expediting the prototyping process, the promise to assist significantly in custom product production or 3D printed parts has yet to prove itself, at least not in high-volume or complex production environments.


In more recent times, AI has seen a similar rise in popularity, and today, it's impossible to digest any form of media without hearing about the latest AI invention, technology, discovery, or how AI is going to replace multiple jobs throughout various industries. This buzz is further backed by the analytics, as according to Stanford University’s 2023 Artificial Intelligence Index Report total mergers and acquisitions, minority stake private investments, and public offerings amounted to a whopping $934.2 billion from 2013 to 2022, growing from an estimated $14.6 billion in 2013 to $276.1 billion at its peak in 2021, a roughly 1791% increase! And while a dip was observed in 2022, with the release of OpenAI's generative AI tool ChatGPT in November of that year, the view of AI as the next big thing has not lost any traction at all, with many major players placing a heavy focus on the area and some estimating its market size will reach a staggering $407 billion by 2027, a far cry from the $86.9 billion revenue estimate of 2022. But all this leads to the question of whether this ‘AI hype' and the sensational promises purported in many articles will match or outpace what we will eventually see in reality. Thus, through this paper, we will attempt to separate fiction from reality, giving an overview of the field of AI, its history in supply chain management, and what we realistically believe it can do going forward.


Figure 2: Showcasing global investment activity in AI from 2013 - 2022


History of AI

To tackle such a question, we must first ask ourselves how we got to this point, and where the roots of AI’s usage in supply Chain and general practice as a whole lie. And while the idea of self-sufficient technology may seem like a relatively novel concept at first glance, the ideas of automata and intelligence existing beyond the confines of the humans form have captured the human imagination since Ancient Greece, through the tales of the bronze giant Talos, the artificial woman Pandora, and their creator god, Hephaestus. All that said, however, it wouldn’t be until several centuries later that the notion of Artificial Intelligence escaped the realms of make-believe and science fiction, and manifested itself in reality during the 1950s. This coincided with the start of a period now referred to as the Pioneering Summer of AI


Figure 3: Showcasing one version of an AI historical timeline


Since then, throughout its history, the field has exhibited cyclical periods of intense growth, progress, and public interest (now termed AI Summers), which are subsequently followed by periods of decline and stagnation (known as AI Winters). A similar pattern emerges when looking at key landmarks in the field’s application to the Supply Chain industry. And while there is some consensus on these timelines, variation still does exist between sources. That said, through an examination of multiple sources we were able to construct a timeline of our own, highlighting a few periods which shaped much of the AI landscape’s development in supply chain sphere today:


  1. The Pioneering Summer and Early AI Applications (1950s-1970s):  It was during this period that the term “Artificial Intelligence,” as we know it today, was officially coined by John McCarthy while hosting the famous Dartmouth Conference in 1956, marking the formal birth of AI as a field of study. McCarthy emphasized that while AI shares a kinship with the quest to harness computers to understand human intelligence, it isn’t necessarily tethered to methods that mimic biological intelligence. He proposed that mathematical functions can be used to replicate the notion of human intelligence within a computer. McCarthy also presented the ideas of “Timesharing” and distributed computing, which would later play a key role in the growth of the Internet in its early days and later provided foundations for the concept of “Cloud Computing. Despite the massive leaps during this time period, the field was still in its infancy and the term “supply chain management” would not be introduced to the public until 1982 by British logistician Keith Oliver. Consequently, this implies that AI’s practical impact on supply chain during this period was limited and most of the measurable transformation it would bring would come in subsequent decades as AI matured and found broader applications in the industry. That said, however, given developments at the time it can be inferred that optimization algorithms such as Deterministic Linear Programming and Multi-Stage Stochastic Linear Programming played an increasingly crucial role in inventory management during the time, laying the groundwork for further AI implementation in the coming decades.

  2. Expert Systems (1980s-1990s):  The next major landmark came in the 1980s, spearheaded by the rise of Expert Systems, a branch of AI wherein programs are specialized in a singular task, just like a human expert. These systems are mainly designed to solve intricate problems with human-like decision-making capabilities in specified domains and under predefined rules. One of the first notable applications of this technology was an expert system known as The Inventory Management Assistant (IMA). IMA was designed in 1986 to improve the replenishment of spare parts and reduce safety stock for the US Air Force and allegedly improved the effectiveness of inventory management by 8–18% by reducing the inventory errors. From there, going into the early 1990s, AI saw a broad resurgence of interest, becoming commercially available in many other Supply Chain Management applications during this time. That said, however, because at the time these systems’ decision-making capability still paled in comparison to their human counterparts, their implementation was still relatively limited. This was quickly changing however, as seen in other realms such as IBM’s Deep Blue famously defeating Garry Kasparov, the then reigning world champion of chess in 1997, showing that while still not fully capable of replicating human efficacy in several domains, the technology was rapidly evolving and catching up.

  3. Data-Driven AI and Machine Learning (2000s): Before the emergence of big data, AI was largely limited by the amount and quality of data that was available for training and testing machine learning. The rise of Big Data during the 2000s changed this by providing access to massive amounts of data from a wide variety of sources, including social media, sensors, and other connected devices. This allowed machine learning algorithms to be trained on much larger datasets, which in turn enabled them to learn more complex patterns and make more accurate predictions. These developments, when coupled with the trend of increasing computing power and hardware cost reduction by way of Moore’s Law, meant that the industry also witnessed the widespread adoption of Data-Driven AI, enabling the development of a range of new tools and techniques for diagnosing data such as predictive analytics and data mining. These applications of the technology not only revolutionized many functions across the supply chain including demand forecasting, inventory optimization, and risk management, but also gave rise to companies that provided SaaS platforms to assist companies in making data-driven decisions using AI and digital tools. Notable examples of this include Coupa and Epicor. At the same time, advances in data storage and processing technologies made it possible to process and analyze large datasets quickly and efficiently, leading to the development of new machine learning algorithms, such as Deep Learning (a branch of AI wherein algorithms attempt to simulate neural networks inspired by the human brain and copy how biological neurons signal to each other). This, therefore, also gave algorithms the ability to learn from massive amounts of data and make more accurate predictions, a development that would set the foundation for future applications of the technology.

  4. Internet of Things (IoT) and Robotics and Automation (2010s):  This period brought about a new era in Supply Chain Management with the proliferation of the Internet of Things (which is a network of interconnected physical devices that collect data and exchange it with each other or send it for storage and analysis via the internet) alongside the integration of AI-powered robotics and automation in warehouses and distribution centers. The implementation of IoT allowed for increased transparency and connectivity across the entire supply chain through the provision of real-time asset tracking and data sharing throughout the entire distribution chain alongside a host of other benefits and potential use cases. This can be seen through various real-world examples such as Maersk’s smart containers, Walmart’s RFID-enabled inventory tracking and Coca-Cola’s connected vending machines. AI-powered robotics, on the other hand, allowed companies to leverage automation to reduce direct and indirect operating costs while also increasing revenue potential. This was done through: increasing efficiency and productivity, reducing error, re-work, and risk rates; improving safety for employees in high-risk work environments; performing mundane tasks so humans can work collaboratively to focus on more strategic efforts and; enhancing revenue by improving perfect order fulfillment rates, delivery speed and, customer satisfaction.


AI in Supply Chain

This brings us to where we are today and the state of AI in modern Supply Chain. Modern Artificial intelligence is all about data, algorithms, and mathematical and statistical models that can be defined by capability, functionality, and technology. AI tools are typed on the familiarity toward situations, from weaker automated tasks such as facial recognition to more unfamiliar tasks. AI systems can be operated to not store information and only react or store information and previous data to respond. These tools can fall under branches including machine learning, neural networks and deep learning, robotics, computer vision, fuzzy logic, expert systems, and natural language processing.


To elaborate, machine learning takes vast amounts of data to independently learn and improve its system through algorithms varying in supervised, unsupervised, and reinforcement learning for predictions and optimization. Deep learning is a subset of machine learning, utilizing neural networks that mimic a human brain and have artificial neurons, coined nodes or perceptrons . Robotics are machines that are specialized in automation and are already integrated into manufacturing. Computer vision takes data and breaks it down to pixels to identify and track objects. Fuzzy logic determines whether a hypothesis is true or false, commonly used to set limits for risky situations. Expert systems are specialized and use rules to manage information and solve issues. Finally, natural language processing takes human language for translation, speech recognition, and text analysis, the branch that ChatGPT falls under and the most noticeable branch recently.


Generative AI, popularized by ChatGPT uses data from existing images, text, audio, and video to train and create new data. IBM observed that professionals recognize the value of generative AI up to 70% quicker than previous traditional AI, especially its competitive advantages for supply chains. More importantly, generative AI can provide quick responses from large volumes of real-time data and what-if scenarios, allowing professionals to brainstorm and make decisions that improve risk management. According to EY, this tool has been used with clients to advise and strengthen supply chains through projects such as information classification and categorization, and strategy modification. EY observes a biotech company that uses what-if scenarios to improve demand forecasting for production planning, inventory optimization, distribution, and risk management. Another example is when a large United States retailer used generative AI as a chatbot to assist in procurement negotiations in which over 65% preferred negotiating with the chatbot over a human, which can further build supplier relationships and assist in sourcing and contracting.


From a manufacturing standpoint, generative AI can assist in warehouse automation. According to DHL, almost 80% of warehouses are still operated manually, 15% of warehouses use conveyors or pick and place solutions, and only 5% are fully automated. This is often a result of the high costs of implementing automation, changing infrastructure and machines, which is where generative AI can centralize operations. The numerous theoretical examples are likely to be observed first by the emergence of startups and their partnerships with large companies that have capital, data, internal expertise, and other resources to run powerful computing systems and maintain cloud databases and servers, which will then be used in case studies to learn and analyze from.

Figure 5: Step-by-Step Manual Operations to Full Automation Process for Warehouses (source: Oracle NetSuite)


Some supply chain professionals in the field have created their own tools building on existing generative AI such as ChatGPT, to address specific supply chain decisions. RiskGPT, for example, created by Overhaul takes historical theft data and patterns and real-time locations and factors to recommend solutions such as rerouting and security involvement. Scoutbee, an AI application used by Unilever and Siemens, that scrapes the Internet to find potential new or alternative suppliers, providing information on finances and scorecards. Koch Industries used an AI tool to focus internally, analyzing the stock-keeping units and existing supplier data to assess supplier network performance and automatically generating requests for quotes. Maersk and Walmart both use Pactum AI to automate negotiations, and Maersk implemented its tool in North America and 66 other countries.


Figure 6: global heat map of AI startups that impact supply chain and logistics (source: StartUs insights)


Like most technological applications, a few success stories can overshadow the errors of other tools. Tyson Foods, for example, tested an AI application that labeled a reliable supplier as high risk. Gartner surveyed 818 from supply chain professionals from August to October 2023 and predicts that 50% of supply chain organizations will invest in AI applications over the next few years, focusing more so on productivity than efficiency or cost savings.


Future of AI 

The Biden administration's investment in AI underscores a governmental acknowledgment of its strategic importance, signaling a robust support framework for its development and application across sectors, including supply chains. This investment is pivotal in realizing AI's capabilities in real-time analytics, demand forecasting, and risk mitigation—a triad essential for building resilient supply chain systems that can adapt to dynamic market conditions and unforeseen disruptions.


The government’s investment in AI also highlights its potential to drive revenue growth and profitability. The evolution of AI presents a transformative potential across various sectors, with one of the most significant impacts seen within the supply chain management sphere. Sensors, RFID tags, and IoT devices will usher in an era where inventory counts itself, and pallets autonomously report discrepancies, fundamentally changing the dynamics of supply chain management. Moreover, AI's role in demand forecasting has evolved from a theoretical advantage to a practical tool that companies are increasingly relying on. By leveraging vast datasets and sophisticated algorithms, AI can predict market demand with greater accuracy, thereby reducing waste and optimizing inventory levels. This capability not only improves operational efficiency but also contributes to more sustainable business practices by minimizing excess production.


IBM’s report "Welcome to the cognitive supply chain" reveals an aggressive push towards utilizing AI to address end-to-end supply chain challenges, with financial outperformers demonstrating a higher inclination towards adopting AI technologies. These applications span across material quality, preventative maintenance, and risk management, highlighting machine learning's mainstream adoption in operational technologies.

Figure 7: Forecasted Generative AI in Supply Chain Market Size (source: Precedence Research)


It's crucial to acknowledge both the immediate impact and the long-term consequences of AI. Its transformative power goes beyond simple task automation, touching on complex data analysis and a broad array of professions. Insights from Berkeley Exec Ed suggest that AI's predictive capabilities could lead to the automation of a wide range of non-routine tasks, affecting up to 30% of tasks in nearly 60% of occupations. This doesn't mean we're heading towards a future without jobs, but it highlights a significant change in job roles and duties, stressing the growing need for flexibility in the job market.


As articulated by Berkeley Exec Ed in their recent article, the advent of the 'AI era' carries profound implications for our societies beyond technological advancements and business models. It prompts crucial questions about the evolving roles of humans as machines gain cognitive capabilities, particularly in leadership, decision-making, and strategy. Realizing this potential fully requires careful attention to ethical considerations, ongoing investment in AI research and development, and a commitment to inclusive education and training programs.


There is an urgent need to establish regulations on AI and various countries are rushing to the task. With gen AI like ChatGPT growing quickly, there's potential for significant economic impact, estimated between $2.6 trillion to $4.4 trillion annually. However, risks such as data bias, privacy violations, and misinformation highlight the need for regulation. Globally, there's no comprehensive AI or gen AI regulation yet, but countries like Brazil, China, the EU, Singapore, South Korea, and the USA are leading with varied approaches. These regulations aim for AI to augment human decision-making without compromising autonomy, to be resilient against misuse, and to contribute positively to societal and environmental well-being. The shared vision among nations is to foster AI innovation that is accountable, secure, and beneficial for all, navigating the fine balance between advancing technology and safeguarding ethical standards and human values. Developing strong safety standards and ethical norms for AI is crucial to maximize its societal benefits while reducing the risks associated with its misuse. By adopting these principles as standard practices, we can ensure that progress in artificial intelligence is both responsible and advantageous.

 

4 views0 comments
bottom of page