I think it’s fair to say that Generative AI has pushed past the hype and is on a trajectory to touch nearly every aspect of our lives. Bloomberg-published research projects the generative AI market to explode to $1.3 trillion over the next 10 years. The term “Generative AI” appeared in more than 1,200 online news articles from January 2023 to January 2024 alone. Gartner identifies Generative AI as one of the top strategic technology trends of our time and predicts that it will become a general-purpose technology with an impact similar to that of the steam engine, electricity, and the internet.
How is it that this technology is so intriguing? In short, it's about the simplicity and naturalness it offers to our digital interactions.
Gen AI and Co-Pilots: Natural interaction and all insights in one place.
Generative AI solutions revolutionize the way we interact with tech moving it from interface-based to a natural interaction experience. You can ask questions and receive answers just like you would in a conversation with a friend or expert. There's no need for specialized terminology or intricate question structures – just speak or type your query in a way that feels completely normal.
The second reason Generative AI solutions are so compelling is their ability to consolidate all your queries in one place without the need to switch between different systems or apps. The AI will either answer from it’s own pretrained memory or sort and route your questions to external knowledge sources - all seamlessly in the background.
A term that has emerged in this context is “Co-Pilot”, an AI system or solution that can create content or perform tasks based on natural language prompts. A Co-Pilot assists users with their work by automating or simplifying some aspects of it, while still letting users have control and agency over the final outcome.
Let's look at an example from the shop:
Addressing Challenges in Parts and Service: A Gen AI Perspective
The automotive parts and service landscape is more complex and more competitive than ever - and our conversations with industry leaders confirm again and again, where the major pain points are:
Inexperienced technicians and parts counter specialists present a significant challenge for the industry, requiring strategies for skill development and/or knowledge enhancement. In addition to this, a new generation of techs has entirely different preferences when it comes to using tools - they prefer to interact with minimal clicks, are mostly mobile tech savvy, and ‘AI native’.
And, no surprise: It’s all about the parts - identifying the correct part quickly is still a major issue. Very specifically, part type identification majorly relies on PCdb/PIES - but they’re not complete and not always comprehensive. New technology or newly introduced part types are not captured quickly enough. Additionally, electronic catalogs maintain proprietary part terminology beyond PCdb/PIES. And lastly, PCdb/PIES is a very aftermarket-centric system which OEM part suppliers don’t fully adopt.
As in the example above, Generative AI Solutions can help address these challenges very effectively. The automotive repair industry already sits on a wealth of data - repair orders, electronic parts catalog transactions, diagnostic sessions - just to name a few. AI-powered solutions such as Predii’s Smart Servicing Cloud® have been extracting insights from this data for years, powering different analytics solutions for various players in the industry. Now imagine an AI-based agent that is able to connect the dots across the parts and service ecosystem - and present back these findings in a natural, easy to consume manner - irrespective of the level of experience of the end-user.
Parts and Service Co-Pilots: Augmenting the repair process
While it is hard to report on exact numbers, the industry agrees that significant revenue is being lost due to inefficient parts ordering. Traditional catalog systems are often clunky, and hard to use without extensive knowledge about part number codes, kits, fitment specifics, and other quirks. Imagine a full brake replacement for a 2019 Ford F-150 - repair that requires at least eight different components - that’s assuming you’ll go for the brake pads and rotors in a kit - all of which need to be typed in by part number or exact component name match. That is after you selected your vehicle from a drop-down on punched in the VIN.
An AI-powered Parts Co-Pilot allows you to use the same ‘street language’ your techs are used to (try and search for ‘plenum gasket’ or 'boo switch' in a traditional parts catalog) and even skip the one-by-one search for multiple parts in a repair, i.e. a brake job - providing specific part numbers for everything from brake cleaner, calipers, drums, shoes, to the brake pads & rotors kit and the required miscalleneous hardware. One query and the parts counter becomes the hero in the shop in under 60 seconds.
The tricky part about Generative AI: How to avoid hallucinations and protect IP
While Generative AI has undoubtedly made its way into commercially deployed solutions - or will soon do so - naturally there are questions to be answered. Solutions need to be safe, reliable, and obviously makes sense form an economic standpoint.
Lets look at two of the most common concerns around both Generative AI or large language models in general.
Data privacy and security: Generative AI and LLMs require large amounts of data to train and fine-tune their models. Organizations are - and should be - naturally cautious about exposing sensitive, confidential, and proprietary data to large language models for the sake for protecting their intellectual property. If data is the new oil, you don’t want to go around and have everyone fuel their trucks with yours.
It should go without saying that Generative AI providers need to ensure data and privacy policies are transparent and in line with local and global legislation as well as IP restrictions and NDAs are in place. From Predii’s experience, the only way to realistically deploy solutions in a safe and responsible manner, is to move the entire model and architecture into an isolated, customer-owned environment. This requires language models to be lightweight and the surrounding architecture to be enterprise-grade and production ready.
Technical limitations and hallucinations: Generative AI and LLMs have known technical limitations and can make mistakes or produce nonsensical or inaccurate content - known as hallucinations - affecting the performance and reliability of the applications they power. Specifically in domain-expertise applications - such as medical appliances or, in our case, complex repair procedures - lack of accuracy and reliability can be a show-stopper.
Domain- or even task-specific fine-tuning of large language models on specific datasets is a good way of increasing accuracy of responses - but is expensive and almost never economically viable. This is why “Retrieval Augmented Generation” or “Ontology Augmented Generation” based solutions are becoming increasingly popular.
Hybrid solutions leverage the best of both worlds
An effective way of creating meaningful Co-Pilot solutions is to combine the capabilities of Generative AI with the support of labelled knowledge sources. In this scenario, the large langue model acts as an agent that understands the intent behind a natural question, taps into vectorized knowledge sources - e.g. repair orders, parts catalog data, repair procedures - and then intelligently summarizes the relevant information and presents it back to the user.
The AI-powered solution becomes a platform that allows custom knowledge infusion, optimally prepares the relevant data to be consumed by the large langue model (which, as easy as it sounds is not a straight-forward task - repair orders are messy!), and then integrates with existing workflows.
In our brake job scenario above, the AI did not make up or produce a single piece of content but acted as an agent, an assistant to connect the relevant pieces of information from the shop management system, repair procedures, and via API parts catalog integration.
Generative AI is recognized as a top strategic technology trend and, if applied correctly, can act as an effective Co-Pilot to solve major challenges in the parts & service ecosystem, namely parts identification and process inefficiencies. Hybrid approaches combine large langue models with external knowledge infusion provide a balanced solution that leverages the benefits or Generative AI while addressing major concerns around reliability (“hallucinations”), IP, and data security.