Artificial Intelligence (AI) and Machine Learning (ML) In Real Life (IRL)

HAL from 2001: A Space Odyssey

Unless you’ve been completely disconnected from modern society for the last 30 years, you’ve been impacted by AI whether you realize it or not. Suggested friends on Facebook? The call you placed to make an appointment at your salon last week? There’s a decent chance you were talking to AI. That needless item you bought on Amazon at 1am after drinking an entire bottle of wine that was supposed to be “just one glass”? You found it, ordered it, and it’ll be delivered with AI through the entire pipeline. AI may seem like a new invention once ChatGPT burst on the scene and started writing response texts to your significant other, but the foundational principles have been around for some time. Machine learning and resulting predictive analytics have been around in some form or fashion for years…AI wrapping them into something that mimics human thinking and behavior is new. Or is it?

Machine Learning (ML) has been around longer than most realize

In my first role at General Motors in 2013, I used excel to create reports exported to powerpoint as a data analyst. I automated data updates by integrating source exports with ingestion to excel through cubes and pivot tables. Charts would then automatically update in excel and by linking them directly to powerpoint, the charts would update automatically in the final presentation. 

Most of the job was updating reports for executives and to stand out as a grunt market analyst, I always made sure to include some key insights in my reports based on the trends I observed in the data. With my data and charts automatically updating, I began automating analysis by identifying the key data points to analyse and establishing thresholds for those points for excel to notify me of outliers or continuing trends. Over time, my excel model would notify me of new emerging market trends and could forecast market performance based on those with impressive accuracy. 

Without knowing it, I created a very basic Machine Learning pipeline in an effort to make my job simpler. While my peers spent countless hours copying and pasting tables and charts, I was using my brain to deeply analyse the data and ensure actionable insights were being presented to leadership. I ascended from that role quickly. I could use ChatGPT today for full AI analysis, but the fundamentals of Machine Learning would persist.

Predictive Analysis for Forecasting and Marketing

Using ML for analysis and forecasting helped in product and portfolio planning for developing business cases and making data-driven decisions for new product development was and still is core to how I develop products. In 2021, I learned how these models could be used to create products when I joined automotiveMastermind (aM). 

Predictive marketing was all the rage, using ML with large customer datasets to identify in-market customers to proactively market to. aM offers a SaaS product to automotive retailers that analyses their Customer Relationship Management (CRM) systems to identify customers who are most likely in market for a vehicle by assigning them a Behavior Prediction Score (BPS). This helps retailers focus their precious time on customers who are most likely to transact, driving efficiency and business results. aM would even proactively send print and digital marketing to customers on behalf of the dealer, changing lead generation from pull to push. I loved the model and was excited to help take it to the next level.

After nearly a decade of working for Automotive OEMs, I deeply understand the inefficiencies with marketing and incentive planning. In 2023, General Motors spent $2.9B (yes, BILLION) on advertising in the US alone. It also spent $2-5k per vehicle in incentives on average, making the advertising and incentive budget the equivalent to a mid-cap company. By using the intelligence of aM, a product was imagined that enabled OEMs to identify the most in-market customers and by ingesting additional customer data, could also identify price sensitivity and assign specific incentives to each prospect. EnterpriseEyeQ (EEQ) could effectively turn a shotgun blast to a laser beam for incentive spending, and upend the very large automotive marketing space. 

When an automotive customer returns to market, they will do so for various reasons. A lease renewal, replacing a worn-out vehicle, needing a larger vehicle, all result in a new vehicle sale. When a customer chooses to return to a brand and buy a larger and thus more expensive vehicle, the OEM is ecstatic. Loyal customers are the best customers and those that continue to spend more money are the peak. Using ML and predictive analysis, the challenge that came to me as a Product Manager on EEQ was to develop a model to predict migration patterns for vehicle upsizing.

This was a highly specific ask on a tight timeline, but I understood the critical importance of this feature to our OEM partners. We wouldn’t have time to ramp up developers on the data science and logic behind this new model, so we built a prototype for MVP ourselves using my favorite tool…excel. We ingested industry-wide migration pattern data into excel, and extrapolated it for each OEM partner’s vehicle lines. A prospect with a certain BPS score with a certain price sensitivity and lifestage driving a certain vehicle could predict their next vehicle. We used index matching the thresholds to export vehicle matches to the predictive marketing system, effectively sending the right incentive to the right customer for the right (upsell) vehicle. 

Once the MVP product was in market, we used ML to assess the accuracy of our prediction model and improve over time with a feedback loop based on actual performance. The ML pipeline was used to standardize the predictive model out of excel and in the cloud for scalability. 

Robots bringing AI and ML to the physical world

I’m not going to go on a doomsday tirade about robot overlords and the inevitability of a Terminator style apocalypse (get a few beers in me on a Friday night and we’ll have a fun discussion). While robotics are bringing AI and ML to the physical world, there’s still a long way to go with the marriage of hardware and software before a simple, safe, and reliable robotic system of a large scale form factor is available to the public. With more real world pilot deployments and real world testing, the day of autonomous vehicles or humanoid robots are fast approaching, but it’ll be a while before the car in your garage can fully drive itself or your robot Jeeves is shaking up dirty martinis at your next soiree. 

Product development in the current robotics products landscape is unique compared to other industries, even tech that seems like it can invent just about anything. As the Product Lead for at Forterra for the AutoDrive autonomous vehicle software stack, identifying and filling gaps between product capability and customer expectations became core to the job. Customers expect a robotic vehicle that can do everything a human operated vehicle can do because the business case for an autonomous vehicle hinges on deploying autonomous vehicles without drivers in lieu of human operated vehicles. The challenge is that humans can quickly adapt to operate in infinitely variable environments, while robots can’t. 

Oversimplifying, Robots (and robotic vehicles) are engineered in a manner similar to how humans operate:

  1. Receive action
  2. Perceive the world 
  3. Interpret the perceived world (ML)
  4. Execute action based on perceived world (ML)
  5. Record experience to be referenced later (ML)

The product development levers are few but critical:

  1. Safe operation
  2. Smooth stack performance
  3. New actions
  4. Expanding operational environments (Operational Design Domain ODD)

“Operate like a human” seems so simple on the surface doesn’t it? But the devil really lives in the details and that’s where smart product leadership really weighs in. Having a deep understanding of technological limitations on both hardware and software, then working within those limitations to make the product better is the essence of what a product manager does. Deploying machine learning for the perception pipeline for example, enables the product to handle more scenarios and environments, but it’s the human training through data annotation and threshold setting that truly makes ML useful to making the product better. 

There will always need to be a need for interaction between human and robot, beyond just a safety net to prevent a Terminator style apocalypse. Robots supplementing humans by doing repetitive and oftentimes dangerous tasks is where their value really shines. How do robots know what those tasks are, when to do them, and how? The human machine interface (HMI) is the product developer’s domain. As the main interface for a customer to their purchased robot, it’s the true differentiator between robotic products. A smart, logical, and simple HMI can really surprise and delight especially when deploying AI in a logical manner. 

Controlling robots through AI and ML

Yard trucks are small tractor-type vehicles designed with one purpose in life: move truck trailers around a warehouse, distribution facility, port, or yard. They do the same task over and over in limited variability environments that humans can quickly get bored working in. Boredom leads to complacency, which can lead to accidents causing costly property damage or human injury. This is a prime opportunity for autonomous vehicles and one that I’m intently familiar with in my time at Forterra. I was involved in the productization of the yard business, with a particular focus on Remote Operations including the various HMIs. 

The goal of an autonomous yard truck is to supplement and ultimately replace human operated vehicles. Customers interested in autonomous yard trucks expect them to operate like human operated vehicles, particularly when it comes to commanding them. We designed HMIs that used the same software scaled for different form factors to simplify the entire command and control process and when combined with the machine learning pipeline on the robots, could command multiple vehicles at the same time with ease. 

The customer in this case is a yard manager or the individual responsible with identifying which trailers need to move at a facility, when and where. In their current state, yard managers use a scheduling system (Yard Management System or YMS) with automated or manual communication with drivers on the yard to move trailers around. The human drivers then take care of identifying routes, and completing moves. “Hey Carla, move trailer 10 to dock door 3 by 1pm”. Carla the driver understands, but a robot won’t. That’s where the potential for large language models (LLMs) and AI comes in. But even better than commanding a single vehicle by voice would be commanding an entire fleet with data.

By developing a series of command and control APIs, we were able to integrate vehicle commands with existing automated systems. This made the robotic vehicles effectively act like human operated vehicles, leaving the HMI mostly responsible for monitoring robotic activity and notifying the manager when the robots need help. Machine learning was used to identify and notify when a command was off-track for various reasons, but more impressively AI was used to coordinate moves across multiple vehicles throughout the day, which improved overall move count by reducing congestion on the yard and orchestrating a delicate dance of trailer moves. It’s like a robotic symphony conducted by AI powered by machine learning.

The future will always be AI

It’s been with us for a while in various ways and while it’s experiencing a boom in advancement with LLMs and increased everyday human applicability, there’s still and always will be room for improvement with AI. While advancing rapidly, AI is only as good as the humans programming it and we have a long way to go toward replicating our emotional intelligence and intuition with AI. As the dust settles and LLMs lose their shiny new object luster, we’ll find the right balanced ways of working where AI supplements our lives rather than replacing them. 

Product development professionals will be spending more time on the human machine hand-off, finding the right approach to deploying AI and ML in more products across all industries. Considering how to enhance the user experience through smart utilization of LLMs or using ML to force multiply robotic commands should be key to all of our roadmaps in some form or fashion. Even if it sounds new and maybe even scary, keep in mind that we’ve been doing this for a while, even if just automating charts in powerpoint.

Related Posts