Automotive Fleet
MenuMENU
SearchSEARCH

The Story of 2025: Outsourcing Accountability

The fleet world has been transformed by technology, so a choice must be made: how do we discern when to use AI and when not to?

December 12, 2025
A blue, futuristic graphic warns of the dangers of AI use.

Accountability, critical thinking skills, and safety are at risk when fleets rely too heavily on AI. 

Image: Automotive Fleet

3 min to read


Let’s start with AI — and the increasingly concerning tendency to trust software that’s accountable to no one.

Think about a professional in a fleet or safety role. Your job is to weigh context: prevailing laws, company culture, internal policies, budgets, and your own experience built over time through education, networking, and collaboration.

Ad Loading...

Now consider how technology has been steadily integrated to improve compliance with safety policies and fleet best practices. These efficiencies allow fewer experts to manage more vehicles and employees — and there’s nothing wrong with that. But there’s a limit to how much you should outsource to AI.

While it’s tempting to rely on AI agents to identify risks or inefficiencies, what you surrender in return is your critical thinking — and, more importantly, your organization’s accountability. Best practices in fleet and safety aren’t algorithms; they’re the result of people sharing experience, learning, and refining approaches over the years.

Those practices evolve through professionals who engage with associations such as NAFA, NETS, and AFLA, and through attending our own BBM events, such as the Fleet Forward Conference and Government Fleet Expo. As we embrace new AI-enabled tools, we must also stay grounded in real-world discussion and peer learning. That’s why in 2026, our Fleet Fast Podcast on Spotify will feature over 40 hours of sessions recorded at Fleet Forward Conference — so the human dialogue remains central.

Quick Pulse Check

As 2026 begins, budgets are set, and compromises loom. You have two choices:

Subscribe to software that uses AI to guide management decisions as it learns from your data, or

Ad Loading...

Hire an intern — invest in a person — and pass along your experience as you make decisions yourself.

You probably can’t afford to do both.

I’ll admit my bias: I love technology. I’ve helped organizations deploy data-driven systems that delivered remarkable safety and efficiency gains. But none of that relied on AI to make the decisions — it was skilled professionals interpreting data who produced the results. That said, in 2026, I’d pick an intern over an AI upgrade.  

So, here’s the real question: if you had to choose between an AI platform that could replace your critical-thinking value, or a human you could mentor toward succession, both costing the same, which would you choose?

The Human Edge

Are you prepared to learn and retain knowledge — or only to ask prompts and accept whatever response comes back?

Ad Loading...

The less we practice retaining knowledge, the harder it becomes. You might feel efficient multitasking through prompts, but your software will always be the better multitasker. Each time you train AI by rewarding its responses, you’re also training your own replacement.

For now, your value lies in the knowledge you share — but within a year or two, if you don’t evolve, you may find yourself needing a completely new role.

In the short term, your company wins. In the long term, both you and your organization lose if mentorship and human judgment fade away.

As you weigh your choices — mentoring new people versus investing in more software — remember: You can’t technology your way into a safety culture. You can’t technology your way into efficiency growth. And you can’t technology your way into 2026.

It’s here, it’s real — and we’re in this together. 

Subscribe to Our Newsletter

More Blog Posts

Insight Laneby Colin SutherlandMarch 24, 2026

Is This Theft Disguised as Training?

If AI is still “training,” why is it already publishing, summarizing, and profiting from the work of creators? In this provocative Insight Lane column, Colin Sutherland challenges the industry’s most convenient narrative—and asks a simple question: when does training become extraction, and why is credit still optional?

Read More →
Oil pumpjacks at sunrise with overlay text about China’s fleet electrification, illustrating reduced oil dependence and the impact of EV adoption on global fuel demand.
Insight Laneby Colin SutherlandMarch 24, 2026

How China Quietly Reduced Its Dependence on Oil (And What Fleets Should Learn)

China didn’t just bet on EVs. It used them to cut oil dependence at scale. Here’s what that strategy reveals and why smart fleets should be paying attention.

Read More →
A Tesla has two gray arrows over it, signifying vehicle and software lifespan.
Insight Laneby Colin SutherlandFebruary 9, 2026

When Cars Stop Updating but Don’t Stop Driving

When the US’s largest EV manufacturer discontinues a model, it gives me pause to consider the software that drives it.

Read More →
Ad Loading...
Two workers sit at computers sitting across from each other, analyzing their screens.
Insight Laneby Colin SutherlandJanuary 7, 2026

The Cart is Pulling the Horse

Here's why software features shouldn’t be allowed to dictate your business goals.

Read More →
An Automotive Fleet image has horses standing on a hill, representing history supporting fleets.
Insight Laneby Colin SutherlandNovember 19, 2025

The 250-Year-Old Fleet Manager: Quartermasters & Wagon Masters of the American Revolution

“Fleet manager” wasn’t a title then, but Quartermasters and Wagon Masters lived the job on the road to independence.

Read More →