The challenging winter-weather testing is based in Michigan, including Mcity on the campus of the University of Michigan.
by Staff
January 14, 2016
Photo courtesy of Ford.
3 min to read
Photo courtesy of Ford.
Ford is conducting autonomous vehicle tests in snow-covered environments — another step in the company’s plan to bring self-driving vehicles to millions of customers worldwide, the automaker said.
“It’s one thing for a car to drive itself in perfect weather,” said Jim McBride, Ford technical leader for autonomous vehicles. “It’s quite another to do so when the car’s sensors can’t see the road because it’s covered in snow. Weather isn’t perfect, and that’s why we’re testing autonomous vehicles in wintry conditions — for the roughly 70 percent of U.S. residents who live in snowy regions.”
Ad Loading...
Ford’s winter-weather testing takes place in Michigan, including at Mcity — a 32-acre, simulated urban environment at the University of Michigan.
Fully autonomous driving can’t rely on GPS, which is accurate only to several yards — not enough to localize or identify the position of the vehicle. And it’s key that an autonomous vehicle knows its precise location, not just within a city or on a road, but in its actual driving lane. A variation of a few inches makes a big difference.
LiDAR, on the other hand, is much more accurate than GPS – identifying the Fusion Hybrid’s lane location right down to the centimeter, according to Ford. LiDAR emits short pulses of laser light to precisely allow the vehicle to create a realtime, highdefinition 3D image of what’s around it.
In ideal weather, LiDAR is the most efficient means of gathering important information and metadata — underlying information about the data itself — from the surrounding environment. LiDAR can sense nearby objects and use cues to determine the best driving path. But on snowcovered roads or in high-density traffic, LiDAR and other sensors such as cameras can’t see the road. This is also the case when the sensor lens is covered by snow, grime or debris.
Together, Ford and University of Michigan technologists began developing a solution that would allow an autonomous vehicle to see on a snow-covered road. To navigate snowy roads, Ford autonomous vehicles are equipped with high-resolution 3D maps. The maps are complete with information about the road and what’s above it, including road markings, signs, geography, landmarks and topography.
Ad Loading...
“Maps developed by other companies don’t always work in snow-covered landscapes,” said Ryan Eustice, associate professor at University of Michigan’s college of engineering. “The maps we created with Ford contain useful information about the 3D environment around the car, allowing the vehicle to localize even with a blanket of snow covering the ground.”
An autonomous vehicle creates the maps while driving the test environment in favorable weather, with technologies automatically annotating features such as traffic signs, trees and buildings. When the car can’t see the ground, the vehicle detects above-ground landmarks to pinpoint itself on the map, and then subsequently uses the map to drive successfully in inclement conditions.
“The vehicle’s normal safety systems, like electronic stability control and traction control, which often are used on slippery winter roads, work in unison with the autonomous driving software,” McBride explained. “We eventually want our autonomous vehicles to detect deteriorating conditions, decide whether it’s safe to keep driving, and if so, for how long.”
Winter driving still presents a host of challenges, but Ford considers this testing an important achievement on the road to autonomous driving. That road goes back roughly a decade to the first-generation autonomous vehicle from Ford — a LiDAR-equipped F250 Super Duty.
Distracted driving remains one of the most persistent risks in fleet operations. New approaches focus on removing mobile device use entirely while adding real-time safety support.
As distraction risks evolve, fleets are turning to smarter, more connected technologies to better understand what’s happening behind the wheel. Part 2 explores how these tools are helping identify risky behaviors and improve visibility across operations.
Distracted driving is often measured by what we can see—phones in hand, eyes off the road. But what about the distractions we can’t? A recent incident raises a bigger question about awareness, attention, and why subtle risks so often go unnoticed.
Fleets have more driver data than ever, so why isn't behavior changing? Training requires more than reports and coaching — it requires real-world practice.
A two-part conversation with Stefan Heck on how AI is transforming the fight against distracted driving. As fleets adopt smarter tools, the focus shifts from reacting to preventing risk. In Part 1, we look at where AI is making an impact for fleets today.
An 11% drop in pedestrian fatalities in early 2025 signals progress in U.S. road safety, but elevated death rates and ongoing risks underscore the need for continued action from fleets and policymakers.