During an industry event, company researchers highlight progress in developing camera-based systems that can recognize and understand a range of driving scenarios.
by Staff
October 14, 2015
Image of scene labeling courtesy of Daimler AG.
2 min to read
Image of scene labeling courtesy of Daimler AG.
Daimler researchers are making major strides in developing camera-based systems that can automatically recognize and classify different categories of people, objects and driving situations.
Such “scene labeling” capabilities represent a breakthrough, aided in part by the UR:BAN initiative -- shorthand for the Urban Space: User-friendly Assistance Systems and Network Management project. At the project’s closing event, Daimler presented results from five different test vehicles.
Ad Loading...
“The tremendous increase in computing power in recent years has brought closer the day when vehicles will be able to see their surroundings in the same way as humans and also correctly understand complex situations in city traffic,” said Professor Ralf Guido Herrtwich, head of driver assistance and chassis systems for group research and advanced development at Daimler AG.
Scene labeling transforms the camera “from a mere measuring system into an interpretive system, as multifunctional as the interplay between eye and brain,” Daimler explained in a press release.
One Daimler test vehicle demonstrated scene labeling, while another showcased imaging radar systems and their potential in urban environments. Radar sensors can now resolve and visualize not just dynamic objects but also static environments. Additionally, properties of radar waves allow the system to function in fog and inclement weather. Environment models can be generated using data from radar and camera sensors.
The third test vehicle included a system for detecting, classifying and identifying the intent of pedestrians and cyclists. This system, for example, analyzed head posture, body position and curbside position to predict whether a pedestrian was likely to stay on the sidewalk or cross the road. “In dangerous situations, this allows an accident-preventing system response to be triggered up to one second earlier than with currently available systems,” Daimler said.
Another highlight was the demonstration of how radar- and camera-based systems can make lane changing in city traffic safer and less stressful. Following a command from the driver, this system provides assisted lane changing in a speed range of 18-37 mph. The driver at all times has the option of overruling the system by intervening with the steering, accelerator or brakes.
Ad Loading...
The fifth test vehicle showed the potential for predicting driver behavior in relation to planned lane changes or changes of direction.
“With regard to an imminent change of lane, for example, glances over the shoulder are linked with driving parameters that have already been sensed,” Daimler explained. “A likely change of direction can be predicted from the interplay between steering movement, reduction of speed and map information. In the demonstration, the direction indicator was then automatically activated to inform other road users as early as possible.”
To advance such technology, Daimler continues to conduct research with its partners.
Distracted driving remains one of the most persistent risks in fleet operations. New approaches focus on removing mobile device use entirely while adding real-time safety support.
As distraction risks evolve, fleets are turning to smarter, more connected technologies to better understand what’s happening behind the wheel. Part 2 explores how these tools are helping identify risky behaviors and improve visibility across operations.
Distracted driving is often measured by what we can see—phones in hand, eyes off the road. But what about the distractions we can’t? A recent incident raises a bigger question about awareness, attention, and why subtle risks so often go unnoticed.
Fleets have more driver data than ever, so why isn't behavior changing? Training requires more than reports and coaching — it requires real-world practice.
A two-part conversation with Stefan Heck on how AI is transforming the fight against distracted driving. As fleets adopt smarter tools, the focus shifts from reacting to preventing risk. In Part 1, we look at where AI is making an impact for fleets today.
An 11% drop in pedestrian fatalities in early 2025 signals progress in U.S. road safety, but elevated death rates and ongoing risks underscore the need for continued action from fleets and policymakers.