Boaz Mizrachi, Co-Founder and CTO of Tactile Mobility. Boaz is a veteran technologist and entrepreneur, holding over three decades of experience in signal processing, algorithm research, and system design in the automotive and networking industries. He also brings hands-on leadership skills as the co-founder and Director of Engineering at Charlotte’s Web Networks, a world-leading developer and marketer of high-speed networking equipment (acquired by MRV Communications), and as System Design Group Manager at Zoran Microelectronics (acquired by CSR).
Tactile Mobility is a global leader in tactile data solutions, driving advancements in the mobility industry since 2012. With teams in the U.S., Germany, and Israel, the company specializes in combining signal processing, AI, big data, and embedded computing to enhance smart and autonomous vehicle systems. Its technology enables vehicles to “feel” the road in addition to “seeing” it, optimizing real-time driving decisions and creating accurate, crowd-sourced maps of road conditions. Through its VehicleDNA™ and SurfaceDNA™ solutions, Tactile Mobility serves automotive manufacturers, municipalities, fleet managers, and insurers, pioneering the integration of tactile sensing in modern mobility.
Can you tell us about your journey from co-founding Charlotte’s Web Networks to founding Tactile Mobility? What inspired you to move into the automotive tech space?
After co-founding Charlotte’s Web Networks, I transitioned into a role at Zoran Microsystems, where I served as a systems architect and later a systems group manager, focusing on designing ASICs and boards for home entertainment systems, set-top boxes, and more. Then, a conversation with a friend sparked a new path.
He posed a thought-provoking question about how to optimize vehicle performance driving from point A to point B with minimal fuel consumption, taking into account factors like the weather, road conditions, and the vehicle abilities. This led me to dive deep into the automotive space, founding Tactile Mobility to address these complexities. We started as an incubator-backed startup in Israel, ultimately growing into a company on a mission to give vehicles the ability to “feel” the road.
What were some of the initial challenges and breakthroughs you experienced when founding Tactile Mobility?
One of our major early challenges was generating real-time insights given the vehicle’s limited resources. Vehicles already had basic sensors, but cars lacked insights into essential parameters like current vehicle weight, tire health, and surface grip. We tackled this by implementing new software in the vehicle’s existing engine control unit (ECU), which allowed us to generate these insights through “virtual sensors” that connected to the current vehicle setup and didn’t require additional hardware.
However, using the ECU to get the insights we needed presented as many problems as answers. An ECU is a low-cost, small computer with very limited memory. This meant our software originally had to fit within 100 KB, an unusual restriction in today’s software world, especially with the added complexity of trying to integrate machine learning and neural networks. Creating these compact digital sensors that could fit in the ECU was a breakthrough that made us a pioneer in the field.
Tactile Mobility’s mission is ambitious—giving vehicles a “sense of touch.” Could you walk us through the vision behind this concept?
Our vision revolves around capturing and utilizing the data from vehicles’ onboard sensors to give them a sense of tactile awareness. This involves translating data from existing sensors to create “tactile pixels” that, much like visual pixels, can form a cohesive picture or “movie” of the vehicle’s tactile experience on the road. Imagine blind people sensing their surroundings based on touch – this is akin to how we want vehicles to feel the road, understanding its texture, grip, and potential hazards.
How do Tactile Mobility’s AI-powered vehicle sensors work to capture tactile data, and what are some of the unique insights they provide about both vehicles and roads?
Our software operates within the vehicle’s ECU, continuously capturing data from various hardware sensors like the wheel speed sensor, accelerometers, and the steering and brake systems. Ideally, there will also be tire sensors that can collect information about the road. This data is then processed to create real-time insights, or “virtual sensors,” that convey information about the vehicle’s load, grip, and even tire health.
For example, we can detect a slippery road or worn-out tires, which improves driver safety and vehicle performance. The system also enables adaptive functions like adjusting the distance in adaptive cruise control based on the current friction level or informing drivers that they need to allow more distance between their car and the cars in front of them.
Tactile Mobility’s solutions enable vehicles to “feel” road conditions in real-time. Could you explain how this tactile feedback works and what role AI and cloud computing play in this process?
The system continuously gathers and processes data from the vehicle’s hardware sensors, applying AI and machine learning to convert this data into conclusions that can influence the vehicle’s operations. This feedback loop informs the vehicle in real-time about road conditions – like friction levels on varying surfaces – and transmits these insights to the cloud. With data from millions of vehicles, we generate comprehensive maps of road surfaces that indicate hazards like slippery areas or oil spills to create a safer and more informed driving experience.
Could you describe how the VehicleDNA™ and SurfaceDNA™ technologies work and what sets them apart in the automotive industry?
VehicleDNA™ and SurfaceDNA™ represent two branches of our tactile “language.” SurfaceDNA™ focuses on the road surface, capturing attributes like friction, slope, and any hazards that arise through tire sensors and other external sensors. VehicleDNA™, on the other hand, models the specific characteristics of each vehicle in real time – weight, tire condition, suspension status, and more (known in the industry as “digital tween” of the chassis). Together, these technologies provide a clear understanding of the vehicle’s performance limits on any given road, enhancing safety and efficiency.
How does the onboard grip estimation technology work, and what impact has it had on autonomous driving and safety standards?
Grip estimation technology is crucial, especially for autonomous vehicles driving at high speeds. Traditional sensors can’t reliably gauge road grip, but our technology does. It assesses the friction coefficient between the vehicle and the road, which informs the vehicle’s limits in acceleration, braking, and cornering. This level of insight is essential for autonomous cars to meet existing safety standards, as it provides a real-time understanding of road conditions, even when they’re not visible, as is the case with black ice.
Tactile Mobility is actively working with partner OEMs like Porsche, and the municipalities as City of Detroit. Can you share more about these collaborations and how they have helped expand Tactile Mobility’s impact?
While I can’t disclose specific details about our collaborations, I can say that working with original equipment manufacturers (OEMs) and city municipalities has been a long but rewarding process.
In general, OEMs can harness our data to generate critical insights into vehicle performance across different terrains and weather conditions, which can inform enhancements in safety features, drive assist technologies, and vehicle design. Municipalities, on the other hand, can use aggregated data to monitor road conditions and traffic patterns in real-time, identifying areas that require immediate maintenance or pose safety risks, such as slick roads or potholes.
What do you believe are the next major challenges and opportunities for the automotive industry in the realm of AI and tactile sensing?
The challenge of achieving accuracy in autonomous vehicles is likely the most difficult. People are generally more forgiving of human error because it’s part of driving; if a driver makes a mistake, they’re aware of the risks involved. However, with autonomous technology, society demands much higher standards. Even a failure rate that’s much lower than human error could be unacceptable if it means a software bug might lead to a fatal accident.
This expectation creates a major challenge: AI in autonomous vehicles must not only match human performance but far surpass it, achieving extremely high levels of reliability, especially in complex or rare driving situations. So we have to ensure that all of the sensors are accurate and are transmitting data in a timeframe that allows for a safe response window.
On top of that, cybersecurity is always a concern. Vehicles today are connected and increasingly integrated with cloud systems, making them potential targets for cyber threats. While the industry is progressing in its ability to combat threats, any breach could have severe consequences. Still, I believe that the industry is well-equipped to address this problem and to take measures to defend against new threats.
Privacy, too, is a hot topic, but it’s often misunderstood. We’ve seen a lot of stories in the news recently trying to claim that smart cars are spying on drivers and so on, but the reality is very different. In many ways, smart cars mirror the situation with smartphones. As consumers, we know our devices collect vast amounts of data about us, and this data is used to enhance our experience.
With vehicles, it’s similar. If we want to benefit from crowd-sourced driving information and the collective wisdom that can improve safety, individuals need to contribute data. However, Tactile Mobility and other companies are mindful of the need to handle this data responsibly, and we do put procedures in place to anonymize and protect personal information.
As for opportunities, we’re currently working on the development of new virtual sensors, one that can provide even deeper insights into vehicle performance and road conditions. These sensors, driven by both market needs and requests from OEMs, are tackling challenges like reducing costs and enhancing safety. As we innovate in this space, each new sensor brings vehicles one step closer to being more adaptable and safe in real-world conditions.
Another significant opportunity is in the aggregation of data across thousands, if not millions, of vehicles. Over the years, as Tactile Mobility and other companies gradually install their software in more vehicles, this data provides a wealth of insights that can be used to create advanced “tactile maps.” These maps aren’t just visual like your current Google maps app but can include data points on road friction, surface type, and even hazards like oil spills or black ice. This form of “crowdsourced” mapping offers drivers real-time, hyper-localized insights into road conditions, creating safer roads for everyone and significantly enhancing navigation systems.
Moreover, there’s an untapped realm of possibilities in integrating tactile sensing data more fully with cloud computing. While smartphones offer extensive data about users, they can’t access vehicle-specific insights. The data gathered directly from the vehicle’s hardware – what we call the VehicleDNA™ – gives a lot more information.
By leveraging this vehicle-specific data in the cloud, smart cars will be able to deliver an unprecedented level of precision in sensing and responding to its surroundings. This can lead to smarter cities and road networks as vehicles communicate with infrastructure and each other to share real-time insights, ultimately enabling a more connected, efficient, and safer mobility ecosystem.
Finally, what are your long-term goals for Tactile Mobility, and where do you see the company in the next five to ten years?
Our aim is to continue embedding Tactile Mobility’s software in more OEMs globally, expanding our presence in vehicles connected to our cloud. We expect to continue offering some of the most precise and impactful insights in the automotive industry throughout the next decade.
Thank you for the great interview, readers who wish to learn more should visit Tactile Mobility.
Credit: Source link