HomeTechnologyMicrocontrollersIncreasing Autonomy and Safety through AI

    Increasing Autonomy and Safety through AI

    Courtesy: Renesas 

    With the growing use of autonomous vehicle capabilities, there is less reliance on drivers or vehicle operators, which means products with higher levels of autonomy will need to replicate all the various activities that a driver or operator would perform. Vehicles and robotics will have to utilize all the senses a human driver or operator uses, not just vision or touch, but also sound and smell.

    According to SAE:

    • Level 2 Autonomy indicates that the driver must… monitor the environment.
    • Level 3 Autonomy indicates that the driver is not required to monitor the environment.
    • And when we reach Level 4 Autonomy, the vehicle is capable of… monitoring the environment on its own.

    Reality AI software from Renesas is already helping to monitor that environment today. Not just the environment outside of the product or vehicle for advanced driving or operational capabilities, but also the product itself, to facilitate preventative and predictive maintenance. With the advanced signal processing techniques available to analyze the different signals generated from the environment, Reality AI tools can facilitate a higher level of autonomy and make the vehicle or product safer.

    For various industries driving to autonomy, Renesas offers the Seeing with Sound solution, which augments the current ADAS/AD sensor suite by including the sensed modality of sound. When we drive, we’re not only using our sense of sight but also relying on hearing and even vibrations to navigate our surroundings. We hear a lot of sounds from the environment around us that we may not be able to see the source of immediately or at all.

    With SWS, we use the vehicle’s microphones or vibro-acoustic sensors to listen for emergency vehicles, other vehicles on the road, pedestrians, and even cyclists, combined with a Renesas MCU like the RH850/U2A.

    The Seeing with Sound (SWS) application is the perfect solution to a problem reported recently in an article from the New York Times, when a robot vehicle passed in front of an emergency vehicle with sirens on, thus delaying its arrival to an emergency. The robot vehicle had a clear line of sight to an oncoming fire truck and didn’t recognize it, but it certainly would have heard the sirens if our SWS solution had been implemented. The audible environment around that vehicle would have provided the information that the visual environment had missed, delivering a better image of the total environment.

    The Seeing with Sound solution from Renesas allows a vehicle to hear its surroundings and make decisions based on the noise of interest, like the sound of emergency sirens. Sometimes, we overlook how much we utilize our sense of hearing when we drive. Even within current vehicles, we see that distracted driving shifts the focus away from the road, and with the passive and active noise cancellation techniques being deployed, it is more difficult for human drivers to recognize all inputs from the environment. The inclusion of SWS in vehicles, it will make them safer and less reliant on alert drivers. Regulations and Move Over laws for emergency vehicles, or automatic emergency braking, are driving the implementation of various technologies like SWS.

    As industries continue to trend toward autonomy, the use of AI for condition-based monitoring of vehicle or product operational status and predictive and preventive maintenance will continue to rise. We as drivers and operators typically monitor a product or vehicle’s health and schedule required maintenance or move the vehicle to a safe location to get a better look. Reality AI Tools from Renesas can analyze data from various sensors like accelerometers, gyros, IMUs, and microphones, to name a few, to develop ML models utilizing frequency/magnitude or time domain signal metrics to calculate feature sets based on mathematical, statistical, and logarithmic formulations. Even pressure and temperature sensors can be employed. Generating results like State of Health and Remaining Useful Life metrics for various applications, ranging from filters to water or washer pumps to tire wear, to name a few.

    Consider all the different variables we monitor when we drive, even anomalies like small impacts, bumps, dings, and scratches, which can be felt as well as heard. Reality AI Tools software can analyze these different signals to classify what they may represent. What if a stone or rock impacts your windshield? After the initial shock, drivers are capable of determining what precautions should be taken. How will the car of the future know that the impact occurred? There are many different situations where the inclusion of sound or even touch or feel can improve autonomous capabilities. As industries strive for autonomy, how will autonomous products be able to adjust to the different conditions to keep drivers and passengers safe and comfortable?

    ELE Times Report
    ELE Times Reporthttps://www.eletimes.ai/
    ELE Times provides extensive global coverage of Electronics, Technology and the Market. In addition to providing in-depth articles, ELE Times attracts the industry’s largest, qualified and highly engaged audiences, who appreciate our timely, relevant content and popular formats. ELE Times helps you build experience, drive traffic, communicate your contributions to the right audience, generate leads and market your products favourably.

    Related News

    Must Read

    Engineering the Future of High-Voltage Battery Management: Rohit Bhan on BMIC Innovation

    ELE Times conducts an exclusive interview with Rohit Bhan,...

    Anritsu Launches New RF Hardware Option, Supporting 6G FR3 

    Anritsu Corporation released a new RF hardware option for...

    Anritsu Achieves Skylo Certification to Accelerate Global Expansion for NTNs

    ANRITSU CORPORATION announced the expansion of its collaboration with...

    Arrow Electronics Initiates Support for Next-Gen Vehicle E/E Architecture

    Arrow Electronics has launched a strategic initiative and research...

    Software-Defined Everything: The Foundation of the AI-powered Digital Enterprise

    Courtesy: Siemens Industry today is not facing a single technological...

    3 semicon-enabled innovations impacting our experience of the world

    Courtesy: Texas Instruments The chips that power today's smartphones contain...

    The Next Phase of Energy Storage: When Batteries Start Working with the Grid

    Authoredby: Rajesh Kaushal, Energy Infrastructure & Industrial Solutions (EIS)...

    TOYOTA Selects Infineon’s SiC Power Semiconductors for its New, “bZ4X”

    Infineon Technologies announced that CoolSiC MOSFETs (silicon carbide (SiC)...

    STMicroelectronics expands strategic engagement with AWS, enabling high-performance compute infrastructure for cloud and AI data

    STMicroelectronics has announced an expanded strategic collaboration with Amazon...

    GaN Benefits in Motor Controls

    By: Ester Spitale, Technical Marketing Manager, STMicroelectronics and Albert...