Express Computer
Home  »  Columns  »  How Machine Learning  can give emotions to driverless cars

How Machine Learning  can give emotions to driverless cars

0 618

One of the most commendable achievements in technology – the ‘emotion recognition technique’ – has transformed people’s consumption habits, especially in the automotive industry. Thanks to the autonomous vehicle technology

By Manish Gupta

Displaying manish_gupta.jpgDisplaying manish_gupta.jpgDisplaying manish_gupta.jpgDisplaying manish_gupta.jpgmanish_guptaThe ability to read emotions on people’s faces and smartly navigate one way through complex situations is envied and appreciated at the same time. As technological convergence takes control – becoming a vital life line for many people – computers have evolved and today have taken on the onus of recognizing images, words, sounds and even facial expressions. One of the most commendable achievements in technology – the ‘emotion recognition technique’ – has transformed people’s consumption habits, especially in the automotive industry. Thanks to the autonomous vehicle technology!

Autonomous vehicle technology, made possible through Machine Learning (ML) algorithms and techniques, has elevated the overall in-vehicle experience of a user to levels hitherto unheard of. The new driverless vehicles are equipped with speedometers, GPS and Torque sensors that act as powerful aids to identify driving style patterns, and in the process have made driving a whole lot easier. These patterns could cover a diverse range – from a cautious driver to one where aggression and rage is reflected on the drivers face, or then speeding during lane switching among others. Additionally, enabled through embedded real-time systems that map data and analysis of a driver’s personality, autonomous systems now successfully recognize emotional states such as happiness, sadness, anger, contempt, fear, disgust, neutral and surprise; facial expressions that reflect emotions and feelings.

As smart cars become increasingly popular, tracking driver behavior behind the wheel becomes quintessential resulting in triggering early warning signals such as drowsiness that could potentially lead to forward collision, much ahead of the location. The reasoning being the necessity to deploy sensors to detect dangers and preempt accidents or then the need for corrective action. These sensors, known as ‘Advanced Driver Assistance Systems’ (ADAS), are proficient in detecting road conditions and driver behavior – by making use of both external and internal elements including cameras and facial expression/body language of a driver, respectively. Besides sensors, ADAS can read and analyze data from people or machine’s statuses, such as social media and vehicle social networks, to pre-emptively suggest actions for the driver. This could also include data on traffic congestion.

Driver assistance systems, while monitoring a driver’s behavior, can also predict the user’s next action. This reinforcement learning can also incorporate external context like time of day, location of car, etc. to learn patterns in user behavior. There has been a lot of recent research around developing automated systems that anticipate dangerous maneuvers to alert drivers before they perform the maneuver, and also give ADAS more time to prepare for the danger. For this purpose, the car is equipped with two cameras and computing devices to capture internal and external data and information. Deep learning techniques like recurrent neural networks can use data from these cameras along with sensor inputs like GPS and speed to anticipate maneuvers 3.5 seconds before they occur with over 80% F1-score in real-time.

For instance, if a driver is looking away from the road for extended lengths of time that could adversely impact the safety and wellbeing, the smart car agent system will immediately issue a warning signal drawing the attention of the driver back on to the road. In a scenario like this, the in-vehicle cameras can track the drivers head movements and direction of gaze to send information in the form of images to the computer software mapping face-detection. The face-detection software then conflates the information received (from the in-vehicle cameras) to the data captured from the external surroundings (captured by the external cameras), and in case the data analysis predicts a possible threat to safety – the system transmits a warning signal. However, in the case of an impending threat that is more immediate in nature, the smart agent automatically applies brakes to prevent or mitigate collisions.

The arrangement of in-vehicle videos is such that it captures data that enables the ADAS understand the gestures and the emotional state of the driver at any point in time. Point in case to elucidate how it works – if the car faces a part failure or slowing performance and the driver is frightened by the incident his entire body language and facial emotions will undergo a change – it is then that the body language monitoring and emotion recognition kicks in to decide whether the driver can resume control of the car within a given time frame or not. If the reading indicates a weak response from the driver, the smart car agent switches to acceleration control, braking or another alternate safety mode.

ADAS are also paired with special sensors which, when directly inserted into a steering wheel of a vehicle, can be used to detect drunken driving. Alcohol can be detected in the palm of the hand in less than five minutes after initial ingestion. If the alcohol contents detected are above a preset limit, the vehicle could be immobilized. There is more – research is further refining connected car technologies. Ashutosh Saxena, an Assistant Professor of Computer Science at Cornell University, together with his team is working on advanced sensors that monitor the grip of the driver on the steering wheel and pressure sensors to detect foot movements – moving a step further in understanding a driver’s intentions.

Undoubtedly, driverless cars are altering the face of the automotive industry, additionally delivering benefits for road safety, reducing emissions and congestion, enhancing productivity of users and enabling social inclusion. A study by the Eno Centre for Transportation, a non-profit group, states that if about 90% of cars on American roads were autonomous, the number of accidents would fall from six million a year to 1.3 million.

While safety is a key selling point for automated cars, vehicle management qualifies as the secondary selling point for drivers and fleet owners. A growing range of evolved technologies offer efficient vehicle performance and maintenance monitoring software – thus, making available predictive maintenance analysis of the vehicle’s health. This analysis may later be distributed to OEM service channels and dealership service centers. Additionally, linear regression models are used to predict the next service date of the vehicle/systems depending on the current status of the vehicle/systems.

Automotive companies today are leading the way in bringing mobility solutions to their cars to meet drivers’ changing expectations, as well as enhance safety. These automakers and OEM companies see their cars as technology platforms while striking the right balance between using data to create both intelligent and personal experiences, while helping maintain privacy and security and more natural, human computing interfaces. Companies like Volvo Cars, Nissan, Toyota, Delphi, Ford and Harman are already working with Microsoft to deliver intelligent car technologies.

To conclude, with the advent of driverless and automated assist technologies, there is undoubtedly an added layer of safety and protection; allowing the driver to enjoy a unique driving experience. Having said that, there is still a need for adding an extra level of safety by encouraging ‘driver-watching’ or ‘taking on the wheels occasionally’ prototype to further strengthen the safety layer.

The author is Senior Applied Researcher, Microsoft India R&D Private Limited

Get real time updates directly on you device, subscribe now.

Leave A Reply

Your email address will not be published.

LIVE Webinar

Digitize your HR practice with extensions to success factors

Join us for a virtual meeting on how organizations can use these extensions to not just provide a better experience to its’ employees, but also to significantly improve the efficiency of the HR processes
REGISTER NOW 

Stay updated with News, Trending Stories & Conferences with Express Computer
Follow us on Linkedin
India's Leading e-Governance Summit is here!!! Attend and Know more.
Register Now!
close-image
Attend Webinar & Enhance Your Organisation's Digital Experience.
Register Now
close-image
Enable A Truly Seamless & Secure Workplace.
Register Now
close-image
Attend Inida's Largest BFSI Technology Conclave!
Register Now
close-image
Know how to protect your company in digital era.
Register Now
close-image
Protect Your Critical Assets From Well-Organized Hackers
Register Now
close-image
Find Solutions to Maintain Productivity
Register Now
close-image
Live Webinar : Improve customer experience with Voice Bots
Register Now
close-image
Live Event: Technology Day- Kerala, E- Governance Champions Awards
Register Now
close-image
Virtual Conference : Learn to Automate complex Business Processes
Register Now
close-image