Cars with feelings – a new travelling companion that understands you.

June 21, 2019

Cars with feelings – a new travelling companion that understands you.

           Many people spend a lot of time in a car, and for some it is even more than they do at work or at home. Some people’s journey between their house and their workplace takes as much as 2-5 hours a day and salespersons virtually live on the road, driving long distances to meet up with customers.

           So, how cool would it be if you could have a car as a trusted friend, one that could recognize your emotions? It would be like Herbie, a sentient car from the Herbie film series, or Bumblebee, the yellow world-hero car from the Transformers films. How awesome it would be if the car could even understand you? Well, this dream is actually coming true. A start-up company in the US called Affectiva, building on their work at MIT Media Lab, has installed a new algorithm in their prototype car that uses Deep Learning methods and works with connected devices such as cameras, a computer, a microphone, and software pre-installed in the cabin.

           This is how it works – the sensor of the camera captures a video of the driver’s face and then Affectiva Automotive AI will analyze the driver’s mood based on recorded video of their movements, such as raised eyebrows, blinking eyes, or a gasping mouth. These subtle movements are detected with a face detector that captures both the emotions and movements on the face before saving them. A sensor (RGB or Near-IR) is used in this process to analyze the face at pixel level to classify facial expressions and emotions. As soon as the driver is at risk of behavior that might cause an accident, an alarm will go off to alert them immediately, so that the driver will focus more on their driving and reduce the chance of road accidents. Alternatively, if the driver is not in a good mood, a soothing music playlist will be selected automatically to help the driver to calm down. Meanwhile, the sound detector will capture the words of the driver, which will be analyzed after the first 1.2 seconds by the Deep Learning network which will predict emotions or situations. The recorded data will not be sent to the cloud network but will be processed only within the vehicle.

           In the future, we may see other devices around us that can understand our emotions as well, like a mobile phone that knows when you are upset and suggests not to receive a call. Some people in the entertainment industry are already using something similar to examine whether an audience is satisfied with an advertisement or not. Another example is a theater in Spain called Teatremei Club, that collaborates with an agency named The Cyranos McCann to develop a system called Pay Per Laugh. The theater will not charge their audience an entrance fee, but will charge them when they laugh by using a tablet that is installed in front of each audience member to capture the movements of their face. Every time they laugh or smile, they will be charged 0.30 Euro (about 12 Baht) until reaching the maximum charge at 24 Euro (about 980 Baht). Some people use similar technology for children with autism who are not very good at communicating their emotions, to learn to understand them more. This innovation is definitely something that we should keep an eye on to see how it develops when it becomes widely used in real life.

Read more >> Autonomous Driving: A World Has Changed

Share this article

Subscribe to InnoHub!

Stay updated and inspired

เรานำข้อมูลมาใช้เพื่อการส่งมอบคอนเทนต์และบริการอย่างเหมาะสม เราจะปกป้องความเป็นส่วนตัวของคุณ คุณสามารถอ่านข้อมูลเพิ่มเติมได้ที่ Privacy Policy และคลิกสมัครเพื่อดำเนินการต่อ