AI + ML

This app knows how you feel  –  from the look on your face

Imagine if you were able determine the current mood of your audience

Rana el Kaliouby's Ted Talk inspired me a lot and me made me think further – imagine if you were able determine the current mood of your audience: a relaxed and happy user is more open to additional content and – for example – you could show more product suggestions.

Whereas a tired and frustrated visitor would get only a simplified version of the same page. All content and app screens could be tailored to the current emotional state of the user. Personalisation at its best! We could cheer users up with images or copy. Possibilities are endless (the device’s camera must be switched on though).

25 May 2021 update: https://go.affectiva.com/smarteye-acquires-affectiva – Affectiva was founded by Rana el Kaliouby

What if the camera is not available or the user has not granted access? I can think about additional ways to measure emotion and mood: the speed of typing, spelling errors, movement of the phone, sound level and so on.

On the side note: An Emotion Sensor use case