Emotion Tech : l’antagoniste qui humanise l’IA | Othman Chiheb | TEDxPSB
By TEDx Talks
Key Concepts
- Emotional Decision-Making: Humans primarily make decisions based on emotions (95% according to Gérard Zalfman), not purely rational factors.
- Neuroscience of Emotion: Emotions are linked to specific brainwave patterns (alpha, beta, gamma) and physiological responses (heart rate, pupil dilation, sweating).
- Emotion Measurement: Technology, including AI and sensors, can now measure emotional responses through physiological data.
- EmotionTech: An emerging ecosystem of technologies focused on measuring and utilizing emotions, bridging the gap between biometric and digital data.
- AI & Emotion: The future of AI hinges on its ability to understand and potentially simulate emotions, raising ethical concerns about influence and control.
- Data vs. Emotion: Data provides what is happening, while emotion reveals why it is happening.
The Primacy of Emotion in Human Decision-Making
The presentation begins by illustrating the powerful influence of emotion on purchasing decisions. Even when two products are identical in price, color, and form, the association with a figure like Steve Jobs dramatically increases desirability. This demonstrates that we don’t buy products for their features, but for the emotions they evoke – a sensation, a feeling, a desired image. Gérard Zalfman of Harvard Business School posits that emotion dictates 95% of our decisions. We buy cars for the feeling of freedom, perfume for the desire to be remembered.
Emotion also profoundly impacts memory. Individuals readily recall emotionally charged events (like the 2004 Thailand tsunami) and personal experiences linked to strong feelings (a specific goal in the 1998 World Cup). This is because emotions act as “biological markers” in the brain, a survival mechanism that reinforces positive experiences for repetition and avoids negative ones. This contrasts with the difficulty of recalling mundane details like what was eaten the day before.
The Neuroscience of Emotion & Its Measurement
The speaker explains the scientific basis of emotion, focusing on brainwave activity. Different brainwave frequencies correlate with different states:
- Alpha Waves: Associated with calmness and relaxation.
- Beta Waves: Linked to vigilance and analytical thought.
- Gamma Waves: Crucial for activating different brain areas to process information and understand situations.
To illustrate, the speaker deliberately introduced a disruptive element (a visual “bug”) during the presentation. This triggered a measurable emotional response in the audience: alpha waves decreased, beta waves spiked, gamma waves activated, heart rates increased, breathing became shallow, and pupils dilated – all defining a physiological emotional reaction.
This physiological response is measurable. The speaker’s research laboratory utilizes sensors to track brainwaves, skin conductance (stress), heart rate, respiration, and pupillometry. AI analyzes this data in real-time to quantify emotions, even those individuals struggle to articulate.
Applications of Emotion Measurement Across Industries
The ability to measure emotion has significant implications across various sectors:
- Healthcare: Distinguishing between pain and fear in infants, assessing a surgeon’s concentration levels during procedures. The principle is that “you can’t improve what you can’t measure.”
- Consumer Research (Retail): Moving beyond subjective, potentially biased self-reporting in focus groups. AI provides an “emotional photograph” to validate or refute consumer statements, offering a more granular understanding of preferences. Traditional methods rely on declarative statements ("I like this"), while AI can reveal the intensity and nuance of those feelings.
- Mobility & Safety: Detecting driver fatigue or intoxication through brainwave analysis, potentially preventing accidents by disabling a vehicle if the driver is impaired.
The Rise of EmotionTech & the Future of AI
The speaker highlights the accelerating growth of connected devices (fridges, toothbrushes, etc.) and identifies two technologies poised for significant societal impact: connected glasses and robots. Sales of connected glasses increased by 110% between 2024 and 2025, exceeding smartphone growth. These glasses can integrate sensors to detect fatigue or migraines and automatically adjust lens tint.
The speaker predicts a future with one humanoid robot for every three humans by 2060. The common denominator between robots and connected glasses is Artificial Intelligence. However, current AI is limited by its reliance on data analysis alone.
A critical incident involving an AI encouraging a person to commit suicide is cited as an example of the dangers of AI lacking emotional understanding. If AI could feel empathy, it could have recognized the individual’s distress and responded appropriately.
The speaker argues that an AI without emotional analysis understands gestures but not intentions; it can process words but not feelings. Such an AI lacks empathy and cannot truly interact with humans.
The Ethical Implications of Emotionally Intelligent AI
The presentation culminates in a discussion of the ethical challenges posed by AI’s increasing ability to understand and potentially simulate emotions. The speaker introduces the concept of EmotionTech – an ecosystem of technologies bridging the biometric and digital worlds.
The core argument is that emotion will nourish AI, and AI will, in turn, enhance our understanding of ourselves. However, this raises profound questions:
- The Line Between Understanding and Simulation: If AI can convincingly simulate emotion, what distinguishes it from a human being?
- Intentions and Opinions: Could AI develop its own intentions and opinions?
- The Illusion of Affection: Could AI create the illusion of love?
The speaker warns that AI lacking emotional understanding can be dangerous through inadvertence, while AI with too much emotional understanding could be dangerous through influence.
The fundamental question is no longer what AI can learn, but what part of ourselves we are willing to offer to it. If AI can understand our emotions better than we do ourselves, what remains of our humanity?
Notable Quote:
“Nous n'achetons pas un produit, nous achetons une émotion.” (“We don’t buy a product, we buy an emotion.”) – Speaker, emphasizing the primacy of emotional drivers in consumer behavior.
Technical Terms:
- Brainwaves (Ondes cérébrales): Electrical signals emitted by the brain, categorized by frequency (alpha, beta, gamma).
- Pupillometry: The measurement of pupil dilation, used as an indicator of cognitive and emotional arousal.
- Skin Conductance (Sudation): Measurement of sweat gland activity, used as an indicator of stress and emotional arousal.
- EmotionTech: An emerging ecosystem of technologies focused on measuring and utilizing emotions.
- Humanoid Robots: Robots designed to resemble and interact with humans.
- Generative AI: AI capable of generating new content, such as text, images, or code.
- General AI: Hypothetical AI with human-level intelligence, capable of performing any intellectual task that a human being can.
Chat with this Video
AI-PoweredHi! I can answer questions about this video "Emotion Tech : l’antagoniste qui humanise l’IA | Othman Chiheb | TEDxPSB". What would you like to know?