With Fortune 50 Clients and $1M+ in Revenue, Affectiva Proves Market for Measuring Emotion
WHAT: Tools for measuring emotion in research and business outside a lab. The Q Sensor, worn on the wrist, measures emotional arousal, temperature and activity (will soon have streaming capabilities). Affdex uses a webcam to read facial expressions and can determine whether someone likes or is paying attention to the commercial or movie trailer she is watching.
LAUNCHERS: Co-founders are Rana el Kaliouby (CTO) and Rosalind Picard (chief scientist and a leader in affective computing). CEO David Berman helped build WebEx, which Cisco acquired.
WHY: Testing in a lab is more invasive, labor-intensive and expensive. Marketers can measure actions but cannot assess emotion. Webcams capture good-quality video and are ubiquitous. Emotional data can be used in social networking and gaming.
WHEN/WHERE: Q Sensor in May 2011 and Affdex beta launch in March 2011 (public launch in early 2012) / Waltham, Massachusetts. Company in 2009. Also an office in San Jose for sales and marketing.
BACKSTORY: Both products rooted in helping people with autism. Technology originally developed at the Massachusetts Institute of Technology Media Lab. The wristband has helped caregivers and educators understand why an autistic child is having an episode. Q Sensor also used with veterans suffering from post-traumatic stress disorder and drug addicts.
Rana began studying affective computing because she spent so much time in front of computers and was frustrated that they are "oblivious" to people's emotions. She was a PhD student at the University of Cambridge when Rosalind came there to keynote at an important conference. Rosalind eventually brought Rana to MIT as a post-doc.
HOW AFFDEX WORKS: Rana says Affdex "learned" to read emotions by processing thousands of images of expressions humans automatically recognize as happy or sad. Through social media and other means to recruit volunteers, they have recorded expressions from 9M faces. Some people followed a two-minute exercise instructing them to look surprised, angry, etc., and others agreed to be recorded watching a short video.
The value of the two types of images do not differ, but, says Rana, "Spontaneous expressions take a different neural pathway than the acted expressions. They differ in the speed of how they happen, how long they persist." For example, a spontaneous smile has multiple peaks as someone smiles or laughs at a funny scene or joke.
BUSINESS MODEL: Selling hardware (Q Sensor costs about $1,800), consulting services and SaaS subscriptions.
GOAL: Democratize market research. More generally: "How we can use emotion to improve lives and improve communication," says David.
COMPETITION: Existing research methods like surveys and focus groups. Affective Interfaces (for Affdex). David says, "We anticipate more competition" but adds that the company has "lots of IP to protect us."
REVENUE: "Well over $1M in sales" to date, David says.
CUSTOMERS: Fortune 50 companies (not disclosing names), ad agencies and academic researchers. Universities using Affectiva products include Stanford, Harvard and Notre Dame.
ON PRIVACY ISSUES: David knows this is a concern and says Affectiva is committed to keeping the technology opt-in. "We haven’t gotten into the digital billboard market for that reason," he says. David believes people will opt-in so they can see commercials that appeal to them, for example.
SOFTWARE PRODUCTS BUILT ON: C++, Python, Ruby on Rails, Flash.
WHO BACKED IT: WPP Kantar (consumer insights group of marketing firm), Myrian Capital, Peder Sager Wallenberg Charitable Trust. Also received grant from the National Science Foundation.
TOTAL RAISED: $7.7M, $5.7M of that in Series B, July 2011.
NUMBER OF EMPLOYEES: 20
SCREEN SHOTS (AFFDEX)
[ Play with the demo here. ]
The attention, smile and lowered eyebrow results for a particular video appear in a color-coded line chart. Show all three metrics or choose just one in the top nav bar. In the chart itself, click on a peak or valley to go that moment in the video.
Within the attention, smile and lowered-eyebrows charts, choose to see how engaging, creative and effective the video was over time (engaging is the default). The green line shows the highest level, the pink line the lowest level.
FURTHER READING
1. "Affective Computing" (Rosalind Picard's book, published in 2000, Google Books)
2. "Improving Lives With Emotionally Intelligent Technology" (Rana el Kaliouby talk at TEDx Cairo, May 8, 2010)
3. "Emotional, not factual, ads win skeptical consumers, study says" (Medical News Today, August 16, 2005)
CONTACTS & LINKS
David Berman
Twitter: @daveberman
LinkedIn: http://www.linkedin.com/pub/david-berman/1/625/b77
Rosalind Picard
Twitter:@RosalindPicard
LinkedIn: http://www.linkedin.com/pub/rosalind-picard/0/1bb/11
Rana el Kaliouby
Email: kaliouby at affectiva dot com
Twitter: @kaliouby
LinkedIn: http://www.linkedin.com/in/kaliouby
Affectiva
Twitter: @affectiva
Jobs: http://www.affectiva.com/about/careers/