Can facial expressions really reveal emotions? Yes…but not the whole story. #Facial_Expression_Analysis (FEA) captures what people express, not everything they feel internally. Built on the Facial Action Coding System (#FACS), our FEA technology maps subtle facial muscle movements to observable expressions. However, not everyone expresses emotions the same way. Culture, personality, and context all shape what actually shows up on the face. That’s why strong research doesn’t just look at expressions in isolation, it establishes individual baselines to understand what’s meaningful for each participant. 👉 The real value isn’t just in the data, it’s in knowing how to interpret it. Do you have a behavioral research or biometric question for our experts to answer? Put it in the comments below and we will do our best to answer it!
About us
"iMotions is now a Smart Eye company. In October 2021 Smart Eye acquired iMotions. iMotions continues to be independently-run, while the combined companies are working to create the first powerhouse for analyzing emotional, cognitive and behavioral data, delivering holistic human insights." The iMotions software helps unpack human behavior by combining multiple biosensor data to uncover underlying emotions and responses. The platform enables research in market studies, academics, healthcare, UX, automotive, advertisement, education, and more. iMotions is used worldwide by leading universities such as Harvard, Yale, MIT, and Stanford as well as corporations such as gsk, P&G, Unilever, BMW, and Duracell. iMotions integrates best-in-class biosensors and synchronizes Eye Tracking, Facial Expression Analysis, EEG, EDA/GSR, EMG, ECG and Surveys in one unified software platform. iMotions continuously develop its product to be the most comprehensive, easy to use and scalable biometric research platform in the market. It helps our clients conduct state-of-the-art human behavior research in the areas of Psychology, Neuroscience, Human Factors Engineering, Education, Health, Business, and Human Computer Interaction. iMotions is a high tech software development company originally founded in 2005, headquartered in Copenhagen, Denmark with offices in the USA, Germany, and Singapore. Read more on https://imotions.com
- Website
-
https://imotions.com
External link for iMotions
- Industry
- Software Development
- Company size
- 51-200 employees
- Headquarters
- Copenhagen, Capital Region
- Type
- Privately Held
- Specialties
- Eye tracking, Market research, Scientific Research, Surveys, University Research, Heatmap, usability eye tracking, website usability, eye tracking publications, research, neuromarketing, biometrics, software, ai, healthcare, eeg, facial expression analysis, gsr, eyetracking, eda, neuroscience, data, innovation, biosensors, testing, usertesting, experiment, ecg, technology, emg, VR, Area of interest, visual attention, study design, human behavior, UX, API, and product shelf testing
Products
iMotions Software
Laboratory Information Management Systems (LIMS)
Our platform offers simple setup, flexible integrations and easy study management for faster research and improved validation of human responses.
Locations
-
Primary
Get directions
Kristen Bernikows Gade
6, 4th fl.
Copenhagen, Capital Region, DK
-
Get directions
141 Tremont St
Boston, Massachusetts 02111, US
-
Get directions
60 Paya Lebar Road - Paya Lebar Square
#06-01
Singapore, Singapore 409051, SG
-
Get directions
20 W Kinzie St
17th fl.
Chicago, Illinois 60654, US
Employees at iMotions
Updates
-
Going to #MRMW APAC in Singapore? We will be in Singapore offering free 20-minute demos during the conference, where you can see how our multimodal research software combines tools like eye tracking, facial expression analysis, voice analysis and more to uncover deeper human insights. Whether you're working in UX, advertising, product testing, or consumer insights, we’d love to show you what’s possible. Jeff Zornig, Senior Account Director, and Jessica M. Wilson, Ph.D., Global Director, Product Specialists will be there to answer all your questions! 👉 Book your free demo slot here: https://lnkd.in/eh_n3-94
-
-
🚀 New in iMotions: Move More. See More. Compare Smarter. Since our last update, we’ve been busy rolling out features designed to make your research faster, clearer, and, dare we say, much more enjoyable. Here’s what’s new: 🎯 🏃♂️ Motion Capture Module Bring behavior into motion. Our new module captures detailed movement data, adding an entirely new layer of insight to your studies, which is the perfect tool for sports science, human factors, ergonomics, and beyond. 🙂🙂🙂🙂 Multiface Module One face is good. Multiple faces? Better. Track and analyze facial expressions from multiple participants simultaneously, ideal for group dynamics, media testing, and social research. 📊 Multi-Respondent View Comparison just got an upgrade. Easily view participant data side-by-side and spot patterns faster—because insights shouldn’t hide in separate tabs.
-
We’ve just released a new iMotions #R_Notebook for automated PERCLOS (PERcentage of Eye CLOSure) detection, making robust #drowsiness_monitoring from #eyetracking data easier than ever. PERCLOS is one of the most validated indicators of fatigue and reduced alertness, with strong applications across: • Driver monitoring & road safety • Workplace safety & fatigue risk management • Sleep research • Human factors & HCI With this release, researchers can now move from raw eye-tracking data to actionable drowsiness insights, both faster, more consistently, and at scale. So, If you're working with #attention, #fatigue, or real-world #performance, this is a metric you shouldn’t ignore. Learn more about PERCLOS and other key drowsiness detection metrics, and how they work in iMotions: https://lnkd.in/eayWxVWq #HumanBehavior #EyeTracking #DrowsinessDetection #HumanFactors #HCI #SleepResearch #iMotions
-
What does iMotions Academy actually like? 🧠 🏆 It’s not just a week of lectures. It’s an interactive, hands-on experience where attendees design real studies, collect biometric data, analyze results, and present their findings... all in one week! Participants work in small groups on research projects, choosing their own topics and exploring questions ranging anywhere from ad testing to #EEG #SignalProcessing and #emotional #responses to #AI-generated content. Along the way, participants have full access to all iMotions modules, enabling them to work hands on with #EyeTracking, #EEG, #GSR, #FacialExpressionAnalysis, #VoiceAnalysis, #ECG, and more — exploring new ways of approaching their research questions. The next iMotions Academy takes place in Boston, from June 1–5, 2026. Interested in taking your research further? Learn more about the upcoming iMotions Academy and register here: https://lnkd.in/e3_DPaWP
-
“Why isn’t pupillometry always recommended for research?” It’s a question that comes up often, especially since pupil size is linked to cognitive and emotional processes. In practice, pupillometry is more complex than it might seem. Lighting conditions, experimental design, and interpretation all play a big role in whether the data is meaningful... or misleading. In this week’s Ask Me Anything, we break down how pupillometry works, and when it can (and can’t) be used effectively in research. Got a question for us? Leave it in the comments 👇
-
🚢 iMotions at #AISS 2026 We’re excited to see our partners at DST - Development Centre for Ship Technology and Transport Systems presenting a collaboration paper at the Autonomous Inland & Short Sea Shipping Conference (AISS) 2026 this week 🤝 The talk, “Opportunities for Human Factors Research in Inland Navigation,” will be presented by Matthias Tenzer and features joint work with Mohan Sai Krishna illuri, Dr. Markus Schönberger, @Stephan Schweig, Dr. Divya Seernani, and colleagues. 📍 #AISS2026 within XPONENTIAL Europe 🕒 March 24 at 16:30 Together, DST and iMotions have been exploring how human factors research can help deepen our understanding of the operator’s role in inland navigation - and what that may mean for the future of highly automated inland vessels. If you’re attending, don’t miss it! #AISS2026 #AutonomousShipping #MaritimeInnovation #SmartShipping #HumanFactors #HumanMachineInteraction #CognitiveLoad #UsabilityTesting
-
-
Last week, Nam Nguyen and Thomas Baker were in Michigan spending time with the VI-grade team at their Vehicle Dynamics Center Open House. 🏎️ From immersive simulators to full-spectrum vehicle development, the event brought together some seriously exciting work in motion. A highlight was seeing the new HyperDock cockpit in action and a focus on full-spectrum simulation. For us at iMotions, it was a great example of how collaboration adds real value. By integrating human behavior data into these simulation setups, teams can better understand not just system performance, but the driver experience behind it. Thanks to VI-grade and Multimatic for having us, and to everyone who took the time to connect! 🤝 #iMotions #simulation #humanbehavior #automotive #UX
-
-
GenAI ads can drive strong reactions, but not always the right ones. In our joint research with Kantar, we found that branding impact is often lower in AI-generated ads when visual execution isn’t aligned with brand assets or tone of voice. As Graham Page shares in the session, the difference comes down to how intentionally AI is used within the creative process. Watch the full session for data, case studies, and practical takeaways: https://lnkd.in/emzJuA6n
GenAI ads tend to show lower branding impact compared with non‑AI generated ads, according to our work with Affectiva - iMotions. Why? Because too many teams are leaving the visual execution to generic AI models that aren't aligned with their brand’s tone of voice or using its distinctive brand assets. The good news: intentionally putting your brand at the center of the AI process leads to stronger outcomes. Watch our on‑demand webinar to uncover the full findings, case studies to guide your creative decisions and practical tips for ensuring GenAI actually enhances your brand. 👉 Watch now or save it to view whenever you’re ready: https://loom.ly/yTl22tU
-
-
Peter Hartzbech is at Advertising Research Foundation (ARF)'s Audience x Science 2026 in New York. Our CEO is busy connecting with researchers, marketers, and innovators who are shaping the next generation of insights. 🤓 At iMotions, we’re excited to be part of the conversation around: 🔸Attention and audience measurement 🔸AI in research 🔸Multimodal behavioral insights
Great to be on the ground in #NewYork for Advertising Research Foundation (ARF) Audience x Science 2026 at the beautiful venue at People Inc. I'm looking forward to hearing from and meeting some of the brightest minds in marketing science and audience research. The intersection of AI and human behavior is moving faster than ever. At iMotions and Affectiva , that’s exactly the space we focus on: the data, the science, and the real-world applications. And great to see many amazing clients here! If you're here, let's connect. 👋 #ARF2026 #iMotions #AudiencexScience #Marketingscience #Consumerinsights #AI #Humanbehavior Affectiva Smart Eye
-
-
-
-
-
+1
-