Today I measured the number of minutes I jogged, how many glasses of water, tea & coffee I drank, and the number of hours of sleep on which I am operating. I shared (and perhaps overshared) that data with my husband and a co-worker.

Creative Commons licensed photo by Flickr's ZeRo'Skill

But something which BOTH my husband and co-worker might appreciate more would be if I were to gain a heightened awareness of my emotions.

Affectiva, a company with roots at MIT Media Lab‘s Affective Computing group, provides opt-in technologies to measure and communicate emotion and they are a signal to watch in the Quantified Emotion landscape.

How do they measure emotion? So far, Affectiva‘s tools include accelerometers for motion, skin conductance sensors for excitement level, temperature sensors (through their Q Sensor), and more recently facial recognition (using Affedex) using simple webcams.

I first started tracking their work with emotion measurement in order to help families with children on the autism spectrum, and now they are expanding into the arena of market research.

Yesterday, they announced a National Science Foundation grant  “to develop an online version of its technology that enables computers to recognize human expressions and deduce emotional and cognitive states.”

The business applications of this grant work are clear:

Affdex not only allows more accurate understanding of an important aspect of human communication — emotion — it helps democratize emotion research by making it accessible, user-friendly and affordable for large and small corporations.  The goal is a technology service that truly transforms the way customers and businesses communicate aabout product experiences.

“The NSF grant is an important step toward helping us open up the science of emotion measurement and make it massively available,” said Affectiva co-founder Dr. Rana el Kaliouby, who led the invention of the facial expression technology as a researcher at the University of Cambridge and at the MIT Media Lab.

In addition to tracking these commercial applications of technologies that measure emotion, I am looking forward to hearing more about their other candidate application areas including for clinical research, tools for persons with disabilities, online gaming and more.

____________________________________

Note:  Also posted on the IFTF blog, FutureNow

Advertisements

As my colleagues at Institute for the Future have forecasted, more & more people–particularly in North America–are beginning to track, quantify, and visualize data about themselves–from simple pedometers to track fitness, to complex genetic code to monitor chronic conditions and health probabilities.  One of the clearest expressions of this movement is the Quantified Self.

Some of the most exciting developments of this Quantified Self movement come when you consider a mashup of Quantified Self and neuroscience with tools like fMRI technology.  Are we on the way to beginning to quantify faith and its impact on our health?  The connection between physical health and spiritual practices has long been proven, and most recently featured in the popular PBS series, This Emotional Life.

Soon, we’ll be able to see the affect of meditation and faith community connections on our mental, emotional and physical health in a way that we never have before.  By 2020, quantifying faith will become increasingly possible, and this new potential has the potential to catalyze better health outcomes, as well as serve as a driver of growth and revitalization of faith communities.

Some questions immediately arise.  When you can quantify your faith…

  • How much more frequently will you perform your spiritual practices?
  • How likely are you to share about your faith with others, especially if you had discomfort with evangelism before?
  • What products, services, or tools will you use to measure your faith?
  • How will you monitor the feedback loops between your faith, health, emotions, and relationships?
  • With whom will you share your quantified faith data? What is that data worth to you?