Emotions


Gallup and the New York Times have teamed up to find the happiest man in America (according to how his profile fits with the demographics of happiness).

Gallup said that the happiest person would be: male, Asian-American, a religious Jew, self-employed, living in Hawaii, married, has children, receiving a household income of at least $120,000.

Lo and behold, they found someone who fits this description!  In this Times article yesterday, they shared a profile of this gentleman:

Here’s a breakdown of how each of Mr. Wong’s attributes contributes to happiness, with links to some of our previous coverage on these correlations. But remember, as always, correlation is not necessarily causation.

RELIGION: On average, Jews have higher levels of well-being than their counterparts of every other major faith in America. Muslims have the lowest levels of well-being. In between, from happiest to least happy, are Mormons, atheists/agnostics, Roman Catholics, “other non-Christians” and then Protestants. For people of most religions, greater levels of religiosity (like frequent church or synagogue attendance) are associated with higher levels of happiness.

GENDER: Men, on average, report slightly higher levels of well-being, a score of 67 on a scale of 0 to 100, compared to women’s average score of 66.6. This modest gap is mostly because women score much lower on the physical health index, as measured by the presence of illnesses and various other physical ailments like neck pain and low energy.

RACE: Asians have by far the highest levels of well-being, followed by whites, Hispanics, blacks and then everybody who doesn’t fit into those defined categories. Asians beat out their non-Asian counterparts on five out of the six well-being sub-indexes: life evaluation, emotional health, physical health, healthy behaviors and basic access to things like food and shelter. The one category where whites beat them, but just barely, is work environment.

MARITAL STATUS: Married people have far and away the highest happiness levels. The biggest differential between married people and non-married people is in the work environment index. Across the entire index, married people are followed in happiness by singletons; people with domestic partners and people who have been widowed (these two categories have equal levels of well-being); those who are divorced; and finally, people who are married but separated.

CHILDREN: People who have children are slightly happier than people who don’t.

AGE: Seniors — those age 65 and older — rank as the happiest, followed by Americans under 30. The people in the middle —the ones with mortgages, teenagers, car loans and midlife crises — are perhaps understandably the ones who are more miserable.

INCOME: Income tracks very neatly with well-being. People earning under $12,000 annually lead the least happy lives, and the more money they make, generally speaking, the better off they are emotionally and physically. This probably makes sense when you consider what goes into the index, things like good health and access to basic needs like food and shelter. You can’t afford to lead “the good life” if you can’t afford much at all.

GEOGRAPHY: In 2010 Hawaii topped the well-being list, and West Virginia was at the bottom. If you want to zoom in further, you can see well-being rates by Congressional district. California’s 14th district, one of the highest-income districts in the country that also happens to include most of Silicon Valley, ranks at the top. Michigan’s 13th district, an area of high unemployment that covers parts of Detroit and the wealthier Gross Pointe suburbs, is at the very bottom of the barrel.

EMPLOYMENT: Americans who own their own businesses were the happiest on average in 2010, followed by professionals. The least happy are transportation and manufacturing workers.

HEIGHT: Randy Newman was right: Taller people are generally happier.

Advertisements

Anyone who questions the linkage between robots and emotions clearly hasn’t seen the 1986 film, Short Circuit.  Are you with me fellow children of the 80’s?:

New Scientist‘s Catherine de Lange wrote a piece this week which casts the linkage between robots, emotions, and what it means to be alive in her recap of today’s robotic facial expressions and more.  A quote from her piece, Emotion 2.0:

“Robot” comes from the Czech word “robota” which means “work” or “forced labourer.” Indeed, in the early days, robots were seen as a way to make light work of tedious tasks. Who doesn’t want a robot that does the housework or makes the tea? But instead of creating a modern-day, indefatigable Jeeves, much robotics research today focuses on creating emotional machines. Robots started out conceptually as automaton-servants but are now helping us get to grips with what makes us human.

Jules, the posh robot from the University of Bristol, UK, is equipped with tiny motors under its skin, which means it can accurately mimic human facial expressions. Jules is a disembodied head though, and while its copycat technique is impressive, robots need to do more than copy us to be able to interact on an emotional level.

A step up the emotional adeptness scale, AIDA, the driving companion, uses its facial expressions to respond to the driver’s mood – looking sad if a seatbelt is undone, for instance, or detecting that you are tense as you drive and helping you relax.

How will robots play a role in the future of emotion?

Today Rudy Adler, co-founder of 1000Memories visited us at Institute for the Future to share about his project, which links to the future of death, the future of connecting, the future of memories, and the future of storytelling.

Here is an example of a memories tribute page, designed to remind us of a patchwork quilt.  It’s complete with stories, photos, videos, songs, and scanned artifacts of the beloved one who has passed away at various ages and even features handwitten notes:

Rudy is a smart, genuine guy (which is important for a site that promises to preserve your memories forever.)

He opened his talk with a personal story about losing someone in his life and his company’s call to action evidences big picture thinking: “We need a new oral history.”

The site that results is a well-designed and interesting signal (early indicator of a future direction of change) around the future of connecting.

Here is a summary of the idea in 1 minute & 12 seconds:

1000Memories has gotten a lot of press, but my favorite description comes from a TechCrunch article that resonates with my own personal experence of the site:

Visitors are first presented with a big picture of the deceased, presumably that one image that best captured his soul and personality. From there it’s easy to navigate to your next step as a reader, and sign a guest book. You can also invite others to the page at that time.

But what makes each site really rich are the stories and pictures that loved ones add to the site. Some are silly. Others rip tears from your eyes. But it helps fill out the picture of a man, and it helps family and friends remember that man more richly.

We are indeed becoming People of the Screen, as our Technology Horizons Future of Video research suggests.

This experience of digital life after death is one of the more meaningful ways I’ve seen this come to life so far.  Intrigued to see where 1000Memories will go.  Who do you want to remember?

____________________________________

Note:  Also posted on the IFTF blog, FutureNow

Yesterday I wrote about technologies to measure emotion (mainly biometric & facial recognition). But what about new metrics for your emotions as they come across to others through the tone of your written communication?

Yesterday, Lymbix (yes, the limbic system pun is reportedly intended) spread its reach using a technology they refer to as an emotional spellchecker of sorts.  Previously,  Microsoft Outlook users could download ToneCheck and now actions have been taken to extend this to Lotus as well.

Screenshot below for a preview, or you can see a demo here.

Today I measured the number of minutes I jogged, how many glasses of water, tea & coffee I drank, and the number of hours of sleep on which I am operating. I shared (and perhaps overshared) that data with my husband and a co-worker.

Creative Commons licensed photo by Flickr's ZeRo'Skill

But something which BOTH my husband and co-worker might appreciate more would be if I were to gain a heightened awareness of my emotions.

Affectiva, a company with roots at MIT Media Lab‘s Affective Computing group, provides opt-in technologies to measure and communicate emotion and they are a signal to watch in the Quantified Emotion landscape.

How do they measure emotion? So far, Affectiva‘s tools include accelerometers for motion, skin conductance sensors for excitement level, temperature sensors (through their Q Sensor), and more recently facial recognition (using Affedex) using simple webcams.

I first started tracking their work with emotion measurement in order to help families with children on the autism spectrum, and now they are expanding into the arena of market research.

Yesterday, they announced a National Science Foundation grant  “to develop an online version of its technology that enables computers to recognize human expressions and deduce emotional and cognitive states.”

The business applications of this grant work are clear:

Affdex not only allows more accurate understanding of an important aspect of human communication — emotion — it helps democratize emotion research by making it accessible, user-friendly and affordable for large and small corporations.  The goal is a technology service that truly transforms the way customers and businesses communicate aabout product experiences.

“The NSF grant is an important step toward helping us open up the science of emotion measurement and make it massively available,” said Affectiva co-founder Dr. Rana el Kaliouby, who led the invention of the facial expression technology as a researcher at the University of Cambridge and at the MIT Media Lab.

In addition to tracking these commercial applications of technologies that measure emotion, I am looking forward to hearing more about their other candidate application areas including for clinical research, tools for persons with disabilities, online gaming and more.

____________________________________

Note:  Also posted on the IFTF blog, FutureNow