By CASSIE SLANA
Reporter
(WARRENSBURG, Mo., digitalBURG) — New facial recognition software that correlates a company’s profitability with the emotions of its CEO has been developed by one of UCM’s own.
James Cicon, assistant professor of accounting at UCM, produced the software. He said facial recognition software isn’t new, but the way he’s using it is.
“There’s software out there to determine if you’re happy, sad, angry or disgusted; that kind of (technology) is fairly readily available,” Cicon said. “What I’ve done is taken the software and given it an application by giving it use (with) something that’s practical. Of course, since I’m a finance professor, I looked at it in the context of market economics.”
Applying the software to Fortune 500 executives in correlation with their company’s profitability, the research was able to determine what kind of emotions made the most successful long-term business leaders.
While many would assume that the software was the foundation of the technology, Cicon said the majority of the credit should be attributed to the study and methodology employed during development.
“We look at a CEO and if they’re happy or sad, angry or disgusted, and then we relate that to firm performance,” Cicon said. “We can see that in the future, depending how the CEO looks, the firm tends to perform in certain ways.”
Cicon said he and his collaborators found that qualities people would normally consider unpleasant produced some of the best results when applied to business.
“An angry or disgusted CEO during an interview tends to make changes to the company to improve profitability,” Cicon said. “These changes are in the long term; we look three to six months out and see improvements in profitability based on facial expression.”
Cicon said the study found that a happy CEO is a CEO that doesn’t work much.
“(A happy CEO will) be on the golf course more perhaps,” Cicon said. “An angry or disgusted CEO seems to be the CEO that goes back to the office to make changes and improve performance.”
Cicon said the software works by recognizing facial features and how different emotions are displayed.
“Based on these facial features and the distortions we have with (them, the software) can determine emotions just like a human being does,” Cicon said. “Our brains process the face in the same way, so we’re trying to emulate that.”
After producing the data, the study garnered attention from media outlets like The Wall Street Journal.
“It was fun (to be published in the WSJ); people all over the world are talking about our research,” Cicon said. “But it’s the journal publication that matters. You get the scientific vetting when you do the publication in a peer review journal. That’s far more important.”
The study is published in the Journal of Behavioral Finance.
Research for the study was a collaborative effort involving Stephen Ferris, dean of the College of Business at the University of Missouri, Ali Akansu, professor at the New Jersey Institute of Technology, and Yanjia Sun, PhD graduate from NJIT who used the work as part of his dissertation.
Cicon said he is confident in the success of future research that utilizes this technology, but information on upcoming studies remains confidential.
Leave a Reply