While computers outperform humans in most mathematical tasks and can do complex calculations that people never could, there's one area where machines haven't quite achieved humanlike smarts: emotional intelligence.

But now, a new computer program can recognize people's emotions based on how they type, paving the way for computers that could one day be smarter than humans — a concept called "the singularity."

In a new study, researchers asked a small group of people to type a block of sample text, and then analyzed the keystrokes and characteristics to see if they could identify any of seven different emotional states: joy, fear, anger, sadness, disgust, shame or guilt.

The emotions that the program recognized with the greatest degree of accuracy were joy (87 percent of the time) and anger (81 percent of the time). 

"If we could build any system that is intelligent enough to interact with humans that involves emotions — that is, it can detect user emotions and change its behavior accordingly — then using machines could be more effective and friendly," the researchers, from the Islamic University of Technology in Bangladesh, wrote in the study.

The researchers noted that emotion-detecting systems could be used in applications like online teaching: An emotionally intelligent online system could change its look, teaching style or the contents of its lectures to better adapt to a particular student's emotional state, they said.

In the first part of the study, 25 people, ages 15 to 40, retyped two paragraphs from Lewis Carroll's famous novel "Alice's Adventures in Wonderland," and picked one of the emotions that they felt while they were typing: joy, fear, anger, sadness, disgust, shame, guilt, feeling neutral or tired. The last two options were added in case an individual did not identify with any of the original options.

In the second part of the study, the researchers used software that collected text samples from users, who were prompted every 30 minutes to enter their mental state, choosing from the following possibilities: joy, fear, anger, sadness, disgust, shame, guilt or none of the above.

In this part of the experiment, the text that the users typed did not come from a particular source assigned to them, but was collected during their regular computer use. The researchers used a special type of software that ran in the background to record all keys pressed by the users as well as the users' press and release times.

The researchers then extracted 19 keystroke attributes from the collected data. Some of the attributes included typing speed in 5-second intervals and the time elapsed between when a particular key was pressed and released.

To analyze the sample texts, the investigators used a standard database of words and sentences that were associated with the seven different emotional states.

The newly described emotion-detecting system "does not look like a breakthrough," Myounghoon Jeon, an assistant professor of applied cognitive science at Michigan Technological University who was not involved in the study, told Live Science. "But [the researchers'] effort to integrate the existing methods looks fair, positive and promising."

However, Jeon said the method of detecting emotions in text that was used in this study has some limitations. For instance, unlike speech-recognition technologies or devices used to detect facial expressions, it is based on something a person has to do, as told by someone else. Therefore, if a person is truly sad or angry, they may not be able or willing to type something they are told to type because of the emotions they are feeling.

Still, the new system could be a valuable tool for online counseling sessions, Jeon said. For example, in some cultures where online counseling is particularly popular, psychiatrists may be able to estimate a patient's internal state even without the person verbally articulating it them.

The study was published online July 3 in the journal Behavior & Information Technology.

Copyright 2014 LiveScience, a TechMediaNetwork company. All rights reserved. This material may not be published, broadcast, rewritten or redistributed.