The wearable tool uses AI which has deep learning systems and will in future be used as a virtual tool.
Graduate student Tuka Alhanai and Ph.D. candidate Mohammad Ghassemi have described the technology as an AI system that uses specialized algorithms to analyze audio, text transcriptions and physiological signals to help determine a conversation's overall tone in real-time.
The system will run on a Samsung Sinbad, which is Samsung's digital health platform, and will also consist of a modular and a wrist wearable research-centric that can sense the emotions in a conversation classifying them as 'happy,' 'sad' or 'neutral.' This will be determined by sensors and two algorithms which will determine the overall tone of the conversation.
In the trials, 31 conversations were sampled, and the research team found the system could identify the tone of a story with an accuracy of about 83%. Unlike past researchers, this study was tested by real life conversations compared to happy and sad videos in the past.
The research is still in its early stages, and MIT believes that the technology will be developed further and incorporate it into commercially available wearable's like the Apple watch while also expanding its scope of algorithms.