Return to site

Ohmni Developer Edition Case Study Lehigh University

January 10, 2020

 

About a year ago, OhmniLabs announced a partnership with Lehigh University students to support academic research in intelligent machines. Over the course of the year, Lehigh students integrated a number of new artificial intelligence-based technologies into the Ohmni Developer Edition robotics platform. By adding off-the-shelf sensors, open deep learning modules and cloud-based NLP onto Ohmni Developer Edition, Lehigh students were able to deploy an AI-enabled robot and explore many real-world applications at low cost. 

Navigation Subsystem

Lehigh developers first added a navigation module that enables the robot to navigate and handle a complex indoor environment. For this feature, Lehigh combined Lidar and Sonar sensors to create a robust information stream for real-time navigation. The cost-efficient sensors help the robot extract information from its surroundings. Mounted under the speaker, they enable Ohmni robot to see both left and right at a range between 5 and 10 meters. 

Vision Module

A vision module was also added to enable Ohmni to recognise up to 200 faces and be able to extract emotion features. An Android App integrates multiple functionalities such as face detection, image processing, facial recognition and emotion detection. 

The goal of emotion detection is to recognise emotions on a person’s face and to categorise them into 6 classes: angry, disgust, happy, neutral, sad and surprise. Building upon the face detection technology Lehigh previously developed for Ohmni, the software takes pixels from Ohmni’s high-resolution camera and passes then through the emotion detection program. Once an emotion is detected, this information is processed to allow Ohmni to generate a response appropriate to the user’s emotional state. 

Voice Module 

Students also developed a voice assistant powered by cloud computing which can start interactive conversations based on the user’s reactions. Using CMU Sphinx for speech recognition and Google Dialogue Flow as the conversation model, Lehigh students developed a voice assistant that is connected to the emotion detection mechanism. When an emotion is detected, the voice assistant is triggered and Ohmni robot can then conduct a basic conversation with the user. 

The result is a robotic navigation system that uses multiple types of sensors to smoothly handle complex environments. And a voice assistant system capable of conducting basic conversations and providing useful information.

Potential Use Cases for an AI-enabled Robot

How can an AI-enabled robot be used in the real world? The potential use cases span many industries including real estate, healthcare, and retail. Let’s take a closer look at how a company can use the Ohmni Developer Edition platform to create customised solutions for their customers. 

Robotics for Healthcare

The healthcare industry is one of the best examples for robotics due to the increased cost of care and the growing number of ill and elderly patients. Care providers can use robotics in a number of ways including patient engagement. As Lehigh students pointed out, emotion detection can be used to help children cope with social anxiety and sensors can help patients with anxiety perceive people and their environment in order to reduce stress. Moreover, robots can also be used to stimulate interaction between patients and caregivers or to enhance the psychological effect on patients, improving their relaxation and motivation.

Remote clinical encounters is another great application of robotics. Telepresence robots allow physicians to attend to patients in several locations at the same time. Physicians can securely review charts, vitals and test results remotely while still communicating “face to face” in real-time. In this way, more patients can be served. For instance, Tallaght Hospital in Ireland is using the Remote Presence Robot (RP7) to diagnose strokes quickly.

Similar to these patient engagement examples, Robots4Good can help health service providers improve the quality of care and patients’ outcomes via OhmniLabs robotics telehealth solutions. Incorporate the Ohmni robot to the check-in process at clinics to shorten wait time and provide better services to patients or check-in on patients remotely to discuss issues and improvements.

Robotics for Real Estate

OhmniLabs developers are working on special AI and robotics features for the real estate industry to improve the real estate buying and leasing experience. These new, conceptual features include instant, life-like virtual tours of properties along with data collection based on client feedback. In the image below, for example, several team members are touring office space together and communicating live about the features and benefits of the property. 

Other real estate-specific features OhmniLabs is creating for their Ohmni robot include:

  • Remote monitoring of luxury vacation homes
  • Virtual visits to office spaces, clean rooms and/or hazardous places
  • Monitoring construction progress and compliance 
  • Security services alongside human guards

Robotics for Retail

For retailers, robotics has huge potential that is now being developed. Imagine if your customers could dial into a robot to shop at your stores and enjoy the warm, human interactions with your sales representatives.

In Japan, ANA Holdings is test piloting this service using the newme robot developed by OhmniLabs. Information can be overlaid over the user’s screen to provide additional information about the items, recommendations from experts and celebrities. High-end customers can dial into a robot at products place of origin (for wine, jewellery, dress, etc.) to meet the makers and hear the insiders’ stories

Summary

With increasing demands for customisable robots, a growing number of companies and research labs are investing in designing intelligent robots. If you are interested in developing robotics applications quickly and inexpensively, purchase the Ohmni Developer Edition to get started today.