Last month, the Black Shark 3 and Black Shark 3 Pro were released, becoming the world’s first 5G gaming phones. Careful users may have noticed that the Black Shark 3 is clearly equipped with a Samsung AMOLED screen, while the Black Shark 3 Pro does not name the supplier. According to Korean media reports, the Black Shark 3 Pro uses a 2.5D OLED flexible screen provided by BOE. Acc...
The robotic arm, built by Johnson, a bioengineering graduate student at Northeastern, is designed to produce tactile sign language in order to enable more independence for people who, are both deaf and blind. Lard is one of the members of the deaf-blind community that is helping Johnson test the robot and giving her feedback on how it could be improved.
People who are deaf can communicate with their hearing friends and family through visually signed language, but for people who are both deaf and blind, language must be something they can touch. So that means that people who are both deaf and blind often need an interpreter to be present with them in person for interactions with others who do not know American Sign Language, so they can feel what shape their hands are making.
The goal of developing a tactile sign language robot is to create something that can be used for someone who relies on American Sign Language as their primary communication language to be able to communicate independently, without relying on another person to interpret. She sees the robot as potentially helpful at home, at a doctor’s office, or in other settings where someone might want to have private communication or an interpreter might not be readily available.
Johnson is still in the early stages of developing the robot, working on it as her thesis with Chiara Bellini, assistant professor of bioengineering at Northeastern, as her advisor. Right now, Johnson is focusing on the letters of the American Manual Alphabet, and training the robot to finger-spell some basic words.
The ultimate goal is for the robot to be fluent in American Sign Language, so that the device can connect to text-based communication systems such as email, text messages, social media, or books. The idea is that the robot would be able to sign those messages or texts to the user. Johnson would also like to make the robot customizable, as, just like in any other language, there are unique signs, words, or phrases used in different regions and some signs that mean different things depending on the cultural context.