MIT researchers demonstrate ability to tell a Boston Robotics robotic dog to fetch while wearing a pair of AttentivU mind-reading smart glasses

Illustrate a scene in a friendly and light style featuring two MIT researchers in a laboratory environment. Focus on a Caucasian male researcher wearing a pair of futuristic-looking smart glasses and a Black female researcher monitoring the operation from a computer. They are working together to operate a sleek, sci-fi-inspired robotic dog. This dog is shown fetching an object, showcasing a demonstration of its mind-controlled capabilities. Also include a modern looking smartphone displaying some sort of data or diagrams, symbolizing the AI and ML cloud computing that aids the process. The overall environment should reflect the excitement and innovation of emerging technologies.

MIT researchers have successfully demonstrated the ability to control a Boston Robotics robotic dog using mind-reading smart glasses called AttentivU. The participants in the study wore these glasses, which have built-in electrodes to measure brain activity. By thinking responses to preset queries, they were able to instruct the robotic dog to fetch items and move around. Unlike other mind-controlled robots that require implanted electrodes or brain caps, the AttentivU glasses are easier to wear, cheaper, and more aesthetically pleasing. During the demonstration, data was transmitted to an iPhone app that utilized AI/ML cloud computing to analyze the participants’ thoughts. The app correctly understood the commands approximately 84% of the time. This development showcases the rapid evolution of mind-reading interfaces in terms of cost and accuracy. The document suggests that readers can also explore their own robotic dog options on Amazon. The author, David Chien, has a background in technology and is excited about emerging technologies that can save lives and create immersive virtual and augmented reality experiences.

Full article

Leave a Reply