21 April 2018

Electrically Stimulate Muscles for Hands-free AR Input

Researchers at The Human Computer Interaction Lab at Hasso-Plattner-Institut in Potsdam, Germany, proposed a novel solution to the problem of wearable haptics for augmented reality. Using a lightweight, mobile electrical muscle stimulation (EMS) device that provides low-voltage to arm muscles, the idea is to let AR headset-users stay hands-free, but also be able to experience force-feedback when interacting with virtual objects, and feel extra forces when touching physical objects in their environment too. Using a HoloLens headset, researchers show their proposed solution in action, which is made up of a backpack, a laptop computer running Unity, a battery-powered EMS machine, electrode pads, and visual markers to better track hand gestures. The researchers say their system adds physical forces while keeping the users’ hands free to interact unencumbered. The EMS-based system actuates the user’s wrists, biceps, triceps and shoulder muscles with a low-voltage to simulate a sort of ‘virtual pressure’. This perceived pressure can be activated when you interact with virtual objects such as buttons, and even physical objects like real-world dials and levels to create an extra sense of force on the user’s arms.


There are some trade-offs when using this sort of system though, making it somewhat less practical for long-term use as it’s configured now. Two of the biggest drawbacks: it requires precise electrode placement and per-user calibration before each use, and it can also cause muscle fatigue, which would render it less useful and probably less comfortable. But maybe a little muscle stimulation can go a long way. The paper discusses using EMS sparingly, playing on the user’s keen sense for plausibility while in a physical (and not virtual) environment. It’s an interesting step that could prove effective in a multi-pronged approach to adding haptics to AR wearables, the users of which would want to stay hands-free when going about their daily lives. Actuator-based gloves and vests have been a low-hanging fruit so far, and are quickly becoming a standard go-to for VR haptics, but still seem too much of a stretch for daily AR use. Force-feedback exoskeletons, which stop physical movements, are much bulkier and are even more of a stretch currently. There’s no telling what the prevailing AR wearable will be in the future, but whatever it is, it’s going to have to be both light and useful—two aspects EMS seems to nail fairly well out of the gate.

More information:

20 April 2018

Infinadeck Omnidirectional Treadmill

Infinadeck is the world’s first commercially-viable omnidirectional treadmill.  This patented product allows users to naturally walk in any direction and, when paired with a VR headset, creates a personal holodeck experience that will allow you to take a stroll through vast, immersive new worlds.


The treads on the bottom move in both an x and y-axis giving you the sensation of moving around freely. You will have the ability to move in 360 while being tracked with the Vive Tracker. The sensors determine where you are in the virtual world and can recreate your physical height through it.

More information:

15 April 2018

ViMM TA4.3 Propositions

The main objective of this meeting that took place in Berlin between 13-14 April 2018 was to agree on and synthesize what recommendations and proposals for the EU and the DCH community should be taken forward by ViMM, through its Manifesto, Roadmap and Action Plan.


I have presented the TA4.3 propositions about presence which is essential for engagement and cognitive connection to the content. Presence can be enhanced if the content is relevant and coherent in terms of social and cultural factors.

More information:

07 April 2018

Mind Reading Headset with 90% Accuracy

A new mind-reading device means people can silently type on their computer using nothing but thoughts - and it's accurate 90 per cent of the time. Instead of communicating with smart devices by saying 'Ok Google' or 'Hey Siri', the headset silently interprets what users are thinking. When people think about verbalising something, the brain sends signals to facial muscles - even if nothing is said aloud. The device has sensors that pick up seven key areas along the cheek, jaw and chin that can recognise words and can even talk back once it has processed them. Currently the 'AlterEgo' device, which was created by researchers from MIT Media Lab, can recognise digits 0 to 9 and has a vocabulary of around 100 words. The system consists of a wearable device and an associated computing system which is directly linked to a program that can query Google. Electrodes in the device pick up neuromuscular signals in the jaw and face which are triggered when users say words in their head. The signals are fed to a machine-learning system that has been trained to correlate particular signals with particular words. The device also includes a pair of bone-conduction headphones, which transmit vibrations through the bones of the face to the inner ear. 


These headphones do not obstruct the ear canal so users can still hear information without their conversations being interrupted. This silent-computing system means users can communicate with Google without being detected by anyone else. To start with, researchers found which part of the face was the source of the most reliable neuromuscular signals. They did this by asking people to subvocalise the same series of words four times with 16 different electrodes at different facial locations each time. They found signals from seven particular locations were consistently able to distinguish subvocalised words. Using this information, MIT researchers created a prototype that wraps around the back of the neck like a telephone handset. It touches the face in seven locations either side of the mouth and along the jaw. They then collected data on a few computational tasks with limited vocabularies - around 20 words each. One was arithmetic and the other was used in a chess game. The prototype device could complete these tasks with 90 per cent accuracy. In one experiment researchers used the system to report the opponents' moves in a chess game. In response the device gave recommended responses.

More Information:

03 April 2018

Animated 3D Sea Turtles

The Digital Life team at the University of Massachusetts Amherst, creators of an online catalog of high-resolution, full-color 3D models of living organisms, announce today that they have released two new, online full-color animated models of a loggerhead and a green sea turtle through a collaboration with sea turtle rescue and research institutions. The Digital Life team, with volunteer 3D artists created the animated sea turtles using software such as Capturing Reality and Blender and a process called photogrammetry, in which multiple still photos are integrated to create lifelike 3D meshes with photographic colors.


The models can be downloaded and 3D printed, such as for classroom use. These models can be used by scientists in a computer modeling environment for testing models of migration in sea turtles, or to test different net designs to avoid trapping sea turtles. They can also be used in VR or game-like educational environments, and are available at no cost to educators, scientists, conservationists and others for creative or nonprofit use on the Digital Life website. The animators spent hundreds of hours animating the 3D turtle models in a format that may be used for VR, film or game applications, among others.

More information: