16 August 2017

3D Printed Brain-Like Tissue

Scientists in Australia have created brain-like tissue in the lab using a 3D printer and special bio-ink made from stem cells. The research takes us a step closer to making replacement brain tissue derived from a patient's own skin or blood cells to help treat conditions such as brain injury, Parkinson's disease, epilepsy and schizophrenia. The bio-ink is made of human induced pluripotent stem cells (iPSC), which have the same power as embryonic stem cells to turn into any cell in the body, and possibly form replacement body tissues and even whole organs. The team used 3D printing to make neurones involved in producing GABA and serotonin, as well as support cells called neuroglia, they reported in the journal Advanced Healthcare Material. In the future, they plan to print neurones that produce dopamine.


To make the neurones, researchers used their bio-ink to print layers of a hatched pattern to create a 5 millimetre-sized cube. They then crosslinked the cube into a firm jelly-like substance. Growth factors and nutrients were then fed into the holes of this spongey "scaffold", encouraging the stem cells to grow and turn into neurons and support cells, linking up to form tissue. Waste was also removed via the holes in the scaffold. One of the challenges of using iPSCs is that, like embryonic stem cells, they have the potential to develop into teratomas — disturbing looking tumours that contain more than one type of tissue type (think toenails growing in brain tissue, or teeth growing in ovary tissue). While this is a first step towards 3D printing of whole organs, a whole functioning brain would be a much more complex task.

More information:

13 August 2017

Brain Can Form New Memories While You Are Asleep

A sleeping brain can form fresh memories, according to a team of neuroscientists. The researchers played complex sounds to people while they were sleeping, and afterward the sleepers could recognize those sounds when they were awake. The idea that humans can learn while asleep, a concept sometimes called hypnopedia, has a long and odd history. Researchers accomplished pattern learning. While a group of 20 subjects was sleeping, the neuroscientists played clips of white noise. Most of the audio was purely random but there were patterns occasionally embedded within the complex noise: sequences of a single clip of white noise, 200 milliseconds long, repeated five times. The subjects remembered the patterns. The lack of meaning worked in their favor; sleepers can neither focus on what they're hearing nor make explicit connections, the scientist said. This is why nocturnal language tapes don't quite work — the brain needs to register sound and semantics.

 
But memorizing acoustic patterns like white noise happens automatically. Once the sleepers awoke, the scientists played back the white-noise recordings. The researchers asked the test subjects to identify patterns within the noise. Unless you happened to remember the repetitions from a previous night's sleep. The test subjects successfully detected the patterns far better than random chance would predict. What's more, the scientists discovered that memories of white-noise pattern formed only during certain sleep stages. When the authors played the sounds during REM and light sleep, the test subjects could remember the pattern the next morning. During the deeper non-REM sleep, playing the recording hampered recall. Patterns presented during non-REM sleep led to worse performance, as if there were a negative form of learning. This marked the first time that researchers had evidence for the sleep stages involved in the formation of completely new memories.

More information:

11 August 2017

Playing Action Video Games May Harm Brain

A new study reveals that habitual players of action games have less grey matter in their hippocampus, a major part of the brain. And the more depleted the hippocampus becomes, the more a person is at risk of developing brain illnesses and diseases ranging from depression to schizophrenia, PTSD and Alzheimer's disease. Shaped like a seahorse, hence its name, the hippocampus is the part of the brain that helps people to orient themselves (so-called spatial memory) and to remember past experience (episodic memory). However, there's another important part of the brain called the striatum that counterbalances the hippocampus. It has an area known as the caudate nucleus that acts as a kind of "autopilot" and "reward system" – getting us home from work, for example, and telling us when it's time to eat, drink, have sex and do other things that keep us alive and happy. The caudate nucleus also helps us form habits and remember how to do things like ride a bicycle. Gaming has been shown to stimulate the caudate nucleus more than the hippocampus; 85 per cent of players rely on that part of the brain to navigate their way through a game. The problem is, the more they use the caudate nucleus, the less they use the hippocampus, and as a result the hippocampus loses cells and atrophies, the new study shows. Specifically, patients with Parkinson's disease combined with dementia, as well as those with Alzheimer's disease, schizophrenia, depression or PTSD – all of whom have less grey matter in their hippocampus – would not be advised to follow action video game treatments, according to the study.
 

To do their investigation, the researchers recruited close to 100 people (51 men, 46 women) at UdeM and got them to come in and play a variety of popular shooter games like Call of Duty, Killzone and Borderlands 2, as well as 3D games from the Super Mario series, for a total of 90 hours. To establish which participants were spatial learners (that is, those who favoured their hippocampus) versus response learners (those using the reward system), the team first had each of them run through a ‘4-on-8’ virtual maze on their computer. From a central hub, they had to navigate down four identical-looking paths to capture target objects, then, after their gates were removed, go down the four others. To remember which paths they'd already been down and not waste time looking for the objects they'd already taken, spatial learners oriented themselves by the landmarks in the background: a rock, a mountains, two trees. Response learners didn't do that; they ignored the landmarks and concentrated instead on remembering a series of right and left turns in a sequence from their starting position. Once their learning strategy was established, participants then began playing the action and 3D-platform video games. The same amount of screen time on each produced very different effects on the brain. Ninety hours of playing action games led to hippocampal atrophy in response learners, while 90 hours of playing 3D games led to increased grey matter within the hippocampal memory system of all participants.



More information:

04 August 2017

Watch People While They Are Watching Movies

Dolby Laboratories has been around since 1965, and for most people, the company is synonymous to the white label you see at the end of movies that tells you that the sound and video have been remastered in some way. Inside its headquarters in San Francisco, Dolby has over a hundred technical labs, and over the past five years some of the labs have been devoted to a lesser-known project: watching people while they’re watching movies. The company has been attaching biosensors to willing subjects and plopping them down on a couch to settle in for an entertainment session. Using EEG caps, heart rate monitors, galvanic skin response sensors, and thermal imaging Flir cameras, the scientists can observe the biophysical and emotional responses that humans are experiencing via media.


They’re trying to figure out what kind of videos and sounds make people’s hearts race, what makes their skin flush, and what makes them cognitively engaged, aroused, or maybe even bored. Dolby is using this information to better sell its own technology to its Hollywood content partners. The idea being that if it can prove that HDR, surround sound, or a certain color palette, will elicit an emotional response, then the creative content makers are more likely to want to use Dolby tools. This kind of affective computing has been around for decades, but industry experts say that in entertainment it’s becoming even more common. Case in point: both Netflix and Hulu, have used eye-tracking technology in recent years to get a sense of where people are looking within their app interfaces.

More information:

02 August 2017

Commercial BCI for VR

Neurable is offering a product: a brain-computer interface for virtual reality. The integrated platform is built for one purpose: to enable developers to create brain-controlled content for virtual reality. The company foresees an ecosystem of control inputs that will combine to make AR/VR environments incredibly responsive and adaptive to user behavior.


The machine learning platform, interprets brain activity in real time to afford virtual powers of telekinesis. The machine learning pipeline has been distilled into an SDK compatible with Unity®. Brain signal acquisition is accomplished through our upgraded headband for the HTC Vive. The solution demonstrates how brain sensors and neurotechnology can be integrated with AR/VR devices.

More information:

27 July 2017

Oculus Video Shows Advanced Hand Tracking Gloves

An Oculus video posted last week, shows what looks like the same gloves in action. It reveals a marker-based solution that uses external tracking cameras. Unfortunately, hands have about 25 degrees of freedom and lots of self-occlusion. Right now, retroreflector-covered gloves and lots of cameras are needed to get to this level of tracking quality. Perhaps one day the company might be able to fit all of that into sensors like those used for the Rift and Touch, but right now it requires that elaborate rig constructed around Zuckerberg in the picture he teased.


Oculus won’t be releasing any new hardware in 2017 and the Touch controllers are still the primary form of input for its VR experiences having only released late last year. It’s more than likely that this is still an R&D project that’s not even confirmed for a consumer release. We’d love to know if these gloves have any kind of haptic feedback to react to actions like pressing buttons. Accurate finger tracking is another important step towards fully VR immersion, but actually replicating the feel and resistance of surfaces and objects in VR is another challenge entirely.

More information: