Man turns into Cyborg

Dr Peter B Scott-Morgan has just turned from Peter 1.0 to Peter 2.0, to use his own term, and has become the world’s first full Cyborg. He’s real and you can see his posts on Twitter. Dr Morgan is a scientists who has a muscle wasting disease that has now taken its toll on his body. In other words, he is terminally ill with a motor neurone disease. As the muscles in his body lose their power completely, only his brain will be alive. He will only be able to interact with the world electronically. His eyes will be able to control objects around him, including his bed and wheelchair. There is a feeding tube inserted into him, a colostomy bag, and a catheter and other essentials taken care of as bodily functions are replaced with machinery. His larynx has already been removed and he speaks electronically. “I’m not dying, I’m transforming,” says Dr Morgan.

Walking a dream

Go for a walk, but be elsewhere. In a Stanford University-Microsoft research project on human-computing interaction, a team of researchers explored a new way to take a walk. They combined an actual walk with a VR walk to come with an interesting experience. The system is called Dreamwalker. They had subjects walk around the Microsoft campus wearing VR headsets. Inside those headsets, they were made to experience walking in crowded Manhattan. The routes to be walked were mapped and synchronised with the virtual walk. This meant fusing two GPS locations but also included circumventing objects so no one would bump into anything or anyone. The researchers called this inside-out tracking, helped by sensors. It’s not clear what use this type of experience can be put to but it could certainly be explored for training and a form of remote tourism where the actual experience isn’t possible.

AI knows something we don’t

A report in the New Scientist says Artificial Intelligence can predict a person’s chances of dying within the year even where doctors can’t. What is even more strange is that it’s not known exactly how the AI can tell. One could toss the idea aside except that the insight comes from a study conducted on a very large scale. Healthcare company Geisinger and a group of researchers in Pennsylvania got an AI system to look at 1.7 million ECG results coming from 400,000 people. One part of the study let the AI see just the ECG while the other allowed it to see age and sex as well. The AI correctly predicted the risk patterns that even doctors missed, leading them to believe that it is seeing patterns they cannot or have not been able to interpret. The research was based on historical data and the next step is to try to learn what the AI is able to pick up.

Sound helps create a 3D display

Scientists at the University of Sussex have used an acoustic system to create a display that appears like a holograph but in 3D and can move and be interacted with. The display is known as a ‘volumetric’ one and involves sound levitation and doesn’t need any screen. What it needs is a multimodal acoustic trap display (MATD) created by speaker like equipment which emits high frequencies we can’t hear. But a tiny bead can and that’s what is made to float, buoyed by the sound waves. Lights and colours are projected on to this bead. It flies about at high speed and can be made to trace out what looks like an image — that too, one that can be viewed from all sides. The researchers reported their work in the journal, Nature. Music or sounds can be added to this setup. This technically advanced display can interact with humans, opening up interesting possibilities for its use. 

Compiled by Mala Bhargava

comment COMMENT NOW