Category: Media Coverage (Page 1 of 2)

Successful spinal cord rehabilitation trial by UWIN affiliates Chet Moritz and Soshi Samejima featured on King 5 News

Successful spine injury rehabilitation trial conducted by UWIN affilliates Chet Mortiz and Soshi Samejima

Transcutaneous Spinal Stimulation project with Chet Moritz
Image credit: Center for Neurotechnology;

UWIN/Center for Neurotechnology (CNT) graduate fellow Soshi Samejima, and UWIN faculty member (and CNT Co-Director) Chet Moritz were featured on King 5 News for their research which resulted in a successful spinal cord rehabilitation trial.  The article focuses on the study participant, Joe Beatty, who suffered a spine injury which left him with a “future life without the use of his limbs.”  During the course of the study, Joe has regained some fine control in his limbs, going from having “a difficult time to feed himself, grabbing thing, grasping utensils” to movement that is “improved where he can grab sandwiches, he can grab a remote, grab his cell phone,” even walking with some aid for up to eight minutes.  With defined improvements in Joe’s movements, the initial trial has been a success and the Center for Neurotechnology is looking to refine and expand the new method of rehabilitation for chronic spinal cord injuries.

Dr. Mortiz and his team changed the traditional invasive methods of spinal cord rehabilitation by applying transcutaneous electrical simulation – that is, stimulation of spinal cord circuits through the skin. This noninvasive electrical stimulation happens at the same time that the patient performs movements, and the stimulation allows the patient to move better than without stimulation.  Repeated sessions even lead to long term improvements, although the exact mechanism has not been solidified. Currently, Dr. Mortiz and his team believe that by having the simulator firing at the same time that the patient practices movements, the patient can rewire the connections between the neurons in the brain and the spinal cord, leading to long term changes.

With initial success in nerve stimulation trials, the study plans to expand to four other states with the intent to design individual units that patients can take to their house in order to provide convenient ongoing treatment.  Learn more about this research on the Center for Neurotechnology website and in the study’s associated paper.

Soshi Samejima was awarded a UWIN graduate fellowship in 2017.  Chet Moritz, in addition to being the CNT Co-Director and a member of the UWIN Executive Committee, is part of the team running the Laboratory for Amplifying Motion and Performance (AMP Lab).  He was also part of the team awarded a $1 million prize as part of reaching the finals in the GlaxoSmithKline Bioelectronics Innovation Challenge.

World’s lightest wireless flying robot created by UWIN faculty member Sawyer Fuller’s team

Size comparison between RoboFly, the lightest wireless flying robot, and a real fly

Size comparison between RoboFly and a real fly

UWIN faculty member Sawyer Fuller and his team have created what is to date the world’s lightest wireless flying robot. Weighing in at 190 mg, “RoboFly” is only slightly larger than an actual fly. The team also includes Vikram Iyer, Johannes James, Shyam Gollakota, and  Yogesh Chukewad. See the research paper here.

Currently, insect-sized flying machines need to be tethered in order to deliver the power required for flight (check out Fuller’s “RoboBee“). In order to circumvent this issue, RoboFly is powered by a laser beam using a photovoltaic cell. An on-board circuit boosts the seven volts generated by the cell to the 240 necessary to power the wings. The circuit also contains a microcontroller which controls the movement of the wings. “The microcontroller acts like a real fly’s brain telling wing muscles when to fire,” according to Vikram Iyer.

RoboFly, the lightest wireless flying robot, circuit

RoboFly’s flexible circuit. The copper coil and black boxes to the right comprise the boost converter, and the microcontroller is the small square box in the top right.

In the future, autonomous roboinsects could be used to complete tasks such as surveying crop growth or detecting gas leaks. “I’d really like to make one that finds methane leaks,” says Fuller. “You could buy a suitcase full of them, open it up, and they would fly around your building looking for plumes of gas coming out of leaky pipes. If these robots can make it easy to find leaks, they will be much more likely to be patched up, which will reduce greenhouse emissions. This is inspired by real flies, which are really good at flying around looking for smelly things. So we think this is a good application for our RoboFly.”

The team that created Robofly, the world's lightest wireless flying robot.

The Robofly team. Front row: Vikram Iyer (left) and Johannes James; back row (from left): Yogesh Chukewad, Sawyer Fuller, and Shyam Gollakota.

At the moment, RoboFly is only capable of taking off and landing, as there is no way for the laser beam to track the robot’s movement; but the team hopes to soon be able to steer the laser and allow the machine to hover and fly. Shyam Gollakota says that future versions could use tiny batteries or harvest energy from radio frequency signals. That way, their power source can be modified for specific tasks.

See a video below of the RoboFly in action!

RoboFly has received extensive publicity, see coverage by WIRED, The Economist, IEEE Spectrum, MIT Tech Review, TechCrunch, Discover Magazine, GeekWire, Popular Mechanics, Engadget, CNET, Digital Trends, Siliconrepublic, and SlashGear.

PNAS paper on sensory integration from UWIN postdoctoral fellow Eatai Roth

Hawkmoth, featured in the research of UWIN postdoctoral fellow Eatai RothUWIN postdoctoral fellow Eatai Roth, working in the lab of UWIN Co-director Tom Daniel, recently published a paper in Proceedings of the National Academy of Sciences on how multiple types of sensory information are used by hawkmoths to govern flight behavior.  The paper, entitled “Integration of parallel mechanosensory and visual pathways resolved through sensory conflict”, describes work that investigated how moths combine sensory cues to follow the motion of wavering flowers while feeding.

While hovering in front of a flower, a feeding moth receives information about how the flower is moving from two sensory modalities: visual information from the eye and mechanosensory information from the proboscis in contact with the flower.  By building a two-part artificial flower that allows for independent manipulation of visual and mechanosensory cues, Roth et al. disentangled the contribution of each sensory modality to the moth’s flower-following behavior.  They found that the brain linearly sums information from the visual and mechanosensory domains to maintain this behavior. They further demonstrated that either sensory modality alone would be sufficient for this behavior, and this redundancy makes the behavior robust to changes in the availability of sensory information.

This work provides a better understanding of how multiple sensory modalities are used in nature to govern complex behaviors, and connects with the mission of the Air Force Center of Excellence on Nature-Inspired Flight Technologies and Ideas (NIFTI).

This research was also featured in a UW Today article, “Tricking moths into revealing the computational underpinnings of sensory integration”.

Photo credit: Rob Felt, Georgia Tech

Multiple UWIN faculty featured in College of Arts and Sciences articles

Research by a number of UWIN faculty members was recently featured in articles by the College of Arts and Science’s ‘Perspectives’ Newsletter:

  • Bing Brunton, UWIN's Washington Research Foundation Innovation Assistant Professor in NeuroengineeringBing Brunton, the Washington Research Foundation Innovation Assistant Professor in Neuroengineering, was featured in the article “The Brain, By the Numbers”.  Her research uses computational and mathematical approaches to analyze large sets of neural data.
  • An experiment that allowed two people in different locations to communicate brain-to-brain, developed by Chantel Prat, Andrea Stocco, and Raj Rao , was featured in the article “Playing Mind Games, for Science”.
  • Jason Yeatman’s research on better understanding dyslexia by using MRI technology to investigate changes in brain structure and connectivity was described in the article “Decoding Dyslexia”.

UWIN Co-Director Adrienne Fairhall helps bring light to gender bias in conference panels

Triggered by a series of conference listings with all-male or nearly all-male speaker lineups, Princeton neuroscientist Yael Niv roused a group of colleagues to action: UWIN Co-Director Adrienne Fairhall quickly launched a website, BiasWatchNeuro.com, which Yael and the group use to track and post speaker lists at conferences and compare with “base rates” in the field. This effort was the subject of a recent article in the New York Times, “Female Scientists Turn to Data to Fight Lack of Representation on Panels”.

Another contributor, Anne Churchland, maintains a list of women scientists – initiated while she was a postdoc at UW – to assist conference organizers to identify women speakers in different subject areas.  The list is at: https://anneslist.net/

Jason Yeatman’s research featured on UW homepage

Research by UWIN faculty member Jason Yeatman was recently featured on the UW Homepage and in an article by the College of Arts and Sciences, “Decoding Dyslexia”.  Work in the Yeatman lab aims to better understand dyslexia, using MRI technology to investigate changes in brain structure and connectivity as children acquire reading skills and participate in interventions for dyslexia.  Says Yeatman, “We’re trying to understand individual differences among children with dyslexia, working toward the idea of personalized intervention programs based on brain measurements.”

Read the full article here.

Research by Jason Yeatman, UWIN faculty member, featured on the University of Washington homepage

« Older posts