UWIN PI, Amy Orsborn, an assistant professor in Electrical and Computer Engineering at the University of Washington, was awarded a “Changing the Face of STEM” grant by L’Oréal USA. The grant supports former L’Oréal “For Women in Science” fellows to inspire future generations of girls and women in STEM.
Dr. Orsborn will be using her first CTFS grant to support her mentorship organization that promotes Women In Neural Engineering (WINE). WINE looks to provide vital peer‐to‐peer mentorship and networking for women in neural engineering. The group’s initial efforts center on women at the faculty level, as this career stage represents a key bottleneck towards inclusive STEM leadership. The CTFS grant will help WINE provide mentorship and outreach across the training pipeline.
More information on the award, and the other ten awardees can be found in the L’Oréal press release.
The first UWIN seminar for the 2019-2020 school year features a pair of short talks by Anitha Pasupathy and Sam Burden. The seminar is on Wednesday, October 9, 2019 at 3:30pm in Husky Union Builiding (HUB) 337. Refreshments will be served prior to the talks.
“Mid-level cortical representation for object recognition”
Anitha Pasupathy, Professor, Department of Biological Structure, University of Washington
“Sensorimotor games: Human/machine collaborative learning and control”
Sam Burden, Assistant Professor, Department of Electrical & Computer Engineering, University of Washington
“Mid-level cortical representation for object recognition” (Anitha Pasupathy)
Recognizing a myriad visual objects rapidly is a hallmark of the primate visual system. Traditional theories of object recognition have focused on how critical form features, e.g. the orientation of edges, may be extracted in early visual cortex and utilized to recognize objects. An alternative view argues that much of early and mid-level visual processing focuses on encoding surface characteristics, e.g. texture. Neurophysiological evidence from primate area V4 supports a third alternative—the joint, but independent, encoding of form and texture—that would be advantageous for segmenting objects from the background in natural scenes and for object recognition that’s independent of surface texture. Future studies that leverage deep convolutional network models can advance our insights into how such a joint representation of form and surface properties might emerge in visual cortex.
“Sensorimotor games: Human/machine collaborative learning and control” (Sam Burden)
While interacting with a machine, humans will naturally formulate beliefs about the machine’s behavior, and these beliefs will affect the interaction. Since humans and machines have imperfect information about each other and their environment, a natural model for their interaction is a game. Such games have been investigated from the perspective of economic game theory, and some results on discrete decision-making have been translated to the neuromechanical setting, but there is little work on continuous sensorimotor games that arise when humans interact in a dynamic closed loop with machines. We study these games both theoretically and experimentally, deriving predictive models for steady-state (i.e. equilibrium) and transient (i.e. learning) behaviors of humans interacting with other agents (humans and machines).
Applications are now open for UWIN’s final round of WRF Innovation Undergraduate and Post-baccalaureate Fellowships in Neuroengineering.
Fellowships awarded in this cycle can be used for research during the Winter, Spring, and/or Summer 2020 quarters. Applications are due by Monday, November 4, 2019.
These fellowships provide up to $6000 to support for undergraduate and post-baccalaureate researchers committed to working in UWIN faculty labs. More information about applying for these fellowships can be found in the links below:
The first part of our series highlighting UWIN fellow research starts with third year PhD Psychology graduate Ezgi Yucel. Ezgi is co-advised by Ione Fine and Ariel Rokem, and received a UWIN fellowship before arriving at the University of Washington. Her general research combines vision perception, computer modeling, and patient experience improvement.
Retinal degenerative diseases, such as retinitis pigmentosa, can cause gradual vision loss due to the death of the cone and rod photoreceptor cells in the eye. Treatment for vision loss includes electrical retinal implants that are surgically placed, similar to cochlear implants. The implants electrically stimulate the remaining retinal cells to induce specific percepts, or a perceptual experience that a patient sees.
In 2013, the ARGUS II became the first FDA approved retinal implant in the United States. The Argus II includes a microelectrode array placed on the retina at the back of the eye, a camera and transmitter mounted on eyeglasses, and a video processing unit. The camera captures visual information that the processing unit translates into a series of instructions to be transmitted to the microelectrode array. The array then stimulates the remaining cells in the retina. While these percepts do not copy exact visual information recorded by the camera, it is a step towards restoring vision.
Optimizing a perceptual experience
Ezgi’s work revolves around refining the percepts that patients see. She does this by developing a clearer understanding of patients’ perceptual experiences and investigating how various factors can affect the experience of the patient. Initially, the purpose of her work was to validate a model called pulse2percept, which was created to simulate the perceptual experience of retinal prosthesis patients. The model receives the series of pulses sent to the microarray and predicts a possible precept seen by a patient. This original model was based on behavior data of three patients with older Argus I and one patient with an Argus II device. While the model applies to this small population, it would need to be validated against a larger audience.
Ezgi began this project by conducting behavior experiments with blind patients using a drawing task. In this task, researchers stimulated specific electrodes, and the patients drew a recreation their percepts. Initial tests did not lead to reproducible percepts across of the patients, showing a variability in how the ARGUS II functioned in each patient. There was even trouble in receiving two distinct percepts in multi-electrode stimulation trials, hinting at a large patient variability. She continued meeting with patients and had more success on another data collection trip. She collected consistent percepts across trials, even receiving distinct shapes when multi-electrode stimulation occurred. This gap between the first data collection trip and the second prompted an evolution in Ezgi’s research.
Considering the gap in perceptual experiences, Ezgi expanded her project to focus on possible causes of the perceptual difference. The first cause that she is investigating is how physiological changes, such as retinal scarring, affects the spatial vision. While there may different possible reasons for perceptual differences, investigating how insertion surgery methods and existing retinal cell damage affect percepts may inform surgeons on how to best align the electrode array for optimized percepts. This research involves using computational models designed to relate the location of scarring and cell abnormalities with the resulting percepts.
Investigation into fMRI data
Alongside her work to optimize the percepts over the entire entire population of patients, Ezgi is also investigating the influence of system-wide low frequency oscillation on the Blood Oxygenation Level Dependent (BOLD) signal. BOLD signals occur where additional blood flow highlights areas where neuronal activity is occurring. Blood flow naturally has measurable oscillations throughout the body, and may lead to biases in the BOLD signal. As the location of retinal neurons for visual input is well known for humans, she is using visual field mapping to estimate the effects of these BOLD oscillations on the measurement of the neuronal activity.
The measurements for a visual field map are taken with fMRI (functional Magnetic Resonance Imaging). FMRI’s show how blood flows through the brain while neurons rest or fire, changing the amount of blood needed. While the measurement of the BOLD signal allows researchers to directly measure brain activity during various tasks, researchers question the natural noise of BOLD signals. Ezgi hopes by investigating the noise of the BOLD signal, researchers can clean the BOLD measurements, gaining more precise data.
For the next few years, Ezgi will
continue her work on the perceptual experience of retinal prosthesis patients,
now focusing on scarring, and expand into the investigation of fMRI noise. She
is looking forward to developing better simulations and hopes to improve the
consistency of percepts created by devices such as the ARGUS II.
UWIN faculty member Jeff Riffell, alongside UWIN Co-director Adrienne Fairhall and others, published a paper in Current Biology: “Visual-Olfactory Integration in the Human Disease Vector Mosquito Aedes aegypti“. The paper focuses on mosquito’s integration of various sensory cues to find and track their hosts.
As mosquitoes buzz around searching for their next target, they receive a multitude of signals from the environment. These signals include scents, sights, and environmental heats, which the mosquito must processes in order to locate the most viable target. The Riffell lab worked to determine the interaction between two of the previously unconnected senses, sight and smell.
To track the wing movements during flight, researchers placed mosquitoes in an enclosed space with an optical sensor. Researchers then mimicked human breath and movement with triggered puffs of CO2 infused air and a moving bar. The mosquitoes moved to both the air and the motion, but more dramatically to the motion after receiving a CO2 puff. This experiment was repeated with mosquitoes whose central nervous system cells glowed when firing. Neural data showed both the puff of CO2 and the motion triggered the cells. Stimulus order changed the scale of the reaction, as only bar motion after the CO2 puff caused an increase in cell firing. As Dr. Riffell states, “Smell triggers vision, but vision does not trigger the sense of smell.”
Dr. Riffell hopes scientists can gain a better understanding of how mosquitos feed, and develop new methods of bite prevention. Identifying how mosquitoes track their hosts may lead to the ideal prevention strategy.
Steve Brunton, a UWIN faculty member, received the prestigious Presidential Early Career Award for Scientists and Engineers (PECASE). The United States Government bestows the PECASE as the highest honor to “scientists and engineers show exceptional promise for leadership”. The White House Office of Science and Technology Policy receives and reviews recommendations from governmental agencies who support the scientist’s work. Nine agencies have the ability to offer recommendations, including the Department of Energy and the Department of Defense.
Dr. Brunton is a mechanical engineer whose research focuses on data-driven modeling and control of complex systems, such as studying how turbulent fluids behave. Brunton was nominated for his work on using machine learning to develop efficient models that accurately describe the complexities of fluid mechanics. These models will then be used in part for designing better aircraft and more efficient energy systems.
Research by UWIN faculty member David Gire and emeritus UWIN post-baccalaureate fellow Dominic Sivitilli is featured on Science Friday as part of the Cephalopod Week spotlight! Science Friday is a weekly radio show distributed by WNYC and is available in podcast format for listening.
Gire and Sivitilli’s research focuses on the unique spread of neurons over the body of the octopus. In a human, the brain houses most of the sensing and decision making neurons, but in an octopus the eight arms contains two-thirds of these neurons. This distribution changes how the octopus can process decisions. In a classical distribution of neural activity, a central location, usually a brain, collects external information and makes decisions for the entire animal. Conversely, in an octopus, each neuron-heavy section of a tentacle makes many minor decisions independently, which may not necessarily agree with the decisions of other tentacles. By tracking the movement of each section of tentacle, researchers can draw insight from this neuron distribution.
Future research combines three dimensional tracking of tentacle movement with real-time neurological data from an implant in the octopus’s brain. This exploration will help determine how octopuses control their arms, or to what extent their arms work independently. The researchers hope that the observation can bring light to how a different neural structure can affect the sensory capabilities of a creature.
We are excited to announce that John Tuthill, a UWIN faculty member, has been selected to join the 2019 Pew Scholars Program in Biomedical Sciences. The Pew Charitable Trusts funds the scholars program, committing to “investing in scholars at the beginning stages of their careers”.
The Pew Scholars program chose John Tuthill alongside twenty one other researchers from all over the United States. The program hopes to “answer some of the most pressing questions surrounding human health and disease”. Dr. Tuthill’s current research focuses on the adaptability of animal’s body sensing and movement abilities.
His research involves modifying the gait of fruit flies as they walk on tiny treadmills and learn to avoid obstacles . By manually changing when the proprioceptor – limb control – neurons activate, the flies’ gait can also be modified. Modifying the neural activity cause the flies’ gait to change, which then causes the flies’ fine motor control system to adapt. This adaption allows the flies to retrain their motor control neurons to work even in non-ideal circumstances. Dr. Tuthill hopes the work into fly neuron adaptation aids therapeutic efforts for nerve injury in humans.
UWIN graduate fellow Thomas Mohren and UWIN faculty Bing Brunton, Steve Brunton, and Tom Daniel published a paper in PNAS (Proceedings of the National Academy of Sciences of the United States of America) on how flying insects can detect changes in their flight patterns using only a few complex sensors. In order to navigate quickly in complex situations, insects require rapid feedback from the multitude of sensors found on their wings, antennae, and other body parts. The paper, titled “Neural-inspired sensors enable sparse, efficient classification of spatiotemporal data,” describes how insects use both the location of the sensors and the temporal history of the wing motions to sense body rotations.
The researchers use computer models to investigate how insect sensors help detect disturbances. They found a few vital pieces in the insects intake and processing mechanisms. The temporal filter, which modifies environmental inputs with relation to the history of the wings, alongside a non-linear transformation of the received signal at every sensor was crucial for the detection of rotations. These two input modifications, as well as the precise layout of sensors across the wing made it so only a few sensors were required for this detection. The group of researchers believe the principle of neural encoding and sparse placement of sensors hold promise for man-made system. they are now working to implement biologically inspired sensors into robotic platforms.
UWIN faculty member and Washington Research Foundation (WRF) Innovation Assistant Professor in Neuroengineering Azadeh Yazdan published a paper in eLIFE on how brain stimulation changes the ability of neurons to activate and encourage a learning state. The paper titled “Target cortical reorganization using optogenetics in non-human primates” describes investigations into the large-scale connections between brain regions, testing if the relationship between regions become stronger or weaker with varied stimulation.
When people preform everyday actions, connections occur between the sensory and motor areas in the brain. As this action happens more often, those connections becomes stronger. Strengthening connections allow us to learn new skills, and may be key to relearning skills lost due to a brain injury. While many studies have addressed this idea in individual neurons, the importance of strengthened connections can also be expanded to brain regions. Yazdan used a type of virus with the ability to embed light-sensitive proteins into neurons to modify the neurons of macaque monkeys. This allows for researchers to specifically activate certain neurons in the brain, isolating desired regions for connectivity and investigation. Using a concentrated light, researchers activated small regions of tissue within the brain and measured the activity of the regions electronically, displaying the reaction of the regions. While much of the brain followed the assumption that co-activation strengthens connections between brain areas, smaller brain regions had more variability, with some connections becoming weaker overall.
Using the understanding gained through these experiments, researchers can continue to refine therapies that use brain stimulation, such as those used in Parkinson’s disease. Researchers hope to use the brain’s natural learning and growing process to cure or recover from neurological illness and traumas.