News

How Flight Simulation Tech Can Help Turn Robots Into Surgeons

Robotics researchers from Berkeley's AUTOLab, led by IEOR and EECS professor Ken Goldberg, have built a heaving robotic platform — mimicking the motion of a breathing, heart-beating human patient — to help develop algorithms that robotic surgical assistants can use to guide their cutting.  This research is the subject of an article in Wired magazine titled "How Flight Simulation Tech Can Help Turn Robots Into Surgeons."  During surgery, when the chest heaves or blood pumps, the surgeon has to compensate for that movement.  The researchers took the data from watching the surgeon's movements and developed algorithms that could mimic his strategy for cutting along a line. This new robot, which is a kind of a Stewart platform, mimics that movement.  Stewart platforms are normally hefty pneumatic devices that power things like immersive flight simulators. But for this study, the researchers took the concept and shrunk it down to a 6-inch-wide device, opting for servo motors instead of pneumatic power. The machine costs just $250.

Retraining the brain’s vision center to take action

Neuroscience researchers, including Prof. Jose Carmena, have demonstrated the astounding flexibility of the brain by training neurons that normally process input from the eyes to develop new skills, in this case, to control a computer-generated tone.  Carmena, the senior author of a paper about the development that appeared in the journal Neuron, explains that “to gain a reward, the rats learned to produce arbitrary patterns of neural activity unrelated to visual input in order to control a BMI, highlighting the power of neuroplasticity and the flexibility of the brain.”   “These findings suggest that the striatum has a broader role in shaping cortical activity based on ongoing experience and behavioral outcomes than previously acknowledged, and have wide implications for the neuroscience of thought and action and brain-machine interfaces,” said Carmena.

RISELab's AI research wins $10M NSF award

The RISELab, led by Prof. Ion Stoica, has received an Expeditions in Computing award from the National Science Foundation (NSF), providing $10 million in funding over five years to enable game-changing advances in real-time decision making technologies.  The award is presented to research teams pursuing large-scale, far-reaching and potentially transformative research in computer and information science and engineering.   RISELab’s award will be used to develop technology for an era in which AI systems will make decisions that will play an increasingly central role in people’s lives in areas such as healthcare, transportation and business.

Ling-Qi Yan helps to improve computer rendering of animal fur

CS graduate student Ling-Qi Yan (advisors: Ravi Ramamoorthi/Ren Ng) and researchers at U.C. San Diego are the subject of an article in TechXplore titled "Scientists improve computer rendering of animal fur."  He is part of a team that developed a method for dramatically improving the way computers simulate fur, and more specifically, the way light bounces within an animal's pelt.  The researchers are using a neural network to apply the properties of a concept called subsurface scattering to quickly approximate how light bounces around fur fibers.  The neural network only needs to be trained with one scene before it can apply subsurface scattering to all the different scenes with which it is presented. This results in simulations running 10 times faster than current state of the art.  "We are converting the properties of subsurface scattering to fur fibers," said Yan. "There is no explicit physical or mathematical way to make this conversion. So we needed to use a neural network to connect these two different worlds."  The researchers recently presented their findings at the SIGGRAPH Asia conference in Thailand.

Dan Wallach to testify about election security and voting machines in Texas

EECS alumnus Dan Wallach (B.S. '93) will testify before the Texas Senate Select Committee on Election Security at a hearing about recent election irregularities in Texas, a review of voting security protocols and the responsibilities and duties of members of the Electoral College.  Specifically, the hearing will examine the use of electronic voting machines and paper ballots, voting fraud and disenfranchisement occurring inside nursing homes and assisted living facilities, outside interference and manipulation of elections, and the voting requirements of presidential electors.  Wallach is widely regarded as an expert on voting machine security.  He is currently an EECS professor at Rice University and a scholar at Rice's Baker Institute for Public Policy. 

Security for data analytics – gaining a grip on the two-edged sword

Prof. Dawn Song and graduate student Noah Johnson are taking a new approach to enable organizations to follow tight data security and privacy policies while enabling flexible data analysis, as well as machine learning for analysts.  Working with Uber, they tested their system using a dataset of 8 million queries written by the company’s data analysts. The system is currently being integrated into Uber’s internal data analytics platform.  With help from the Signatures Innovation Fellows program, they are advancing the system to provide the same level of security and flexibility for a broad range of data analysis and machine learning, whether needed in basic and medical research or business analytics.

Small robots with smart bodies can safely bump into obstacles

Prof. Ron Fearing's team have modified a palm-size robot with a soft, roach-like exoskeleton and six legs, called the Dynamic Autonomous Sprawled Hexapod (DASH), to use the momentum of a head-on crash to tip itself upward to climb a wall.  Kaushik Jayaram (Ph.D. Robotics/Biology '14, advisor: Robert Full) discovered how cockroaches use the energy from collisions to propel themselves up and over obstacles.  “Their bodies are doing the computing, not their brains or complex sensors,” explains Jayaram. DASH can now scurry up an incline, if equipped with gecko toes – sticky pads that Full and Fearing has also investigated and adapted for robots – they may one day become as nimble as a cockroach. The work “shows that small robots can be built with simple, robust, smart bodies to safely bump into obstacles instead of using complex and expensive sensing and control systems," says Full. 

Anca Dragan and Raluca Popa

Anca Dragan and Raluca Popa win Sloan Research Fellowships

Assistant Profs. Anca Dragan and Raluca Ada Popa have been awarded 2018 Alfred P. Sloan Research Fellowships.  They are among 126 early-career scholars who represent the most promising scientific researchers working today. Their achievements and potential place them among the next generation of scientific leaders in the U.S. and Canada. Winners receive $65,000, which may be spent over a two-year term on any expense supportive of their research.  Popa and Dragan were both selected in the Compter Science category.   Popa is a co-founder of the RISELab where she is trying to develop a learning and analytics framework that can run on encrypted data.  Dragan runs the InterACT lab and is a PI for the Center for Human-Compatible AI.  Her goal is to enable robots to work with, around and in support of people, autonomously generating behavior in a way that formally accounts for their interactions with humans. “The Sloan Research Fellows represent the very best science has to offer,” said foundation president Adam Falk. “The brightest minds, tackling the hardest problems, and succeeding brilliantly – fellows are quite literally the future of 21st century science.”

Computer Vision to Protect Patients — and Budgets

Prof. Alexandre Bayen and PhD student Pulkit Agrawal developed a computer vision-based system to help memory care centers monitor patient falls and to reduce them where possible.  State regulations require an MRI of the head any time a patient suffers an unwitnessed fall, and about a fourth of all Alzheimer’s-related hospital visits are triggered by a fall. With five million Americans currently living with Alzheimer’s, the task of preventing, tracking and treating fall-related injuries has become daunting and costly, with more than a $5 billion annual cost to medicare--and the number of people with Alzheimer’s is expected to double in the next 15 years.   A system capable of detecting falls by autonomously monitoring patients and sending therapists video clips could improve the monitoring process immensely  “There are no effective drugs yet to treat Alzheimer’s,” Agarwal says. “Until we have them, we have to help patients where they are. Developing computer vision systems to detect falls and fall vulnerability seemed like a good way to improve healthcare for a growing patient population.”

Constance Chang-Hasnain and David Tse elected members of the National Academy of Engineering

Prof. Constance Chang-Hasnain and Adjunct Prof. David Tse have been elected members of the National Academy of Engineering (NAE).   Election to the NAE is among the highest professional distinctions accorded to an engineer.  Academy membership honors those who have made outstanding contributions to "engineering research, practice, or education, including, where appropriate, significant contributions to the engineering literature" and to "the pioneering of new and developing fields of technology, making major advancements in traditional fields of engineering, or developing/implementing innovative approaches to engineering education."  Chang-Hasnain was elected "for contributions to wavelength tunable diode lasers and multiwavelength laser arrays."  Tse was elected "for contributions to wireless network information theory."    37 of the 2,293 current U.S. members are EECS faculty.