News

EECS faculty envision California's next-gen infrastructure

EE Profs Claire Tomlin,  Costas Spanos and Connie Chang-Hasnain, and CS Prof. David Culler, are featured in a Berkeley Engineer article titled "Smart moves: California's next-gen infrastructure," which describes current UC Berkeley research projects that promise to transform the way we live.  “What’s enabling these infrastructure changes is our ability to compute faster, to share information faster and to provide that information to users very quickly,” says Tomlin.   She envisions “rail-to-drone” expressways, converting railroad rights-of-way to aerial corridors where closely-spaced fleets of drones travel safely.  Spanos predicts self-monitoring buildings so smart they band together and form bargaining alliances, and Chang-Hasnain's team have been working on manufacturing highly efficient but low-cost solar cells by growing nanoscale “forests” of expensive photovoltaics on inexpensive silicon substrates. The Berkeley Institute for Data Science, co-founded by Culler, is “equipping students not just to consume data but to produce insight” which will help guide the changes to come.

Amazing Salto-1P is jumping longer, faster and higher than ever

Bioinspired robot, Salto-1P, is featured in an IEEE Spectrum article titled "Salto-1P Is the Most Amazing Jumping Robot We've Ever Seen."  Born in Prof. Ronald Fearing's Biomimetic Millisystems Lab, Salto-1P is the most recent incarnation of the Saltatorial Locomotion on Terrain Obstacles (Salto) robot which was pronounced the most vertically agile robot ever created last December.  Salto-1P uses a small motor and a system of linkages and gears to jump.  It needs to do most of its control in the air because it spends so little time in contact with the ground, so it uses a rotating inertial tail and two little thrusters to stabilize and reorient itself in between jumps.

See Salto-1P in action as it bounces around and self-destructs.

Justine Sherry wins the 2016 ACM SIGCOMM Doctoral Dissertation Award

CS alumna Justine Sherry (M.S. '12/Ph.D. '16 advisor: Sylvia Ratnasamy) has won the ACM SIGCOMM Doctoral Dissertation Award for Outstanding PhD Thesis in Computer Networking and Data Communication.  Justine's thesis was on "Middleboxes as a Cloud Service," and brought the benefits of cloud computing to the networking domain.  Justine is now assistant professor at the Carnegie Mellon School of Computer Science.

"Earable" 3D-printed ear-mounted sensors will monitor body's core temperature

EE Prof. Ali Javey and his research into "Earables," are profiled in a Digital Trends article titled 3D printed ear-mounted wearable will Monitor your body's core temperature.  Earables represent a class of ‘structural electronics’ where sensors and electronics are embedded in the fabricated structure itself.  “Monitoring core body temperature in a continuous fashion could have important medical applications,” Javey continued. “Examples can include monitoring patients with severe conditions, or infants.”  Their goals include making the device smaller while expanding the range of sensors implemented.

Rikky Muller takes on the challenge of reverse-engineering the brain

EE Assistant Prof. Rikky Muller will make a presentation on the theme of Reverse-Engineering the Brain at this year's Global Grand Challenges Summit (GGCS).  Muller is co-founder of Cortera Neurotechnologies, a company which designs medical devices for the treatment of incurable neurological conditions. "We can use devices that record brain signals directly from the the motor cortex of the brain and interpret those signals as movements," she says, "and we can use those signals to control robotic arms, for example, or mouse cursors on a screen."  The GGCS,  sponsored by the National Academy of Engineering (NAE), was created to help find solutions for 14 major engineering challenges critical to the future well-being of humanity and the planet.  Muller will also be awarded an NAE Gilbreth Lectureship, established to recognize outstanding young engineers.

Lauren Barghout Joins Last Studio Standing as Chief Vision Scientist

Laura Barghout, a visiting scholar at the Berkeley Initiative for Soft Computing (BISC), has been hired as Chief Vision Scientist of Last Studio Standing, the largest hand-drawn animation studio in the Western Hemisphere.  Barghout invented a Gestalt-based fuzzy inference system and labeling technique that is used commercially in image/video background removal, object recognition and image labeling systems. Her research has contributed to the understanding of context-dependent spatial vision, spatial masking, theoretical and computational psychophysics, and the application of fuzzy set theory to human and machine vision.  "Most animation relies solely on a direction and what's being created," she said . "This invention allows for a softer, more human-like understanding. It captures the flavor and nuance of a subject naturally -- and, again, it's softer. As such, it requires less manual clean up."

Aviad Rubinstein helps show that game players won’t necessarily find a Nash equilibrium

CS graduate student Aviad Rubinstein (advisor: Christos Papadimitriou)  is featured in a Quanta Magazine article titled "In Game Theory, No Clear Path to Equilibrium," which describes the results of his paper on game theory proving that no method of adapting strategies in response to previous games will converge efficiently to even an approximate Nash equilibrium for every possible game. The paper, titled Communication complexity of approximate Nash equilibria, was co-authored by Yakov Babichenko and published last September.  Economists often use Nash equilibrium analyses to justify proposed economic reforms, but the new results suggest that economists can’t assume that game players will get to a Nash equilibrium, unless they can justify what is special about the particular game in question.

Vern Paxson's cybersecurity startup Corelight raises $9.2M in Series A funding

Corelight, a cybersecurity startup co-founded by CS Prof. Vern Paxson, has raised $9.2 million in Series A funding from Accel Partners, with participation from Osage University Partners and Riverbed Technology Co-founder (and former Berkeley CS professor) Dr. Steve McCanne.  Corelight provides powerful network visibility solutions for cybersecurity built on a widely-used open source framework called Bro, which was developed by Paxson while working at LBNL in 1995.   The Corelight Sensor, which enables wide-ranging real-time understanding of network traffic, is already being used by many of the world’s most capable security operations including Amazon and five other Fortune 100 companies.

Jose Carmena and Michel Maharbiz win 2017 McKnight Technological Innovations in Neuroscience Award

Professors Jose Carmena and Michel Maharbiz have won a 2017 McKnight Technological Innovations in Neuroscience Award for "Neural Dust: an ultrasonic, low power, extreme miniature technology for completely wireless and untethered neural recordings in the brain."  These annual awards come with a $200K prize and are aimed at advancing the range of tools neuroscientists have to map, monitor, and model brain function.  The award will allow Carmena and Maharbiz to apply neural dust technology to the central nervous system, which has the potential to allow the brain to be trained or treated to restore normal functionality following injury or the onset of neuropsychological illness.

Compressed light field microscopy helps build a window into the brain

In a project funded by a $21.6M donation from DARPA,  a light field microscope developed by EE Associate Prof. Laura Waller, MCB Assistant Prof. Hillel Adesnik and their lab groups, is being used to create a window into the brain through which researchers — and eventually physicians — can monitor and activate thousands of individual neurons using light.   The microscope is based on CS Assistant Prof. Ren Ng's revolutionary light field camera which captures light through an array of lenses and reconstructs images computationally in any focus. The microscope is the first tier of a two-tier device referred to as a cortical modem:  it "reads" through the surface of the brain to visualize up to a million neurons; the second tier component "writes" by projecting light patterns onto these neurons using 3D holograms, stimulating them in a way that reflects normal brain activity. The goal of the project is to read from a million individual neurons and simultaneously stimulate 1,000 of them with single-cell accuracy.  “By encoding perceptions into the human cortex," MCB Prof. Ehud Isacoff says, "you could allow the blind to see or the paralyzed to feel touch.”

There are 10 faculty involved in this project, 4 of which are from EECS: Laura Waller, Ren Ng, Jose Carmena and Rikky Muller. The project is being led by Ehud Isacoff from the Helen Wills Neuroscience Institute.