News

5 questions for Randy Katz

EECS professor and UC Berkeley's new Vice Chair for Research, Randy Katz, is interviewed in Cal Alumni's California Magazine about his approach to his new job.  The article covers how one might go about creating a nurturing environment for pursuing innovative research, his predictions about future technologies, the integration of Big Data in new research, examples of some exciting projects,  and the problem of funding.

Research breakthrough StimDust is the smallest volume, most efficient wireless nerve stimulator to date

A research team led by Assistant Prof. Rikky Muller and Prof. Michel Maharbiz have created StimDust (stimulating neural dust), the smallest volume, most efficient wireless nerve stimulator to date.  The innovation adds more sophisticated electronics to neural dust (tiny, wireless sensors first implanted by Maharbiz and Prof. Jose Carmena in 2016) without sacrificing the technology’s size or safety, greatly expanding its range of applications.   Powered by ultrasound at an efficiency of 82%, and with a volume of 6.5 cubic millimeters, StimDust can be used to monitor and treat disease in a real-time, patient-specific approach.  “StimDust is the smallest deep-tissue stimulator that we are aware of that’s capable of stimulating almost all of the major therapeutic targets in the peripheral nervous system,” said Muller. “This device represents our vision of having tiny devices that can be implanted in minimally invasive ways to modulate or stimulate the peripheral nervous system, which has been shown to be efficacious in treating a number of diseases.” The research will be presented April 10 at the IEEE Custom Integrated Circuits Conference in San Diego.

A step forward in Stephen Derenzo's search for dark matter

Prof. Stephen Derenzo is quoted in an article for Australia’s Particle about a new material for a proposed detector of weakly interactive massive particles (WIMPs).  Derenzo is the lead author of a study published March 20 in the Journal of Applied Physics about a crystal called gallium arsenide (GaAs) which features added concentrations, or “dopants,” of silicon and boron.  This material possesses a scintillation property--it lights up in particle interactions that knock away electrons. According to Derenzo, who is a senior physicist in the Molecular Biophysics and Integrated Bioimaging Division at Berkeley Lab, the new ultrasensitive detector technology could scan for dark matter signals at energies thousands of times lower than those measurable by more conventional WIMP detectors. “It’s a privilege to be working on such an important problem in physics, but the celebration will have to wait until clear signals are seen,” he says. “It’s possible that dark matter particles are even lighter than what we can see with GaAs, and their discovery will have to wait for even more sensitive experiments.”

John Kubiatowicz and Group's (Circa 2000) Paper Named Most Influential at ASPLOS 2018

At the ASPLOS conference in late March, John Kubitowicz and his group from 2000 were celebrated for their paper, "OceanStore: an architecture for global-scale persistent storage." The paper was named Most Influential Paper 2018, and the authors receiving the award included David Bindel, Yan Chen, Steven Czerwinski, Patrick Eaton, Dennis Geels, Ramakrishna Gummadi, Sean Rhea, Hakim Weatherspoon, Chris Wells, and Ben Zhao, as well as Kubi, a long-time Berkeley CS faculty member. The paper was originally published in the Proceedings of the ninth international conference on Architectural support for programming languages and operating systems (ASPLOS IX). 

Dave Patterson wins ACM Turing Award with Stanford's Hennessy for Creation of RISC

The Association of Computing Machinery (ACM) announced today that the winners of the 2017 ACM Turing Award are UC Berkeley's David A. Patterson and Stanford University's John L. Hennessy for "pioneering a systematic, quantitative approach to the design and evaluation of computer architectures with enduring impact on the microprocessor industry. Hennessy and Patterson created a systematic and quantitative approach to designing faster, lower power, and reduced instruction set computer (RISC) microprocessors." (ACM) Since then, computer architects have been using principles derived from their approach in a wide variety of projects for industry and academia.  Today, 99% of the more than 16 billion microprocessors produced annually are RISC processors, to be found in most smartphones, tablets, and the billions of embedded devices that comprise the Internet of Things (IoT).  

EE and CE grad programs place #1 and #2 in 2019 US News rankings

The Berkeley Electrical Engineering and Computer Engineering programs have been ranked #1 and #2 respectively by U.S.News & World Report, continuing our tradition as one of the best EECS graduate programs the world.  In the 2019 rankings, Electrical/Electronic/Communications Engineering moved up from #3 to #1, tying with MIT and Stanford.  Computer Engineering held the #2 spot between MIT and UI Urbana-Champaign.  The College of Engineering graduate programs held the #3 spot under MIT and Stanford, with #1 rankings in Civil and Environmental/Environmental Health, a #2 ranking in Chemical Engineering, #3 rankings in Industrial/Manufacturing/Systems, Materials,and Mechanical, and a #4 ranking in Biomedical/Bioengineering.

Barbara Simons proves how easy it is to hack elections—and how it can be stopped

College of Engineering Distinguished Alumna Barbara Simons (Ph.D. '81) is the subject of an article in the Dail Kos titled "Computer scientist Barbara Simons proves how easy it is to hack elections—and how it can be stopped."  Simons,  who runs Verified Voting, has been a longtime advocate for bringing paper ballots back to all states and exposing the perils of electronic paperless ballots.  Last summer, she ran an experiment at the Def Con Hacker Conference in Las Vegas in which she secured 4 voting machines and had two teams of hackers successfully compromise them. “Anything that’s happening in here, you can be sure [it’s something] that those intent on undermining the integrity of our election systems have already done,” she said.  Simons will be a keynote speaker at the WiCSE 40th Reunion on Saturday.

Michael Jordan named Plenary Lecturer at the 2018 International Congress of Mathematicians (ICM)

Prof. Michael Jordan has been named a Plenary Lecturer at the 2018 International Congress of Mathematicians (ICM), which will take place in Rio de Janeiro, Brazil, in August.  ICM is considered the world’s premier forum for presenting and discussing new mathematical discoveries.  Plenary speakers are invited from around the world to present one-hour lectures which are held without other parallel activities--an honor that has been bestowed on only a small handful of computer scientists over the 121 year history of the ICM.

Ashokavardhanan, Jung, and McConnell

Ashokavardhanan, Jung, and McConnell named KPCB Engineering Fellows

Undergraduate students Ganeshkumar Ashokavardhanan (EECS  + Business M.E.T.),  Naomi Jung (CS BA), and Louie McConnell (EECS + Business M.E.T.) have been selected to participate in the 2018 KPCB Engineering Fellows Program, named one of the top 5 internship programs by Vault.  Over the course of a summer, KPCB Engineering Fellows join portfolio companies, where they develop their technical skills and are each mentored by an executive within the company. It offers students an opportunity to gain significant work experience at Silicon Valley startups, collaborating on unique and challenging technical problems.

Carlini (photo: Kore Chan/Daily Cal)

AI training may leak secrets to canny thieves

A paper released on arXiv last week by a team of researchers including Prof. Dawn Song and Ph.D. student Nicholas Carlini (B.A. CS/Math '13), reveals just how vulnerable deep learning is to information leakage.  The researchers labelled the problem “unintended memorization” and explained it happens if miscreants can access to the model’s code and apply a variety of search algorithms. That's not an unrealistic scenario considering the code for many models are available online, and it means that text messages, location histories, emails or medical data can be leaked.  The team doesn't “really know why neural networks memorize these secrets right now, ” Carlini says.  “At least in part, it is a direct response to the fact that we train neural networks by repeatedly showing them the same training inputs over and over and asking them to remember these facts."   The best way to avoid all problems is to never feed secrets as training data. But if it’s unavoidable then developers will have to apply differentially private learning mechanisms, to bolster security, Carlini concluded.