News

Michael Chen awarded SPIE Optics and Photonics Education Scholarship

Grad student Michael Chen (advisor:  Laura Waller) has been awarded a 2018 Optics and Photonics Education Scholarship by SPIE, the international society for optics and photonics, for his potential contributions to the field of optics, photonics or related field.  Chen works in the Computational Imaging Lab where he focuses on non-invasive multi-dimensional phase imaging. “Nowadays, computation enables us to truly utilize full capacity of existing imaging system and extract new information from decade-old optical designs. By jointly designing the optical hardware and post processing software, we deliver simple yet powerful computational imaging techniques,” he said.

PerfFuzz wins ISSTA18 Distinguished Paper Award

"PerfFuzz: Automatically Generating Pathological Inputs," written by graduate students Caroline Lemieux and Rohan Padhye, and Profs. Koushik Sen and Dawn Song, will receive a Distinguished Paper Award from the ACM SIGSOFT International Symposium on Software Testing and Analysis (ISSTA) 2018 in Amsterdam in July.  PerfFuzz is a method to automatically generate inputs for software programs via feedback-directed mutational fuzzing.  These inputs exercise pathological behavior across program locations, without any domain knowledge.   The authors found that PerfFuzz outperforms prior work by generating inputs that exercise the most-hit program branch 5x to 69x times more, and result in 1.9x to 24.7x longer total execution paths.

Microsoft acquires Semantic Machines

Semantic Machines, an artificial intelligence startup co-founded by Prof. Dan Klein and staffed by a number of EECS alumni, has been acquired by Microsoft to help Cortana hold more natural dialog with users.  The team has built a number of machine learning components which work together for a smarter AI, and move beyond the more basic back-and-forth currently supported by the Google Assistant, Apple’s Siri, and Amazon’s Alexa.

In addition to Klein, the team includes Percy Liang (Ph.D. '11), David Hall (Ph.D. '12), Adam Pauls (Ph.D. '12), David Burkett (Ph.D. '12), Jason Wolfe (Ph.D. '11 adviser: Stuart Russell), Yuchen Zhang (Ph.D. '16), Taylor Berg-Kirkpatrick (B.A. '08/Ph.D. '15), Greg Durrett (Ph.D. '16), Alex Nisnevich (M.S. '14), current grad student Jacob Andreas, Charles Chen (B.A. CS/Math '11), Andrew Nguyen (B.A. CS/Linguistics '12), Chuck Wooters (Ph.D. Speech Recognition '93), and consultant Prof. Michael Jordan.

EECS M.Eng. project wins 2018 Fung Institute Award for the Most Innovative Project

Assistant Prof. Rikky Muller and her Masters of Engineering students Jingbo Wu, Sherwin Lau, Paul Meyer-Rachner, Chen Fu and Mary Lee Lawrence, have received the 2018 Fung Institute Award for the Most Innovative Project.  This award is given to the team that most effectively demonstrates the relevance of the problem they are trying to solve, the originality of their proposed solution, and the potential of their project's impact. Their research project, "Neurodetect: On-Chip Biosignal Computation for Health Monitoring," was selected from over 100 capstone projects in the College of Engineering.  

Luke Strgar thinks that Blockchain can be used to track gun sales in America

Graduating CS senior Luke Strgar thinks he might have a solution for the fraught issue of guns in America: Use blockchain to track gun sales.  Strgar thinks that Blockchain offers the perfect balance between security, anonymity and scale that could please people on all sides of the gun-control debate.  He spent two days in Washington, D.C. this month pitching the idea of a centralized, ultra-secure, online gun-sale database to legislative aides and think-tank analysts.  A database like this could be monitored by everyone and could not be abused by the government.  “The goal here is to find a solution that both parties can agree on,” Strgar said. “I am not interested in developing something for one side of the discussion, that people try to force down the throat of parties coming from the other side. One of the nice things about technology is that you can develop systems that work for people.”

Nick Carlini embeds hidden commands to Alexa and Siri in recordings of music and spoken text

CS graduate student Nicholas Carlini  is featured in a New York Times article titled "Alexa and Siri Can Hear This Hidden Command. You Can’t." He and his advisor, David Wagner, have published a paper showing they can embed audio instructions, undectable by human beings, directly into recordings of music or spoken text. They can secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.  “We want to demonstrate that it’s possible,” he said, “and then hope that other people will say, ‘O.K. this is possible, now let’s try and fix it.’ ”  Carlini was among a group of researchers who showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

Jacque Garcia graduates a champion

Graduating CS senior Jacque Garcia, the president of Cal Boxing, is the focus of a Berkeley News article titled "Longtime fighter graduates as a champion."  Garcia, who grew up in Compton and is known for her “mental toughness, determination, dedication and positive attitude,” won the 2018 132-pound National Collegiate Boxing Association (NCBA) championship belt, an Outstanding Boxer Award, and a Cal Boxing women's third-place team award.  She was also both a Code2040 Fellow and CircleCI software engineering intern in 2017, and worked at the Hybrid Ecologies Lab in 2016 to help Ph.D. grad student Cesar Torres develop some features of a 2.5D Computer Aided Design (CAD) tool to reduce complexity of digital modeling by using grey-scale height maps.  Garcia credits the student organization Code the Change for her decision to eventually major in Computer Science. “Graduation is going to be very emotional,” says Garcia. “I didn’t start thinking about college until I was in the eighth grade. I didn’t know if I was going to go to college, I didn’t know how I was going to pay for it. It’s going to be a surreal moment. I can’t believe it’s happening.”

Will Huang, Vedant Saran, and Alvin Wan are 2018 U.S. Imagine Cup Winners

Three EECS students are in the top two teams which won the U.S. Imagine Cup Finals in San Francisco this week.  In the three-day event, sponsored by Microsoft, competing teams from across the United States presented and demoed their tech projects to a panel of VIP judges.  Will Huang (EECS M.Eng. program) and Vedant Saran (EECS senior) are on the 1st place U.C. Berkeley Pengram team, which received a $10k prize plus a $1k Judges' Mixed Reality Award.  The Pengram team built an AR/VR platform which allows engineers from around the world to be holographically ‘teleported’ into a workspace when needed.   Alvin Wan (EECS senior) is on the UC Berkeley/Johns Hopkins Boomerang team, which placed 2nd and received an $8k prize plus a $1k Judges' Data & IoT Award. The Boomerang team created a hybrid device and smartphone platform that monitors inhaler location for patients with asthma, notifying them of missing devices. The 6 winning teams will advance to the Imagine Cup World Finals this summer, where they will represent the United States for the chance to take home the trophy and win the $100,000 grand prize.  The competition is designed to empower "the next generation of computer science students to team up and use their creativity, passion and knowledge of technology to create applications that shape how we live, work and play."

HäirIÖ: Human Hair as Interactive Material

CS Prof. Eric Paulos and his graduate students in the Hybrid Ecologies Lab, Sarah Sterman, Molly Nicholas, and Christine Dierk, have created a prototype of a wearable color- and shape-changing braid called HäirIÖ.  The hair extension is built from a custom circuit, an Arduino Nano, an Adafruit Bluetooth board, shape memory alloy, and thermochromic pigments.  The bluetooth chip allows devices such as phones and laptops to communicate with the hair, causing it to change shape and color, as well as respond when the hair is touched. Their paper "Human Hair as Interactive Material," was presented at the ACM International Conference on Tangible, Embedded and Embodied Interaction (TEI) last week. They have posted a how-to guide and instructable videos which include comprehensive hardware, software, and electronics documentation, as well as information about the design process. "Hair is a unique and little-explored material for new wearable technologies," the guide says.  "Its long history of cultural and individual expression make it a fruitful site for novel interactions."

Michael Laskey talks DART in Robohub podcast

EECS graduate student Michael Laskey (advisor: Ken Goldberg) is interviewed by Audrow Nash for a Robohub podcast titled "DART: Noise injection for robust imitation learning."  Laskey works in the AUTOLAB where he develops new algorithms for Deep Learning of robust robot control policies and examines how to reliably apply recent deep learning advances for scalable robotics learning in challenging unstructured environments.  In the podcast, he discusses how DART relates to previous imitation learning methods, how this approach has been used for folding bed sheets, and on the importance of robotics leveraging theory in other disciplines.