Explainable AI could reduce the impact of biased algorithms

CS Assistant Prof. Joseph Gonzalez is quoted in an article for VentureBeat titled "Explainable AI could reduce the impact of biased algorithms."   The article discusses the ways human bias could potentially be introduced into machine learning-enabled systems and how General Data Protection Regulation (GDPR) might help. Collecting data from the past is a common starting point for data science projects — but historical “data is often biased in ways that we don’t want to transfer to the future,” said Gonzalez.  “It is an incredibly hard problem...but by getting very smart people thinking about this problem and trying to codify a better approach or at least state what the approach is, I think that will help make progress.”

Microsoft acquires Semantic Machines

Semantic Machines, an artificial intelligence startup co-founded by Prof. Dan Klein and staffed by a number of EECS alumni, has been acquired by Microsoft to help Cortana hold more natural dialog with users.  The team has built a number of machine learning components which work together for a smarter AI, and move beyond the more basic back-and-forth currently supported by the Google Assistant, Apple’s Siri, and Amazon’s Alexa.

In addition to Klein, the team includes Percy Liang (Ph.D. '11), David Hall (Ph.D. '12), Adam Pauls (Ph.D. '12), David Burkett (Ph.D. '12), Jason Wolfe (Ph.D. '11 adviser: Stuart Russell), Yuchen Zhang (Ph.D. '16), Taylor Berg-Kirkpatrick (B.A. '08/Ph.D. '15), Greg Durrett (Ph.D. '16), Alex Nisnevich (M.S. '14), current grad student Jacob Andreas, Charles Chen (B.A. CS/Math '11), Andrew Nguyen (B.A. CS/Linguistics '12), Chuck Wooters (Ph.D. Speech Recognition '93), and consultant Prof. Michael Jordan.

150 Years of Innovation: John Whinnery: Fields and waves

EECS Prof. and alumnus John Whinnery (1916-2009, EE B.S. '37/Ph.D. '48 ) is the subject of a Berkeley Engineering article celebrating UC Berkeley's 150th year.  Whinnery served as director of the Electronics Research Laboratory from 1952-56, department chair from 1956-59, and dean of the College of Engineering from 1959-63. He was a distinguished innovator in the field of electromagnetism and communication electronics and was recognized as one of the country’s top experts on the fundamentals of quantum electronics.  He was awarded the IEEE Medal of Honor in 1985 and the National Medal of Science in 1992

SiFive receives $50.6M in series C funding

SiFive, a fabless provider of customized semiconductors built on research by alumnus Yunsup Lee (MS '11/Ph.D. '16), alumnus Andrew Waterman (M.S. '11/Ph.D. '16), and Prof. Krste Asanović, received $50.6M in series C funding in April.  Lee is Chief Technology Officer,  Waterman is Chief Engineer, and Asanović is Chief Architect at SiFive. The funding round was co-led by Osage University Partners, Sutter Hill Ventures, Spark Capital, and Intel Capital.  SiFive's semiconductors are built on Risc-V, an instruction set architecture (ISA), which acts as the conduit between a computer's software and hardware.  The series C round is being used to commercialize additional products based on Risc-V.  The company has raised $64.1M in funding to date.

Nick Carlini embeds hidden commands to Alexa and Siri in recordings of music and spoken text

CS graduate student Nicholas Carlini  is featured in a New York Times article titled "Alexa and Siri Can Hear This Hidden Command. You Can’t." He and his advisor, David Wagner, have published a paper showing they can embed audio instructions, undectable by human beings, directly into recordings of music or spoken text. They can secretly activate the artificial intelligence systems on smartphones and smart speakers, making them dial phone numbers or open websites. In the wrong hands, the technology could be used to unlock doors, wire money or buy stuff online — simply with music playing over the radio.  “We want to demonstrate that it’s possible,” he said, “and then hope that other people will say, ‘O.K. this is possible, now let’s try and fix it.’ ”  Carlini was among a group of researchers who showed in 2016 that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on airplane mode or open a website.

Rikky Muller is building brain implants to change lives

Assistant Prof. Rikky Muller is featured in an Institution of Mechanical Engineers" article titled "This machine can read your mind – engineers unlock secrets of the brain."  The article explores some of the newest breakthroughs in brain-machine interfaces, and some of the obstacles encountered by researchers.  Muller, a co-founder of Cortera Neurotechnologies,  discusses implant therapies like deep-brain stimulation (DBS) and assistive technologies like ‘intra-cortical recording’--where electrodes are inserted directly into patients’ neurons to allow them to control an external device.  She and her colleagues are working on miniaturising these technologies, to make them wireless and less invasive.  “Our vision is to create devices that are so small, safe and minimally invasive that they can be implanted in the patient for their lifetime,” she said.

Laura Waller on the appeal of working at the intersection of two fields

EE and CS Associate Prof. Laura Waller was interviewed by Computer Vision News in advance of her keynote address to the IEEE International Symposium on Biomedical Imaging (ISBI) in April.  She describes the appeal of working at the intersection of two fields:  design of optical systems and computational algorithms.  She also talks about breakthroughs in computational imaging and industry/academia, and offers advice to conference attendees.

Scott Shenker wins 2017 ACM Paris Kanellakis Theory and Practice Award

Prof. Scott Shenker has been named the 2017 ACM Paris Kanellakis Theory and Practice Award recipient.   The award honors specific theoretical accomplishments that have had a significant and demonstrable effect on the practice of computing.   Shenker is honored for pioneering contributions to fair queueing in packet-switching networks, which had a major impact on modern practice in computer communication. His work was fundamental to helping the internet grow from a tool used by a small community of researchers to a staple of daily life used by billions.   Previous winners of this award include EECS Chair Prof. James Demmel and Prof. Emeritus Robert Brayton.

Pieter Abbeel, Robert Full, and Ken Goldberg will speak at TechCrunch Sessions: Robotics 2018

Three EECS professors are featured speakers at the upcoming TechCrunch Sessions: Robotics on May 11 at Zellerbach Hall.  The single-day event will focus on the crossroads of the latest AI and robotics technology and the startup ecosystem.  Prof. Pieter Abbeel, who works in machine learning and robotics (and who co-founded and Gradescope), will talk about "Teaching Robots New Tricks with AI."    Prof. Robert Full, who has a joint appointment in the Department of Integrative Biology (and who founded of CiBER), will talk about "What Robots Can Learn from Nature." Prof. Ken Goldberg, who holds appointments in IEOR, the School of Information, Art Practice, and the UCSF Dept of Radiation Oncology, will talk about "Getting A Grip on Reality: Deep Learning and Robot Grasping."  He is the co-founder of the Center for New Media.  Alumnus Paul Birkmeyer (Ph.D. '13), co-founder of Dishcraft Robotics, is also slated to speak.

Editing brain activity with holography

The research of Associate Prof. Laura Waller is highlighted in a Berkeley News article titled "Editing brain activity with holography."  Waller is co-author of a paper published in the journal Nature Neuroscience that describes a holographic brain modulator which can activate up to 50 neurons at once in a three-dimensional chunk of brain containing several thousand neurons, and repeat that up to 300 times a second with different sets of 50 neurons. The goal is to read neural activity constantly and decide, based on the activity, which sets of neurons to activate to simulate the pattern and rhythm of an actual brain response, so as to replace lost sensations after peripheral nerve damage, for example, or control a prosthetic limb. “The major advance is the ability to control neurons precisely in space and time,” said Waller's postdoc Nicolas Pégard, who is a first author of the paper.  “In other words, to shoot the very specific sets of neurons you want to activate and do it at the characteristic scale and the speed at which they normally work.”