Prospective: Visual Computing and Computer Graphics

Program Requirements

All EECS MEng students should expect to complete four (4) technical courses within the EECS department at the graduate level, the Fung Institute's engineering leadership curriculum, as well as a capstone project that will be hosted by the EECS department. You must select a project from the list below.

2020-2021 Capstone Design Experience

Autonomy (advisor Prof. Allen Yang)

Description - One of the fastest growing industries in the past five years is at the intersection of AI, Robotics, and Computer Vision. In particular, Silicon Valley has become a hotbed for research and commercialization of autonomous driving. Leading logistics and retail corporations have started delivering packages via unmanned aerial vehicles. Collaborative robot arms and legs are being experimented in compliant co-working applications. In this Capstone Design Experience Proposal, we will describe the above related topics under the broad umbrella name of Autonomy. 

Due to the multidisciplinary nature of Autonomy and its emerging markets, we believe prospective MEng applicants will have broad interest in selecting Autonomy curriculum and its capstone project. The FHL Vive Center is proposing a concerted effort by its center members to spearhead the creation of a new Capstone Design Experience at the Fung Institute.

Mandatory Courses:

Fall: EE221A. Linear System Theory. Lecturers: Shankar Sastry, Claire Tomlin

Spring: ME 131. Vehicle Dynamics and Control. Lecturer: Francesco Borrelli

Capstone Project: Berkeley Robot Open Autonomous Racing (ROAR) Competition. Advisor: Allen Yang

ROAR Website:

2020-2021 Capstone Projects*

For the capstone projects for Master of Engineering in Electrical Engineering and Computer Science (EECS) our department believes that our students are going to have a significantly better experience if the projects are followed closely by an EECS professor throughout the academic year. To ensure this, we have asked the faculty in each area for which the Master of Engineering is offered in our department to formulate one or more project ideas that the incoming students will have to choose from.

Back to M.Eng. Admissions.

*we will continue to update this list as more proposals are received from our faculty for the upcoming year


Project 1

Title - Oz Vision (advisor Prof. Ren Ng)

Description - The goal of this project is to prototype next-generation color display technology and probe the limits of human vision. The technology is based on a new concept of creating visual percepts by laser stimulation of individual retinal photoreceptors. In principle, the system has the potential to display new colors impossible in the real world, or enable a color blind person to differentiate red and green for the first time. This underlying research is in collaboration with the School of Optometry. The scope of the project can be adjusted based on the expertise of the project members. Helpful background includes some of: systems engineering, software engineering, embedded programming, computer graphics, computer vision, vision science, computational imaging, signal processing, machine learning, optical engineering, physical electronics, biomedical devices, and precision engineering.

Project 2 & 3

Title - Vision Correcting Display (advisor Prof. Brian Barsky)

Description - Vision problems such as near-sightedness, far-sightedness, as well as others, are due to optical aberrations in the human eye. These conditions are prevalent, and the number of people who have these hardships is growing rapidly. Correcting optical aberrations in the human eye is traditionally done optically using eyeglasses, contact lenses, or refractive surgeries; these are sometime not convenient or not always available to everyone. Furthermore, higher order aberrations are not correctable with eyeglasses. This research is investigating a novel approach which involves a new computation based aberration-correcting light field display: by incorporating the person’’s own optical aberrations into the computation, content shown on the display is modified such that the viewer will be able to see the display in sharp focus without using corrective eyewear. Our research involves the analysis of image formation models; through the retinal light field projection, it is possible to compensate for the optical blurring on the target image by a process of prefiltering with the inverse blur. As part of this project, we are building a light field display prototype that supports our desired inverse light field prefiltering. We are working towards extending the capability to correct for higher order aberrations. This latter aspect is particularly exciting since it would enable people for whom it is not possible to see displays in sharp focus using eyeglasses to be able to do so using no corrective eyewear. This is a broad project that incorporates many different aspects. A variety of different backgrounds for students is welcome. Students are free to choose what is the most interesting part for them.


Title - Assistive Technology for Navigation, Selection, Pointing, and Clicking in a Mouse-free Environment by Capturing Hand Movements (advisor Prof. Brian Barsky)

Description -This project is concerned with assistive technology that enables users with fine motor control difficulties to navigate, select, point, and click without physically manipulating a mouse. The idea is to use a camera to capture the user’s hand movements. There are many individuals who do not have the ability to control the pointer easily by moving a physical mouse. Unfortunately, there no viable alternatives due to the fact that the mouse has become an essential input device for all modern computers. Inability to control the mouse could be caused by impaired sensation from a wide variety of conditions and illnesses. This project aims to help those with impaired sensation by developing a computer-vision based input system with a camera as its input device. The current project is developing a system that comprises three modules of detection, tracking, and response: (1) The detection stage extracts the hand and recognizes its gesture which can then be used to alter the users' control; for example, a specific movement could correspond to a click of the mouse. (2) The hand is traced by a tracker and its movement is filtered with an anti-shake filter to perform a more stable movement. (3) In the response stage, the granularity (e.g., dots per inch) of the cursor is adjusted according to the user’s speed of hand movement.

Technical Courses

At least THREE of your four technical courses should be chosen from the list below. The remaining technical courses should be chosen from your own or another MEng area of concentration within the EECS Department.

Fall 2019 (Updated: 7/10/19)

Spring 2020 (Updated: 10/7/2019)

Note: The courses listed here are not guaranteed to be offered, and the course schedule may change without notice. Refer to the UC Berkeley Course Schedule for further enrollment information.