Computer Science

Graduate Study

For information on graduate admission, see Graduate Programs on page 26.

The following introductory information is based on 2019-20 program requirements for UCLA graduate degrees. Complete program requirements are available at Program Requirements for Graduate Degrees. Students are subject to the detailed degree requirements as published in program requirements for the year in which they enter the program.

The Department of Computer Science offers Master of Science (M.S.) and Doctor of Philosophy (Ph.D.) degrees in Computer Science and participates in a concurrent degree program (Computer Science M.S./Management M.B.A.) with the John E. Anderson Graduate School of Management.

Computer Science M.S.

Course Requirements

Course Requirement. A total of nine courses is required for the M.S. degree, including a minimum of five graduate courses. No specific courses are required, but a majority of both the total number of formal courses and the total number of graduate courses must consist of courses offered by the Computer Science Department.

Undergraduate Courses. No lower-division courses may be applied toward graduate degrees. In addition, the following upper-division courses are not applicable toward graduate degrees: Chemical Engineering 102A, 199, Civil and Environmental Engineering 108, 199, Computer Science M152A,

152B, 199, Electrical and Computer Engineering 100, 101A, 102, 110L, M116L, 199, Materials Science and Engineering 110, 120, 130, 131, 131L, 132, 141L, 150, 160, 161L, 199, Mechanical and Aerospace Engineering 102, 103, 105A, 105D, 199.

Breadth Requirement. M.S. degree students must satisfy the computer science breadth requirement by the end of the third term in graduate residence at UCLA. The requirement is satisfied by mastering the contents of five undergraduate courses or equivalent: Computer Science 180, two courses from 111, 118, and M151B, one course from 130, 131, or 132, and one course from 143, 161, or 174A. A UCLA undergraduate course taken by graduate students cannot be used to satisfy graduate degree requirements if students have already received a grade of B– or better for a course taken elsewhere that covers substantially the same material.

For the MS degree, students must also complete at least three terms of Computer Science 201 with grades of Satisfactory.

Competence in any or all courses in breadth requirements may be demonstrated in one of three ways:

  1. Satisfactory completion of the course at UCLA with a grade of B– or better
  2. Satisfactory completion of an equivalent course at another university with a grade of B– or better
  3. Satisfactory completion of a final examination in the course at UCLA

Comprehensive Examination Plan

In the comprehensive examination plan, at least five of the nine courses must be 200-series courses. The remaining four courses may be either 200-series or upper-division courses. No units of 500-series courses may be applied toward the comprehensive examination plan requirements.

Thesis Plan

In the thesis plan, seven of the nine courses must be formal courses, including at least four from the 200 series. The remaining two courses may be 598 courses involving work on the thesis.

The thesis is a report on the results of student investigation of a problem in the major field of study under the supervision of the thesis committee, which approves the subject and plan of the thesis and reads and approves the complete manuscript. While the problem may be one of only limited scope, the thesis must exhibit a satisfactory style, organization, and depth of understanding of the subject. Students should normally start to plan the thesis at least one year before the award of the M.S. degree is expected. There is no examination under the thesis plan.

Computer Science M.S./Management M.B.A.

The Department of Computer Science and the John E. Anderson Graduate School of Management offer a concurrent degree program that enables students to complete the requirements for the M.S. in Computer Science and the M.B.A. (Master of Business Administration) in three academic years. Students should request application materials from both the M.B.A. Admissions Office, John E. Anderson Graduate School of Management, and the Department of Computer Science.

Computer Science Ph.D.

Major Fields or Subdisciplines

Artificial intelligence; computational systems biology; computer networks; computer science theory; computer system architecture; graphics and vision; data science computing; and software systems.

Course Requirements

Normally, students take courses to acquire the knowledge needed to prepare for the written and oral examinations and for conducting Ph.D. research. The basic program of study for the Ph.D. degree is built around the major field requirement and two minor fields. The major field and at least one minor field must be in computer science.

The fundamental examination is common for all Ph.D. candidates in the department and is also known as the written qualifying examination.

To satisfy the major field requirement, students are expected to attain a body of knowledge contained in five courses, as well as the current literature in the area of specialization. In particular, students are required to take a minimum of three graduate courses in the major field of Ph.D. research, selecting these courses in accordance with guidelines specific to the major field. Guidelines for course selection in each major field are available from the departmental Student Affairs Office. Grades of B– or better, with a grade-point average of at least 3.33 in all courses used to satisfy the major field requirement, are required. Students are required to satisfy the major field requirement within the first nine terms after enrolling in the graduate program.

Each minor field normally embraces a body of knowledge equivalent to two courses, at least one of which is a graduate course. Grades of B– or better, with a grade-point average of at least 3.33 in all courses included in the minor field, are required. By petition and administrative approval, a minor field may be satisfied by examination.

Breadth Requirement. Ph.D. degree students must satisfy the computer science breadth requirement by the end of the third term in graduate residence at UCLA. The requirement is satisfied by mastering the contents of five undergraduate courses or equivalent: Computer Science 180, two courses from 111, 118, and M151B, one course from 130, 131, or 132, and one course from 143, 161, or 174A. A UCLA undergraduate course taken by graduate students cannot be used to satisfy graduate degree requirements if students have already received a grade of B– or better for a course taken elsewhere that covers substantially the same material.

For the Ph.D. degree, students must also complete at least three terms of Computer Science 201 with grades of Satisfactory (in addition to the three terms of 201 that may have been completed for the M.S. degree).

Competence in any or all courses may be demonstrated in one of three ways:

  1. Satisfactory completion of the course at UCLA with a grade of B– or better
  2. Satisfactory completion of an equivalent course at another university with a grade of B– or better
  3. Satisfactory completion of a final examination in the course at UCLA

For requirements for the Graduate Certificate of Specialization, see Engineering Schoolwide Programs.

Written and Oral Qualifying Examinations

The written qualifying examination consists of a high-quality paper, solely authored by the student. The paper can be either a research paper containing an original contribution or a focused critical survey paper. The paper should demonstrate that the student understands and can integrate and communicate ideas clearly and concisely. It should be approximately 10 pages single-spaced, and the style should be suitable for submission to a first-rate technical conference or journal. The paper must represent work that the student did as a graduate student at UCLA. Any contributions that are not the student’s own, including those of the student’s adviser, must be explicitly acknowledged in detail. Prior to submission, the paper must by reviewed by the student’s adviser on a cover page with the adviser’s signature indicating review. After submission, the paper must be reviewed and approved by at least two other members of the faculty. There are two deadlines a year for submission of papers.

After passing the preliminary examination and coursework for the major and minor fields, the student should form a doctoral committee and prepare to take the University Oral Qualifying Examination. A doctoral committee consists of a minimum of four members. Three members, including the chair, must hold appointments in the department. The remaining member must be a UCLA faculty member in another department. The nature and content of the oral qualifying examination are at the discretion of the doctoral committee but ordinarily include a broad inquiry into the student’s preparation for research. The doctoral committee also reviews the prospectus of the dissertation at the oral qualifying examination.

Fields of Study

Artificial Intelligence

Artificial intelligence (AI) is the study of intelligent behavior. While other fields such as philosophy, psychology, neuroscience, and linguistics are also concerned with the study of intelligence, the distinguishing feature of AI is that it deals primarily with information processing models. Thus the central scientific question of artificial intelligence is how intelligent behavior can be reduced to information processing. Since even the simplest computer is a completely general information processing device, the test of whether some behavior can be explained by information processing mechanisms is whether a computer can be programmed to produce the same behavior. Just as human intelligence involves gathering sensory input and producing physical action in the world, in addition to purely mental activity, the computer for AI purposes is extended to include sense organs such as cameras and microphones, and output devices such as wheels, robotic arms, and speakers.

The predominant research paradigm in artificial intelligence is to select some behavior that seems to require intelligence on the part of humans, to theorize about how the behavior might be accounted for, and to implement the theory in a computer program to produce the same behavior. If successful, such an experiment lends support to the claim that the selected behavior is reducible to information processing terms, and may suggest the program’s architecture as a candidate explanation of the corresponding human process.

The UCLA Computer Science Department has active research in the following major subfields of artificial intelligence:

  1. Problem Solving. Analysis of tasks, such as playing chess or proving theorems, that require reasoning about relatively long sequences of primitive actions, deductions, or inferences
  2. Knowledge representation and qualitative reasoning. Analysis of tasks such as common-sense reasoning and qualitative physics. Here the deductive chains are short, but the amount of knowledge that potentially may be brought to bear is very large
  3. Expert systems. Study of large amounts of specialized or highly technical knowledge that is often probabilistic in nature. Typical domains include medical diagnosis and engineering design
  4. Natural language processing. Symbolic, statistical, and artificial neural network approaches to text comprehension and generation
  5. Computer vision. Processing of images, as from a TV camera, to infer spatial properties of the objects in the scene (three-dimensional shape), their dynamics (motion), their photometry (material and light), and their identity (recognition)
  6. Robotics. Translation of a high-level command, such as picking up a particular object, into a sequence of low-level control signals that might move the joints of a robotic arm/hand combination to accomplish the task; often this involves using a computer vision system to locate objects and provide feedback
  7. Machine learning. Study of the means by which a computer can automatically improve its performance on a task by acquiring knowledge about the domain
  8. Parallel architecture. Design and programming of a machine with thousands or even millions of simple processing elements to produce intelligent behavior; the human brain is an example of such a machine

Computational Systems Biology

The computational systems biology (CSB) field can be selected as a major or minor field for the Ph.D. or as a specialization area for the M.S. degree in Computer Science.

Graduate studies and research in the CSB field are focused on computational modeling and analysis of biological systems and biological data.

Core coursework is concerned with the methods and tools development for computational, algorithmic, and dynamic systems network modeling of biological systems at molecular, cellular, organ, whole organism, or population levels—and leveraging them in biosystem and bioinformatics applications. Methodological studies include bioinformatics and systems biology modeling, with focus on genomics, proteomics, metabolomics, and higher levels of biological/physiological organization, as well as multiscale approaches integrating the parts.

Typical research areas with a systems focus include molecular and cellular systems biology, organ systems physiology, medical, pharmacological, pharmacokinetic (PK), pharmacodynamic (PD), toxicokinetic (TK), physiologically based PBPK-PD, PBTK, and pharmacogenomic system studies; neuro-systems, imaging and remote sensing systems, robotics, learning and knowledge-based systems, visualization, and virtual clinical environments. Typical research areas with a bioinformatics focus include development of computational methods for analysis of high-throughput molecular data, including genomic sequences, gene expression data, protein-protein interaction, and genetic variation. These computational methods leverage techniques from both statistics and algorithms.

Computer Networks

The computer networks field involves the study of computer networks of different types, in different media (wired, wireless), and for different applications. Besides the study of network architectures and protocols, this field also emphasizes distributed algorithms, distributed systems, and the ability to evaluate system performance at various levels of granularity (but principally at the systems level). In order to understand and predict systems behavior, mathematical models are pursued that lead to the evaluation of system throughput, response time, utilization of devices, flow of jobs and messages, bottlenecks, speedup, power, etc. In addition, students are taught to design and implement computer networks using formal design methodologies subject to appropriate cost and objective functions. The tools required to carry out this design include probability theory, queueing theory, distributed systems theory, mathematical programming, control theory, operating systems design, simulation methods, measurement tools, and heuristic design procedures. The outcome of these studies provides the following:

  1. An appropriate model of the computer system under study
  2. An adequate (exact or approximate) analysis of the behavior of the model
  3. The validation of the model as compared to simulation and/or measurement of the system
  4. Interpretation of the analytical results in order to obtain behavioral patterns and key parameters of the system
  5. Design methodology

Resource Allocation

A central problem in the design and evaluation of computer networks deals with the allocation of resources among competing demands (e.g., wireless channel bandwidth allocation to backlogged stations). In fact, resource allocation is a significant element in most of the technical (and nontechnical) problems we face today.

Most of our resource allocation problems arise from the unpredictability of the demand for the use of these resources, as well as from the fact that the resources are geographically distributed (as in computer networks). The computer networks field encounters such resource allocation problems in many forms and in many different computer system configurations. Our goal is to find allocation schemes that permit suitable concurrency in the use of devices (resources) so as to achieve efficiency and equitable allocation. A very popular approach in distributed systems is allocation on demand, as opposed to pre-scheduled allocation. On-demand allocation is found to be effective, since it takes advantage of statistical averaging effects. It comes in many forms in computer networks and is known by names such as asynchronous time division multiplexing, packet switching, frame relay, random access, and so forth.

Computer Science Theory

Computer science is in large measure concerned with information processing systems, their applications, and the corresponding problems of representation, transformation, and communication. The computer science fields are concerned with different aspects of such systems, and each has its own theoretical component with appropriate models for description and analysis, algorithms for solving the related problems, and mathematical tools. Thus in a certain sense computer science theory involves all of computer science and participates in all disciplines.

The term theoretical computer science has come to be applied nationally and intentionally to a certain body of knowledge emphasizing the interweaving themes of computability and algorithms, interpreted in the broadest sense. Under computability, one includes questions concerning which tasks can and cannot be performed by information systems of different types restricted in various ways, as well as the mathematical analysis of such systems, their computations, and the languages for communication with them. Under algorithms, one includes questions concerning (1) how a task can be performed efficiently under reasonable assumptions on available resources (e.g., time, storage, type of processor), (2) how efficiently a proposed system performs a task in terms of resources used, and (3) the limits on how efficiently a task can be performed. These questions are often addressed by first developing models of the relevant parts of an information processing system (e.g., the processors, their interconnections, their rules of operation, the means by which instructions are conveyed to the system, or the way the data is handled) or of the input/output behavior of the system as a whole. The properties of such models are studied both for their own interest and as tools for understanding the system and improving its performance or applications.

Emphasis of Computer Science Theory

Computer science theory emphasizes

Computer System Architecture

Computer system architecture deals with the design, implementation, and evaluation of computer systems and their building blocks. It deals with general-purpose systems as well as embedded special-purpose systems. The field also encompasses the development of tools to enable system designers to describe, model, fabricate, and test highly complex computer systems from single-chip to computing clouds.

Computer systems are implemented as a combination of hardware and software. Hence, research in the field of computer architecture often involves both hardware and software issues. The requirements of application software and operating systems, together with the capabilities of compilers, play a critical role in determining the features implemented in hardware. At the same time, the computer architect must also take into account the capabilities and limitations of the underlying implementation technology as well as of the design tools.

The goal of research in computer architecture is to develop building blocks, system organizations, design techniques, and design tools that lead to improved performance and reliability as well as reduced power consumption and cost.

Corresponding to the richness and diversity of computer systems architecture research at UCLA, a comprehensive set of courses is offered in the areas of advanced processor architecture, arithmetic processor systems. parallel and distributed architectures, fault-tolerant systems, reconfigurable systems, embedded systems, and computer-aided design of VLSI circuits and systems.

  1. Novel architectures encompass the study of computations that are performed in ways that are quite different than those used by conventional machines. Examples include various domain-specific architectures characterized by high computational rates, low power, and reconfigurable hardwares used in a wide range of computing devices from smart phones to data centers.
  2. The study of high-performance processing algorithms deals with algorithms for very high-performance numerical processing. Techniques such as redundant-digit representations of number systems, fast arithmetic, and the use of highly parallel arrays of processing elements are studied with the goal of providing the extremely high processing speeds required in a number of upcoming computer applications.
  3. The study of computational algorithms and structures deals with the relationship between computational algorithms and the physical structures that can be employed to carry them out. It includes the study of interconnection networks, and the way that algorithms can be formulated for efficient implementation where regularity of structure and simplicity of interconnections are required.
  4. Computer-aided design of VLSI circuits and systems is an active research area that develops techniques for the automated synthesis and analysis of large-scale systems. Topics include high-level and logic-level synthesis, technology mapping, physical design, interconnect modeling, and optimization of various VLSI technologies such as full-custom designs, standard cells, programmable logic devices (PLDs), multichip modules (MCMs), system-on-a-chip (SoCs) that are used in a wide range of applications from IoTs to data centers.
  5. VLSI architectures and implementation is an area of current interest and collaboration between the Electrical and Computer Engineering and Computer Science departments that addresses the impact of large-scale integration on the issues of computer architecture. Application of these systems in medicine and healthcare, multimedia, and finance is being studied in collaboration with other schools on campus.

Data Science Computing

The data science computing field focuses on basic problems of modeling and managing data and knowledge, and their relation with other fundamental areas of computer science, such as operating systems and networking, programming languages, and human-computer interface design.

A data management system embodies a collection of data, devices in which the data are stored, and logic or programs used to manipulate that data. Information management is a generalization of data management in which the data being stored are permitted to be arbitrarily complex data structures, such as rules and trees. In addition, information management goes beyond simple data manipulation and query and includes inference mechanisms, explanation facilities, and support for distributed and web-based access.

The need for rapid, accurate information is pervasive in all aspects of modern life. Modern systems are based on the coordination and integration of multiple levels of data representation, from characteristics of storage devices to conceptual and abstract levels. As human enterprises have become more complex, involving more complicated decisions and trade-offs among decisions, the need for sophisticated information and data management has become essential.

Graphics and Vision

The graphics and vision field focuses on the synthesis and analysis of image and video data by computer. Graphics includes the topics of rendering, modeling, animation, visualization, and interactive techniques, among others, and it is broadly applicable in the entertainment industry (motion pictures and games) and elsewhere. Vision includes image/video representation and registration, feature extraction, three-dimensional reconstruction, object recognition, and image-based modeling, among others, with application to real-time vision/control for robots and autonomous vehicles, medical imaging, visual sensor networks and surveillance, and more. Several of the projects undertaken by our researchers in this field unify graphics and vision through mathematical modeling, wherein graphics is considered a models-to-images synthesis problem and vision the converse images-to-models analysis problem.

Software Systems

The programming languages and systems field is concerned with the study of theory and practice in the development of software systems. Well-engineered systems require appreciation of both principles and architectural trade-offs. Principles provide abstractions and rigor that lead to clean designs, while systems-level understanding is essential for effective design.

Principles here encompass the use of programming systems to achieve specified goals, the identification of useful programming abstractions or paradigms, the development of comprehensive models of soft-ware systems, and so forth. The thrust is to identify and clarify concepts that apply in many programming contexts.

Development of software systems requires an understanding of many methodological and architectural issues. The complex systems developed today rely on concepts and lessons that have been extracted from years of research on programming languages, operating systems, database systems, knowledge-based systems, real-time systems, and distributed and parallel systems.