2002-2003 Computer Science


UCLA
4732 Boelter Hall
Box 951596
Los Angeles, CA 90095-1596

(310) 825-3886
http://www.cs.ucla.edu/

Milos D. Ercegovac, Ph.D., Chair
D. Stott Parker, Jr. , Ph.D., Vice Chair
David A. Rennels, Ph.D., Vice Chair

Professors
Rajive L. Bagrodia, Ph.D.
Alfonso F. Cardenas, Ph.D.
Wesley W. Chu, Ph.D.
Jason (Jinsheng) Cong, Ph.D.
Joseph J. DiStefano III, Ph.D.
Michael G. Dyer, Ph.D.
Milos D. Ercegovac, Ph.D.
Deborah L. Estrin, Ph.D.
Mario Gerla, Ph.D.
Sheila A. Greibach, Ph.D.
Richard E. Korf, Ph.D.
Richard R. Muntz, Ph.D.
D. Stott Parker, Jr., Ph.D.
Miodrag Potkonjak, Ph.D.
Majid Sarrafzadeh, Ph.D.
Carlo Zaniolo, Ph.D. (Norman E. Friedmann Professor of Knowledge Sciences)
Lixia Zhang, Ph.D.

Professors Emeriti
Algirdas A. Avizienis, Ph.D.
Bertram Bussell, Ph.D.
Jack W. Carlyle, Ph.D.
Gerald Estrin, Ph.D.
Thelma Estrin, Ph.D.
Leonard Kleinrock, Ph.D.
Allen Klinger, Ph.D.
Lawrence P. McNamee, Ph.D.
Michel A. Melkanoff, Ph.D.
Judea Pearl, Ph.D.
Jacques J. Vidal, Ph.D.

Associate Professors
Adnan Y. Darwiche, Ph.D.
Eliezer M. Gafni, Ph.D.
Elias Koutsoupias, Ph.D.
David A. Rennels, Ph.D.
Stefano Soatto, Ph.D.
Yuval Tamir, Ph.D.

Assistant Professors
Junghoo (John) Cho, Ph.D.
Petros Faloutsos, Ph.D.
Songwu Lu, Ph.D.
Glenn Reinman, Ph.D.

Senior Lecturer
Leon Levine, M.S., Emeritus

Adjunct Professors
Andrew B. Kahng, Ph.D.
Boris Kogan, Ph.D.
Gerald J. Popek, Ph.D.

Adjunct Associate Professors
Leon Alkalai, Ph.D.
Peter L. Reiher, Ph.D.

Scope and Objectives

Computer science is concerned with the design, modeling, analysis, and applications of computer-related systems. Its study at UCLA provides education at the undergraduate and graduate levels necessary to understand, design, implement, and use the software and hardware of digital computers and digital systems. The programs provide comprehensive and integrated studies of subjects in computer system architecture, computer networks, distributed computer systems, programming languages and systems, information and data management, artificial intelligence, computer science theory, and scientific computing.

The undergraduate and graduate studies and research projects in computer science are supported by significant computing resources. In addition to the departmental computing facility, there are nearly a dozen laboratories specializing in areas such as distributed systems, multimedia computer communications, VLSI systems, VLSI CAD, and artificial intelligence. Also, the Cognitive Systems Laboratory is engaged in studying computer systems which emulate or support human reasoning. The Biocybernetics Laboratory is devoted to multidisciplinary research involving the application of engineering and computer science methods to problems in biology and medicine.

The B.S. degree may be attained either through the Computer Science and Engineering major or through the Computer Science major described below.

In addition to the B.S. in Computer Science and Engineering and the B.S. in Computer Science, HSSEAS offers M.S. and Ph.D. degrees in Computer Science, as well as minor fields for graduate students seeking engineering degrees. In cooperation with the John E. Anderson Graduate School of Management, the Computer Science Department offers a concurrent degree program that enables students to obtain the M.S. in Computer Science and the M.B.A. (Master of Business Administration).

Computer Science and Engineering B.S.

The computer science and engineering curriculum at UCLA provides the education and training necessary to design, implement, test, and utilize the hardware and software of digital computers and digital systems. This curriculum has components spanning both the Computer Science and Electrical Engineering Departments. Within the curriculum students study all aspects of computer systems from electronic design through logic design, MSI, LSI, and VLSI concepts and device utilization, machine language design, implementation and programming, operating system concepts, systems programming, networking fundamentals, higher-level language skills, and application of these to systems. Students are prepared for employment in a wide spectrum of high-technology industries.

The computer science and engineering curriculum is accredited by the Computing Accreditation Commission of ABET, 111 Market Place, Suite 1050, Baltimore, MD 21202-4012, (410) 347-7700.

The Major

Course requirements are as follows (186 minimum units required):

  1. Four core courses: Computer Science 31, 32, 33, M51A (or Electrical Engineering M16)
     
  2. Computer Science 111, 118, 131, M151B (or Electrical Engineering M116C), 180, 181, Electrical Engineering 10, 102, 103, 110, 110L, 115A, 115AL, 115C, Statistics 110A; 6 laboratory units from Computer Science M152A (or Electrical Engineering M116L) and M152B (or Electrical Engineering M116D); one computer science/electrical engineering elective (excluding Electrical Engineering 100)
     
  3. Four upper division elective courses from the Computer Science Department. Course 199 may normally be taken only as a free elective; however, students may petition for exceptions in extraordinary situations
     
  4. Chemistry and Biochemistry 20A; Electrical Engineering 1, 2, Physics 1A, 1B, 4AL, 4BL; Mathematics 31A, 31B, 32A, 32B, 33A, 33B, 61
     
  5. HSSEAS general education (GE) requirements; see Curricular Requirements on page 22 for details. Computer Science and Engineering majors are also required to satisfy the ethics and professionalism requirement by completing one course from Engineering 95 or 194 or 195, which may be applied toward either the humanities or social sciences section of the GE requirements

Computer Science B.S.

The computer science curriculum is designed to accommodate students who want professional preparation in computer science but do not necessarily have a strong interest in computer systems hardware. The curriculum consists of components in computer science, a minor or technical support area, and a core of courses from the social sciences, life sciences, and humanities. Within the curriculum, students study subject matter in software engineering, principles of programming languages, data structures, computer architecture, theory of computation and formal languages, operating systems, distributed systems, computer modeling, computer networks, compiler construction, and artificial intelligence. Majors are prepared for employment in a wide range of industrial and business environments.

The computer science curriculum is accredited by the Computing Accreditation Commission of ABET, 111 Market Place, Suite 1050, Baltimore, MD 21202-4012, (410) 347-7700.

The Major

Course requirements are as follows (182 minimum units required):

  1. Four core courses: Computer Science 31, 32, 33, M51A (or Electrical Engineering M16)
     
  2. Computer Science 111, 112, 118, 131, 132, M151B (or Electrical Engineering M116C), 161, 180, 181, Statistics 110A; Computer Science 170A or Electrical Engineering 103; 6 laboratory units from Computer Science M152A (or Electrical Engineering M116L) and M152B (or Electrical Engineering M116D)
     
  3. Two elective upper division computer science courses
     
  4. A minor or technical support area composed of three upper division courses selected from one of the following areas: astronomy, atmospheric sciences, biology, chemical engineering, chemistry and biochemistry, civil and environmental engineering, Earth and space sciences, economics, electrical engineering, information studies, linguistics, management, materials science and engineering, mathematics, mechanical and aerospace engineering, molecular biology, physics
     
  5. Electrical Engineering 1, 2, Physics 1A, 1B, 4AL, 4BL; Mathematics 31A, 31B, 32A, 32B, 33A, 33B, 61
     
  6. HSSEAS general education (GE) requirements; see Curricular Requirements on page 22 for details. Computer Science majors must also select two additional humanities/social sciences courses and one additional life sciences course and are required to satisfy the ethics and professionalism requirement by completing one course from Engineering 95 or 194 or 195, which may be applied toward either the humanities or social sciences section of the GE requirements. Chemistry 20A may be substituted for one of the life sciences courses

Graduate Study

For information on graduate admission, see Graduate Programs, page 24.

The following introductory information is based on the 2002-03 edition of Program Requirements for UCLA Graduate Degrees. Complete annual editions of Program Requirements are available from the “Publications” link at http://www.gdnet.ucla.edu. Students are subject to the degree requirements as published in Program Requirements for the year in which they matriculate.

The Department of Computer Science offers Master of Science (M.S.) and Doctor of Philosophy (Ph.D.) degrees in Computer Science and participates in a concurrent degree program with the John E. Anderson Graduate School of Management.

Computer Science M.S.

Course Requirements

Course Requirement. A total of nine courses is required for the M.S. degree, including a minimum of five graduate courses. No specific courses are required, but a majority of both the total number of formal courses and the total number of graduate courses must consist of courses offered by the Computer Science Department.

Undergraduate Courses. No lower division courses may be applied toward graduate degrees. In addition, the following upper division courses are not applicable toward graduate degrees: Chemical Engineering M105A, 199; Civil and Environmental Engineering 106A, 108, 199; Computer Science M152A, M152B, M171L, 199; Electrical Engineering 100, 101, 102, 103, 110L, M116D, M116L, M171L, 199; Materials Science and Engineering 110, 120, 130, 131, 131L, 132, 150, 160, 161L, 190, 191L, 199; Mechanical and Aerospace Engineering 102, 103, M105A, 105D, 199.

Breadth Requirement. Candidates for the M.S. in Computer Science must satisfy the computer science breadth requirement by the end of the fourth quarter in graduate residence at UCLA. The requirement is satisfied by mastering the contents of six undergraduate courses in computer science chosen from the following two groups:

Group I: Four required courses or equivalent from Computer Science M51A, 143 or 180, M151B, 181.

Group II: Two required courses or equivalent from Computer Science 111, 112 or 118, 131 or 132, 161, 171 or 174.

In addition, for each degree students must complete at least one course per quarter for three quarters of Computer Science 201 with grades of Satisfactory.

Competence in any or all courses may be demonstrated in one of three ways:

  1. Satisfactory completion of the course at UCLA with a grade of B- or better
     
  2. Satisfactory completion of an equivalent course at another university with a grade of B- or better
     
  3. Satisfactory completion of a final examination in the course at UCLA

Comprehensive Examination Plan. In the comprehensive examination plan, at least five of the nine courses must be 200-series courses. The remaining four courses may be either 200-series or upper division courses. No units of 500-series courses may be applied toward the comprehensive examination plan requirements.

Thesis Plan. In the thesis plan, seven of the nine courses must be formal courses, including at least four from the 200 series. The remaining two courses may be 598 courses involving work on the thesis.

Comprehensive Examination Plan

Consult the department.

Thesis Plan

The thesis is a report on the results of the investigation of a problem in the major field of study under the supervision of the thesis committee, which approves the subject and plan of the thesis and reads and approves the complete manuscript. While the problem may be one of narrow scope, the thesis must show a significant style, organization, and depth of understanding of the subject. Students should normally start to plan the thesis at least one year before the award of the M.S. degree is expected. There is no examination under the thesis plan.

Computer Science M.S./Management M.B.A.

The Department of Computer Science and the John E. Anderson Graduate School of Management offer a concurrent degree program that enables students to complete the requirements for the M.S. in Computer Science and the M.B.A. (Master of Business Administration) in three academic years. Students should request application materials from both the M.B.A. Admissions Office, John E. Anderson Graduate School of Management, and the Department of Computer Science.

Computer Science Ph.D.

Major Fields or Subdisciplines

Artificial intelligence; computer networks; computer science theory; computer system architecture; information and data management; scientific computing (biomedical engineering systems and biocybernetics, physical systems); software systems and languages.

Course Requirements

Course Requirement. There is no formal course requirement for the Ph.D. degree, and students may theoretically take required Ph.D. program examinations without taking courses. However, students normally take courses to acquire the knowledge needed for the required written and oral preliminary examinations. The basic program of study for the Ph.D. degree is built around one major field and two minor fields; the major and at least one minor must be in computer science. The major field corresponds to a body of knowledge contained in six courses, at least four of which are graduate courses, plus the current literature in the area of specialization. A detailed syllabus for each major field can be obtained from the Student Affairs Office in the Computer Science Department. Each minor field normally embraces a body of knowledge equivalent to three courses, at least two of which are graduate courses. Grades of B- or better, with a grade-point average of at least 3.33 in all courses included in the minor field, are required. By petition and administrative approval, a minor field may be satisfied by examination.

Breadth Requirement. Candidates for the Ph.D. in Computer Science must satisfy the computer science breadth requirement by the end of the fourth quarter in graduate residence at UCLA. The requirement is satisfied by mastering the contents of six undergraduate courses in computer science chosen from the following two groups:

Group I: Four required courses or equivalent from Computer Science M51A, 143 or 180, M151B, 181.

Group II: Two required courses or equivalent from Computer Science 111, 112 or 118, 131 or 132, 161, 171 or 174.

In addition, for each degree students must complete at least one course per quarter for three quarters of Computer Science 201 with grades of Satisfactory.

Competence in any or all courses may be demonstrated in one of three ways:

  1. Satisfactory completion of the course at UCLA with a grade of B- or better
     
  2. Satisfactory completion of an equivalent course at another university with a grade of B- or better
     
  3. Satisfactory completion of a final examination in the course at UCLA
     

For requirements for the Graduate Certificate of Specialization, see Engineering Schoolwide Programs.

Written and Oral Qualifying Examinations

After mastering the body of knowledge defined in the three fields and passing the breadth requirement, students take a written preliminary examination. When the examination is passed and all coursework is completed, students may be required to take an oral preliminary examination that encompasses the major and minor fields. The examination may be waived by the faculty on the recommendation of the major field committee for students deemed to be making strong progress toward the degree. Students may not take a preliminary examination more than twice.

After passing the preliminary examination, students should form a doctoral committee and prepare to take the University Oral Qualifying Examination. The nature and content of the examination are at the discretion of the doctoral committee but ordinarily include a broad inquiry into the student’s preparation for research. The doctoral committee also reviews the prospectus of the dissertation at the oral qualifying examination.

Note: Doctoral Committees. A doctoral committee consists of a minimum of four members. Three members, including the chair, are “inside” members and must hold appointments at UCLA in the student’s major department in HSSEAS. The “outside” member must be a UCLA faculty member outside the student’s major department.

Fields of Study

Artificial Intelligence

Artificial intelligence (AI) is the study of intelligent behavior. While other fields such as philosophy, psychology, neuroscience, and linguistics are also concerned with the study of intelligence, the distinguishing feature of AI is that it deals primarily with information processing models. Thus the central scientific question of artificial intelligence is how intelligent behavior can be reduced to information processing. Since even the simplest computer is a completely general information processing device, the test of whether some behavior can be explained by information processing mechanisms is whether a computer can be programmed to produce the same behavior. Just as human intelligence involves gathering sensory input and producing physical action in the world, in addition to purely mental activity, the computer for AI purposes is extended to include sense organs such as cameras and microphones, and output devices such as wheels, robotic arms, and speakers.

The predominant research paradigm in artificial intelligence is to select some behavior which seems to require intelligence on the part of humans, to theorize about how the behavior might be accounted for, and to implement the theory in a computer program to produce the same behavior. If successful, such an experiment lends support to the claim that the selected behavior is reducible to information processing terms, and may suggest the program’s architecture as a candidate explanation of the corresponding human process.

The UCLA Computer Science Department has active research in the following major subfields of artificial intelligence:

  1. Problem Solving. Analysis of tasks, such as playing chess or proving theorems, that require reasoning about relatively long sequences of primitive actions, deductions, or inferences
     
  2. Knowledge representation and qualitative reasoning. Analysis of tasks such as commonsense reasoning and qualitative physics. Here the deductive chains are short, but the amount of knowledge that potentially may be brought to bear is very large
     
  3. Expert systems. Study of large amounts of specialized or highly technical knowledge that is often probabilistic in nature. Typical domains include medical diagnosis and engineering design
     
  4. Natural language processing. Symbolic, statistical, and artificial neural network approaches to text comprehension and generation
     
  5. Computer vision. Processing of images, as from a TV camera, to infer spatial properties of the objects in the scene (three-dimensional shape), their dynamics (motion), their photometry (material and light), and their identity (recognition)
     
  6. Robotics. Translation of a high-level command, such as picking up a particular object, into a sequence of low-level control signals that might move the joints of a robotic arm/hand combination to accomplish the task; often this involves using a computer vision system to locate objects and provide feedback
     
  7. Machine learning. Study of the means by which a computer can automatically improve its performance on a task by acquiring knowledge about the domain
     
  8. Parallel architecture. Design and programming of a machine with thousands or even millions of simple processing elements to produce intelligent behavior; the human brain is an example of such a machine

Computer Networks

The computer networks field involves the study of computer systems, computer communications, computer networks, local area networks, high-speed networks, distributed algorithms, and distributed systems, emphasizing the ability to evaluate system performance at all levels of activity (but principally from the systems viewpoint) and to identify the key parameters of global system behavior. Of interest are mathematical models that lend themselves to analysis and that can be used to predict the system throughput, response time, utilization of devices, flow of jobs and messages, bottlenecks, speedup, power, etc. In addition, computer networks are constructed using design methodologies subject to appropriate cost and objective functions. The field provides the techniques for system performance, evaluation, and design.

The tools required to carry this out include probability theory, queueing theory, queueing networks, graph and network flow theory, mathematical programming, optimization theory, operating systems design, computer communication methods and protocols, simulation methods, measurement tools and methods, and heuristic design procedures. The outcome of these studies is to provide the following:

  1. An appropriate model of the computer system under study
     
  2. An adequate (exact or approximate) analysis of the behavior of this model
     
  3. The validation of the model as compared to simulation and/or measurement of the system
     
  4. Interpretation of the analytical results in order to obtain behavioral patterns and key parameters of the system
     
  5. A design methodology

Resource Allocation

Many of the issues involved in the consideration of computer networks deal with the allocation of resources among competing demands. In fact resource allocation is a significant element in most of the technical (and nontechnical) problems we face today.

Most of our resource allocation problems arise from the unpredictability of the demand for the use of these resources, as well as from the fact that the resources are geographically distributed (as in computer networks). The computer networks field encounters such resource allocation problems in many forms and in many different computer system configurations. Our goal is to find allocation schemes that permit suitable concurrency in the use of devices (resources) so as to achieve efficiency and equitable allocation. The use of demand allocation is found to be effective, since it takes advantage of statistical averaging effects. We identify and exploit this averaging effect whenever possible in our system modeling, analysis, and design. This demand multiplexing (sharing of large systems) comes in many forms and is known by names such as asynchronous time division multiplexing, line switching, message switching, store and forward systems, packet switching, frame relay, call switching, and so forth.

Areas Included in the Computer Networks Field

Typical Systems Studied Tools Used
• Computer networks
• Packet switching
• Multiprogramming systems
• Parallel processing systems
• High-performance computers
• Distributed processing systems
• Time-shared systems
• Satellite and ground radio packet switching systems
• Cellular radio systems
• Personal communication networks
• Local-area networks
• Communication processors
• High-speed networks
• Photonic networks
• Neural networks
• Communication protocols
• Network control procedures (centralized and distributed)
• Mobile networks
• Probability theory
• Queueing theory
• Queueing networks
• Graph theory
• Network flow theory
• Optimization theory
• Mathematical programming
• Heuristic design algorithms
• Simulation
• Distributed algorithms
• Measurement
• Information theory
• Control theory

Computer Science Theory

Computer science is in large measure concerned with information processing systems, their applications, and the corresponding problems of representation, transformation, and communication. The computer science fields are concerned with different aspects of such systems, and each has its own “theoretical component” with appropriate models for description and analysis, algorithms for solving the related problems, and mathematical tools. Thus in a certain sense “computer science theory” involves all of computer science and participates in all disciplines.

The term theoretical computer science has come to be applied nationally and internationally to a certain body of knowledge emphasizing the interweaving themes of computability and algorithms, interpreted in the broadest sense. Under computability, one includes questions concerning which tasks can and cannot be performed by information systems of different types restricted in various ways, as well as the mathematical analysis of such systems, their computations, and the languages for communication with them. Under algorithms, one includes questions concerning (1) how a task can be performed efficiently under reasonable assumptions on available resources (e.g., time, storage, type of processor), (2) how efficiently a proposed system performs a task in terms of resources used, and (3) the limits on how efficiently a task can be performed. These questions are often addressed by first developing models of the relevant parts of an information processing system (e.g., the processors, their interconnections, their rules of operation, the means by which instructions are conveyed to the system, or the way the data is handled) or of the input/output behavior of the system as a whole. The properties of such models are studied both for their own interest and as tools for understanding the system and improving its performance or applications.

Emphasis of Computer Science Theory

• Design and analysis of algorithms
• Distributed and parallel algorithms
• Models for parallel and concurrent
computation
• Online and randomized algorithms
• Computational complexity
• Automata and formal languages

Computer System Architecture

Computer system architecture deals with

  1. The study of the structure and behavior of computer systems
     
  2. The development of new algorithms and computing structures to be implemented in hardware, firmware, and software
     
  3. The development of tools to enable system designers to describe, model, fabricate, and test highly complex computer systems

Computer systems are among the most complex systems ever developed and as such are the subject of intensive study. The computer architect must be able to define the functions to be provided by a computing system and the way in which they are implemented. Due to their complexity, computer systems must be decomposed into subsystems. This decomposition is carried out at several levels until the desired system can be composed from well-understood reusable hardware and software elements. One way to categorize these subsystems is by processor, memory, data transmission and interconnection, control, input/output, and operating system elements. The subsystems must be precisely specified and their interactions modeled and thoroughly understood before a system can be fabricated.

Properties of a well-engineered system include ease and efficiency of programming and behavior that is predictable to a user. Moreover, a well-engineered system is one that satisfies cost, performance, and reliability constraints.

A comprehensive set of courses is offered in the areas of advanced computer architecture, arithmetic processor systems, fault-tolerant systems, memory systems, operating systems, data communications, VLSI-based architectures, computer-aided design of VLSI circuits and systems, distributed computing, and parallel processing. The courses are intended to prepare students for advanced engineering and continuing research. Advanced courses are also offered to introduce students to research areas being pursued by the faculty.

The computer architecture field at UCLA offers strong emphasis on systems issues of design, performance modeling, and algorithms. Some of the areas of current interest are described below:

  1. Fault-tolerant computing involves the design of systems that can continue operation in the presence of faults. This includes errors in specification, operator errors, software faults, and random failures of hardware components. Design techniques and modeling tools are being studied for several levels of system design, including specification, software fault-tolerance, and fault-tolerance techniques for VLSI.
     
  2. Novel architectures encompass the study of computations which are performed in ways that are quite different than those used by conventional machines. Examples include various domain-specific architectures characterized by high computational rates, low power, and reconfigurable hardware implementations.
     
  3. The study of high-performance processing algorithms deals with algorithms for very high-performance numerical processing. Techniques such as redundant-digit representations of number systems, fast arithmetic, and the use of highly parallel arrays of processing elements are studied with the goal of providing the extremely high processing speeds required in a number of upcoming computer applications.

  4. The study of computational algorithms and structures deals with the relationship between computational algorithms and the physical structures which can be employed to carry them out. It includes the study of interconnection networks, and the way that algorithms can be formulated for efficient implementation where regularity of structure and simplicity of interconnections are required.
     
  5. Computer-aided design of VLSI circuits is an active research area which develops techniques for the automated synthesis of large-scale systems. Topics include logic synthesis, physical design, testing, and yield enhancement for various VLSI technologies such as standard cells, gate arrays, field programmable gate arrays (FPGAs), and multichip modules (MCMs). Other areas of study include a structural theory of the large-scale global optimizations which arise in VLSI CAD.
     
  6. VLSI architectures and implementation is an area of current interest and collaboration between the Electrical Engineering and Computer Science Departments that addresses the impact of large-scale integration on the issues of computer architecture. In addition to detailed studies of these issues there is an active program in the design of MOS large-scale integrated circuits.

Information and Data Management

The information and data management field focuses on basic problems of modeling and managing data and knowledge, and their relation with other fundamental areas of computer science, such as operating systems and distributed processing, programming languages, and interface design.

A data management system embodies a collection of data, devices in which the data are stored, and logic or programs used to manipulate that data. Information management is a generalization of data management in which the &147;data&148; being stored are permitted to be complex data structures, such as rules and trees. In addition, information management goes beyond simple data manipulation and query, and includes inference mechanisms, explanation facilities, and support for distributed and web-based information.

The need for rapid, accurate information is pervasive in all aspects of modern life. Modern systems are based on the coordination and integration of multiple levels of data representation, from characteristics of storage devices to conceptual and abstract levels. As human enterprises become more complex, involving more complicated decisions and trade-offs among decisions, the need for sophisticated information and data management will become essential.

Scientific Computing

The scientific computing field currently covers one area of specialization: biomedical systems, which can be selected as a major or minor field for the Ph.D. in Computer Science.

Subject Matter and Course Offerings

Biomedical Systems. Emphasis is on modeling methodologies, algorithms, and quantitative methods for life sciences applications, both basic and applied. Research topics typically involve one or more of the following areas: (1) organismic, cellular, and mechanism-level mathematical modeling and simulation of cancer and disease processes, (2) advances in modeling methodology for life sciences research, including experiment design optimization, (3) development of software for modeling (model fitting and model selection) and kinetic analysis of biological systems, with emphasis on expert systems, graphical, and web interfaces, (4) model-based experimental research in physiology, pharmacology, biochemistry, genomics, bioinformatics, and related fields, developing the interface between (theoretical) modeling and laboratory experimentation.

Software Systems and Languages

The programming languages and systems field is concerned with the study of theory and practice in the development of software systems. Well-engineered systems require appreciation of both principles and architectural trade-offs. Principles provide abstractions and rigor that lead to clean designs, while systems-level understanding is essential for effective design.

Principles here encompass the use of programming systems to achieve specified goals, the identification of useful programming abstractions or paradigms, the development of comprehensive models of software systems, and so forth. The thrust is to identify and clarify concepts that apply in many programming contexts.

Development of software systems requires an understanding of many methodological and architectural issues. The complex systems developed today rely on concepts and lessons that have been extracted from years of research on programming languages, operating systems, database systems, knowledge-based systems, real-time systems, and distributed and parallel systems. UCLA places particular emphasis on the area of parallel and distributed systems.

Facilities

Departmental laboratory facilities for instruction and research include:

Artificial Intelligence Laboratory. For investigating knowledge representation systems, pattern recognition, expert systems, intelligent user agents, planning, problem solving, heuristic search, and related areas

Biocybernetics Laboratory. For interdisciplinary experimental and computational problems in physiology, medicine, and pharmacology. Laboratory pedagogy involves development and exploitation of the synergistic and methodologic interface between modeling and laboratory experimentation, emphasizing integrated approaches for solving complex biosystem problems from sparse biodata. See http://biocyb.cs.ucla.edu/

Biomedical Engineering Laboratory. Established jointly by HSSEAS and the School of Medicine to support courses and research projects in health care information systems, covering issues in user requirement specifications, image data processing and retrieval, feature abstraction, simulation and analysis, visualization, and systems integration

Cognitive Systems Laboratory. For studying systems that emulate human cognition, especially learning, planning, and reasoning under uncertainty. Topics include causal reasoning, knowledge discovery, knowledge compilation, physical systems diagnosis, and automated explanation. See http://singapore.cs.ucla.edu/

Collaborative Design Laboratory. For investigating methods for effective computer support of small teams involved in design and research

Computer Communications Laboratory. For investigating local-area networks, packet-switching networks, and packet-radio networks

Concurrent Systems Laboratory. For investigating the design, implementation, and evaluation of computer systems that use state-of-the-art technology to achieve both high performance and high reliability. Research is often related to multiprocessors and multicomputers in the context of general-purpose as well as embedded systems. See http://www.cs.ucla.edu/csd/research/labs/csl/

Data Mining Laboratory. For extraction of patterns, anomalies, concepts, classification rules, and other forms of high-level relationships that are latent in large commercial and scientific databases. See http://dml.cs.ucla.edu/main.html

Digital Arithmetic and Reconfigurable Architecture Laboratory. For fast digital arithmetic (theory, algorithms, and design) and numerically intensive computing on reconfigurable hardware. Research includes floating-point arithmetic, online arithmetic, application-specific architectures, and design tools. See http://arith.cs.ucla.edu/

Distributed Simulation Laboratory. For research on operating system support and applications and utilization of special architectures such as the Intel Hypercube

Embedded and Reconfigurable System Design Laboratory. For studying reconfigurable cores in embedded systems that provide the required adaptability and reconfigurability, and the design and CAD aspects of low-power embedded systems. See http://er.cs.ucla.edu/

High-Performance Internet Laboratory. For investigating high-performance quality of service (QoS) techniques in the Internet, including QoS routing in Internet domains, QoS fault-tolerant multicast, TCP congestion control, and gigabit network measurements. See http://www.cs.ucla.edu/NRL/hpi/

Human/Computer Interface Laboratory. Use of cognitive science concepts to design more reliable user-friendly interfaces with computer and communication systems and the modeling and visualization of scientific data. See http://www.cs.ucla.edu/hcip/

Internet Research Laboratory. For exploring the forefront of current Internet architecture and protocol development, including fault tolerance in large-scale distributed systems such as the Internet routing infrastructure, Internet distance measurement, scalable IP multicast delivery in absence of network multicast support, distributed Internet information discovery, and protocol design principles for large-scale self-organizing systems and their applications to sensor networking. See http://irl.cs.ucla.edu/

Knowledge-Based Multimedia Medical Distributed Database Systems Laboratory. For developing new methodologies to access multimedia (numeric, text, image/picture) data by content and feature rather than by artificial keys such as patient ID. See http://kmed-www.cs.ucla.edu/

Laboratory for Embedded Collaborative Systems. For research on the architectural challenges posed by massively distributed, large-scale, physically coupled, and usually untethered and small-form-factor computing systems. Through prototype implementations and simulation, such issues as data diffusion protocols, localization, time synchronization, low-power wireless communications, and self-configuration are explored. See http://lecs.cs.ucla.edu/

MAGIX: Modeling Animation and Graphics Laboratory. For research on computer graphics, physics-based animation, robotics, and biomechanics. See http://www.cs.ucla.edu/magix/

Multimedia Systems Laboratory. For research on all aspects of multimedia: physical and logical modeling of multimedia objects, real-time delivery of continuous multimedia, operating systems and networking issues in multimedia systems, and development of multimedia courseware. See http://www.mmsl.cs.ucla.edu/

Parallel Computing Laboratory. For research in scalable simulation, providing an efficient lightweight simulation language, as well as tools for large-scale parallel simulation on modern supercomputers. See http://pcl.cs.ucla.edu/

System Software Laboratory. For developing advanced operating systems, distributed systems, and middleware and conducting research in systems security

UCLA Vision Laboratory. For research involving processing of visual information to retrieve mathematical models of the environment in order for humans and machines to interact with it. Applications include image-based rendering for virtual reality, archaeology, CAD, guidance of autonomous vehicles, human/computer interaction, visualization, and recognition. See http://www.vision.cs.ucla.edu/

VLSI CAD Laboratory. For computer-aided design of VLSI circuits and systems. Areas include high-level and logic-level synthesis, technology mapping, physical design, interconnect modeling and optimization of various VLSI technologies such as full-custom designs, standard cells, programmable logic devices (PLDs), multichip modules (MCMs), system-on-a-chip (SOCs), and system-in-a-package (SIPs). See http://ballade.cs.ucla.edu/

Wireless Adaptive Mobility Laboratory. For investigating wireless local-area networks and the interaction between wireless network layers, middleware, and applications. Activities include protocol development, protocol analysis and simulation, and wireless testbed experiments. See http://www.cs.ucla.edu/NRL/wireless/

Some of these research laboratories also provide support for upper division and/or graduate courses.

Computing Resources

In summarizing the resources now available to conduct experimentally based research in the UCLA Computer Science Department, it is useful to identify the major components of the research environment: the departmental computing facility, other hardware and software systems, administrative structure, and technical support staff.

Hardware

Computing facilities range from large campus-operated supercomputers to a major local network of servers and workstations that are administered by the department or school network (HSSEASnet).

The departmental research network includes at least 30 Sun Sparcstation servers and shared workstations, on the school’s own ethernet TCP/IP local network. A wide variety of peripheral equipment is also part of the facility, and many more research-group workstations share the network; the total number of machines exceeds 600, the majority running the UNIX operating system. The network consists of switched10/100 ethernet to the desktop with a gigabit backbone connection. The department LAN is connected to the campus ATM backbone. A Lucent wireless network is also available to faculty, staff, and graduate students.

Administrative Structure

The central facilities and wide-area networking are operated by the campuswide Academic Technology Services. Access to the departmental and HSSEASnet machines is controlled so as to maximize the usefulness of these computers for education and research, but no direct charges are involved.

Technical Support Staff

The support staff consists of hardware and software specialists. The hardware laboratory supports network connections, configures routers, switches, and network monitoring tools. The software group administers the department UNIX servers, providing storage space and backup for department users.

Faculty Areas of Thesis Guidance

Professors

Rajive L. Bagrodia, Ph.D. (U. Texas, 1987)
Distributed algorithms, concurrent programming languages, simulation, performance evaluation of distributed systems

Alfonso F. Cardenas, Ph.D. (UCLA, 1969)
Data management systems, programming languages and software systems, digital simulations, systems analysis, management information systems, computing economics and management

Wesley W. Chu, Ph.D. (Stanford, 1966)
Distributed computing, distributed database, memory management, computer communications, performance measurement and evaluation for distributed systems and multiaccess packet-switched systems

Jason (Jinsheng) Cong, Ph.D. (Illinois, 1990)
Computer-aided design of VLSI circuits, fault-tolerant design of VLSI systems, design and analysis of algorithms, computer architecture

*Joseph J. DiStefano III, Ph.D. (UCLA, 1966)
Biocybernetics, bioengineering, biosystems, modeling and simulation, modeling theory and methodology, artificial intelligence/knowledge-based systems, dynamic systems optimization, estimation, identification, and control

Michael G. Dyer, Ph.D. (Yale, 1982)
Artificial intelligence, natural language processing, cognitive modeling

Milos D. Ercegovac, Ph.D. (Illinois, 1975)
Computer systems architecture, digital computer arithmetic, logic design, functional languages and architectures, VLSI algorithms and structures

Deborah L. Estrin, Ph.D. (MIT, 1985)
Computer networks, network embedded systems, sensor networks

Mario Gerla, Ph.D. (UCLA, 1973)
Analysis, design, and control of computer communications networks and systems, computer network protocol evaluation, queueing networks, topological design and routing problems in large networks, design and evaluation of algorithms for distributed computation

Sheila A. Greibach, Ph.D. (Harvard, 1963)
Theoretical computer science, computational complexity, program schemes and semantics, formal languages, automata, computability

Richard E. Korf, Ph.D. (Carnegie Mellon, 1983)
Artificial intelligence

Richard R. Muntz, Ph.D. (Princeton, 1969)
Multimedia systems, database systems, data mining

D. Stott Parker, Jr., Ph.D. (Illinois, 1978)
Data mining, information modeling, scientific computing, bioinformatics, database and knowledge-based systems

Miodrag Potkonjak, Ph.D. (UC Berkeley, 1991)
Computer-aided analysis and synthesis of system level designs, behavioral synthesis, and interaction between high-performance application-specific computations and communications

Majid Sarrafzadeh, Ph.D. (Illinois, 1987)
Computer engineering, embedded systems, VLSI CAD, algorithms

Carlo Zaniolo, Ph.D. (UCLA, 1976)
Knowledge bases and deductive databases, parallel execution of PROLOG programs, formal software specifications, distributed systems, artificial intelligence, and computational biology

Lixia Zhang, Ph.D. (MIT, 1989)
Computer network, data networking, network architectures and protocols

* Also Professor of Medicine

Professors Emeriti

Algirdas A. Avizienis, Ph.D. (Illinois, 1960)
Digital computer architecture and design, fault-tolerant computing, digital arithmetic

Bertram Bussell, Ph.D. (UCLA, 1962)
Computer systems architecture, interactive computer graphics

Jack W. Carlyle, Ph.D. (UC Berkeley, 1961)
Communication, computation theory and practice, algorithms and complexity, discrete system theory, developmental and probabilistic systems

Gerald Estrin, Ph.D. (Wisconsin, 1951)
Computer systems architecture, methodology and supporting tools for design of concurrent systems, automating design teamwork, restructurable architectures

Thelma Estrin, Ph.D. (Wisconsin, 1951)
Biomedical engineering, application of technology and computers to health care, computer methods in neuroscience, engineering education

Leonard Kleinrock, Ph.D. (MIT, 1963)
Computer networks, computer-communication systems, resource sharing and allocation, computer systems modeling, analysis, and design, queueing systems theory and applications, performance evaluation of congestion-prone systems, performance evaluation and design of distributed multiaccess packet-switching systems, wireless networks, mobile computing, and distributed and parallel processing systems

Allen Klinger, Ph.D. (UC Berkeley, 1966)
Pattern recognition, picture processing, biomedical applications, mathematical modeling

Lawrence P. McNamee, Ph.D. (Pittsburgh, 1964)
Computer graphics, discrete simulation, digital filtering, computer-aided design, LSI fabrication techniques, printed circuit board design

Michel A. Melkanoff, Ph.D. (UCLA, 1955)
Programming languages, data structures, database design, relational models, simulation systems, robotics, computer-aided design and manufacturing, numerical-controlled machinery

Judea Pearl, Ph.D. (Polytechnic Institute of Brooklyn, 1965)
Decision analysis, human information processing, artificial intelligence, pattern recognition, man/machine interface, mathematical analysis of systems, complexity of computations

*Jacques J. Vidal, Ph.D. (U. Paris, Sorbonne, 1961)
Information processing in biological systems, with emphasis on neuroscience, cybernetics, online laboratory computer systems, and pattern recognition, analog and hybrid systems/signal processing

*Member of Brain Research Institute

Associate Professors

Adnan Y. Darwiche, Ph.D. (Stanford, 1993)
Knowledge representation and reasoning under uncertainty with emphasis on Baysian networks, structure-driven inference algorithms and knowledge-based compilation techniques. Unifying foundation for logical and probabilistic reasoning. Belief revision and causality. Applications to model-based diagnosis and reasoning about physical systems

Eliezer M. Gafni, Ph.D. (MIT, 1982)
Computer communication, networks, mathematical programming algorithms

Elias Koutsoupias, Ph.D. (UC San Diego, 1994)
Decision making under uncertainty, computational geometry, randomized algorithms, combinatorial optimization and computational complexity

David A. Rennels, Ph.D. (UCLA, 1973)
Digital computer architecture and design, fault-tolerant computing, digital arithmetic

Stefano Soatto, Ph.D. (Cal Tech, 1996)
Vision, control system theory, estimation theory, real-time sensory processing and machine interfaces

Yuval Tamir, Ph.D. (UC Berkeley, 1985)
Computer architecture, VLS

Assistant Professors

Junghoo (John) Cho, Ph.D. (Stanford, 2002)
Databases, web technologies, information discovery and integration

Petros Faloutsos, Ph.D. (Toronto, 2001)
Computer graphics, computer animation

Songwu Lu, Ph.D. (Illinois, 1999)
Integrated-service support over heterogeneous networks, e.g. mobile computing environments, Internet and Activenet: networking and computing, wireless communications and networks, computer communication networks, dynamic game theory, dynamic systems, neural networks, and information economics

Glenn Reinman, Ph.D. (UC San Diego, 2001)
Computer architecture

Senior Lecturer

Leon Levine, M.S. (MIT, 1949), Emeritus
Computer methodology

Adjunct Professors

Andrew B. Kahng, Ph.D. (UC San Diego, 1989)
Complexity theory, parallel algorithms and architectures, VLSI layout, computational geometry

Boris Kogan, Ph.D. (Moscow, Russia, 1962)
Application of multiprocessor systems with massive parallelism to simulation of dynamic phenomena in excitable biological tissues

Gerald J. Popek, Ph.D. (Harvard, 1973)
Privacy and security in information systems, operating system software design, representation for design and evaluation of databases

Adjunct Associate Professors

Leon Alkalai, Ph.D. (UCLA, 1989)
Computer architecture

Peter L. Reiher, Ph.D. (UCLA, 1987)
Optimistic replication, optimistic concurrency control mechanisms for parallel and distributed systems, distributed systems