Systematic growth-- UChicago's computer science department proliferates

Systematic growth

Professor Andrew Chien

UChicago’s computer science department proliferates.

In 2011, the Computer Science Department hired Andrew Chien—previously Intel’s vice president of research and now a William Eckhardt Distinguished Service Professor. His appointment, which then-PSD dean Robert Fefferman called “game-changing,” marked the beginning of a transformative recruitment period.

Chien—who specializes in applications, system software, networking, and architecture—describes this development: “Building on its long history of distinction in theory, the Computer Science Department has undertaken a rapid expansion with the ambition of creating world-leading groups” in areas such as data-intensive computing, resilient computing, and quantum computing. These new areas “reflect the growing intellectual importance of computer science in the University and its dramatic and pervasive reach in the economy and society.”

Since the start of academic year 2014-15, the department has made nine new academic appointments, in both junior and senior positions. The majority work under the “broad umbrella of systems,” says associate professor and associate chair Anne Rogers, but their specific areas of study and backgrounds are diverse.

“The field is going to change, and very quickly,” says professor and department chair Todd Dupont. Computer science faculty members must be flexible, “people who have done different things.” Some of the recent hires come from industry, some from academia, and some have worked in both. The boundaries between the two are increasingly fluid, says Rogers, with advances in each driving the other.

Quantum leap

Working at the intersection of computer science and emerging technologies, Frederic Chong joins the University as the Seymour Goodman Professor of Computer Architecture. His work includes redesigning systems to leverage the strengths of new technologies while mitigating their weaknesses. “For example, new memory technologies may save energy,” says Chong, “but might require spreading data out or writing data slowly when possible.”

Of many projects, Chong’s most revolutionary is on quantum computing, which may provide exponentially powerful computation and help close the gap between big data growth and traditional computational efficiency improvements. His focus is to help guide the development of basic technologies and algorithms by “looking ahead to how these pieces will scale to a large quantum computing machine ten to twenty years in the future.”

Robust and resilient

The department’s most recent appointment, assistant professor Yanjing Li, arriving in September 2015, is currently a senior research scientist at Intel Labs. Her focus is robust system design, and she is broadly interested in the area of computer systems and computer architecture. “I design systems in the most efficient way to ensure they function as the user expects,” says Li, “despite underlying disturbances like hardware failures, software and hardware bugs, environmental issues, or even malicious attacks.”

One specific project Li works on is called cross-layer resilience, which aims to solve hardware failures that can occur any time. These errors can be caused by manufacturing defects, cosmic rays or alpha particles hitting a chip, or simply aging of the chip. Traditionally, a one-time manufacturing test is used to screen out defects, and redundant resources are added to deal with errors in the field, but as systems increase in complexity, these methods become cost prohibitive. Li’s research can achieve the same goal without needing to greatly increase the cost.  “There’s an increasing need to provide support while the system is running in the field,” and Li is currently researching techniques that span abstraction layers, functioning at circuit, architecture, and software levels.


As Li mentioned, the ability to eradicate software bugs is crucial to system robustness and reliability. Associate professor Shan Lu, previously at University of Wisconsin, focuses on understanding, detecting, mitigating, and fixing defects through program analysis, software engineering, and system techniques. Her work involves obtaining bug report databases, most often from open source software; analyzing the reports, developer discussions, and patches; and searching for common patterns. When Lu discovers one, she can write a program to automatically identify that pattern. Then the defect can be repaired.

Making observations that generalize patterns and automate repair is crucial for today’s software. “Thirty years ago, programs usually had a thousand lines of code,” says Lu. “It was still possible to read through every line and manually validate and correct.” Most software is now far too large and complicated to comb through individually and often built with different components. “Software is not built by just one person.” 

Big data, big bases

Complex software is necessary to keep pace with the massive amounts of data produced, collected, and analyzed in an increasingly connected world. “Data is at the core of everything, and big data is about trying to make sense of large amounts of dirty and complex data,” says assistant professor Aaron Elmore, SM’09, who joined the department in June. Databases store and analyze this information, and his work focuses on making them adaptive, flexible, and self-managed.

Big data is inherently diverse, requiring multiple tools to manage it. As part of BigDAWG (Big Data Working Group)—a project run by Intel Science and Technology Center for Big Data—Elmore works to build a system that can integrate such tools. “We’re trying to federate different, specialized databases into one unified access system,” allowing analysts to forgo the effort of finding the right tool for the right operation and spend that time instead on the knowledge the data analysis yields.


Assistant professor Ravi Chugh, who held internships at Microsoft and Mozilla while earning his doctorate at University of California, San Diego, brings to the department a specialization in programming languages, verification, program analysis, and compilers. His research involves developing techniques that will work with and improve modern languages, enhancing the computer programming experience.

Chugh is working to combine two different modes in which users interact with software. On one end of the spectrum, there are systems that allow users to write a program that generates a result; on the other—systems that provide a graphical user interface (GUI) with visual tools to produce a result. Each mode has its distinct strengths, better suited for certain tasks, but existing software systems typically provide only one. Chugh is building software interfaces with both programmatic- and GUI-based manipulation ability while addressing the challenge of keeping the program “synchronized” with changes to the result. He believes that hybrid interfaces will offer the “opportunity to find an audience beyond traditional programmers.”

Next generation

Joining the department as a senior research associate and lecturer, Diana Franklin has an extensive background in computing education research, parallel programming and architecture, and ethnic and gender diversity in computer science. She’s particularly interested in how children learn computing concepts and will work with the Center for Elementary Mathematics and Science Education as Director of Computer Science Education.

Franklin currently researches “what challenges young students—fourth through sixth grade—face,” and develops new resources for both research and learning purposes. The DEPICT project—Developing Elementary (Learning) Progressions to Integrate Computational Thinking—studies how students learn concepts, including sequential execution, initialization, software design, loops, conditionals, and communication.

A stepwise approach

Having joined the department last fall as a computer science lecturer, Matthew Wachs—who specializes in operating systems—teaches programming courses for both majors and non-majors. For non-majors, he exposes students to computational thinking—the ways in which computer scientists create a step-by-step approach to solve a problem in a systematic way. He also teaches them how to collect data and integrate and analyze information, with the goal of providing the tools to turn raw data into knowledge in their own disciplines.

The introductory sequence for majors offers the basics of programming, but from different perspectives. Students “learn how to express solutions to problems both in a more abstract, mathematical manner,” says Wachs, but also in a “form that focuses on the detailed sequence of steps followed by a computer to perform calculations.” This approach helps students see the “big picture” while also mastering the details that underlie that broader understanding.

Complex questions

Not all of the recent hires work in computer systems; the department also aims to expand its theoretical contingent. Assistant professor Andrew Drucker conducts research in classical and quantum complexity theory, in which he tries to understand the inherent limits of computation efficiency, in terms of the usage of resources such as running time and storage space.

Central to Drucker’s field is the P vs. NP question, which asks: are there computational problems for which finding solutions is more difficult than verifying solutions? For one class of problems called “NP-complete,” finding a solution is believed to take dramatically more time as the problem increases in size. For example, the number puzzle, Sudoku, becomes harder to solve the larger the grid. These types of problems seem not to be reliably solvable by any algorithm in a realistic timeframe—at least that’s the conjecture, formalized as the statement "P ≠ NP." While proving this central conjecture appears out of reach, Drucker studies the power of various restricted classes of algorithms to solve NP-complete problems, aiming to improve our understanding of this enigmatic issue.

Resisting malicious attacks

The first computer security faculty member to join the department, assistant professor Ariel Feldman became interested in his specialty when he realized that computer security is at the heart of issues with “social, legal, and policy significance, such as privacy and voting.” Appointed in July 2014, his research combines computer security and distributed systems and involves making and breaking secure systems. Offensive research—breaking through security, conducted ethically and legally in a lab setting—helps uncover new classes of vulnerabilities and ways to exploit them to help design better defenses.

Feldman’s research presently focuses on protecting the security of “cloud-hosted” services and the privacy of their users by reducing the degree to which parties must trust each other. Currently users of services like Google and Facebook have no control over those companies’ security measures or actions: they might be attacked, secretly use personal data, or disclose information with or without a warrant. Feldman is working to redesign these services to return control to the user. 

Across divisions and beyond

The department has strong ties with statistics and mathematics, and some of these new faculty members will likely collaborate with other divisions—for instance, Feldman with the Harris School of Public Policy and the Law School, and Chong with the Institute for Molecular Engineering. The department also has “synergies” with neighboring Toyota Technological Institute at Chicago. UChicago’s computer science department is outward looking, says Dupont, “probably more than most.”

With the addition of these new faculty, the department “is pursuing a bold agenda that fuses deep computer science research with challenging problems emerging from real societal-scale systems and applications,” says Chien, “to create a vibrant research community” that reaches across the University and city of Chicago.

Story by Maureen Searcy at the University of Chicago Magazine