HEIDELBERG, Germany—Every September, a critical mass of the world’s most decorated computer scientists and mathematicians gathers in the warm microclimate here. They discuss the states of their fields and mentor 200 undergraduate, graduate and postgraduate students from around the world selected in a highly competitive process.
“It feels like coming home,” said Vinton Cerf, Google’s vice president and chief internet evangelist, who is also known as one of the “fathers of the internet” for having developed, along with Robert Kahn, the internet’s architecture and protocol suite known as Transmission Control Protocol/Internet Protocol (TCP/IP). For this work, Cerf and Kahn won the Turing Award—the so-called Nobel of computing.
The young researchers who attended this year’s Heidelberg Laureate Forum—as the event is known—were able, for example, to chat over coffee with Yann LeCun (“godfather of artificial intelligence”), go for a walk with Whitfield Diffie (“father of public key cryptography”) or take a boat ride on the Nekar River with Shwetak Patel, a MacArthur Fellow whose groundbreaking work in human-computer interactions has improved the lives of millions. The forum is an intimate, invitation-only gathering modeled after its scientific partner, the Lindau Nobel Laureate Meetings held each July in Lindau, Switzerland.
Though the 28 laureates in attendance this year gave and listened to each other’s talks with optimistic titles such as “Computing for Social Good,” Inside Higher Ed took the opportunity to ask them questions about computer science’s challenges in higher education.
These luminaries are concerned about how to teach computer science today, given the breakneck pace of developments, faculty shortages and an unrealized need to integrate ethics into the curriculum. They also have misgivings about some ed-tech tools, interdisciplinary dialogue and improving-but-still-low participation rates by women, especially given their role in developing tech products that change how people live.
Missing Seats at Important Tables
Researchers across academic departments use computer science tools to address an array of problems in health care, weather forecasting, ecommerce, transportation, finance, agriculture, energy systems, manufacturing, environmental monitoring, national security and more. But that does not mean that those researchers always consult with the computer scientists who supply the computing tools.
“We’re seen as a bunch of geeks who provide the raw materials for them but not necessarily as equal players,” said Cherri Pancake, past Association for Computing Machinery (ACM) president and professor emeritus at Oregon State University. “What we really need to bring to the table is not our software or our tools but our fundamentally different way of looking at problems and coming to solutions.”
Computer scientists have long warned that computing applications carry risks. Pre-eminent British scientist Stephen Hawking, for example, warned that artificial intelligence could end mankind. Last month, a paper published by Google and Oxford scientists concluded that a sufficiently advanced artificial agent could elicit “catastrophic consequences.”
“As we go forward try to solve these really existential challenges for mankind, computer scientists need to step up and bring our fundamentally different ways of looking at the universe.”
Fast Pace of Developments Presents Teaching Challenges
Every hour in 2019, more than three artificial intelligence preprints were submitted to arXiv—an open-access repository of electronic scientific preprints. That rate was over 148 times faster than in 1994, according to a Journal of Informetrics study. On the AI subtopic of deep learning alone, more than one preprint was submitted every hour—a 1,064-fold increase from the 1994 rate.
“It’s kind of like in medical school when they talk about the ‘half-life of knowledge.’ The medical school dean tells graduates, ‘In five years, half of what we tell you will turn out to be false,’” said Alexei Efros, computer science professor at the University of California, Berkeley. “The half-life of knowledge in computer science is quite short. In machine learning, it is about three months.” Efros earned the ACM Prize in Computing for his groundbreaking data-driven approaches to computer graphics and computer vision.
That makes teaching computer science challenging, according to Efros, who noted that he had not had time to check arXiv during the month he had been traveling but had since discovered that “already five things changed while I was away.”
Questions About How to Teach Foresight and Ethics
Social media has brought people with shared interests together, which sounds good, except that it has also connected, for example, supporters of terror, extremism and hate in ways that some argue have undermined democracy. For this reason and others, some computer scientists seek to integrate ethics into their curricula. But how to do that is unclear.
“Do we do this by having a required ethics course?” said Barbara Liskov, MIT Institute Professor of computer science. “Or should every course have ethics in it?” Liskov is an early computer science pioneer who earned the Turing Award for contributions to the practical and theoretical foundations of programming languages and system design.
Training in ethics, however, is a necessary but not sufficient condition to avert technology’s unintended consequences, Liskov said. Students and practitioners of computer science also need to learn how to anticipate problems before they occur.
“We used to naïvely think, ‘Oh, isn’t it wonderful that we could have these groups [on social media] where you could talk to people who are like you?’ Now we know this doesn’t really work all that well,” Liskov said.
“The days of being naïvely technical, which we were for many years, are over,” Liskov said. “We need to open students’ minds so they think about the harm that can come from what they’re doing and so they ask, ‘What could I add that could act as a safeguard?’ It’s more than ethics. They need to think from a different perspective.”
Missed Opportunities to Include Experts From Other Fields
Computer scientists may be able to look back and identify patterns in their field’s development, but they cannot predict its trajectory looking forward.
“The No. 1 question that all first- or second-year graduate students ask me is, ‘What’s going to be hot in two years?’” Efros said. “The question presupposes determinism. It’s not predetermined. It’s like evolution.”
Without clear goalposts, computer scientists are engaged in an evolutionary process, often in response to substantial real-world needs. Technology enabled colleges to offer remote schooling during pandemic lockdowns, which was positive, except that it also amplified educational inequities experienced by low-income and underrepresented students.
The research community that enables the creation of technology, however, does not always consult with or does not always have access to psychologists, anthropologists, sociologists, neuroscientists and other experts, according to Cerf. These individuals might help computer scientists understand how people may react to new tech environments and applications.
“All of us in the online world that provide products and services need to be more cognizant than we have been about the impact of these technologies on our social and economic lives,” Cerf said.
Departments Spread Too Thin
Students in a variety of disciplines beyond computer science need computing skills specialized to their subject areas. But that development is not without challenges for already-stretched-thin computer science departments.
“How do you choose between teaching computer science students and students in other disciplines that need and legitimately deserve some [computer science] education also?” said Eric Brewer, computer science professor emeritus at the University of California, Berkeley. “Do we have to choose? And if we’re going to choose, how do we choose?”
Berkeley, for example, offers three different courses in discrete math—each tailored to different disciplines. When those departments help teach the courses, everyone wins, Brewer said. “They know what they want and, more importantly, they are providing some actual manpower or womanpower to teach it.”
This solution also helps reduce tensions with departments that may envy computer science departments’ relatively high allocations in terms of faculty hires and other resources.
“You can say it’s driven by undergraduate demand, but it doesn’t make it any more desirable by the other departments,” Brewer said. “A joint model spreads that allocation out a bit better.”
Questionable Social Surveillance Via Ed-Tech Tools
Some ed-tech products undermine educational objectives, says Raj Reddy, professor of computer science and robotics at Carnegie Mellon University, who won the Turing Award for pioneering work in artificial intelligence and human-computer interaction.
“The biggest use of mining of data for student applications is plagiarism detection,” Reddy said. “In fact, we should promote copying. If you’re doing a great thing, I want to learn from you and copy it!” Reddy suggests that faculty members should spend less time with plagiarism software and more time determining whether students understand concepts, even in the event that their work is modeled after others’ work.
Shannon Vallor, chair in the ethics of data and artificial intelligence at the Edinburgh Futures Institute at the University of Edinburgh, also encourages faculty members and students to think critically about expanded social surveillance.
“As data-hungry models become the dominant trend in deep learning, what we see is that that incentivizes a certain kind of social phenomenon,” Vallor said. “It incentivizes investment in expanded systems of social surveillance and more intrusive forms of data extraction … We have to ask ourselves, ‘What does society look like at the end of that road?’”
Brain Drain to Private Sector
More than 7,500 students in Washington State—the home of Microsoft’s headquarters—applied for admission to the University of Washington’s computer science and technology programs this year. But without enough computer science faculty to meet the demand, UW admitted only 7 percent of those applicants—an acceptance rate on par with undergraduate acceptance rates at Brown and Yale.
Such high student demand coupled with significant computer science faculty shortages is playing out at colleges across the United States.
“We’re eating our own seed corn,” Cerf said. “Expertise doesn’t grow on trees. It grows in universities and schools of research. We need to keep those populated.”
“The salary structure is a killer,” said Jeffrey Ullman, computer science professor emeritus at Stanford and Turing Award recipient. “When you can earn three times as much by doing coding, why would you teach coding? It can’t be a good idea to keep to your standard salary scale and take what you can get.”
“Every department is trying to figure out how to teach more students with the same number of people,” Brewer said. “They don’t have enough grad students to [assist] all the classes, so they have undergrad [teaching assistants]. Then you have to figure out how to train undergrad TAs. We try to be inclusive and take as many students as logistically possible, but that is an ongoing challenge.”
Cerf, who has logged full-time stints in academe, government and the private sector, hopes the computer science community can enable more professionals to transition seamlessly in and out of academe over their careers.
“Maybe some of the tools that we’ve evolved during the pandemic will turn out to be useful because it makes it possible to teach remotely,” Cerf said.
Resistance to Important, Unconventional Ideas
When Ralph Merkle, an undergraduate at the UC Berkeley in the 1970s, proposed a project to develop a cryptographic system, his professor dubbed his idea “muddled,” according to Martin Hellman, a professor emeritus of electrical engineering at Stanford. Merkle dropped the class and worked on the project on his own. When he later submitted a paper based on the results to the Communications of the Association of Computing Machinery, it was not accepted.
“One reviewer rejected it because ‘the paper is not in the mainstream of present cryptographic thinking,’” Hellman said. “Of course it wasn’t. It was groundbreaking.”
Merkle, working on his own, and Hellman and Diffie working together, later developed public key cryptography—the technology that permits us, for example, to enter credit card numbers online with confidence. Hellman and Diffie won the Turing Award for this work, but Merkel’s contribution was not recognized.
“Ralph came up with half of public key cryptography—the privacy half—on his own, independently of us, and actually slightly prior to us,” Hellman said.
Like Merkle, Yann LeCun was a graduate student in the 1980s who also had trouble ensuring that his ideas were heard. At first, he told Inside Higher Ed, no faculty member would agree to work with him on research that was the embryo of neural networks—machine learning algorithms inspired by the brain’s structure and function. (The term “neural network” did not exist at the time.) Eventually, he found a faculty member who told him, “I have no idea of what you are working on, but you seem smart enough.”
Today, LeCun is a professor of data science, computer science, neural science and electrical engineering at New York University and chief AI scientist at Meta. He won the Turing Award, along with Yoshua Bengio and Geoffrey Hinton, “for conceptual and engineering breakthroughs that have made deep neural networks a critical component of computing.” The three computer scientists are referred to as the “godfathers of AI.”
Groundbreaking ideas are sometimes only recognized as such after some time has passed. Before that, several of the computer science laureates suggested, they may be overlooked.
Problems With Underrepresentation
At least Merkle and LeCun identified paths to ensure their important ideas were heard, which raises the question of whose ideas are heard.
Women participate in computer science higher education at one of the lowest rates across all science and engineering fields, according to the National Science Foundation. Tim Cook, Apple’s chief executive, told the BBC this week that there were “no good excuses” for women’s underrepresentation in technology and that the sector “will not achieve nearly what it could achieve” without a more diverse workforce.
That said, women’s participation is increasing. The number of women earning computer science bachelor’s degrees doubled (from 7,580 to 16,000 students) between 1998 and 2018 (the most recent available data), as did the number of women earning doctoral degrees (from 140 to 430) during that same period; the number of women earning master’s degrees in computer science quadrupled (from 3,430 to 15,100), according to the NSF.
At the same time, the dearth of women Turing Award recipients cannot be explained by women’s underrepresentation in the field. Since 1966, 75 computer scientists have won the Turing Award, only three of whom have been women: Barbara Liskov, Frances Allen (who is deceased) and Shafi Goldwasser. That means women account for 4 percent of recipients of the prestigious award, even though they earned approximately 22 percent of Ph.D.s in 2018, down from the 1987 peak of 37 percent.
Recipients of million-dollar prizes like the Turing Award are often called on to meet with business leaders and advise politicians. They also are sought after to inspire and mentor young researchers, as many volunteered to do in Heidelberg. That’s a challenge when there are so few.
“It’s really sad,” Liskov said. When it comes to living Turing Award recipients, “there are only two of us.”