Why Many Computer Science Programs Are Stagnating

If you think about it, computer science (CS) has had–at best–a rocky relationship with education.

Let’s rewind for a minute. Born at the merging of algorithm theory, math logic and the invention of the stored-program electronic computer in the 1940s, it wasn’t truly a standalone academic discipline in universities until at least 20 years later.

Initially, most people thought the study of computers was a purely technical job for select industries, like the department of defense or aerospace, rather than a widespread academic science. The proliferation of computers in the 1990s pushed universities to create a standard computer science department to teach students the fundamentals of computing, like algorithms and computational thinking.

Fast forward to today, the average computer science department is still handing out a routine syllabi with lectures, books and lab assignments about the theories of writing programs. Sure, there have been a few updates here and there in the short history of CS fundamentally, but–with the exception of the elite or small CS programs–the educational structure is always lagging behind the sheer pace of advancements in the tech industry. Here’s why:

There’s No Feedback Loop Between Industry & Universities

When Ashu Desai, founder of Make School, was studying computer science as an undergrad at UCLA just a few years ago, he would routinely skip classes to work on building bluetooth accessories for iPhones in his dorm room.

“Nothing I learned at UCLA helped me build my startup,” Desai says. “I had reached out to various CS and EE professors for help, and while they were enthusiastic about my work, they were unable to help me with the project.

One professor even suggested, rather than working alongside experts in a lab, he should work outside of the UCLA lab to avoid risk of losing product ownership.

It’s ironic, really. Some of the most sophisticated technological breakthroughs have happened in university research labs, but the undergraduates down the hall are stuck learning the same concepts as peers that came 10 years before them.

Plus, while most CS professors are highly intellectual and deeply knowledgeable about computer science, they lack industry knowhow. It’s purely circumstantial, considering the academic career path doesn’t normally involve industry experience. There’s one exception to this rule. Elite universities have a major advantage because they have more resources to pay industry experts more than the average university.

We did a little experiment to justify this hunch. After comparing 20 random faculty members at Carnegie Mellon University, a top computer science program, with that of 20 random faculty members at a lesser-known University of Houston, we found a pretty significant difference:

CS syllabus

While this is by no means an exhaustive account, it’s a good anecdotal indication of the elite advantage. The majority of computer science programs have a substantially large gap between university and industry demands, trends and technologies.

We’d be remiss not to recognize the efforts of those who are actively working to bridge this gap. The Joint Task Force on Computing Curricula Association for Computing Machinery (ACM) IEEE Computer Society has historically been critical to shaping the CS curriculum as the pioneers of the discipline.

And the group does try to involve some industry professionals in creating their curriculum recommendation. Unfortunately, it’s not always a rosy picture. For instance, one of the biggest, repeated concerns industry folks mention in 2013 is the lack of security and parallel and distributed systems as a core part of student preparation for the real-world.

Indeed, feedback during the CS2008 review had also indicated the importance of these two areas, but the CS2008 steering committee had felt that creating new KAs was beyond their purview and deferred the development of those areas to the next full curricular report. (Pg 13)

The Joint Task Force’s attempt to updating the CS Syllabus is noble and commendable. But the pure nature of higher education puts too many limits on what they can accomplish.

All of these findings beg the larger question: Without an effective feedback loop between industry and brick-and-mortar universities, how well are we preparing our CS undergrads for the industry world with our current syllabus?

Brick & Mortar University Infrastructure isn’t Built to Support the Pace of Tech

If university is the wise but sluggish grandparent, computer science is the restless, two-year old tot. Universities can simply never catch up to the rapid speed of software technology.

The reason is twofold. First of all, Intel’s cofounder Gordon Moore’s famous Moore Law, which is a 50-year-old observation turned prediction, says that the speed of computing power will double every 12-18 months. Moore’s prediction has been right so far, and technology, in general, is still evolving. Looking further down the line, the latest revelation and potential of quantum computing sparks an entirely new commercialization of innovation on the horizon that will impact the industry significantly. How can universities logistically keep up?

To add new teachings, universities must subtract. The Joint Task Force’s recommendations were updated every decade until 2008, when they decided to increase it to every 5 years. If you look at their CS2001 recommendation, it’s very clear that its curriculum is forced to focus on breadth.

“Over the last decade, computer science has expanded to such an extent that it is no longer possible simply to add new topics without taking others CC2001 Computer Science volume – 13 – Final Report (December 15, 2001) away….It is important to recognize that this core does not constitute a complete undergraduate curriculum, but must be supplemented by additional courses that may vary by institution, degree program, or individual student.”


That last line is crucial and stands true for any curriculum today. Since educators can’t just keep adding new technologies to their syllabi, universities with limited resources stick to the unchanging fundamentals as a requirement. It makes sense. Universities are inherently more theory-based versus practical application. And, theoretically, you should be able to pick up new technologies and tools if you have the fundamentals down. While these are good points, the problem arises when fundamental theories don’t translate seamlessly to the industry. Yes, it’s undoubtedly true that fundamentals are important, but the only way to truly grasp the fundamentals is to apply it to real-world projects and scenarios.

There were so many times I was scratching my head in college to the point where I gave up after a while because I couldn’t visualize where to start,” says Anubhav Saggi, a software engineer who majored in computer science at UCLA. “If you can’t internalize the fundamentals [through real-world projects] in a way that makes sense to you, then you won’t be able to really understand or appreciate why current tech works the way it does.

In other words, simply listening to a lecture on algorithms is okay. But actually carrying out real-world programming tasks using the most current technology is better. Often times, that’s up to students to be proactive and do it themselves.

It’s Hard to Support the Demand of CS Majors As Well

How can universities focus on updating their curriculum if they don’t have enough professors to support the surge of CS majors?

Screen Shot 2015-06-02 at 8.47.06 AM

Source: Tech Crunch

A report from the University of Washington found an astounding increase of freshmen CS majors by 300% over the last 4 years at the University of Washington. Again, some elite programs are fortunate enough to have supporters with deep pockets, like Harvard University, which recently announced expansion thanks to Steve Ballmer’s generous donation of $60 million. No big deal.

The larger majority suffers from this problem: They need more funding for tools and resources to make classes more hands-on and applicable to today’s technologies to support the spike in enrollment. Otherwise, students who learn by doing versus listening, like Saggi and Desai, suffer the consequences of a reduced quality of education with larger classrooms, more lectures and less resources for hands-on CS learning. Or, worse:

We’re turning away many students we’d love to have,” Ed Lazowska, the Bill & Melinda Gates Chair in Computer Science & Engineering at the UW told GeekWire. “That’s the tragedy.”

But Learning Can Be Fun, Hands-On and Flexible

So, where does that leave students today? The most successful software engineers usually spend some time in the real world to get the hang of things. There’s a lot of StackOverflow-ing and general Google-ing involved. Without a structured way to visualize and apply teachings to current, evolving technologies, it’s a self-teaching study at the moment.

This is also one strong explanation for the recent rise in Massive Open Online Courses (MOOCs). Since traditional brick-and-mortar universities can’t support the spike, nonprofits, like Stanford and Harvard’s EdX, reached 1.25 million students. Both professors and students can look to online resources to test their classroom knowledge. For instance, Tom Murphy, computer science teacher at Contra Costa College says:

“I consider problem solving to be one of the most import skills to foster in computer science students, usually accomplished via challenging coding problems.”

Still, nothing can replace hands-on experience of applying knowledge to real-world problems. Students who feel ill-prepared must be proactive in getting hands-on experience, whether it’s by getting an internship, contributing to open source or practicing real-world challenges online, like security programming referenced above.

The evolving nature of computer science can’t be confined to brick-and-mortar university lecture halls. But adopting technological tools to make hands-on training easier and supplement evergreen fundamentals taught at universities is crucial to better prepare CS grads for the tech industry.


Lead image: washington.edu


37 Replies to “Why Many Computer Science Programs Are Stagnating”

  1. Traditional universities are probably never going to do computer science all that well because technology moves too fast, and expecting potential students to just be proactive is not enough to generate enough competent graduates. Something else is needed.

  2. You’ve left out a very key issue: because of the realities of universities, most professors have industry experience from 20 years ago. A university with a marginally growing size will need to get its CS professors in 1980, staffed up, then has probably kept all of them, because being a tenured professor is a pretty good gig.

    While I wouldn’t want to simply remove older professors, for a fast moving vocational degree like CS, one wonders how a tenure-for-life system could keep up with a degree that almost completely reinvents itself every 20 years.

  3. If a subject can’t generate research papers, there’s no role for academia in it. That’s why software engineering has been abandoned by universities. Likewise, programming and other practicum skills will remain tangential at best among unis and colleges.

    The solution is to look elsewhere to develop such skills. Like automotive mechanics and HVAC repair, most programming skills should be fostered as a trade-like skill, where breadth of competency (in multiple languages and tools) takes precendence over depth, and the language du jour can be given its proper priority — satisfying the ephemeral fancies of the employer of the month.

  4. Ech…. I’ve been working with graduates, dropouts and people without cs studies at all. While its true that universities does teach outdated languages, those institutions also teach such things as teamwork, debugging skills, googling and generally searching skills, some basic algorithms and implementations. Many skills you usually does not notice that you have until you see someone without them. And missing them is a big pain in the back-part. New technology can be learned quite rapidly. Basics like how to go around implementing business ideas need time and lot of practice. Three or four years of at least part-time writing code can be a big help. You already have some practice, you know how to fix your own bugs, etc.

  5. This is why I took a college program instead on University. Every professor has 10+ years in the field and they teach you more than just the technical skills. You learn how to deal with different types of people as well(QA, Business, clients). I also got lucky in the fact our program had a 16 month co-op after 2 years. This gives you so much experience and lets you get a fell for what areas you want to focus in on during your third year. Universities teach you the fancy math stuff. But colleges teach you how to actually build things.

  6. I think this article shows how bad the pedagogical approach of universities is making connections between practice, theory, and implementation. Implementation is the part where college classes fall down. But industry is also to blame. The business climate is driven by fashion and novelty. All of the systems development from the 60’s are still as valuable and useful today as it was then. The great achievements in industry are always degradations of some greater achievement in computer science research and development. Not because the computer science development was bad, but because it was not intuitive or easy to comprehend. Slowly the industry moves towards the core ideas that were developed in the 40’s, 50’s, 60’s, and 70’s.

    For example, Turing completeness proves that all of our modern languages are equivalent. But no one believes it. Partly, that is is because computer science is counter intuitive and is not taught in a way to reveal it’s power. But mostly it’s because the developer confronted with an implementation problem does not go back to first principles, and instead just makes due with what he or she knows. Javascript is the modern classic example. It’s a fussy language the is slowly fracturing in all kinds of directions. And why is that? Because the original developer of javascript, though intending to put scheme in the browser (a really good design decision) was instead tasked with putting a “java-like” language in the browser. Somebody in business making a computer science decision. And that is what happens in industry all the time.

    And then some other company will company along where the founders hew closer to good theory and develop a better design because of it. And if that business also does a great implementation of that design, they are a winner. And the rest of business confuses itself thinking it’s how things looked that mattered, not the underlying computer science and the product implementation that mattered. This is how Apple has always won, and when they stop caring about both things, why they have lost. It’s why Google rose to dominate search (good computer science, great implementation).

    So don’t blame colleges and universities for the failures of computer science pedagogy. Because often some administrator and the students are all saying: “you have to teach Java!” And that focus misses the point completely. It mistakes understanding for implementation.

Leave a Reply

Your email address will not be published. Required fields are marked *