If you think about it, computer science (CS) has had–at best–a rocky relationship with education.
Let’s rewind for a minute. Born at the merging of algorithm theory, math logic and the invention of the stored-program electronic computer in the 1940s, it wasn’t truly a standalone academic discipline in universities until at least 20 years later.
Initially, most people thought the study of computers was a purely technical job for select industries, like the department of defense or aerospace, rather than a widespread academic science. The proliferation of computers in the 1990s pushed universities to create a standard computer science department to teach students the fundamentals of computing, like algorithms and computational thinking.
Fast forward to today, the average computer science department is still handing out a routine syllabi with lectures, books and lab assignments about the theories of writing programs. Sure, there have been a few updates here and there in the short history of CS fundamentally, but–with the exception of the elite or small CS programs–the educational structure is always lagging behind the sheer pace of advancements in the tech industry. Here’s why:
When Ashu Desai, founder of Make School, was studying computer science as an undergrad at UCLA just a few years ago, he would routinely skip classes to work on building bluetooth accessories for iPhones in his dorm room.
“Nothing I learned at UCLA helped me build my startup,” Desai says. “I had reached out to various CS and EE professors for help, and while they were enthusiastic about my work, they were unable to help me with the project.
One professor even suggested, rather than working alongside experts in a lab, he should work outside of the UCLA lab to avoid risk of losing product ownership.
It’s ironic, really. Some of the most sophisticated technological breakthroughs have happened in university research labs, but the undergraduates down the hall are stuck learning the same concepts as peers that came 10 years before them.
Plus, while most CS professors are highly intellectual and deeply knowledgeable about computer science, they lack industry knowhow. It’s purely circumstantial, considering the academic career path doesn’t normally involve industry experience. There’s one exception to this rule. Elite universities have a major advantage because they have more resources to pay industry experts more than the average university.
We did a little experiment to justify this hunch. After comparing 20 random faculty members at Carnegie Mellon University, a top computer science program, with that of 20 random faculty members at a lesser-known University of Houston, we found a pretty significant difference:
While this is by no means an exhaustive account, it’s a good anecdotal indication of the elite advantage. The majority of computer science programs have a substantially large gap between university and industry demands, trends and technologies.
We’d be remiss not to recognize the efforts of those who are actively working to bridge this gap. The Joint Task Force on Computing Curricula Association for Computing Machinery (ACM) IEEE Computer Society has historically been critical to shaping the CS curriculum as the pioneers of the discipline.
And the group does try to involve some industry professionals in creating their curriculum recommendation. Unfortunately, it’s not always a rosy picture. For instance, one of the biggest, repeated concerns industry folks mention in 2013 is the lack of security and parallel and distributed systems as a core part of student preparation for the real-world.
Indeed, feedback during the CS2008 review had also indicated the importance of these two areas, but the CS2008 steering committee had felt that creating new KAs was beyond their purview and deferred the development of those areas to the next full curricular report. (Pg 13)
The Joint Task Force’s attempt to updating the CS Syllabus is noble and commendable. But the pure nature of higher education puts too many limits on what they can accomplish.
All of these findings beg the larger question: Without an effective feedback loop between industry and brick-and-mortar universities, how well are we preparing our CS undergrads for the industry world with our current syllabus?
If university is the wise but sluggish grandparent, computer science is the restless, two-year old tot. Universities can simply never catch up to the rapid speed of software technology.
The reason is twofold. First of all, Intel’s cofounder Gordon Moore’s famous Moore Law, which is a 50-year-old observation turned prediction, says that the speed of computing power will double every 12-18 months. Moore’s prediction has been right so far, and technology, in general, is still evolving. Looking further down the line, the latest revelation and potential of quantum computing sparks an entirely new commercialization of innovation on the horizon that will impact the industry significantly. How can universities logistically keep up?
To add new teachings, universities must subtract. The Joint Task Force’s recommendations were updated every decade until 2008, when they decided to increase it to every 5 years. If you look at their CS2001 recommendation, it’s very clear that its curriculum is forced to focus on breadth.
“Over the last decade, computer science has expanded to such an extent that it is no longer possible simply to add new topics without taking others CC2001 Computer Science volume – 13 – Final Report (December 15, 2001) away….It is important to recognize that this core does not constitute a complete undergraduate curriculum, but must be supplemented by additional courses that may vary by institution, degree program, or individual student.”
That last line is crucial and stands true for any curriculum today. Since educators can’t just keep adding new technologies to their syllabi, universities with limited resources stick to the unchanging fundamentals as a requirement. It makes sense. Universities are inherently more theory-based versus practical application. And, theoretically, you should be able to pick up new technologies and tools if you have the fundamentals down. While these are good points, the problem arises when fundamental theories don’t translate seamlessly to the industry. Yes, it’s undoubtedly true that fundamentals are important, but the only way to truly grasp the fundamentals is to apply it to real-world projects and scenarios.
“There were so many times I was scratching my head in college to the point where I gave up after a while because I couldn’t visualize where to start,” says Anubhav Saggi, a software engineer who majored in computer science at UCLA. “If you can’t internalize the fundamentals [through real-world projects] in a way that makes sense to you, then you won’t be able to really understand or appreciate why current tech works the way it does.
In other words, simply listening to a lecture on algorithms is okay. But actually carrying out real-world programming tasks using the most current technology is better. Often times, that’s up to students to be proactive and do it themselves.
How can universities focus on updating their curriculum if they don’t have enough professors to support the surge of CS majors?
Source: Tech Crunch
A report from the University of Washington found an astounding increase of freshmen CS majors by 300% over the last 4 years at the University of Washington. Again, some elite programs are fortunate enough to have supporters with deep pockets, like Harvard University, which recently announced expansion thanks to Steve Ballmer’s generous donation of $60 million. No big deal.
The larger majority suffers from this problem: They need more funding for tools and resources to make classes more hands-on and applicable to today’s technologies to support the spike in enrollment. Otherwise, students who learn by doing versus listening, like Saggi and Desai, suffer the consequences of a reduced quality of education with larger classrooms, more lectures and less resources for hands-on CS learning. Or, worse:
So, where does that leave students today? The most successful software engineers usually spend some time in the real world to get the hang of things. There’s a lot of StackOverflow-ing and general Google-ing involved. Without a structured way to visualize and apply teachings to current, evolving technologies, it’s a self-teaching study at the moment.
This is also one strong explanation for the recent rise in Massive Open Online Courses (MOOCs). Since traditional brick-and-mortar universities can’t support the spike, nonprofits, like Stanford and Harvard’s EdX, reached 1.25 million students. Both professors and students can look to online resources to test their classroom knowledge. For instance, Tom Murphy, computer science teacher at Contra Costa College says:
“I consider problem solving to be one of the most import skills to foster in computer science students, usually accomplished via challenging coding problems.”
Still, nothing can replace hands-on experience of applying knowledge to real-world problems. Students who feel ill-prepared must be proactive in getting hands-on experience, whether it’s by getting an internship, contributing to open source or practicing real-world challenges online, like security programming referenced above.
The evolving nature of computer science can’t be confined to brick-and-mortar university lecture halls. But adopting technological tools to make hands-on training easier and supplement evergreen fundamentals taught at universities is crucial to better prepare CS grads for the tech industry.
Lead image: washington.edu