/ Education

The Reputation of Universities and how it affects us

We've got a big problem with University education in our field. Some people leave knowing a lot, and having enormous amounts of practical experience, while others leave knowing very little, or not being able to apply their knowledge to real-world problems.

I think this has been accepted as a standard. "Internships are there to fill this gap", or "You'll get practical experience at your job". But if it is the case that when someone leaves an institution in which they have been for up to 4 years (or more, with a master's degree!), and the expectation is that they'll learn everything on the job, we're probably doing something wrong. We might as well not go.

On the other hand, we've got "coding bootcamps", which have a questionable reputation, although their focus is to provide practical experience and help their students get jobs. These bootcamps have questionable reputation because many students finish them without knowing the fundamentals of Computer Science, like algorithms, complexity, data structures, general design principles and patterns, etc... Which University would provide.

When hiring, what we're interested in is on whether you can do the job, and personal traits, culture fit, and the like. Often, doing the job implies learning, so being able to learn is necessary as well. The question is: does University achieve either of these things?

Can you do the job?

At University, I didn't learn about many things that I use daily on my job:

  • Python
  • JSON
  • CSS
  • Git
  • Scrum
  • JavaScript
  • Any JavaScript library or framework
  • Bootstrap
  • Design of applications and user interfaces
  • Deployment of applications and continuous integration
  • Using libraries, building from source, and general knowledge
  • Servers

I did learn some things though:

  • Java
  • General Object-Oriented Principles
  • Algorithms and data structures
  • Game programming
  • How to make class diagrams and Gantt charts (like really?)
  • Bash
  • Introduction to Functional Programming
  • HTML
  • Some Agile notions
  • ASP.NET
  • How to write ethics submissions
  • Working as a team (or, more often than not, how not to work as a team, which is even more important)

Some nice things in there, and a lot of things I wouldn't have learned if I didn't go to University, such as functional programming, ASP.NET, or writing ethics submissions. So University is clearly useful in some measure, but why are they not teaching the useful stuff, as well as the fundamental stuff?

Why are there no modules on Git, JavaScript, web design, creating APIs, deploying applications, deploying servers, and more things that a lot of people are going to be using daily?

Are you able to learn?

Learning about the fundamentals of Computer Science can be tough; there's some maths, some complex theory, design principles, etc; so that will definitely help learning. But the practical stuff also requires a lot of focus and patience, especially in the beginning when you can't make anything work. University does not help with this.

We need to learn about developing applications and making working software. We need to be able to read documentation and pick up frameworks or libraries. We need to learn more than just Java, as other programming languages become more and more popular in the industry. We need to learn a mentality that, while difficult to inculcate in many people, is essential for learning and adapting.

Being able to do the job is nice, but with an ever-changing landscape in this industry, it's unavoidable to become irrelevant unless we can learn, and more importantly, unless we want to learn.

The reputation of Universities

Universities have reputations to maintain, often using metrics like percentage of graduates with jobs within 6 months, or percentage of students that pass their degree, or amount of papers published in research journals. I don't think these are good metrics for software development. These metrics encourage the following:

  • Help graduates find jobs (any job), and
  • Make sure graduates pass
  • Do research and employ researchers instead of software development professionals as lecturers

There is nothing there about quality. We have a huge shortage of people in this industry, so of course most people will find jobs. We also want everyone to pass, so we'll decrease our standards until we meet our desired pass rate. They'll find jobs anyways.

When a graduate applies for a job knowing a bit of Java and HTML, I feel sad. Or when someone leaves university "knowing" HTML but unable to add an ssh key to their GitHub account. Or when the password to MongoLab is stored in plain text in the Git repository (we'll just change that in the next commit!).

These are things that should raise red flags that we're missing a big part of University education in Computer Science. Why are our students not interested in learning more and making better software?

But they'll learn these things in the job, so that's not a problem.