Why do an IT Degree?

5 August 2014

For training companies the busiest time of the year is from August to October, when software firms bring onboard their yearly intake of graduates / placement students and need them prepared for a career in industry. Its an honour and a privilege and something we take very seriously.

Preparation for this always rekindles some old debating topics, such as:

  • Why don’t graduates come out of college ready for industry?
  • Is it worth doing a computer science degree prior to working in industry?
  • What benefit do ‘ivory towers’ courses bring to software development?
  • What place (if any) does maths have in an IT degree?
  • Are self taught developers better than university educated ones?
  • Surely there are better things you could do with the money?

Reasons Why College Doesn’t Matter

It seems to me that there are a growing number of reasons why a computer science degree is less important than it used to be:

Knowledge is freely available

If you want to learn to program today you are overwhelmed with options. At Instil we have just finished mentoring a group of teenagers as part of the Festival of Code and (along with many other local IT companies) participate in weekend ‘coding dojos’. We run the BASH meetups whilst other companies sponsor technology specific user groups like Belfast Ruby.

Going online you can find free high quality training courses from providers like Coursera plus a deluge of blogs, articles and tutorials. Last but not least universities like Stanford have opened up their course materials to the general public.

So in any popular programming language there is an embarrassment of riches – unimaginable to students before the internet, who had to rely on photocopied notes and expensive textbooks.

Hardware is cheap

When I was learning to program most parents could just about afford a Spectrum or Commodore for their kids to learn BASIC on (thats what we told dad we were doing anyway…). If one of your parents was a university lecturer you got a BBC micro (no Manic Miner for you) but no one would let their child near a Personal Computer (especially a 286 with a jaw dropping 50MB hard drive).

These days you can easily replicate at home the same environment that most software developers use at work. We even see the trend reversing, with some attendees bringing their own laptops to courses because they are more powerful than the ones issued at work.

Development Tools are universally available

Almost all the tools a modern software developer uses are available for free as Open Source or through ‘community editions’. This extends all the way from compilers and IDE’s to databases and application servers. Its hard to underestimate the opportunities this opens up.

You can publicise your work

If you have educated yourself in a particular field to a high enough level you can contribute to Open Source projects, answer questions on Stack Overflow etc… Over time you can build up an on-line reputation that will make employers take notice.

A computer science degree is increasingly more cost than benefit

I have met a number of people doing great work within the university system, and would not wish to diminish their efforts. However consider the following:

  • The cost of completing a degree is continually increasing, leaving graduates with a huge burden of debt
  • Universities are under pressure to produce more and more IT graduates, whilst the aptitude of the average student falls as the number of applicants increases
  • Students are paying increasing amounts for their degree and hence (unsurprisingly) demanding a guaranteed result
  • Academics are rewarded based on their research work rather than their lecturing

Given the above it should hardly be surprising that (regardless of the best efforts done by individuals) the quality of university courses will inevitably drop over time. In particular programming seems to have been marginalised on many computer science degrees and, if your degree is some kind of combined course, it is often possible to omit programming entirely.

This is not to say that universities aren’t offering great and demanding courses in difficult topics, but that the rational consumer (sorry student) is motivated (if not encouraged) to avoid them.

Reasons Why College Does Matter

There are a few reasons why college does matter:

You need a degree to get an interview

This is certainly true. However these days to made the cut for first interview you only need a degree with an IT component or one of the conversion MSc’s that are available.

A university education will give you maths

The relevance of Mathematics to computer science deserves a series of articles in its own right. Some very interesting posts have been written on both sides of the debate. I would sum this up as follows:

  • An understanding of the mathematics underlying software engineering is extremely advantageous, especially as you progress to the higher rungs of the technical ladder. However it’s not essential
  • Ability at maths does indicate ability at programming, but so does ability at linguistics and/or musical composition. Maths is not a prerequisite for programming and the best preparation for programming is programming
  • Most mathematics degrees do not teach the areas of maths most relevant to computer science

Most computer science courses teach maths in such a way that it is not understood and/or quickly forgotten It’s interesting that there are no academic courses for software engineers in computer science. There are some self-study online courses in relevant areas of mathematics, but if maths was as important to software engineering as some people make it out to be then self-taught developers would be storming the barricades of academia demanding advanced degrees in ‘Maths for Computer Science’.

Where Do We Go From Here?

The following options seem to be floating around:

Go back to apprenticeships

This argument goes that we should stop asking students to attend IT degrees and instead take them straight into industry to be incrementally educated as they progress. Students would start with basic maintenance and admin tasks and then evolve over a period of years into well rounded software developers. Cynics would say this is already being done.

Engage with academia to improve degrees

I personally am in this camp but it does seem to be like swimming against the tide.

If there is one thing I would change about computer science degrees it is the amount of collaboration. When I was a psychology undergraduate we were formed into lab groups of 4-5 and every fortnight tried to reproduce a famous study (using another group as the test subjects). The first week was spent conducting the experiment itself and the second analysing the results, writing up the report etc… Everyone played a part and gained experience in each role.

Imagine a university course where students work continuously in groups, building applications using industrial tools whilst simultaneously exploring alternative languages (e.g. Idris and Haskell) and emerging technologies (e.g. functional data-stores like Datomic). Instead of writing 10,000 word essays on the theory of Agile they could see how lightweight principles make life easier. They could even have to switch codebases every few months, which would certainly reinforce lessons about code quality, test suites and documentation :-)

I’m amused and exasperated in equal measure by academic friends who insist this wouldn’t be appropriate for a university education, as if collaborative lab work and repeatable results are in some way non-scientific. Given that a recent report found a lack of reproducibility in computer science papers it might be that academia could benefit from some industrial rigour.

Move to a two degree system

This option seems to be default for lawyers and doctors in the US. You do your first degree in whatever you like and then, having proved your academic credentials, you go to IT school for 2 years and emerge as a well rounded software developer with an MSc to your name. Whilst this option appeals to me as a continuous learner it apalls me as a parent (who funds all this?) and won’t increase the pace of developers going into the industry.

Radical Change

I think everyone in education should watch this TEDx talk by Seth Godin on the future of education and what it is ultimately for. At the end he makes several concrete suggestions on the education systems of the future:

  • Homework during the day, lectures at night
  • Open book, open note all the time
  • Access to anything, anywhere all the time
  • Focused education not batch processing
  • Teachers become coaches
  • Learning becomes lifelong with work starting earlier
  • The death of the famous college (i.e. don’t pay for the brand)
  • It should be easier to do this for Computer Science than for any other subject. Imagine if we did…

Your Opinion Matters

I’m fascinated to know what other people think about this, both young and old and from inside and outside academia. So please contribute below and/or on the Facebook site. If there is enough interest we might try to organise some kind of public debate under the BASH umbrella…

Article By
blog author

Garth Gilmour

Head of Learning