Computer Science degrees and relevance to the real world

I was reading this article and it reminded me of when I first entered the workforce and wondered just how relevant my university degree really was to an actual software development job. I still think computer science university courses could be a lot better, but I don’t agree with modeling it on actual real-life projects, reasons being:
– They can learn this better from doing work experience (which looks great on the resume too)
– It teaches a single way of doing things, which doesn’t encourage adaptability to different processes and methods
– It will probably teach them all to be programmers
The reason universities teach theories and concepts is because it gives a solid, adaptable grounding in a field. Information Technology is a field that is constantly and rapidly evolving. Universities are training people for IT jobs that don’t even exist yet. How do they do that?
Computer science graduates don’t all end up as programmers. They can end up as testers (hi!), database administrators, business analysts, project managers, web designers, system administrators, and heaps of other jobs that probably have yet to be invented.
Actually, at my university (University of Queensland), we actually did a team project subject with monthly deadlines, documentation required and version control. I don’t think it was really that useful, because it wasn’t real. In a student project, you hand in your work, it gets graded, and that’s the end. In a real project, you deliver your work to the customer and if it’s not good enough then you have a responsibility to fix it, and explain why it wasn’t done right the first time.
So, what improvements would I like to see made to university courses? For starters of course I’d like to see more testing concepts introduced. In fact, there could be a whole heap of useful non-programming subjects added like requirements gathering and interface design.
I hope as the software industry evolves, universities will incorporate software testing theories into their education system. Then maybe someday soon we’ll see computer science graduates who want a career in testing, rather than seeing it as the job you take when you can’t get a real programming job. :)

I was reading this article and it reminded me of when I first entered the workforce and wondered just how relevant my university degree really was to an actual software development job. I still think computer science university courses could be a lot better, but I don’t agree with modeling it on actual real-life projects, reasons being:

– They can learn this better from doing work experience (which looks great on the resume too)

– It teaches a single way of doing things, which doesn’t encourage adaptability to different processes and methods

– It will probably teach them all to be programmers

The reason universities teach theories and concepts is because it gives a solid, adaptable grounding in a field. Information Technology is a field that is constantly and rapidly evolving. Universities are training people for IT jobs that don’t even exist yet. How do they do that?

Computer science graduates don’t all end up as programmers. They can end up as testers (hi!), database administrators, business analysts, project managers, web designers, system administrators, and heaps of other jobs that probably have yet to be invented.

Actually, at my university (University of Queensland), we actually did a team project subject with monthly deadlines, documentation required and version control. I don’t think it was really that useful, because it wasn’t real. In a student project, you hand in your work, it gets graded, and that’s the end. In a real project, you deliver your work to the customer and if it’s not good enough then you have a responsibility to fix it, and explain why it wasn’t done right the first time.

So, what improvements would I like to see made to university courses? For starters of course I’d like to see more testing concepts introduced. In fact, there could be a whole heap of useful non-programming subjects added like requirements gathering and interface design.

I hope as the software industry evolves, universities will incorporate software testing theories into their education system. Then maybe someday soon we’ll see computer science graduates who want a career in testing, rather than seeing it as the job you take when you can’t get a real programming job. :)

2 thoughts on “Computer Science degrees and relevance to the real world

  1. I would say that it wasn’t particularly well-suited even to developers: the majority of real-world development work is not writing from scratch, or adding new features to a nicely prepared receptacle, which were almost the only contexts we covered. In the real world, we maintain and we refactor, a LOT.

    As an engineer I did a different team project, and so I also graduated without ever having used version control!

    I think the software team project could be best improved by putting a major emphasis on iterative requirements changes. I don’t know how else one can begin to truly understand maintainability (with testability being one of its subsets).

  2. You raise a good point about maintainability. Refactoring or improving upon someone else’s code (while not introducing new bugs) is an essential skill to have, and probably makes up most of what developers do in practice.

    I wonder what the best way to teach that would be, and if teaching it at university would result in better programmers?

Comments are closed.