Skip to content

Developer core competencies – what should universities cover in Computer Science courses?

October 11, 2012

This tweet from @stuherbert got me thinking:

Despite the massive shortage of programmers to fill vacancies, UK comp sci grads are the most unemployed 6 months after graduating #ouch
There’s been much talk about teaching grass-roots ‘coding’, even at Primary School level. But it’s clear to many that our university courses are woefully inadequate, out of date, and little use to employers generally. Usually a CS grad is useless until they’ve got at least a year’s experience under their belt. Actually, they can be worse than useless – some poor bugger has to live with their code.
I thought I’d note down as many general ‘computer science’-like core competencies that I think modern businesses should look for as a grounding for any developer, regardless of whether they’re doing web, mobile or software development in a start-up, an enterprise or a bank.  I’d love to make this more comprehensive, so please comment below and I’ll incorporate your ideas too.
  • Version control, inc svn, git, hg, perforce – including branching and merging
  • Key algorithms eg binary search, bubble and quick sort, lexical analysis, etc
  • Game & animation physics (elasticity, acceleration, gravity, trajectories, etc)
  • Maths – percentages, number systems, logic (and/or/xor), bitwise operations, randomness, etc
  • Business analysis, requirements, UCD, user stories, MoSCoW
  • Project Management inc Agile, Kanban
  • HCI & UI design
  • HTML & CSS
  • Basic C++
  • ‘Scripting’ with bash, ruby, python
  • Code deployment and Continous Integration
  • How search engines work
  • Finding solutions to problems (googling, GitHub, stack exchange)
  • Evaluation of code for bad smells & code review
  • Teamwork & collaboration
  • Security – network, xss, buffer overflows, encryption, etc
  • Intros to: java, objective C, PHP, Ruby, Python
  • Frameworks & MVC for the above inc separation of logic & display
  • Building reusable code (note: inc OOP, but wider than that)
  • Technical Debt, YAGNI, premature optimisation, over and under engineering, refactoring, KISS, DRY, not-invented-here-syndrome
  • How open source works; contributing to open source.
  • Operating systems – installing, using, maintaining: windows, Linux (centos+Debian?), Mac
  • Filesystems – from coding to block device
  • Threading (on OS and also stateless http), race conditions, stampedes etc
  • Debugging & optimising inc memory leaks, stack traces etc
  • BDD & TDD, automated testing, unit testing etc
  • Prototyping
  • Content Management Systems (inc WordPress & Drupal)
  • Realtime
  • Caching
  • RDS and noSQL
  • Storage types – strings, floats, etc, their problems
  • A bit of history won’t hurt…
  • Hardware? Or is that electronic engineering now…?
  • Regular Expression (thanks @mrpmeth)
  • API design, usage, good and bad (thanks @fooishbar)
  • Design Patterns (eg Dependency Injection)
  • Overview of software dev philosophies eg OOP, Aspect-OP, Event-OP, Extreme, etc.

What’s interesting is that so many of the competencies above either didn’t exist 10 years ago, or have only become mainstream as being more than just ‘best practice’  in the last 3-5 years. That’s a big challenge for education, which just can’t move quickly enough. There must a better way of keeping it relevant?

12 Comments
  1. October 11, 2012 1:29 pm

    QA should be added to the testing as just passing unit test does not mean its of decent quality!

  2. October 11, 2012 1:36 pm

    As a CS student myself, I couldn’t agree more. In the first two years of my course, I haven’t heard anything about half of that list.

    Which is why I’m now 3 months into a year out in the industry, before returning to university for my final year. I can absolutely say that I’ve learnt more in these three months than I have in the past two years. Sure, it’s nice learning the theory behind algorithms/data structures, but actually using them – and having a deadline to hit – is entirely different.

    I do think it’s worth mentioning that universities shouldn’t be expected to teach students everything on that list (or is it?). I mean, universities should teach students how to *learn*. In these three months, I’ve used C#, XSLT, XML, Umbraco, .Net — none of which I’d used before (and I hadn’t even heard of XSLT), but it only took me a week or so to grasp, because I knew *how* to pick up the skill (or at least try). With all the red tape surrounding “curriculum”, by the time Ruby gets to the top of the list, it’ll be yesterday’s technology.

    I definitely think more universities need to push industrial years to students, though, and for more small companies to offer them. The skills I’m picking up are a lot more valuable here than in a lecture hall – and that’s including none-IT skills; client interaction, time management that are required in the “real world”.

    • October 11, 2012 2:53 pm

      Agree re industrial years. I didn’t do a year out, but I did 3 very long summers at Vodafone. It was a massively mixed bag. The first year I wrote one or two pointless C++ test harnesses and learned how to do The Times cryptic crossword from a contract developer. The second year I did ‘year 2000 co-ordination’ involving phoning people up and adding items to a spreadsheet…

      It was only in the final year when I was paired up with a senior dev / middle manager who had spent more than 2 mins thinking about what I should do that it was useful. In a matter of weeks I learned PL-SQL and built a bug/feature tracking system for Vodafone’s Operations team, entirely using Oracle 8i and Oracle Web Server. The tech itself was a dead-end, but I learned a huge amount (mostly that people don’t know what they want until they see it!) and will forever be grateful to the guy who was my mentor there. Previously I’d also done a summer at The Co-Op bank and ended up building them what was basically a CRM for their B2B unit. Amazingly it was still in use years later and I came back to Y2K test it.

      So we have to find a way for more people to have experiences like I did at Vodafone (in the 3rd year anyway) and at the Co-Op bank. It requires people within a business to provide real-life projects and real-life scenarios.

  3. Jenny Wong permalink
    October 11, 2012 1:52 pm

    I’m a recent graduate and my lecturer admitted that even though his notes were prepared that summer before class, they were already out of date.

    He told us : ” If you want to stay up to date here are the places you can read the latest news, here is where you can see what the professionals are saying, and most importantly, here are the local user groups where you can meet, listen to and speak to the local professionals”.

    Out of a class of 30 students. I was the only go to go to the recommended user groups.

    Another Lecturer told us ” We are here to teach you to think, not to spoon feed knowledge into you”.

    Technology changes too quickly for academic red tape, The best they can do (and did for me) is to teach us to research and be our own teachers.

    A third lecturer who was in charge of the sandwich year tried to rally students to take a placement year. Yet from a group of 30 students only 4 of us took the sandwich year option. Even though the university didn’t charge us for the placement year (some universities still charge for the year).

    To those people, I am grateful. It highlights the fact that sometimes no matter what a lecturer or number of lecturers say, students will be students and against all advice do nothing. These students still expect to be awesome for a graduate job as soon as they leave university life.

    It is as much a problem with the mindset of the students as what the lecturers teach.

    • October 11, 2012 2:47 pm

      Yep, out of date notes is a huge problem. I suppose I’d like to see the core principals taught. If they’re looking out of date or under flux (eg if you were teaching version control 3 years ago as Git was growing in popularity) have a ‘whats new’ lecture at the end touching on new ideas.

      In the real world there’s the same problem with few people keeping up to date and being self starting, and most only learning as and when they absolutely need to and never realising there’s a better way to do something.

      • Jenny Wong permalink
        October 11, 2012 3:12 pm

        I went to Salford Uni and I also know the course that York Uni teaches.

        York uni did a module which was the theory of languages. It taught the principles which enabled their students to be able to pick up any language because all they needed to learn then was syntax. I know it has meant my partner picked up PHP a lot faster than I ever could because all he needed to find out was what the syntax for a concatenation was etc.

        The way that Salford Uni got version control into a module was not to have a module in it but to insist that a number standard industry practises were written into each module taken. It doesn’t matter which type of version control is used as long as the habits of using one are there.

        There were as always tools that they did not teach or implement into modules, but they did tell students about them and mention that it was some thing we should learn about ourselves.

        One of the biggest moments for me was industry/ businesses coming into the classroom and telling students what they expect from them when they graduate.

        Its a problem when a lecturer only has 2 hours a week and 10 weeks to teach a module.

        Maybe what would help is that sandwich years become a standard requirement of doing your degree that can be done either as your 3rd or 4th year in the degree.

  4. October 11, 2012 2:05 pm

    The problem today is that the CS spectrum is too diversified to be fit into one common CS course to benefit everybody equally. IT companies are no longer just creating web sites. They are into heavily specialised areas such as Big Data, RDBMS, Caching experts etc. So either we make it mandatory to have a specialised post-grad degree or we diversify the undergrad course itself from on catch-all CS degree to different broad streams.

    Perhaps like ‘CS -Web Development’ or ‘CS – Mobile’, ‘CS – Data analytics’ etc. They could share similar modules but their final year speciality modules could differ. This could help students target their group of companies to work for and also help companies narrow their search based on the qualification stream.

    • October 11, 2012 2:45 pm

      My degree was “Computing and Information Systems” which was basically CS + database oriented and analyst type stuff and less hardware/embedded stuff.

      The only useful stuff was underlying algorithms, but I think I learned more about that from this book: http://shop.oreilly.com/product/9781565923980.do

  5. October 11, 2012 2:22 pm

    I think Agile software development should be on that list. Technology changes at a rate far faster than most organisations can keep up with that it’s important for the principles of Agile be taught. CS grads need to be aware of incremental development in anticipation and response to change.

  6. October 11, 2012 3:57 pm

    Missed that. I must wear my glasses.

Comments are closed.

%d bloggers like this: