12 Bonehead Misconceptions of Computer Science Professors « The War on Bullshit

12 Bonehead Misconceptions of Computer Science Professors

by Kavan Wolfe (published on Oct 19)

The poster-child for what’s wrong with postsecondary education is the computer science program. Despite the enormous need for competent programmers, database administrators, systems administrators, IT specialists and a host of other technical professionals, computer science programs seem to explicitly ignore the professional skills of which western society has growing deficiency and proceed with materials and teaching styles that are outdated, ineffective, useless and just plain wrong. This is due to the absurd misconceptions held by computer science faculty members across many universities.

I have personally met computer science professors who believe each of the following things. I make no claims as to how widespread these beliefs are; you can judge that for yourself.

1. Java is a good first teaching language

I don’t know how many computer science programs start teaching programming using Java, but there are more than a few, and that’s too many. When you’re going over variables, loops and conditionals, the object-oriented overhead of a language like java is unnecessary and confusing. Inquisitive students can’t just memorize things (i.e. public static void main (String args[])) without demanding to know what it means and why it’s there.

2. Machine language is “basic”

Comp Sci people seem to be terribly confused about what ‘basic’ means. When one learns to drive a car, starting the car, making a right turn, a left turn, parking, etc. is basic. Building a parallel gas-electric hybrid engine is not basic. Driving a car is more basic than building one because the latter requires significantly more expert knowledge than the former. In the same way, using a simple scripting language requires less depth of understanding that writing in machine language; therefore, computer science education should start with higher level languages and proceed to lower level ones, not vice versa.

3. You should write code on paper before you write it on a computer

Writing code by hand is stupid. It is entirely inconsistent with the interactive and iterative design process that comes naturally to hackers and painters alike. Professional software developers make extensive use of API documentation, reference guides, forum discussions, etc. to make troubleshoot problems and make their code more efficient and effective. Writing code by hand tests your ability to write trivially simple software without making errors. Real programmers must be capable of making complex software and detecting their errors with a variety of automated tools. Teaching or testing coding using pencil and paper is inconsistent with both the natural mode of human action and the practical realities of software development.

4. Lectures are an effective method of teaching programming

Programming is like algebra. You can’t learn how to write code by watching someone write code on a blackboard or listening to elaborate explanations from professors. You can’t learn math from watching someone do math. You learn to do things by doing them.

5. Algorithm design is learned by reading existing algorithms

Designing algorithms is about finding innovative solutions to difficult problems. Algorithm design courses are about studying existing solutions to rather simple problems. Learning how a particular problem can be solves provides approximately zero insight into how to solve problems you’ve never encountered before.

6. You can just ‘pick up’ prolog in a week for a course

There’s this crazy belief among Comp Sci. faculty that all languages are basically the same, so after learning the principles behind languages you can use whatever. This is bullshit. This is like claiming that since someone studied Spannish grammar in grade school, they can speak Spanish fluently, in any of Spanish, Mexican or Columbian accents. The leap between structured and object-oriented programming is huge, and it pales in comparison to the leap between object-oriented languages and declarative languages.

7. Exams measure understanding of programming

Teams of professional programmers spends months and years building intricate software systems in response to poorly-understood, ill-defined and changing problems. To accomplish this, they employ API documentation, online tutorials and forum discussions, team problem-solving sessions, reference books and an infinite number of phone-a-friend lifelines. Exams test your ability to write simple code to solve a trivial, well-defined static problems, without consulting and references. One is about resourcefulness, the other about memory. Exams test the wrong thing.

8. GUI’s are not an important aspect of learning to code

At the university where I did my undergrad, it was easy to finish a B.Sc. in computer science without ever building a graphical interface. While I agree that many software projects do not have graphical components (e.g., developer APIs), to marginalize GUIs as some kind of specialty endeavor is short-bus crazy!

9. Programming Requires Calculus

I have been told that development involving sophisticated work with graphics and animation involves calculus. Outside of this particular subfield, however, I haven’t seen much calculus in software development. Certainly I’ve seen a lot more GUI development than graphics.

10. Linux will rapidly overtake Windows among consumers

Comp. Sci. profs have been saying this for years. Hasn’t happened. And it’s not going to happen until Ubuntu and company take the dicking around out of computing the way Apple has.

11. LaTeX will overtake WYSIWYG text editors because LaTeX gives you more control

Yes, believe it or not, a computer science prof said this during one of my classes in undergrad. It goes directly to a deeper misunderstanding among Comp. Sci. academics that power and control are the primary factors driving adoption. They’re not. Simplicity and ease of use are far more important.

12. You can buy gates at RadioShack

The same idiot who thought LaTeX was the future also told his class to go buy gates (the things transistors are made of) at RadioShack and play with them to see how they work. Again, this evidences how completely out of touch some of these people are. Gates are microscopic. You can’t go buy them at an electronics store.

Update (25MAR2011): As so many helpful readers have pointed out, 1) gates are made of transistors, not the other way around, and you can now buy gates at Radio Shack online. However, the prof in question told me to go buy gates at a physical Radio Shack store in 2001, and they had no such thing. I don’t know what I was thinking when I wrote “the things transistors are made of.”


I have long argued that society needs a professional certification for software developers and that universities need undergraduate programs dedicated to training people for these certifications. It’s worked for accounting, engineering and medicine. There’s no reason it can’t work for software development. One of the primary barriers to this sort of progress is the raging incompetence of academics in computer science, computer engineering, management information systems and related disciplines.

Have one or a few to add? Comment away.

Related Posts
Why on Earth do Business Schools Teach Microsoft Access?
Abolish Universities?
Nine Reasons why Bad Grades Don’t Mean Squat

Comments Published in


  1. Jon Cage says:

    I tend to agree with Brendan. Java’s not a bad first language (although Python would be better IMHO). It teaches your at a reasonably high level about object oriented programming without so many of the opportunities to shoot yourself in the foot as C++ provides.

    @Izkata: Why on earth would you want to waste time re-writing code which other people have already written for you? Do you implement string handling routines each time you write any software or do you use the STL? If CS degrees teach you one thing; that it’s okay to make use of other people’s hard work to help you solve really interesting problems then why waste time on the little things?

  2. Aeiluindae says:

    Odds are, they are not available at most radio shack stores, any more. disappointing, because they are easier to find than dedicated electronic component retailers.

  3. jkuehn says:

    You can still purchase a FET at radioshack and make a logic gate. The OP didn’t know what he was talking about. Yes, there are cheaper places. No, the prof’s advice didn’t make a lot of sense because configuring a logic gate at the transistor level relies on electronics experience that most CS guys don’t have.

  4. Allen says:

    Mostly a reasonable position.
    I take exception to 8,9 and 12 though….

    8 GUI’s not important to learning to code.
    They are not important.
    Just like parsing is not important to learning to code.
    And algorithms are not important to coding.
    Knowing how to properly write a novel is not important to learning
    how to write.
    Algorithm’s are important tools. GUI’s are important tools.
    Neither of them is coding.
    Coding and design are separate but inter-related. I can design
    algorithms without knowing anything about coding. Coding is the
    process of converting the algorithm’s into executable form.

    9) Programming needs Calculus.
    Animations/graphics are not the only fields that need calculus.
    Advanced financial analysis, datamining, engineering, physics,
    advanced chemistry, protein folding all require calculus at
    some level. Some more than others.

    12 Gates at Radio Shack: Yes you can buy gates, those are the 7400 series devices that have 4 nand gates (or other things like DFF’s) in a 14 pin package. Though it is getting harder to find in many of the Radio Shacks as they cater less to the hobbyist. You can even buy experimentors borads and some simple “toy” learning kits. You can not actually see the gates but you can play with them and learn about them.

  5. elhombre says:

    Skip anything you like son, the real world will soon kick that 13 year old emo-fag attitude out of you. As smart as you think you are, it counts for nothing. The world is full of failed geniuses. I strongly suspect that companies turn you down because you’re an insufferable, immature little drama queen. Don’t feel bad, though, you will mature with time.

  6. Bugong says:

    While i do agree with most of what you wrote, i have certain quibbles with some of them.

    1. I agree, Java is difficult to learn at first but it’s biggest advantage compared to many other languages is it has a very very well documented API. If you need anything, you can just look it up there. I cannot say the same for C even if. Also, most of the things you need are already provided like parsing libraries or linked list libraries which isn’t provided with C.

    3. While writing on paper may seem stupid, it does help to organize your thoughts. You don’t have to write perfectly correct code. It helps you see the logic flow. I write pseudocode on paper so i could see the flow. Then when I’ve got it down, i’ll write the proper code. This also useful when you’re in a team. If other people see the way your code works with the psuedocode, it will be easy for them to read your code and improve on it. That’s why there are whiteboards everywhere in Software Engineering companies.=D It reduces cost because when you’ve sorted the kinks out on paper first, you don’t have to redo everything if something happens.=D

    4. Lectures are needed. They teach you the programming concepts that you need to know. If you need actual programming lessons, internet tutorials are better. But again, they won’t teach you new algorithms.

    5. There aren’t really new algorithms. “New” algorithms are actually mashups of different existing algorithms. Unfortunately, most of them involves problems you’ve never seen before so it is unavoidable that you will have to learn an algorithm which solves a problem you think is stupid.

    6. You don’t need to pick it up immediately. Short “prolog” courses are there to get you started. Also, if two different programming languages follow more-or-less the same paradigm, it’s easy to switch in between, like C or Java

    9. Amazingly enough, I’ve seen a lot of math in programming. It’s just not as obvious as you’d think.

    All in all, i think you’re mixing up computer science and programming. Big difference. Programming’s the tool. Computer science is the knowledge needed to use it.

  7. Bill says:

    You sir, are a god amongst insects. Not only did you pinpoint the bullshit in any given CS degree, but you just inflicted mass-butthurt on every “CS Graduate” that can’t come to terms that their education was a giant vat of ass butter.

    Protip: Your field isn’t a gift from the heavens. It’s not unique. Hell, it’s a bastardized combination of many different fields, dependent on what your specialty is. You make big boxes that store various objects that hold electrical charges shift currents and voltages around. Occasionally you make the big shiny box that people sit in front of flash different colors. Wooooo. 90% of the critics that posted probably can’t wrap their heads around a game loop.

    And I swear to god, the next person that says that MIPS is a viable road map for learning assembly is going to get shanked.

  8. Chris says:

    I would remove #12 if I were you. First, gates do not make up transistors. It’s the other way around! Second, logic gates are indeed available at some Radio Shacks. You probably should do some research next time before accusing others of being idiots on a subject you don’t understand very well yourself. Out of touch indeed.

  9. Jack White says:

    Good post! Point by point:

    1. I totally agree. While I think there has to be a certain amount of jumping in at the deep end when you start programming, Java makes things difficult because of the sheer amount of structural definition you have to do to get something basic to work. In order to really understand your first programme, you have to understand the basics of object orientation. The need for OO might seem reasonable to a more seasoned programmer, but your first steps should be understanding very basic things like the idea of an instruction.

    That needs to be coupled with sending people off in a direction that will lead them to their end-goal. So for engineering students, C is good, because there is little structural definition, but it’s easy to expand on the basics to include work with low-level hardware work (pointers, bit fields and working with memory). For a CS student, LISP is better, because it encourages the programmer to think algorithmically and in terms of data representation. For others something less powerful, but simpler, like Pascal or maybe Python.

    Nobody should touch the likes of Java or C++ as their first language – they really are bastard love-children of a variety of languages, and their impurities and idiosyncrasies make them conceptually hellish.

    2. I don’t agree so much with this. Learning about assembly and how that relates to machine code were the first steps I took in the world of programming. Although I still think there are better starting points, any programmer will benefit immensely from the ability to see how their high-level code is an abstraction and that many high-level ideas are grounded in how data and code are represented at a low level.

    3. It’s a fact that many programmes evolve rather than are planned. But it’s equally true that there are multitude of big name softwares that suffer immensely from this approach. The trouble with teaching programming by paper design is that the kind of projects undergrads are typically asked to produce could be hacked together by any half-competent coder in a matter of hours. In these cases, it’s usually easier to do the hacking. But if you try to build a cast-iron email client, or (God forbid) a content management system in this way, you are so far up shit creek that you can see it welling up from underground.

    4. Yeah, programming lectures are a heap of shit. Ha ha. Heap.

    5. Algorithm design is not taught by reading algorithms, but it can be taught be *studying* algorithms and trying to understand the thought process behind the design of someone else’s algorithm.

    6. I don’t get this attitude either. I went to Glasgow University to do CS 10 years ago and any number of people said languages were essentially all the same. What a load of bollocks! I think this comes as a result of the two most popular working languages being C and C++, and then adding a load of C-like languages into the mix. There are massive benefits in using different languages in different circumstances and frankly if you use them the way they are meant to be used, then even the difference between C and C++ can be a really headfuck. If you start involving yourself with Haskell for example, then you actually need to unlearn a lot of what you picked up while learning C.

    7. I agree.

    8. Ehhhhh… I don’t really agree. Development of GUIs is still a tiny, tiny part of the world’s software development – it’s just that they’re always on display. Glasgow did quite a good first year course on visual design. I just think though that GUI design is for GUI designers and coding is for coders. The way you programme a GUI once it has been designed (by a designer) is unlikely to be terribly irksome. While they do need to be got right, and they’re important because they’re at the human/computer interface, they totally *are* a small speciality.

    9. Programming doesn’t necessarily require calculus, but a lot of it does. You should just regard algorithms for implementing calculus the same way you would any other algorithm – something you should know about for knowledge’s sake and that can be brought out of the toolbox if need be.

    10. I used to be totally into Linux. But since those days I’ve used a PC for more and more – newspaper design in particular, general admin work and typing up reports for an insurance broker. With those applications, you really, really appreciate what the likes of Microsoft and Apple have done for the microcomputer. Ease of use and good presentation rule the roost for most users – not power or customisability. While they have a way to go before getting it right, I believe that both Canonical (who coordinate Ubuntu) and Google have cottoned on to this. Canonical in particular now sees Apple as leading the way in user interface design and is making a concerted effort to advance Linux as a consumer operating system. I think the real power of Linux is that its (many, many) developers have the ability to place this layer of extreme usability *on top* of their geeky engineering OS, whereas Microsoft and Apple are institutionally unable to go in the reverse direction. If Ubuntu *does* become as usable as Windows, then it will start to go places, particularly in SOHO and SMEs.

    11. I sometimes use LaTeX. It’s good in a way, but rubbish in so many others. It is so much easier to accomplish layout as good as LaTeX *can* do, with something like InDesign or QuarkXpress.

    12. Like everyone above said, yes, gates are small, but there’s nothing to stop someone just putting two of them on a consumer chip and flogging that to people like us. I used some in my first year of my degree.

  10. Ira says:

    1) Java is not a good language to learn to code on: however, its a great language to learn OOD on. Much better than C++. But learn to program in basic. why? because its really simple. then learn structured programming in Pascal. then learn Object Oriented Programming in Java. Then learn C++ because then people won’t think you’re an idiot.

    2) If you’re a CSE or want to write your own OS or compiler, you should learn assembly. Otherwise, don’t waste your time.

    3) don’t write code on paper. but do learn to write Pseudocode, and learn flowcharts, and most importantly, LEARN HOW TO WRITE A DESIGN DOCUMENT. Far too few programmers know how to write decent documentation. And its what separates projects that fail from projects that don’t. For that matter, learn how to read requirements, too.

    4) Lectures are effective at teaching anything unless you can’t learn through them, and then they are useless no matter what subject they are in. However, a good computer course should be 50/50 lectures and computer work. Believe it or not, Computer Science is a science. If nothing else, learning logic helps.

    5) Who designs their own Algorithms. OTOH, its nice to be able to discuss the merits of different sorting algorithms with a new hire without getting that blank look.

    6) Well, some people can. But seriously, some curriculum’s need to realize that the number of programming languages you learn isn’t nearly as important as what those programming languages are. Ruby on Rails will get you hired. Snobol will get you laughed at.

    7) I always loved exams, but they don’t measure anything except that you can think under pressure. Hmmm…. not a bad life skill. Programming is usually not a high pressure job. Usually.

    8) Gui’s can be or not depending on the job. However, a new Comp Sci should know everything about Gui’s, so they can be as clueless about them as they are about everything else.

    9) Everything requires Calculus. Everyone should know calculus, because, well… Look, you are a science major, and every other science major learns it, even the Biology people who become veterinarians. If I had my druthers everyone who earns a Bachelors degree would have taken Calc 1. Good thing I’m not a dean.

    10) Don’t look now. Ubuntu is gathering some momentum. Still not going to happen. Because Microsoft wields too much economic might to go anywhere.

    11) yeah, that’s just funny. I wrote my thesis in LaTex, hated every minute of it. There’s a reason its not used outside of textbooks. But formula editor in word sucks beans.

    12) I buy gates at home depot. Hang them myself. oh, you mean the things on computer chips. um. No opinion.

    Conclusion: Anyone who thinks you go to college to learn how to program needs to actually realize what going to college really teaches you. It teaches you how to think, how to study, how to take notes, how to do all that crap that high school didn’t teach you. You learn how to program by sitting at your computer at home writing code, and by sitting in an office writing code, and by sitting on a beach writing code, and by sitting in a board room being told how crappy your code is by someone who’s been writing code for 20 years, and by standing at the water cooler with the same guy, and by reading his code, and by reading the code of the guy who got fired for downloading too much porn on the work machine, and by debugging code from a vendor who didn’t know what they were doing because they all got Bachelors degrees in Computer Science.

  11. Matt says:

    Misconceptions by this author:

    Software development is the only thing you can do with a computer science degree.

    This article should be titled “12 Reasons how Computer Science Professors don’t prepare students for Software development”

    You are just complaining because Computer Science professors didn’t teach you how to write code. Computer science is way more broad then just writing programs. Just because you’re professors didn’t prepare you for software development where you work doesn’t mean that they aren’t good, or aren’t doing the right thing. Most of what you said might be true for you’re job where you work but it in no way describes computer science in general. The major is computer science, not software engineering.

  12. TP says:

    Wow, you stepped on a lot of toes!
    Nice job!
    I’ve seen people learn to code so many different ways
    its not funny. Teachers are always comfortable with a book
    and an approved method, it freaks them out to learn some kid hacker
    can out code them. But coding is so easily learned by way of the internet
    that any interested halfway intellegent person can learn it.

  13. Peter Lind says:

    First of, I am a computer science instructor, so I might be more than a little biased.

    You touch on some interessting points, but are not quite right in all of them. Let me take your “misconceptions” one at a time.

    1. Java – you are absolutely right, Java is a complicated first language. Python og Ruby would be better to learn about variables and programflow. However Java is much better than C or C++, and it has an immediate “real-world” feeling, since the students can actually see it being used and requested by the industry.

    2. Machine Language – when we say that it is “basic”, or low-level, we mean from the computers point of view, not from the students. Machine Language is the basic “workhorse” in the computer, whereas high-level languages have to be compiled or interpreted. High-level are however easier to learn for humans …

    3. Write code on paper – I’ve seen other instructors make this mistake too. The idea is that you write your program, not your code, on paper before sitting in front of the machine. Working with paper allows you to design your program, to think about the algorithm without all the nitty-gritty of the language you use. Experienced programmers can do it in their heads, or in comments, but students need yo use paper. Unfortunately some instructors seem to think that you need to write compilable code on paper, and that is indeed a bonehead misconception.

    4. Lectures – you are absolutely right, you can only learn programming by programming.

    5. Algorithm design – hmm, well, it IS a good idea to learn to read existing code, and to understand how they solve their problems. But it certainly isn’t for beginners.

    6. Picking up other languages – I kind of agree. While it is informative to show students that they can learn a new language using the same principles they have already learned once, it always ends in confusion. In the long term you will pick this up by yourselves.

    7. Exams – yeah, they are problematic, but we need some way to measure your individual understanding of the area. I would love to see some changes there, but haven’t the faintest idea as how it could be done better.

    8. GUIs are tools of the API – like working with filesystems, XML, databases, encryption, sound, threads, networking etc. It is not part of learning to program, and it requires more learning a specific API, than learning basic theory, and I guess that is why it is often left to specialty courses. I would like to spend more time on GUI-programming, but it requires that you first understand basic programming, algorithm design, program flow, multithreading and event-controlled programmming.

    9. Calculus – you are right, it doesn’t. Programming doesn’t require any specific math-skills, it only requires that you are logical and systematic – traits often seen in higher mathematics, but the connection between high-school math and programming is a nuisance that I would like to get rid of.

    10 and 11 sounds like personal opinions, like whether Mac or Windows is best, and whether everyone should use Open Office or some other system. It is an endless debate, and no one is correct.

    12. You are mistaken – or your professor has taught you poorly. Gates are built from transistors, and you can buy integrated circuits with four NAND, EXOR, AND og OR gates in them, and play around with at home. Boolean logic and how it is implemented in electronic circuits are a very important aspect of computer science, but maybe not so much in programming.

  14. Kavan Wolfe says:

    Thanks to everyone who left a comment. To fully understand the perspective of this article, one needs awareness of the paradox at the heart of computer science. Each year, western society requires hundreds of thousands of new software developers, and only a few thousand new computer scientists. Unfortunately, there are very few software engineering programs, so most students who want to become developers take computer science. However, the people who control the computer science programs are largely computer scientists, not software developers. This results in a fundamental mismatch between what society needs and what these programs deliver.

  15. Java isn’t really a good first taught programming language, I mean why do we have to put those public static void main in there anyways.

  16. Justin says:

    Programmer here. As in passionate, obsessed, want to make new software do amazing stuff programmer not the lately seen stereotype of “oh i heard it’s well payed, i want to be a programmer”. In any case, there is nothing i would love more than to see some proper IT teachers in Universities, especially in the Programming courses.

    Just as a small insider, in my university (which shall remain unnamed) our VB.NET programming teacher is such a bloody genius he wrote in nice big font on the projector Dim Integer as Integer and couldn’t figure out what the problem was for about half an hour.

    But when i wrote a string to 7-bit octet text conversion function (the one used to convert string to SMS PDU format) i got a 60 in my report because, and i quote “you did not use arrays in your code”. Then what the bloody hell do you consider a string when i refer to it with brackets (or parentheses in the case of VB) …. IDIOT!

  17. Ash says:

    I agree with everything except point 6, I found that after I learned C++ I could pick up nearly any language quite easily.

    I know a kid studying CS now and he is having issues with binary and I told him that he doesn’t need to really know that just get past it and forget it, it is redundant to modern Programmers.

  18. Kavan Wolfe says:


    C++ to Java and other object-oriented or scripting languages isn’t so bad, but try Prolog and LISP.

  19. Aphotic says:

    I’ve bought some AND and OR and NOT gate chips at RadioShack last semester.

  20. Body Shapers says:

    Python should be the first language taught, very easy to learn.

    As much as I try to get rid of Windows I always go back, I love linux but I don’t know if will ever be good for the regular user.

  21. Jake says:

    “The same idiot who thought LaTeX was the future also told his class to go buy gates (the things transistors are made of) at RadioShack and play with them to see how they work. Again, this evidences how completely out of touch some of these people are. Gates are microscopic. You can’t go buy them at an electronics store.”

    What!? I’m sorry, transistors are made of gates? lol This guy obviously never took a real hardware course in his life. If you can’t buy them at the store then what in the hell is the Ti 7400 series!?

    He makes a couple of decent points but its shrouded with bag grammar and personal biasing. This guy sounds like some kind of code monkey that would be better suited for technical school and not a real University.

Leave a comment