This is an older post I’ve been hanging on to, and since I have forgotten why I was hanging on to it, I no longer see any reason to keep it private.
This stems from a post Mark Guzdial did last year (August 2009):
There was quite an interesting discussion around this. For the original comments, check Mark’s blog. The comments are all public. I’ll quote a few bits here along with some of my responses:
Patrick Widener: Katrin, Steve Yegge from Google wrote up a great blog post once illustrating that very point: some of their interview candidates would take a problem that practically screamed to be solved via awk/sed/perl, and try to write 250 lines of C++ on a whiteboard in 45 minutes because it was the only hammer in their toolbox.
I had the same experience in a 3rd year data architecture course. I had my students create a client-side search utility that required them to generate a data file containing their search data that they could send over to the client. It had to be small and the first step was to crawl around their website to build this data file. I told them they could use any language they wanted to. My sample solution (I always solved their assignments myself before they got them) was about 20 lines of SED and UNIX script. Almost ALL of them wrote their solution in C++ – over 2000 lines of code.
They didn’t believe me when I told them it would only take a few hours to learn enough SED & shell script to solve the problem. They didn’t believe that it was possible to learn just a bit of a language because none of their professors told them. Instead they spent many more hours than that writing programs in a language that was not well suited to the problem at hand.
Fred Martin: When I was an undergrad, the MIT AI Lab was filled with Symbolics Lisp machines. So of course undergrads had to learn Scheme, if they were going to be useful in the research labs.
A big part of the problem here is that none of us (or few of us) actually *use* these languages in our research work. E.g, my two big build-and-study projects are based on web apps, and we use the obvious language/techn choices (Java, JS, PHP, MySQL).
Fred Martin: Mark you’d probably be an exception to this with your work with Squeak. But how many of us — other than language researchers — actually use these languages in our work? Robot people don’t program in Lisp anymore, etc. I’ve been teaching our PL course (based on Sussman/Abelson SICP) — it’s still offered and still required. I added a project module to the last part of the course, and I think it really helps make the ideas useful to students.
Still — if we’re not dog-fooding the stuff in our labs, classes can only go so far…
With a few exceptions, most of the faculty I knew at my former institution didn’t actually *use* ANY language – they have grad students for that. I seriously doubt whether many of even can write a working program to do something useful any more.
What we teach, especially in the 1st & 2nd year of a 4-year program, should not be determined by what we personally use, nor by what is currently popular in industry. It should be determined by whether or not it helps to build a good foundation of knowledge & understanding.
Re dogfooding: All faculty should be able to teach ANY 1st or 2nd year course (given some prep time). If they can’t, they shouldn’t be there.
Re Ian’s lament about lack of interest in our own history is absolutely true. We had Mike Williams in our department- probably *the* world expert in computer history – no-one paid him much attention or give him the respect he had earned while he was faculty. He has the very bestest stories. When I was a student there, these stories helped to give me a sense of place and of connection to my profession. The lack of interest in our past may also explain why most people in CS are so Ferengi today.
Fred Martin: I think I wasn’t very clear — what I meant was, there used to be more fac teaching cool languages and also using them in their research labs (even if it was the grad students doing the coding). Now even the fac who love and teach languages, probably aren’t using them unless they’re also language researchers. So students don’t get to see many instances of real-world projects using modern languages (I’m including fact research labs as being real-world, vs. pedagogical content). I think that makes our task even harder — if we’re not using these languages ourselves in our projects, it’s harder to show they really matter.
You’re right. So, as faculty, we SHOULD be writing code in these languages.Like Bjarne Stroustrup said, “You can’t teach what you don’t practice.”
Part of the reason we used to use cool languages was because they were interesting, and learning how to use them broadened our minds. They mattered because they had something to teach us, not because they were being used in industry or research.
When I was doing my undergrad, most of the languages I learned were not useful for ‘real’ work because we didn’t have the horsepower. So, along side Fortran, PL/1, Simula, APL and JCL (useful at the time), I also learned Snobol, Lisp, Algol68, Ada, Pascal, Basic, and many other not-useful (at the time) languages.
I think we often loose touch with the notion that a university degree should be different from a professional certificate. Sometimes the reasons for teaching something don’t have anything to do with whether students will be able to use this directly in their future jobs. Students absolutely have the right to ask, “Why am I doing this?” and “What is it good for?”, and faculty absolutely should have answers to these questions, but they DON’T always have to relate directly to the jobs these people might do when they graduate.
Example: I’ve never written a single line of assembler for anything”useful”, but I learned a lot about how the machine works from learning to program in assembler.
That’s the thing – I don’t think you do have the ‘wrong’ idea (I didn’t think that when I was teaching CS either). I think most schools of computer science have the wrong idea. It’s the CS departments that keep saying, “No, we don’t do that.” “And, we don’t do that either. Nor that. Let’s have more math, more theory, more algorithm analysis more silly seng….”
(Thing is: I know some *real* software engineers – they don’t do the stuff you learn in school.)
Most of CS @ school is now a waste of good talent – no wonder the good talent ends up getting other degrees (or none at all).