Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

As someone who has TAed a wide variety of undergrad (mostly systems / OS) classes and mentored new-grad hires and interns, my observation is that universities are under tremendous pressure to prepare graduates with "go-to-market" skills, and deprioritize fundamental classes and concepts.

To put it this way: I'm just about a decade out from when I finished my BS degree, and my CS courses were (in order):

1. A SICP-based intro course using Scheme. I loved this course!

2. A data structures course using Java

3. A machine structures / light hardware design (just to understand pipelines, caching, etc) in C and assembly

4. Now pick whatever CS area you want to study

I mentored a new grad from my uni who is just graduated, so is 10 years younger than me roughly. This curriculum was changed to:

1. Same SICP-ish course, but using Python

2. Data structures was cut shorter to make room for ML

3. C++ course to build some kind of "distributed system" (but no discussion of the fundamentals of how the ABI works, for example)

I had to explain very basic things to this guy (e.g. what the stack was, basic gdb usage), who was otherwise very bright.

That's not to say "the kids these days are all dumb!", as I have TAed some classes where some of the students were far better hackers / coders than I was, but I just think the funnel towards skills and technology that have Proper Nouns that can be put on the resume is an unfortunate pressure being placed on universities these days.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: