Lately, I was going through the blog of a math professor I took at a community college back when I was in high school. Having gone the path I did in life, I took a look at what his credentials were, and found that he completed a computer science degree back sometime in the 1970s. He had a curmudgeonly and standoffish personality, and his IT skills were nonexistent back when I took him.

It’s fascinating to see the perspectives on computing and how many of the things I learned in my undergraduate were still being taught way back to the 1950s. It also seems like the computer science degree was more intertwined with its electrical engineering fraternal twin.

Although the title of this post is inherently provocative, I’m curious to hear from those of you who did computer science, electrical engineering, or similar technical degrees in decades past. Are there topics or subjects that have phased out over the years that you think leave younger programmers/engineers ill-equipped in the modern day? What common practices were you happy to see thrown in the dumpster and kicked away forever?

The community also seems like it was significantly smaller back then and more interconnected. Was nepotism as prevalent in the technology industry then as it is today?

This is just the start of a discussion, please feel free to share your thoughts!

  • HarriPotero@lemmy.world
    link
    fedilink
    English
    arrow-up
    7
    ·
    edit-2
    10 months ago

    It feels like many positions today don’t deal with things that you couldn’t learn in a 6 month boot camp aimed at a particular stack.

    I did my computer engineering degree in the early 2000s, and we still had a lot of those early day concepts. All from digital electronics, to processor and compiler design. Lots of focus on the formal methods to prove the correctness of software. Plenty of programming paradigms. None of my professors had a degree in CS. There was no CS when they were studying. They all had math degrees and a love for logic and automata theory.

    I can’t say that I’ve actively used it outside of academia, but I think that it has set me up to be a life long quick learner of everything happening in this fast-paced field. Most roles might be working with high level languages today, but those roles wouldn’t exist unless capable people build the compilers, drivers and hardware.

    The field needs people who will comb through specifications instead of searching stackoverflow to figure out things. (I guess asking ChatGPT or copilot are the new stackoverflow)

    I have a guilty pleasure in old things. The Computer Chronicles have all their episodes on youtube, and their analysis of the news in the 80s have held up remarkably well. I’ve also been reading Hollingdale’s Electronic computers. Computers are still just Von Neumann architecture no matter how many abstraction layers we build on top of it.