• queermunist she/her
    link
    fedilink
    arrow-up
    19
    arrow-down
    4
    ·
    1 year ago

    “But we refuse to train those students after school or allow them to go back to college for free.”

  • Beej Jorgensen@lemmy.sdf.org
    link
    fedilink
    arrow-up
    14
    ·
    1 year ago

    I was taught obsolete things in college in the early 90s. But FORTRAN wasn’t the useful part of the class–problem-solving and broader language exposure was.

    People focus on random technologies that are being used in class as being obsolete, but that’s not the point of college. You can learn technologies on your own, and if you have trouble with that, maybe practicing it in college is a good idea.

    Basically we’re going to drill on technology-agnostic fundamentals for 4 years, and use a wide variety of technologies and languages as vehicles for that so you get a good breadth of experience.

    • ribboo@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      1 year ago

      So much this.

      People want more “real world usage” in college and school overall. Teach kids how to do taxes, teach engineers how to use X and Y software.

      Well, in 10 years there’s a new software that does your taxes in another way, and plenty of laws have changed and there are new stuff to consider. And those software the engineers were taught, they are obsolete.

      That’s why focus should be on getting people to a place where they themselves can acquire the skills needed to do those things by themselves.

      • Chao-c'@f.cz
        link
        fedilink
        arrow-up
        0
        arrow-down
        3
        ·
        1 year ago

        @agressivelyPassive @beejjorgensen ehm, not really. Comprehensions [I didn’t even know they are called as such for a long time] ale light years ahead of any abstraction provided by Fortran, and unfortunately also C (maybe not so C++, which is dangerous and versatile beast).

        The core concepts of Python are two or three generations newer than that of Fortran.

        • AggressivelyPassive@feddit.de
          link
          fedilink
          arrow-up
          3
          arrow-down
          1
          ·
          1 year ago

          Comprehensions are just shorthands, nothing more. You can unroll into loops on a syntactical level.

          But the fact that you think like that, shows to me, that you actually don’t understand the core concepts behind languages.

          At the end, each language compiles to assembly, and all the fancy features are just different ways of encoding assembly.

          • Chao-c'@f.cz
            link
            fedilink
            arrow-up
            0
            arrow-down
            2
            ·
            edit-2
            1 year ago

            @agressivelyPassive ok, comprehensions are just syntax, but still, they still allow producing new arrays directly from iterable objects, without need to store them in temporary arrays, which counts as added abstraction.

            Basically I agree, that there are concepts which are simply not available in certain runtime libraries/interpreters, like multithreading or lazy evaluation. So I more or less agree, that syntax is not so important and we should categorize the underalying abstractions, accessible by syntax (or whatever).

            But at least memory management abstractions of Python are very different from Fortran or C (ok, you can use many different libraries for that in C, but you will hardly get reference counting and automatic clenaup of unreferenced objects and so, and this not just syntax issue… it is automation issue…)

            • AggressivelyPassive@feddit.de
              link
              fedilink
              arrow-up
              3
              ·
              1 year ago

              And nothing of that has anything to do with understanding core concepts. Threads are native CPU constructs, they don’t just exist because of a library. Memory management is nice, but also not arcane knowledge that can only be learned by going to a university.

              A C dev learning dependency injection and a Java dev learning manual memory management will both have to learn something new, but for neither it should fundamentally change how they think about computers.

              Again, you seem not to understand what’s actually going on under the hood. There is not a single language concept that a regular dev in another language couldn’t understand. It’s all just “make compiler write assembly so computer go brrr”. That doesn’t mean it’s trivial to be proficient in a new language, but that was never the goal of any higher education. It’s called computer science, not advanced button pressing.

        • Beej Jorgensen@lemmy.sdf.org
          link
          fedilink
          arrow-up
          1
          ·
          1 year ago

          Compensations are light years behind whatever’s coming next. 🙂 And I don’t think of them as “core”. They’re practically syntactic sugar. If you can write a comprehension but can’t write a loop with a conditional in Python or FORTRAN, you’re missing the core.

          • Chao-c'@f.cz
            link
            fedilink
            arrow-up
            1
            ·
            1 year ago

            @beejjorgensen I spent so many time writing manual loops even in Basic, sometimes in ASM (on Z-80) later in C, but actually, abstraction level of manually written loops is relatively low.

            Ok, core concept in Python are iterable objects. Iterable objects are much, much more advanced abstraction, than manual loop - I and yes, I have spent many, many years writing manual loops again and again, later I added some macros, but still - in Python, it is not only less keystrokes, but the iterable object abstraction is something, which was absent in Fortran and C (maybe not in C++, but C++ was mostly pain)

            Syntactic sugar poured on iterable objects is maybe not so important, but in enviroments without certain core concepts, no amount of syntactic sugar will fix that.

            Think of it as it was in Basic: it had no pointers (unless you wished to peek and poke memory manually). C had pointers and pointer aritmetics, which was powerful abstraction, compared to Basic. You would need to manually call peek() function to read pointer… well, technically possible, but you would read one byte at a time, with no clue about data type, etc). C pointer is not just syntactic sugar over peek(), it is much more than that.

            And there are more and more such powerful abstraction, which are just absent in older languages. You can eg. call try ~ except (or catch, or whatever) syntactic sugar - well, maybe it is, but is sugar coated setjmp()/longjmp() call of libc, not sugar coated goto, as it may seem at the first glance…

  • ch00f@lemmy.world
    link
    fedilink
    arrow-up
    12
    arrow-down
    1
    ·
    edit-2
    1 year ago

    I remember my microcontroller course professor telling us that if we just wanted to learn how to program assembly for microcontrollers, we could just pick up a book and skip the class.

    Instead, he intended to teach us problem solving with microcontrollers.

    The class was based around the Intel 8085 architecture, and this was in 2010. When I left the class, I started trying to make things using 8085s and assembly. These chips were so old, they needed external memory and flash storage to operate.

    Anyway, I eventually learned about the larger microcontroller world; writing C; 32bit processors, real-time debugging, etc.

    Understanding the fundamental goings on of assembly has been helpful, but it was only ever a building block.

    • AggressivelyPassive@feddit.de
      link
      fedilink
      arrow-up
      1
      arrow-down
      7
      ·
      1 year ago

      That’s exactly not what is meant here.

      If “learning 8085 assembly” only prepared you to program 8085 assembly and do exclusively that, you missed the entire point of higher education. Being able to generalize knowledge and applying it to other fields and specialisations is what is being taught. Not just following a tutorial.

    • SheeEttin@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      I thought my data structures class was useful. A few others were interesting. But other than that, no, Java development was not useful to anyone’s daily life.

      • sylver_dragon@lemmy.world
        link
        fedilink
        English
        arrow-up
        2
        arrow-down
        1
        ·
        1 year ago

        no, Java development was not useful to anyone’s daily life.

        You’ve never worked with the US Federal Government. For every software problem the Government has, there is a Java application written to make your life a living hell trying to solve that problem. It’s also even odds on said application requiring a version of Java which is about a decade old and it just mysteriously breaks with anything newer.

        • SheeEttin@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I think that was supposed to be my daily life. Not sure what happened between brain and fingers there. Java development was probably useful to some of my classmates.

  • pixxelkick@lemmy.world
    link
    fedilink
    arrow-up
    8
    ·
    1 year ago

    The important skills haven’t changed in awhile.

    Version control still works the same overall.

    The concept of CI/CD are still just as important.

    Understanding A/A/A for unit testing is still the same.

    All the useful patterns are just as useful.

    All the same antipatterns are just as important to watch out for.

    Largely speaking while languages may evolve, the core foundational principles of how to write Good Clean Code remains the same.

    • XGM@lemmy.world
      link
      fedilink
      arrow-up
      1
      ·
      1 year ago

      Funny enough I retired a dozen Netware servers in the past year with the last one just a month ago. To say they were old and outdated was an understatement.

  • DarthYoshiBoy@kbin.social
    link
    fedilink
    arrow-up
    6
    ·
    1 year ago

    Funny story time, intentionally vague to shield identities:

    I have a friend who was hired to teach a course at a local University for their new CS degree that had a focus on video games some while ago. He was a bit of an expert in a particular portion of the material that they needed, and when they started putting out feelers to find someone to teach the subject matter, everyone locally in the industry gave him the highest praise and said he was the man for the job. The University met with him and eventually selected him to teach, which he did for 3 semesters. After 3 semesters, they dropped him because he didn’t himself have a college degree in what he was teaching (which was something he made very clear in the hiring process.)

    He went into making games straight out of high school, he was basically there at the ground floor, self taught, acknowledged by everyone in the industry locally as a foremost expert in the field where they had him teaching, and they couldn’t keep him because they couldn’t have him teach when he didn’t have a degree in the field. Without his having a degree their program couldn’t be accredited. So… They wanted him to have a degree in a subject he was an originator of and without that degree they had to drop him.

    He makes financial software now because the games industry was/is brutal and he wanted to see his family now and then. I’ve always found it hilarious that a University had to let him go because otherwise the snake wasn’t eating its own tail and the ouroboros apparently can’t have that.

  • IzzyData
    link
    fedilink
    arrow-up
    8
    arrow-down
    2
    ·
    edit-2
    1 year ago

    If some piece of knowledge or skill becomes obsolete in less than 4 years from its inception than it was not important in the first place.

  • Zima@kbin.social
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    I think the article should focus on how everyone or most people at work do keep up with the times. At least when I learned my teachers understood this issue and focused on providing a good theoretical foundation on which you can build on, the particular technologies are just examples of what’s available at the time when you are being educated, it’s not the actual focus of the education.

  • Natal@lemmy.world
    link
    fedilink
    arrow-up
    2
    ·
    1 year ago

    Applies to many fields. Studied translation at university and, kudos to the head teacher, he kept saying we worked on current software for illustration but the point was to learn transverse skills to apply to whatever tools are trendy once on the market. Turns out I work in a firm working outdated software older than my uni did. But I always agreed with the dude, we’ll have to adapt or die as businesses.

  • AutoTL;DR@lemmings.worldB
    link
    fedilink
    English
    arrow-up
    1
    ·
    1 year ago

    This is the best summary I could come up with:


    In an essay, Hyams shared his top concerns around AI — one of which is how technologies like OpenAI’s ChatGPT will affect the job market.

    “With AI, it’s conceivable that students might now find themselves learning skills in college that are obsolete by the time they graduate,” Hyams wrote in the essay.

    “The higher the likelihood that a job can be done remotely, the greater its potential exposure is to GenAI-driven change,” the researchers wrote, referring to generative artificial intelligence.

    The CEOs thoughts on AI come as labor experts and white-collar workers alike become increasingly worried that powerful tools like ChatGPT may one day replace jobs.

    After all, employees across industries have been using ChatGPT to develop code, write real estate listings, and generate lesson plans.

    For instance, Hyams said that Indeed’s AI technology, which recommends opportunities to its site visitors, helps people get hired “every three seconds.”


    The original article contains 463 words, the summary contains 148 words. Saved 68%. I’m a bot and I’m open source!