• kamen@lemmy.world
    link
    fedilink
    arrow-up
    16
    ·
    edit-2
    1 year ago

    Yeah, cool, except that the first time you encounter these (probably in high school) you’d be a minority if you somehow already know programming.

    Edit: and if you somehow already know programming, chances are you’ve encountered some math in the process.

    • beefcat@lemmy.world
      link
      fedilink
      arrow-up
      6
      ·
      1 year ago

      I learned basic programming skills around the time I was taking algebra in middle school. This was in the '00s.

      For me, code was a lot easier to understand and going forward I would write programs that implemented the concepts I was learning in math classes in order to better comprehend them (and make my homework easier). I demonstrated enough aptitude here that I was allowed to take two years of AP Computer Science in high school despite lacking the math prerequisites.

      I know a lot of programmers who think they are “bad at math” but really, they struggle with mathematical notation. I think a big reason for this disconnect is that mathematical notation prioritizes density, while modern programming languages and styles prioritize readability.

      These different priorities make sense, since math historically needed to be fast to write in a limited amount of space. Mathematicians use a lot of old Greek symbols, and single-letter variable identifiers. The learning curve and cognitive load associated with these features is high, but once mastered you can quickly express your complex idea on a single chalkboard.

      In programming, we don’t need to fit everything on a chalkboard. Modern IDEs make wrangling verbose identifiers trivial. The programming languages themselves make use of plain English words rather than arcane Greek letters. This results in code that, when well written, can often be somewhat understood even by lay people