• WalnutLum
    link
    fedilink
    arrow-up
    3
    ·
    5 months ago

    I’ve seen this said multiple times, but I’m not sure where the idea that model training is inherently non-deterministic is coming from. I’ve trained a few very tiny models deterministically before…

    • umami_wasabi
      link
      fedilink
      arrow-up
      1
      ·
      5 months ago

      You sure you can train a model deterministically down to each bits? Like feeding them into sha256sum will yield the same hash?

      • WalnutLum
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        Yes of course, there’s nothing gestalt about model training, fixed inputs result in fixed outputs