As AI capabilities advance in complex medical scenarios that doctors face on a daily basis, the technology remains controversial in medical communities.

  • pezhore
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    2
    ·
    1 year ago

    Don’t forget the inherent biases that are introduced with AI training! Women especially have a history of having their symptoms dismissed out of hand - if the LLM training data includes these biases, in combination with the bad diagnosis women could be really screwed.

    • inspxtr@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      similarly to people from different races/countries … it’s not only that their conditions might vary and require more data, it is also that some communities don’t visit/trust hospitals to even have their data collected to be in the training set. Or they can’t afford to visit.

      Sometimes, people from more vulnerable communities (eg LGBT) might prefer not to have such data collected in the first place, making data sparser.