The US Department of Defense has deployed machine learning algorithms to identify targets in over 85 air strikes on targets in Iraq and Syria this year.

The Pentagon has done this sort of thing since at least 2017 when it launched Project Maven, which sought suppliers capable of developing object recognition software for footage captured by drones. Google pulled out of the project when its own employees revolted against using AI for warfare, but other tech firms have been happy to help out.

  • Patapon Enjoyer@lemmy.world
    link
    fedilink
    arrow-up
    6
    arrow-down
    2
    ·
    edit-2
    9 months ago

    I hope they taught those things the difference between a military base and a hospital or wedding this time

    • IninewCrow@lemmy.ca
      link
      fedilink
      English
      arrow-up
      10
      ·
      9 months ago

      I always describe the birth and development of AI is like a trailer park trash couple that never finished grade school, highly religious and believe in ghosts and fairies that have a new baby.

      We’re terrible parents that probably shouldn’t have children yet we have one that is growing fast and by the time it is fully mature, it will be way more powerful and capable than we are … but it will have the morals and ethics that it’s parents taught it.

    • FaceDeer@kbin.social
      link
      fedilink
      arrow-up
      2
      ·
      9 months ago

      AI vision systems are already better than humans at distinguishing between a gun and a camera or other gun-like-but-not-a-gun object, so I for one am cautiously optimistic about this sort of thing. People need to bear in mind that humans aren’t the greatest things to be putting in charge of targeting decisions either.