SafeRent is a machine learning black box for landlords. It gives landlords a numerical rating of potential tenants and a yes/no result on whether to rent to them.

In May 2022, Massachusetts housing voucher recipients and the Community Action Agency of Somerville sued the company, claiming SafeRent gave Black and Hispanic rental applicants with housing vouchers disproportionately lower scores.

The tenants had no visibility into how the algorithm scored them. Appeals were rejected on the basis that this was what the computer output said.

  • drdiddlybadger@pawb.social
    link
    fedilink
    English
    arrow-up
    114
    arrow-down
    2
    ·
    1 month ago

    The land lords who used the service should also be held liable. You mean to tell me you get a report with a binary answer and you just trust it with no due diligence? If there is no penalty for blindly trusting an algorithm they will just move to the next tool they can use to be bigots.

  • cheese_greater@lemmy.world
    link
    fedilink
    English
    arrow-up
    54
    arrow-down
    6
    ·
    edit-2
    1 month ago

    If there are suicides linked to wronged applicants, they should be charged with at least “involuntary” manslaughter

        • otacon239@lemmy.world
          link
          fedilink
          English
          arrow-up
          25
          ·
          1 month ago

          The fact that I’ve never heard of the corporate death penalty until now, but they’re bringing back the actual death penalty says everything.

        • ChickenLadyLovesLife@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 month ago

          The penalty is usually a fine, which impacts stockholders by making the stock less valuable

          Of course they can always compensate for this by firing a bunch of people.

      • IcyToes@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        5
        ·
        1 month ago

        Well stockholders don’t have executive capabilities. The CEO is responsible. Could hold board responsible too if they knew.

      • Squizzy@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        1 month ago

        In order to be a director of a business you have to assume the legal responsibility of the organisation. You need more than 1 director and ignorance is not an excuse, there are expectations of awareness and involvement for anyone legally in a director role.

    • scratchee@feddit.uk
      link
      fedilink
      English
      arrow-up
      7
      ·
      1 month ago

      At least they were banned from using AI screening for 5 years.

      I’d hope breaking a court order would result in the kind of punishments they would actually fear.

  • AlecSadler@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    18
    ·
    1 month ago

    Crappy-ass fine and simply a “cost of doing business” for them I bet. Damages have been done for which there is no undoing. Deplorable.

  • meyotch@slrpnk.net
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    1
    ·
    1 month ago

    SafeRent was a giant piece of shit before “AI”. I tried to rent a place 15 years ago that used them. The report returned several serious felonies committed over years by another person with an only vaguely similar name who lived in a state I had never even visited.

    The leasing office people admitted that the report was transparently bogus, but they still had orders to deny all housing to negative reports.

    My only recourse at the time was to lock my record so they won’t issue reports in my name at all. I now ask right up front who a renter uses for screening and they get a vigorous ‘nope’ if they use SafeRent.

    Fsck SafeRent!

  • Fizz@lemmy.nz
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    4
    ·
    1 month ago

    Just do you job you lazy cunts. Stop trying get ai to do everything. Real estate agents should be checking this stuff it’s part of the role.

    • nandeEbisu@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      1
      ·
      1 month ago

      But AI is so useful for laundering racism, sexism, and IP theft with plausible deniability.

        • AngryCommieKender@lemmy.world
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 month ago

          I’m not certain if we have letting agents in the US. I certainly have never used one, and I’ve rented in at least 30+ states. I’ve lived in 49/50, but that’s counting while living with my parents, so I’m not going through the effort to figure out the exact number.

          I would agree with your humorous throwaway, but that is actually the reality of the US unfortunately.

          • Fizz@lemmy.nz
            link
            fedilink
            English
            arrow-up
            1
            ·
            30 days ago

            And my comment is clearly about whichever job is managing the task being replaced by AI. What does it matter if you think the job title is wrong?