• ᗪᗩᗰᑎ
    link
    fedilink
    arrow-up
    5
    arrow-down
    3
    ·
    edit-2
    3 years ago

    They’re hiding the function (rules) that will trigger a captcha response in the client if they get enough reports that it’s a spammer, after which the client will be unable to continue to send messages until the captcha is solved. That’s it. The reason you can’t check how they’re doing it is because the spammers would just read it as instructions on how to avoid getting caught.

    Communication/messaging, everything, is still E2EE. Nobody is getting anything out of this. If the FBI asks them to get user data, they will be unable to share anything with them. They don’t need to warn users because they don’t keep any data anyways - as can be seen by the multiple subpoenas they’ve fought to make public and continue to not provide any useful info.

    • DessalinesA
      link
      fedilink
      arrow-up
      3
      arrow-down
      2
      ·
      3 years ago

      unable to share anything with them

      Except phone numbers, dates / times, contacts… pretty much everything except message content.

      • ᗪᗩᗰᑎ
        link
        fedilink
        arrow-up
        2
        arrow-down
        2
        ·
        3 years ago

        This is incorrect.

        They store:

        • Your number
        • The date you first registered.
        • Last day (not time) a client last pinged their servers.

        Signal’s access to your contacts lets the client (not them):

        determine whether the contacts in their address book are Signal users without revealing the contacts in their address book to the Signal service [0].

        They’ve been developing/improving contact discovery since at least 2014 [1], I’d wager they know a thing or two about how to do it in a secure and scalable way. If you disagree or have evidence that proves otherwise, I’d love to be enlightened. The code is open [2], anyone is free to test it and publish their findings.

        [0] https://signal.org/blog/private-contact-discovery/

        [1] https://signal.org/blog/contact-discovery/

        [2] https://github.com/signalapp/ContactDiscoveryService/

          • ᗪᗩᗰᑎ
            link
            fedilink
            arrow-up
            1
            ·
            3 years ago

            In security, you can’t assume that the the server isn’t storing a piece of data just because the operator says it isn’t

            100% agree with you about being unable to confirm what the server is doing, but the fact of the matter is anyone you interact with - centralized server-client or decentralized peer-to-peer - can store some metadata.

            The FBI could force Moxie to hand it over, and may have already done so without us knowing

            Private contact discovery is engineered in a way that you would be unable to retrieve what is being processed even if you had access to Signal’s infrastructure or admin/root rights. If you don’t believe this is true, please point out where the weakness in their code is, it’s open for review and for anyone to point out its flaws.

            Lastly, the FBI cannot compel anyone - individuals or companies - to work on anything without compensation. That is considered forced labor, which is highly illegal in the United States where Signal resides. The FBI attempted to force Apple to develop software to compromise the security of iOS, but they dropped the case, likely because they knew they would fail. Although they claim they found the software they needed elsewhere [0].

            So the FBI can ask Signal for assistance, but that’s it. Signal must comply with the law so they always provide the info they do have - which is the data I previously pointed out - but they do not have to build any such system that would compromise the security of their service as it would fall under forced labor; i.e. developing software against their will.

            [0] https://www.beencrypted.com/news/apple-vs-fbi-events-summary/

      • ᗪᗩᗰᑎ
        link
        fedilink
        arrow-up
        1
        ·
        3 years ago

        A simple system like that is easy to implement. I don’t think anyone’s questioning that they can build the worst attempt at an anti-spam system, like the one you’re suggesting. The types of spam you see on modern systems needs a bit more thought than “block if reported more than x times in x times” because you could easily target people and disable them remotely by coordinating attacks.

        So yeah, it’s not magic if you want a dumb system that may introduce other problems, but you really have to think about things sometimes if you want it to work well in the long run.

          • ᗪᗩᗰᑎ
            link
            fedilink
            arrow-up
            1
            ·
            3 years ago

            I never get any spam on my chats

            I’ve never crashed my car, should everyone get rid of their car’s seat belts?

            Your experience does not represent the world. I’ve only experienced 2 cases of spam on Signal, but they were all within the last year. I’ve had zero spam in the many years I’ve now been using Signal. So, while my anecdote is just as invalid as your single point of data, there’s definitely a trend for increased spam as a service gains popularity and it makes sense that they’re looking at enhanced methods to block spammers.

            I still don’t see why they want a super secure smart system to block with captcha

            You don’t understand why Signal, one of the most secure messaging platforms available, wants a super secure smart system to block spammers? I think you answered your own question.

            Telegram for example you can add your own bot to kick the bot users. If you get a direct message you can just block and report

            Telegram stores all your data and can view everything you do - unless you opt into their inferior E2EE chat solution known as “Secret Chats” - so it’s easier for them to moderate their services. When you report someone, Telegram moderators see your messages for review [0] and can limit an account’s capabilities. Signal can’t view your messages because everything is E2EE, nobody but the intended recipient can view your messages, they can’t review anything.

            As you can see, without even digging into it too much, I’ve already found one case where Signal faces challenges not present in Telegram. Thing’s aren’t always as simple as they seem. Especially not for Signal, as they’ve worked their asses off to ensure they have as little data on their users as possible.

            [0] https://www.telegram.org/faq_spam#q-what-happened-to-my-account

              • ᗪᗩᗰᑎ
                link
                fedilink
                arrow-up
                1
                ·
                3 years ago

                Briar is probably more secure and it’s not the only secure app to chat in this world, Signal isn’t the MOST SECURED one xD.

                A communication platform is only as good as it’s feature-set, ease-of-use, and accessibility. I’m not going to ask my grandma to install Briar - hell, half my friends and family with iPhones can’t even install it, there’s no app for it. I would consider my PGP signed/encrypted text files delivered via carrier pigeon even more secure than briar, but who would I even talk to? Maybe Briar will be a great alternative in the future, but it has a lot of ground to cover. Also, Signal is fully E2EE - that’s what I want, that’s what I care about right now. I’m keeping an eye on Briar, but I’m not asking anyone to install it yet.

                Just block and done.

                You’re simplifying a problem in a domain you seem to have zero experience with. I will just leave it at at that, as my previous examples in my previous reply didn’t seem to click.

                if FBI asks for a backdoor you are forced to make it BY LAW and you can’t even tell this to anyone BY LAW

                This is a lie.

                Forced labor in the US is illegal. The FBI cannot force you or an organization to work without compensation. As such, the FBI cannot compel software developers to work (modify their code to make it less secure) without breaking the law.

                The All Writs Act forces companies to assist in investigations by providing data they already have, (which Signal gladly does [1] )but it does not grant the ability to force someone to work (which is what software development is and is what would be required to backdoor their own systems).

                [0] https://www.beencrypted.com/news/apple-vs-fbi-events-summary/

                [1] Reminder that Signal only collects: 1) the date you signed up 2) the last day your client pinged their servers.