• iortega@lemmy.eus
    link
    fedilink
    arrow-up
    13
    arrow-down
    3
    ·
    3 years ago

    Is this really a “proof”? I can’t say I’m a Signal hate, but neither a lover, however I’m not sure if signal itself explaining why signal is privacy friendly is enough to consider their service and products privacy friendly. It might just be my opinion though. Too used to companies providing equivalent arguments.

      • gmate8OP
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        3 years ago

        But you know they can’t say no to the authorities. If they say to start logging user activity from an account, they have to. Otherwise they would become criminals. And it was really the top of these cases. They decline requests, what they can. But there are situations, when even the swiss laws can’t protect you. If you are really concerned, you should try self-hosting.

      • RushKitty
        link
        fedilink
        arrow-up
        4
        arrow-down
        1
        ·
        3 years ago

        Cant agree with you more , When they ask for your phone number, It means no privacy .

      • pancake
        link
        fedilink
        arrow-up
        3
        arrow-down
        1
        ·
        3 years ago

        I’m a little out of the loop. What’s going on with Telegram?

        • weex
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          3 years ago

          My understanding is that code was limited to anti-spam. There’s going to be some level of trust involved with using a centralized service so I don’t see that it’s such a huge issue, even as someone who prefers to use decentralized and FLOSS for as much as possible.

        • ᗪᗩᗰᑎ
          link
          fedilink
          arrow-up
          5
          arrow-down
          3
          ·
          edit-2
          3 years ago

          They’re hiding the function (rules) that will trigger a captcha response in the client if they get enough reports that it’s a spammer, after which the client will be unable to continue to send messages until the captcha is solved. That’s it. The reason you can’t check how they’re doing it is because the spammers would just read it as instructions on how to avoid getting caught.

          Communication/messaging, everything, is still E2EE. Nobody is getting anything out of this. If the FBI asks them to get user data, they will be unable to share anything with them. They don’t need to warn users because they don’t keep any data anyways - as can be seen by the multiple subpoenas they’ve fought to make public and continue to not provide any useful info.

          • DessalinesA
            link
            fedilink
            arrow-up
            3
            arrow-down
            2
            ·
            3 years ago

            unable to share anything with them

            Except phone numbers, dates / times, contacts… pretty much everything except message content.

            • ᗪᗩᗰᑎ
              link
              fedilink
              arrow-up
              2
              arrow-down
              2
              ·
              3 years ago

              This is incorrect.

              They store:

              • Your number
              • The date you first registered.
              • Last day (not time) a client last pinged their servers.

              Signal’s access to your contacts lets the client (not them):

              determine whether the contacts in their address book are Signal users without revealing the contacts in their address book to the Signal service [0].

              They’ve been developing/improving contact discovery since at least 2014 [1], I’d wager they know a thing or two about how to do it in a secure and scalable way. If you disagree or have evidence that proves otherwise, I’d love to be enlightened. The code is open [2], anyone is free to test it and publish their findings.

              [0] https://signal.org/blog/private-contact-discovery/

              [1] https://signal.org/blog/contact-discovery/

              [2] https://github.com/signalapp/ContactDiscoveryService/

                • ᗪᗩᗰᑎ
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 years ago

                  In security, you can’t assume that the the server isn’t storing a piece of data just because the operator says it isn’t

                  100% agree with you about being unable to confirm what the server is doing, but the fact of the matter is anyone you interact with - centralized server-client or decentralized peer-to-peer - can store some metadata.

                  The FBI could force Moxie to hand it over, and may have already done so without us knowing

                  Private contact discovery is engineered in a way that you would be unable to retrieve what is being processed even if you had access to Signal’s infrastructure or admin/root rights. If you don’t believe this is true, please point out where the weakness in their code is, it’s open for review and for anyone to point out its flaws.

                  Lastly, the FBI cannot compel anyone - individuals or companies - to work on anything without compensation. That is considered forced labor, which is highly illegal in the United States where Signal resides. The FBI attempted to force Apple to develop software to compromise the security of iOS, but they dropped the case, likely because they knew they would fail. Although they claim they found the software they needed elsewhere [0].

                  So the FBI can ask Signal for assistance, but that’s it. Signal must comply with the law so they always provide the info they do have - which is the data I previously pointed out - but they do not have to build any such system that would compromise the security of their service as it would fall under forced labor; i.e. developing software against their will.

                  [0] https://www.beencrypted.com/news/apple-vs-fbi-events-summary/

            • ᗪᗩᗰᑎ
              link
              fedilink
              arrow-up
              1
              ·
              3 years ago

              A simple system like that is easy to implement. I don’t think anyone’s questioning that they can build the worst attempt at an anti-spam system, like the one you’re suggesting. The types of spam you see on modern systems needs a bit more thought than “block if reported more than x times in x times” because you could easily target people and disable them remotely by coordinating attacks.

              So yeah, it’s not magic if you want a dumb system that may introduce other problems, but you really have to think about things sometimes if you want it to work well in the long run.

                • ᗪᗩᗰᑎ
                  link
                  fedilink
                  arrow-up
                  1
                  ·
                  3 years ago

                  I never get any spam on my chats

                  I’ve never crashed my car, should everyone get rid of their car’s seat belts?

                  Your experience does not represent the world. I’ve only experienced 2 cases of spam on Signal, but they were all within the last year. I’ve had zero spam in the many years I’ve now been using Signal. So, while my anecdote is just as invalid as your single point of data, there’s definitely a trend for increased spam as a service gains popularity and it makes sense that they’re looking at enhanced methods to block spammers.

                  I still don’t see why they want a super secure smart system to block with captcha

                  You don’t understand why Signal, one of the most secure messaging platforms available, wants a super secure smart system to block spammers? I think you answered your own question.

                  Telegram for example you can add your own bot to kick the bot users. If you get a direct message you can just block and report

                  Telegram stores all your data and can view everything you do - unless you opt into their inferior E2EE chat solution known as “Secret Chats” - so it’s easier for them to moderate their services. When you report someone, Telegram moderators see your messages for review [0] and can limit an account’s capabilities. Signal can’t view your messages because everything is E2EE, nobody but the intended recipient can view your messages, they can’t review anything.

                  As you can see, without even digging into it too much, I’ve already found one case where Signal faces challenges not present in Telegram. Thing’s aren’t always as simple as they seem. Especially not for Signal, as they’ve worked their asses off to ensure they have as little data on their users as possible.

                  [0] https://www.telegram.org/faq_spam#q-what-happened-to-my-account

      • iortega@lemmy.eus
        link
        fedilink
        arrow-up
        3
        ·
        3 years ago

        I think I did. I might have not understood it. However, is that response enough? Shouldn’t Signal have some kind of audit by authorities to confirm it is true what they responded?