As evidence, the lawsuit cites unnamed “courageous whistleblowers” who allege that WhatsApp and Meta employees can request to view a user’s messages through a simple process, thus bypassing the app’s end-to-end encryption. “A worker need only send a ‘task’ (i.e., request via Meta’s internal system) to a Meta engineer with an explanation that they need access to WhatsApp messages for their job,” the lawsuit claims. “The Meta engineering team will then grant access – often without any scrutiny at all – and the worker’s workstation will then have a new window or widget available that can pull up any WhatsApp user’s messages based on the user’s User ID number, which is unique to a user but identical across all Meta products.”

“Once the Meta worker has this access, they can read users’ messages by opening the widget; no separate decryption step is required,” the 51-page complaint adds. “The WhatsApp messages appear in widgets commingled with widgets containing messages from unencrypted sources. Messages appear almost as soon as they are communicated – essentially, in real-time. Moreover, access is unlimited in temporal scope, with Meta workers able to access messages from the time users first activated their accounts, including those messages users believe they have deleted.” The lawsuit does not provide any technical details to back up the rather sensational claims.

  • socsa@piefed.social
    link
    fedilink
    English
    arrow-up
    62
    arrow-down
    3
    ·
    12 days ago

    It is end to end encrypted but they can just pull the decrypted message from the app. This has been assumed for years, since they said they could parse messages for advertising purposes.

    • Hotzilla@sopuli.xyz
      link
      fedilink
      English
      arrow-up
      5
      ·
      12 days ago

      Hasn’t it always been that they can decrypt the backups that you personally setup in wa, this way they don’t legally lie to you when the app tells you “this chat is encrypted, even Whatsapp cannot read the messages”.

      • socsa@piefed.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        11 days ago

        Yes, any time you can store and recover encrypted cloud archives across devices, without needing to transfer keys between devices, it implies that there is a key archive somewhere in the cloud. Even Signal struggles to get this both user friendly and properly secure without compromising forward secrecy. I believe they still actually make you explicitly do a local key transfer to populate a new device, even though they have cloud archives now. Whatsapp doesn’t do that. And the app also clearly leaks some amount of unencrypted data anyway, archives or not.

    • Pup Biru@aussie.zone
      link
      fedilink
      English
      arrow-up
      9
      ·
      12 days ago

      it’s not even that: they just hold the keys so can simply decrypt your messages with out your clients intervention any time they like

    • FactualPerson@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      10 days ago

      Surely they have access otherwise how do they moderate and investigate account blocks, reports of spam etc. Accounts get suspended, then some automation reviews it, then it escalates to a human, who will have to make a judgement based on some policy. How can they do that if they see nothing? (I’m asking not condoning).

  • arc99@lemmy.world
    link
    fedilink
    English
    arrow-up
    4
    ·
    12 days ago

    I would not be surprised at all if they’d have a backdoor way to filch data, or the key with which to decrypt backed up data.

  • BilboBargains@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    11 days ago

    It would not be surprising if found to be true. Difficult to see how the current business model operates at a profit. Their long term goal is the usual loss leader model until a monopoly is achieved and then slug us with ads, sell all the data, hike the price, etc. Sickening to watch them cosy up to fascists. They are probably supplying any and all the agencies with intelligence scraped from their user base. If Facebook were a person they would be a psychopath.

  • Rioting Pacifist@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    25
    ·
    12 days ago

    Is the same not true of any app depending on centralized servers, e.g including signal?

    And also Google & Apple can backdoor any app on any mobile device.

    • Dekkia@this.doesnotcut.it
      link
      fedilink
      English
      arrow-up
      2
      ·
      12 days ago

      You’ve already gotten a lot of responses about the first claim.

      But to answer the second one:

      Why would they mess with a specific app if they already control the OS? They could read everything they ever wanted from memory without anyone noticing.

        • Dekkia@this.doesnotcut.it
          link
          fedilink
          English
          arrow-up
          1
          ·
          12 days ago

          And what if your target is using a different app for messaging?

          I agree that blindly going trough memory isn’t the best solution. To catch everything, a keylogger as part of the input-handler of the OS would probably be the way to go.

    • superglue@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      48
      ·
      12 days ago

      No. Signal encrypts every message on the device itself before sending to Signal servers. You can even confirm this yourself by looking at their github.

      Whats app claims they do this but its impossible to confirm. Its extrenemly likely that either they dont encrypt at all or they have the decryption keys.

      • Rioting Pacifist@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        20
        ·
        edit-2
        12 days ago
        1. In The method described, it doesn’t matter if Signal encrypt the message before it leaves your phone, the plaintext is still in the app and gets sent to Meta while also being encrypted with Meta’s keys.

        2. It’s basically impossible to know this isn’t happening based on reading source code, because the code to load widgets doesn’t have to be remotely close to the messaging code, you’d have to read the entire signal code based.

        3. There is way to know that the code you read on GitHub is the code Google/Apple install on your phone.

        • EisFrei@lemmy.world
          link
          fedilink
          English
          arrow-up
          10
          ·
          12 days ago
          1. Why would meta have access to signal’s memory?
          2. That’s why code audits have been done multiple times.
          3. Reproducible builds. Signal has those since 2016
          • furry toaster@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            1
            arrow-down
            2
            ·
            12 days ago

            about the 3rd, is the end apk file downloaded by a useer on the playstore reproducible? could google add stuff to the apk before the user downloading it? do users ever bother checking if the apk hash matches the one from the reproducible build?

        • just_another_person@lemmy.world
          link
          fedilink
          English
          arrow-up
          22
          arrow-down
          2
          ·
          edit-2
          12 days ago

          🤣🤣🤣😂

          Bruv, before Signal launched they posted an entire whitepaper detailing their protocol, the working mechanisms of the system, and source code. So to reply to your 3 points:

          1. No, this is stupid and easily verified by watching network traffic from any device. Signal isn’t secretly sending plaintext messages anywhere.
          2. No, it’s not impossible to tell this at all. That’s what source code is. The executable code. Not only have NUMEROUS security audits been done on Signal by everyone from Academia, to for-profit security researchers and governments, you can easily verify that what you’re running on your phone is the same source code as what is published publicly because the fingerprint hashes for builds are also published. This means the same fingerprint you’d get building it yourself from source should also be the same as what is publicly published.
          3. See my point above, but also when two users exchange keys on Signal (or in any other cryptographic sense), these keys are constantly verified. If changed, the session becomes invalid. Verifying these keys between two users is a feature of Signal, but moreover, the basics of cryptography functioning can, and have been proven, during the independent audits of Signal. Go read any of the numerous papers dating back to 2016.

          If you don’t understand how any of this works, it’s just best not to comment.

          • pressanykeynow@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            12 days ago

            What if the malicious actor is not Signal but Google or the hardware manufacturer?

            Can we check that the encryption key generated by the device is not stored somewhere on the device? Same for the OS.

            Can we check that the app running in memory is the same that is available for reproducible build checks?

            Can we check that your and my apps at the moment are the same as the one security researchers tested?

            • just_another_person@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              11 days ago

              The clients (apps) enforce key symmetry for your own keys, server identity, and the exchanged with the other person part of a conversation. Constantly. There is no way to MITM that.

              The clients are open source, and audited regularly, and yes, builds are binary reproduceable and fingerprinted on release.

              That’s not to say someone can’t build a malicious copy that does dumb stuff and put it in your phone to replace the other copy, but the server would catch and reject it if it’s fingerprints don’t match the previously known good copy, or a public version.

              Now you’re just coming up with weird things to justify the paranoia. None of this has anything to do with Signal itself, which is as secure as it gets.

              • pressanykeynow@lemmy.world
                link
                fedilink
                English
                arrow-up
                1
                ·
                11 days ago

                None of this has anything to do with Signal itself, which is as secure as it gets.

                Didn’t I say that at the start of my questions? What’s your point?

                server would catch and reject it if it’s fingerprints don’t match the previously known good copy, or a public version

                If I understand you correctly, you mean that Signal app checks itself and sends the result to the server that can then deny access to it? Is that what Signal does and what makes it difficult to spoof this fingerprint?

                I don’t think you answered any of my questions though since they weren’t about Signal.

                Now you’re just coming up with weird things to justify the paranoia

                I’m just asking questions about security I don’t know answers to, I’m not stating that’s how things are.

          • Rioting Pacifist@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            arrow-down
            17
            ·
            12 days ago
            1. Why would any message be plaintext?

            2. Fair you could have just said they have reproducible builds or linked to the docs: https://github.com/signalapp/Signal-Android/blob/main/reproducible-builds/README.md

            3. Again you are missing the point of the attack

            If you don’t understand how any of this works, it’s just best not to comment.

            Back at you, even if you are right that signal is secure, the attack is not what you think it is.

            • just_another_person@lemmy.world
              link
              fedilink
              English
              arrow-up
              14
              arrow-down
              3
              ·
              12 days ago

              What in the world are you talking about here, bud? Your comments are making zero sense.

              Look, seriously, if my comment is being upvoted, it’s because I responded to yours, and people understand what I am saying in response.

              You, unfortunately, clearly do not understand what I’m saying because you do not grasp how any of this works.

              • Rioting Pacifist@lemmy.world
                link
                fedilink
                English
                arrow-up
                2
                arrow-down
                17
                ·
                edit-2
                12 days ago

                seriously, if my comment is being upvoted, it’s because I responded to yours, and people understand what I am saying in response.

                Lmao, sure buddy pat yourself on the back because you got upvotes.

                You’re talking about E2E encryption as if it prevents side-channel client side attacks, but sure morons will upvotes because they also don’t understand real world security.

                The only useful thing you’ve pointed out in your deluge of spam, is that Signal builds are reproducible which does protect against the attack described (as long as there isn’t a backdoor in the published code)

                • just_another_person@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  8
                  arrow-down
                  1
                  ·
                  12 days ago

                  Do you know what size channel attacks are? Because nothing you’ve even tried to bring up describes one at all, or how it applies to your original comments.

                • wonderingwanderer@sopuli.xyz
                  link
                  fedilink
                  English
                  arrow-up
                  10
                  arrow-down
                  1
                  ·
                  12 days ago

                  You’re talking about E2E encryption as if it prevents side-channel attacks

                  That’s literally what E2E encryption does. In order to attack it from outside you would have to break the encryption itself, and modern encryption is so robust that it would require quantum computing to break, and that capability hasn’t been developed yet.

                  The only reason the other commenter’s words sound like spam to you is because you don’t understand it, which you plainly reveal when you say "(as long as there isn’t a backdoor in the published [audited] code)

    • hersh@literature.cafe
      link
      fedilink
      English
      arrow-up
      14
      ·
      12 days ago

      For most: yes, there is a risk that the vendor has included a backdoor. There is also the risk that they are straight-up lying about how their service operates.

      For Signal in particular: You can verify that their claims are true because you can audit the source code.

      The Signal client is open-source, so any interested parties can verify that it is A) not sending the user’s private keys to any server, and B) not transmitting any messages that are not encrypted with those keys.

      Even if you choose to obtain Signal from the Google Play Store (which comes with its own set of problems), you can verify its integrity because Signal uses reproducible builds. That means it is possible for you to download the public source code, compile it yourself, and verify that the published binary is identical. See: https://github.com/signalapp/Signal-Android/tree/main/reproducible-builds

      You might not have the skills or patience to do that yourself, but Signal has undergone professional audits if anyone ever discovers a backdoor, it will be major news.

      You are more likely to be compromised at the OS level (e.g. screen recorders, key loggers, Microsoft Recall, etc.) than from Signal itself.

      • wischi@programming.dev
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        4
        ·
        12 days ago

        Signal could still (at least for a short period of time) read everything. Whisper System just has to push a Signal Update that no longer encrypts. It would probably be noticed pretty soon. And no not because of the source code. The source code is what they claim to ise to build the applications but they could easily apply patches before they build. You’d have to reverse engineer the compiled applications ro see if there is code that’s probably not in the source.

        This kind of problem is typically way smaller in projects that actively encourage building the clients from source yourself - which Whister System/Signal does not.

        • SlippiHUD@lemmy.world
          link
          fedilink
          English
          arrow-up
          4
          ·
          12 days ago

          Theres so many ways to check for that that don’t require decompiling the app.

          You can straight compare the downloaded binary with a locally compiled binary to see if they match.

          You can check the hash of app. Changing some lines of code and getting the same hash is so unlikely to be effectively impossible.

          If for some reason Signal decided to do what you claim, it’d destroy thier credibility, be caught almost immediately, and only work once before the whole project gets forked, and would be true of any alternative.

  • Treczoks@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    12 days ago

    Why am I not surprised? Whether there is no end-end encryption, they have a copy of every key, get the decrypted messages from the client, or can ask the client to surrender the key - it does not matter.

    The point is that they never intended to leave users a secure environment. That would make the three latter agencies angry, and would bar themselves from rather interesting data on users.

  • clav64@lemmy.world
    link
    fedilink
    English
    arrow-up
    9
    ·
    11 days ago

    I would argue that the vast majority of users don’t use WhatsApp for privacy. In the UK at least, it’s just the app everyone has and it works. I’ve actively tried to move friends over to signal, to limited success, but honestly it can be escaped how encryption is not it’s killer IP.

    • PhoenixDog@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      11 days ago

      Yup. I use Whatsapp to text my girlfriend and my work uses it as a group chat for road conditions or just shit talking.

      If you’re using it for secure purposes, you’re part of the problem.

  • fodor@lemmy.zip
    link
    fedilink
    English
    arrow-up
    6
    ·
    12 days ago

    It will be interesting to see if this goes anywhere. It looks like the claims are based on specific aspects of California law (put simply: wiretapping, privacy, and deceptive business practices). Do they have a strong case? I don’t know, not worth my personal time to research state law on these issues.

    Is there enough to go to court? Certainly the lawyers think so, and I agree. If Meta is claiming E2EE (which it is) and then immediately undercutting that by re-transmitting large numbers of messages to itself (which is alleged), that sure feels deceptive to me, and it’s easy to think that a jury might agree.

  • matlag@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    30
    arrow-down
    1
    ·
    12 days ago

    Proposed line of defense: “With all respect, M. Judge, with all the different times we fucked our users, lied to them, tricked them, experimented on them, ignored them, we already sold private discussions on Facebook in the past, our CEO and founder most famous quote is «They trust me, dumbfucks!», the list goes on and on: no one in their sane mind would genuinely believe we were not spying on Whatsapp! They try to play dumb, they could not possibly believe we were being fair and honest THIS time?!”

    • CeeBee_Eh@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      8
      ·
      11 days ago

      Any claims around E2EE is pointless, since it’s impossible to verify.

      This is objectively false. Reverse engineering is a thing, as is packet inspection.

        • CeeBee_Eh@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          arrow-down
          1
          ·
          11 days ago

          It isn’t. Otherwise security research would never happen for proprietary software and services.

          • Dr. Moose@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            10 days ago

            In the US, CFAA is so draconian that in certain aspects it can be very illegal to reverse engineer code behind explicit ToS which whatsapp make you agree through click-wrap agreement (meaning explicit I agree button press) upon installing the app. So Meta could easily sue you with very good chance of winning. I work in security and reverse engineer a lot of stuff but just because my company has lawyers that will protect me (also I’m not an american) but generally americans are super fucked here and there are many stories of people being sued and even imprisoned for breaking ToS.

      • Sinthesis@lemmy.today
        link
        fedilink
        English
        arrow-up
        2
        ·
        edit-2
        11 days ago

        Now you just need Meta to allow you on their networks to inspect packets and reverse engineer their servers because as far as I know, WhatsApp messages are not P2P.

        /edit I betcha $5 that the connection from client to server is TLS(https), good luck decrypting that to see what its payload is.

      • snowboardbumvt@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        ·
        11 days ago

        Reverse engineering is theoretically possible, but often very difficult in practice.

        I’m not enough of an expert in cryptography to know for sure if packet inspection would allow you to tell if a ciphertext could be decrypted by a second “back door” key. My gut says it’s not possible, but I’d be happy to be proven wrong.

        • black0ut@pawb.social
          link
          fedilink
          English
          arrow-up
          4
          ·
          edit-2
          11 days ago

          Hell, as far as I know, E2EE would be indistinguishable from client to server encryption, where the server can read everything without the need for a secret “backdoor key”. You can see that the channel is encrypted, but you can’t know who has the other key.

          • herseycokguzelolacak@lemmy.ml
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 days ago

            The easiest way to break E2EE is to copy your private key to Meta’s servers. It’s very easy to implement, and close to impossible to detect.

  • Seefra 1@lemmy.zip
    link
    fedilink
    English
    arrow-up
    17
    arrow-down
    4
    ·
    11 days ago

    Only a tech illiterate can expect privacy from a closed source program, open source is a requirement for both privacy and security.

  • melfie@lemy.lol
    link
    fedilink
    English
    arrow-up
    3
    arrow-down
    1
    ·
    11 days ago

    Ending encryption is Meta’s end so they can spy on everyone and help governments do so as well, so they therefore have an end to end encryption. Oh, y’all thought the app had true E2EE such that even Meta with their surveillance capitalist business model couldn’t access your data? 🤣