• HarkMahlberg
    link
    fedilink
    1
    edit-2
    8 months ago

    I’m a mixed bag of agree and disagree on these points but I’m only going to point out that the “De-Googling” trend doesn’t really have anything to do with the right to be forgotten. It has more to do with enshittification - Google shutting down services, making their current services harder to use, charging money for what used to be free services, charging more money for already paid services, adding ads, etc etc. Basically people finding alternative software to Google because Google’s practices have become increasingly volatile and their services less and less reliable.

    • danhakimiOP
      link
      fedilink
      18 months ago

      but I’m only going to point out that the “De-Googling” trend doesn’t really have anything to do with the right to be forgotten. It has more to do with enshittification - Google shutting down services, making their current services harder to use, charging money for what used to be free services, charging more money for already paid services, adding ads, etc etc. Basically people finding alternative software to Google because Google’s practices have become increasingly volatile and their services less and less reliable.

      Ohhhhh that de-Googling. Yeah, I’ve done a bit of that, disabled the Google app on my phone entirely since Firefox does its job better, but I’m on Android and doing all that setup every time I get a new phone is just a headache.

    • @atrielienz@lemmy.world
      link
      fedilink
      English
      18 months ago

      To hear people on the privacy subreddits and even the privacy Lemmy communities tell it, it’s absolutely about the data these companies are collecting. I’ll grant you it’s about what the companies are perceived to be doing with the data the collect (serving ads), but I don’t think I personally ever made the point that op did (that it was about right to be forgotten).

      Either way, I think op may have missed my point. As technology evolves people will find new ways to abuse it. And there’s a level of privacy people should have the expectation of, and our privacy laws don’t do enough as it is. Op is really suggesting that we further violate everyone’s privacy in the name of protecting them and they don’t want to hear that it’s a bad idea or one where we would have to put our trust in a company or companies to apply this monitoring.

      They also don’t seem to want to hear about the burn out rate of people tasked with moderating content and validating that that content is against TOS or breaks the law. Having humans trawl communities or even just messaging app text data for CP and scams is bound to have a detrimental effect.

      • danhakimiOP
        link
        fedilink
        28 months ago

        To hear people on the privacy subreddits and even the privacy Lemmy communities tell it, it’s absolutely about the data these companies are collecting.

        Sure. But I can’t blame them for collecting data that I literally decide to send them for no reason but my own, I can only blame them for using that data in a shitty way.

        If I post something on Instagram, I know that they’re collecting the photo I post, that’s how posting works, that’s not the issue. The issue comes if they try scanning peoples’ faces to invade their privacy, or build an advertising profile about me. Sending unencrypted chat messages is not that different.

        If I download Whatsapp, and I enable the contacts permission, and it uploads all of the Contacts data on my phone, that’s super not okay, because I never wanted to give them that data in the first place, they just jacked it.(I disable contacts permission for whatsapp on my phone, but most users would never know that data gets uploaded to begin with.

        • @atrielienz@lemmy.world
          link
          fedilink
          English
          18 months ago

          Users are responsible for the conduct and permissions they give to companies. Absolving them of that responsibility doesn’t make sense ethically or legally. We can’t just say “they didn’t know because WhatsApp didn’t tell them”. That’s not really an accurate statement. They more than likely agreed to use the app and in exchange they would receive free use and WhatsApp would receive that data. But they more than likely didn’t read the agreement before agreeing. That’s on them.

          • danhakimiOP
            link
            fedilink
            1
            edit-2
            8 months ago

            But not in the same way that it’s on them if they don’t know that when they post a post to facebook, facebook has the post.

            one of these things is sheer vapid stupidity, one of them is a failure of extreme vigilance in a modern nightmare society.

            • @atrielienz@lemmy.world
              link
              fedilink
              English
              1
              edit-2
              8 months ago

              You can make a private group on Facebook where you can exchange messages without anyone who lacks system access being able to view them. That’s how CP rings hide what they’re doing. And Facebook allows it until someone reports it or the cops subpoena that data. Which is basically the same as every other messaging platform or social media site that allows such functionality. So what was it you wanted them (Facebook, or WhatsApp, or signal or telegram) to do? Delete the accounts or known terrorists whether they are or aren’t using the platform for terrorist activities? Because I don’t really understand what you’re advocating for. You appear to very much be advocating for people’s private messages to be scanned and possibly read by a human being if they trip the algorithm. So yes. You are advocating for an invasion of privacy.

              • danhakimiOP
                link
                fedilink
                1
                edit-2
                8 months ago

                You can make a private group on Facebook where you can exchange messages without anyone who lacks system access being able to view them. That’s how CP rings hide what they’re doing. And Facebook allows it until someone reports it or the cops subpoena that data.

                My point here was that people would be stupid to expect that their information is private from facebook.

                I also have to imagine that you’re wrong, I’m sure they have proactive means to scan for CP and ban it whenever they become aware of it, and just don’t have the means to always ban CP groups immediately. Like, knowing your company controls child porn and allowing it to remain in control is a great way to end up in prison, most corporate douchebags prefer to avoid prison if they can. Like, Zuckerberg does not want to go to jail just so he can get a few more ad bucks from pedos.

                I feel very weird defending Facebook, they’re quite evil, but your conspiracy theory is silly.

                So what was it you wanted them (Facebook, or WhatsApp, or signal or telegram) to do?

                You seem like you’re focused on private groups, which I think are still problematic on Telegram as they are on Facebook, but you’re really neglecting the issue of the fully public broadcast groups Hamas is known for, and known for engaging in terrorist activity on, including the al-Qassam Brigades! Why can’t Telegram ban the fucking public channel for the al-Qassam Brigades? Doesn’t that just make the issue so obvious?

                Delete the accounts or known terrorists whether they are or aren’t using the platform for terrorist activities?

                That would be good, but also, if you know somebody is a terrorist, isn’t that at least enough cause to look through their non-encrypted chats to see if they really are or are not using the platform for terrorist activities?

                You appear to very much be advocating for people’s private messages to be scanned and possibly read by a human being if they trip the algorithm. So yes. You are advocating for an invasion of privacy.

                Meh. If your public messages and stories and large group messages (which are really not private in any meaningful sense) trip an algorithm with high confidence, then scanning your unencrypted kinda-private-ish messages after that doesn’t seem like a big problem, and human review after a high-confidence trip there doesn’t seem too bad either.