• Max-P@lemmy.max-p.me
    link
    fedilink
    English
    arrow-up
    116
    arrow-down
    5
    ·
    18 days ago

    They’d get sued whether they do it or not really. If they don’t they get sued by those that want privacy invasive scanning. If they do, they’re gonna get sued when they inevitably end up landing someone in hot water because they took pictures of their naked child for the doctors.

    Protecting children is important but can’t come at the cost of violating everyone’s privacy and making you guilty unless proven innocent.

    Meanwhile, children just keep getting shot at school and nobody wants to do anything about it, but oh no, we can’t do anything about that because muh gun rights.

    • 0x0@programming.dev
      link
      fedilink
      English
      arrow-up
      14
      ·
      18 days ago

      Makes me wonder if the lawsuit is legit or if it’s some But think of the children™ institution using some rando as cover.

      because muh gun rights.

      I think it’s a bit more complicated. These are worth a watch at least once:
      Let’s talk about guns, gun control, school shooting, and “law abiding gun owners” (Part 1)
      Let’s talk about guns, gun control, school shooting, and “law abiding gun owners” (Part 2)
      Let’s talk about guns, gun control, school shooting, and “law abiding gun owners” (Part 3)

    • john89@lemmy.ca
      link
      fedilink
      English
      arrow-up
      4
      ·
      edit-2
      16 days ago

      If people really care about protecting the children, we can always raise taxes on the wealthy/cut military spending to fund new task forces to combat the production and spread of child pornography!

      Heck, the money spent on this lawsuit could be spent catching people producing CSAM instead.

    • rottingleaf@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      10
      ·
      18 days ago

      Meanwhile, children just keep getting shot at school and nobody wants to do anything about it, but oh no, we can’t do anything about that because muh gun rights.

      Children get abused in school and shoot some of the bullies in response, usually. Bullying is the problem, and not that most autistic children don’t have non-radical ways of responding to it. And they do have right to revenge, if no other mechanism delivers justice.

      It’s telling how in all such cases the bullying itself is seen as almost normal, just the response. If a kid is weird enough to shoot up the bullies, then they must have been weird before, and then it’s OK apparently.

      But I agree that this is more important than interference with people’s communications to somehow prevent bad people from communicating.

      Bad people generally try to get into privileged positions btw, or undertake the needed effort to secure their activities. Most ideas of surveillance allow them to do their stuff without interference.

      I think gun rights are fine. Every free human should not be robbed of right to carry arms. Especially looking at videos from that prison in Syria, 4 underground floors, people not remembering their names, children born there … I think one can make some sacrifices to keep one of the failsafe mechanisms against that.

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    111
    arrow-down
    3
    ·
    edit-2
    18 days ago

    First: I’m not in any way intending to cast any negative light on the horrible shit the people suing went through.

    But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.

    If you really were serious about suing to force change, you’ve literally got:

    1. X, who has reinstated the accounts of people posting CSAM
    2. Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely
    3. Instagram/Facebook, which have much the same problem as X with slow or limited action on reported content

    Apple, at least, will take immediate action if you report a user to them, so uh, maybe they should reconsider their best target, if their intent really is to remove content and spend some time on all the other giant corpos that are either literally actively doing the wrong thing, doing nothing, or are sitting there going ‘well, akshully’ at reports.

    • Chozo@fedia.io
      link
      fedilink
      arrow-up
      45
      arrow-down
      1
      ·
      18 days ago

      Google/Youtube, who take zero action on people posting both horrible videos AND comments on said videos routinely

      I used to share an office with YouTube’s content review team at a previous job and have chatted with a bunch of them, so I can give a little insight on this side. For what it’s worth, YT does take action on CSAM and other abusive materials. The problem is that it’s just a numbers game. Those types of reports are human-reviewed. And for obvious reasons, it’s not exactly easy to keep a department like that staffed (turns out you really can’t pay people enough to watch child abuse for 8 hours a day), so the content quickly outnumbers the reviewers. Different types of infractions will have different priority levels, and there’s pretty much always a consistent backlog of content to review.

      While this article talks about Facebook, specifically, it’s very similar to what I saw with YouTube’s team, as well: https://www.theverge.com/2019/2/25/18229714/cognizant-facebook-content-moderator-interviews-trauma-working-conditions-arizona

      • 0x0@programming.dev
        link
        fedilink
        English
        arrow-up
        11
        ·
        18 days ago

        you really can’t pay people enough to watch child abuse

        I wonder what the package was, besides the salary. And the hiring requirements.

        • Chozo@fedia.io
          link
          fedilink
          arrow-up
          15
          ·
          18 days ago

          I don’t know all the details, but I know they had basically unlimited break time, as well as free therapy/counseling. The pay was also pretty decent, especially for a job that didn’t require physical labor or a specialized background.

          They did have a pretty strict vetting process, because it was apparently not uncommon at all for people to apply to the job because they were either eager to see abusive content directly, or had an agenda they might try to improperly influence what content gets seen. Apparently they did social media deep dives that you had to consent to, to apply.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        1
        arrow-down
        1
        ·
        17 days ago

        For Youtube I was very much talking specifically about how long and how little action they took on the kids-doing-gymnastics videos, even when it became abundantly clear that the target market was pedophiles, and the parents who kept posting these videos were, at the very least, complicit if not explicitly pimping their children out.

        (If you have not seen and/or read up on this, save yourself the misery and skip it: it’s gross.)

        It took them a VERY long time to take any meaningful action, even after the intent of and the audience to which it was being shown was clearly not people interested in gymnastics, and it stayed there for literal years.

        Like, I have done anti-CSAM work and have lots and lots of sympathy for it because it’s fucking awful, but if you’ve got videos of children - clothed or not - and the comment section is entirely creeps and perverts and you just kinda do nothing, I have shocking limited sympathy.

        Seriously - the comment section should have been used for the FBI to launch raids, because I 100% guarantee you every single person involved has piles and piles of CSAM sitting around and they were just ignored because it wasn’t explicit CSAM.

        Just… gross, and poorly handled.

    • john89@lemmy.ca
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      16 days ago

      But it also kinda feels like a lawyer convinced a victim they could get paid if they sued Apple, because Apple has lots of money.

      Yep. All the money being wasted on this lawsuit could be spent catching actual producers and distributors of child porn.

      Always follow the money. It shows what people’s true intentions are.

  • conciselyverbose@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    42
    arrow-down
    3
    ·
    18 days ago

    I thought the way they intended to handle it was pretty reasonable, but the idea that there is an actual obligation to scan content is disgusting.

  • paraphrand@lemmy.world
    link
    fedilink
    English
    arrow-up
    39
    arrow-down
    3
    ·
    edit-2
    18 days ago

    “People like to joke about how we don’t listen to users/feedback. About how we just assert our vision and do things how we wish. Like our mouse. It drives people absolutely bonkers! But this time we listened to the pushback. And now they sue us?”

  • lurklurk@lemmy.world
    link
    fedilink
    English
    arrow-up
    23
    ·
    18 days ago

    Is iCloud a file sharing service or social network in some way? If it isn’t, comparing them with such services makes no sense

  • john89@lemmy.ca
    link
    fedilink
    English
    arrow-up
    2
    ·
    16 days ago

    Is this a free system, by the way?

    Is Apple essentially getting sued for not giving another company money?

  • Lutra@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    ·
    15 days ago

    I just read up, and I didn’t know this is not so much about stopping new images, but restitution for continued damages.

    The plaintiffs are “victims of the Misty Series and Jessica of the Jessica Series” ( be careful with your googling) https://www.casemine.com/judgement/us/5914e81dadd7b0493491c7d7

    Correct me please, The plaintiffs logic is : “The existence of these files is damaging to us. Anyone found ever in possession of one of these files is required by law to pay damages. Any company who stores files for others, must search every file for one these 100 files, and report that files owner to the court”

    I thought it was more about protecting the innocent, and future innocent, and it seems more about compensating the hurt.

    Am I missing something?

  • lepinkainen@lemmy.world
    link
    fedilink
    English
    arrow-up
    19
    arrow-down
    21
    ·
    edit-2
    17 days ago

    The irony is that the Apple CSAM detection system was as good as we could make it at the time, with multiple steps to protect people from accidental positives.

    But, as usual, I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.

    • lurklurk@lemmy.world
      link
      fedilink
      English
      arrow-up
      31
      arrow-down
      3
      ·
      18 days ago

      You should have though. This type of scanning is the thin end of the wedge to complete surveillance. If it’s added, next year it’s extended to cover terrorism. Then to look for missing people. Then “illegal content” in general.

      The reason most people seem to disagree with you in this case is that you’re wrong

      • lepinkainen@lemmy.world
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        6
        ·
        17 days ago

        We could’ve burned that bridge when we got to it. If Apple would’ve been allowed to implement on-device scanning, they could’ve done proper E2E “we don’t have the keys officer, we can’t unlock it” encryption for iCloud.

        Instead what we have now is what EVERY SINGLE other cloud provider is: they scan your shit in the cloud all the time unless you specifically only upload locally-encrypted content, which 99.9999% of people will never be bothered to do.

    • Xatolos@reddthat.com
      link
      fedilink
      English
      arrow-up
      9
      arrow-down
      5
      ·
      17 days ago

      I think I was the only one who actually read the paper and didn’t go “REEEE muh privacy!!!” after seeing the headline.

      Did you also read the difference in how Apple was trying to go about it and how literally everyone else was going about it?

      Apple wanted to scan your files on your device, which is a huge privacy issue and a huge slippery slope (and a backdoor built in).

      The entire industry scans files when they are off your private device and on their own personal computers. So your privacy is protected here, and no backdoor built in.

      Apple just had a fit and declared that if they can’t backdoor and scan your files on your own device then they just won’t try anything, even the most basics. They could just follow the lead of anyone else and scan iCloud files, but they refuse to do that. That was the difference.

      • meejle@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        arrow-down
        1
        ·
        17 days ago

        I’m amazed it’s taken so long… I think I’m on my third Android phone since they first announced it and I said “fuck no”.

      • lepinkainen@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        arrow-down
        6
        ·
        edit-2
        17 days ago

        There was no “huge privacy issue”.

        First of all: You could turn off the local scanning by turning off iCloud sync - which would’ve sent the images to the cloud for scanning anyway. That’s it, nothing else, nobody at Apple would’ve touched a single super-private file on your device.

        The local scanning required MULTIPLE (where n>3, they didn’t say the exact number for obvious reasons) matches to known and human-verified CSAM. This database is the one that would’ve been loaded from iCloud if you had it turned on. This is the exact same database all cloud providers are using for legal reasons. Some have other algos on top - at least Microsoft had an is_penis algorithm that shut down a German dude’s whole Live account for his kid’s pics being on OneDrive.

        After the MULTIPLE matches (you can’t get flagged by “accidentally” having one on your phone, nor would pics of your kids in the pool trigger anything) a human checker would have had enough data to decrypt just those images and see a “reduced resolution facsimile” (Can’t remember the exact term) of the offending photos. This is where all of the brainpower used to create false matches would’ve ended up in. You would’ve had to create multiple matches of known CP images that looks enough like actual CP for the human to make an erroneous call multiple times to trigger anything.

        If after that the human decided that yep, that’s some fucked up shit, the authorities would’ve been contacted.

        Yes, a Bad Government could’ve forced Apple to add other stuff in the database. (They can do it right now for ALL major cloud storage providers BTW) But do you really think people wouldn’t have been watching for changes in the cloud-downloaded database and noticed any suspicious stuff immediately?

        Also according to the paper the probability of a false match was 1 in 1 trillion accounts - and this was not disputed even by the most hardcore activists btw.

        tl;dr If you already upload your stuff to the cloud (like iOS does automatically) the only thing that would’ve changed is that nobody would’ve had a legit reason to peep at your photos in the cloud “for the children”. But if you’ve got cloud upload off anyway, nothing would’ve changed. So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.

        • Xatolos@reddthat.com
          link
          fedilink
          English
          arrow-up
          10
          arrow-down
          4
          ·
          17 days ago

          So I still don’t understand the fervour people had over this - the only reason I can think of is not understanding how it worked.

          Or that it was a built in backdoor running in your device.

          The difference is what happens on your own device should be in your control. Once it leaves your device then it’s not in your control. Which is where the entire issue was. It doesn’t matter if I toggle a switch on whether to allow upload or not, the fact it was happening on my device was the issue.

          • lepinkainen@lemmy.world
            link
            fedilink
            English
            arrow-up
            6
            arrow-down
            2
            ·
            17 days ago

            It’s not a very good back door if you have an explicit easy to use switch to turn it off.

            And even without this feature on your device, they don’t need to use a “back door”. They’ll just go through your front door that’s wide open and can’t be closed because of “the children”

            If you want to “own” your phone, there are other manufacturers than Apple that allow you to lock it down like Fort Knox (or whatever you deem secure)

        • BearOfaTime@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          5
          ·
          17 days ago

          You don’t understand or you refuse to acknowledge this is a back door into your device an Apple is actively scanning your files meaning your device is now compromised.

          Or are you shilling for anti-privacy?

          My device, my files. I don’t want your scanning.

          What’s so hard to grok about that unless you are anti-privacy?

          • lepinkainen@lemmy.world
            link
            fedilink
            English
            arrow-up
            3
            arrow-down
            2
            ·
            17 days ago

            The files WILL be scanned the second they leave your device to any major cloud.

            If they don’t leave your device, then turning off iCloud (and thus the “back door”) wouldn’t have had any impact on you.

            • Railcar8095@lemm.ee
              link
              fedilink
              English
              arrow-up
              3
              ·
              17 days ago

              The files WILL be scanned the second they leave your device to any major cloud.

              There are services with e2e and you can encrypt before uploading to those who can’t.

              Realistically speaking, if this was implement anybody with CSAM would just not use iPhones, and all scanning would be done on everyone else.

              Then, once implemented and with less fanfare some authoritarian regimes (won’t say any to not upset the tankies) can ask apple to scan for other material too… And as it’s closed source we wouldn’t even know that the models are different by country.

            • Lutra@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              15 days ago

              Just clearing up the argument.

              1. The files will be scanned
              2. They’ve been doing for decades

              There’s a difference here in principle. Exemplified by the answer to this question: “Do you expect that things you store somewhere are kept private?” Where, Private means: “No one looks at your things.” Where, No One means: not a single person or machine.

              This is the core argument. In the world, things stored somewhere are often still considered private. (Safe Deposit box). People take this expectation into the cloud. Apple, Google, Microsoft, Box, Dropbox etc - only made their scanning known publicly _after they were called out. They allowed their customers to _assume their files were private.

              Second issue: Does just a simple machine looking at your files count as unprivate? And what if we Pinky Promise to make the machine not really really look at your files, and only like squinty eyed. For many, yes this also counts as unprivate. Its the process that is problematic. There is a difference between living in a free society, and one in which citizens have to produce papers when asked. A substantial difference. Having files unexamined and having them examined by an ‘innocuous’ machine, are substantial differences. The difference _is privacy. On one, you have a right to privacy. In the other you don’t.


              an aside…

              In our small village, a team sweeps every house during the day while people are out at work. In the afternoon you are informed that team found illegal paraphernalia in your house. You know you had none. What defense do you have?

    • Petter1@lemm.ee
      link
      fedilink
      English
      arrow-up
      5
      arrow-down
      3
      ·
      18 days ago

      😆 yea especially after I learned that most cloud services (amazon, google, dropbox) were already doing csam scans on their servers 🤭

      • lepinkainen@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        arrow-down
        4
        ·
        17 days ago

        Yep, it’s a legal “think of the children” requirement. They’ve been doing CSAM scanning for decades already and nobody cared.

        When Apple did a system that required MULTIPLE HUMAN-VERIFIED matches of actual CP before even a hint would be sent to the authorities, it was somehow the slippery slope to a surveillance state.

        The stupidest ones were the ones who went “a-ha! I can create a false match with this utter gibberish image!”. Yes, you can do that. Now you’ve inconvenienced a human checker for 3 seconds, after the threshold of local matching images has been reached. Nobody would’ve EVER get swatted by your false matches.

        Can people say the same for Google stuff? People get accounts taken down by “AI” or “Machine learning” crap with zero recourse, and that’s not a surveillance state?

        • Petter1@lemm.ee
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          2
          ·
          17 days ago

          😅why do we get downvoted?

          I guess somebody doesn’t like reality 💁🏻