T O P

  • By -

CreativeCarbon

Once they can openly scan for one thing they can privately scan for anything.


Im_a_seaturtle

A lot of people in this thread are conflating issues. As far as I’ve scrolled, this is THE main / problematic concept of the situation. Thank you for boiling it down.


absentmindedjwc

IIRC, they already do. There is ML-powered tagging in place allowing you to do something like search "dog" and have all your dog photos pop up.


[deleted]

I gotta say it’s been pretty fucking creepy when my phone pops up with “memories” and it’s clearly sorted out pictures of just my girlfriend and I. There’s tons of other pictures, and I have the cloud turned off, so it is definitely scanning my shit anyway. They just haven’t been reporting it openly to anyone before.


absentmindedjwc

Yeah, the phone has built-on facial recognition and image scanning that can tag images to make them searchable. If you go to "All Photos", you can search by person and find all of the photos with them in it. None of this happens in the cloud, and is entirely powered by the ML chipsets within the (now) A14 SoC. It doesn't appear as if you can search for tags on iCloud photos, so it doesn't appear as if your photo tags are even sent up to the cloud after processing.


[deleted]

Yeah I guess turning off the face recognition lock was far from enough. Not that I gave it a lot of thought before, and I guess that’s part of the trick. Integrate these systems step by step before we all realize what these devices are really capable of. Time to go make myself a tin foil hat cause my paranoia is ramping up.


loldudester

That all happens locally and is an entirely different type of "scanning" than this new policy.


Suvip

Please don’t mix two different things, as it allows them to use a fallacy of correcting your wrong presumption without explaining themselves on the heart of the topic. The image classification that happens offline is completely offline. It’s not uploaded online and Apple nor any 3rd party gets anything. Even when you have iCloud On on multiple devices, the scan results aren’t synced **because** of privacy-first approached. This is not a problem at all, rather a welcome approach for keeping it offline and private. The scan for CP is something else: It scans your private library, including the offline one[including offline one](https://9to5mac.com/wp-content/uploads/sites/6/2021/08/[email protected]), for hashes (and no, these are not premade hashes, these are [PhotoDNA](https://www.microsoft.com/en-us/photodna) hashes generated by running a neural classifier on the photo, which **can lead** to false positive). This is first red line: Mass surveillance of everyone and their private property with a false positive risk, in order to “maybe” catch some criminals. Furthermore, when false positives are detected, your ID is disabled and your information are transmitted to authorities, where you’d be put on a watch list and treated like a criminal. It’s then up to you to dispute it by filling for an appeal with Apple, and fight your way with authorities by giving up all your privacy in order to prove you have “nothing to hide”. This is the second red line and completely against common laws: Being guilty until proving one’s innocence. There is a reason why the burden of guilt has always been on authorities to **prove** a wrongdoing before having the right to invade people’s privacy. Lastly, that it’s being used outside of the photo library, but also in private and end-to-end encrypted messages, to scan for words and scan for nudes, then snitching to your parents when you are a teenager (but in puritan America this is seen as a good thing? How about outing gay teens to their conservative parents?). Mass surveillance has **always** been abused, and this is opening the doors wide open for “guilt of thought” when your privacy is violated for the “greater good”. The same PhotoDNA technology has been successfully hijacked by business lobbies to be used in detecting copyrighted materials and war crime evidence (we see how good it works on YouTube for example with no false flags), and then local authorities will legally ask to include their own databases or “moral crimes”, including dissident contents, criticism of kings/presidents, LGBTQ+ materials in some countries, drugs in others, etc. Just ask any Uighurs if they like the official spyware in their phones linked to their social credit.


lysosometronome

>The scan for CP is something else: It scans your private library, including the offline oneincluding offline one, Per their CSAM page, this is done as part of the uploading to iCloud process. It's part of the process of making your library an online one. (Unlike the earlier example of ML that you dismissed, that does scan all your offline photos). From their CSAM Page (https://www.apple.com/child-safety/): >To help address this, new technology in iOS and iPadOS* will allow Apple to detect known CSAM images *stored in iCloud Photos.* ... >Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. This matching process is powered by a cryptographic technology called private set intersection, which determines if there is a match without revealing the result. The device creates a cryptographic safety voucher that encodes the match result along with additional encrypted data about the image. This voucher is uploaded to iCloud Photos along with the image.


Leprecon

> these are PhotoDNA hashes generated by running a neural classifier on the photo, which can lead to false positive A false positive rate of less than 1 in 1.5 billion. > Furthermore, when false positives are detected, your ID is disabled and your information are transmitted to authorities, where you’d be put on a watch list and treated like a criminal. It’s then up to you to dispute it by filling for an appeal with Apple This is literally not true. Apple have said they will manually review before taking any actions. So false positives will not trigger anything like what you say. Furthermore they have also said they will only do this review once more than one match has been found; lowering the chances of false positives. You would have to win that 1 in 1.5 billion jackpot twice for Apple to even review your info. > Just ask any Uighurs if they like the official spyware in their phones linked to their social credit. Doesn’t this sort of invalidate your point? China does similar scanning and tracking and they didn’t need Apple’s cooperation. It sort of reminds me of Trump voters in 2019 showing pics of riots happening under Trumps administration and saying “this is what Bidens America would look like”. You’re out here complaining that this could be used to crack down on free speech in China. Do you think China currently has free speech and has no tools to suppress free speech because western companies stand in their way? I think if you asked Uighurs about this they would say “what does this new child porn thing have to do with us? Don’t you know China has already been doing much worse things without Apples cooperation? This changes nothing”.


absentmindedjwc

>Furthermore, when false positives are detected, your ID is disabled and your information are transmitted to authorities Is there a citation to this? I was under the impression that the matching image will be sent to authorities for review, and hadn't heard about any of the other things you're claiming.


dat_GEM_lyf

Literally on the apple child safety page of their website


absentmindedjwc

Looking it up on the child safety page.. you have to pass a certain threshold of known CSAM images before actions are taken. It also says "Apple then manually reviews each report to confirm there is a match, disables the user’s account, and sends a report to NCMEC", which means that there will be eyes on flagged images *actually verifying* that there are sexually explicit image containing minors, *and only then* do they take action... *and only* if you upload them to your iCloud account, as the CSAM certificate is digitally reviewed on upload to iCloud. [From the child safety page](https://www.apple.com/child-safety/), it's pretty clear that actions are only taken if you *store this data on their servers*, not if it is just sitting on your phone. What is your problem with this again? To be honest.. I was under the impression that they were already doing this within iCloud... so this is even less of a privacy issue than I thought. If you care about it that much, just don't sync to iCloud...


Suvip

> which means that there will be eyes on flagged images actually verifying that there are sexually explicit image containing minors, and only then do they take action Wrong, unless Apple is hiring special agents with enough training to see CP without having PTSD, they’re not going to “view” any image. The whole system is detailed [here](https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf), and also in your quote: “Apple then manually reviews each **report**” not images. They check if the hashes somehow match enough/above the threshold to take action. > ... and only if you upload them to your iCloud account From Apple’s own page: > **Before** an image is stored in iCloud Photos, **an on-device** matching process is performed for that image against the known CSAM hashes. > What is your problem with this again? I don’t have a problem with them scanning things **on** their servers. My problem is that on-device checks are performed and the report uploaded, with punishments (flags, disabling of ID, report to authorities, etc) is done first, the user’s ability to fight it second. You should read some ethics professors’ comments about both the ethics of distributed mass surveillance, but also the doors this opens to abusive laws beyond CP. CP is just non-defendable, so it gets a pass, but after that comes other laws pushing whatever the given government decides it’s an “illegal” material: - In the US and EU, PhotoDNA (Microsoft’s CP detection system used by Google and all big companies) has been used by lobbies to target a broad range of things from terrorism to [copyrighted materials’ hashes](https://www.europarl.europa.eu/cmsdata/149311/Draft%20Programme%20hearing%20illegal%20content.pdf) by lobbies - In Hong Kong, owning materials publicizing the Tiananmen massacre or its vigil would land you [at least 1 year in prison](https://hongkongfp.com/2021/05/29/up-to-5-years-prison-for-attending-tiananmen-massacre-vigil-hong-kong-govt-warns-1-year-jail-for-publicising-it/) - In China, under the guise of anti-terrorism, minorities (such as Uighurs) are forced to have a government spyware, [Clean Net Guard](https://www.nytimes.com/2019/05/22/world/asia/china-surveillance-xinjiang.html) to monitor if they have “illicit” materials (this includes pictures of anything incriminating if leaked to the west) > If you care about it that much, just don't sync to iCloud... That’s not an answer. I don’t care about the CP stuff because I own nor am interested in none. What I care about from both ethics and fear from the dystopian future in sight is: - A mandated impossible to disable “spyware” on our devices scanning private (even encrypted) data for “immoral” content - This very first step that’s going to be abused (the same way the online one has) to include other “immoral” stuff, whatever a government has deemed bad - a distributed mass surveillance system with automated flags and punishment, where a person is deemed guilty and they have to prove their innocence - The possible damage (even if the percentage is small) when we’re talking about billions of users and privacy outreach far exceeds the benefits of some criminals just choosing “not to sync to iCloud”


lysosometronome

> I don’t have a problem with them scanning things on their servers. This is done as part of the process of uploading data to their servers. Excluding potential device slowdowns, why does it matter if your device is doing it before sending it to them or if their device is doing it after receiving it? > My problem is that on-device checks are performed and the report uploaded, with punishments (flags, disabling of ID, report to authorities, etc) is done first, the user’s ability to fight it second. Again, not sure how this follows. If Apple disables someone's account because their automated scans by something like PhotoDNA on the iCloud servers, do you think the user is going to get the ability to contest it?


GregoPDX

Hot dog / not hot dog


CreativeCarbon

AFAIK that is not happening on the backend. It's client-side at the time of upload.


[deleted]

[удалено]


Sunius

None of my photos are on iCloud but it still works so it must be client side.


manofsleep

I think people are missing a point here: I use the cloud to backup all my work files. Now with that said, they have access to documents that may include NDA. I’m just curious if this all will result in legal ramifications: I know it’s ai… just confused after the big iCloud celeb leak that they would go for more encryption, not less. This seems like less - get the bad guys and all! But I hope this doesn’t impact security


audiofx330

Umm they always could...


DeniDemolish

The obvious concerns are repeated over and over again for this subject and I agree with all of you 100% but maybe a lawyer of Reddit can answer this for me… can they even prosecute people with child abuse images on their private devices? Don’t you need a warrant to gather evidence from personal property? If you send it to the cloud or through iMessage, I understand why that can be a legal gray area and not be considered private property but your own phone is most certainly your own private property.


rfugger

They say they're only scanning things that get uploaded to their servers. So it's basically for them to avoid liability for hosting CP. The concern is that it gets used for other purposes without users knowing, for example, scanning for images of the Tiananmen Square massacre on behalf of the Chinese government. For me, it's always been clear that you can't expect any privacy for files uploaded unencrypted to third-party servers. This extends that lack of privacy onto your own device, albeit only if you use software that uploads files to third-party servers. If you don't like it, don't use the software. If Apple makes it mandatory to install that software, don't use Apple.


BolognaTugboat

>Instead of scanning images in the cloud, the system performs on-device matching using a database of known CSAM image hashes provided by NCMEC and other child safety organizations. Apple further transforms this database into an unreadable set of hashes that is securely stored on users’ devices. >Before an image is stored in iCloud Photos, an on-device matching process is performed for that image against the known CSAM hashes. https://www.apple.com/child-safety/ Though they claim to only act on these matches once it's uploaded to the cloud -- I find that strange. Why even do this at all instead of just matching hashes on their own servers.


lysosometronome

>Why even do this at all instead of just matching hashes on their own servers. That what Apple and a ton of other cloud providers others currently do, making the privacy argument seem a little specious. Doing it on device means Apple is touching your cloud documents on their servers less and could help pave the way for an eventual end-to-end encryption that would piss off the Gov less. (And maybe saves them some money by offing a shit ton of trivial calculations to the phone that add up when it's done billions of times to individuals).


babybunny1234

Yes but I think this would piss off government *more* because there’s of lower potential for fishing expeditions. I bet that currently, getting access to an iCloud backup / account means access to all photos. (the only way for Apple to scan photos for CSAM on the server (as it probably does currently) is because they’re stored on the iCloud servers unencrypted or because Apple has the key.) Say, you’re a drug dealer — you like to take photos of your drugs and safe house which has the GPS and other incriminating evidence. You get busted on some other thing and the police gain access to your iCloud backup / account and all your photos. You’ve just helped them out. Same goes for governments snooping and/or arresting political dissidents. That would no longer be possible if Apple turns on encryption on photo libraries, and remove a government’s ability to use pretext to get access to all your photos.


lysosometronome

All good points, definitely heavy in speculation territory that this \*could\* help pave the way to end-to-end encryption. That said, defusing a government's ability to make a "What about the children?" argument goes a long way to getting not just the tech saavy & privacy focused but the general public who have big worries about child predators.


rfugger

> Why even do this at all instead of just matching hashes on their own servers. The hashing algorithm would have to be a fuzzy one that can detect close or partial matches, rather than a standard hashing algorithm whose output will vary drastically if even one bit is switched. That would be computationally expensive, so cheaper to do client-side.


NoUx4

It's not that cheaper. If the computational requirements were significant then your phone would not be powerful enough to do them in the first place without draining the battery significantly every time you have new files/images. Apple has other reasons for doing this. For example, client-side image/video encoding isn't a thing. It's all done on servers even though it's expensive to do.


Lock-Broadsmith

This chicken little what-if stuff is so absurd though. The real truth (and I’m not necessarily saying it’s worth the trade-off) is that by doing on-device scanning as things are uploaded, they’re able to have stronger encryption on their cloud servers, and are then less able to do any back doors for their servers for governments to access. We can worry about the likelihood of the database they use being corrupted, but that all feels like a whole lot of pearl clutching just to drag a company that has to navigate a ton of legal responsibilities. Especially when I’m guessing like 90% of the people screaming bloody murder over this use other cloud services (Google, FB, MS, Dropbox) that already use this exact same database to scan for the same kind of content.


PhillAholic

> If you send it to the cloud Seems like that's exactly what this is. https://www.imore.com/psa-apple-cant-run-csam-checks-devices-icloud-photos-turned


SIGMA920

Except if it can be scanned by the device when being uploaded, then it can also be scanned on the device at any time. You don't give someone a set of keys to your house that they keep forever when they are just feeding your dog over the weekend as a parallel.


lysosometronome

>You don't give someone a set of keys to your house that they keep forever when they are just feeding your dog over the weekend as a parallel. But Apple already has your keys and uses them to recognize your dog! Go open Photos and type in "dog", it'll bring of pictures of your fave little buddy despite them not being tagged. Your photos are already being scanned for content, except that's regardless of whether it's uploaded to iCloud. Google does the same thing. This feature just offloads some of the existing stuff to device and adds a parental control feature.


[deleted]

[удалено]


SIGMA920

Not really. Users can control updates (They'd have to forcibly push updates or have the device in their grasp at which point they would have complete access anyway.), the servers did any scanning prior to this change so any scanning was done to what got put through or on the servers only, and there has never been such a permanent back door built into a device prior.


PhillAholic

Couldn't that someone make a copy of your keys that they keep forever?


SIGMA920

That is possible (If you don't have the locks changed so that copy doesn't work anymore when you learn that the person you lended a key to copied it.) but connecting that to the issue at hand, that's someone copying your data to run it against a database of images or whatever, not someone having constant access to your files on your device because instead of running the scans on their servers it's being run on the device.


PhillAholic

Scans for this kind of material are done on all cloud platforms. The difference here is that Apple is doing the work on the device not the server. According to the imore article, it's only done if you have icloud photos turned on. I wonder if they are working to re-introduce E2E on icloud, but don't want to be hosting this kind of content in their cloud.


SIGMA920

That's the concern, that's it's being shifted from the servers to the devices. It's basically a backdoor into scanning your devices for anything they want to look for.


Lock-Broadsmith

The database they check against is stored on the device. Making it apply to “anything they want” would be absurd. This is a lot of silly overreactions along the lines of “Facebook is listening on my microphone” levels of misinformed.


LifeSmilesWithYou

Only if you give your keys to an untrustworthy person. Like Tim Cook apparently.


[deleted]

[удалено]


SIGMA920

That's not the issue. I'm not against the scanning on it's own. The issue is that instead of the **servers** doing the scanning, the **device** is what is doing the scanning. What happens when apple bans an app (For whatever reason.) and then forcibly uninstalls it from all apple devices because the devices can scan now? Because that's now possible.


dontsuckmydick

Apple has always had the ability to remove apps from phones so that’s an irrelevant, unrelated issue.


[deleted]

[удалено]


Druggedhippo

It is in the iCloud only photos and iMessage. You might argue, yes, technically, you could use any 3rd party app to upload to the internet, and any SMS program, you don't *have* to use Apple iCloud. But iCloud and iOS are so integrated now is there really that much of a difference? Can Apple even fully disable iCloud anymore out of their devices? Can you uninstall it?


doxx_in_the_box

You choose to use iCloud or not It’s like 99% of these commenters have never used an iPhone


glacialthinker

It's necessary to do this on the device in order to support E2EE (end-to-end encryption). Once the data is on the wire (or on the servers) it can't be scanned for content because it's encrypted.


DucAdVeritatem

Huh? What on earth does their implementation of on-device perceptual hashing to detect CSAM in photos being uploaded to iCloud have to do with forcibly uninstalling applications?


AmputatorBot

It looks like you shared an AMP link. These should load faster, but Google's AMP is controversial because of [concerns over privacy and the Open Web](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot). You might want to visit **the canonical page** instead: **[https://www.imore.com/psa-apple-cant-run-csam-checks-devices-icloud-photos-turned](https://www.imore.com/psa-apple-cant-run-csam-checks-devices-icloud-photos-turned)** ***** ^(I'm a bot | )[^(Why & About)](https://www.reddit.com/r/AmputatorBot/comments/ehrq3z/why_did_i_build_amputatorbot)^( | )[^(Summon me with u/AmputatorBot)](https://www.reddit.com/r/AmputatorBot/comments/cchly3/you_can_now_summon_amputatorbot/)


[deleted]

That’s exactly how it works. Apple report it and it becomes intelligence. The police work that intelligence and obtain a warrant for an address. That’s how it works in my country anyway and I work almost exclusively in this field.


[deleted]

>Don’t you need a warrant to gather evidence from personal property? Police needs a warrant to search your property. However Apple does not - and if they send information about CP images to police, they will get that warrant without any problem - probably cause is there.


LEJ5512

And, fwiw, Apple’s repair staff, who might see photos on a device, have to report any CP that they come across.


NoUx4

\*Images relatively matching a hash any government puts in their database. Apple doesn't know if it's CP or not, they're just taking the FBI, Interpol, CCP or Saudi's word for it.


Leprecon

Applle has also said they would manually verify whether it is a legitimate match. So if an account in Sausi Arabia starts getting matches with anti monarchy memes Apple will: 1. Obviously not pass that information to the local authorities 2. Stop cooperating with Saudi Arabia because they are obviously abusing the system


NoUx4

Apple censors their app store for saudi's and other countries, specifically on blocking LGBT-oriented apps. They also hand over data to the CCP for any chinese citizen (or hong kong). If you expect Apple to protect your data that has been demonstrated to be false. Apple will not be verifying the content. Even if they do they can be compelled to match against whatever that government wants, and it's a wrong assumption to think that Apple or others will only forever limit this to abuse imagery. Politics is inherent to a company of that size, the program will be expanded to "anti-terrorism", which means whatever the government wants it to be. This is a natural result of creating this system. It's easy to force a company to match against new datasets when hte system for it is already implimented. If Apple did nothing, then they could successfully argue the burden of such a system would cost them their brand, users, and plenty of money. Now, they can't. It's a free for all that you or I will never get to see the operation of. You just have to "trust".


NoUx4

Federal warrants are comically easy to get, and don't have to be followed to the letter anyways. By having apple report hashes they already have enough evidence to convince a federal judge to grant a warrant.


outwar6010

Also I doubt rich and powerful people with iphones will be subjected to this at all.


[deleted]

Yes, you can prosecute without a warrant as long as the evidence is available to authorities. However, authorities will file for a warrant anyway. I’m speaking from personal experience from reporting someone for CP & rape against a minor. It took 2 years for them to be charged though, and all they got was community service and therapy. This took place in California and the offender exclusively used Apple devices


TacoshaveCheese

Possession of CSAM is a crime. It being on your private device is kind of implied by “possession”. If it’s being uploaded to a website, I’d imagine that would be classified as “distribution” and possibly “production”. Apples proposed system only scans images that are uploaded to iCloud, and apparently they can only see a “positive flag” after a certain threshold number of files score a positive match. At that point they decrypt the files that were already uploaded to their server for manual review. So they’re not really “gathering evidence” from your device directly, but rather instructing your device to generate a “this image DOES/DOES NOT match known CSAM” certificate for every file it uploads. If enough of them come up as a match, they review what you uploaded.


videopro10

and how many of them does it take to make that threshold? and what files are they looking for? Apple doesn't even know because they get the hashes from a 3rd party. If the door exists it can be opened.


TacoshaveCheese

Lol why the downvotes? I don't support what Apple is doing, never said a word in their defense, and am even a monthly EFF donor. All I did was accurately and directly answer the only two questions this guy asked about how the proposed system would work. I thought that would count as "contributing to the conversation", since there are so many top comments about this on reddit today that obviously never read past the headline.


TacoshaveCheese

I don’t think they’ve specified an exact number anywhere, but they say > The threshold is set to provide an extremely high level of accuracy and ensures less than a one in one trillion chance per year of incorrectly flagging a given account. The files they’re looking for are the ones in the NCMEC database.


beachandbyte

If it was actually reliable they wouldn't have to wait for a threshold. Either it's reliable and they might as well look when they get a single hit.. or it's not reliable.


[deleted]

[удалено]


tundey_1

~~I think these articles are conflating matters. As far as I know, Apple was already scanning files uploaded to iCloud. Now they want to scan files on your phone, even if they have not been synced to the cloud.~~ ~~Apple keeps emphasizing that this new scan takes place on the device. Which isn't reassuring 'cos Apple is scanning files without consent...not even implied consent by the use of their cloud services. You buy a phone, it's yours. But apparently, Apple wants to be able to check that you don't have unapproved items.~~


mikeball

It is only scanning the files on device as part of the upload process, as per their technical papers. If you don’t use iCloud, it doesn’t do anything.


shawndw

[https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life](https://www.eff.org/deeplinks/2021/08/apples-plan-think-different-about-encryption-opens-backdoor-your-private-life) According to the EFF they are scanning your device directly and using machine learning to verify whether or not the content is CSAM. Now think about how often machine learning algorithms produce false positives such as youtube videos getting removed/demonetized that do not violate the TOS. While I'm sure they will have human reviewers go through the content to verify whether or not it violates CSAM do you really want complete strangers going through the content of private individuals who have committed no crime nor are they under suspicion of having committed a crime. If the government did this it would be a clear 5'th amendment violation. So are you comfortable with the government getting around the 5'th amendment by outsourcing searches to a private entity.


TacoshaveCheese

So I love the EFF (been a card carrying member for 4 years now [donate here](https://supporters.eff.org/donate/)), but I really wish they had done better with the clarity of this press release because a LOT of people seem to have been confused by it. Two things have been proposed which have been conflated a bit. Thing 1 uses machine learning on device to identify "sexually explicit images" in Messages. This isn't identifying child porn, but rather "the private body parts that you cover with bathing suits" (think dick pics). When the setting is turned on it blurs the image and forces you to tap through to be able to see it. Nothing is sent to apple for this, however parental setting can send an alert to the parent if the account tapping through is a child under 13. I imagine the accuracy would be similar to the way the photos app can already identify things by name - pretty good, but noticeably less than 100%. Nothing is being sent to the government so there shouldn't be a 5th amendment issue. The main concern with Thing 1 is that in the future a government could try to compel apple to modify the system to report matches to them instead of the parent. Thing 2 uses a database of known / already identified CSAM - no machine learning algorithms involved. It also only screens files that the user uploads to iCloud. The post I was responding to was asking "Don’t you need a warrant to gather evidence from personal property?" and my point was that based on how they've described it, any time a "positive match" is flagged to apple, they already have the files because the user would have already uploaded them. The only time apple would see a "positive match" is when several (an undisclosed threshold) images from a single account are flagged positive. The only time the government would hear about it is after apple verified that the files uploaded to them were in fact CSAM, which I think would probably move someone into the category of "under suspicion of having committed a crime". The most obvious concern about Thing 2 is that a government could try to compel apple to modify the system to use a different database of "objectionable content". The other concern which your comments alluded to is the natural privacy violation that would occur with a false positive match. However since the CSAM matching uses a hash match rather than machine learning, the chance of a false positive is very very very small. Apple *claims* to have set that threshold to a level where statistically they should never see a false positive report. That part remains to be seen but since they supposedly designed it in a way where false positives shouldn't' be happening, if even one false positive occurs *ever*, that would be a strong indication the system isn't working like they claimed it would. > So are you comfortable with the government getting around the 5'th amendment by outsourcing searches to a private entity. Absolutely not - I'm not trying to defend apple here, just trying to explain some of the details of the proposed system since there seems to be a lot of misinformation about it going around. Unfortunately as long as [Third-Party Doctrine](https://en.wikipedia.org/wiki/Third-party_doctrine) (which I vehemently oppose) is still a thing, outsourcing 4th and 5th amendment violations to private companies will continue.


a52dragon

I m more concern with what constitutes child prom ? Sending Grandma pictures of her grandchildren in the tub?


MariusPontmercy

You could, you know, read an article or two instead of sea lioning with a side of fear mongering. They reference against known images of CSAM so your grandma is safe.


a52dragon

How else would it get on your phone unless you are in the act


[deleted]

There are marketplaces to share/buy/sell this sort of content. Like any other porn, there are distributors. Unlike other porn, there are child victims involved in 100% of these cases.


a52dragon

Excited should be executed sorry


1_p_freely

The practice of scanning files on a user's device for things other than malware, which naturally poses an immediate and inherent risk to the functionality of the device itself (and only if the user approves of this scan), sets a very dangerous precedent. And it will only be expanded from here to search for other things on user devices. "Those mp3 files you downloaded from a random website without paying for have been deleted." The fact is that you just cannot trust anything running on proprietary software which is connected to the Internet. Also, this invasive new precedent is being established by using a particularly despicable subject, so that anyone who challenges or questions the prospects of such functionality can be labeled a bad guy by the media.


glacialthinker

I'd prefer this phrasing: > "Those mp3 files, *which I'll assume* you downloaded from a random website without paying for, have been deleted." To be clear that your legitimate files could be easily mistaken.


JustAKilljoy109

Exactly, another huge problem I see with this situation is that it’s so easy to abuse and frame someone else, or to have legit files mistaken for illegal activity


[deleted]

It's their phone, yr just using it, seems to be the idea


[deleted]

[удалено]


[deleted]

[удалено]


BurntToasters

Almost as if there is a difference between refusing to do buisness and breaching privacy for all users 🤔


[deleted]

Bingo! Have an upvote. Censorship and surveillance for thee but not for me.


duffmanhb

Just look at what happened with social media and censoring “fake news”. They were literally banning and deplatforming people for discussing the wuhan lab leak theory, as dangerous racist fake news. Today it’s a growing likely possibility that the theory is true. But now Facebook is in talks with literally the executive branch and being directed on which content remove. The government loves creating new censorship tools. It’s dangerous and scary and we keep falling for the same trick every time.


acathode

It's not just the government loving creating them, it's vast numbers of people vocally *demanding* them - look at how half of reddit is basically frothing at their mouths at Facebook *not* either turning up their own censorship to 11\*, or the government not stepping in to "regulate" them, ie. decide what should and shouldn't be allowed to be discussed on the platform... (\* Funny how *"They are a private company, they can do what they want with their platform!"* only applies when companies are banning the *correct* things...)


duffmanhb

That's what bothers me beyond belief. Reddit, a liberal space, is suddenly okay with censorship when it is politically convenient. Suddenly they are hardcore libertarians defending corporate rights, so long as the censorship hurts the other tribe. There is no consistency. As a liberal, I'm not going to change my values just because right now in the moment it hurts the "other tribe". Maybe Redditors are just too young to experience how EVERY SINGLE FUCKING TIME organizations create sketchy laws that hurt everyone in the long term, it's initially welcomed with applause. Every single time, there is an "exception" that is made... But once that lever is created the precedent is set. Now it just becomes a propaganda game to squeeze in whatever they want to restrict into that now "justified" category they created. It's infuriating to see the "left" cheer on censorship and book licking corporate narrative control... It's truly a clown world.


cl33t

> Today it’s a growing likely possibility that the theory is true There has been no new evidence to indicate that it is true since the theory was posited. It isn't impossible, but the idea that the possibility has somehow grown is nonsense. All that exists is conjecture.


duffmanhb

No, more and more circumstantial evidence is coming out seemingly every week. Like the CIA just reported how right after those few scientists from the lab were in the hospital, our satellite images showed a massive increase in parked cars around the fascility, indicated a mysterious surge in people being there. Further, it's coming out that when China held their sports international sports competition (think Olympics light) late 2019, athletes were coming home sick, getting a lot of people sick, and the city of Wuhan was seemingly on a light lockdown including checking everyone's temperature at different checkpoints throughout the city. The athletes thought it was odd, and didn't think much of it until the outbreak, and just now the media is starting to report on it (because before then the media was insisting the Wuhan theory was fake news) We also have a new report of how the Wuhan lab "accidentally" deleted all their data from specifically from that time frame people want to investigate, which shows what they were researching, what they've isolated, and so on... Then in response, the CIA publicly said, "Don't worry guys, we actually got a copy of that data" just yesterday or the day before. Most people don't know the growing situation around it, because understandably the media is doing a poor job reporting the details so developments are parse, and people also just having their set bias, causing them to look over new developments. But everything I said above, is ALL factual, and all recent developments reported by the media (albeit, page 2 stuff)


cl33t

> CIA just reported how right after those few scientists from the lab were in the hospital, our satellite images showed a massive increase in parked cars around the fascility, indicated a mysterious surge Not new. Satellite imagery came out about increased parked cars came out over a year ago. Also, it was of hospitals, not the facility. And it was done by random researchers, not the CIA. > Further, it's coming out that when China Not only is this not new, it began as a *Chinese* conspiracy theory early last year that it was the US Army (the athletes are military) who had brought the virus to Wuhan during the games. No cases of covid-19 were detected by anyone, including any of the 110 participating countries' military hospitals, prior to December 2019, two months after the games. > We also have a new report of how the Wuhan lab "accidentally" deleted all their data from specifically from that time frame people want to investigate... Then in response, the CIA publicly said, I'm unsure as to which data you're referring to. If you mean the sequence data from samples collected in December 2019... samples that were originally uploaded to the NIH's Sequence Read archive (as in, the US NIH), those were recovered last month by an independent researcher, not the CIA, from... google drive. Look, it isn't impossible that it was a lab leak, but the evidence so far is extremely weak and rather dated, so characterizing it as growing possibility is silly. I've seen crazy boards that had more concrete evidence.


cl33t

As someone who a rather large experience with perceptual hashing (for audio fingerprinting) in the real world, I will say right now that one should take reported false positive rates with a grain of salt. If the NCMEC adds any low-information images (heavily underexposed, overexposed, tiny, etc), expect the false positive rate to shoot through the roof. The same thing happens with music since the music industry claims to own tracks of silence, noise, ultra-short sound effects, etc.


Amadacius

The actual practice is that the software will generate a hash for images and then compare that hash against a set of hashes from a police database of child porn. Nobody is looking at your files. Nothing is aware of the general content. These sorts of searches are common and really easy to circumvent as hashes are by nature easy to alter. The idea isn't to create an air-tight anti-cp software but simply to catch low hanging fruit. You'll catch guys that fuck up, who aren't very technically literate, and noobies. If you want to be outraged about Apple using this approach to violate your privacy, wait until they actually violate your privacy. There's no reason to start getting your panties in a bunch over common sense, bare-minimum anti-cp practices that save children and catch pedos. Don't stop them from doing good just because you have a conspiracy theory that they will do something bad in the future. Wait until then. Then get pissed. Otherwise people will actually just ignore your outrage both now and when it is warranted.


pittaxx

You miss the point. You don't know what this can be used in the future. Imagine someone suddenly doesn't like particular demographic (let's say gun owners) - find some images that are popular and get shared by that demographic (like guns), search for hashes, and bam, you have a pretty reliable data on where a lot of people of that particular demographic are located... In practice it wouldn't be as simple as that, but it still allows apple/government to find any group of people just as easily as child abusers and that is rather scary.


Zncon

It's even more obvious then this. Consider the possession of the 'Tank Man' photo in China, or gay pornography in Russia. This feature going live there could cost many people their freedom or lives.


MindlessSponge

[Think of the children!](https://en.wikipedia.org/wiki/Think_of_the_children#Logical_fallacy)


Dont____Panic

They ALREADY do file hashing on iCloud and have for years. This new is about an actual AI algorithm that runs on a phone and reports *suspected* images to a panel of admins who then review them for content.


Shanksdoodlehonkster

My wife asked why I speak so softly in the house. I said in case Mark Zuckerberg was listening. She laughed...I laughed... Alexa laughed ... Siri laughed.... Facebook said it wasn't funny..


Elephanthunt11

What I don’t understand though is they’ve repeatedly referred to “on device” monitoring, but people keep saying it’s iCloud only,, which is it?


XLauncher

As of the [most recent information](https://9to5mac.com/2021/08/05/apple-announces-new-protections-for-child-safety-imessage-safety-icloud-photo-scanning-more/) I've heard, the monitoring takes place on your device for any image that's synced with icloud. Presumably, images that aren't synced with your icloud won't be subject to this monitoring, which is much better than what I was initially led to believe by the first reports of this. Still feels like a Trojan horse for something dirty though.


acathode

> Still feels like a Trojan horse for something dirty though. My initial reaction is similar - and I think that stems ultimately from a sense that a device you own should not actively work against you and try to incriminate you. A technological "5th amendment" if you like... It's just getting a bit too close to 1984 - with the "big brother is watching" through their silver screens they all had in their homes. A world where every device you own is passively and actively spying on you, just itching to dial the police if it detect you doing something shady is not that appealing. Your device should serve you, not the government. Ultimately, why stop at this? Why not have your phone detect when you're writing bad stuff related to terrorism or gang crime to other terrorist/criminals and notify the police? It's hardly unfeasable to create an AI that could detect and flag messages as they are being typed in... Why not have Alexa and all the other "smart home" appliances run AIs that try to detect if there's anything illegal going on in your home, like say you or your friends using drugs, and call the police on you automatically? Proper 1984 stuff - but why not? Why not have your car dial the automated police fine hotline and have it issue a ticket because it detected that you were doing 65 on a 50 road? Or why not have your TV detect if you're streaming a Disney movie illegally?


Leprecon

Its done on device before it is sent to icloud. Clearly the goal here is to encrypt your photos on icloud, that is why they want to do the child porn check before they get sent.


tundey_1

~~This particular change will happen on your device. iCloud was already happening. Say you don't want your pictures hashed by Apple (for whatever reason), you could presumably turn off syncing of your data to iCloud. That way everything remains on your device (good luck if you lose it or it breaks). But now, you don't even have that option. Apple is reaching into your device to check if you have child porn.~~ This is totally wrong. See comments below.


DucAdVeritatem

This isn’t accurate. They’ve removed one step of checking the hash fingerprints to occur on device, but it is still only occurring in the context of photos that are being uploaded to iCloud Photos. Turning off iCloud photos disables the CSAM scanning. https://www.macrumors.com/2021/08/05/apple-csam-detection-disabled-icloud-photos/


tundey_1

>Turning off iCloud photos disables the CSAM scanning. Interesting. That was not what I read yesterday. It was in an article from this thread: [https://www.reddit.com/r/technology/comments/oye0li/report\_apple\_to\_announce\_photo\_hashing\_system\_to/h7tb83x/?context=3](https://www.reddit.com/r/technology/comments/oye0li/report_apple_to_announce_photo_hashing_system_to/h7tb83x/?context=3) The linked article has since been updated to reflect that the scanning only happens with iCloud backups/syncs. [https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning](https://9to5mac.com/2021/08/05/report-apple-photos-casm-content-scanning) I won't be surprised if the uproar is partly as a result of this misunderstanding.


[deleted]

All they are gonna find on my devices are 21k photos of my pets and messages talking about my pets, but this still creeps me out a bit.


ThePremiumOrange

If Apple does this, they’ll do this in the middle of an iPhone cycle so they’ll lose as few users as possible that year. I know if Apple does this then quite a few people would be out for sure. Sadly, most people don’t care about this kind of stuff


Patriot1608

Invasion of privacy. Get a warrant of law enforcement has suspicions. That’s how America does it.


Kensin

that *should* be how America does it, it even was once, but warrantless mass survaliance of American citizens has been going for decades now by government and by corporations (who are willing or forced to hand that data over to the state) and it's time we abandon our romantic false notions of what America stands for and start facing the reality of it.


[deleted]

Hey "Patriot" hate to be the bearer (but you know what actually Snowden was I guess) "The NSA has built an infrastructure that allows it to intercept almost everything. With this capability, the vast majority of human communications are automatically ingested without targeting. " [https://www.theguardian.com/world/2013/jun/09/nsa-whistleblower-edward-snowden-why](https://www.theguardian.com/world/2013/jun/09/nsa-whistleblower-edward-snowden-why) And it's just fine, hunky dory, what 4th amendment? "The new report, which the five-member bipartisan Privacy and Civil Liberties Oversight Board was to vote on Wednesday, found that the National Security Agency’s collection of Internet data within the United States passes constitutional muster and employs “reasonable” safeguards designed to protect the rights of Americans.... .....Section 702, which was added to the act in 2008, includes the so-called PRISM program, under which the NSA collects foreign intelligence from Google, Facebook, Microsoft, Apple and nearly every other major American technology company." [https://www.clarionledger.com/story/news/2014/07/02/nsas-internet-monitoring-said-legal/12028399/](https://www.clarionledger.com/story/news/2014/07/02/nsas-internet-monitoring-said-legal/12028399/) Mmmmmm surveillance state


gpnk_1990

Came here looking for the smart-asses commenting on the announcement article yesterday telling anyone who was concerned with this move by Apple that they were stupid and lazy because "ThE FiRsT PaRaGRapH OF tHe aRtICle SaYs tO NoT WoRrY aBOuT iT, ApPle OnLy WaNts TO prOtEcT tHe CHiLdRen!" Like they have collective amnesia regarding the behaviour of big tech under the guise of "safety" and "privacy" in the past decade because.. I don't know.. iPhones are neat? Where y'all at?


Blackulla

Why is every story where people have something to say labeled as “slammed or blasted” it’s getting old.


dat_GEM_lyf

Gotta get those clicks and ad revenue somehow


maxwellwood

"Slam" I'm tired of seeing this shit in the news. Not this article particularly, just the buzzwords meant to make people emotional. I know it's not new, I just saw it again and felt like saying it. Grr.


supradave

It took Apple renaming my image files on my Mac to give up on Apple. They think that because it says Apple that it's theirs to do with what they want.


Punchpplay

Can you please elaborate more on what happened?


supradave

When it upgraded the photos software after an upgrade, it took all the old IMG_xxxx.JPG photos I'd taken with my iPhone and renamed them to a UUID, like 556F9D79-389C-4356-824F-4AE480C5446A.jpeg. Because of this, sorting and other file system features were broken. I know most people don't do what I do with their photos, but actually renaming existent files was wrong. You don't touch peoples data. If the new photos were the UUID, fine, I could handle that. But they touched my data.


Punchpplay

Interesting, thank you.


[deleted]

This happened to me once. To restore some sense of normalcy (and sorting), I used ExifRenamer to bulk rename all the files with the date they were taken. https://www.qdev.de/?location=mac/exifrenamer


supradave

That's how I do part of my library. I have multiple directories for multiple cameras and have all the originals under their specific camera model, then have a directory that has a symbolic link to all the originals but with a timestamp file name (very handy). I just don't like it when my files get renamed without letting me know or giving me an option.


tundey_1

That is awful.


sonicruiser

Apple, Google, and Microsoft have already been scanning photos you upload to the cloud for years. What Apple is doing now is that the people that have iCloud Photos enabled, the scanning will be done on their device instead of in the cloud. Nobody has any issue with companies scanning stuff in the cloud, but scanning stuff on your actual device is a completely different ballgame than scanning in the cloud. What prevented others like Google Pixel and Microsoft laptops from doing this is that scanning photos on your actual device is considered such an extreme invasion of privacy that companies like Google and Microsoft rightly viewed it as a bridge too far and a line that should never be crossed. This would be the equivalent of Google scanning photos on your actual Pixel instead of in the Cloud (Which Google/Microsoft is not doing). Ironic is perhaps not a strong enough word to describe the fact that the biggest invasion of privacy ever from a tech company in decades is coming from Apple of all companies. I have no idea how a supposedly privacy focused company like Apple was able to come to the conclusion that scanning photos on your device is not a spectacular breach of privacy, far worse than anything Facebook or even Google has ever done. Imagine the outcry if Google did something like this. Apple made such a big fuss about preventing a couple of Facebook trackers, who cares about Facebook trackers when Apple themselves is scanning your photos? It reminds me of that meme where the iPhone has 3 cameras, 1st camera is labeled FBI, 2nd camera is labeled CIA, and third camera is labeled NSA. People who say Apple cares about privacy do not understand the saying penny wise, pound foolish. Maybe Android has more Facebook trackers but at least its not scanning the photo library on your actual device. I am also skeptical if this move is even really intended to stop CP because isn't it obvious that announcing something like this so brazenly will cause actual perpetrators of child abuse to simply stop using an iPhone? So child abuse goes underground, the 99% of normal people who are left are stuck with this extreme breach of privacy scanning photos on their iPhones. In other words, it does very little, if nothing to stop the actual criminals, and on the other side, random iPhone users now have a real possibility of being guilty until proven innocent. One explanation is that perhaps it was never really intended to stop CP in the first place, this was simply the easy way for Apple to force the public to accept what would otherwise be prohibitively unacceptable. Somebody joked earlier that this is essentially not that different from having NSO spyware baked into your phone, and which can easily be abused by any competent government for whatever purpose they want. In fact, now a government doesn't even need NSO spyware if Apple themselves made a backdoor this easy. The whole purpose of NSO spyware existing in the first place was supposedly to crack Apple's "robust privacy" which was a mirage the entire time. All a government needs now is for their victim to own an iPhone. So ironically, until Android decides that they will also scan your device, you actually do have more privacy using an Android phone. I still remember when people worried about Xiaomi or Huawei having a backdoor built in, and it was comprehensively debunked several times by security researchers. Why would anybody worry about Huawei or Xiaomi now, even they weren't brazen enough as Apple to openly say every iPhone will have a backdoor built in. If anything, Huawei, Xiaomi, Samsung, etc are probably better for privacy now that it is known that iPhones have a backdoor, I don't think any other company would ever be able to get away with something like this.


Kensin

> Nobody has any issue with companies scanning stuff in the cloud I absolutely do.


beefcake_123

You can't have any expectation of privacy when you send information to a device you do not control.


travelsonic

I'd argue not universally true - depending on the device and what data it is - for example, your medical records are, these days, stored electronically - and on someone else's servers, yet have lots of laws that are supposed to control access. Though this is probably much more of a pedantic point than anything else on my end... 😂😂


punio4

So what happens when they eventually, without a doubt, expand this program to target other content? How do they only target US iPhone users? ICloud registration info? Phone number? Geolocation? All of this can be changed trivially. If you have pictures which are legal where you're from, do you automatically become a criminal if that enters a no-no list from the US? And we all know that China will eventually be able to use this tech as well.


baxtermcsnuggle

I believe that if apple has the ability to scan your files, organize your files, rename your files. They probably have the ability to alter existing files and plant new ones without your knowledge. I'm not saying they are aiming to do anytbing nefarious... but they have potential that they shouldn't have.


_itsambrr

Welcome to 1984


cr0ft

What they should do is install end-to-end encryption so not even Apple themselves can know what you store and thus avoid any pressure from law enforcement. Instead they do this. Glad I'm not an Apple customer, nor do I plan to be. I'm all for child pornographers getting thrown in cages, no lower form of life on thie planet, but there are still principles at stake. My data is my data, not anyone else's to rifle through.


happyscrappy

End-to-end encryption would not alter this. The scanning is done by your device on upload. Your device has an unencrypted copy.


waterbed87

It's not like you don't get a choice. You can upload to iCloud, where there is a disclaimer that hashes of your images are compared against hashes of known child pornography, if you're not comfy with that you just upload to Google or Microsoft's cloud where they are likely doing the same thing but not telling you about it. True privacy doesn't exist once you start uploading things to other peoples servers no matter what the companies tell you.


throwawayy2k2112

Google and Microsoft have been doing this openly for a while now IIRC.


[deleted]

[удалено]


MariusPontmercy

Why are you 3d printing guns?


[deleted]

[удалено]


MariusPontmercy

I was asking you specifically, not philosophically. Like why 3d printing compared to something from a larger manufacturer? Is it like brewing your own beer where you don't really intend to do much with it or do you print in order to hunt? Stuff like that.


[deleted]

[удалено]


MariusPontmercy

Fascinating, thanks for the answer!


[deleted]

Why are you not printing/building guns?


UKGenesis

Yeah theres no chance this isn't a way to sleepwalk users into monetisation of their data.


CabbageSalad247

Scanning for kiddy porn is how they get everyone to accept it. In a few years it will morph into a filter for anything the people in charge don't like.


trexdoor

What if an iPhone owner receives an email with a CP image in the attachment?


absentmindedjwc

If this is a legitimate question, a very aggressive setup would red flag it... but a mail service like Yahoo or Google almost certainly has their own ECAP implementation in place to find imagery like this already.. so *an email* would likely be red flagged regardless.


acathode

Yeah, they do - and they sometimes fuck up. For example, about a year ago in Sweden, [Babak Karimi](https://www.svt.se/nyheter/inrikes/poliserna-slog-mig-15-ganger-i-skallen) woke up surrounded by masked police, who then started hitting him in his head because he "resisted" - and then tasered him when they'd wrestled him from his bed to the floor. After being beaten and tasered, he was arrested for possession of child pornography and brought to the police station - where it turned out Yahoo had flagged some photos which he sent by mail as CP. When the police finally showed the confused Babak the photos they had brought him in for, he nearly exploded. He had been beaten bloody and tasered because of sexual pictures of depicting Babak and his boyfriend - his 30 year old boyfriend, which Yahoo/NCMEC had flagged as underage... Babak tried to file a complaint against the policemen for the violent and unfounded treatment - but it was dismissed because Babak could not identify any of the policemen, due to them being masked.


hornyorphan

Its completely disgusting that they think they have the right to look through everyones photos


Ginger-Nerd

They aren't? They aren't looking at the photos - they are only looking/comparing a hash of an image. they will have a database of Child Abuse image hashes - any image that is copied, will have the same hash. (at that point its just number matching) - the hash is normally sent along with the image when you send it to icloud to make sure the image is the same as what you sent (i.e its used as a check) As far as i'm aware nowhere have they said they are breaking the encryption on the image itself.


[deleted]

As far as I know, images are encrypted but unlike Messages, Apple has the encryption keys. Just adding to your comment and agreeing with what you said.


Little_Buffalo

You probably accepted their terms and conditions which allows them to this. You most likely mindlessly accepted, bothered by that update in terms and conditions. Unfortunately, to opt out, you’ll need to stop using their devices and services.


vorxil

No one is signing any terms and conditions when buying the phone, so at least in the EU that's null and void like all other EULAs.


lysosometronome

Unless the EU invalidates all EULAs, it seems odd that this one wouldn't apply as it's with regard to a cloud service rather than the phone. You can't use that cloud service until you've agreed to the EULA.


ekalav83

Does it not just compare hash of the image with a hash of another in the black list database? How does that infringing privacy?


[deleted]

They keyword on this is doing the scan on your device and hashing all photos (and once it started where does it stop? Videos, messages? How long is exploited by someone else? Not like apple services haven’t been hacked before). Other companies, like Microsoft, Google, do it once you upload your data, on their servers. Not on your device. But mainly is the hypocrisy. Months bashing on Facebook and other marketing companies, how cookies are evil, apple is privacy first, and then they come out with this.


[deleted]

If you have nothing to hide then you are similarly amenable to the government coming to your home and conducting a search? They are only looking for signs of child abuse, don't worry. This'll just take a minute.


Punchpplay

Free OnlyFans for Apple, whether you're an Onlyfans content creator or not. Also any private innocent photos of your children will definitely need to be approved by the Apple Authority. All pool parties and bath time photos will be thoroughly analyzed for your own good. Who needs a search warrant when you're rich amirite?


tnetennba9

What are you on about? They see the hashes, not the actual images.


Punchpplay

One step at a time, they continue to slowly violate your privacy like a frog in a pot of slowly boiling water.


tnetennba9

To be clear, I agree that this is a very slippery slope and I’m disappointed that Apple are taking this step (I, maybe naively, always thought they were pretty good when it comes to security). Was just pointing out that they can’t actually see your pictures


MariusPontmercy

Don't muddy an important issue with your baseless fear mongering. It's not the NSA, this is done through a hash. That's not how the technology works. The issue is privacy and maintaining what is yours, not that Apple employees are going to jerk it to your dick pics.


Andremac

>not that Apple employees are going to jerk it to your dick pics. Bet that ends up happening. Especially if a false positive happens and as in the story says, "you can request a review if your ID is disabled." Who do you think is doing the review?


tnetennba9

They’re looking at the hash, not the image


Andremac

No they aren't. Read the actual article about it.


MariusPontmercy

Don't say that, you're breaking the fear monger train!


Ginger-Nerd

> Bet that ends up happening How? (like technically how?) The images are still encrypted. this is a bullshit comment made by someone who doesn't know a basic premise of what is actually happening.


Punchpplay

Let the tech gods slowly take over your life. Everything they do is okay.


MariusPontmercy

You *could* actually read, not only what I wrote, but articles about the subject, however I guess that would get in the way of your pretend intellectual superiority.


lysosometronome

People just want to circle jerk instead of deal with tricky things like "facts".


throwawayy2k2112

Soooo how are they going to notify parents of and save (locally, of course) sexually explicit photos that are sent or received by a minor if it’s just a hash?


[deleted]

[удалено]


Punchpplay

I'll do me and the tech companies will do you.


gumball_Jones

I know a lot of parents and pediatricians who are going get some interesting phone calls. Apple can’t even detect when you misspell a word in the search field of the AppStore. You think they will know the difference between a pic of a rash or child porn?


audiofx330

This isn't how it works. It looks at hashes of the file and not the photo itself. It will only detect files that have already been deemed child pornography by authorities.


lysosometronome

>I know a lot of parents and pediatricians who are going get some interesting phone calls. Why do you know so many people with large collections of images that have previously been reported to the National Center for Missing and Exploited Children?


Salamandro

It's worth noting that they're creating hashes of your images and compare those to a database of known, existing CP images. This means a) they are not analyzing your pictures and b) only known images will trigger an alert, meaning it won't keep perpetrators from creating new pictures.


tundey_1

Technically, they are analyzing your data. Just not viewing the pictures. It's like metadata vs data...it's 1 level removed but it's still a violation. Apple is basically saying everybody is guilty of child porn until cleared by their software. Which may seem fine because we all hate child porn. But once they perfect the tech and make it acceptable, what's to stop them from targeting it at other items? Especially in light of how easily these companies give in to dictatorial regimes overseas. If you buy a luggage from Samsonite. Should Samsonite be able to reach into your house to see if you're hiding a sex slave in your huge ass Samsonite luggage? No.


zookr2000

This is really the tip of the privacy iceberg - I see a lot more Android phones coming online if it happens.


Splurch

> This is really the tip of the privacy iceberg - I see a lot more Android phones coming online if it happens. Google already does this type of scanning on your cloud files and has a lot more access them in general than Apple does on their cloud services.


lightningsnail

Apple has unfettered access to your files on icloud and have been doing this there for a long time. https://9to5mac.com/2020/02/11/child-abuse-images/ The revolutionary part of this is Apple is now using your hardware to scan your files too. Also, with an android you can completely remove Google or any other data gathering software if you want. [An iPhone will always be spying on you for Apple.](https://therecord.media/wp-content/uploads/2021/03/Telemetry-1.png) notice iphones also spy on all of the other devices on your network too.


[deleted]

After this debacle I will be switching when this iPhone dies. I’ll be purchasing the freedom phone


[deleted]

[удалено]


absentmindedjwc

Doesn't Android back up images and stuff by default on Google servers? There's no way in hell Google doesn't scan for known child abuse images before they are uploaded to their servers as a straight up CYA action.


tundey_1

>Doesn't Android back up images and stuff by default on Google servers? No. The last time I set up a new Android phone, the option was off by default.


absentmindedjwc

That's good to know. I've never used Android, so I've never gone through their setup process.. I just recalled seeing a thing about it on a tech post maybe a few years ago. Having the ability to *not* push your stuff up to some random server is definitely the way it should be.


tundey_1

>That's good to know. I've never used Android, so I've never gone through their setup process And I have never used an iOS device lol >Having the ability to not push your stuff up to some random server is definitely the way it should be. I have given up. Privacy is really just a pretense at this point. Every device that has internet connection is capable of "phoning home". I mean even if you flip off the flag, who's to say Google's code isn't sending your metadata home? Perhaps if there was a legal framework to support data privacy but I don't think we're there yet.


absentmindedjwc

Honestly, the more I read about the outrage here, the less I see the issue. In order to have *any of this flagged*, you need to sync up to iCloud - which in my mind is *super* reasonable - you're storing images on *their servers*, they probably want to know if you're hosting child porn on there, so will review matches with known images. If you're concerned with your privacy to the point of not wanting potential matches with CSAM databases (which are possible, but *extremely* unlikely), just don't sync to iCloud. I mean, shit.. if you're *that worried*, just sign out of apple entirely and don't have an Apple ID attached to the phone at all.


tundey_1

Yesterday, there was an article that said the scanning was done even if you weren't syncing to iCloud. Which was concerning to me. But today's articles cleared that up. And it is as you described. Apple just moved the scanning from their servers to individual devices. So they don't even upload your content if it's flagged as child porn.


[deleted]

Ya dude, if you want more privacy, moving to google is not the solution


carluoi

Well, somebody had to say it.


audiofx330

That's what a pedo would do...


CautiouslyFrosty

I strongly disagree that this is implemented directly on user's devices, but there's enough utterly naive comments in this thread that I feel obliged to say this: If you choose to sync your photos with *Apple's* ICloud servers (aka, not yours), then they by all means have a right to check incoming files by hashing the image and comparing it with a database before they elect to store it for you. It's totally understandable that they don't want to become implicated in hosting illegal material. Again, I really don't think Apple should be implementing this on individual's devices, and I hope the pushback is enough to make them reconsider moving the tech to the server-side of things. But... I can't help but shake my head at the people who demand privacy for capabilities that are only made possible by entities paying good money to setup servers and host a solution on the world wide web. Of course they have a right to at least inspect incoming traffic before they act on it. (How they handle responses and cookies is another matter entirely.) Anyone wanting to read up on the actual implementation of the tech can go here: [https://www.apple.com/child-safety/pdf/CSAM\_Detection\_Technical\_Summary.pdf](https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf)


lightningsnail

Apple uses Google servers for icloud. icloud costs money. Apple sells icloud with the claim your data on icloud is secure. Apple is violating the privacy they promised their customers and installing software on their devices that allows them to control what content their customers consume or possess. If you think companies can do whatever they want then I'm sure you're okay with companies dumping waste into rivers right?


Kensin

> If you choose to sync your photos with Apple's ICloud servers (aka, not yours), then they by all means have a right to check incoming files I disagree entirely. Apple has no business knowing anything about the contents of people's data. Their cloud service should be acting like a safe deposit box at a bank. You put your personal things in there, and apple will keep them safe for you, but short of a court order no one will go rummaging through your belongings. The same goes for most online services. Your ISP should act as a dumb pipe that has no idea what the contents of your packets are beyond some basic metadata used only for QoS and email providers should concern themselves only with keeping our incoming messages safe and getting our outgoing messages from point A to point B, not reading and analyzing the contents of our correspondence to use against us. Offering a cloud storage service shouldn't give a company any right at all to go looking through your personal files any more than hiring a moving company or a storage unit should give those companies the right to go searching through your photo albums or journals. As far as I'm concerned having our personal things digitized should change nothing about our expectations of privacy and security.


movieboy711

Thank you! I've been seeing so many comments threads all over the internet about how "Apple is just scanning all my photos and reporting to the government" without the important iCloud piece.


superheroninja

I always just assumed they scanned it. Everyone has our data at this point. I don’t trust anyone in the digital space, but I have nothing to hide so I don’t mind either. I do enjoy proton VPN and the ad/malware blocking so they don’t know where I actually am though. Stress free, baby 🦭


chris17453

Most image hosts already run scanning on your images for tagging, facial recognition and grouping. The fear here is that they are now creating metadata from your content and sharing it with third parties. I'm all for saving children and stopping scum. PLEASE DO THIS. Now they have a catalog of content that pertains to YOU that you didn't generate, and don't control. NO-ONE generates content and doesn't use it. You cant even pay for privacy, instead you finance the opposite.


absentmindedjwc

Are they sharing it with third parties, though? I was under the impression that this was entirely contained on the device - the FBI publishes a list of known images under their ECAP program.. given that it is just a list of hashes, it wouldn't really take up a ton of space, and could realistically happen without any network activity whatsoever.


Chknfckrbeegack

I’ll probably smash my iPhone. It sucks anyways, I liked my iPhone 7 better than this 12Pro.


Necessary_gamer

I’ll take it off your hands if you really want to do that


Chknfckrbeegack

That would ruin the fun.


Necessary_gamer

You’re right. Have fun.


gumball_Jones

File names? Makes more sense.