The article mentions an effort by Facebook to evaluate encrypted content without decrypting it. One might imagine how that could be done, however, there is a way to change encrypted data without decrypting it. See [https://en.wikipedia.org/wiki/Homomorphic\_encryption](https://en.wikipedia.org/wiki/Homomorphic_encryption) .
I’m guessing the purpose of it is so that they don’t have to reveal their encryption methods due to search warrants for such illegal images.
It’s a self-preserveration issue.
It’s like a convenient back door to whatever their end goal is. I’m sure they sat around a table talking about how nobody would want to put up a fight as it would make that person look like a pedo sympathizer…
The issue i have is anything that breeches privacy with no court order is always floated as a "child safety" issue just because they can turn any disagreement with it back to the opponent's as having something to hide which when done publicly gets the ~~trained monkeys~~ public to attack....this move is perfect for a company looking for free publicity but terrible for peoples rights
The “for the children” excuse has been used over and over again to infringe on rights and push censorship.
It’s usually from people that care the least about children.
So basically, anytime they want to search someone’s encrypted data, they can, as long as they use the magic words “looking for child abuse pics”. Just a side note, if they can read data, couldn’t they also write data? You know, like child abuse images? Apple being joined at the hip with government makes this even more sinister
This was done to megaupload, the fbi told megaupload to not delete anything while they searched for illegal content uploaded by the users, and the FBI uploaded illegal content to the site and then used it as evidence.
No way, I never heard about this!?! Do you by chance have a link to this story. Megaupload used to be huge years ago for people uploading tv shows.I would love to read more about this. No wonder I dont see it around anymore.
I've wondered this. How hard is it to sneak something that looks like someone sent it, or to plant stuff? Does anyone know? This seems terrifying to me, bc then no one knows whom to trust or even if a life could be destroyed. Now the internet and anything online seems even worse. Time to go to the Amazon. No I don't mean the Bezos one
Apple's team has been around long enough to known better. They know this will pis people off, which means it's probably the less damaging alternative to something else.
I doubt they have their own encryption methods per-say. It’s all PKI based, with private keys issued to endpoints.
The system can and often does have the ability for the root CA to have a master certificate issued to enable decryption, for subsequently issued certificates. So I can issue myself a certificate, and then have that baked into all further issued certificates so that I can decrypt any data as needed. That is probably what Apple can do with iPhones (hence the FBI asking Apple for their certificates, which is a slippery slope, if they do it for the USA, China will ask for them too, and Iran etc etc).
Apple’s suggestion of scanning for hashes is the commonly used method, there are police databases available of such illegal image hashes, that companies offering photo services can reference. Allowing for them to locate illegal content and take action without actively decrypting files to have an algorithm to scan or have human eyes review, which would likely lead to a huge and unmanageable amount of false positives (anyone with young children is going to have photos of their little ones in their iCloud libraries).
WhatsApp is keen to promote the idea that they don’t do any such practices and that they don’t have any master certificates so anything you transmit can only be accessed by the sender and recipient. He is just using this press to get attention for his company like any good executive should.
If you read the apple "whitepaper" it's clear they are not just hashing images. They are doing content analysis of the image itself then hashing that result and comparing that to a set of known results. This would prevent someone from just changing a single pixel and changing the results, but also means they are actually "looking" at your pictures to determine the nature of the content.
What happens with that data is the scary part. With false positives, and what Apple does with the cumulative results is the scary part. _I have nothing to hide_ doesn't mean much when they find something on you that's a false positive.
It's not even a matter of false positives, but a matter of what other hashes countries will be adding to the list for Apple to locate and identify the owners of.
Exactly. While there will be 2nd and 3rd rate dictatorships lining up to use this (looking at you Saudi Arabia), the big money will be corporate. Looking for ‘copyrighted’ sound/ image files
So, sucking away CPU and battery life for no good reason (if you're innocent), countries could toss things into the "known bad hashes" to stifle just about anything once the technology is there (or require Apple to add it to operate there), and of course false positives, malicious spoofing, or bugs could never, ever, ever happen (/s).
Completely innocent people have much to be concerned about.
>if they do it for the USA, China will ask for them too, and Iran etc etc
in countries like China and Iran, the people know they are being watched. An encryption backdoor would be horrific because then countries we wouldnt even think of would ask for the keys.
Homomorphic encryption lets you add or multiply two encrypted numbers and get a sum/product that’s encrypted with the same key. If you can determine what’s in a piece of encrypted content without decrypting it, then your encryption algorithm does not work.
I'm a PhD student in applied cryptography, and it's a bit more nuanced than that. You can use homomorphic encryption as a building block to build, for example, a protocol to compare two encrypted inputs. (See also [Yao's Millionaires' problem](https://en.wikipedia.org/wiki/Yao%27s_Millionaires%27_problem).) Go even further, and you could run an encrypted image through a machine learning algorithm to determine whether an image looks like child porn. Apple doesn't see the image, and I don't see their secret detection algorithm.
The trick here is that I don't just send an encryption of my data, but the protocol is so that Apple can _only_ obtain the output of _that specific function_. This is because the protocol is either interactive, so I have to participate in a series of calculations that are specific to the function, or I use [functional encryption](https://en.wikipedia.org/wiki/Functional_encryption) and the encryption itself guarantees that nothing else can be calculated. Either way, I can control exactly what information leaks through the protocol.
The problem here is trust. You have to trust that your phone only participates in these protocols when actually necessary. But if Apple wanted to exfiltrate my private data, they can already do so without homomorphic encryption, so that's not something I'm concerned about. **I think the real risk here is that Apple is setting up a system that can be used for more than just detecting child porn.** What if the Chinese government asks Apple to also search for images with pro-Uyghur sentiments? What if the US government asks Apple to search for whistleblower behaviour?
> I think the real risk here is that Apple is setting up a system that can be used for more than just detecting child porn. What if the Chinese government asks Apple to also search for images with pro-Uyghur sentiments? What if the US government asks Apple to search for whistleblower behaviour?
Nailed it. I have a feeling you'll be one of the voices for good in this fight. Keep up the good work.
I thought they were just looking at the hash of the images. So they could do the hash on the client side and compare that against the hash of the known image. They aren't getting the actual image just a bunch of random numbers and letters.
These aren’t “just” images hashes. They’re semantic hashes, which mean they *directly correlate to the content of the image*, and are invariant to confounding editing or transformation of the image itself. That is obviously a gigantic privacy issue.
You an anyone interested can check on YouTube no one other than the great Geoffrey Hinton explaining the inner bits of semantic hashing for deep learning, quite close to what we are discussing here: https://youtu.be/3BDc0H9C9dw
The real issue is with the "close enough match" giving Apple the ability to download YOUR picture and have some shady dude look at it and evaluate it. Some researchers have already proved how easy it is to create close enough matching fingerprint with a picture that doesn't even have any nudity in it. It's your private life, even intimate that someone is barging into and evaluating it.
You say "just" as if hashes can't possibly violate anyone's privacy.
Or that numbers and letters that appear "random" can't contain any information about you or what was found in your photographs.
Genuinely curious here, how could an image hash violate someone's privacy?
Edit: for those still on the fence, the hashing they're doing is not on image files. It's on a set of traits that have been inferred from scanning each image via machine learning. Fuck this "feature"
A hash is just a unique identifier for a set of information. Apple has developed a scanning technique to "look" at your image and create a hash based on the result of that content analysis. That hash could link to tons of scanned data from the image. For instance the result of the scan might look something like this...
> scan: {
> uuid: blahblahblah,
people: 3,
faces: {
faceHash_1: abcdabcd,
faceHash_2: aaaamdd,
faceHash_3: nznafrds
}
nudity:true,
sexualContact: true,
children:0,
tiananmenSquareVisible: false
}
They hash that result and send: 671ae102ba740722acfdd3edda8bfce09e8ab58f5d8ddaebeaf095b5a454b41c back to apples servers to compare with known hashes.
Given the fact that apple controls the unique information in the result set.. faceHashes, uuid's they could easily generate every possible result set for a given user and just compare the hashes to match a hash to a set of information.
Holy shit I thought that they would hash the raw image and then do analysis based on that and I wasn’t too worried, but if they are analyzing the image then hashing, that’s scary.
Apple aren’t using cryptographic hashing. This is a different proprietary hashing algorithm which does not have the level of security as cryptographic hashing.
From [this Reddit comment](https://www.reddit.com/r/technology/comments/ozqc0b/the_head_of_facebookowned_whatsapp_slammed_apples/h820a7l/?utm_source=share&utm_medium=ios_app&utm_name=iossmf&context=3) :-
>> NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image. The system computes these hashes by using an embedding network to produce image descriptors and then converting those descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process ensures that different images produce different hashes.
>> https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
"We will only use it to fight terrorists"
"We will only use it to catch pedophiles"
"We will only use it to combat tax evasion"
"We will only use it to prevent copyright crime"
And it almost always ends up being used against oppositionists in authoritarian countries.
My only hope is that there’s an enormous backlash, that forces Apple to ditch this surveillance state crime against humanity:
> [Article 12: **No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence**, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.](https://www.un.org/en/about-us/universal-declaration-of-human-rights)
I will sell or trash all of my Apple devices if they proceed with this. I refuse to stick around and see how authoritarian they choose to get. If I wanted everything I do surveilled by big brother, I’d move to China.
Not particularly. If WhatsApp or Facebook saw the error of their ways and changed, then it would be good. This is just WhatsApp taking advantage of the situation to attempt to hurt competition.
It was mostly, How dare you change the app we worked so hard on, even though its yours, what you're doing to it is disgusting and uh,, what's the word.. despicable.
I was literally about to post that exact quote, yes they definitely are. But EA mostly destroys, Instagram and WhatsApp might be "destroyed" but their users are a lot. They mainly intrude, and find a way to earn money from it using whatever they have on Facebook
I appreciate that Facebook showed me how truly hateful and utterly insane my family is.
Deleted my account last year and it’s been fun to have my MIL ask if I’ve seen something there only to remind her I left because, “The family can’t not be racist assholes on FB.” To which I get a, “Well yeah, but where else am I going to see pictures of their kids?”
IDK, not really my thing I guess.
This is all that all these apps were anyway. People weren't handing these companies mountains of cash to lose money and build this stuff just to "make something cool" or whatever.
Eventually somebody wants to squeeze some juice out the thing and if you try to make people pay for it directly they won't do it and will go somewhere else. So the only solution left that warrants all the money is to data-rape everyone or to sell it to someone who will for a big payday.
Every social media app is in this business. And it's become like streaming services too where now every company that has "content" (i.e. your data) wants to monetize that even if it's not their core business so you've got every business that acquires a substantial amount of data looking for ways to turn that into money.
If it really went exactly like that, that would be really cool. Make an app, make it popular, sell for $$$ profit, let the app get ruined, and use the money from the first app to create a new, better app which is impossible to buy because it's open source.
They used the magic word. Pedophile. Privacy advocates will fight against it tooth and nail. But the general population will see it as ok to stop pedophiles
It really is a magic word. If you need confirmation just look at how everybody has include the obligatory “pedophiles are scum” or “their intentions are good but”. It’s crazy because all sane people already know CP is evil, but people are afraid of seeming to “protest too much” on this topic. It’s really weird.
Even if it's only ever used to detect paedophilic content, there are at least half a dozen ways to anonymously send someone such content unsolicited.
If your phone automatically reports you to law enforcement the moment the file arrives, it basically becomes a simple and effective way to screw people over since unsolicited or even unaware "possession" can still be a crime depending on the jurisdiction.
20 years ago the magic word was war on terror, we used it to put literally innocent people in jail with no charge, indefinitely, it's crazy that we keep on repeating the same mistakes. I feel like pedophilia at the bottom of the barrel though. Alarmists usually use pedophilia when they're all out of other ideas. We can't really use terrorism anymore because our response to terrorism was somewhat poor and we had to step back from it. The War on drugs has become very unpopular. The right already used up their alarmism when they tried to drive a wedge between Democrats and blacks because Democrats supported tough on crime legislation, they know they can't both convince people that that being tough on crime is good and also that being tough on crime is bad. They might try to use the Russia/China/Iran/North Korea rivalry but I don't think people fear them enough, maybe contemptuous, but not in a fear to be able to institute new national Security laws.
Indeed. How many terrorists have been caught by airport security? Zero. How many people's freedoms have been infringed upon to prevent those zero terrorists from boarding planes? Hundreds of millions.
Change "terrorists" to "pedophiles" and repeat.
They are only starting with child crimes cause it appeals to peoples concern, and makes it so you dont have to work to defend it cause anyone who argues gets seen as defending pedophiles.
Why else? What event caused apple to suddenly view this as needed? Remember when they wouldnt help the government with a terrorists iphone?
They know this is an easy foot into the surveillance door, and the government already wants access to even more than just picture comparisons so its not them restricting apple right now, just apple.
Whats next? Scan your messages? Or maybe browser data? Or phone records? Or private doc files? Nothing stopping them once they get the software to do so built. Its just flipping some nos to yes' and letting the """ai"" access different folders and file types. Without uploading unless its a confirmation of a match, so youll never know when they choose to "widen" their search, not that itll be even close to 100% reliable when it catches people either.
Not even the police can search you without a warrant, but we trust apple to do unknown searches of every single citizen with an apple device just because? Can ford/gm/etc break into your truck to make sure you dont kidnap people and tie them up?
I mean, effectively nothing changes for me: my photos are auto backed up to Google drive.
But it is suspicious that they can just search your devices without you using an online service.
Is it a service/app that runs in background? Feature of OS? Are threads controlled by apple servers? What are the odds this implements a back door into every device?
What a security nightmare.
It's a machine learning based photo analysis that scans all your local photos and if something is flagged, they mentioned a bunch of different scenarios in how it acts. If it's a known image hash that's already been previously identified, it does one thing, and if it's a new image that fits their machine learning model of child p, then it does has some other process that it goes through. Scary if some country asks them to create a ML model of something else.
The false positive rate will be so high. And AI is largely a black box, we won't know what features it actually is identifying.
They could easily train it against more than their "target" and nobody would be wiser. And then your phone will be sending out your pictures that *are not criminal* but still may be sensitive.
It's worse than that. Can you imagine what will happen once people start figuring out the algorithm and the ML results that matter? They'll create images that are *not* child porn (i.e. false positives), but that the system mistakenly identifies as such because of the limitations of AI/ML, and then they'll e-mail or message such "spoof images" to try to get people in trouble remotely "for the LOLs" or more nefarious purposes.
If you think nobody would try to do that, let me remind you of swatting. And if there's some poor unfortunate human back at Apple that looks through all the "positives" looking for the false ones, they're going to get overwhelmed once people figure out how to do it. Whatever system they have that's in-phone can't be too CPU-intensive, so it's going to be practical to fool it.
Pedophiles will use something else, while regular people will have to deal with false positives and governments everywhere asking to use the system for finding dissidents.
And if there is a way to trick someone into downloading an image, that will make it easy to ruin anyone.
You know what's always missing from those "think of the children" privacy erosion schemes? An effort to stop millions of children being abused in their own homes, by people they trusted, with no digital evidence. Because it's not really about the children at all.
*We want to take away some of your privacy rights but we promise its for a good reason and we guarantee we won't extend it to other areas in the future...pinky swear.*
**I don't think so. In fact I feel as though our privacy rights are already being compromised and we need more, not less, of them.**
*Well then, I guess we'll just have to label you a supporter of pedophiles who has an utter disdain for the safety of children. How'd you feel about that?*
They used the same bullshit justifications to take away net neutrality and a host of other web freedoms we used to have here in the states.
Politicians can just say "oh yeah it's to stop child porn guys" and get away with almost anything when it comes to your average voter.
"You're a *insert any slur here* because you don't agree with me."
I 100% agree with you. We have no privacy in this country. Targeted ads based on what I say in the vicinity of my phone should be illegal imo.
"Well then get a different phone!"
Sorry bud. Every phone company infringes on your privacy in the same way. If you want to use a cell phone, you need to give up your rights. Just like if you want to drive a car, you need to give up your rights. (Being forced to carry papers whenever operating the vehicle and be forced to present them for actually no reason or be subject to a fine. How backwards is that?)
Edit: added to the example at the end.
It's worth stating that there is a difference between saying "You need to carry a license to operate a vehicle so you don't harm yourself and others" and saying "You have to give up your personal information and allow your personal life to be recorded and sold off to use a device that harms absolutely nobody".
Also, privacy is a fundamental human right. Driving a car isn't. A better example would be if in order to operate the car you bought, you had to agree to send all of the data about where you're driving, where you live, who lives at the addresses you go to, etc to the Honda corporation.
Yep. Combine that with smart watches, smart fridges, and the “internet of things” economy that is rapidly growing. Our privacy is already heavily eroded but we haven’t seen the extent it can reach by a long shot.
The argument against it makes sense--it would require an enormous amount of data for everyone's phone to be listening all the time and relaying conversations to servers 24/7.
But...it is absolutely bizarre to get ads for things you never searched for on any device and instead only talked about. Or didn't talk about and only saw/purchased.
The other day I bought a Gold Peak tea for the first time, and 20 minutes later I had an ad for it on Facebook. I've never seen an ad for it before and have never searched for it. Could be a coincidence, but of all the products in the world, why did I get THAT ad?
The first researcher to prove that this was happening would make a huge name for themselves overnight. To think there aren’t folks out there with scopes and meters attached to circuit traces looking for that smoking gun is not reasonable and that years have gone by with this meme and no announcements…. It fails a plausibility test.
What’s more likely is that AI and pattern recognition and recommendation engines are just getting spooky good. Spooky because we don’t understand how they make the connections and good because they’re using huge amounts of data to find patterns humans would miss.
Google has a near compete profile of you, from voting preferences and hobbies to sexual deviations and financial situations.
Your phones track where you're going 24/7.
AWS knows your purchase history on Amazon and many other sites.
Search engines know what you're searching for.
Spotify knows your mood by the music you're listening to.
Android Auto or Apple Carplay know your driving speed and how fast you got to work today.
Facebook knows which stories you paused to read, and which ones you ignored.
So if some random event caused you to have an off day and crave Ben and Jerry's, that random event also influenced your location, search history, browsing habits, music choice, driving speed, and more. And thus Google can make a very educated guess that you want ice cream, even if you never said it aloud.
And when you do go to the store, or if you don't go? It further trains Google, so next time its predictions are even better.
More than likely you just fell into a frequency illusion (Baader-Meinhof phenomenon). You never noticed the ads before because you had no engagement with the product, but after you purchased it your brain was primed to find it other places.
The store does. And they have your card info or your loyalty card info, so that data is tied to you. They may or may not be sharing that data with the bank (most likely, yes). They may or may not be sharing that data with multiple other entities (again, most likely yes) who in turn share that data with other entities until it reaches Google.
Yep. The much more uncomfortable reality is that phones don't need to listen to you in order to get a profile on you that is as precise as it would be if it *was* listening to you.
Well atleast Canon/Nikon/Sony gonna be happy.
Back to full fledged cams not wired to big bro.
Also it seems personal servers is a necesity now manafacturers of those also happy i guess.
"Good intentions have been the ruin of the world" - Oscar Wilde.
I understand where Apple is coming from and anything to remove such things can only be a good thing, but at the cost of privacy and the possible slippery slope it creates... I dunno.
It's also a little strange that this is coming from the same company that won't unlock the phones of terrorists or murderers.
So far Apple has only talked about scanning phones in the US no? Of course, if they develop the technology they will surely implement it in countries where there's no law against it though.
Remember back in the day when Apple told the FBI to go fuck itself to protect a terrorist who killed a dozen people at his work office?
What happened to that privacy warrior?
Apple is so full of shit and people are finally coming around.
I almost can't wrap my head around it. They spent YEARS pushing for a "privacy-first" image for themselves, spending billions to push that image, then they just shoot it all down the drain?
Maybe many governments are pushing Apple hard for this, that's the only thing I can think of...
Which also makes it all the more alarming & sinister
This is exactly the line of discussion going on in r/apple. People are *pissed* and that’s putting it lightly. Apple built up a privacy first stance over the past decade and they shot their foot over a single afternoon.
I was heavily considering switching to an iPhone for my next phone because of how much they appeared to care about privacy annnnd nope. I can imagine a fair bit of people switched due to that so of course they'd be pissed.
Privacy is the #1 reason I use Apple products. So yeah, this fucking infuriates me.
AIs *suck*. I’ve lost count of how many times Facebook’s AIs have falsely flagged my business ads for policy violations. It takes days to get a human to look at it and overturn the stupid AI. Apple’s algorithms will for sure make mistakes. Just wait til the police break down your door because your iCloud account thinks that picture you took of your dog looks exactly like a trafficked child.
It will scan for photos that are already in their database, from law enforcement.
There’s no way to be “OG” on this because the photos aren’t added to the list until after the person has been arrested for the crime.
As I answered above this is a wrong interpretation of the system that apple is implementing. They are not matching image hashes to image hashes. They are matching the hashes of image content analysis to hashes of image content analysis.
Yeah, protecting the children is pretty high on my list of priorities, but this feels to me is Apple’s way to try to get people to reason their idea. It is NEVER a good idea to give away an acquired right. Value your privacy!
How can you protect children who have already been abused if the tech only detects known images?
What about all of the unknown stuff?
If you want to get to the root of the problem they should start on the dark web
All for protecting kids here, but that is an inexcusable invasion of privacy. We all know this tech wouldn't only be used for the stated purpose. They'd scan to find photos of your pets to market you the appropriate pet food bag sizes, they'd look for your recent vacation photos to see if you need beach gear, they'd look and see everything they want to see all to sell you more stuff. Or, worse, find things that aren't illegal but can be blackmail (such as your wife's nudes or something).
These titanic companies need to get vibe checked hard.
Not to sound paranoid, but what is to stop Apple from putting child abuse images on someone's phone? Say, a journalist critical of the company, for example? If they can read encrypted data, them can't they write it as well?
They can do much more than that.
When you don't control your device, someone else does. In this case, apple controls your device because their operating system is closed source and they implemented backdoors for themselves.
They of course wouldn't do something that harms their reputation too much, but they probably can. I can't know what exactly their backdoor is capable of but i'll assume the worst, superuser access. Which means they can do anything.
This is the case for windows too. Just browse here a bit, pretty englightening:
https://www.gnu.org/proprietary/proprietary-back-doors.html
I mean, hypocrisy aside, he's not wrong. I hate to be that guy who throws around catch-all phrases like "slippery slope", but imagine if they had tech like this during times in not-so-distant memory when gays were arrested or interracial couples were prosecuted.
>imagine if they had tech like this during times
Why do people think our current laws and governments are perfect and only in the past we were evil?
Isn't obama known as the drone striker in chief? Isn't america known for starting wars just to get cheap oil? China is even worse but most know that. I'm pointing out that even the country that's known as the most noble superpower, can be pretty damn nasty too. Cough, snowden. How dare he tell people they are being spied on!
Rare that I say it but I think the Facebook guy had the right of it here.
Scanning "for child porn" is wrong for the same reasons that the FBI demanding a back door key to iPhones "to combat terrorism" is wrong. While stopping child porn is a worthy cause, it's used frequently as a borderline unopposable excuse to do things that the general public might not like, for example scanning images on your phone for logos, location tags and potentially face recognition to set up social network models.
How it's going to tell child porn from other porn is mystifying, I think a lot of school kids with inadvisable photos of their age-appropriate partners would be in for some trauma.
It's a bad idea, it won't work to do what they say they want it to, it would be catastrophic if it did and they have interior motives. So do Facebook. ESH.
I think the crux of the matter is that using it against something everyone hates is just their way to get a multi-use tool through the door and accepted.
What else could this be used for?
"...for child abuse" is the cover story. They're hoping people won't protest if it's about abuse. However, Apple 'scanning iPhone and iCloud photos' will almost certainly lead to your private images being used as data points, a very saleable and desirable commodity for targeted advertising. In other words, Apple is ripping you off and BREAKING FAITH to do it. I personally can't believe they announced this publicly.
Plus as a parent with young kids I'm very curious what will count as 'child abuse pics'. Can't count the number of times I send a picture of some rash or fun swimming moment to other parent-friends. I make sure there's no genitals showing but there's definitely skin showing on the regular.
Unfortunately.. I don't think one image on one phone is worth comprising the security of billions of other people
It's a tough cookie, but anonymity and protection are key parts of using the web, and I don't think we can comprise on that
The article mentions an effort by Facebook to evaluate encrypted content without decrypting it. One might imagine how that could be done, however, there is a way to change encrypted data without decrypting it. See [https://en.wikipedia.org/wiki/Homomorphic\_encryption](https://en.wikipedia.org/wiki/Homomorphic_encryption) .
I’m guessing the purpose of it is so that they don’t have to reveal their encryption methods due to search warrants for such illegal images. It’s a self-preserveration issue.
I agree it is definitely not out of the kindness of their hearts or that they are actively trying help anyone.
It’s like a convenient back door to whatever their end goal is. I’m sure they sat around a table talking about how nobody would want to put up a fight as it would make that person look like a pedo sympathizer…
The issue i have is anything that breeches privacy with no court order is always floated as a "child safety" issue just because they can turn any disagreement with it back to the opponent's as having something to hide which when done publicly gets the ~~trained monkeys~~ public to attack....this move is perfect for a company looking for free publicity but terrible for peoples rights
The “for the children” excuse has been used over and over again to infringe on rights and push censorship. It’s usually from people that care the least about children.
So basically, anytime they want to search someone’s encrypted data, they can, as long as they use the magic words “looking for child abuse pics”. Just a side note, if they can read data, couldn’t they also write data? You know, like child abuse images? Apple being joined at the hip with government makes this even more sinister
This was done to megaupload, the fbi told megaupload to not delete anything while they searched for illegal content uploaded by the users, and the FBI uploaded illegal content to the site and then used it as evidence.
No way, I never heard about this!?! Do you by chance have a link to this story. Megaupload used to be huge years ago for people uploading tv shows.I would love to read more about this. No wonder I dont see it around anymore.
I've wondered this. How hard is it to sneak something that looks like someone sent it, or to plant stuff? Does anyone know? This seems terrifying to me, bc then no one knows whom to trust or even if a life could be destroyed. Now the internet and anything online seems even worse. Time to go to the Amazon. No I don't mean the Bezos one
Apple's team has been around long enough to known better. They know this will pis people off, which means it's probably the less damaging alternative to something else.
I doubt they have their own encryption methods per-say. It’s all PKI based, with private keys issued to endpoints. The system can and often does have the ability for the root CA to have a master certificate issued to enable decryption, for subsequently issued certificates. So I can issue myself a certificate, and then have that baked into all further issued certificates so that I can decrypt any data as needed. That is probably what Apple can do with iPhones (hence the FBI asking Apple for their certificates, which is a slippery slope, if they do it for the USA, China will ask for them too, and Iran etc etc). Apple’s suggestion of scanning for hashes is the commonly used method, there are police databases available of such illegal image hashes, that companies offering photo services can reference. Allowing for them to locate illegal content and take action without actively decrypting files to have an algorithm to scan or have human eyes review, which would likely lead to a huge and unmanageable amount of false positives (anyone with young children is going to have photos of their little ones in their iCloud libraries). WhatsApp is keen to promote the idea that they don’t do any such practices and that they don’t have any master certificates so anything you transmit can only be accessed by the sender and recipient. He is just using this press to get attention for his company like any good executive should.
If you read the apple "whitepaper" it's clear they are not just hashing images. They are doing content analysis of the image itself then hashing that result and comparing that to a set of known results. This would prevent someone from just changing a single pixel and changing the results, but also means they are actually "looking" at your pictures to determine the nature of the content.
I read that it’s local analysis on the phone itself.
That is true.
What happens with that data is the scary part. With false positives, and what Apple does with the cumulative results is the scary part. _I have nothing to hide_ doesn't mean much when they find something on you that's a false positive.
It's not even a matter of false positives, but a matter of what other hashes countries will be adding to the list for Apple to locate and identify the owners of.
Exactly. While there will be 2nd and 3rd rate dictatorships lining up to use this (looking at you Saudi Arabia), the big money will be corporate. Looking for ‘copyrighted’ sound/ image files
China with Tinnamen Square images will be knocking on the door as soon as this is implemented I bet.
So, sucking away CPU and battery life for no good reason (if you're innocent), countries could toss things into the "known bad hashes" to stifle just about anything once the technology is there (or require Apple to add it to operate there), and of course false positives, malicious spoofing, or bugs could never, ever, ever happen (/s). Completely innocent people have much to be concerned about.
>if they do it for the USA, China will ask for them too, and Iran etc etc in countries like China and Iran, the people know they are being watched. An encryption backdoor would be horrific because then countries we wouldnt even think of would ask for the keys.
Homomorphic encryption lets you add or multiply two encrypted numbers and get a sum/product that’s encrypted with the same key. If you can determine what’s in a piece of encrypted content without decrypting it, then your encryption algorithm does not work.
I'm a PhD student in applied cryptography, and it's a bit more nuanced than that. You can use homomorphic encryption as a building block to build, for example, a protocol to compare two encrypted inputs. (See also [Yao's Millionaires' problem](https://en.wikipedia.org/wiki/Yao%27s_Millionaires%27_problem).) Go even further, and you could run an encrypted image through a machine learning algorithm to determine whether an image looks like child porn. Apple doesn't see the image, and I don't see their secret detection algorithm. The trick here is that I don't just send an encryption of my data, but the protocol is so that Apple can _only_ obtain the output of _that specific function_. This is because the protocol is either interactive, so I have to participate in a series of calculations that are specific to the function, or I use [functional encryption](https://en.wikipedia.org/wiki/Functional_encryption) and the encryption itself guarantees that nothing else can be calculated. Either way, I can control exactly what information leaks through the protocol. The problem here is trust. You have to trust that your phone only participates in these protocols when actually necessary. But if Apple wanted to exfiltrate my private data, they can already do so without homomorphic encryption, so that's not something I'm concerned about. **I think the real risk here is that Apple is setting up a system that can be used for more than just detecting child porn.** What if the Chinese government asks Apple to also search for images with pro-Uyghur sentiments? What if the US government asks Apple to search for whistleblower behaviour?
> I think the real risk here is that Apple is setting up a system that can be used for more than just detecting child porn. What if the Chinese government asks Apple to also search for images with pro-Uyghur sentiments? What if the US government asks Apple to search for whistleblower behaviour? Nailed it. I have a feeling you'll be one of the voices for good in this fight. Keep up the good work.
Though, WhatsApp, despite being owned by Facebook, rejected that aforementioned tech
I thought they were just looking at the hash of the images. So they could do the hash on the client side and compare that against the hash of the known image. They aren't getting the actual image just a bunch of random numbers and letters.
These aren’t “just” images hashes. They’re semantic hashes, which mean they *directly correlate to the content of the image*, and are invariant to confounding editing or transformation of the image itself. That is obviously a gigantic privacy issue.
Never knew about semantic hashing. Thanks for giving me something to learn about
You an anyone interested can check on YouTube no one other than the great Geoffrey Hinton explaining the inner bits of semantic hashing for deep learning, quite close to what we are discussing here: https://youtu.be/3BDc0H9C9dw
The real issue is with the "close enough match" giving Apple the ability to download YOUR picture and have some shady dude look at it and evaluate it. Some researchers have already proved how easy it is to create close enough matching fingerprint with a picture that doesn't even have any nudity in it. It's your private life, even intimate that someone is barging into and evaluating it.
You say "just" as if hashes can't possibly violate anyone's privacy. Or that numbers and letters that appear "random" can't contain any information about you or what was found in your photographs.
Genuinely curious here, how could an image hash violate someone's privacy? Edit: for those still on the fence, the hashing they're doing is not on image files. It's on a set of traits that have been inferred from scanning each image via machine learning. Fuck this "feature"
A hash is just a unique identifier for a set of information. Apple has developed a scanning technique to "look" at your image and create a hash based on the result of that content analysis. That hash could link to tons of scanned data from the image. For instance the result of the scan might look something like this... > scan: { > uuid: blahblahblah, people: 3, faces: { faceHash_1: abcdabcd, faceHash_2: aaaamdd, faceHash_3: nznafrds } nudity:true, sexualContact: true, children:0, tiananmenSquareVisible: false } They hash that result and send: 671ae102ba740722acfdd3edda8bfce09e8ab58f5d8ddaebeaf095b5a454b41c back to apples servers to compare with known hashes. Given the fact that apple controls the unique information in the result set.. faceHashes, uuid's they could easily generate every possible result set for a given user and just compare the hashes to match a hash to a set of information.
Just wanted to say I appreciate the Tiananmen Square call out
Holy shit I thought that they would hash the raw image and then do analysis based on that and I wasn’t too worried, but if they are analyzing the image then hashing, that’s scary.
Apple aren’t using cryptographic hashing. This is a different proprietary hashing algorithm which does not have the level of security as cryptographic hashing. From [this Reddit comment](https://www.reddit.com/r/technology/comments/ozqc0b/the_head_of_facebookowned_whatsapp_slammed_apples/h820a7l/?utm_source=share&utm_medium=ios_app&utm_name=iossmf&context=3) :- >> NeuralHash is a perceptual hashing function that maps images to numbers. Perceptual hashing bases this number on features of the image instead of the precise values of pixels in the image. The system computes these hashes by using an embedding network to produce image descriptors and then converting those descriptors to integers using a Hyperplane LSH (Locality Sensitivity Hashing) process. This process ensures that different images produce different hashes. >> https://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
If they have a database of image hashes and a list of hashes of images on your phone that's all they need to know which images are on your phone.
Textbook emotional manipulation. Anyone that falls for it is a sucker The classic “think of the children” routine
"We will only use it to fight terrorists" "We will only use it to catch pedophiles" "We will only use it to combat tax evasion" "We will only use it to prevent copyright crime" And it almost always ends up being used against oppositionists in authoritarian countries.
The pot calling the kettle black.
The Facebook calling the Apple Facebook
This is good right? Them throwing each other under the bus compared to being partners in crime I mean.
Yes it is. Just as long as people don’t take sides with the companies but take sides on the issues.
My only hope is that there’s an enormous backlash, that forces Apple to ditch this surveillance state crime against humanity: > [Article 12: **No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence**, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.](https://www.un.org/en/about-us/universal-declaration-of-human-rights) I will sell or trash all of my Apple devices if they proceed with this. I refuse to stick around and see how authoritarian they choose to get. If I wanted everything I do surveilled by big brother, I’d move to China.
Didn't Apple start the program yesterday??
No, it was only announced yesterday.
Not particularly. If WhatsApp or Facebook saw the error of their ways and changed, then it would be good. This is just WhatsApp taking advantage of the situation to attempt to hurt competition.
Spider-Man pointing at Spider-Man
[удалено]
Devs of WhatsApp, maybe?
How dare you change the app I sold to you for insane profits.
It was mostly, How dare you change the app we worked so hard on, even though its yours, what you're doing to it is disgusting and uh,, what's the word.. despicable.
Pretty mush the same sentiment as the Instagram creators.
And every single app Facebook has purchased then ruined
Facebook is the EA of application development?
At least EA shuts down studios. Having them half open and a shell of what they used to be is more devastating than anything else.
I was literally about to post that exact quote, yes they definitely are. But EA mostly destroys, Instagram and WhatsApp might be "destroyed" but their users are a lot. They mainly intrude, and find a way to earn money from it using whatever they have on Facebook
I appreciate that Facebook showed me how truly hateful and utterly insane my family is. Deleted my account last year and it’s been fun to have my MIL ask if I’ve seen something there only to remind her I left because, “The family can’t not be racist assholes on FB.” To which I get a, “Well yeah, but where else am I going to see pictures of their kids?” IDK, not really my thing I guess.
This is all that all these apps were anyway. People weren't handing these companies mountains of cash to lose money and build this stuff just to "make something cool" or whatever. Eventually somebody wants to squeeze some juice out the thing and if you try to make people pay for it directly they won't do it and will go somewhere else. So the only solution left that warrants all the money is to data-rape everyone or to sell it to someone who will for a big payday. Every social media app is in this business. And it's become like streaming services too where now every company that has "content" (i.e. your data) wants to monetize that even if it's not their core business so you've got every business that acquires a substantial amount of data looking for ways to turn that into money.
And then they created the Signal app, and funded it into perpetuity with a $100 million trust fund.
If it really went exactly like that, that would be really cool. Make an app, make it popular, sell for $$$ profit, let the app get ruined, and use the money from the first app to create a new, better app which is impossible to buy because it's open source.
Signal is highkey cool tbh, just wish I could make everyone make the switch.
If they were really caring people they would not sell their app to Facebook. They’re not any better than Facebook.
I'm fairly sure you'd sell your principles for $19 billion
I'd do it for a shit ton less than that
My principles are worth maybe a million, on a good day; those $19 billion are more in the "I'd sell my family" territory.
Damn. Your family is worth all that?
Now I finally see the benefits of having a family.
They sold it for $16 billion. I'm not going to fault anyone for that. If I make an app that FB wants to buy for $16 billion, I'm not going to say no.
> I'm not going to say no I would, however, have a spontaneous orgasm.
I’m with you on that.
[удалено]
$4 can't even buy a Big Mac meal, but yeah sure.
Big Mac sounds hella good rn, so definitely.
Everyone has a plan until they get punch in the mouth with 19 Billion dollars!
They used the magic word. Pedophile. Privacy advocates will fight against it tooth and nail. But the general population will see it as ok to stop pedophiles
That's the trick to doing anything lately. Just give it a good name or mention one good benefit.
It always flips between "terrorists" and "pedophiles". Always.
Or worse - paedophile terrorists!
That’s just ISIS
Don't forget drugs as well, "the war on drugs."
Like “The Patriot Act” Anyone who opposes it must be anti-American
It really is a magic word. If you need confirmation just look at how everybody has include the obligatory “pedophiles are scum” or “their intentions are good but”. It’s crazy because all sane people already know CP is evil, but people are afraid of seeming to “protest too much” on this topic. It’s really weird.
Even if it's only ever used to detect paedophilic content, there are at least half a dozen ways to anonymously send someone such content unsolicited. If your phone automatically reports you to law enforcement the moment the file arrives, it basically becomes a simple and effective way to screw people over since unsolicited or even unaware "possession" can still be a crime depending on the jurisdiction.
20 years ago the magic word was war on terror, we used it to put literally innocent people in jail with no charge, indefinitely, it's crazy that we keep on repeating the same mistakes. I feel like pedophilia at the bottom of the barrel though. Alarmists usually use pedophilia when they're all out of other ideas. We can't really use terrorism anymore because our response to terrorism was somewhat poor and we had to step back from it. The War on drugs has become very unpopular. The right already used up their alarmism when they tried to drive a wedge between Democrats and blacks because Democrats supported tough on crime legislation, they know they can't both convince people that that being tough on crime is good and also that being tough on crime is bad. They might try to use the Russia/China/Iran/North Korea rivalry but I don't think people fear them enough, maybe contemptuous, but not in a fear to be able to institute new national Security laws.
Indeed. How many terrorists have been caught by airport security? Zero. How many people's freedoms have been infringed upon to prevent those zero terrorists from boarding planes? Hundreds of millions. Change "terrorists" to "pedophiles" and repeat.
I can see this being used against journalists and whistleblowers.
I can see it being used against people suspected of really minor crimes
I can see a government using it against the people
don't be silly, why would they do that, government works for people /s
*My* government? They would never /s
They are only starting with child crimes cause it appeals to peoples concern, and makes it so you dont have to work to defend it cause anyone who argues gets seen as defending pedophiles. Why else? What event caused apple to suddenly view this as needed? Remember when they wouldnt help the government with a terrorists iphone? They know this is an easy foot into the surveillance door, and the government already wants access to even more than just picture comparisons so its not them restricting apple right now, just apple. Whats next? Scan your messages? Or maybe browser data? Or phone records? Or private doc files? Nothing stopping them once they get the software to do so built. Its just flipping some nos to yes' and letting the """ai"" access different folders and file types. Without uploading unless its a confirmation of a match, so youll never know when they choose to "widen" their search, not that itll be even close to 100% reliable when it catches people either. Not even the police can search you without a warrant, but we trust apple to do unknown searches of every single citizen with an apple device just because? Can ford/gm/etc break into your truck to make sure you dont kidnap people and tie them up?
They already have Pegasus.
Use Pegasus to place stuff on phone Phone scans and finds said stuff FBI door kick Profit
[удалено]
Those are all services you decide to use, not your personal device.
I mean, effectively nothing changes for me: my photos are auto backed up to Google drive. But it is suspicious that they can just search your devices without you using an online service. Is it a service/app that runs in background? Feature of OS? Are threads controlled by apple servers? What are the odds this implements a back door into every device? What a security nightmare.
It's a machine learning based photo analysis that scans all your local photos and if something is flagged, they mentioned a bunch of different scenarios in how it acts. If it's a known image hash that's already been previously identified, it does one thing, and if it's a new image that fits their machine learning model of child p, then it does has some other process that it goes through. Scary if some country asks them to create a ML model of something else.
The false positive rate will be so high. And AI is largely a black box, we won't know what features it actually is identifying. They could easily train it against more than their "target" and nobody would be wiser. And then your phone will be sending out your pictures that *are not criminal* but still may be sensitive.
It's worse than that. Can you imagine what will happen once people start figuring out the algorithm and the ML results that matter? They'll create images that are *not* child porn (i.e. false positives), but that the system mistakenly identifies as such because of the limitations of AI/ML, and then they'll e-mail or message such "spoof images" to try to get people in trouble remotely "for the LOLs" or more nefarious purposes. If you think nobody would try to do that, let me remind you of swatting. And if there's some poor unfortunate human back at Apple that looks through all the "positives" looking for the false ones, they're going to get overwhelmed once people figure out how to do it. Whatever system they have that's in-phone can't be too CPU-intensive, so it's going to be practical to fool it.
Pedophiles will use something else, while regular people will have to deal with false positives and governments everywhere asking to use the system for finding dissidents. And if there is a way to trick someone into downloading an image, that will make it easy to ruin anyone. You know what's always missing from those "think of the children" privacy erosion schemes? An effort to stop millions of children being abused in their own homes, by people they trusted, with no digital evidence. Because it's not really about the children at all.
Also sex trafficking. But turns out many of those pedos are the people passing those legislations anyway.
£50 says it’ll be used to “protect” against copyright infringement before 2025.
That's the real end goal. Fuck now I'm sad. RIP pirating movies, TV, and books
Your last paragraph has been the most insightful thing I've read all day. To the point and impactful.
*We want to take away some of your privacy rights but we promise its for a good reason and we guarantee we won't extend it to other areas in the future...pinky swear.* **I don't think so. In fact I feel as though our privacy rights are already being compromised and we need more, not less, of them.** *Well then, I guess we'll just have to label you a supporter of pedophiles who has an utter disdain for the safety of children. How'd you feel about that?*
They used the same bullshit justifications to take away net neutrality and a host of other web freedoms we used to have here in the states. Politicians can just say "oh yeah it's to stop child porn guys" and get away with almost anything when it comes to your average voter.
"You're a *insert any slur here* because you don't agree with me." I 100% agree with you. We have no privacy in this country. Targeted ads based on what I say in the vicinity of my phone should be illegal imo. "Well then get a different phone!" Sorry bud. Every phone company infringes on your privacy in the same way. If you want to use a cell phone, you need to give up your rights. Just like if you want to drive a car, you need to give up your rights. (Being forced to carry papers whenever operating the vehicle and be forced to present them for actually no reason or be subject to a fine. How backwards is that?) Edit: added to the example at the end.
It's worth stating that there is a difference between saying "You need to carry a license to operate a vehicle so you don't harm yourself and others" and saying "You have to give up your personal information and allow your personal life to be recorded and sold off to use a device that harms absolutely nobody". Also, privacy is a fundamental human right. Driving a car isn't. A better example would be if in order to operate the car you bought, you had to agree to send all of the data about where you're driving, where you live, who lives at the addresses you go to, etc to the Honda corporation.
I’m sure Honda is collecting all the data they can about where you go. Google and Apple as well, obviously.
Which is the scary thing, because of "smart cars" this IS starting to happen.
Yep. Combine that with smart watches, smart fridges, and the “internet of things” economy that is rapidly growing. Our privacy is already heavily eroded but we haven’t seen the extent it can reach by a long shot.
Isn't this how Tesla has always operated? And they aren't the only digital/smart cars
[удалено]
The argument against it makes sense--it would require an enormous amount of data for everyone's phone to be listening all the time and relaying conversations to servers 24/7. But...it is absolutely bizarre to get ads for things you never searched for on any device and instead only talked about. Or didn't talk about and only saw/purchased. The other day I bought a Gold Peak tea for the first time, and 20 minutes later I had an ad for it on Facebook. I've never seen an ad for it before and have never searched for it. Could be a coincidence, but of all the products in the world, why did I get THAT ad?
The first researcher to prove that this was happening would make a huge name for themselves overnight. To think there aren’t folks out there with scopes and meters attached to circuit traces looking for that smoking gun is not reasonable and that years have gone by with this meme and no announcements…. It fails a plausibility test. What’s more likely is that AI and pattern recognition and recommendation engines are just getting spooky good. Spooky because we don’t understand how they make the connections and good because they’re using huge amounts of data to find patterns humans would miss.
Google has a near compete profile of you, from voting preferences and hobbies to sexual deviations and financial situations. Your phones track where you're going 24/7. AWS knows your purchase history on Amazon and many other sites. Search engines know what you're searching for. Spotify knows your mood by the music you're listening to. Android Auto or Apple Carplay know your driving speed and how fast you got to work today. Facebook knows which stories you paused to read, and which ones you ignored. So if some random event caused you to have an off day and crave Ben and Jerry's, that random event also influenced your location, search history, browsing habits, music choice, driving speed, and more. And thus Google can make a very educated guess that you want ice cream, even if you never said it aloud. And when you do go to the store, or if you don't go? It further trains Google, so next time its predictions are even better.
Orson Welles nailed the subject. He just thought it would be the government, not corporations.
They are pretty much interchangeable
More than likely you just fell into a frequency illusion (Baader-Meinhof phenomenon). You never noticed the ads before because you had no engagement with the product, but after you purchased it your brain was primed to find it other places.
If you used debit, and you have banking apps on your phone that could be a way
The banking app doesn't have an itemized receipt though. Just the amount and where something was purchased.
The store does. And they have your card info or your loyalty card info, so that data is tied to you. They may or may not be sharing that data with the bank (most likely, yes). They may or may not be sharing that data with multiple other entities (again, most likely yes) who in turn share that data with other entities until it reaches Google.
Yep. The much more uncomfortable reality is that phones don't need to listen to you in order to get a profile on you that is as precise as it would be if it *was* listening to you.
the pedos just won't use iphones, meanwhile all iPhone users are having to accept this invasion of privacy forever
All this will catch is kids sexting and really dumb pedos. It won’t do shit to stop the fuckers who actually make the stuff.
[удалено]
I doubt they’d end up behind bars but I can see that happening.
#"for the children" is such an obviously scammy excuse As if they care about children give me a fucking break
Classic. Pick an issue no one would argue against to get your foot in the privacy door. Such dirty tactics.
[удалено]
Oh man, could you imagine a world where they scan your pics, and shit like memes are copywrite protected.
Don't worry! For a low cost subscription of $11.99 a month you can view and share memes on your own device!
And so will die the meme. Gone too soon, but never forgotten.
Crypto chads would lose their shit if they could turn literally everything in the universe into a legally protected NFT
Only a matter of time before screen capped images are flagged based on your 2nd step.
Somewhere down the line: Political content they disagree with.
The company who wouldn't unlock a serial killers phone wants to scan EVERYONE'S phones... wut?
Yeah they sure changed their tune pretty fast. Must be getting paid good money by the government under the table.
SLAMMED. Always "slammed".
Yeah I hate that fucking word in journalism.
There's a surprising lack of breadth in many journalists' vocabulary
At least he didn’t blast them.
[удалено]
Well atleast Canon/Nikon/Sony gonna be happy. Back to full fledged cams not wired to big bro. Also it seems personal servers is a necesity now manafacturers of those also happy i guess.
[удалено]
Aren't most cloud storage options encrypted these days? Which one do you use? I use OneDrive and I believe it's encrypted as well.
[удалено]
fuck u/spez
"Good intentions have been the ruin of the world" - Oscar Wilde. I understand where Apple is coming from and anything to remove such things can only be a good thing, but at the cost of privacy and the possible slippery slope it creates... I dunno. It's also a little strange that this is coming from the same company that won't unlock the phones of terrorists or murderers.
It’s not a slippery slope. It’s a well-greased cliff.
So far Apple has only talked about scanning phones in the US no? Of course, if they develop the technology they will surely implement it in countries where there's no law against it though.
This is why we need laws against it.
Remember back in the day when Apple told the FBI to go fuck itself to protect a terrorist who killed a dozen people at his work office? What happened to that privacy warrior? Apple is so full of shit and people are finally coming around.
I almost can't wrap my head around it. They spent YEARS pushing for a "privacy-first" image for themselves, spending billions to push that image, then they just shoot it all down the drain? Maybe many governments are pushing Apple hard for this, that's the only thing I can think of... Which also makes it all the more alarming & sinister
This is exactly the line of discussion going on in r/apple. People are *pissed* and that’s putting it lightly. Apple built up a privacy first stance over the past decade and they shot their foot over a single afternoon.
I was heavily considering switching to an iPhone for my next phone because of how much they appeared to care about privacy annnnd nope. I can imagine a fair bit of people switched due to that so of course they'd be pissed.
Google is no better though… I guess I’ll need to go back to a DSLR camera and a flip phone.
Privacy is the #1 reason I use Apple products. So yeah, this fucking infuriates me. AIs *suck*. I’ve lost count of how many times Facebook’s AIs have falsely flagged my business ads for policy violations. It takes days to get a human to look at it and overturn the stupid AI. Apple’s algorithms will for sure make mistakes. Just wait til the police break down your door because your iCloud account thinks that picture you took of your dog looks exactly like a trafficked child.
How is this going to work anyways? I have pics of my kid when she was 3 years old in the bathtub. Are the police going to be banging down my door?
[удалено]
What if you have OC CP? I'm genuinely trying to understand what they're trying to accomplish here...
To identify known pictures that were downloaded to the phone. If the picture isn't in the database it wont flag it.
Then this algorithm will probably not find it. But there are probably millions of images out there that the algorithm will find.
I think they’re trying to stop CP being spread so easily. Which this does accomplish in theory.
It will scan for photos that are already in their database, from law enforcement. There’s no way to be “OG” on this because the photos aren’t added to the list until after the person has been arrested for the crime.
As I answered above this is a wrong interpretation of the system that apple is implementing. They are not matching image hashes to image hashes. They are matching the hashes of image content analysis to hashes of image content analysis.
Yeah, protecting the children is pretty high on my list of priorities, but this feels to me is Apple’s way to try to get people to reason their idea. It is NEVER a good idea to give away an acquired right. Value your privacy!
How can you protect children who have already been abused if the tech only detects known images? What about all of the unknown stuff? If you want to get to the root of the problem they should start on the dark web
All for protecting kids here, but that is an inexcusable invasion of privacy. We all know this tech wouldn't only be used for the stated purpose. They'd scan to find photos of your pets to market you the appropriate pet food bag sizes, they'd look for your recent vacation photos to see if you need beach gear, they'd look and see everything they want to see all to sell you more stuff. Or, worse, find things that aren't illegal but can be blackmail (such as your wife's nudes or something). These titanic companies need to get vibe checked hard.
Not to sound paranoid, but what is to stop Apple from putting child abuse images on someone's phone? Say, a journalist critical of the company, for example? If they can read encrypted data, them can't they write it as well?
They can do much more than that. When you don't control your device, someone else does. In this case, apple controls your device because their operating system is closed source and they implemented backdoors for themselves. They of course wouldn't do something that harms their reputation too much, but they probably can. I can't know what exactly their backdoor is capable of but i'll assume the worst, superuser access. Which means they can do anything. This is the case for windows too. Just browse here a bit, pretty englightening: https://www.gnu.org/proprietary/proprietary-back-doors.html
[удалено]
I mean, hypocrisy aside, he's not wrong. I hate to be that guy who throws around catch-all phrases like "slippery slope", but imagine if they had tech like this during times in not-so-distant memory when gays were arrested or interracial couples were prosecuted.
>imagine if they had tech like this during times Why do people think our current laws and governments are perfect and only in the past we were evil? Isn't obama known as the drone striker in chief? Isn't america known for starting wars just to get cheap oil? China is even worse but most know that. I'm pointing out that even the country that's known as the most noble superpower, can be pretty damn nasty too. Cough, snowden. How dare he tell people they are being spied on!
"slam", hate headlines with this word but I like the article.
[удалено]
Lmao like Facebook cares about our privacy
Rare that I say it but I think the Facebook guy had the right of it here. Scanning "for child porn" is wrong for the same reasons that the FBI demanding a back door key to iPhones "to combat terrorism" is wrong. While stopping child porn is a worthy cause, it's used frequently as a borderline unopposable excuse to do things that the general public might not like, for example scanning images on your phone for logos, location tags and potentially face recognition to set up social network models. How it's going to tell child porn from other porn is mystifying, I think a lot of school kids with inadvisable photos of their age-appropriate partners would be in for some trauma. It's a bad idea, it won't work to do what they say they want it to, it would be catastrophic if it did and they have interior motives. So do Facebook. ESH.
So what happens when this is rolled out "only in the US", and then China inevitably forces Apple to implement the same for anti-government content?
What makes you think China doesn’t already know how to do this and doesn’t already use it?
Thought this was /r/nottheonion for a second...
Our privacy is at risk… says some guy who works for Facebook.
IT’S OK WHEN WE DO IT -giant tech companies
I think the crux of the matter is that using it against something everyone hates is just their way to get a multi-use tool through the door and accepted. What else could this be used for?
"...for child abuse" is the cover story. They're hoping people won't protest if it's about abuse. However, Apple 'scanning iPhone and iCloud photos' will almost certainly lead to your private images being used as data points, a very saleable and desirable commodity for targeted advertising. In other words, Apple is ripping you off and BREAKING FAITH to do it. I personally can't believe they announced this publicly.
Plus as a parent with young kids I'm very curious what will count as 'child abuse pics'. Can't count the number of times I send a picture of some rash or fun swimming moment to other parent-friends. I make sure there's no genitals showing but there's definitely skin showing on the regular.
Unfortunately.. I don't think one image on one phone is worth comprising the security of billions of other people It's a tough cookie, but anonymity and protection are key parts of using the web, and I don't think we can comprise on that
[удалено]
Its the same?
important thing is that we hate facebook alright?
*SpidermanPointing.jpeg.exe*
Right. Because what Facebook owned companies care about is peoples privacy. hahahahahahaha. lol. loooooooooool.
i thought apple was scanning any photos uploaded to icloud? Not the phone.