User Controls
We need to have a talk about apples new invasion of privacy
-
2021-11-29 at 3:18 AM UTC
Originally posted by Toxoplasmosis Privacy is effectively dead and now every device you will own from a monitor to your operating system and your eventually hmd will be constantly monitored and all your actions will be cataloged
The NSA has been doing this for years. They can pull up every web search, download and keystroke you ever did in the past 10 years even if you used 20 different computers and moved across the country they can easily track a person.
-
2021-11-29 at 3:19 AM UTC
Originally posted by Wariat it is only icloud storage:
https://www.cnbc.com/2021/08/06/apples-privacy-reputation-is-at-risk-with-new-changes.html
it cant scan your entire ipad or screen or at least nowhere did i read that as reported and it would be a huge violation not letting its users know this.
It does
Open up a picture on your iPad that has something written on it, you can highlight and copy the text like you can with pdfs and web pages. -
2021-11-29 at 3:27 AM UTC
-
2021-11-29 at 3:42 AM UTC
Originally posted by the man who put it in my hood That's been a thing for many years
yeah the relatively new thing is Apple scanning images on your device and icloud account for 'child pornography'
https://niggasin.space/thread/68582
it's just a pretext. if they hash all images on a device they know where an image was first seen - say someone posts a photo online antagonising the police or something. the police are able to get the hash of the posted image, then apple is able to check that hash to see which phone it was seen on first - assuming the image originated from an iphone, they're not only able to get the phone's ID/IMEI code, they're able to get the service number which would lead to the owner, as well as the current location of the device and possibly the location the photo was taken. -
2021-11-29 at 3:45 AM UTCit doesnt scan images on the device only icloud. nowhere does any source claim it scans devices.
-
2021-11-29 at 3:49 AM UTC
-
2021-11-29 at 3:52 AM UTC
Originally posted by Wariat it doesnt scan images on the device only icloud. nowhere does any source claim it scans devices.
https://archive.md/ys5Q5
***Original FT article is now paywalled -
2021-11-29 at 4:14 AM UTC"On AMerican iPhones" stops reading.
-
2021-11-29 at 4:14 AM UTCand it is only intends or plans not implements.
-
2021-11-29 at 4:19 AM UTCthat was from August, I think it's been activated already
The scheme will initially roll out only in the US.
soon they'll be going through your CP collection, even in Poland -
2021-11-29 at 4:22 AM UTCread the article again retard it only deals with cloud storage.
-
2021-11-29 at 4:22 AM UTCnowhere does your article say what you claim.
-
2021-11-29 at 4:23 AM UTC
Originally posted by aldra that was from August, I think it's been activated already
soon they'll be going through your CP collection, even in Poland
no they wont because youre too retarded to read your own source materials and still no they wont not in Poland tard nor do I have such a collection tard. -
2021-11-29 at 4:24 AM UTCyoure too dumb to realise they dont have such technology to scan every photo on actual hardware devices or even hash them and not tied to icloud. plus they dont go through photos just look for known hash numbers of known files.
-
2021-11-29 at 4:25 AM UTC
-
2021-11-29 at 4:25 AM UTCyou can opt not to connect your photos to your icloud drive moron in settings.
-
2021-11-29 at 4:25 AM UTC
Originally posted by Wariat read the article again retard it only deals with cloud storage.
whatApple intends to install software on American iPhones to scan for child abuse imagery, according to people briefed on its plans
Apple’s neuralMatch algorithm will continuously scan photos that are stored on a US user’s iPhone and have also been uploaded to its iCloud back-up system. Users’ photos, converted into a string of numbers through a process known as “hashing”, will be compared with those on a database of known images of child sexual abuse.
-
2021-11-29 at 4:34 AM UTChttps://www.apple.com/child-safety/pdf/CSAM_Detection_Technical_Summary.pdf
CSAM Detection will be included in an upcoming release of iOS 15 and iPadOS 15.
judging by the whitepaper, it's a process that runs on the device to create and index hashes of stored images in a database. when you upload an image to iCloud the hash is bound to it, likely so that the image can still be tracked to the original hash if it's been altered (ie. resizing, fi'ltering etc.).
What is not clear (or maybe I just didn't see it) is whether Apple has direct access to the hash database on the device - I would assume the answer is yes. -
2021-11-29 at 4:34 AM UTC
-
2021-11-29 at 4:37 AM UTCWell it was always kind of silly to keep your CP on your phone now wasn't it?