Apple published on its website a document in which it tried to dispel the fears associated with the introduction of new measures to protect children from unacceptable content.The manufacturer assured that the implemented system that will scan the photos and video uploaded to iCloud for images of brutal handling with children (CSAM content) will not turn into a tool for surveillance and will not expand at the request of the authorities of a particular state.
Apple explained that "this technology is limited by the detection of CSAM content stored in iCloud," and any requests from the governments associated with the expansion of this function will deviate.
The new system was emphasized in the company, “works only with images provided by the national center of missing and operated children (NCMEC) and other organizations for the protection of children” - that is, not the images themselves will be scanned, but their digital representation.In addition, they promised in Apple, the Hash base will be the same on all iPhone and iPad to prevent "aiming" on individual users.
Apple announced the introduction of new initiatives to protect children last week.iOS and iPados will begin to scan photo and video uploaded in ICLOUD for abuse of brutal handling with children, and in the Imessage messenger ("messages") to the minors upon receipt or an attempt to send a frank sexual nature, a warning will be shown.The first new functions will be earned in the United States later this year.
The idea of scanning Apple Content caused criticism.The expert in cryptography and cybersecurity Matthew Green admitted that if the company allows you to control the base with prohibited materials, then some governments may abuse this.For example, instead of detecting illegal children's content, an instrument can be used to suppress political activity.The head of WhatsApp, Will Katkart, also holds the same opinion, who called the new Apple tool a “surveillance system”.
At the same time, Apple was already concessions for the right to work in a number of countries, reminds the Verge.For example, in states where encrypted phone calls are prohibited, the company sells the iPhone without Facetime.In China, Apple blocks all content that is objectionable to the government, does not encrypt user data, since this is prohibited by Chinese law, and also deleys any applications from the local App Store at the request of the authorities.