Apple is delaying its much criticized plan to scan users’ iPhones for images of child sex abuse material (CSAM).
“Last month we announced plans for features intended to help protect children from predators who use communication tools to recruit and exploit them, and limit the spread of Child Sexual Abuse Material. Based on feedback from customers, advocacy groups, researchers and others, we have decided to take additional time over the coming months to collect input and make improvements before releasing these critically important child safety features,” Apple said in a statement.
The proposed software update would include an algorithm that could scan a users’ iCloud account for images and compare them to a database of knows CSAM material, relevant information would then be flagged by a human reviewer and sent to the proper authorities.
The company was originally planning on releasing the feature with the iOS 15 this autumn but decided to delay the release of the feature with no further indication of when it would be released.
The feature has been criticized by many as being a violation of privacy with even Apple employees expressing their concerns that the software could be misused by authorities.
Sources: Business Insider
Follow FiND iT on Facebook here.