According to Appleinsider,After receiving a lot of criticismAppleThe company recently announced that it will no longer launch its CSAM child protection function as planned.
Apple Inc had planned to launch a new feature to censor photos saved on users' phones and pictures posted through iMessage and uploaded to iCloud to identify child pornography and abuse content (CSAM) and combat its spread. The feature was originally planned to be launched with the iOS 15 system and was first available in the United States. After the announcement, the feature has been criticized by a large number of non-profit organizations, customers and security researchers. Apple Inc said that he would continue to collect relevant opinions and postpone the launch of the feature.