As MacRumors pointed out, the site included a brief description of Apple’s “Child Sexual Abuse Material (CSAM), and linkages to technical documents on how it works. According to the Internet Archive, Apple’s child safety website was altered in recent days to remove all mentions of the CSAM detection system. The company’s original links to technical documents regarding the technology remain online. Apples Child Safety site looked on Dec 10th. I re-read the archive. Apple’s child safety site looks now. Apple’s website is on the phone. That deletion resulted in Apple losing its controversial plan to use the iPhone to combat online child pornography. But Apple spokesperson tells PCMag that nothing has changed since September, when Apple said it was hitting pause on the CSAM detection system to gain more feedback and implement improvements. That’s the reason why Apple is putting it in a hurry to try and sell porn detection to a skeptical public. The company created the CSAM detection system to protect the child sexual abuse imagery stored in iCloud. Other companies do this by scanning their own servers to see if child porn has been transmitted across the user accounts. Apple approved an approach whose main function is to flag any child porn uploaded to the iCloud. The proposal quickly became the most objectionable between privacy advocates and consumers because it would not mean that such a system could be used for surveillance or flag photos incorrectly flagged. Apple attempted to explain why its approach is better for users than for server-wide scanning. Without the harsh feedback, Apple would delay its plan to put up with the CSAM detection system, which was originally supposed to arrive with iOS 15. Aside from feedback from customers, advocate groups, researchers and others, we have decided to take more time over the coming months to collect data and make improvements before it releases these critically important child safety features. The same statement was also prominently posted on Apple’s Child Safety website. But it was removed, too.