Google-owned YouTube yesterday said that after multiple investigations, it has not detected any kind of child sexual abuse material (CSAM) on its platform, and has submitted its formal response to India’s IT Ministry, after it was served a notice, along with other social media intermediaries last week, by MeitY to remove any CSAM (child sexual abuse material) on their platforms.
In a statement to IANS, a YouTube spokesperson said that based on “multiple thorough investigations, we did not detect CSAM on our platform, nor did we receive examples or evidence of CSAM on YouTube from regulators”.
The spokesperson added that no form of content that endangers minors is allowed on YouTube, news agency IANS reported from New Delhi.
“We will continue to heavily invest in the teams and technologies that detect, remove, and deter the spread of this content. We are committed to working with all collaborators in the industry-wide fight to stop the spread of CSAM),” said the company spokesperson.
According to the platform, the majority of videos featuring minors on YouTube do not violate its policies. But when it comes to kids, YouTube takes an “extra cautious approach towards our enforcement”.
The Ministry of Electronics and IT had issued notices to social media intermediaries X (formerly Twitter), YouTube and Telegram, warning them to remove any kind of CSAM from their platforms on the Indian Internet or face action.
“The rules under the IT Act lay down strict expectations from social media intermediaries that they should not allow criminal or harmful posts on their platforms. If they do not act swiftly, their safe harbor under section 79 of the IT Act will be withdrawn and consequences under the Indian law will follow,” said Union Minister of State for Electronics & IT, Rajeev Chandrasekhar.
The Information Technology (IT) Act, of 2000, provides the legal framework for addressing pornographic content, including CSAM.
Sections 66E, 67, 67A, and 67B of the IT Act impose stringent penalties and fines for the online transmission of obscene or pornographic content.
According to YouTube, in India, “we surface a warning at the top of search results for specific search queries related to CSAM”.
This warning states child sexual abuse imagery is illegal and links to the National Cyber Crime Reporting Portal.