Tech companies must start putting in place measures to protect users from child sexual abuse images and other illegal content in Britain from March 17 as enforcement of its online safety regime ramps up.
British media regulator Ofcom said Meta’s Facebook, ByteDance’s TikTok, Alphabet’s YouTube and other companies must now implement measures such as better moderation, easier reporting and built-in safety tests to tackle criminal activity and make their platforms safer by design, a Reuters report stated.
The Ofcom official statement yesterday, accessed by Indianbroadcastingworld.com, stated, “New duties under the UK’s Online Safety Act come into force today, meaning platforms in scope have to start implementing appropriate measures to remove illegal material quickly when they become aware of it, and to reduce the risk of ‘priority’ criminal content from appearing in the first place.
“Given the acute harm caused by the spread of online child sexual abuse material (CSAM), assessing providers’ compliance with their safety duties in this area has been identified as one of our early priorities for enforcement. Our evidence shows that file-sharing and file-storage services are particularly susceptible to being used for the sharing of image-based CSAM.”
The regulator said it has launched an enforcement programme to assess the safety measures being taken, or that will soon be taken, by file-sharing and file-storage providers to prevent offenders from disseminating CSAM on their services.
“We have written to a number of these services to put them on notice that we will shortly be sending them formal information requests regarding the measures they have in place, or will soon have in place, to tackle CSAM, and requiring them to submit their illegal harms risk assessments to us,” Ofcom statement elaborated..
Dwelling on CSAM on file-sharing services, Ofcom said evidence shows that file-sharing and file-storage services are particularly susceptible to being used for the sharing of image-based CSAM.
“Among the 40 safety measures set out in our illegal harms codes of practice, we recommend, for example, that certain services, including all file-sharing services at high risk of hosting CSAM, regardless of size, use automated moderation technology, including ‘perceptual hash-matching’, to assess whether content is CSAM and, if so, to swiftly take it down,” it said.
The regulator further added that it has at its disposal strong enforcement powers, including being able to issue fines of up to 10 percent of turnover or £18m, whichever is higher, or to apply to a court to block a site in the UK in the most serious cases.
“Child sexual abuse is utterly sickening and file storage and sharing services are too often used to share this horrific material. Ofcom’s first priority is to make sure that sites and apps take the necessary steps to stop it being hosted or shared.
“Platforms must now act quickly to come into compliance with their legal duties, and our codes are designed to help them do that. But, make no mistake, any provider who fails to introduce the necessary protections can expect to face the full force of our enforcement action, said Suzanne Cater, Enforcement Director at Ofcom.
Netflix debuts ‘Clips’ feed to boost mobile content discovery
India’s creative industry reps air AI policy gaps in meet with PM’s advisor
IAMAI flays TRAI attempts to regulate communication OTT
TRAI consultation seeks policy reset to up public Wi-Fi expansion
Registrations open for Fever FM-backed ‘Voice of UP’ talent hunt
News18 India to air ‘Sabse Bada Dangal’ for counting day on May 4
Aamir Khan says ‘Ek Din’ feels like his own story
Ashutosh Gowariker named festival director for 57th IFFI
Finn Wolfhard to release new album ‘Fire From The Hip’ this July 
