The communications watchdog has advised that social media platforms should take steps to combat online grooming by not automatically suggesting children as "friends".
Ofcom's inaugural guidance for tech platforms outlines the need to adhere to the Online Safety Act.
Instructions for how to address illegal material, particularly that concerning child abuse, online are outlined herein.
The regulator Ofcom unveiled data implying that more than 10% of 11-18 year olds have acquired nude or partially nude photos.
Ofcom, in enforcing the Online Safety Act, has published this initial code of practice that covers such issues as fraud, grooming, and child sexual abuse material (CSAM).
It desires to know the opinions of tech platforms regarding its proposals.
A lot of advice concentrates on prevention of grooming. Major sites are forecasted to alter the default selections so that minors aren't added to the proposed contacts catalogs--something that groomers could misuse.
They ought to make sure that kids' whereabouts cannot be uncovered in their profile or postings and stop them from getting communications from individuals who are not in their contacts.
Depending on their size, type, and potential for risk, platforms should also:
Furthermore, Ofcom will demand certain platforms to utilize a hash-matching system to pick up on Child Sexual Abuse Material.
This turns an image into a numerical "hash" and compares it to a database of numbers created from known CSAM images. If a match is found, then it indicates a known CSAM image has been uncovered.
Professor Alan Woodward from Surrey University has mentioned that the technique has already been widely adopted by search engines and social media outlets.
He expressed his concern to the BBC that Ofcom were merely validating existing procedures, asserting that the research conducted to this point had not yielded anything better than what is already being employed.
However, this encryption will not be used for private or confidential communications. Ofcom emphasizes that it is not suggesting any proposals that would weaken encryption in this guidance.
The bill contains certain powers that, if met with the right criteria, could cause private messaging applications like iMessage, WhatsApp and Signal to have to inspect messages for CSAM, which has been a widely disputed topic.
These apps employ end-to-end-encryption, meaning that not even the tech firm is able to decipher the contents of the message.
Some major apps have declared they will not accommodate requests to scan encrypted messages - asserting it necessitates lessening the security of their worldwide systems, and weakening the protection of users, including youngsters.
According to Ofcom, consultation on these powers won't occur until 2024 and it's predicted that they will not be put into action until around 2025.
One ponders if enforcing these authorities in a manner to maintain the confidentiality of encrypted conversations is feasible.
When asked in a BBC interview if Ofcom would ever use its powers, Dame Melanie Dawes, the chief executive, commented that it is difficult to answer at present but there is no technology available that allows scanning to occur in encrypted settings without decryption happening.
She urged encrypted messaging companies to devise methods for tackling child abuse on their respective platforms.
Ofcom faces a significant hurdle. With the initial guidance being over 1,500 pages long, potentially over 100,000 services, many of which are located outside of the UK, could be subject to regulation.
Government estimates indicate that 20,000 small businesses may have to comply.
Dame Melanie acknowledged that Ofcom had a "really big job" to tackle, but was confident that they were "absolutely up for the task" and "really excited" to launch their endeavor today.
Ofcom faces the difficulty of meeting public and campaigner expectations. No matter what is declared by Ofcom, it will be subject to criticism from some for being too strict with tech companies or too lenient, affirmed Dame Melanie. It is, however, our job to earn the respect of everyone we interact with."
It is not realistic to expect a regulator to be favored by all individuals. Nonetheless, we must strive to garner the esteem of those we encounter.
"That is not our aspiration; however, it is our duty to ensure that what we demand is supported by reliable evidence," stated Dame Melanie.
One expectation that Ofcom is keen to refute is the idea that unlawful or damaging material should be reported straight to them - instead, their responsibility is to make certain that the tech companies have reliable systems for users to report any prohibited or damaging content to them.
Dame Melanie commented that this was not like TV complaints, where one could send an Ofcom complaint, and it would be looked over by the regulatory body.
Subscribe to our morning bulletin and receive BBC News straight to your inbox.
top of page
bottom of page
Comments