TechScape: What To Expect From The Online Security Bill | Technology

You wouldn’t blame Ofcom for feeling intimidated. The world, or at least the part of the planet that wants to clean up the internet, is watching the online security bill and the UK communications regulator must enforce it. Hearings on the bill by a joint committee ended last week and if you step back and look at what has come out of these sessions since September, it is clear that Ofcom has work to do.

Sign up for our weekly technical newsletter, TechScape.

Quick introduction: The bill covers technology companies that allow users to post their own content or interact with each other. So that means big fish like Facebook, Twitter, Instagram, YouTube and Snapchat have to obey it, but also commercial porn sites like OnlyFans. Search engines such as Google are also included.

The bill imposes a duty of care on these companies to protect users from harmful content – or face substantial fines imposed by Ofcom. The duty of care is divided into three parts: preventing the proliferation of illegal content and activities such as child pornography, terrorist material and hate crimes (ie racial abuse); ensure that children are not exposed to harmful or inappropriate content; and, for big players like Facebook, Twitter and YouTube (described as “category 1” services), ensuring adults are protected from legal but harmful content. This last category of content must be defined by the secretary of culture, after consultation with Ofcom, then examined by parliament before being promulgated in secondary legislation.

Ofcom chief executive Dame Melanie Dawes had warned that she was “overwhelmed” by complaints from social media users and that she had to face the “sheer legal weight” of the big tech response to the law once it becomes law, which should happen around the end of next year.

Culture Secretary Nadine Dorries closed the hearings with an appearance in which she proposed a number of changes to the legislation. But even previous sessions had pointed out the complexities and shortcomings of the bill. It has to be simpler – but there’s no doubt after Dorries appears that it’s going to be bigger.

The committee will release its report on the bill by December 10, and Dorries said she would consider the recommendations “very seriously indeed.” Here are some of the changes we can expect, or at least the issues the committee will address in its report, after the hearings.

A standing joint committee will oversee the law
Dorries said a standing committee of MPs and peers – modeled on the Joint Human Rights Committee – will be set up to conduct “an ongoing review” of the landscape that the law will control and the role of the secretary of law. State and Ofcom in the implementation of the bill. . The body could also recommend when the secretary of state deploys secondary powers under the bill, such as giving advice on how Ofcom should exercise its powers.

There will be criminal penalties for users and executives
Dorries is definitely on the hunt for tech executives, telling Facebook founder Mark Zuckerberg and chief communications officer Nick Clegg to avoid the metaverse and focus on the real world. Speaking to the entire tech industry, Dorries said, “Kill your harmful algorithms today and you won’t be subject – named people – to criminal liability and prosecution.” Being sued for failing to deal with the algorithms that steer users towards harmful content is certainly not in the bill. As it stands, the bill contains provisions for a deferred power, after about two years, to impose criminal penalties on executives if they do not respond to Ofcom’s requests for information accurately. and in a timely manner. Dorries is now talking about imposing criminal penalties within three to six months for a much broader offense, allowing their platforms to guide users to harmful content. Is it illegal content such as racist abuse or less clear areas such as legal but harmful?

For users, three new penal sanctions will be imposed for the offenses of: sending messages or messages which “convey a threat of serious harm”; publishing false information – “false communications” – intended to cause significant emotional, psychological or physical damage; and sending messages or messages intended to cause harm without reasonable excuse.

Online advertising: in the bill or not?
In his appearance before the committee, founder Martin Lewis urged the government to include advertising in the bill as an area that should be regulated. “Fraudulent advertising destroys people’s lives. People kill themselves after being ripped off, and it should be on the bill. Ofcom’s Dawes has suggested regulating ads alongside the Advertising Standards Authority and committee chair Conservative MP Damian Collins is dealing with misleading political advertising. But Dorries firmly said last week that advertising, especially fraudulent ads, would be too big an addition, saying, “He needs his own bill.” Nonetheless, don’t be surprised if the committee tries to get it or at least make some strong recommendations to deal with advertising in the bill or in another act.

Increased investigative powers for Ofcom
Information Commissioner Elizabeth Denham (UK data regulator) said during her appearance that as the bill currently stands, Ofcom does not have enough powers to properly audit data subjects. technology companies. During the sessions, there had been talk of obtaining access allowing the regulator to examine the algorithms and to demand modifications. Denham said she was able to “look under the hood” of tech companies under the age-appropriate design code, which requires websites and apps to consider “the best interests” of their child users. She said Ofcom’s powers under the bill need to be “bolstered with audit powers so the regulator can look under the hood.”

Currently, the bill requires companies to submit details about how their services might expose users to harmful content – and how they will combat that risk. These risk assessments will inform the platform codes of conduct that Ofcom enforces, but the feeling within the committee is that the regulator needs more oomph. Dorries’ strong words about algorithms and criminal liability suggest she agrees.

Fight anonymous abuse
Former Manchester United and Leeds footballer Rio Ferdinand spoke scathingly of the failure to deal with the anonymous abuse when he appeared in September. A total ban on anonymous social media accounts and posts isn’t coming, but expect some form of action. A recent opinion poll found that of those who had experienced online abuse, 72% came from anonymous accounts.

Clean up the Internet, a campaign group calling for more civility and respect online, called for action against anonymous trolls in its submission to the committee, and for the bill to require platforms to demonstrate to Ofcom that they have systems in place to deal with anonymity. Clean up the Internet used social media platforms to give users the ability to preemptively block any interaction with anonymous accounts, as well as to make users’ verification status clearly visible. The group also suggested that anonymous users register their identities on platforms, which could keep this account information – and reveal it to law enforcement if necessary.

If you would like to read the full newsletter, please sign up to receive TechScape in your inbox every Wednesday.

Source link

Comments are closed.