Guide to Online Safety Bills | Daily News Byte

Guide to Online Safety Bills

 | Daily News Byte

[ad_1]

Guide to Online Safety Bills

What the new internet safety laws mean for adults and children.

The Online Safety Bill is a new set of laws to protect children and adults online. It will make social media companies more accountable for the security of their users on their platforms.

How the Online Safety Bill will protect children

The bill would make social media companies legally responsible for keeping children and young people safe online.

It will protect children by making social media platforms:

  • Remove illegal content quickly or prevent it from appearing in the first place. This includes removing material that promotes self-harm
  • Prevent children from accessing harmful and age-inappropriate content
  • Enforce age limits and age-verification measures
  • Ensure that threats and risks to children are more transparent on the largest social media platforms, including by publishing risk assessments
  • Provide clear and accessible ways for parents and children to report problems online when they arise

How the Online Safety Bill will protect adults

The bill would protect adults in three ways through a ‘triple shield’.

The platform will require:

  1. Remove all illegal content.

  2. Remove content prohibited by their own terms and conditions.

  3. Empower adult internet users with tools so they can tailor the type of content they see and avoid potentially harmful content if they don’t want to see it on their feeds. Children will be automatically prevented from viewing this content without changing any settings.

Types of material that will be encountered

Illegal content

Some content that children and adults access online is already illegal. The bill would force social media platforms to remove all illegal content, preventing children and adults from viewing it.

The bill is also introducing new offences, including for the first time making it illegal to make material promoting self-harm. The platform will need to remove this.

This isn’t just about removing existing illegal content, it’s also about preventing it from appearing. Platforms will need to think about how they design their sites to reduce the likelihood of them being used for criminal activity in the first place.

Illegal content that the platform will be required to remove includes:

  • Child sexual abuse
  • Controlling or coercive behavior
  • Extreme sexual violence
  • Fraud
  • Hate crime
  • Incite violence
  • Illegal immigration and people smuggling
  • Encouraging or facilitating suicide
  • Promote self harm
  • Change porn
  • Selling illegal drugs or weapons
  • sexual abuse
  • Terrorism

Harmful material

Some content is not illegal but may be harmful or age-inappropriate for children. The platform will need to prevent children from accessing it.

Harmful content that the Platform will need to protect children from accessing includes:

  • obscene content
  • Online abuse, cyberbullying or online harassment
  • Content that does not meet the criminal threshold but promotes or glorifies suicide, self-harm or eating disorders

Online safety laws will mean social media companies will have to keep minors off their platforms.

Social media companies set age limits on their platforms and many of them say that children under 13 are not allowed, but many younger children have accounts. This will stop.

Various techniques can be used to check people’s age online. These are called age verification or age verification techniques.

The new law means that social media companies will have to disclose what technology they are using, if any, and show that they are enforcing their age limits.

Adults will have more control over the content they see

The biggest platforms will need to offer tools to adult users so they can have more control over the type of content they watch and who they connect with online. This includes giving them the option to filter out unverified users, which will help prevent anonymous trolls from contacting them.

The largest platforms will have to provide tools to help adult users reduce the likelihood of encountering certain types of content set out in the bill. Examples include content that does not meet criminal thresholds but promotes or encourages eating disorders or self-harm, or is racist, anti-Semitic or misogynistic.

The bill would already protect children from viewing this material.

Tools must be effective and easy to access and may include human moderation, blocking of content flagged by other Internet users, or sensitivity and warning screens.

This bill will deal with repeat offenders

These laws will require all social media companies to assess how their platforms may allow abusers to create anonymous profiles, and take steps to ban repeat offenders, prevent them from creating new accounts, and limit what new or suspicious accounts can do. will take

How will the bill be implemented?

We are putting Ofcom in charge as the regulator to check that platforms protect their users.

Platforms will have to demonstrate that they have processes in place to meet the requirements set out by the bill. Ofcom will examine how effective those processes are at protecting internet users from harm.

Ofcom will have the power to take action against companies that do not comply with their new duties. Companies will be fined the greater of £18 million or 10 per cent of their annual global turnover. Criminal action will be taken against senior managers who fail to comply with Ofcom’s information requests.

In the most extreme cases, with the court’s agreement, Ofcom will be able to stop payment providers, advertisers and internet service providers from working with a site, stop it from generating money or prevent it from being accessed from the UK.

How this UK law affects international companies

Ofcom will have the power to take appropriate action against all social media and tech companies, regardless of where they are based, if they are accessible to UK users.

Next steps for the bill

The new laws are currently going through Parliament and will come into effect once they are passed. The Department for Digital, Culture, Media and Sport is working with Ofcom to ensure that the laws are enforced once they are introduced.

As Ofcom’s powers come into force we will take a phased approach to bringing in the Online Safety Bill duties.
To address the most serious harm as soon as possible, Ofcom will initially focus on illegal content.

Find out more

See more information about online safety, including learning materials and tips for staying safe online.

[ad_2]

Source link