The regulator says they contain more than 40 practical measures tech firms must take, including: Algorithms being adjusted to filter out harmful content from children's feeds; Robust age checks for people accessing age-restricted content; Taking quick action when harmful content is identified; Making terms of service easy for children to understand; Giving children the option to decline invitations to group chats which may include harmful content; Providing support to children who come across harmful content; A "named person accountable for children's safety"; Management of risk to children reviewed annually by a senior body.
Source: www.bbc.com