On 6 October 2021, Ofcom published guidance on measures Video Sharing Platform (“VSP“) providers can take to protect online users from harmful material and how to implement such measures effectively (the “Guidance“). The Guidance is intended to help VSP providers comply with their obligations under Part 4B of the Communications Act 2003 (“CA“), which establishes a legal framework in the UK for VSPs (the “VSP Framework“). The VSP Framework is intended to afford VSP providers “flexibility in how they protect users”, reflecting “the diversity of the sector and the importance of allowing companies to innovate in the systems and processes they use to keep users safe.”

Further, on 7 December 2021, Ofcom published two sets of  guidance to help VSPs comply with their obligations under the VSP Framework in relation to advertising covering control of advertising and advertising harms and measures (together the “Advertising Guidance“).


On 1 November 2020, prior to the UK leaving the European Union, Part 4B of the CA came into force, implementing the EU’s revised Audiovisual Media Services Directive (the “Directive“) and establishing a legal framework in the UK for VSPs. The VSP Framework was enacted in recognition of the changing landscape of content consumption and with the view to harmonising regulation for all audiovisual media services including traditional linear services, video-on-demand and, for the first time, VSPs. Additionally, new legal requirements on advertising which stem from the Directive have also been incorporated into UK law under the CA.

On 10 March 2021, Ofcom published detailed guidance to assist service providers in assessing whether they fall within the jurisdiction of the VSP Framework. Service providers will fall within the VSP Framework if:

  • their service, as a whole or a ‘dissociable section’ of the service (for example a distinct part of an app or certain types of user accounts providing access to video) has the principal purpose of providing videos to the public (this casts the net very wide and could, for example, bring sites such as news publishers within the ambit of the VSP Framework);
  • their service is provided via an electronic communications network;
  • their service is provided on a commercial basis; and
  • the VSP provider has general control over the organisation of videos but not over what videos are available to users.

From 6 April 2021, VSP providers were required to notify Ofcom that they fall within the scope of the VSP Framework. To date the likes of TikTok, Snapchat, Twitch and Triller (amongst others) have notified under the VSP Framework (a full list of VSP providers which have notified can be viewed here). Notably, some well-known platforms such as YouTube, Facebook and Twitter fall under Irish jurisdiction and therefore are not required to comply with the VSP Framework.From 1 April 2022, firms that have notified under the VSP Framework will be required to pay an annual fee to Ofcom.

Following the publication of the Advertising Guidance, a new Appendix has been included in the UK Code of Non-broadcast Advertising and Direct & Promotional Marketing (the “CAP Code“), which will apply to aspects of advertisements that are subject to the CA.

Harmful material

The VSP Framework requires VSP providers to take appropriate measures to protect:

  • the general public from “relevant harmful material“, which includes:
  • incitement of violence or hatred against particular groups; and
  • content which would be considered a criminal offence under laws relating to terrorism, child sexual abuse material, and racism and xenophobia,
  • under-18s from “restricted material” which includes:
  • material which has been, or would likely be issued, a rated 18 certificate;
  • material which has been deemed, or would likely be deemed, unsuitable for classification; and
  • other material which could impair the physical, mental or moral development of under-18s.

The requirement for VSP providers to have in place appropriate measures to protect users from such material is applicable to all video media appearing on VSPs, including VSP-controlled advertisements which are required to comply with “advertising-specific requirements” which will vary depending on whether or not the advertisements have been “marketed, sold or arranged” by the VSP provider (as set out below).

Ofcom Guidance

Ofcom consulted on draft guidance for VSP providers between 24 March 2021 and 2 June 2021. The Guidance published on 6 October 2021 is intended to assist VSP providers in complying with the statutory regime from a practical perspective.

Appropriate measures

Section 368Z1 of the CA 2003 sets out the general duty on VSP providers to take ‘such measures as are appropriate’ to protect users from harm. Schedule 15A of the CA 2003 sets out measures which ‘may‘ be appropriate for VSP providers to take which are summarised below along with the accompanying Ofcom Guidance in relation to each measure.

The measures set in Schedule 15A of the CA 2003 and the accompanying Ofcom Guidance are  not intended to be prescriptive and therefore VSP providers will not be required to put in place all of the measures  but instead should determine which measures are appropriate by reference to factors such as:

  • the size of the VSP (which can be determined by metrics such as reach of the platform, volume of content and the resources the platform has at its disposal) and nature of the VSP;
  • the harm that may be caused by the material in question;
  • the characteristics of the category of persons to be protected (for example, under-18s);
  • the rights and legitimate interests at stake (such as freedom of expression); and
  • any other measures which have been taken and what would be practicable and proportionate for that particular VSP.
  1. Measures relating to terms and conditions
  • One potential appropriate measure set out in the VSP Framework is the inclusion in VSP terms and conditions that (i) if a user uploads a video containing restricted material, the user must bring it to the attention of the VSP provider and (ii) users must not upload videos which contain relevant harmful material to the VSP.
  • The Guidance stipulates that the inclusion in terms and conditions of a restriction against uploading relevant harmful material is fundamental to the VSP Framework, and Ofcom considers it unlikely that effective protection of users can be achieved without this measure.
  • The Guidance encourages VSP providers to put in place terms and conditions which are easy for users to locate, understand and engage with. Importantly, terms and conditions should be effective and the Guidance states that a clear way of achieving this is through content moderation and appropriate sanctions for violations.
  • The Guidance expects the implementation of terms and conditions to be fair and transparent and to this end encourages VSP providers to set out clearly in their terms and conditions all material which is prohibited on their VSP and all potential sanctions.
  • The Guidance also highlights the expectation that terms and conditions should continually evolve and therefore VSP providers should continually review their terms and conditions and make changes where necessary.
  1. Measures relating to the reporting, flagging or rating of content
  • The VSP Framework lists the establishment and operation of a mechanism enabling users to report or flag harmful material present on the VSP as an appropriate measure which VSP providers can put in place to achieve compliance. Any such mechanism should (i) be transparent and user-friendly and (ii) include a mechanism by which the VSP provider explains to the user flagging or reporting videos the effect of that user’s report.
  • Again, the Ofcom Guidance considers reporting and flagging mechanisms to be fundamental to the protection of users and therefore considers it unlikely effective protection would be afforded to users without such mechanisms.
  • The Guidance states that VSP providers should respond to reports or flags within an appropriate timeframe (taking into account the size, nature and risk profile of the VSP) and ensure that reporting and flagging mechanisms are effectively supported by internal escalation processes.
  1. Access control measures such as age assurance and parental controls
  • The VSP Framework lists both obtaining assurance as to the age of potential viewers and providing parental control systems in relation to restricted material as appropriate measures VSP providers may put in place.
  • In relation to age assurance, the Guidance states that this is a broad term which can come in several forms such as:
    • age verification (the strictest form of age assurance), where the user would provide evidence of their age such as a passport or driving licence; and
    • self-declaration (the softest form of age assurance), where the user states their age or birthdate but offers no supporting evidence.

The Guidance recommends conducting a risk assessment of the VSP and selecting an approach proportionate to the risk, in particular the risk of harm posed to under-18s, by restricted material and the prevalence of such material.

  • The Guidance recommends that VSP providers which offer services to under-18s should “seriously consider” having some form of parental control feature which gives those responsible for under-18s a degree of control over what a child can see on the VSP.
  1. Complaints processes
  • The VSP Framework suggests VSP providers put in place a complaints process linked to specific measures as referred to above. However, the Guidance considers it best practice for VSP providers to implement a complaints process which covers all aspects of user safety.
  1. Media literacy tools and information
  • Finally, the VSP Framework and Guidance suggests that VSP providers put in place tools and information for individuals using the service with the aim of improving their media literacy and empowering VSP users to protect themselves from harmful material.

The Advertising Guidance

In addition to the duties above, the Advertising Guidance has been published to assist VSP providers in complying with the advertising related obligations contained in the VSP Framework.  The Guidance reflects the distinction made in the CA between VSP-controlled advertising and non-VSP-controlled advertising and further clarifies what is meant by those terms.

VSP-controlled advertising: Ofcom considers advertising to be marketed, sold or arranged by a VSP provider when a VSP provider is involved in making advertising available on the platform, which may include: enabling advertisers to buy advertising on their platform (either directly or via a third-party); and/or providing tools that enable advertisers to target or optimise the reach of their advert on the VSP. VSP providers will be directly responsible for ensuring compliance of VSP-controlled advertising with the relevant requirements. Ofcom has appointed the Advertising Standards Authority (“ASA“) as a co-regulator of VSP-controlled advertising allowing the ASA to take action on suspected breaches against VSP providers without the need to refer to Ofcom.

Non-VSP controlled advertising: Where advertising is not marketed, sold or arranged by a VSP provider, there will be a requirement on the VSP provider to take appropriate measures to ensure the relevant requirements in relation to the advertisement are met (but the VSP provider will not be directly responsible for meeting the relevant requirements). Ofcom will remain responsible for assessing the appropriateness of measures taken by VSP providers to protect users.

The Advertising Guidance sets out requirements that all adverts included on a VSP (both VSP and non-VSP controlled):

  • should be recognisable as advertisements and must not convey subliminal messaging; and
  • should not display prohibited or restricted products such as cigarettes and prescription-only medicines.

In relation to non-VSP controlled advertisements, VSP providers will, in addition and as appropriate, be required to take the following measures to meet requirements relating to the transparency of advertising:

  • make available a tool for users who upload content to disclose the presence of advertising in that content; and
  • include and apply in the terms and conditions of the service a requirement that users who upload content featuring advertisements make use of that tool as applicable.


Alongside the Guidance, Ofcom also published a strategy paper outlining their proposed approach to enforcing the VSP Framework. In the strategy paper, Ofcom states that flexibility and proportionality will underpin its approach to enforcement. Importantly, it is keen to ensure that regulation does not encroach on freedom of expression.

Any actions Ofcom takes against VSPs under the VSP Framework will be in line with its existing enforcement guidelines and could include imposing financial penalties.


The Online Safety Bill (which we discuss in our blog post here) will repeal Part 4B of the Communications Act 2003 and put in place a more comprehensive online safety framework, applicable to a wider range of online services and online harms and with Ofcom being given greater duties and powers. Compliance with the VSP Framework will provide VSP providers with an early opportunity to improve approaches to online safety, however VSP providers will no doubt be making any changes to systems and processes with the Online Safety Bill in mind.

Ofcom’s approach to regulation under the VSP Framework will be informative of the approach it is likely to take when given its new powers under the Online Safety Bill, with both regimes being based on systems and processes online service providers put in place to protect their users.

With the Online Safety Bill not likely to be passed until later this year , Ofcom’s chief executive has stated that the new Guidance for VSPs is an important step in better protecting people, and in particular children, from online harms. Some stakeholders, such as TikTok have noted that their community guidelines go beyond what is required in the in the VSP Framework, but TikTok have asked that Ofcom “recognise the complexity and subjectivity in creating and implementing policies for content that may impair minors.”

Hayley Brady
Hayley Brady
Head of Media and Entertainment, London
+44 20 7466 2079

James Balfour
James Balfour
Senior Associate, London
+44 20 7466 7582

Rhianne Murray
Rhianne Murray
Associate, London
+44 20 7466 2874