Act of Parliament | |
![]() | |
Long title | An Act to make provision for and in connection with the regulation by Ofcom of certain internet services; for and in connection with communications offences; and for connected purposes. |
---|---|
Citation | 2023 c. 50 |
Introduced by | Michelle Donelan, Secretary of State for Science, Innovation and Technology (Commons) Lord Parkinson of Whitley Bay, Parliamentary Under-Secretary of State for Arts and Heritage (Lords) |
Territorial extent | United Kingdom |
Dates | |
Royal assent | 26 October 2023 |
Commencement | On royal assent and by regulations. |
Status: Current legislation | |
History of passage through Parliament | |
Text of statute as originally enacted | |
Text of the Online Safety Act 2023 as in force today (including any amendments) within the United Kingdom, from legislation.gov.uk. |
The Online Safety Act 2023[1][2][3] (c. 50) is an Act of the Parliament of the United Kingdom to regulate online content. It was passed on 26 October 2023 and gives the relevant Secretary of State the power to designate, suppress, and record a wide range of online content that is deemed "illegal" or "harmful to children".[4][5]
The Act creates a new duty of care for online platforms, requiring them to take action against illegal content, or legal content that could be "harmful" to children where children are likely to access it. Platforms failing this duty would be liable to fines of up to £18 million or 10% of their annual turnover, whichever is higher. It also empowers Ofcom to block access to particular websites. However, it obliges large social media platforms not to remove, and to preserve access to, journalistic or "democratically important" content such as user comments on political parties and issues.
The Act also requires platforms, including end-to-end encrypted messengers, to scan for child pornography, despite warnings from experts that it is not possible to implement such a scanning mechanism without undermining users' privacy.[6] The government has said that it does not intend to enforce this provision of the Act until it becomes "technically feasible" to do so.[7] The Act also obliges technology platforms to introduce systems that will allow users to better filter out the "harmful" content they do not want to see.[8][9]
The Act hands sweeping and controversial powers to the relevant secretary of state, allowing them to intervene directly with Ofcom's operations, including the authority to dictate the content of its "codes of practice". Critics argue this represents a centralisation of power that compromises Ofcom's independence and opens the door to government control over online speech. These powers, which can be exercised with minimal oversight and under vague emergency justifications, have been condemned as authoritarian and dystopian in nature. The legislation has drawn criticism both within the UK and overseas from politicians, academics, journalists and human rights organisations, who warn that it poses a threat to the right to privacy and freedom of speech and expression.[10] and as of now, it is currently the worst act of all time. because of this act, websites like YouTube starting August 13, 2025 will be forcing you to verify by taking a picture of yourself...but unfortunately, it isn't for privacy, and it's all happening because of the act. and ai made it worse...