|
Ofcom publishes another mountain of expensive and suffocating censorship red tape
|
|
|
|
16th December 2024
|
|
| See press release
from ofcom.org.uk |
Ofcom writes: Today we are publishing our first major policy Statement for the Online Safety regime. This decision on the Illegal Harms Codes and guidance marks a major milestone, with online
providers now being legally required to protect their users from illegal harm. Ofcom published proposals about the steps providers should take to address illegal harms on their services shortly after passage of the Online Safety
Act in October 2023. Since then, we have been consulting carefully and widely, listening to industry, charities and campaigners, parents and children, as well as expert bodies and law enforcement agencies. With today's publication1, online providers must
take action to start to comply with these new rules. The result will be a safer life online for people in the UK, especially children. Providers now have a duty to assess the risk of illegal harms on their services, with a
deadline of 16 March 2025. Subject to the Codes completing the Parliamentary process, from 17 March 2025, providers will need to take the safety measures set out in the Codes or use other effective measures to protect users from illegal content and
activity. We are ready to take enforcement action if providers do not act promptly to address the risks on their services.
Analysis to follow but there are over 1000 pages to get through first! |
|
And finding it in draft Australian censorship codes
|
|
|
| 27th October 2024
|
|
| See article from
theguardian.com |
The Australian internet industry has produced draft censorship rules related to age/ID verification. The schedule is for these to come into force in 2025. One of the rules that has caught the attention is that search engines will be required to
age/ID verify users before links to porn or gambling sites sites can be provided. The draft codes will apply to websites, social media, video games, search engines, gaming companies, app developers and internet service providers, among others. As
is the case in most other countries, the authorities are refusing to specify exactly what age/ID verification mechanisms will be acceptable and will leave it to companies to take enormous commercial risks in guessing what mechanisms will be acceptable.
Examples of options include checking photo ID, facial age estimation, credit card checks, digital ID wallets or systems, or attestation by a parent or guardian. The codes have been developed by the Australian Mobile Telecommunications Association
(Amta), the Communications Alliance, the Consumer Electronics Suppliers Association (CESA), the Digital Industry Group Inc. (Digi), and the Interactive Games and Entertainment Association (IGEA). Dr Jennifer Duxbury, Digi's director for policy,
regulatory affairs, and research, told Guardian Australia that the group doesn't speak for the porn industry, and added: I can't predict what their reaction might be, whether they would withdraw from the market, or what's
the likely outcome.
|
|
Ofcom announces a timetable for UK age verification censorship rules and implementation for porn websites
|
|
|
|
17th October 2024
|
|
| See article from ofcom.org.uk
|
Ofcom writes: Parliament set us a deadline of April 2025 to finalise our codes and guidance on illegal harms and children's safety. We will finalise our illegal harms codes and guidance ahead of this deadline. Our expected timing for
key milestones over the next year -- which could change -- include:
December 2024: Ofcom will publish first edition illegal harms codes and guidance. Platforms will have three months to complete illegal harms risk assessment. January 2025: Ofcom will finalise
children's access assessment guidance and guidance for pornography providers on age assurance. Platforms will have three months to assess whether their service is likely to be accessed by children. February 2025: Ofcom
will consult on best practice guidance on protecting women and girls online, earlier than previously planned. March 2025: Platforms must complete their illegal harms risk assessments, and implement appropriate safety measures. -
April 2025: Platforms must complete children's access assessments. Ofcom to finalise children's safety codes and guidance. Companies will have three months to complete children's risk assessment. Spring
2025: Ofcom will consult on additional measures for second edition codes and guidance. July 2025: Platforms must complete children's risk assessments, and make sure they implement appropriate safety measures.
We will review selected risk assessments to ensure they are suitable and sufficient, in line with our guidance, and seek improvements where we believe firms have not adequately mitigated the risks they face. Ready to take enforcement
action. Ofcom has the power to take enforcement action against platforms that fail to comply with their new duties, including imposing significant fines where appropriate. In the most serious cases, Ofcom will be able to seek a
court order to block access to a service in the UK, or limit its access to payment providers or advertisers. We are prepared to take strong action if tech firms fail to put in place the measures that will be most impactful in
protecting users, especially children, from serious harms such as those relating to child sexual abuse, pornography and fraud. |
|
Australian Government is quick to want to grab age verification data for its own uses
|
|
|
| 9th June 2024
|
|
| Thanks to Trog See
article from thenightly.com.au
|
Another layer of secrecy is being stripped from Australian internet users. At a time when users are being forced to and over personal ID data in the name of age verification, it seems that governments will be quick in demanding that internet companies
have to hand over such data to them. It was announced that internet companies will now be forced to reveal the ages of active users supposedly so that the Australian Government can get a grip on the impact these platforms are having on Australian
kids. Last week the Albanese Government announced sweeping reforms intended to boost transparency and accountability for digital platforms used by Australians including popular social media, messaging and gaming services. Communications
Minister Michelle Rowland said the government had amended the Basic Online Safety Expectations to better address new and emerging online safety issues and help hold the tech industry accountable. The new Determination will also require companies
to provide, on request of the eSafety Commissioner, a report on the number of active end-users of services in Australia, broken down according to the number of users who are children or adults. eSafety Commissioner Julie Inman Grant said that
without information on users' ages, the Government was flying blind. Inman Grant said these strengthened powers meant her office would now be able to find out precisely how many children are on specific services. She said:
This needs to be a starting point of understanding how many under-aged users are on these platforms today, otherwise governments are flying blind. If we're serious about effectively managing the ages and stages at which a child can
partake in social media, we need to move forward with all technology companies deploying effective age-assurance systems. |
|
|