Melon Farmers Unrated

Internet Porn Censorship


2018: Oct-Dec

 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    

 

Offsite Article: The internet war on sex is here...


Link Here8th December 2018
No sex please, we're beholden to our advertisers. By Violet Blue

See article from engadget.com

 

 

Censsorship hub...

Uganda blocks 27 internet porn websites


Link Here6th December 2018
Full story: Internet Censorship in Uganda...Banning VPNs and taxing social media
ISPs in Uganda have blocked 27 pornography websites after a directive was issued by the Uganda Communications Commission.

Pornhub, Xvideos, and Youporn were among the top 100 most visited websites.

The Daily Monitor reports that at least 25 of the 27 banned websites cannot be accessed on mobile phones. However, users of Virtual Private Networks can access the banned sites.

Chairperson of the Pornography Control Committee Annette Kezaabu told the Monitor there is a drop in the number of people accessing pornography after they blocked the prominent porn sites. She said:

We have a team that is compiling a list of other porn sites that will be blocked

We anticipate that some people will open up new sites but this is a continuous process.

 

 

Offsite Article: And who better to the job?...


Link Here5th December 2018
New Zealand film censor, with a keen eye on upcoming UK cesorship, publishes a report on porn viewing by the young and inevitably finds that they want porn to be censored

See report [pdf] from classificationoffice.govt.nz

 

 

Offsite Article: Online porn filters will never work...


Link Here26th November 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
Beyond the massive technical challenge, filters are a lazy alternative to effective sex education. By Lux Alptraum

See article from theverge.com

 

 

Adult gaming website hacked...

Further demonstrating how dangerous it is for the government to demand that identity information is handed over before viewers can access the adult web


Link Here21st November 2018
The website of an adult video game featuring sexualised animals has been hacked, with the information of nearly half a million subscribers stolen.

High Tail Hall is a customisable role-playing game, which features what the website describes as sexy furry characters, including buxom zebras and scantily clad lionesses.

The compromised information, including email addresses, names and order histories, resurfaced on a popular hacking forum a few months later. HTH Studio has acknowledged the breach and say that it has been fixed. The company added:

Both our internal security and web team security assures us that no financial data was compromised. The security of our users is the highest priority.

It further recommended that all users change their passwords. So although credit card data is safe users are still at risk from identity fraud, outing and blackmail.

It is the latest in a long series of hacks aimed at adult sites and demonstrates the dangers for UK porn viewers when they are forced to supply identity information to be able to browse the adult web.

 

 

Offsite Article: How US Republicans Gave Up on Porn...


Link Here20th November 2018
Once, the fight against pornography was the beating heart of the American culture war. Now porn is a ballooning industry with no real opponents. What happened? By Tim Alberta

See article from politico.com

 

 

Miserable Bangladesh...

High Court orders the censorship of all internet porn websites for 6 months


Link Here19th November 2018
The Bangladesh High Court has ordered the country's government to block all pornography websites and publication of all obscene materials from the internet for the next six months.

The court also ordered the authorities concerned to explain in four weeks why pornography websites and publication of obscene materials should not be declared illegal.

The judges issue the orders in response to a writ petition filed by Law and Life Foundation campaigning for internet censorship.

 

 

BBFC: Age verification we don't trust...

Analysis of BBFC's Post-Consultation Guidance by the Open Rights Group


Link Here8th November 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
Following the conclusion of their consultation period, the BBFC have issued new age verification guidance that has been laid before Parliament. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary.

Summary

The new code has some important improvements, notably the introduction of a voluntary scheme for privacy, close to or based on a GDPR Code of Conduct. This is a good idea, but should not be put in place as a voluntary arrangement. Companies may not want the attention of a regulator, or may simply wish to apply lower or different standards, and ignore it. It is unclear why, if the government now recognises that privacy protections like this are needed, the government would also leave the requirements as voluntary.

We are also concerned that the voluntary scheme may not be up and running before the AV requirement is put in place. Given that 25 million UK adults are expected to sign up to these products within a few months of its launch, this would be very unhelpful.

Parliament should now:

  • Ask the government why the privacy scheme is to be voluntary, if the risks of relying on general data protection law are now recognised;
  • Ask for assurance from BBFC that the voluntary scheme will cover the all of the major operators; and
  • Ask for assurance from BBFC and DCMS that the voluntary privacy scheme will be up and running before obliging operators to put Age Verification measures in place.

The draft code can be found here .

Lack of Enforceability of Guidance

The Digital Economy Act does not allow the BBFC to judge age verification tools by any standard other than whether or not they sufficiently verify age. We asked that the BBFC persuade the DCMS that statutory requirements for privacy and security were required for age verification tools.

The BBFC have clearly acknowledged privacy and security concerns with age verification in their response. However, the BBFC indicate in their response that they have been working with the ICO and DCMS to create a voluntary certification scheme for age verification providers:

"This voluntary certification scheme will mean that age-verification providers may choose to be independently audited by a third party and then certified by the Age-verification Regulator. The third party's audit will include an assessment of an age-verification solution's compliance with strict privacy and data security requirements."

The lack of a requirement for additional and specific privacy regulation in the Digital Economy Act is the cause for this voluntary approach.

While a voluntary scheme above is likely to be of some assistance in promoting better standards among age verification providers, the "strict privacy and data security requirements" which the voluntary scheme mentions are not a statutory requirement, leaving some consumers at greater risk than others.

Sensitive Personal Data

The data handled by age verification systems is sensitive personal data. Age verification services must directly identify users in order to accurately verify age. Users will be viewing pornographic content, and the data about what specific content a user views is highly personal and sensitive. This has potentially disastrous consequences for individuals and families if the data is lost, leaked, or stolen.

Following a hack affecting Ashley Madison -- a dating website for extramarital affairs -- a number of the site's users were driven to suicide as a result of the public exposure of their sexual activities and interests.

For the purposes of GDPR, data handled by age verification systems falls under the criteria for sensitive personal data, as it amounts to "data concerning a natural person's sex life or sexual orientation".

Scheduling Concerns

It is of critical importance that any accreditation scheme for age verification providers, or GDPR code of conduct if one is established, is in place and functional before enforcement of the age verification provisions in the Digital Economy Act commences. All of the major providers who are expected to dominate the age verification market should undergo their audit under the scheme before consumers will be expected to use the tool. This is especially true when considering the fact that MindGeek have indicated their expectation that 20-25 million UK adults will sign up to their tool within the first few months of operation. A voluntary accreditation scheme that begins enforcement after all these people have already signed up would be unhelpful.

Consumers should be empowered to make informed decisions about the age verification tools that they choose from the very first day of enforcement. No delays are acceptable if users are expected to rely upon the scheme to inform themselves about the safety of their data. If this cannot be achieved prior to the start of expected enforcement of the DE Act's provisions, then the planned date for enforcement should be moved back to allow for the accreditation to be completed.

Issues with Lack of Consumer Choice

It is of vital importance that consumers, if they must verify their age, are given a choice of age verification providers when visiting a site. This enables users to choose which provider they trust with their highly sensitive age verification data and prevents one actor from dominating the market and thereby promoting detrimental practices with data. The BBFC also acknowledge the importance of this in their guidance, noting in 3.8:

"Although not a requirement under section 14(1) the BBFC recommends that online commercial pornography services offer a choice of age-verification methods for the end-user".

This does not go far enough to acknowledge the potential issues that may arise in a fragmented market where pornographic sites are free to offer only a single tool if they desire.

Without a statutory requirement for sites to offer all appropriate and available tools for age verification and log in purposes, it is likely that a market will be established in which one or two tools dominate. Smaller sites will then be forced to adopt these dominant tools as well, to avoid friction with consumers who would otherwise be required to sign up to a new provider.

This kind of market for age verification tools will provide little room for a smaller provider with a greater commitment to privacy or security to survive and robs users of the ability to choose who they trust with their data.

We already called for it to be made a statutory requirement that pornographic sites must offer a choice of providers to consumers who must age verify, however this suggestion has not been taken up.

We note that the BBFC has been working with the ICO and DCMS to produce a voluntary code of conduct. Perhaps a potential alternative solution would be to ensure that a site is only considered compliant if it offers users a number of tools which has been accredited under the additional privacy and security requirements of the voluntary scheme.

GDPR Codes of Conduct

A GDPR "Code of Conduct" is a mechanism for providing guidelines to organisations who process data in particular ways, and allows them to demonstrate compliance with the requirements of the GDPR.

A code of conduct is voluntary, but compliance is continually monitored by an appropriate body who are accredited by a supervisory authority. In this case, the "accredited body" would likely be the BBFC, and the "supervisory authority" would be the ICO. The code of conduct allows for certifications, seals and marks which indicate clearly to consumers that a service or product complies with the code.

Codes of conduct are expected to provide more specific guidance on exactly how data may be processed or stored. In the case of age verification data, the code could contain stipulations on:

  • Appropriate pseudonymisation of stored data;
  • Data and metadata retention periods;
  • Data minimisation recommendations;
  • Appropriate security measures for data storage;
  • Security breach notification procedures;
  • Re-use of data for other purposes.

The BBFC's proposed "voluntary standard" regime appears to be similar to a GDPR code of conduct, though it remains to be seen how specific the stipulations in the BBFC's standard are. A code of conduct would also involve being entered into the ICO's public register of UK approved codes of conduct, and the EPDB's public register for all codes of conduct in the EU.

Similarly, GDPR Recital 99 notes that "relevant stakeholders, including data subjects" should be consulted during the drafting period of a code of conduct - a requirement which is not in place for the BBFC's voluntary scheme.

It is possible that the BBFC have opted to create this voluntary scheme for age verification providers rather than use a code of conduct, because they felt they may not meet the GDPR requirements to be considered as an appropriate body to monitor compliance. Compliance must be monitored by a body who has demonstrated:

  • Their expertise in relation to the subject-matter;
  • They have established procedures to assess the ability of data processors to apply the code of conduct;
  • They have the ability to deal with complaints about infringements; and
  • Their tasks do not amount to a conflict of interest.
Parties Involved in the Code of Conduct Process

As noted by GDPR Recital 99, a consultation should be a public process which involves stakeholders and data subjects, and their responses should be taken into account during the drafting period:

"When drawing up a code of conduct, or when amending or extending such a code, associations and other bodies representing categories of controllers or processors should consult relevant stakeholders, including data subjects where feasible , and have regard to submissions received and views expressed in response to such consultations."

The code of conduct must be approved by a relevant supervisory authority (in this case the ICO).

An accredited body (BBFC) that establishes a code of conduct and monitors compliance is able to establish their own structures and procedures under GDPR Article 41 to handle complaints regarding infringements of the code, or regarding the way it has been implemented. BBFC would be liable for failures to regulate the code properly under Article 41(4), [1] however DCMS appear to have accepted the principle that the government would need to protect BBFC from such liabilities. [2]

GDPR Codes of Conduct and Risk Management

Below is a table of risks created by age verification which we identified during the consultation process. For each risk, we have considered whether a GDPR code of conduct may help to mitigate the effects of it.

Risk CoC Appropriate? Details
User identity may be correlated with viewed content. Partially This risk can never be entirely mitigated if AV is to go ahead, but a CoC could contain very strict restrictions on what identifying data could be stored after a successful age verification.
Identity may be associated to an IP address, location or device. No It would be very difficult for a CoC to mitigate this risk as the only safe mitigation would be not to collect user identity information.
An age verification provider could track users across all the websites it's tool is offered on. Yes Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.
Users may be incentivised to consent to further processing of their data in exchange for rewards (content, discounts etc.) Yes Age verification tools could be expressly forbidden from offering anything in exchange for user consent.
Leaked data creates major risks for identified individuals and cannot be revoked or adequately compensated for. Partially A CoC can never fully mitigate this risk if any data is being collected, but it could contain strict prohibitions on storing certain information and specify retention periods after which data must be destroyed, which may mitigate the impacts of a data breach.
Risks to the user of access via shared computers if viewing history is stored alongside age verification data. Yes A CoC could specify that any accounts for pornographic websites which may track viewed content must be strictly separate and not in any visible way linked to a user's age verification account or data that confirms their identity.
Age verification systems are likely to trade off convenience for security. (No 2FA, auto-login, etc.) Yes A CoC could stipulate that login cookies that "remember" a returning user must only persist for a short time period, and should recommend or enforce two-factor authentication.
The need to re-login to age verification services to access pornography in "private browsing" mode may lead people to avoid using this feature and generate much more data which is then stored. No A CoC cannot fix this issue. Private browsing by nature will not store any login cookies or other objects and will require the user to re-authenticate with age verification providers every time they wish to view adult content.
Users may turn to alternative tools to avoid age verification, which carry their own security risks. (Especially "free" VPN services or peer-to-peer networks). No Many UK adults, although over 18, will be uncomfortable with the need to submit identity documents to verify their age and will seek alternative means to access content. It is unlikely that many of these individuals will be persuaded by an accreditation under a GDPR code.
Age verification login details may be traded and shared among teenagers or younger children, which could lead to bullying or "outing" if such details are linked to viewed content. Yes Strict rules could be put in place about what data an age verification provider may store, and what data it is forbidden from storing.
Child abusers could use their access to age verified content as an adult as leverage to create and exploit relationships with children and teenagers seeking access to such content (grooming). No This risk will exist as long as age verification is providing a successful barrier to accessing such content for under-18s who wish to do so.
The sensitivity of content dealt with by age verification services means that users who fall victim to phishing scams or fraud have a lower propensity to report it to the relevant authorities. Partially A CoC or education campaign may help consumers identify trustworthy services, but it can not fix the core issue, which is that users are being socialised into it being "normal" to input their identity details into websites in exchange for pornography. Phishing scams resulting from age verification will appear and will be common, and the sensitivity of the content involved is a disincentive to reporting it.
The use of credit cards as an age verification mechanism creates an opportunity for fraudulent sites to engage in credit card theft. No Phishing and fraud will be common. A code of conduct which lists compliant sites and tools externally on the ICO website may be useful, but a phishing site may simply pretend to be another (compliant) tool, or rely on the fact that users are unlikely to check with the ICO every time they wish to view pornographic content.
The rush to get age verification tools to market means they may take significant shortcuts when it comes to privacy and security. Yes A CoC could assist in solving this issue if tools are given time to be assessed for compliance before the age verification regime commences .
A single age verification provider may come to dominate the market, leaving users little choice but to accept whatever terms the provider offers. Partially Practically, a CoC could mitigate some of the effects of an age verification tool monopoly if the dominant tool is accredited under the Code. However, this relies on users being empowered to demand compliance with a CoC, and it is possible that users will instead be left with a "take it or leave it" situation where the dominant tool is not CoC accredited.
Allowing pornography "monopolies" such as MindGeek to operate age verification tools is a conflict of interest. Partially As the BBFC note in their consultation response, it would not be reasonable to prohibit a pornographic content provider from running an age verification service as it would prevent any site from running their own tool. However, under a CoC it is possible that a degree of separation could be enforced that requires an age verification tools to adhere to strict rules about the use of data, which could mitigate the effects of a large pornographic content provider attempting to collect as much user data as possible for their own business purposes.
 

[1] "Infringements of the following provisions shall, in accordance with paragraph 2, be subject to administrative fines up to 10 000 000 EUR, or in the case of an undertaking, up to 2 % of the total worldwide annual turnover of the preceding financial year, whichever is higher: the obligations of the monitoring body pursuant to Article 41(4)."

[2] "contingent liability will provide indemnity to the British Board of Film Classification (BBFC) against legal proceedings brought against the BBFC in its role as the age verification regulator for online pornography."

 

 

Offsite Article: Millions of porn videos will not be blocked by UK online age checks...


Link Here 21st October 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
The government makes changes such that image hosting sites, not identifying as porn sites, do not need age verification for porn images they carry

See article from theguardian.com

 

 

Not taking censorship lying down...

MoneySupermarket survey finds that 25% of customers will take action if their porn is blocked


Link Here16th October 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
In a survey more about net neutrality than porn censorship, MoneySupermarket noted:

We conducted a survey of over 2,000 Brits on this and it seems that if an ISP decided to block sites, it could result in increasing numbers of Brits switching - 64 per cent of Brits would be likely to switch ISP if they put blocks in place

In reality, this means millions could be considering a switch as nearly six million having tried to access a site that was blocked in the last week - nearly one in 10 across the country.

It's an issue even more pertinent for those aged 18 to 34, with nearly half (45 per cent) having tried to access a site that was blocked at some point.

While ISPs might block sites for various reasons, a quarter of Brits said they would switch ISP if they were blocked from viewing adult sites - with those living with partners the most likely to do so!

Now switching ISPs isn't going to help much if the BBFC, the government appointed porn censor, has dictated that all ISPs block porn sites. But maybe these 25% of internet users will take up alternatives such as subscribing to a VPN service.

 

 

Offsite Article: It's politically incorrect to fantasise over Thai or black girls...


Link Here 16th October 2018
Race, porn, and education: will the UK's 2020 sex education update teach people to be PC about their choice of porn?

See article from opendemocracy.net

 

 

The new UK porn censor lays out its stall...

The BBFC launches a new website


Link Here11th October 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust
There's loads of new information today about the upcoming internet porn censorship regime to be coordinated by the BBFC.

The BBFC has launched a new website, ageverificationregulator.com , perhaps to distance itself a bit from its film censorship work.

The BBFC has made a few changes to its approach since the rather ropey document published prior to the BBFC's public consultation. In general the BBFC seems a little more pragmatic about trying to get adult porn users to buy into the age verification way of thinking. The BBFC seems supportive of the anonymously bought porn access card from the local store, and has taken a strong stance against age verification providers who reprehensibly want to record people's porn browsing, claiming a need to provide an audit trail.

The BBFC has also decided to offer a service to certify age verification providers in the way that they protect people's data. This is again probably targeted at making adult porn users a bit more confident in handing over ID.

The BBFC tone is a little bit more acknowledging of people's privacy concerns, but it's the government's law being implemented by the BBFC, that allows the recipients of the data to use it more or less how they like. Once you tick the 'take it or leave it' consent box allowing the AV provider 'to make your user experience better' then they can do what they like with your data (although GDPR does kindly let you later withdraw that consent and see what they have got on you).

Another theme that runs through the site is a rather ironic acceptance that, for all the devastation that will befall the UK porn industry, for all the lives ruined by people having their porn viewing outed, for all the lives ruined by fraud and identity theft, that somehow the regime is only about stopping young children 'stumbling on porn'... because the older, more determined, children will still know how to find it anyway.

So the BBFC has laid out its stall, and its a little more conciliatory to porn users, but I for one will never hand over any ID data to anyone connected with a servicing porn websites. I suspect that many others will feel the same. If you can't trust the biggest companies in the business with your data, what hope is there for anyone else.

There's no word yet on when all this will come into force, but the schedule seems to be 3 months after the BBFC scheme has been approved by Parliament. This approval seems scheduled to be debated in Parliament in early November, eg on 5th November there will be a House of Lords session:

Implementation by the British Board of Film Classification of age-verifications to prevent children accessing pornographic websites 203 Baroness Benjamin Oral questions

So the earliest it could come into force is about mid February.

 

 

Preventing children and non human operators from being able to access porn...

BBFC publishes its sometimes bizarre Guidance on Age-verification Arrangement


Link Here11th October 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust

The BBFC has published its Age Verification Guidance document that will underipin the implementation of internet porn censorship in the UK.

Perhaps a key section is:

5. The criteria against which the BBFC will assess that an age-verification arrangement meets the requirement under section 14(1) to secure that pornographic material is not normally accessible by those under 18 are set out below:

a. an effective control mechanism at the point of registration or access to pornographic content by the end-user which verifies that the user is aged 18 or over at the point of registration or access

b use of age-verification data that cannot be reasonably known by another person, without theft or fraudulent use of data or identification documents nor readily obtained or predicted by another person

c. a requirement that either a user age-verify each visit or access is restricted by controls, manual or electronic, such as, but not limited to, password or personal identification numbers. A consumer must be logged out by default unless they positively opt-in for their log in information to be remembered

d. the inclusion of measures which authenticate age-verification data and measures which are effective at preventing use by non-human operators including algorithms

It is fascinating as to why the BBFC feels that bots need to be banned, perhaps they need to be 18 years old too, before they can access porn. I am not sure if porn sites will appreciate Goggle-bot being banned from their sites. I love the idea that the word 'algorithms' has been elevated to some sort of living entity.

It all smacks of being written by people who don't know what they are talking about.

In a quick read I thought the following paragraph was important:

9. In the interests of data minimisation and data protection, the BBFC does not require that age-verification arrangements maintain data for the purposes of providing an audit trail in order to meet the requirements of the act.

It rather suggests that the BBFC pragmatically accept that convenience and buy-in from porn-users is more important than making life dangerous for everybody, just n case a few teenagers get hold of an access code.

 

 

A significant number of responses raised concerns about the introduction of age-verification...

BBFC publishes its summary of the consultation repsonses


Link Here11th October 2018
Full story: BBFC Internet Porn Censors...BBFC: Age Verification We Don't Trust

BBFC Executive Summary

The British Board of Film Classification was designated as the age-verification regulator under Part 3 of the Digital Economy Act on 21 February 2018. The BBFC launched its consultation on the draft Guidance on Age-verification Arrangements and draft Guidance on Ancillary Service Providers on 26 March 2018. The consultation was available on the BBFC's website and asked for comments on the technical aspects on how the BBFC intends to approach its role and functions as the age-verification regulator. The consultation ran for 4 weeks and closed on 23 April 2018, although late submissions were accepted until 8 May 2018.

There were a total of 624 responses to the consultation. The vast majority of those (584) were submitted by individuals, with 40 submitted by organisations. 623 responses were received via email, and one was received by post. Where express consent has been given for their publication, the BBFC has published responses in a separate document. Response summaries from key stakeholders are in part 4 of this document.

Responses from stakeholders such as children's charities, age-verification providers and internet service providers were broadly supportive of the BBFC's approach and age-verification standards. Some responses from these groups asked for clarification to some points. The BBFC has made a number of amendments to the guidance as a result. These are outlined in chapter 2 of this document. Responses to questions raised are covered in chapter 3 of this document.

A significant number of responses, particularly from individuals and campaign groups, raised concerns about the introduction of age-verification, and set out objections to the legislation and regulatory regime in principle. Issues included infringement of freedom of expression, censorship, problematic enforcement powers and an unmanageable scale of operation. The government's consultation on age-verification in 2016 addressed many of these issues of principle. More information about why age-verification has been introduced, and the considerations given to the regulatory framework and enforcement powers can be found in the 2016 consultation response by the Department for Digital Culture Media and Sport1.

 

 

Internet TV censorship extended...

New rules for AudioVisual Media Services approved by Parliament


Link Here3rd October 2018

New rules on audiovisual media services will apply to broadcasters, and also to video-on-demand and video-sharing platforms

MEPs voted on updated rules on audiovisual media services covering children protection, stricter rules on advertising, and a requirement 30% European content in video-on-demand.

Following the final vote on this agreement, the revised legislation will apply to broadcasters, but also to video-on-demand and video-sharing platforms, such as Netflix, YouTube or Facebook, as well as to live streaming on video-sharing platforms.

The updated rules will ensure:

  • Enhanced protection of minors from violence, hatred, terrorism and harmful advertising

Audiovisual media services providers should have appropriate measures to combat content inciting violence, hatred and terrorism, while gratuitous violence and pornography will be subject to the strictest rules. Video-sharing platforms will now be responsible for reacting quickly when content is reported or flagged by users as harmful.

The legislation does not include any automatic filtering of uploaded content, but, at the request of the Parliament, platforms need to create a transparent, easy-to-use and effective mechanism to allow users to report or flag content.

The new law includes strict rules on advertising, product placement in children's TV programmes and content available on video-on-demand platforms. EP negotiators also secured a personal data protection mechanism for children, imposing measures to ensure that data collected by audiovisual media providers are not processed for commercial use, including for profiling and behaviourally targeted advertising.

  • Redefined limits of advertising

Under the new rules, advertising can take up a maximum of 20% of the daily broadcasting period between 6.00 and 18.00, giving the broadcaster the flexibility to adjust their advertising periods. A prime-time window between 18:00 and 0:00 was also set out, during which advertising will only be allowed to take up a maximum of 20% of broadcasting time.

  • 30% of European content on the video-on-demand platforms' catalogues

In order to support the cultural diversity of the European audiovisual sector, MEPs ensured that 30% of content in the video-on-demand platforms' catalogues should be European.

Video-on-demand platforms are also asked to contribute to the development of European audiovisual productions, either by investing directly in content or by contributing to national funds. The level of contribution in each country should be proportional to their on-demand revenues in that country (member states where they are established or member states where they target the audience wholly or mostly).

The legislation also includes provisions regarding accessibility, integrity of a broadcaster's signal, strengthening regulatory authorities and promoting media competences.

Next steps

The deal still needs to be formally approved by the Council of EU ministers before the revised law can enter into force. Member States have 21 months after its entry into force to transpose the new rules into national legislation.

The text was adopted by 452 votes against 132, with 65 abstentions.

Article 6a

A new section has been added to the AVMS rules re censorship

  1. Member States shall take appropriate measures to ensure that audiovisual media services provided by media service providers under their jurisdiction which may impair the physical, mental or moral development of minors are only made available in such a way as to ensure that minors will not normally hear or see them. Such measures may include selecting the time of the broadcast, age verification tools or other technical measures. They shall be proportionate to the potential harm of the programme. The most harmful content, such as gratuitous violence and pornography, shall be subject to the strictest measures.
     

  2. Personal data of minors collected or otherwise generated by media service providers pursuant to paragraph 1 shall not be processed for commercial purposes, such as direct marketing, profiling and behaviourally targeted advertising.
     

  3. Member States shall ensure that media service providers provide sufficient information to viewers about content which may impair the physical, mental or moral development of minors. For this purpose, media service providers shall use a system describing the potentially harmful nature of the content of an audiovisual media service. For the implementation of this paragraph, Member States shall encourage the use of co - regulation as provided for in Article 4a(1).
     

  4. The Commission shall encourage media service providers to exchange best practices on co - regulatory codes of conduct . Member States and the Commission may foster self - regulation, for the purposes of this Article, through Union codes of conduct as referred to in Article 4a(2).

Article 4a suggests possible organisation of the censors assigned to the task, eg state censors, state controlled organisations eg Ofcom, or nominally state controlled co-regulators like the defunct ATVOD.

Article 4a(3). notes that censorial countries like the UK are free to add further censorship rules of their own:

Member States shall remain free to require media service providers under their jurisdiction to comply with more detailed or stricter rules in compliance with this Directive and Union law, including where their national independent regulatory authorities or bodies conclude that any code of conduct or parts thereof h ave proven not to be sufficiently effective. Member States shall report such rules to the Commission without undue delay. ;




 2009   2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2022   2023   2024   Latest 
Jan-March   April-June   July-Sept   Oct-Dec    


 


 
Gay News

Internet Porn News

Magazine News

Satellite X News

Sex Aware

Sex Toys News
 

UK P4P News

UK Sex News

US P4P News

US Sex News

World P4P News

World Sex News
 


melonfarmers icon

Home

Index

Links

Email

Shop
 


US

World

Media

Nutters

Liberty
 

Film Cuts

Cutting Edge

Info

Sex News

Sex+Shopping
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys