In the evening of February 13, negotiators from the European Parliament and the Council concluded the trilogue negotiations with a final text for the new EU Copyright Directive.
For two years we've debated different drafts and versions of the controversial Articles 11 and 13. Now, there is no more ambiguity: This law will fundamentally change the internet as we know it -- if it is adopted in the upcoming final vote. But we can still prevent that!
Commercial sites and apps where users can post material must make "best efforts" to preemptively buy licences for anything that users may possibly upload -- that is: all copyrighted content in the world. An impossible feat.
In addition, all but very few sites (those both tiny and very new) will need to do everything in their power to prevent anything from ever going online that may be an unauthorised copy of a work that a rightsholder has registered with the
platform. They will have no choice but to deploy upload filters , which are by their nature both expensive and
Should a court ever find their licensing or filtering efforts not fierce enough, sites are directly liable for infringements as if they had committed them themselves. This massive threat will lead platforms to over-comply with these
rules to stay on the safe side, further worsening the impact on our freedom of speech.
Reproducing more than "single words or very short extracts" of news stories will require a licence. That will likely cover many of the snippets commonly shown alongside links today in order to give you an idea of what they lead
to. We will have to wait and see how courts interpret what "very short" means in practice -- until then, hyperlinking (with snippets) will be mired in legal uncertainty.
No exceptions are made even for services run by individuals, small companies or non-profits, which probably includes any monetised blogs or websites.
The project to allow Europeans to conduct Text and Data Mining , crucial for modern research and the development of artificial intelligence, has been obstructed with too many caveats and requirements. Rightholders can opt out of having
their works datamined by anyone except research organisations.
Authors' rights: The Parliament's proposal that authors should have a right to proportionate remuneration has been severely watered down: Total buy-out contracts will continue to be the norm.
Minor improvements for access to cultural heritage : Libraries will be able to publish out-of-commerce works online and museums will no longer be able to claim copyright on photographs of centuries-old paintings.
How we got here Former digital Commissioner Oettinger proposed the law
The history of this law is a shameful one.
From the very beginning , the purpose of Articles 11 and 13 was never to solve clearly-defined issues in copyright law with well-assessed measures, but to serve powerful special interests , with hardly any concern for the collateral
In his conservative EPP group, the driving force behind this law, dissenters were marginalised . The work of their initially-appointed representative
was thrown out after the conclusions she reached were too sensible. Mr Voss then voted so blindly in favour of any and all restrictive measures that he was
caught by surprise by some of the nonsense he had gotten approved. His party, the German CDU/CSU, nonchalantly violated the coalition agreement they had signed (which rejected upload filters), paying no mind to their own
minister for digital issues .
It took efforts equally herculean and sisyphean
across party lines to prevent the text from turning out even worse than it now is.
In the end, a closed-door horse trade between France and Germany was enough to outweigh the objections... so far.
What's important to note, though: It's not "the EU" in general that is to blame -- but those who put special interests above fundamental rights who currently hold considerable power. You can change that at the polls! The anti-EU
far right is trying to seize this opportunity to promote their narrow-minded nationalist agenda -- when in fact without the persistent support of the far-right ENF Group (dominated by the Rassemblement/Front National ) the law
could have been stopped in the crucial Legal Affairs Committee and in general would not be as extreme as it is today.
We can still stop this law
Our best chance to stop the EU copyright law: The upcoming Parliament vote.
The Parliament and Council negotiators who agreed on the final text now return to their institutions seeking approval of the result. If it passes both votes unchanged, it becomes EU law , which member states are forced to implement into
In both bodies, there is resistance.
The Parliament's process starts with the approval by the Legal Affairs Committee -- which is likely to be given on Monday, February 18.
Next, at a date to be announced, the EU member state governments will vote in the Council. The law can be stopped here either by 13 member state governments or by any number of governments who together represent 35% of the EU population (
calculator ). Last time, 8 countries representing 27% of the population were opposed. Either a large country like Germany or several small ones would need to change their minds: This is the less likely way to stop it.
Our best bet: The final vote in the plenary of the European Parliament , when all 751 MEPs, directly elected to represent the people, have a vote. This will take place either between March 25 and 28, on April 4 or between April 15
and 18. We've already
demonstrated last July that a majority against a bad copyright proposal is achievable .
The plenary can vote to kill the bill -- or to make changes , like removing Articles 11 and 13. In the latter case, it's up to the Council to decide whether to accept these changes (the Directive then becomes law without these articles) or
to shelve the project until after the EU elections in May, which will reshuffle all the cards.
This is where you come in
The final Parliament vote will happen mere weeks before the EU elections . Most MEPs -- and certainly all parties -- are going to be seeking reelection. Articles 11 and 13 will be defeated if enough voters make these issues relevant to the
Here's how to vote in the EU elections -- change the language to one of your country's official ones for specific information)
It is up to you to make clear to your representatives: Their vote on whether to break the internet with Articles 11 and 13 will make or break your vote in the EU elections. Be insistent -- but please always stay polite.
The Council of Europe is a wider organisation of European countries than the EU and is known best for being the grouping behind the European Court of Human Rights.
The council's Committee of Ministers has issued a statement criticising the algorithmic nature of social media. It calls on member countries to address its concerns. The Committee writes:
- draws attention to the growing threat to the right of human beings to form opinions and take decisions independently of automated systems, which emanates from advanced digital technologies. Attention should be paid particularly to their
capacity to use personal and non-personal data to sort and micro-target people, to identify individual vulnerabilities and exploit accurate predictive knowledge, and to reconfigure social environments in order to meet specific goals and vested
- encourages member States to assume their responsibility to address this threat by
a) ensuring that adequate priority attention is paid at senior level to this inter-disciplinary concern that often falls in between established mandates of relevant authorities;
b) considering the need for additional protective frameworks related to data that go beyond current notions of personal data protection and privacy and address the significant impacts of the targeted use of data on societies and on the exercise
of human rights more broadly;
c) initiating, within appropriate institutional frameworks, open-ended, informed and inclusive public debates with a view to providing guidance on where to draw the line between forms of permissible persuasion and unacceptable manipulation. The
latter may take the form of influence that is subliminal, exploits existing vulnerabilities or cognitive biases, and/or encroaches on the independence and authenticity of individual decision-making;
d) taking appropriate and proportionate measures to ensure that effective legal guarantees are in place against such forms of illegitimate interference; and
e) empowering users by promoting critical digital literacy skills and robustly enhancing public awareness of how many data are generated and processed by personal devices, networks, and platforms through algorithmic processes that are trained
for data exploitation. Specifically, public awareness should be enhanced of the fact that algorithmic tools are widely used for commercial purposes and, increasingly, for political reasons, as well as for ambitions of anti- or undemocratic
power gain, warfare, or to inflict direct harm;
Of course if once strips away the jargon, then the fundamental algorithm is to simply give people more of what they seem to have enjoyed reading. And of course the establishment's preferred algorithm is to give people what the state would
like them to read.
It will be an offence to view terrorist material online just once -- and could incur a prison sentence of up to 15 years -- under a new UK law.
The Counter-Terrorism and Border Security Bill has just been granted Royal Assent, updating a previous Act and bringing new powers to law enforcement to tackle terrorism.
But a controversial inclusion was to update the offence of obtaining information likely to be useful to a person committing or preparing an act of terrorism so that it now covers viewing or streaming content online.
Originally, the proposal had been to make it an offence for someone to view material three or more times -- but the three strikes idea has been dropped from the final Act.
The government said that the existing laws didn't capture the nuance in changing methods for distribution and consumption of terrorist content -- and so added a new clause into the 2019 Act making it an offence to view (or otherwise access) any
terrorist material online. This means that, technically, anyone who clicked on a link to such material could be caught by the law.
A musician found guilty of broadcasting grossly offensive anti-Semitic songs has had her conviction upheld.
Alison Chabloz has written many politically incorrect, humorous and insulting songs often targeted at jews but also more generally against the PC establishment. The songs have been published on many internet platforms including YouTube.
In May she was convicted of three charges relating to the songs and was given a suspended jail sentence by magistrates which she appealed against.
A judge at Southwark Crown Court has upheld her conviction ruling the content was particularly repellent. In the songs Chabloz suggested the Holocaust was a bunch of lies and referred to Auschwitz as a theme park.
Chabloz was convicted of two counts of sending an offensive, indecent or menacing message through a public communications network and a third charge relating to a song on YouTube.
She was sentenced to 20 weeks' imprisonment, suspended for two years and banned from social media for 12 months.
During the appeal Adrian Davies, defending, told judge Christopher Hehir: It would be a very, very strong thing to say that a criminal penalty should be imposed on someone for singing in polemical terms about matters on which she feels so
The case started as a private prosecution by the Campaign Against Anti-Semitism before the Crown Prosecution Service took over. The group's chairman, Gideon Falter, said: This is the first conviction in the UK over Holocaust denial on social
Social media giants will face tough new laws to prevent the spread of knife crime, the Home Secretary threatened -- as he spoke of fears for his own children's safety.
Sajid Javid said it was time for a legal crackdown on social media images promoting gang culture, in the same way that child sex abuse images and terrorist propaganda have already been outlawed.
In a warning to online firms, he said:
My message to these companies is we are going to legislate and how far we go depends on what you decide to do now. At the moment we don't have the legislation for these types of [knife crime-related] content.
I have it for terrorist content and child sexual abuse images.
Google is among several firms which have been criticised for hosting content glamorising gang culture. Rappers using its YouTube video platform post so-called drill music videos to boast about the number of people they have stabbed or shot, using
street terms. The platform has taken down dozens of videos by drill artists, after warnings from the Metropolitan Police that they were raising the risk of violence.
The Cairncross Review into the future of the UK news industry has delivered its final report, with recommendations on how to safeguard the future sustainability of the UK press.
Online platforms should have a 'news quality obligation' to improve trust in news they host, overseen by a regulator
Government should explore direct funding for local news and new tax reliefs to support public interest journalism
A new Institute for Public Interest News should focus on the future of local and regional press and oversee a new innovation fund
The independent review , undertaken by Frances Cairncross, was tasked by the Prime Minister in 2018 with investigating the sustainability of the production and distribution of high-quality journalism. It comes as significant changes to technology
and consumer behaviour are posing problems for high-quality journalism, both in the UK and globally.
Cairncross was advised by a panel from the local and national press, digital and physical publishers and advertising. Her recommendations include measures to tackle the uneven balance of power between news publishers and the online platforms that
distribute their content, and to address the growing risks to the future provision of public-interest news.
It also concludes that intervention may be needed to improve people's ability to assess the quality of online news, and to measure their engagement with public interest news. The key recommendations are:
New codes of conduct to rebalance the relationship between publishers and online platforms;
The Competition and Markets Authority to investigate the online advertising market to ensure fair competition;
Online platforms' efforts to improve their users' news experience should be placed under regulatory supervision;
Ofcom should explore the market impact of BBC News, and whether its inappropriately steps into areas better served by commercial news providers;
The BBC should do more to help local publishers and think further about how its news provision can act as a complement to commercial news;
A new independent Institute should be created to ensure the future provision of public interest news;
A new Innovation Fund should be launched, aiming to improve the supply of public interest news;
New forms of tax reliefs to encourage payments for online news content and support local and investigative journalism;
Expanding financial support for local news by extending the BBC's Local Democracy Reporting Service;
Developing a media literacy strategy alongside Ofcom, industry and stakeholders.
The Government will now consider all of the recommendations in more detail. To inform this, the Culture Secretary will write immediately to the Competition and Markets Authority, Ofcom and the Chair of the Charity Commission to open discussions
about how best to take forward the recommendations which fall within their remits. The Government will respond fully to the report later this year.
DCMS Secretary of State Jeremy Wright said:
A healthy democracy needs high quality journalism to thrive and this report sets out the challenges to putting our news media on a stronger and more sustainable footing, in the face of changing technology and rising disinformation. There are
some things we can take action on immediately while others will need further careful consideration with stakeholders on the best way forward.
A Mediatique report Overview of recent market dynamics in the UK press, April 2018 commissioned by DCMS as the part of the Cairncross Review found:
Print advertising revenues have dropped by more than two-thirds in the ten years to 2017;
Print circulation of national papers fell from 11.5 million daily copies in 2008 to 5.8 million in 2018 and for local papers from 63.4 million weekly in 2007 to 31.4 million weekly in 2017;
Sales of both national and local printed papers fell by roughly half between 2007 and 2017, and are still declining;
The number of full-time frontline journalists in the UK has dropped from an estimated 23,000 in 2007, to just 17,000 today, and the numbers are still declining.
A report Online Advertising in the UK by Plum Consulting, commissioned by DCMS as the part of the Cairncross Review (and available as an annex to the Review) found:
UK internet advertising expenditure increased from £3.5 billion in 2008 to £11.5 billion in 2017, a compound annual growth rate of 14%.
Publishers rely on display advertising for their revenue online - which in the last decade has transformed into a complex, automated system known as programmatic advertising.
An estimated average of £0.62 of every £1 spent on programmatic advertising goes to the publisher - though this can range from £0.43 to £0.72. *Collectively, Facebook and Google were estimated to have accounted for over half (54%) of all UK
online advertising revenues in 2017.
The major online platforms collect multiple first-party datasets from large numbers of logged-in users. They generally, they do not share data with third-parties, including publishers.
Dame Frances Cairncross is a former economic journalist, author and academic administrator. She is currently Chair of the Court of Heriot-Watt University and a Trustee at the Natural History Museum. Dame Frances was Rector of Exeter College,
Oxford University; a senior editor on The Economist; and principal economic columnist for the Guardian. In 2014 she was made a Dame of the British Empire for services to education. She is the author of a number of books, including "The Death
of Distance: How the Communications Revolution is Changing our Lives" and "Costing the Earth: The Challenge for Governments, the Opportunities for Business". Dame Frances is married to financial journalist Hamish McRae.
The BBC comments on some of the ideas not included in the report's recommendations
The report falls short of requiring Facebook, Google and other tech giants to pay for the news they distribute via their platforms. Caurncross told the BBC's media editor Amol Rajan that "draconian and risky" measures could result in
firms such as Google withdrawing their news services altogether.:
There are a number of ways we have suggested technology companies could behave differently and could be made to behave differently. But they are mostly ways that don't immediately involve legislation."
Frances Cairncross earned widespread respect as a journalist for her hard-headed and pragmatic approach to economics. That pragmatism is the very reason the government commissioned her to look at the future of high-quality news - and also the
reason many in local and regional media will be disappointed by her recommendations.
What is most notable about her review is what it doesn't do.
It doesn't suggest all social media should be regulated in the UK
It doesn't suggest social media companies pay for the privilege of using news content
It doesn't suggest social media companies be treated as publishers, with legal liability for all that appears on their platform
This is because the practicalities of doing these things are difficult, and experience shows that the likes of Google will simply pull out of markets that don't suit them.
Ultimately, as this report acknowledges, when it comes to news, convenience is king. The speed, versatility and zero cost of so much news now means that, even if it is of poor quality, a generation of consumers has fallen out of the habit of
paying for news. But quality costs. If quality news has a future, consumers will have to pay. That's the main lesson of this report.
2018 was a pivotal year for data protection. First the Cambridge Analytica scandal put a spotlight on Facebook's questionable privacy practices. Then the new Data Protection Act and the General Data Protection Regulation (GDPR) forced
businesses to better handle personal data.
As these events continue to develop, 2019 is shaping up to be a similarly consequential year for free speech online as new forms of digital censorship assert themselves in the UK and EU.
Of chief concern in the UK are several initiatives within the Government's grand plan to "make Britain the safest place in the world to be online", known as the Digital Charter. Its founding document proclaims "the same rights that
people have offline must be protected online." That sounds a lot like Open Rights Group's mission! What's not to like?
Well, just as surveillance programmes created in the name of national security proved detrimental to privacy rights, new Internet regulations targeting "harmful content" risk curtailing free expression.
The Digital Charter's remit is staggeringly broad. It addresses just about every conceivable evil on the Internet from bullying and hate speech to copyright infringement, child pornography and terrorist propaganda. With so many initiatives
developing simultaneously it can be easy to get lost.
To gain clarity, Open Rights Group published a report surveying the current state of digital censorship in the UK . The report is broken up into two main sections - formal censorship practices like copyright and pornography blocking, and informal
censorship practices including ISP filtering and counter terrorism activity. The report shows how authorities, while often engaging in important work, can be prone to mistakes and unaccountable takedowns that lack independent means of redress.
Over the coming weeks we'll post a series of excerpts from the report covering the following:
Formal censorship practices
Copyright blocking injunctions
BBFC pornography blocking
BBFC requests to "Ancillary Service Providers"
Informal censorship practices
Nominet domain suspensions
The Counter Terrorism Internet Referral Unit (CTIRU)
The Internet Watch Foundation (IWF)
ISP content filtering
The big picture
Take a step back from the many measures encompassed within the Digital Charter and a clear pattern emerges. When it comes to web blocking, the same rules do not apply online as offline. Many powers and practices the government employs to remove
online content would be deemed unacceptable and arbitrary if they were applied to offline publications.
Part II of our report is in the works and will focus on threats to free speech within yet another branch of the Digital Charter known as the Internet Safety Strategy.
The Russian State Duma is considering multiple bills of law that would further stifle free speech in Russia's already heavily restricted internet environment.
One targets expressions of willful disregard towards the state. Another targets disinformation. All of them echo increasingly global concerns among governments about the political implications of disinformation -- and unbridled criticism -- on
the internet. And all have been heavily criticized by Russian civil society groups, experts, users and even the government's own ministers. Yet these bills promoting possible further crackdown on free speech still trudge on through the
The first bill, a sovereign internet initiative , which is yet to reach the floor of the lower chamber of Russia's bicameral parliament, seeks to establish state-regulated internet exchange points that would allow for increased monitoring and
control over internet traffic moving into and out of the country.
Under this law, individuals, officials or organizations accused of spreading fake news disguised as genuine public announcements which are found to promote public disorder or other serious disturbances could be fined for up to a million rubles
(slightly above USD $15,000), unless they remove the violating content in a day's time. The bill also provides measures through which Roskomnadzor, Russia's media watchdog, will order ISPs to block websites hosting the offending content.
The bill passed its first reading in late January with flying colours, receiving 336 votes in its favor and only 44 against, thanks to the 2016 landslide which guaranteed the ruling United Russia party an absolute voting majority.
The anti-fake news bill will be reviewed again by the Duma in February, conditioned on the revision of some of its most contentious points. The bill pushed through by Putin's party was met with a rare response of significant opposition, even
among the normally acquiescent branches of Russia's highly centralized and executive-biased power structure. The attorney general's office, among others, criticized the bill's vague definitions as potentially damaging to citizens' civil rights.
The second bill , which came up for review alongside the fake news-busting proposal, is seen as being even more controversial. It seeks to punish vulgar expressions of wilful disregard towards the state, its symbols and organs of its power with
fines of up to 5,000 rubles (around USD $76) and detention for up to 15 days. The bill also passed in the first reading on the same day, despite vocal criticism from both government members (Deputy Communications Minister Alexey Volin said that
calmly accepting criticism was an obligation for state officials, adding that they weren't made of sugar) and opposition parties.
There is every reason to believe that the government and opposition are moving to a consensus on introducing a duty of care for social media companies to reduce harm and risk to their users. This may be backed by an Internet regulator, who
might decide what kind of mitigating actions are appropriate to address the risks to users on different platforms.
This idea originated from a series of papers by Will Perrin and Lorna Woods and has been mentioned most recently in a recent Science and Technology committee report and by NGOs including children's charity 5Rights.
A duty of care has some obvious merits: it could be based on objective risks, based on evidence, and ensure that mitigations are proportionate to those risks. It could take some of the politicisation out of the current debate.
However, it also has obvious problems. For a start, it focuses on risk rather than process . It moves attention away from the fact that interventions are regulating social media users just as much as platforms. It does not by itself tell us that
free expression impacts will be considered, tracked or mitigated.
Furthermore, the lack of focus that a duty of care model gives to process means that platform decisions that have nothing to do with risky content are not necessarily based on better decisions, independent appeals and so on. Rather, as has
happened with German regulation, processes can remain unaffected when they are outside a duty of care.
In practice, a lot of content which is disturbing or offensive is already banned on online platforms. Much of this would not be in scope under a duty of care but it is precisely these kinds of material which users often complain about, when it is
either not removed when they want it gone, or is removed incorrectly. Any model of social media regulation needs to improve these issues, but a duty of care is unlikely to touch these problems.
There are very many questions about the kinds of risk, whether to individual in general, vulnerable groups, or society at large; and the evidence required to create action. The truth is that a duty of care, if cast sensibly and narrowly, will not
satisfy many of the people who are demanding action; equally, if the threshold to act is low, then it will quickly be seen to be a mechanism for wide-scale Internet censorship.
It is also a simple fact that many decisions that platforms make about legal content which is not risky are not the business of government to regulate. This includes decisions about what legal content is promoted and why. For this reason, we
believe that a better approach might be to require independent self-regulation of major platforms across all of their content decisions. This requirement could be a legislative one, but the regulator would need to be independent of government and
Independent self-regulation has not been truly tried. Instead, voluntary agreements have filled its place. We should be cautious about moving straight to government regulation of social media and social media users. The government refuses to
regulate the press in this way because it doesn't wish to be seen to be controlling print media. It is pretty curious that neither the media nor the government are spelling out the risks of state regulation of the speech of millions of British
That we are in this place is of course largely the fault of the social media platforms themselves, who have failed to understand the need and value of transparent and accountable systems to ensure they are acting properly. That, however, just
demonstrates the problem: politically weak platforms who have created monopoly positions based on data silos are now being sliced and diced at the policy table for their wider errors. It's imperative that as these government proposals progress we
keep focus on the simple fact that it is end users whose speech will ultimately be regulated.
The number of people using the internet in Uganda has dropped by 26% since July 2018, when the country's social media tax was put into force. Prior to the tax's implementation, 47.4% of people in Uganda were using the internet. Three months after
the tax was put in place, that number had fallen to 35%.
ISPs charge an additional 200 Ugandan shillings (UGX) in social media tax on top of the ISP access fees and standard sales tax. This is nominally 5.4 US cents but is a significant portion of typical Ugandan incomes.
President Yoweri Museveni and several government officials said this was intended to curb online rumor-mongering and to generate more tax revenue.
The tax was the subject of large-scale public protests in July and August 2018. During one protest against the tax, key opposition leader, activist and musician Bobi Wine noted that the tax was enforced to oppress the young generation.
The government expected to collect about UGX 24 billion in revenue from the tax every quarter. But in the first quarter after the tax's implementation, they collected UGX 20 billion. In the second quarter, ending December 2018, they had collected
only UGX 16 billion.
While some people have gone offline altogether, others are simply finding different and more affordable ways to connect. People are creating shared access points where one device pays the tax and tethers the rest as a WiFi hotspot, or relying on
workplace and public area WiFi networks to access the services.
Other Ugandans are using Virtual Private Network (VPN) applications to bypass the tax. In a statement for
The Daily Monitor , the Uganda Revenue Authority's Ian Rumanyika argued that people could not use the VPNs forever, but that doesn't seem to be the case.
In addition to leaving Ugandans with less access to communication and diminished abilities to express themselves online, it has also affected economic and commercial sectors, where mobile money and online marketing are essential components of
Apart from the name Donald, and securing a place in hell, both put American corporate interests above European livelihoods. The Council of the EU approves copyright law that will suffocate European businesses and livelihoods
While Italy, Poland, the Netherlands, Sweden, Finland and Luxembourg
maintained their opposition to the text and were newly joined by Malta and Slovakia, Germany's support of the "compromise" secretly negotiated with France over the last weeks has broken the previous deadlock .
This new Council position is actually more extreme than previous versions, requiring all platforms older than 3 years to automatically censor all their users' uploads, and putting unreasonable burdens even on the newest companies.
The German Conservative--Social Democrat government is now in blatant violation of its own coalition agreement , which rejects upload filters against copyright infringement as disproportionate. This breach of coalition promises will not go
down well with many young voters just ahead of the European elections in May. Meanwhile, prominent members of both German government parties have joined the protests against upload filters.
The deal in Council paves the way for a final round of negotiations with the Parliament over the course of next week, before the entire European Parliament and the Council vote on the final agreement. It is now up to you to contact your MEPs,
call their offices in their constituencies and visit as many of their election campaign events as you can! Ask them to reject a copyright deal that will violate your rights to share legal creations like parodies and reviews online, and
includes measures like the link tax that will limit your access to the news and drive small online newspapers out of business.
Right before the European elections, your voices cannot be ignored! Join the over 4.6 million signatories to
the largest European petition ever and tell your representatives: If you break the Internet and accept Article 13, we won't reelect you!
A new bill introduced late last month in the New York State legislature marks the latest attempt to impose a user tax on porn, or for that matter any sexually oriented media. Teh proposed bill will slap an extra $2 on to every porn download.
The charge would also apply to offline sexually oriented media, adding the two-buck fee to each magazine or DVD classified as sexually oriented. In fact, the language of New York Assembly Bill AO3417 is so broad that it apparently would apply not
only to porn, but even to R-rated movies and TV programs airing on pay cable networks such as HBO or Showtime.
That's because the law as written by Assistant Assembly Speaker Felix W. Ortiz defines sexually oriented as any media that features nude pictures or nude performances. And nude does not even mean completely nude under the bill's wording, breasts
or buttocks are enough.
The language of the bill is also unclear on whether the $2 surcharge would apply to free porn downloads, such as on Pornhub and similar tube sites.
An attempt to block pornography and other obscene material on all personal devices in
South Dakota, then charge users a $20 access fee, was voted down Friday by state lawmakers.
House Bill 1154, written by out-of-state authors, raised serious concerns with lobbyists representing South Dakota retailers and telecommunication companies, who opposed the measure in a meeting of the House Judiciary Committee Friday morning.
Google has agreed to censor search results in Russia as dictated by country's internet censor. This will then allow Google to continue operations in Russia.
Google is one of a few search engines that does not adhere to an official list of banned websites that should not be included in search results.. However, Google already deletes 70% links from its search results to websites that internet censor
Roskomnadzor has banned.
In December of 2018, Roskomnadzor charged Google a fine of 500,000 rubles ($7,590) for refusing to subscribe to the banned list. The company did not challenge the agency's decision and chose to pay the fine. The Russian law that made the fine
possible does not allow Roskomnadzor to block sites that do not comply with its censorship demands, but that did not stop Roskomnadzor from threatening to block Google within Russian borders regardless.
Governments around the world are grappling with the threat of terrorism, but their efforts aimed at curbing the dissemination of terrorist content online all too often result in censorship. Over the past five years, we've seen a number of
governments--from the US Congress to that of France and now the European Commission (EC)--seek to implement measures that place an undue burden on technology companies to remove terrorist speech or face financial liability.
This is why EFF has joined forces with dozens of organizations to call on members of the European Parliament to oppose the EC's proposed regulation, which would require companies to take down terrorist content within one hour . We've added our
voice to two letters--one from Witness and another organized by the Center for Democracy and Technology --asking that MEPs consider the serious consequences that the passing of this regulation could have on human rights defenders and on freedom
We share the concerns of dozens of allies that requiring the use of proactive measures such as use of the terrorism hash database (already voluntarily in use by a number of companies) will restrict expression and have a disproportionate impact on
marginalized groups. We know from years of experience that filters just don't work.
Furthermore, the proposed requirement that companies must respond to reports of terrorist speech within an hour is, to put it bluntly, absurd. As the letter organized by Witness states, this regulation essentially forces companies to bypass due
process and make rapid and unaccountable decisions on expression through automated means and furthermore doesn't reflect the realities of how violent groups recruit and share information online.
We echo these and other calls from defenders of human rights and civil liberties for MEPs to reject proactive filtering obligations and to refrain from enacting laws that will have unintended consequences for freedom of expression.
Wrangling in Whitehall has held up plans to set up a social media censor dubbed Ofweb, The Mail on Sunday reveals.
The Government was due to publish a White Paper this winter on censorship of tech giants but this Mail has learnt it is still far from ready. Culture Secretary Jeremy Wright said it would be published within a month, but a Cabinet source said
that timeline was wholly unrealistic. Other senior Government sources went further and said the policy document is unlikely to surface before the Spring.
Key details on how a new censor would work have yet to be decided while funding from the Treasury has not yet been secured. Another problem is that some Ministers believe the proposed clampdown is too draconian and are preparing to try to block
or water down the plan.
There are also concerns that technically difficult requirements would benefit the largest US companies as smaller European companies and start ups would not be able to afford the technology and development required.
The Mail on Sunday understands Jeremy Wright has postponed a visit to Facebook HQ in California to discuss the measures, as key details are still up in the air.
Update: The Conservatives don't have a monopoly on internet censorship...Labour agrees
Labour has called for a new entity capable of taking on the likes of Facebook and Google. Tom Watson, the shadow digital secretary, will on Wednesday say a regulator should also have responsibility for competition policy and be able to refer
cases to the Competition and Markets Authority.
According to Watson, any duty of care would only be effective with penalties that seriously affect companies' bottom lines. He has referred to regulators' ability to fine companies up to 4% of global turnover, or euro 20m, whichever is higher,
for worst-case breaches of the EU-wide General Data Protection Regulation.
Contrary to some reports A rticle 13 was not shelved solely because EU governments listened to the unprecedented
public opposition and understood that upload filters are costly,
error-prone and threaten fundamental rights.
Without doubt, the consistent public opposition contributed to 11 member state governments voting against the mandate, instead of just 6 last year, but ultimately the reform hinges on agreement between France and Germany , who due to their size
can make or break blocking minorities. The deadlock is the direct result of their disagreement, which was not about whether to have upload filters at all; they just couldn't agree on exactly who should be forced to install those faulty
The deadlock hinged on a disagreement between France and Germany
France's position: Article 13 is great and must apply to all platforms, regardless of size . They must demonstrate that they have done all they possibly could to prevent uploads of copyrighted material. In the case of small businesses, that may
or may not mean using upload filters -- ultimately, a court would have to make that call . (This was previously the
majority position among EU governments , supported by France, before Italy's newly elected government retracted their support for Article 13 altogether.)
Germany's position: Article 13 is great, but it should not apply to everyone. Companies with a turnover below ?20 million per year should be excluded outright, so as not to harm European internet startups and SMEs. (This was closer to the
European Parliament's current position , which calls for the exclusion of companies with a turnover below ?10 million and fewer than 50 employees.)
What brought France and Germany together:
Making Article 13 even worse In the
Franco-German deal , which leaked today, Article 13 does apply to all for-profit platforms. Upload filters must be installed by everyone except those services which fit all three of the following extremely narrow criteria:
Available to the public for less than 3 years
Annual turnover below ? 10 million
Fewer than 5 million unique monthly users
Countless apps and sites that do not meet all these criteria would need to install upload filters, burdening their users and operators, even when copyright infringement is not at all currently a problem for them. Some examples:
Discussion boards on for-profit sites, such as the Ars Technica or Heise.de forums (older than 3 years)
Patreon , a platform with the sole purpose of helping authors get paid (fails to meet any of the three criteria)
Niche social networks like GetReeled , a platform for anglers (well below 5 million users, but older than 3 years)
Small European competitors to larger US brands like wykop, a Polish news sharing platform similar to reddit (well below ? 10 million turnover, bur may be above 5 million users depending on the calculation method)
On top of that, even the smallest and newest platforms, which do meet all three criteria , must still demonstrate they have undertaken " best efforts " to obtain licenses from rightholders such as record labels, book publishers and
stock photo databases for anything their users might possibly upload -- an impossible task . In practice, all sites and apps where users may upload material will likely be forced to accept any license a rightholder offers them , no matter how bad
the terms, and no matter whether the y actually want their copyrighted material to be available on the platform or not , to avoid the massive legal risk of coming in conflict with Article 13. In summary: France's and Germany's compromise on
Article 13 still calls for nearly everything we post or share online to require prior permission by "censorship machines" , algorithms that are fundamentally unable to distinguish between copyright infringement and legal works such as
parody and critique. It would change the web from a place where we can all freely express ourselves into one where big corporate rightholders are the gatekeepers of what can and can't be published. It would allow these rightholders to bully any
for-profit site or app that includes an upload function. European innovation on the web would be discouraged by the new costs and legal risks for startups -- even if they only apply when platforms become successful, or turn 3 years old.
Foreign sites and apps would be incentivised to just geoblock all EU users to be on the safe side.
Now everything hinges on the European Parliament
With this road block out of the way, the trilogue negotiations to finish the new EU copyright law are back on. With no time to lose, there will be massive pressure to reach an overall agreement within the next few days and pass the law in March
or April. The most likely next steps will be a rubber-stamping of the new Council position cooked up by Germany and France on Friday, 8 February, and a final trilogue on Monday, 11 February.
MEPs, most of whom are fighting for re-election, will get one final say. Last September, a narrow majority for Article 13 could only be found in the Parliament after a small business exception was included that was much stronger than the foul
deal France and Germany are now proposing -- but I don't have high hopes that Parliament negotiator Axel Voss will insist on this point. Whether MEPs will reject this harmful version of Article 13 (like they initially did last July) or bow to the
pressure will depend on whether all of us make clear to them: If you break the internet and enact Article 13, we won't re-elect you.
Facebook has been caught out censoring a poster for comedy show because Facebook's simplistic algorithms couldn't distinguish a jokey use of the word 'Brexit' from a political advert.
The social media site has taken drastic action to clamp down on political advertising in a bid to tackle a backlash over secret Russian interference. But it was accused of over-reacting after a comedian was told he couldn't promote his show Brexit Through The Gift Shop.
comedian Matt Forde was told his stand up show's title breached new rules on ads about politics or issues of national importance. Facebook told him: There's no way around this other than not using the word Brexit.
The comedian told The Sun that it was incredible that Facebook allowed tech firms to harvest the data of millions without telling them but stopped him from advertising a comedy show. Forde added:
I'm flattered that they think I'm a greater threat to their users than the collapse of global democracy. Obviously what I forgot to do was offer Facebook the personal data of my friends and family.
The Grand Tour presenter Jeremy Clarkson has pushed back at claims of homophobia from gay singer Will Young by joking about enjoying lesbian porn.
LGBT+ campaigner and musician Will Young had hit out at Clarkson after a recent episode of Amazon motoring show included a running gag alluding to a Jeep Wrangler being gay. The January 27 episode of the Amazon show also saw Clarkson ask whether
LGBT stands for lesbian, bacon, transgender.
I'm afraid 3 heterosexual men SO uncomfortable with their sexuality that they reference in some lame way a Wrangler Jeep being a Gay mans car
.... and then Hammond and May's 'quips to Clarkson wearing chaps , a pink shirt , he should get some moisturiser . It's f**king pathetic and actually homophobic .
Clarkson responded to Young also on Twitter:
...I will apologise to Will for causing him some upset and reassure him that I know I'm not homophobic as I very much enjoy watching lesbians on the internet.
Netflix has hundreds of TV shows and movies to choose from. That selection also includes many titles that contain extreme graphic sexual content, requiring no automatic barrier to access them.
Furthermore, Netflix has a flimsy ratings system at best, only very recently adding content ratings to the opening screens of any selection. The lack of descriptive content warnings employed by cable television and other
streaming services means that movie rated "TV-MA" could have anything from a few swear words to gratuitous and explicit sex scenes.
Due to the nature of streaming, many of Netflix's titles do not have the industry-standard MPAA ratings such as PG or R. Instead, many programs fall under the umbrella of TV-MA, meaning "Mature Audiences". Again,
Netflix does not require any other content warnings to be included besides this vague description. Netflix currently has several films that fall under the category of TV-MA, and some are even rated NC-17. Many of the films rated TV-MA were
originally released as NC-17 or otherwise have extremely graphic sexual content.
The only way to block this content is for parental controls to be turned on, requiring a PIN for either specific titles or ratings such as TV-MA. Read more about Netflix parental controls here .
If movie theaters don't allow those under 17 to see NC-17 (or even R) rated movies, then why is Netflix making these films available to anyone regardless of age?
Below is a list of films currently offered on Netflix that have either a NC-17 or TV-MA rating for graphic sexual content: