Melon Farmers Unrated

Internet News


2018: June

 2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2020   2022   2023   2024   Latest 
Jan   Feb   Mar   Apr   May   June   July   Aug   Sep   Oct   Nov   Dec   Jan    

 

Terrifying censorship...

Labour MP tables bill amendment requiring social media companies to take down posts within 24 hours of an official complaint


Link Here29th June 2018
Google, Facebook, YouTube and other sites would be required by law to take down extremist material within 24 hours of receiving an official complaint under an amendment put forward for inclusion in new counter-terror legislation.

The Labour MP Stephen Doughty's amendment echoes censorship laws that came into effect in Germany last year. However the effect of the German law was to enable no-questions-asked censorship of anything the government doesn't like. Social media companies have no interest in challenging unfair censorship and find the easiest and cheapest way to comply is to err on the side of the government, and take down anything asked regardless of the merits of the case.

The counter-terrorism strategy unveiled by the home secretary, Sajid Javid, this month, said the Home Office would place a renewed emphasis on engagement with internet providers and work with the tech industry to seek more investment in technologies that automatically identify and remove terrorist content before it is accessible to all.

But Doughty, a member of the home affairs select committee, said his amendment was needed because the voluntary approach was failing. He said a wide variety of extremist content remained online despite repeated warnings.

If these companies can remove copyrighted video or music content from companies like Disney within a matter of hours, there is no excuse for them to be failing to do so for extremist material.

Doughty's amendment would also require tech companies to proactively check content for extremist material and take it down within six hours of it being identified.

The proactive check of content alludes to the censorship machines being introduced by the EU to scan uploads for copyrighted material. The extension to detect terrorist material coupled with the erring on the side of caution approach would inevitably lead to the automatic censorship of any content even using vocabulary of terrorism, regardless of it being news reporting, satire or criticsim.

 

 

Vague language and multiple layers of ambiguity...

Human rights organisations petition the courts to block the US internet censorship law named FOSTA


Link Here29th June 2018
Full story: FOSTA US Internet Censorship Law...Wide ranging internet cesnorship law targetting sex workers

Two human rights organizations, a digital library, an activist for sex workers, and a certified massage therapist have filed a lawsuit asking a federal court to block enforcement of FOSTA, the new federal law that silences online speech by forcing speakers to self-censor and requiring platforms to censor their users. The plaintiffs are represented by the Electronic Frontier Foundation (EFF), Davis, Wright Tremaine LLP, Walters Law Group, and Daphne Keller.

In Woodhull Freedom Foundation et al. v. United States , the plaintiffs argue that FOSTA is unconstitutional, muzzling online speech that protects and advocates for sex workers and forces well-established, general interest community forums offline for fear of criminal charges and heavy civil liability for things their users might share.

FOSTA, or the Allow States and Victims to Fight Online Sex Trafficking Act, was passed by Congress in March. But instead of focusing on the perpetrators of sex trafficking, FOSTA goes after online speakers, imposing harsh penalties for any website that might facilitate prostitution or contribute to sex trafficking. The vague language and multiple layers of ambiguity are driving constitutionally protected speech off the Internet at a rapid pace.

For example, plaintiff the Woodhull Freedom Foundation works to support the health, safety, and protection of sex workers, among other things. Woodhull wanted to publish information on its website to help sex workers understand what FOSTA meant to them. But instead, worried about liability under FOSTA, Woodhull was forced to censor its own speech and the speech of others who wanted to contribute to their blog. Woodhull is also concerned about the impact of FOSTA on its upcoming annual summit, scheduled for next month.

FOSTA chills sexual speech and harms sex workers, said Ricci Levy, executive director Woodhull Freedom Foundation. It makes it harder for people to take care of and protect themselves, and, as an organization working to protect people's fundamental human rights, Woodhull is deeply concerned about the damaging impact that this law will have on all people.

FOSTA calls into serious question the legality of online speech that advocates for the decriminalization of sex work, or provides health and safety information to sex workers. Human Rights Watch (HRW), an international organization that is also a plaintiff, advocates globally for ways to protect sex workers from violence, health risks, and other human rights abuses. The group is concerned that its efforts to expose abuses against sex workers and decriminalize voluntary sex work could be seen as facilitating prostitution, or in some way assisting sex trafficking.

HRW relies heavily on individuals spreading its reporting and advocacy through social media, said Dinah Pokempner, HRW General Counsel. We are worried that social media platforms and websites may block the sharing of this information out of concern it could be seen as demonstrating a reckless disregard of sex trafficking activities under FOSTA. This law is the wrong approach to the scourge of sex trafficking.

But FOSTA doesn't just impede the work of sex educators and activists. It also led to the shutdown of Craigslist's Therapeutic Services section, which has imperiled the business of a licensed massage therapist who is another plaintiff in this case. The Internet Archive joined this lawsuit against FOSTA because the law might hinder its work of cataloging and storing 330 billion web pages from 1996 to the present.

Because of the critical issues at stake, the lawsuit filed today asks the court to declare that FOSTA is unconstitutional, and asks that the government be permanently enjoined from enforcing the law.

FOSTA is the most comprehensive censorship of Internet speech in America in the last 20 years, said EFF Civil Liberties Director David Greene. Despite good intentions, Congress wrote an awful and harmful law, and it must be struck down.

 

 

The Pied Piper of Hollywood...

TorrentFreak suggests that the disgraceful EU law to allow censorship machines to control the internet is just to help US Big Media get more money out of US Big Internet


Link Here28th June 2018
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law
 
  
What is the mysterious hold that US Big Music has over Euro politicians?
 

Article 13, the proposed EU legislation that aims to restrict safe harbors for online platforms, was crafted to end the so-called "Value Gap" on YouTube.

Music piracy was traditionally viewed as an easy to identify problem, one that takes place on illegal sites or via largely uncontrollable peer-to-peer networks. In recent years, however, the lines have been blurred.

Sites like YouTube allow anyone to upload potentially infringing content which is then made available to the public. Under the safe harbor provisions of US and EU law, this remains legal -- provided YouTube takes content down when told to do so. It complies constantly but there's always more to do.

This means that in addition to being one of the greatest legal platforms ever created, YouTube is also a goldmine of unlicensed content, something unacceptable to the music industry.

They argue that the existence of this pirate material devalues the licensed content on the platform. As a result, YouTube maintains a favorable bargaining position with the labels and the best licensing deal in the industry.

The difference between YouTube's rates and those the industry would actually like is now known as the " Value Gap " and it's become one of the hottest topics in recent years.

In fact, it is so controversial that new copyright legislation, currently weaving its way through the corridors of power in the EU Parliament, is specifically designed to address it.

If passed, Article 13 will require platforms like YouTube to pre-filter uploads to detect potential infringement. Indeed, the legislation may as well have been named the YouTube Act, since it's the platform that provoked this entire debate and whole Value Gap dispute.

With that in mind, it's of interest to consider the words of YouTube's global head of music Lyor Cohen this week. In an interview with MusicWeek , Cohen pledges that his company's new music service, YouTube Music, will not only match the rates the industry achieves from Apple Music and Spotify, but the company's ad-supported free tier viewers will soon be delivering more cash to the labels too.  "Of course [rights holders are] going to get more money," he told Music Week.

If YouTube lives up to its pledge, a level playing field will not only be welcomed by the music industry but also YouTube competitors such as Spotify, who currently offer a free tier on less favorable terms.

While there's still plenty of room for YouTube to maneuver, peace breaking out with the labels may be coming a little too late for those deeply concerned about the implications of Article 13.

YouTube's business model and its reluctance to pay full market rate for music is what started the whole Article 13 movement in the first place and with the Legal Affairs Committee of the Parliament (JURI) adopting the proposals last week , time is running out to have them overturned.

Behind the scenes, however, the labels and their associates are going flat out to ensure that Article 13 passes, whether YouTube decides to "play fair" or not. Their language suggests that force is the best negotiating tactic with the distribution giant.

Yesterday, UK Music CEO Michael Dugher led a delegation to the EU Parliament in support of Article 13. He was joined by deputy Labour leader Tom Watson and representatives from the BPI, PRS, and Music Publishers Association, who urged MEPs to support the changes.

 

 

CALifornia Internet Police against HATE (CALIPHATE)...

California considers a bill to appoint a board of internet censors targeting social media


Link Here28th June 2018
Full story: Internet Censorship in USA...Domain name seizures and SOPA
California is considering a bill that would require the state's attorney general to create a board of internet censors that would target social media.

The group would include at least one person from the Department of Justice, representatives from social media providers, civil liberties advocates, and First Amendment scholars, according to CBS13. They would theoretically study how fake stories spread through social media and then advise platforms on how to stop them.

The nonprofit Electronic Frontier Foundation is already taking a stand against the measure, noting that it violates the First Amendment and make the government responsible for deciding if news is true or false.

 

 

A Fair achievement for the EFF...

After More Than a Decade of Litigation, the Dancing Baby Has Done His Part to Strengthen Fair Use for Everyone


Link Here28th June 2018

Litigation can always take twists and turns, but when EFF filed a lawsuit against Universal Music Group in 2007 on behalf of Stephanie Lenz, few would have anticipated it would be ten years until the case was finally resolved. But today , at last, it is. Along the way, Lenz v. Universal contributed to strengthening fair use law, bringing nationwide attention to the issues of copyright and fair use in new digital movie-making and sharing technologies.

It all started when Lenz posted a YouTube video of her then-toddler-aged son dancing while Prince's song Let's Go Crazy played in the background, and Universal used copyright claims to get the link disabled. We brought the case hoping to get some clarity from the courts on a simple but important issue: can a rightsholder use the Digital Millennium Copyright Act to take down an obvious fair use, without consequence?

Congress designed the DMCA to give rightsholders, service providers, and users relatively precise rules of the road for policing online copyright infringement. The center of the scheme is the notice and takedown process. In exchange for substantial protection from liability for the actions of their users, service providers must promptly take offline content on their platforms that has been identified as infringing, as well as several other prescribed steps. Copyright owners, for their part, are given an expedited, extra-judicial procedure for obtaining redress against alleged infringement, paired with explicit statutory guidance regarding the process for doing so, and provisions designed to deter and ameliorate abuse of that process.

Without Section 512, the risk of crippling liability for the acts of users would have prevented the emergence of most of the social media outlets we use today. Instead, the Internet has become the most revolutionary platform for the creation and dissemination of speech that the world has ever known.

But Congress also knew that Section 512's powerful incentives could result also in lawful material being censored from the Internet, without prior judicial scrutiny--much less advance notice to the person who posted the material--or an opportunity to contest the removal. To inhibit abuse, Congress made sure that the DMCA included a series of checks and balances, including Section 512(f), which gives users the ability to hold rightsholders accountable if they send a DMCA notice in bad faith.

In this case, Universal Music Group claimed to have a good faith belief that Ms. Lenz's video of her child dancing to a short segment of barely-audible music infringed copyright. Yet the undisputed facts showed Universal never considered whether Ms. Lenz's use was lawful under the fair use doctrine. If it had done so, it could not reasonably have concluded her use was infringing. On behalf of Stephanie Lenz, EFF argued that this was a misrepresentation in violation of Section 512(f).

In response, Universal argued that rightsholders have no obligation to consider fair use at all. The U.S. Court of Appeals for the Ninth Circuit rejected that argument, correctly holding that the DMCA requires a rightsholder to consider whether the uses she targets in a DMCA notice are actually lawful under the fair use doctrine. However, the court also held that a rightsholder's determination on that question passes muster as long as she subjectively believes it to be true. This leads to a virtually incoherent result: a rightsholder must consider fair use, but has no incentive to actually learn what such a consideration should entail. After all, if she doesn't know what the fair use factors are, she can't be held liable for not applying them thoughtfully.

We were disappointed in that part of the ruling, but it came with a big silver lining: the court also held that fair use is not simply a narrow defense copyright but an affirmative public right. For decades, rightsholders and scholars had debated the issue, with many preferring to construe fair use as narrowly as possible. Thanks to the Lenz decision, courts will be more likely to think of fair use, correctly, as a crucial vehicle for achieving the real purpose of copyright law: to promote the public interest in creativity and innovation. And rightsholders are on notice: they must at least consider fair use before sending a takedown notice.

Lenz and Universal filed petitions requesting that the Supreme Court review the Ninth Circuit's ruling. The Supreme Court denied both petitions. This meant that the case returned to the district court for trial on the question of whether Universal's takedown was a misrepresentation under the Ninth Circuit's subjective standard. Rather than go to trial, the parties have agreed to a settlement.

Lenz v. Universal helped make some great law on fair use and also played a role in leading to better takedown processes at Universal. EFF congratulates Stephanie Lenz for fighting the good fight, and we thank our co-counsel at Keker, Van Nest & Peters LLP and Kwun Bhansali Lazarus LLP for being our partners through this long journey.

 

 

Whining Winnie...

China blocks HBO and social media comments after John Oliver mocks Xi Jinping


Link Here26th June 2018
Full story: Internet Censorship in China...All pervading Chinese internet censorship
An item mocking China, Xi Jinping  and Trump on John Oliver's HBO show Last Week Tonight seems to have wound up China's censors.

HBO's website has been blocked in China and social media censors have been working hard to eliminate comments about the show.

According to the anti-censorship and monitoring group Greatfire.org, HBO's website was completely blocked within China as of Saturday, days after media reports emerged that Weibo had censored new posts mentioning Oliver or his HBO show Last Week Tonight.

In the show, Oliver made fun of the Chinese president's apparent sensitivity over comparisons of his figure with that of Winnie the Pooh. Images of the AA Milne character, used to mock Xi, have been censored in China. Oliver also took a serious tone in the show, criticising Xi for the removal of term limits from the Chinese constitution, the use of political re-education camps in the Muslim province of Xinjiang, and a crackdown on civil society. Oliver noted the continued house arrest of Liu Xia, wife of Chinese dissident and nobel laureate Liu Xiaobo who died last year while serving an 11-year prison sentence.

 

 

A censored Netflix...

ASEAN VoD services unite to produce self censorship code


Link Here25th June 2018
Video-on-demand streaming providers in Asean (Association of Southeast Asian Nations) countries, including ASTRO, dimsum, Fox+, HOOQ, iflix, Netflix, tonton, TVB and Walt Disney, have joined forces to launch a self-censorship Subscription Video-on-Demand Industry Content Code.

The censorship rules ensure that the content offered on these platforms is authentic, free from hate speech, pornography and other forms of inappropriate content.

Furthermore, the Code also aims to provide users with age-appropriate content advice.

Companies participating in the Code said in a statement:

We share a mutual objective of putting consumer well-being at the heart of our services. This Code demonstrates our commitment to making sure that the consumer is able to make content viewing choices that are right for them and their families.

They also welcome other video-on-demand services to work under their rules.

 

 

Offsite Article: How the internet is being mapped in real-time to stop censorship...


Link Here 25th June 2018
A combination of hardware and software is being used to track government censorship of the internet. NetBlocks is taking its censorship monitoring global. By Matt Reynolds

See article from wired.co.uk

 

 

Commented: The Drill Squad...

Police set up a 20 strong social media censor initially targeting gang related violence


Link Here24th June 2018
Full story: Drill Music...Drill music videos banned by UK police

Social media censor announced to tackle gang-related online content

The Home Secretary Sajid Javid has announced £1.38 million to strengthen the police's response to violent and gang-related online content.

Funding from the government's £40 million Serious Violence Strategy will be used to create a 20-strong team of police staff and officers tasked with disrupting and removing overt and covert gang-related online content.

The social media censor will proactively flag illegal and harmful online content for social media companies to take down. Hosted by the Metropolitan Police, the new capability will also prevent violence on our streets by identifying gang-related messages generating the most risk and violence.

The move follows the Serious Violence Taskforce chaired by the Home Secretary urging social media companies to do more to take down these videos. The Home Secretary invited representatives from Facebook and Google to Monday's meeting to explain the preventative action they are already taking against gang material hosted on their platforms.

Home Secretary Sajid Javid said:

Street gangs are increasingly using social media as a platform to incite violence, taunt each other and promote crime.

This is a major concern and I want companies such as Facebook and Google to do more.

We are taking urgent action and the new social media hub will improve the police's ability to identify and remove this dangerous content.

Duncan Ball, Deputy Assistant Commissioner of the Metropolitan Police Service and National Policing lead for Gangs, said:

Police forces across the country are committed to doing everything we can to tackle violent crime and the impact that it has on our communities. Through this funding we can develop a team that is a centre of expertise and excellence that will target violent gangs and those plotting and encouraging violence online.

By working together with social media companies we will ensure that online material that glamourises murder, lures young people into a dangerous, violent life of crime, and encourages violence is quickly dealt with to cut off this outlet for gangs and criminals.

Looking to the future we aim to develop a world class capability that will tackle the type of dangerous social media activity that promotes or encourages serious violence.

It is already an offence to incite, assist, or encourage violence online and the Home Office is focused towards building on the relationships made with social media providers to identify where we can take action relevant to tackling serious violence.

Comment: Making music videos is not a criminal activity -- no matter what genre

24th June 2018. See  article from theconversation.com

West London music group 1011 has recently been banned from recording or performing music without police permission. On June 15, the Metropolitan police issued the group, which has been the subject of a two-year police investigation, with a Criminal Behaviour Order .

For the next three years, five members of the group -- which creates and performs a UK version of drill, a genre of hip-hop that emerged from Chicago -- must give 24 hours notice of the release of any music video, and 48 hours notice of any live performance. They are also banned from attending Notting Hill Carnival and wearing balaclavas.

This is a legally unprecedented move, but it is not without context. A recent Amnesty UK report on the Metropolitan Police Gangs Matrix -- a risk assessment tool that links individuals to gang related crime -- stated that:

The sharing of YouTube videos and other social media activity are used as potential criteria for adding names to the Matrix, with grime music videos featuring gang names or signs considered a particular possible indicator of likely gang affiliation.

Furthermore, recent research indicates that almost 90% of those on the Matrix are black or ethnic minority.

For young people who make music, video is a key way to share their work with a wider audience. Online platforms such as SBTV, LinkUp TV , GRM daily and UK Grime are all popular sites. Often using street corners and housing estates as a location, these videos are a central component of the urban music scene. But the making of these music videos appears to feed into a continuing unease about youth crime and public safety.

Fifteen years ago, ministers were concerned about rap lyrics; in 2007 some MPs demanded to have videos banned after a shooting in Liverpool. UK drill music is only the focus of the most recent crackdown by the Metropolitan police, which has requested YouTube to remove any music videos with violent content.

The production and circulation of urban music videos has become a contested activity -- and performance in the public sphere is presented as a cause for concern. This is leading to the criminalisation of everyday pursuits. Young people from poor backgrounds are now becoming categorised as troublemakers through the mere act of making a music video.

See full article from theconversation.com

 

 

With pals like that, who needs enemies...

Paypal decides to censor the games company behind Active Shooter


Link Here22nd June 2018
Full story: Paypal Censors...Paypal unilaterally decide to act as media censors
Acid Software, the developer of a shooting simulator recently removed from Steam, will now struggle to sell its products online thanks to censorship by PayPal.

The Active Shooter developer said this week that purchases of its highly controversial game were temporarily disabled while it tried to resolve issues with PayPal.

Paypal has confirmed it has banned the account saying:

PayPal has a longstanding, well-defined and consistently enforced Acceptable Use Policy, and regardless of the individual or organisation in question, we work to ensure that our services are not used to accept payments for activities that promote violence, PayPal said in a statement.

Acid Software spokesperson Ata Berdyev told the Associated Press the future of the game is now in doubt.

 

 

Offsite Article: Choked Out by the Digital Content Gatekeepers...


Link Here 21st June 2018
Troma Entertainment and many more Independent Filmmakers are being strangled by the likes of YouTube. An essay by Lloyd Kaufman

See article from troma.com

 

 

Censorship machines mass ready for the internet killing fields...

European Parliament committee passed vote to hand over censorship of the internet to US corporate giants


Link Here 20th June 2018
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law
The European Parliament's Committee on Legal Affairs (JURI) has officially approved Articles 11 and 13 of a Digital Single Market (DSM) copyright proposal, mandating censorship machines and a link tax.

Articles 11 and 13 of the Directive of the European Parliament and of the Council on Copyright in the Digital Single Market have been the subject of considerable campaigning from pro-copyleft groups including the Open Rights Group and Electronic Frontier Foundation of late.

Article 11, as per the final version of the proposal, discusses the implementation of a link tax - the requirement that any site citing third-party materials do so in a way that adheres to the exemptions and restrictions of a total of 28 separate copyright laws or pays for a licence to use and link to the material;

Article 13, meanwhile, requires any site which allows users to post text, sound, program code, still or moving images, or any other work which can be copyrighted to automatically scan all such uploads against a database of copyright works - a database which they will be required to pay to access.

Both Article 11 and Article 13 won't become official legislation until passed by the entire European Parliament in a plenary vote. There's no definite timetable for when such a vote might take place, but it would likely happen sometime between December of this year and the first half of 2019.

 

 

Unstoppable Illiberalism...

In two days, an EU committee will vote to crown Google and Facebook permanent lords of internet censorship


Link Here19th June 2018
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law

On June 20, the EU's legislative committee will vote on the new Copyright directive , and decide whether it will include the controversial "Article 13" (automated censorship of anything an algorithm identifies as a copyright violation) and "Article 11" (no linking to news stories without paid permission from the site).

These proposals will make starting new internet companies effectively impossible -- Google, Facebook, Twitter, Apple, and the other US giants will be able to negotiate favourable rates and build out the infrastructure to comply with these proposals, but no one else will. The EU's regional tech success stories -- say Seznam.cz , a successful Czech search competitor to Google -- don't have $60-100,000,000 lying around to build out their filters, and lack the leverage to extract favorable linking licenses from news sites.

If Articles 11 and 13 pass, American companies will be in charge of Europe's conversations, deciding which photos and tweets and videos can be seen by the public, and who may speak.

The MEP Julia Reda has written up the state of play on the vote, and it's very bad. Both left- and right-wing parties have backed this proposal, including (incredibly) the French Front National, whose Youtube channel was just deleted by a copyright filter of the sort they're about to vote to universalise.

So far, the focus in the debate has been on the intended consequences of the proposals: the idea that a certain amount of free expression and competition must be sacrificed to enable rightsholders to force Google and Facebook to share their profits.

But the unintended -- and utterly foreseeable -- consequences are even more important. Article 11's link tax allows news sites to decide who gets to link to them, meaning that they can exclude their critics. With election cycles dominated by hoaxes and fake news, the right of a news publisher to decide who gets to criticise it is carte blanche to lie and spin.

Article 13's copyright filters are even more vulnerable to attack: the proposals contain no penalties for false claims of copyright ownership, but they do mandate that the filters must accept copyright claims in bulk, allowing rightsholders to upload millions of works at once in order to claim their copyright and prevent anyone from posting them.

That opens the doors to all kinds of attacks. The obvious one is that trolls might sow mischief by uploading millions of works they don't hold the copyright to, in order to prevent others from quoting them: the works of Shakespeare, say, or everything ever posted to Wikipedia, or my novels, or your family photos.

More insidious is the possibility of targeted strikes during crisis: stock-market manipulators could use bots to claim copyright over news about a company, suppressing its sharing on social media; political actors could suppress key articles during referendums or elections; corrupt governments could use arms-length trolls to falsely claim ownership of footage of human rights abuses.

It's asymmetric warfare: falsely claiming a copyright will be easy (because the rightsholders who want this system will not tolerate jumping through hoops to make their claims) and instant (because rightsholders won't tolerate delays when their new releases are being shared online at their moment of peak popularity). Removing a false claim of copyright will require that a human at an internet giant looks at it, sleuths out the truth of the ownership of the work, and adjusts the database -- for millions of works at once. Bots will be able to pollute the copyright databases much faster than humans could possibly clear it.

I spoke with Wired UK's KG Orphanides about this, and their excellent article on the proposal is the best explanation I've seen of the uses of these copyright filters to create unstoppable disinformation campaigns.

Doctorow highlighted the potential for unanticipated abuse of any automated copyright filtering system to make false copyright claims, engage in targeted harassment and even silence public discourse at sensitive times.

"Because the directive does not provide penalties for abuse -- and because rightsholders will not tolerate delays between claiming copyright over a work and suppressing its public display -- it will be trivial to claim copyright over key works at key moments or use bots to claim copyrights on whole corpuses.

The nature of automated systems, particularly if powerful rightsholders insist that they default to initially blocking potentially copyrighted material and then releasing it if a complaint is made, would make it easy for griefers to use copyright claims over, for example, relevant Wikipedia articles on the eve of a Greek debt-default referendum or, more generally, public domain content such as the entirety of Wikipedia or the complete works of Shakespeare.

"Making these claims will be MUCH easier than sorting them out -- bots can use cloud providers all over the world to file claims, while companies like Automattic (WordPress) or Twitter, or even projects like Wikipedia, would have to marshall vast armies to sort through the claims and remove the bad ones -- and if they get it wrong and remove a legit copyright claim, they face unbelievable copyright liability."

 

 

Don't MEPs watch the movies?...

Nascent censorship machines already rise up against the stupid politicians that support their Genesis


Link Here18th June 2018

Politicians, about to vote in favor of mandatory upload filtering in Europe, get channel deleted by YouTube's upload filtering.

French politicians of the former Front National are furious: their entire YouTube channel was just taken down by automatic filters at YouTube for alleged copyright violations. Perhaps this will cause them to reconsider next week's vote, which they have announced they will support: the bill that will make exactly this arbitrary, political, and unilateral upload filtering mandatory all across Europe.

The French party Front National, now renamed Rassemblemant National (national rally point), which is one of biggest parties in France, have gotten their YouTube channel disappeared on grounds of alleged copyright violations. In an interview with French Europe1, their leader Marine Le Pen calls the takedown arbitrary, political, and unilateral.

Europe is about to vote on new copyright law next week. Next Wednesday or Thursday. So let's disregard here for a moment that this happened to a party normally described as far-right, and observe that if it can happen to one of France's biggest parties regardless of their policies, then it can happen to anyone for political reasons 204 or any other reason.

The broadcast named TVLibert39s is gone, described by YouTube as YouTube has blocked the broadcast of the newscast of Thursday, June 14 for copyright infringement.

Marine Le Pen was quoted as saying, This measure is completely false; we can easily assert a right of quotation [to illustrate why the material was well within the law to broadcast].

She's right. Automated upload filters do not take into account when you have a legal right to broadcast copyrighted material for one of the myriad of valid reasons. They will just assume that this such reasons never exist; if nothing else, to make sure that the hosting platform steers clear of any liability. Political messages will be disappeared on mere allegations by a political opponent, just as might have happened here.

And yet, the Rassemblemant National is going to vote in favor of exactly this mandatory upload filtering. The horror they just described on national TV as arbitrary, political, and unilateral.

It's hard to illustrate clearer that Europe's politicians have absolutely no idea about the monster they're voting on next week.

The decisions to come will be unilateral, political, and arbitrary. Freedom of speech will be unilateral, political, and arbitrary. Just as Marine Le Pen says. Just as YouTube's Content ID filtering is today, as has just been illustrated.

The article mandating this unilateral, political, and arbitrary censorship is called Article 13 of the upcoming European Copyright bill, and it must be removed entirely. There is no fixing of automated censorship machines.

Privacy remains your own responsibility. So do your freedoms of speech, information, and expression.

 

 

The UN shames the EU...

The UN's free speech rapporteur condemns the EU's censorship machines that will violate human rights


Link Here17th June 2018
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law
David Kaye, the UN's Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the Directive -- the part about mandatory copyright filters -- would be a disaster for free speech and would violate the UN's Declaration on Human Rights, and in particular Article 19 which says:

Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to seek, receive and impart information and ideas through any media regardless of frontiers.

As Kaye's report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.

Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking effective and proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave considerable leeway for interpretation.

The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression should be provided by law. Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions. Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching, criticism, satire and parody.

Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing algorithms at the problem -- especially when a website may face legal liability for getting it wrong.

The designation of such mechanisms as the main avenue to address users' complaints effectively delegates content blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content -- particularly in the context of fair use and other fact-sensitive exceptions to copyright -- may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and expedited judicial process are available as less invasive means for protecting the aims of copyright law.

In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I am concerned that such delegation would violate the State's obligation to provide access to an effective remedy for violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer route of claiming that the upload was a terms of service violation -- this outcome may deprive users of even the remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content restrictions, may often fail to address financial and other harms associated with the blocking of timesensitive content.

He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be in serious trouble:

I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although Article 13(5)'s criteria for effective and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.

 

 

Offsite Article: How China censors the net: by making sure there's too much information...


Link Here 17th June 2018
Full story: Internet Censorship in China...All pervading Chinese internet censorship
A remarkable new book by Margaret Roberts reveals a detailed picture of networked authoritarianism in action. Roberts's book is a magisterial summary of what we have learned so far.

See article from theguardian.com

 

 

Offsite Article: Germany's Deletion Centre...


Link Here16th June 2018
Full story: Internet Censorship in Germany...Germany considers state internet filtering
A report about 1200 internet censors working in Germany to delete Nazi symbols and insults of migrants

See article from independent.co.uk

 

 

Downloaders downed...

More YouTube video and audio download sites closed down following legal pressure from the music industry


Link Here15th June 2018
Several video downloading and MP3 conversion tools have thrown in the towel this week, disabling all functionality following legal pressure. Pickvideo.net states that it received a cease and desist order, while Video-download.co and EasyLoad.co reference the lawsuit against YouTube-MP3 as the reason for their decision.

The music industry sees stream ripping as one of the largest piracy threats. The RIAA, IFPI, and BPI showed that they're serious about the issue when they filed legal action against YouTube-MP3, the largest stream ripping site at the time.

This case eventually resulted in a settlement where the site, once good for over a million daily visitors, agreed to shut down voluntarily last year.

YouTube-MP3's demise was a clear victory for the music groups, which swiftly identified their next targets, putting them under pressure, both in public and behind the scenes.

This week this appears to have taken its toll on several stream ripping sites, which allowed users to download videos from YouTube and other platforms, with the option to convert files to MP3s. The targets include Pickvideo.net , Video-download.co and Easyload.co , which all inform their users that they've thrown in the towel.

With several million visits per month, Pickvideo is the largest of the three. According to the site, they took the drastic measures following a cease -and-desist letter.

 

 

Who pays the blocker?...

UK Supreme Court rules that the cost of website blocking should not be borne by ISPs, and indirectly, internet users


Link Here14th June 2018
The UK Supreme Court has today ruled that trade mark holders are not able to compel ISPs to bear the cost of implementing orders to block websites selling counterfeit goods.

Jim, Alex and Myles at the Supreme CourtOpen Rights Group acted as an intervener in this case. We argued that Internet service providers (ISPs) as innocent parties should not bear the costs of website blocking, and that this was a long-standing principle of English law.

Jim Killock, Executive Director of Open Rights Group said:

This case is important because if ISPs paid the costs of blocking websites, the result would be an increasing number of blocks for relatively trivial reasons and the costs would be passed to customers.

While rights holders may want websites blocked, it needs to be economically rational to ask for this.

Solicitor in the case David Allen Green said:

I am delighted to have acted, through my firm Preiskel, successfully for the Open Rights Group in their intervention.

We intervened to say that those enforcing private rights on internet should bear the costs of doing so, not others. This morning, the UK Supreme Court held unanimously that the rights holders should bear the costs.

The main party to the case was BT who opposed being forced to pay for costs incurred in blocking websites. Now rights-holders must reimburse ISPs for the costs of blocking rights-infringing material.

Supreme Court judge Lord Sumption, one of five n the panel, ruled:

There is no legal basis for requiring a party to shoulder the burden of remedying an injustice if he has no legal responsibility for the infringement and is not a volunteer but is acting under the compulsion of an order of the court.

It follows that in principle the rights-holders should indemnify the ISPs against their compliance costs. Section 97A of the Copyright, Designs and Patents Act 1988 allows rights-holders to go to court and get a blocking order -- the question in the current case is who stumps up for the costs of complying with that order?

Of course this no asks the question about who should pay for mass porn website blocking that will be needed when the BBFC porn censorship regime stats its work.

 

 

The Rise of the Machines...

Vint Cerf, Tim Berners-Lee, and Dozens of Other Computing Experts Oppose Article 13 of the EU's new internet censorship law


Link Here13th June 2018
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law

As Europe's latest copyright proposal heads to a critical vote on June 20-21, more than 70 Internet and computing luminaries have spoken out against a dangerous provision, Article 13, that would require Internet platforms to automatically filter uploaded content. The group, which includes Internet pioneer Vint Cerf, the inventor of the World Wide Web Tim Berners-Lee, Wikipedia co-founder Jimmy Wales, co-founder of the Mozilla Project Mitchell Baker, Internet Archive founder Brewster Kahle, cryptography expert Bruce Schneier, and net neutrality expert Tim Wu , wrote in a joint letter that was released today :

By requiring Internet platforms to perform automatic filtering all of the content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet, from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users.

The prospects for the elimination of Article 13 have continued to worsen. Until late last month, there was the hope that that Member States (represented by the Council of the European Union) would find a compromise. Instead, their final negotiating mandate doubled down on it.

The last hope for defeating the proposal now lies with the European Parliament. On June 20-21 the Legal Affairs (JURI) Committee will vote on the proposal. If it votes against upload filtering, the fight can continue in the Parliament's subsequent negotiations with the Council and the European Commission. If not, then automatic filtering of all uploaded content may become a mandatory requirement for all user content platforms that serve European users. Although this will pose little impediment to the largest platforms such as YouTube, which already uses its Content ID system to filter content, the law will create an expensive barrier to entry for smaller platforms and startups, which may choose to establish or move their operations overseas in order to avoid the European law.

For those platforms that do establish upload filtering, users will find that their contributions--including video, audio, text, and even source code --will be monitored and potentially blocked if the automated system detects what it believes to be a copyright infringement. Inevitably, mistakes will happen . There is no way for an automated system to reliably determine when the use of a copyright work falls within a copyright limitation or exception under European law, such as quotation or parody.

Moreover, because these exceptions are not consistent across Europe, and because there is no broad fair use right as in the United States, many harmless uses of copyright works in memes, mashups, and remixes probably are technically infringing even if no reasonable copyright owner would object. If an automated system monitors and filters out these technical infringements, then the permissible scope of freedom of expression in Europe will be radically curtailed, even without the need for any substantive changes in copyright law.

The upload filtering proposal stems from a misunderstanding about the purpose of copyright . Copyright isn't designed to compensate creators for each and every use of their works. It is meant to incentivize creators as part of an effort to promote the public interest in innovation and expression. But that public interest isn't served unless there are limitations on copyright that allow new generations to build and comment on the previous contributions . Those limitations are both legal, like fair dealing, and practical, like the zone of tolerance for harmless uses. Automated upload filtering will undermine both.

The authors of today's letter write:

We support the consideration of measures that would improve the ability for creators to receive fair remuneration for the use of their works online. But we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet's future, we urge you to vote for the deletion of this proposal.

What began as a bad idea offered up to copyright lobbyists as a solution to an imaginary " value gap " has now become an outright crisis for future of the Internet as we know it. Indeed, if those who created and sustain the operation of the Internet recognize the scale of this threat, we should all be sitting up and taking notice.

If you live in Europe or have European friends or family, now could be your last opportunity to avert the upload filter. Please take action by clicking the button below, which will take you to a campaign website where you can phone, email, or Tweet at your representatives, urging them to stop this threat to the global Internet before it's too late.

Take Action at saveyourinternet.eu

 

 

Banning anything that creates difficulties for the authorities...

Vietnam passes a new internet censorship law where almost all online posts will be illegal


Link Here 13th June 2018
Full story: Internet Censorship in Vietnam...New law requiring websites to delete content
The Committee to Protect Journalists condemned a new cybersecurity law passed today by Vietnam's National Assembly as a clear threat to press freedom and called on the Vietnamese government immediately to repeal it.

The legislation, which goes into effect January 1, 2019, gives broad powers to government authorities to surveil the internet, including the ability to force international technology companies with operations in the country to reveal their users' personal information and censor online information on demand, according to news reports said.

The law's vague and broad provisions ban any online posts deemed as opposing the State of the Socialist Republic of Vietnam, or which [offend] the nation, the national flag, the national anthem, great people, leaders, notable people and national heroes, according to the reports. The same sources state that the law's Article 8 prohibits the use of the internet to distort history, deny revolutionary achievements or undermine national solidarity.

The law also prohibits disseminating online incorrect information which causes confusion among people, damages socio-economic activities [or] creates difficulties for authorities and those performing their duty, according to reports.

After January 1, 2019, companies will have 24 hours to remove content that the Information and Communications Ministry or the Public Security Ministry find to be in violation of the new law.

Shawn Crispin, CPJ's Southeast Asia representative said:

Vietnam's new cybersecurity law represents a grave danger to journalists and bloggers who work online and should be promptly repealed. We expect international technology companies to use their best efforts to uphold their stated commitment to a free and open internet and user privacy and to resist any attempts to undermine those commitments.

 

 

Updated: The latest betting in the Swiss online gambling referendum...

State Monopoly is 4/6, Total Censorship is 5/4, and Online Freedom is a non-runner


Link Here 12th June 2018
Swiss voters will decide on Sunday whether to back a new gambling law designed to restrict online gambling to a state monopoly or reject what opponents say amounts to internet censorship.

Recent polls indicate a clear majority plan support the new law, which has already been passed by both houses of parliament, and now is being put to a referendum.

The Swiss government says the Gambling Act updates legislation for the digital age. If approved by voters, the law would be among the strictest in Europe and would only allow casinos and gaming companies certified in Switzerland to operate, including on the internet. This would enable Swiss companies for the first time to offer online gambling, but would basically block foreign-based companies from the market.

Bern also wants all of the companies' proceeds to be taxed in Switzerland, with revenues helping fund anti-addiction measures, as well as social security and sports and culture programmes.

The new law represents a windfall for Switzerland's casinos, which had put huge amounts of money into campaigning.

Opponents have slammed Bern for employing methods worthy of an authoritarian state, with a measure that they claim is censorship of the internet.

Update: State Monopoly wins by a distance

12th June 2018. See  article from globaltimes.cn

Swiss voters have overwhelmingly approved blocking foreign-based betting sites in a referendum on a new gambling law designed to create a local monopoly.

72.9% of voters came out in favor of the new gambling law.

The law, which is set to take effect next year, will be among the strictest in Europe, allowing only casinos and gaming companies certified in Switzerland to operate in the country, including on the internet.

It will enable Swiss companies for the first time to offer online gambling, but will basically block foreign-based companies from the market.

 

 

Offsite Article: What's really behind the EU law that would ban memes. And hw to stop it...


Link Here 11th June 2018
Full story: Copyright in the EU...Copyright law for Europe
Big corporate lobbies are demanding these new copyright laws, hoping to make additional profits and gain more control over the web. By MEP Julia Reda

See article from juliareda.eu

 

 

Corporate takeover of our internet...

The EU's Copyright Proposal is Extremely Bad News for Everyone, Even (Especially!) Wikipedia. By Cory Doctorow


Link Here10th June 2018
Full story: Copyright in the EU...Copyright law for Europe

The pending update to the EU Copyright Directive is coming up for a committee vote on June 20 or 21 and a parliamentary vote either in early July or late September. While the directive fixes some longstanding problems with EU rules, it creates much, much larger ones: problems so big that they threaten to wreck the Internet itself.

Under Article 13 of the proposal , sites that allow users to post text, sounds, code, still or moving images, or other copyrighted works for public consumption will have to filter all their users' submissions against a database of copyrighted works. Sites will have to pay to license the technology to match submissions to the database, and to identify near matches as well as exact ones. Sites will be required to have a process to allow rightsholders to update this list with more copyrighted works.

Even under the best of circumstances, this presents huge problems. Algorithms that do content-matching are frankly terrible at it. The Made-in-the-USA version of this is YouTube's Content ID system, which improperly flags legitimate works all the time, but still gets flack from entertainment companies for not doing more.

There are lots of legitimate reasons for Internet users to upload copyrighted works. You might upload a clip from a nightclub (or a protest, or a technical presentation) that includes some copyrighted music in the background. Or you might just be wearing a t-shirt with your favorite album cover in your Tinder profile. You might upload the cover of a book you're selling on an online auction site, or you might want to post a photo of your sitting room in the rental listing for your flat, including the posters on the wall and the picture on the TV.

Wikipedians have even more specialised reasons to upload material: pictures of celebrities, photos taken at newsworthy events, and so on.

But the bots that Article 13 mandates will not be perfect. In fact, by design, they will be wildly imperfect.

Article 13 punishes any site that fails to block copyright infringement, but it won't punish people who abuse the system. There are no penalties for falsely claiming copyright over someone else's work, which means that someone could upload all of Wikipedia to a filter system (for instance, one of the many sites that incorporate Wikpedia's content into their own databases) and then claim ownership over it on Twitter, Facebook and Wordpress, and everyone else would be prevented from quoting Wikipedia on any of those services until they sorted out the false claims. It will be a lot easier to make these false claims that it will be to figure out which of the hundreds of millions of copyrighted claims are real and which ones are pranks or hoaxes or censorship attempts.

Article 13 also leaves you out in the cold when your own work is censored thanks to a malfunctioning copyright bot. Your only option when you get censored is to raise an objection with the platform and hope they see it your way--but if they fail to give real consideration to your petition, you have to go to court to plead your case.

Article 13 gets Wikipedia coming and going: not only does it create opportunities for unscrupulous or incompetent people to block the sharing of Wikipedia's content beyond its bounds, it could also require Wikipedia to filter submissions to the encyclopedia and its surrounding projects, like Wikimedia Commons. The drafters of Article 13 have tried to carve Wikipedia out of the rule , but thanks to sloppy drafting, they have failed: the exemption is limited to "noncommercial activity". Every file on Wikipedia is licensed for commercial use.

Then there's the websites that Wikipedia relies on as references. The fragility and impermanence of links is already a serious problem for Wikipedia's crucial footnotes, but after Article 13 becomes law, any information hosted in the EU might disappear--and links to US mirrors might become infringing--at any moment thanks to an overzealous copyright bot. For these reasons and many more, the Wikimedia Foundation has taken a public position condemning Article 13.

Speaking of references: the problems with the new copyright proposal don't stop there. Under Article 11, each member state will get to create a new copyright in news. If it passes, in order to link to a news website, you will either have to do so in a way that satisfies the limitations and exceptions of all 28 laws, or you will have to get a license. This is fundamentally incompatible with any sort of wiki (obviously), much less Wikipedia.

It also means that the websites that Wikipedia relies on for its reference links may face licensing hurdles that would limit their ability to cite their own sources. In particular, news sites may seek to withhold linking licenses from critics who want to quote from them in order to analyze, correct and critique their articles, making it much harder for anyone else to figure out where the positions are in debates, especially years after the fact. This may not matter to people who only pay attention to news in the moment, but it's a blow to projects that seek to present and preserve long-term records of noteworthy controversies. And since every member state will get to make its own rules for quotation and linking, Wikipedia posts will have to satisfy a patchwork of contradictory rules, some of which are already so severe that they'd ban any items in a "Further Reading" list unless the article directly referenced or criticized them.

The controversial measures in the new directive have been tried before. For example, link taxes were tried in Spain and Germany and they failed , and publishers don't want them . Indeed, the only country to embrace this idea as workable is China , where mandatory copyright enforcement bots have become part of the national toolkit for controlling public discourse.

Articles 13 and 11 are poorly thought through, poorly drafted, unworkable--and dangerous. The collateral damage they will impose on every realm of public life can't be overstated. The Internet, after all, is inextricably bound up in the daily lives of hundreds of millions of Europeans and an entire constellation of sites and services will be adversely affected by Article 13. Europe can't afford to place education, employment, family life, creativity, entertainment, business, protest, politics, and a thousand other activities at the mercy of unaccountable algorithmic filters. If you're a European concerned about these proposals, here's a tool for contacting your MEP .

 

 

Offsite Article: Censorship in the Age of Large Cloud Providers...


Link Here9th June 2018
Full story: Internet Censorship in Russia...Russia and its repressive state control of media
An interesting and detailed account of the battle between Russia's internet censors and the Telegram messaging service. By Bruce Schneier

See article from lawfareblog.com

 

 

Changing liability law to force companies to censor swathes of the internet...

UK government publishes its intentions to significantly ramp up internet censorship


Link Here8th June 2018
Full story: Social Networking Censorship in the UK...Internet censorship set to solve Britain's broken society
Who is liable if a user posts copyrighted music to YouTube without authority? Is it the user or is it YouTube? The answer is of course that it is the user who would be held liable should copyright holders seek compensation. YouTube would be held responsible only if they were informed of the infringement and refused to take it down.

This is the practical compromise that lets the internet work.

So what would happen if the government changed the liability laws so that YouTube was held liable for unauthorised music as soon as it was posted. There maybe millions of views before it was spotted. If YouTube were immediately liable they may have to pay millions in court judgements against them.

There is lot of blather about YouTube having magic Artificial Intelligence that can detect copyrighted music and block it before it us uploaded. But this is nonsense, music is copyrighted by default, even a piece that has never been published and is not held in any computer database.

YouTube does not have a database that contains all the licensing and authorisation, and who exactly is allowed to post copyrighted material. Even big companies lie, so how could YouTube really know what could be posted and what could not.

If the law were to be changed, and YouTube were held responsible for the copyright infringement of their posters, then the only possible outcome would be for YouTube to use its AI to detect any music at all and block all videos which contain music. The only music allowed to be published would be from the music companies themselves, and even then after providing YouTube with paperwork to prove that they had the necessary authorisation.

So when the government speaks of changes to liability law they are speaking of a massive step up in internet censorship as the likely outcome.

In fact the censorship power of such liability tweaks has been proven in the US. The recently passed FOSTA law changed liability law so that internet companies are now held liable for user posts  facilitating sex trafficking. The law was sold as a 'tweak' just to take action against trafficking. But it resulted in the immediate and almost total internet censorship of all user postings facilitating adult consensual sex work, and a fair amount of personal small ads and dating services as well.

The rub was that sex traffickers do not in any way specify that their sex workers have been trafficked, their adverts are exactly the same as for adult consensual sex workers. With all the artificial intelligence in the world, there is no way that internet companies can distinguish between the two.

When they are told they are liable for sex trafficking adverts, then the only possible way to comply is to ban all adverts or services that feature anything to do with sex or personal hook ups. Which is of course exactly what happened.

So when UK politicians speak of  internet liability changes and sex trafficking then they are talking about big time, large scale internet censorship.

 And Theresa May said today via a government press release as reported in the Daily Mail:

Web giants such as Facebook and Twitter must automatically remove vile abuse aimed at women, Theresa May will demand today.

The Prime Minister will urge companies to utilise the same technology used to take down terrorist propaganda to remove rape threats and harassment.

Speaking at the G7 summit in Quebec, Mrs May will call on firms to do more to tackle content promoting and depicting violence against women and girls, including illegal violent pornography.

She will also demand the automatic removal of adverts that are linked to people-trafficking.

May will argue they must ensure women can use the web without fear of online rape threats, harassment, cyberstalking, blackmail or vile comments.

She will say: We know that technology plays a crucial part in advancing gender equality and empowering women and girls, but these benefits are being undermined by vile forms of online violence, abuse and harassment.

What is illegal offline is illegal online and I am calling on world leaders to take serious action to deal with this, just like we are doing in the UK with our commitment to legislate on online harms such as cyber-stalking and harassment.

In a world that is being ripped apart by identitarian intolerance of everyone else, its seems particularly unfair that men should be expected to happily put up with the fear of online threats, harassment, cyberstalking, blackmail or vile comments. Surely laws should be written so that all people are treated totally equally.

Theresa May did not say to much about liability law directly in her press release, but it is laid out pretty clearly in the government document just published, titled Government response to the internet strategy green paper [pdf]:

Taking responsibility

Platform liability and illegal harms

Online platforms need to take responsibility for the content they host. They need to proactively tackle harmful behaviours and content. Progress has been made in removing illegal content, particularly terrorist material, but more needs to be done to reduce the amount of damaging content online, legal and illegal.

We are developing options for increasing the liability online platforms have for illegal content on their services. This includes examining how we can make existing frameworks and definitions work better, as well as what the liability regime should look like in the long-run.

Terms and Conditions

Platforms use their terms and conditions to set out key information about who can use the service, what content is acceptable and what action can be taken if users don't comply with the terms. We know that users frequently break these rules. In such circumstances, the platforms' terms state that they can take action, for example they can remove the offending content or stop providing services to the user. However, we do not see companies proactively doing this on a routine basis. Too often companies simply do not enforce their own terms and conditions.

Government wants companies to set out clear expectations of what is acceptable on their platforms in their terms, and then enforce these rules using sanctions when necessary. By doing so, companies will be helping users understand what is and isn't acceptable.

Clear Standards

We believe that it is right for Government to set out clear standards for social media platforms, and to hold them to account if they fail to live up to these. DCMS and Home Office will jointly work on the White Paper which will set out our proposals for forthcoming legislation. We will focus on proposals which will bring into force real protections for users that will cover both harmful and illegal content and behaviours. In parallel, we are currently
assessing legislative options to modify the online liability regime in the UK, including both the smaller changes consistent with the EU's eCommerce directive, and the larger changes that may be possible when we leave the EU.

Worrying or what?

 

 

Offsite Article: The false alarm over fake news...


Link Here8th June 2018
Full story: Fake news in the UK...Government sets up fake news unit
The voter-blaming elites have really lost touch with truth and reality. By Mick Hume

See article from spiked-online.com

 

 

Offsite Video: What Koreans Think Of Porn Censorship?...


Link Here8th June 2018
And how they work around it with VPNs and peer to peer downloads See video from YouTube

 

 

Offsite Article: How Walled Gardens Like Facebook Are Cannibalizing Media Publishers...


Link Here 7th June 2018
A call for social media like software to be developed for small publishers outside of the control of large internet corporates

See article from forbes.com

 

 

Judgement day nears for the latest EU internet censorship law...

TorrentFreak explains the grave threat to internet users and European small businesses


Link Here 6th June 2018
Full story: Internet Censorship in EU...EU introduces swathes of internet censorship law

The EU's plans to modernize copyright law in Europe are moving ahead. With a crucial vote coming up later this month, protests from various opponents are on the rise as well. They warn that the proposed plans will result in Internet filters which threaten people's ability to freely share content online. According to Pirate Party MEP Julia Reda, these filters will hurt regular Internet users, but also creators and businesses.

September 2016, the European Commission published its proposal for a modernized copyright law. Among other things, it proposed measures to require online services to do more to fight piracy.

Specifically, Article 13 of the proposed Copyright Directive will require online services to track down and delete pirated content, in collaboration with rightsholders.

The Commission stressed that the changes are needed to support copyright holders. However, many legal scholars , digital activists , politicians , and members of the public worry that they will violate the rights of regular Internet users.

Last month the EU Council finalized the latest version of the proposal. This means that the matter now goes to the Legal Affairs Committee of the Parliament (JURI), which must decide how to move ahead. This vote is expected to take place in two weeks.

Although the term filter is commonly used to describe Article 13, it is not directly mentioned in the text itself .

According to Pirate Party Member of Parliament (MEP) Julia Reda , the filter keyword is avoided in the proposal to prevent a possible violation of EU law and the Charter of Fundamental Rights. However, the outcome is essentially the same.

In short, the relevant text states that online services are liable for any uploaded content unless they take effective and proportionate action to prevent copyright infringements, identified by copyright holders. That also includes preventing these files from being reuploaded.

The latter implies some form of hash filtering and continuous monitoring of all user uploads. Several companies, including Google Drive, Dropbox, and YouTube already have these types of filters, but many others don't.

A main point of critique is that the automated upload checks will lead to overblocking, as they are often ill-equipped to deal with issues such as fair use.

The proposal would require platforms to filter all uploads by their users for potential copyright infringements -- not just YouTube and Facebook, but also services like WordPress, TripAdvisor, or even Tinder. We know from experience that these algorithmic filters regularly make mistakes and lead to the mass deletion of legal uploads, Julia Reda tells TF.

Especially small independent creators frequently see their content taken down because others wrongfully claim copyright on their works. There are no safeguards in the proposal against such cases of copyfraud.

Besides affecting uploads of regular Internet users and smaller creators, many businesses will also be hit. They will have to make sure that they can detect and prevent infringing material from being shared on their systems.

This will give larger American Internet giants, who already have these filters in place, a competitive edge over smaller players and new startups, the Pirate Party MEP argues.

It will make those Internet giants even stronger, because they will be the only ones able to develop and sell the filtering technologies necessary to comply with the law. A true lose-lose situation for European Internet users, authors and businesses, Reda tells us.

Based on the considerable protests in recent days, the current proposal is still seen as a clear threat by many.

In fact, the save youri nternet campaign, backed by prominent organizations such as Creative Commons, EFF, and Open Media, is ramping up again. They urge the European public to reach out to their Members of Parliament before it's too late.

Should Article 13 of the Copyright Directive proposal be adopted, it will impose widespread censorship of all the content you share online. The European Parliament is the only one that can step in and Save your Internet, they write.

The full Article 13 text includes some language to limit its scope. The nature and size of online services must be taken into account, for example. This means that a small and legitimate niche service with a few dozen users might not be directly liable if it operates without these anti-piracy measures.

Similarly, non-profit organizations will not be required to comply with the proposed legislation, although there are calls from some member states to change this.

In addition to Article 13, there is also considerable pushback from the public against Article 11, which is regularly referred to as the link tax .

At the moment, several organizations are planning a protest day next week, hoping to mobilize the public to speak out. A week later, following the JURI vote, it will be judgment day.

If they pass the Committee the plans will progress towards the final vote on copyright reform next Spring. This also means that they'll become much harder to stop or change. That has been done before, such as with ACTA, but achieving that type of momentum will be a tough challenge.

 

 

Circumventing freedom...

Russian parliament approves fines for using VPNs to circumvent state website blocking


Link Here6th June 2018
Full story: Internet Censorship in Russia...Russia and its repressive state control of media
Lawmakers in Russia's State Duma have adopted a final draft of legislation that imposes fines on violations of Russia's ban on Internet anonymizers that grant access to online content blocked by the state internet censor.

According to the bill, individuals who break the law will face fines of 5,000 rubles ($80), officials will face fines up to 50,000 rubles ($800), and legal entities could be fined up to 700,000 rubles ($11,230).

Internet search engines will also be required to connect to the Federal State Information System, which will list the websites banned in Russia. Failure to connect to this system can result in fines up to 300,000 rubles ($4,800).

Russia's law on VPN services and Internet anonymizers entered force on November 1, 2017. The Federal Security Agency and other law enforcement agencies are authorized to designate websites and online services that violate Russia's Internet censorship.

 

 

No order in court...

The Open Rights Group finds that nearly 40% of court order blocks are in error


Link Here5th June 2018
Full story: Internet Blocking File Sharing in UK...High court dictates website block

Open Rights Group today released figures that show that High Court injunctions are being improperly administrated by ISPs and rights holders.

A new tool added to its blocked.org.uk project examines over 1,000 domains blocked under the UK's 30 injunctions against over 150 services,

ORG found 37% of those domains are blocked in error, or without any legal basis. The majority of the domains blocked are parked domains, or no longer used by infringing services. One Sci-Hub domain is blocked without an injunction, and a likely trademark infringing site, is also blocked without an injunction.

However, the list of blocked domains is believed to be around 2,500 domains, and is not made public, so ORG are unable to check for all possible mistakes.

Jim Killock, Executive Director of Open Rights Group said:

It is not acceptable for a legal process to result in nearly 40% maladministration. These results show a great deal of carelessness.

We expect ISPs and rights holders to examine our results and remove the errors we have found as swiftly as possible.

We want ISPs to immediately release lists of previously blocked domains, so we can check blocks are being removed by everyone.

Rights holders must make public exactly what is being blocked, so we can be ascertain how else these extremely wide legal powers are being applied.

ORG's conclusions are:

  • The administration process of adding and subtracting domains to be blocked is very poor

  • Keeping the lists secret makes it impossible to check errors

  • Getting mistakes corrected is opaque. The ISP pages suggest you go to court.

Examples Some are potential subject to an injunction, which has not been sought, for instance: http://www.couchtuner.es One directs to a personal blog: http://kat.kleisauke.nl

Full results and statistical breakdowns https://www.blocked.org.uk/legal-blocks/errors

Export full results https://www.blocked.org.uk/legal-blocks

For a list of UK injunctions, see:

The UK has 30 copyright and trademark injunctions, blocking over 150 websites.

https://wiki.451unavailable.org.uk/wiki/Main_Page

 

 

Sound judgement...

Spotify reverses censorship based on artist controversy


Link Here5th June 2018
Spotify recently shared a new policy around hate content and conduct. And while we believe our intentions were good, the language was too vague, we created confusion and concern, and didn't spend enough time getting input from our own team and key partners before sharing new guidelines.

It's important to note that our policy had two parts. The first was related to promotional decisions in the rare cases of the most extreme artist controversies. As some have pointed out, this language was vague and left too many elements open to interpretation. We created concern that an allegation might affect artists' chances of landing on a Spotify playlist and negatively impact their future. Some artists even worried that mistakes made in their youth would be used against them.

That's not what Spotify is about. We don't aim to play judge and jury. We aim to connect artists and fans 203 and Spotify playlists are a big part of how we do that. Our playlist editors are deeply rooted in their respective cultures, and their decisions focus on what music will positively resonate with their listeners. That can vary greatly from culture to culture, and playlist to playlist. Across all genres, our role is not to regulate artists. Therefore, we are moving away from implementing a policy around artist conduct.

The second part of our policy addressed hate content. Spotify does not permit content whose principal purpose is to incite hatred or violence against people because of their race, religion, disability, gender identity, or sexual orientation. As we've done before, we will remove content that violates that standard. We're not talking about offensive, explicit, or vulgar content 203 we're talking about hate speech.

We will continue to seek ways to impact the greater good and further the industry we all care so much about. We believe Spotify has an opportunity to help push the broader music community forward through conversation, collaboration and action. We're committed to working across the artist and advocacy communities to help achieve that.

 

 

Offsite Article: The Google MAFIA (Microsoft, Amazon, Facebook, Instagram and Apple)...


Link Here 5th June 2018
The power of corporate giants like Amazon and Facebook is unparalleled. A regulatory assault seems urgent, inevitable -- and impossible. By Rafael Behr

See article from theguardian.com

 

 

Offsite Article: Article 13 could destroy the internet as we know it...


Link Here4th June 2018
What is it, why is it controversial and what will it mean for memes? Critics of the proposed EU directive on copyright warn that it will censor internet users

See article from alphr.com

 

 

Offsite Article: Signing Away Our Lives on Facebook...


Link Here3rd June 2018
The social media giant collects huge quantities of data to target advertising--and that has implications for our lives, our society, and our democracy. By University of King's College

See article from thewalrus.ca

 

 

Offsite Article: The internet of snooping things...


Link Here2nd June 2018
Which? investigation reveals staggering level of smart home surveillance

See article from which.co.uk

 

 

Updated: Instaban...

The US FOSTA internet censorship law results in Instagram banning all posts on the #stripper hashtag


Link Here1st June 2018
Full story: FOSTA US Internet Censorship Law...Wide ranging internet cesnorship law targetting sex workers
Instagram has censored the hashtag #stripper and several related keywords that dancers use to find each other and organize online. Now, sex workers are taking to social media to spread the word, decry censorship, and suggest workarounds.

Currently, when you search Instagram for #stripper or #strippers, you are given a preview of just a couple top posts in the category. But if you click through to view the entire hashtag, the following message appears:

Recent posts from #strippers are currently hidden because the community has reported some content that may not meet Instagram's community guidelines.

The same thing was reportedly happening until very recently with a handful of related hashtags, including #yesastripper, #stripperstyle, and #stripperlife--but those appear to be back in action, demonstrating how quickly the sex work community has to adapt and change.

Instagram has yet to comment about the censorship, but is surely because of the recent US internet censorship law FOSTA. This would make Instagram responsible should any posts to #stripper be used to facilitate sex trafficking. As Instagram is unable to vet all such postings for possible traffcking then the only practical option is to ban all posts about sex work.

Update: Unbanned

1st June 2018. See  article from avn.com

By Thursday morning, Instagram had apparently backed down, telling Jezebel that, the hashtag #stripper can again be used and seen by the community in the spirit in which they are intended.

Instagram sent a statement on Thursday effectively rescinding the ban:

The safety of our community is our number one priority and we spend a lot of time thinking about how we can create a safe and open environment for everyone, Instagram said in the statement. This includes constantly monitoring hashtag behavior by using a variety of different signals, including community member reports. Access to recent posts and following hashtags are sometimes restricted based on content being posted with those hashtags. The hashtag #stripper can again be used and seen by the community in the spirit in which they are intended.

 

 

Offsite Article: Three Massive Threats to Online Liberty in the UK...


Link Here 1st June 2018
Age verification, online safety and a British FOSTA

See article from sexandcensorship.org


 2010   2011   2012   2013   2014   2015   2016   2017   2018   2019   2020   2021   2020   2022   2023   2024   Latest 
Jan   Feb   Mar   Apr   May   June   July   Aug   Sep   Oct   Nov   Dec   Jan    


 


TV News

Movie News

Games News

Internet News
 
Advertising News

Phone News
 

Technology News

Gambling News

Books News

Music News

Art News

Stage News
 

melonfarmers icon

Home

Index

Links

Email

Shop
 


US

World

Media

Nutters

Liberty
 

Film Cuts

Cutting Edge

Info

Sex News

Sex+Shopping
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys