|
After More Than a Decade of Litigation, the Dancing Baby Has Done His Part to Strengthen Fair Use for Everyone
|
|
|
| 28th June 2018
|
|
| See article from eff.org |
Litigation can always take twists and turns, but when EFF filed a lawsuit against Universal Music Group in 2007 on behalf of Stephanie Lenz, few would have anticipated it would be ten years until the case was finally resolved. But
today , at last, it is. Along the way, Lenz v. Universal contributed to strengthening
fair use law, bringing nationwide attention to the issues of copyright and fair use in new digital movie-making and sharing technologies. It all started when Lenz posted a YouTube
video of her then-toddler-aged son dancing while Prince's song Let's Go Crazy played in the background, and Universal used copyright claims to get the link
disabled. We brought the case hoping to get some clarity from the courts on a simple but important issue: can a rightsholder use the Digital Millennium Copyright Act
to take down an obvious fair use, without consequence? Congress designed the DMCA to give rightsholders, service providers, and users relatively precise
rules of the road for policing online copyright infringement. The center of the scheme is the notice and takedown process. In exchange for substantial protection from liability for the actions of their users, service providers must promptly take offline
content on their platforms that has been identified as infringing, as well as several other prescribed steps. Copyright owners, for their part, are given an expedited, extra-judicial procedure for obtaining redress against alleged infringement, paired
with explicit statutory guidance regarding the process for doing so, and provisions designed to deter and ameliorate abuse of that process. Without Section 512, the risk of crippling liability for the acts of users would have
prevented the emergence of most of the social media outlets we use today. Instead, the Internet has become the most revolutionary platform for the creation and dissemination of speech that the world has ever known. But Congress
also knew that Section 512's powerful incentives could result also in lawful material being censored from the Internet, without prior judicial scrutiny--much less advance notice to the person who posted the material--or an opportunity to contest the
removal. To inhibit abuse, Congress made sure that the DMCA included a series of checks and balances, including Section 512(f), which gives users the ability to hold rightsholders accountable if they send a DMCA notice in bad faith.
In this case, Universal Music Group claimed to have a good faith belief that Ms. Lenz's video of her child dancing to a short segment of barely-audible music infringed copyright. Yet the undisputed facts showed Universal never
considered whether Ms. Lenz's use was lawful under the fair use doctrine. If it had done so, it could not reasonably have concluded her use was infringing. On behalf of Stephanie Lenz, EFF argued that this was a misrepresentation in violation of Section
512(f). In response, Universal argued that rightsholders have no obligation to consider fair use at all. The U.S. Court of Appeals for the Ninth Circuit
rejected that argument, correctly holding that the DMCA requires a rightsholder to consider whether the uses she targets in a DMCA notice are
actually lawful under the fair use doctrine. However, the court also held that a rightsholder's determination on that question passes muster as long as she subjectively believes it to be true. This leads to a virtually incoherent result: a rightsholder
must consider fair use, but has no incentive to actually learn what such a consideration should entail. After all, if she doesn't know what the fair use factors are, she can't be held liable for not applying them thoughtfully. We
were disappointed in that part of the ruling, but it came with a big silver lining: the court also held that fair use is not simply a narrow defense copyright but an affirmative public right. For decades, rightsholders and scholars had debated the
issue, with many preferring to construe fair use as narrowly as possible. Thanks to the Lenz decision, courts will be more likely to think of fair use, correctly, as a crucial vehicle for achieving the real purpose of copyright law: to promote the
public interest in creativity and innovation. And rightsholders are on notice: they must at least consider fair use before sending a takedown notice. Lenz and Universal filed
petitions requesting that the Supreme Court review the Ninth Circuit's ruling. The Supreme Court denied both
petitions. This meant that the case returned to the district court for trial on the question of whether Universal's takedown was a misrepresentation under the Ninth Circuit's subjective standard. Rather than go to trial, the parties have agreed to a
settlement. Lenz v. Universal helped make some great law on fair use and also played a role in leading to better takedown processes at Universal. EFF congratulates Stephanie Lenz for fighting the good fight, and we thank
our co-counsel at Keker, Van Nest & Peters LLP and Kwun Bhansali Lazarus LLP for being our partners through
this long journey.
|
|
European Parliament committee passed vote to hand over censorship of the internet to US corporate giants
|
|
|
|
20th June 2018
|
|
| See article from bit-tech.net
|
The European Parliament's Committee on Legal Affairs (JURI) has officially approved Articles 11 and 13 of a Digital Single Market (DSM) copyright proposal, mandating censorship machines and a link tax. Articles 11 and 13 of the Directive of the
European Parliament and of the Council on Copyright in the Digital Single Market have been the subject of considerable campaigning from pro-copyleft groups including the Open Rights Group and Electronic Frontier Foundation of late. Article 11, as
per the final version of the proposal, discusses the implementation of a link tax - the requirement that any site citing third-party materials do so in a way that adheres to the exemptions and restrictions of a total of 28 separate copyright laws or pays
for a licence to use and link to the material; Article 13, meanwhile, requires any site which allows users to post text, sound, program code, still or moving images, or any other work which can be copyrighted to automatically scan all such uploads
against a database of copyright works - a database which they will be required to pay to access. Both Article 11 and Article 13 won't become official legislation until passed by the entire European Parliament in a plenary vote. There's no definite
timetable for when such a vote might take place, but it would likely happen sometime between December of this year and the first half of 2019. |
|
In two days, an EU committee will vote to crown Google and Facebook permanent lords of internet censorship
|
|
|
| 19th June 2018
|
|
| See article from boingboing.net CC by Cory Doctorow
|
On June 20, the EU's legislative committee will vote on the new Copyright directive , and decide whether it will include the controversial "Article 13" (automated
censorship of anything an algorithm identifies as a copyright violation) and "Article 11" (no linking to news stories without paid permission from the site). These proposals will make starting new internet companies
effectively impossible -- Google, Facebook, Twitter, Apple, and the other US giants will be able to negotiate favourable rates and build out the infrastructure to comply with these proposals, but no one else will. The EU's regional tech success stories
-- say Seznam.cz , a successful Czech search competitor to Google -- don't have $60-100,000,000 lying around to build out their filters, and lack the leverage to extract favorable linking
licenses from news sites. If Articles 11 and 13 pass, American companies will be in charge of Europe's conversations, deciding which photos and tweets and videos can be seen by the public, and who may speak.
The MEP Julia Reda has written up the state of play on the vote, and it's very bad. Both left- and right-wing parties
have backed this proposal, including (incredibly) the French Front National, whose Youtube channel was just deleted by a copyright filter of the sort
they're about to vote to universalise.
So far, the focus in the debate has been on the intended consequences of the proposals: the idea that a certain amount of free expression and competition must be sacrificed to enable rightsholders to force Google and Facebook to
share their profits. But the unintended -- and utterly foreseeable -- consequences are even more important. Article 11's link tax allows news sites to decide who gets to link to them, meaning that they can exclude their critics.
With election cycles dominated by hoaxes and fake news, the right of a news publisher to decide who gets to criticise it is carte blanche to lie and spin. Article 13's copyright filters are even more vulnerable to attack: the
proposals contain no penalties for false claims of copyright ownership, but they do mandate that the filters must accept copyright claims in bulk, allowing rightsholders to upload millions of works at once in order to claim their copyright
and prevent anyone from posting them. That opens the doors to all kinds of attacks. The obvious one is that trolls might sow mischief by uploading millions of works they don't hold the copyright to, in order to prevent others from
quoting them: the works of Shakespeare, say, or everything ever posted to Wikipedia, or my novels, or your family photos. More insidious is the possibility of targeted strikes during crisis: stock-market manipulators could use
bots to claim copyright over news about a company, suppressing its sharing on social media; political actors could suppress key articles during referendums or elections; corrupt governments could use arms-length trolls to falsely claim ownership of
footage of human rights abuses. It's asymmetric warfare: falsely claiming a copyright will be easy (because the rightsholders who want this system will not tolerate jumping through hoops to make their claims) and instant (because
rightsholders won't tolerate delays when their new releases are being shared online at their moment of peak popularity). Removing a false claim of copyright will require that a human at an internet giant looks at it, sleuths out the truth of the
ownership of the work, and adjusts the database -- for millions of works at once. Bots will be able to pollute the copyright databases much faster than humans could possibly clear it. I spoke with Wired UK's KG Orphanides about
this, and their excellent article on the proposal is the best explanation I've seen of the uses of these copyright filters to create
unstoppable disinformation campaigns. Doctorow highlighted the potential for unanticipated abuse of any automated copyright filtering system to make false copyright claims, engage in targeted harassment and even
silence public discourse at sensitive times. "Because the directive does not provide penalties for abuse -- and because rightsholders will not tolerate delays between claiming copyright over a work and suppressing its public
display -- it will be trivial to claim copyright over key works at key moments or use bots to claim copyrights on whole corpuses. The nature of automated systems, particularly if powerful rightsholders insist that they default to
initially blocking potentially copyrighted material and then releasing it if a complaint is made, would make it easy for griefers to use copyright claims over, for example, relevant Wikipedia articles on the eve of a Greek debt-default referendum or,
more generally, public domain content such as the entirety of Wikipedia or the complete works of Shakespeare. "Making these claims will be MUCH easier than sorting them out -- bots can use cloud providers all over the world
to file claims, while companies like Automattic (WordPress) or Twitter, or even projects like Wikipedia, would have to marshall vast armies to sort through the claims and remove the bad ones -- and if they get it wrong and remove a legit copyright claim,
they face unbelievable copyright liability."
|
|
Nascent censorship machines already rise up against the stupid politicians that support their Genesis
|
|
|
| 18th June 2018
|
|
| See
article from
privateinternetaccess.com CC by Rick Falkvinge |
Politicians, about to vote in favor of mandatory upload filtering in Europe, get channel deleted by YouTube's upload filtering. French politicians of the former Front National are furious: their entire YouTube channel was just
taken down by automatic filters at YouTube for alleged copyright violations. Perhaps this will cause them to reconsider next week's vote, which they have announced they will support: the bill that will make exactly this arbitrary, political, and
unilateral upload filtering mandatory all across Europe. The French party Front National, now renamed Rassemblemant National (national rally point), which is one of biggest parties in France, have gotten their YouTube channel
disappeared on grounds of alleged copyright violations. In an interview with French Europe1, their leader Marine Le Pen calls the takedown arbitrary, political, and unilateral. Europe is about to vote on new copyright law next
week. Next Wednesday or Thursday. So let's disregard here for a moment that this happened to a party normally described as far-right, and observe that if it can happen to one of France's biggest parties regardless of their policies, then it can happen to
anyone for political reasons 204 or any other reason. The broadcast named TVLibert39s is gone, described by YouTube as YouTube has blocked the broadcast of the newscast of Thursday, June 14 for copyright infringement.
Marine Le Pen was quoted as saying, This measure is completely false; we can easily assert a right of quotation [to illustrate why the material was well within the law to broadcast]. She's right. Automated
upload filters do not take into account when you have a legal right to broadcast copyrighted material for one of the myriad of valid reasons. They will just assume that this such reasons never exist; if nothing else, to make sure that the hosting
platform steers clear of any liability. Political messages will be disappeared on mere allegations by a political opponent, just as might have happened here. And yet, the Rassemblemant National is going to vote in favor of exactly
this mandatory upload filtering. The horror they just described on national TV as arbitrary, political, and unilateral. It's hard to illustrate clearer that Europe's politicians have absolutely no idea about the monster they're
voting on next week. The decisions to come will be unilateral, political, and arbitrary. Freedom of speech will be unilateral, political, and arbitrary. Just as Marine Le Pen says. Just as YouTube's Content ID filtering is today,
as has just been illustrated. The article mandating this unilateral, political, and arbitrary censorship is called Article 13 of the upcoming European Copyright bill, and it must be removed entirely. There is no fixing of
automated censorship machines. Privacy remains your own responsibility. So do your freedoms of speech, information, and expression.
|
|
The UN's free speech rapporteur condemns the EU's censorship machines that will violate human rights
|
|
|
| 17th June 2018
|
|
| See
article from techdirt.com |
David Kaye, the UN's Special Rapporteur on freedom of expression has now chimed in with a very thorough report, highlighting how Article 13 of the Directive -- the part about mandatory copyright filters -- would be a disaster for free speech and would
violate the UN's Declaration on Human Rights, and in particular Article 19 which says: Everyone has the right to freedom of opinion and expression; the right includes freedom to hold opinions without interference and to
seek, receive and impart information and ideas through any media regardless of frontiers.
As Kaye's report notes, the upload filters of Article 13 of the Copyright Directive would almost certainly violate this principle.
Article 13 of the proposed Directive appears likely to incentivize content-sharing providers to restrict at the point of upload user-generated content that is perfectly legitimate and lawful. Although the latest proposed
versions of Article 13 do not explicitly refer to upload filters and other content recognition technologies, it couches the obligation to prevent the availability of copyright protected works in vague terms, such as demonstrating best efforts and taking
effective and proportionate measures. Article 13(5) indicates that the assessment of effectiveness and proportionality will take into account factors such as the volume and type of works and the cost and availability of measures, but these still leave
considerable leeway for interpretation. The significant legal uncertainty such language creates does not only raise concern that it is inconsistent with the Article 19(3) requirement that restrictions on freedom of expression
should be provided by law. Such uncertainty would also raise pressure on content sharing providers to err on the side of caution and implement intrusive content recognition technologies that monitor and filter user-generated content at the point of
upload. I am concerned that the restriction of user-generated content before its publication subjects users to restrictions on freedom of expression without prior judicial review of the legality, necessity and proportionality of such restrictions.
Exacerbating these concerns is the reality that content filtering technologies are not equipped to perform context-sensitive interpretations of the valid scope of limitations and exceptions to copyright, such as fair comment or reporting, teaching,
criticism, satire and parody.
Kaye further notes that copyright is not the kind of thing that an algorithm can readily determine, and the fact-specific and context-specific nature of copyright requires much more than just throwing
algorithms at the problem -- especially when a website may face legal liability for getting it wrong. The designation of such mechanisms as the main avenue to address users' complaints effectively delegates content
blocking decisions under copyright law to extrajudicial mechanisms, potentially in violation of minimum due process guarantees under international human rights law. The blocking of content -- particularly in the context of fair use and other
fact-sensitive exceptions to copyright -- may raise complex legal questions that require adjudication by an independent and impartial judicial authority. Even in exceptional circumstances where expedited action is required, notice-and-notice regimes and
expedited judicial process are available as less invasive means for protecting the aims of copyright law. In the event that content blocking decisions are deemed invalid and reversed, the complaint and redress mechanism
established by private entities effectively assumes the role of providing access to remedies for violations of human rights law. I am concerned that such delegation would violate the State's obligation to provide access to an effective remedy for
violations of rights specified under the Covenant. Given that most of the content sharing providers covered under Article 13 are profit-motivated and act primarily in the interests of their shareholders, they lack the qualities of independence and
impartiality required to adjudicate and administer remedies for human rights violations. Since they also have no incentive to designate the blocking as being on the basis of the proposed Directive or other relevant law, they may opt for the legally safer
route of claiming that the upload was a terms of service violation -- this outcome may deprive users of even the remedy envisioned under Article 13(7). Finally, I wish to emphasize that unblocking, the most common remedy available for invalid content
restrictions, may often fail to address financial and other harms associated with the blocking of timesensitive content.
He goes on to point that while large platforms may be able to deal with all of this, smaller ones are going to be
in serious trouble: I am concerned that the proposed Directive will impose undue restrictions on nonprofits and small private intermediaries. The definition of an online content sharing provider under Article 2(5) is
based on ambiguous and highly subjective criteria such as the volume of copyright protected works it handles, and it does not provide a clear exemption for nonprofits. Since nonprofits and small content sharing providers may not have the financial
resources to establish licensing agreements with media companies and other right holders, they may be subject to onerous and legally ambiguous obligations to monitor and restrict the availability of copyright protected works on their platforms. Although
Article 13(5)'s criteria for effective and proportionate measures take into account the size of the provider concerned and the types of services it offers, it is unclear how these factors will be assessed, further compounding the legal uncertainty that
nonprofits and small providers face. It would also prevent a diversity of nonprofit and small content-sharing providers from potentially reaching a larger size, and result in strengthening the monopoly of the currently established providers, which could
be an impediment to the right to science and culture as framed in Article 15 of the ICESCR.
|
|
|
|
|
| 16th June 2018
|
|
|
A report about 1200 internet censors working in Germany to delete Nazi symbols and insults of migrants See
article from independent.co.uk |
|
More YouTube video and audio download sites closed down following legal pressure from the music industry
|
|
|
| 15th June 2018
|
|
| See article from torrentfreak.com
|
Several video downloading and MP3 conversion tools have thrown in the towel this week, disabling all functionality following legal pressure. Pickvideo.net states that it received a cease and desist order, while Video-download.co and EasyLoad.co reference
the lawsuit against YouTube-MP3 as the reason for their decision. The music industry sees stream ripping as one of the largest piracy threats. The RIAA, IFPI, and BPI showed that they're serious about the issue when they filed legal action against
YouTube-MP3, the largest stream ripping site at the time. This case eventually resulted in a settlement where the site, once good for over a million daily visitors, agreed to shut down voluntarily last year. YouTube-MP3's demise was a clear
victory for the music groups, which swiftly identified their next targets, putting them under pressure, both in public and behind the scenes. This week this appears to have taken its toll on several stream ripping sites, which allowed users to
download videos from YouTube and other platforms, with the option to convert files to MP3s. The targets include Pickvideo.net , Video-download.co and Easyload.co , which all inform their users that they've thrown in the towel. With several million
visits per month, Pickvideo is the largest of the three. According to the site, they took the drastic measures following a cease -and-desist letter. |
|
UK Supreme Court rules that the cost of website blocking should not be borne by ISPs, and indirectly, internet users
|
|
|
| 14th June 2018
|
|
| See article from theregister.co.uk See
article from openrightsgroup.org |
The UK Supreme Court has today ruled that trade mark holders are not able to compel ISPs to bear the cost of implementing orders to block websites selling counterfeit goods. Jim, Alex and Myles at the Supreme CourtOpen Rights Group acted as an
intervener in this case. We argued that Internet service providers (ISPs) as innocent parties should not bear the costs of website blocking, and that this was a long-standing principle of English law. Jim Killock, Executive Director of Open Rights
Group said: This case is important because if ISPs paid the costs of blocking websites, the result would be an increasing number of blocks for relatively trivial reasons and the costs would be passed to customers.
While rights holders may want websites blocked, it needs to be economically rational to ask for this.
Solicitor in the case David Allen Green said: I am delighted to have acted,
through my firm Preiskel, successfully for the Open Rights Group in their intervention. We intervened to say that those enforcing private rights on internet should bear the costs of doing so, not others. This morning, the UK
Supreme Court held unanimously that the rights holders should bear the costs.
The main party to the case was BT who opposed being forced to pay for costs incurred in blocking websites. Now rights-holders must reimburse ISPs for the
costs of blocking rights-infringing material. Supreme Court judge Lord Sumption, one of five n the panel, ruled: There is no legal basis for requiring a party to shoulder the burden of remedying an injustice if
he has no legal responsibility for the infringement and is not a volunteer but is acting under the compulsion of an order of the court. It follows that in principle the rights-holders should indemnify the ISPs against their
compliance costs. Section 97A of the Copyright, Designs and Patents Act 1988 allows rights-holders to go to court and get a blocking order -- the question in the current case is who stumps up for the costs of complying with that order?
Of course this no asks the question about who should pay for mass porn website blocking that will be needed when the BBFC porn censorship regime stats its work. |
|
Vint Cerf, Tim Berners-Lee, and Dozens of Other Computing Experts Oppose Article 13 of the EU's new internet censorship law
|
|
|
| 13th June 2018
|
|
| See article from eff.org See
joint letter that was released today [pdf] |
As Europe's latest copyright proposal heads to a critical vote on June 20-21, more than 70 Internet and
computing luminaries have spoken out against a dangerous provision, Article 13, that would require Internet platforms to automatically filter uploaded content. The group, which includes Internet pioneer Vint Cerf, the inventor of the World Wide Web Tim
Berners-Lee, Wikipedia co-founder Jimmy Wales, co-founder of the Mozilla Project Mitchell Baker, Internet Archive founder Brewster Kahle, cryptography expert Bruce Schneier, and net neutrality expert Tim Wu , wrote in a
joint letter that was released today : By requiring Internet platforms to perform automatic filtering all of the
content that their users upload, Article 13 takes an unprecedented step towards the transformation of the Internet, from an open platform for sharing and innovation, into a tool for the automated surveillance and control of its users.
The prospects for the elimination of Article 13 have continued to worsen. Until late last month, there was the hope that that Member States (represented by the Council of the European Union) would find a compromise. Instead, their
final negotiating mandate doubled down on it. The last hope for defeating the proposal now lies with the European Parliament. On June 20-21 the Legal Affairs (JURI) Committee will vote on the proposal. If it votes against upload
filtering, the fight can continue in the Parliament's subsequent negotiations with the Council and the European Commission. If not, then automatic filtering of all uploaded content may become a mandatory requirement for all user content platforms that
serve European users. Although this will pose little impediment to the largest platforms such as YouTube, which already uses its Content ID system to filter content, the law will create an expensive barrier to entry for smaller platforms and startups, which may choose to establish or move their operations overseas in order to avoid the European law.
For those platforms that do establish upload filtering, users will find that their contributions--including video, audio, text, and even source code --will be
monitored and potentially blocked if the automated system detects what it believes to be a copyright infringement. Inevitably, mistakes will happen . There is no
way for an automated system to reliably determine when the use of a copyright work falls within a copyright limitation or exception under European law, such as quotation or parody. Moreover, because these exceptions are not
consistent across Europe, and because there is no broad fair use right as in the United States, many harmless uses of copyright works in memes, mashups, and remixes probably are technically infringing even if no reasonable copyright owner would
object. If an automated system monitors and filters out these technical infringements, then the permissible scope of freedom of expression in Europe will be radically curtailed, even without the need for any substantive changes in copyright law.
The upload filtering proposal stems from a misunderstanding about the purpose of copyright
. Copyright isn't designed to compensate creators for each and every use of their works. It is meant to incentivize creators as part of an effort to promote the public interest in innovation and expression. But that public interest isn't served
unless there are limitations on copyright that allow new generations to build and comment on the previous contributions . Those limitations are both legal, like fair dealing, and practical, like the zone of tolerance for harmless uses. Automated upload
filtering will undermine both. The authors of today's letter write: We support the consideration of measures that would improve the ability for creators to receive fair remuneration for the use
of their works online. But we cannot support Article 13, which would mandate Internet platforms to embed an automated infrastructure for monitoring and censorship deep into their networks. For the sake of the Internet's future, we urge you to vote for
the deletion of this proposal.
What began as a bad idea offered up to copyright lobbyists as a solution to an imaginary "
value gap " has now become an outright crisis for future of the Internet as we know it. Indeed, if
those who created and sustain the operation of the Internet recognize the scale of this threat, we should all be sitting up and taking notice. If you live in Europe or have European friends or family, now could be your last
opportunity to avert the upload filter. Please take action by clicking the button below, which will take you to a campaign website where you can phone, email, or Tweet at your representatives, urging them to stop this threat to the global Internet before
it's too late. Take Action at saveyourinternet.eu
|
|
|
|
|
|
11th June 2018
|
|
|
Big corporate lobbies are demanding these new copyright laws, hoping to make additional profits and gain more control over the web. By MEP Julia Reda See article from juliareda.eu
|
|
The EU's Copyright Proposal is Extremely Bad News for Everyone, Even (Especially!) Wikipedia. By Cory Doctorow
|
|
|
| 10th June 2018
|
|
| See article from
eff.org CC by Cory Doctorow |
The pending update to the EU Copyright Directive is coming up for a committee vote on June 20 or 21 and a parliamentary vote either in early July or late September. While the directive fixes some longstanding problems with EU rules, it creates much, much
larger ones: problems so big that they threaten to wreck the Internet itself. Under Article 13 of the
proposal , sites that allow users to post text, sounds, code, still or moving images, or other copyrighted works for public consumption will have to filter all their users' submissions against a database of copyrighted works. Sites will have to pay
to license the technology to match submissions to the database, and to identify near matches as well as exact ones. Sites will be required to have a process to allow rightsholders to update this list with more copyrighted works. Even under the best of circumstances, this presents huge problems. Algorithms that do content-matching are frankly terrible at it. The Made-in-the-USA version of this is YouTube's Content ID system, which improperly flags legitimate works all the time, but still gets flack from entertainment companies for not doing more.
There are lots of legitimate reasons for Internet users to upload copyrighted works. You might upload a clip from a nightclub (or a protest, or a technical presentation) that includes some copyrighted music in the background. Or
you might just be wearing a t-shirt with your favorite album cover in your Tinder profile. You might upload the cover of a book you're selling on an online auction site, or you might want to post a photo of your sitting room in the rental listing for
your flat, including the posters on the wall and the picture on the TV. Wikipedians have even more specialised reasons to upload material: pictures of celebrities, photos taken at newsworthy events, and so on.
But the bots that Article 13 mandates will not be perfect. In fact, by design, they will be wildly imperfect. Article 13 punishes any site that fails to block copyright infringement, but it won't punish people
who abuse the system. There are no penalties for falsely claiming copyright over someone else's work, which means that someone could upload all of Wikipedia to a filter system (for instance, one of the many sites that incorporate Wikpedia's content into
their own databases) and then claim ownership over it on Twitter, Facebook and Wordpress, and everyone else would be prevented from quoting Wikipedia on any of those services until they sorted out the false claims. It will be a lot easier to make
these false claims that it will be to figure out which of the hundreds of millions of copyrighted claims are real and which ones are pranks or hoaxes or censorship attempts. Article 13 also leaves you out in the cold when your own
work is censored thanks to a malfunctioning copyright bot. Your only option when you get censored is to raise an objection with the platform and hope they see it your way--but if they fail to give real consideration to your petition, you have to go to
court to plead your case. Article 13 gets Wikipedia coming and going: not only does it create opportunities for unscrupulous or incompetent people to block the sharing of Wikipedia's content beyond its bounds, it could also
require Wikipedia to filter submissions to the encyclopedia and its surrounding projects, like Wikimedia Commons. The drafters of Article 13 have
tried to carve Wikipedia out of the rule , but thanks to sloppy drafting, they have failed: the exemption is limited
to "noncommercial activity". Every file on Wikipedia is licensed for commercial use. Then there's the websites that Wikipedia relies on as references. The fragility and impermanence of links is already a serious problem
for Wikipedia's crucial footnotes, but after Article 13 becomes law, any information hosted in the EU might disappear--and links to US mirrors might become infringing--at any moment thanks to an overzealous copyright bot. For these reasons and many more,
the Wikimedia Foundation has taken a public position condemning Article 13. Speaking of references: the
problems with the new copyright proposal don't stop there. Under Article 11, each member state will get to create a new copyright in news. If it passes, in order to link to a news website, you will either have to do so in a way that satisfies the
limitations and exceptions of all 28 laws, or you will have to get a license. This is fundamentally incompatible with any sort of wiki (obviously), much less Wikipedia. It also means that the websites that Wikipedia relies on for
its reference links may face licensing hurdles that would limit their ability to cite their own sources. In particular, news sites may seek to withhold linking licenses from critics who want to quote from them in order to analyze, correct and critique
their articles, making it much harder for anyone else to figure out where the positions are in debates, especially years after the fact. This may not matter to people who only pay attention to news in the moment, but it's a blow to projects that seek to
present and preserve long-term records of noteworthy controversies. And since every member state will get to make its own rules for quotation and linking, Wikipedia posts will have to satisfy a patchwork of contradictory rules, some of which are already
so severe that they'd ban any items in a "Further Reading" list unless the article directly referenced or criticized them. The controversial measures in the new directive have been tried before. For example, link taxes
were tried in Spain and Germany and they failed , and
publishers don't want them . Indeed, the only country to embrace this idea as workable is
China , where mandatory copyright enforcement bots have become part of the national toolkit for controlling
public discourse. Articles 13 and 11 are poorly thought through, poorly drafted, unworkable--and dangerous. The collateral damage they will impose on every realm of public life can't be overstated. The Internet, after all, is
inextricably bound up in the daily lives of hundreds of millions of Europeans and an entire constellation of sites
and services will be adversely affected by Article 13. Europe can't afford to place education, employment, family life, creativity, entertainment, business, protest, politics, and a thousand other activities at the mercy of unaccountable algorithmic
filters. If you're a European concerned about these proposals, here's a tool for contacting your MEP .
|
|
TorrentFreak explains the grave threat to internet users and European small businesses
|
|
|
|
6th June 2018
|
|
| See article from torrentfreak.com cc
See also saveyourinternet.eu |
The EU's plans to modernize copyright law in Europe are moving ahead. With a crucial vote coming up later this month, protests from various opponents are on the rise as well. They warn that the proposed plans will result in Internet filters which
threaten people's ability to freely share content online. According to Pirate Party MEP Julia Reda, these filters will hurt regular Internet users, but also creators and businesses. September 2016, the European Commission
published its proposal for a modernized copyright law. Among other things, it proposed measures to require online services to do more to fight piracy. Specifically, Article 13 of the proposed Copyright Directive will require
online services to track down and delete pirated content, in collaboration with rightsholders. The Commission stressed that the changes are needed to support copyright holders. However, many legal scholars , digital activists ,
politicians , and members of the public worry that they will violate the rights of regular Internet users. Last month the EU Council finalized the latest version of the proposal. This means that the matter now goes to the Legal
Affairs Committee of the Parliament (JURI), which must decide how to move ahead. This vote is expected to take place in two weeks. Although the term filter is commonly used to describe Article 13, it is not directly mentioned in
the text itself . According to Pirate Party Member of Parliament (MEP) Julia Reda , the filter keyword is avoided in the proposal to prevent a possible violation of EU law and the Charter of Fundamental Rights. However, the
outcome is essentially the same. In short, the relevant text states that online services are liable for any uploaded content unless they take effective and proportionate action to prevent copyright infringements, identified by
copyright holders. That also includes preventing these files from being reuploaded. The latter implies some form of hash filtering and continuous monitoring of all user uploads. Several companies, including Google Drive, Dropbox,
and YouTube already have these types of filters, but many others don't. A main point of critique is that the automated upload checks will lead to overblocking, as they are often ill-equipped to deal with issues such as fair use.
The proposal would require platforms to filter all uploads by their users for potential copyright infringements -- not just YouTube and Facebook, but also services like WordPress, TripAdvisor, or even Tinder. We know from
experience that these algorithmic filters regularly make mistakes and lead to the mass deletion of legal uploads, Julia Reda tells TF. Especially small independent creators frequently see their content taken down because others
wrongfully claim copyright on their works. There are no safeguards in the proposal against such cases of copyfraud. Besides affecting uploads of regular Internet users and smaller creators, many businesses will also be hit. They
will have to make sure that they can detect and prevent infringing material from being shared on their systems. This will give larger American Internet giants, who already have these filters in place, a competitive edge over
smaller players and new startups, the Pirate Party MEP argues. It will make those Internet giants even stronger, because they will be the only ones able to develop and sell the filtering technologies necessary to comply with the
law. A true lose-lose situation for European Internet users, authors and businesses, Reda tells us. Based on the considerable protests in recent days, the current proposal is still seen as a clear threat by many.
In fact, the save youri nternet campaign, backed by prominent organizations such as Creative Commons, EFF, and Open Media, is ramping up again. They urge the
European public to reach out to their Members of Parliament before it's too late. Should Article 13 of the Copyright Directive proposal be adopted, it will impose widespread censorship of all the content you share online. The
European Parliament is the only one that can step in and Save your Internet, they write. The full Article 13 text includes some language to limit its scope. The nature and size of online services must be taken into account, for
example. This means that a small and legitimate niche service with a few dozen users might not be directly liable if it operates without these anti-piracy measures. Similarly, non-profit organizations will not be required to
comply with the proposed legislation, although there are calls from some member states to change this. In addition to Article 13, there is also considerable pushback from the public against Article 11, which is regularly referred
to as the link tax . At the moment, several organizations are planning a protest day next week, hoping to mobilize the public to speak out. A week later, following the JURI vote, it will be judgment day. If
they pass the Committee the plans will progress towards the final vote on copyright reform next Spring. This also means that they'll become much harder to stop or change. That has been done before, such as with ACTA, but achieving that type of momentum
will be a tough challenge.
|
|
The Open Rights Group finds that nearly 40% of court order blocks are in error
|
|
|
| 5th June 2018
|
|
| See article from openrightsgroup.org
|
Open Rights Group today released figures that show that High Court injunctions are being improperly administrated by ISPs and rights holders. A new tool added to its
blocked.org.uk project examines over 1,000 domains blocked under the UK's 30 injunctions against over 150 services, ORG found 37% of those domains are blocked in
error, or without any legal basis. The majority of the domains blocked are parked domains, or no longer used by infringing services. One Sci-Hub domain is blocked without an injunction, and a likely trademark infringing site, is also blocked without an
injunction. However, the list of blocked domains is believed to be around 2,500 domains, and is not made public, so ORG are unable to check for all possible mistakes. Jim Killock, Executive Director of Open
Rights Group said: It is not acceptable for a legal process to result in nearly 40% maladministration. These results show a great deal of carelessness. We expect ISPs and rights holders to
examine our results and remove the errors we have found as swiftly as possible. We want ISPs to immediately release lists of previously blocked domains, so we can check blocks are being removed by everyone.
Rights holders must make public exactly what is being blocked, so we can be ascertain how else these extremely wide legal powers are being applied.
ORG's conclusions are:
The administration process of adding and subtracting domains to be blocked is very poor Keeping the lists secret makes it impossible to check errors Getting mistakes
corrected is opaque. The ISP pages suggest you go to court.
Examples Some are potential subject to an injunction, which has not been sought, for instance: http://www.couchtuner.es One directs to a
personal blog: http://kat.kleisauke.nl Full results and statistical breakdowns
https://www.blocked.org.uk/legal-blocks/errors Export full results
https://www.blocked.org.uk/legal-blocks For a list of UK injunctions, see: The UK has 30 copyright and trademark
injunctions, blocking over 150 websites. https://wiki.451unavailable.org.uk/wiki/Main_Page
|
|
|
|
|
| 4th June 2018
|
|
|
What is it, why is it controversial and what will it mean for memes? Critics of the proposed EU directive on copyright warn that it will censor internet users See
article from alphr.com |
|
Sesame Street sues puppet movie over reference to their characters
|
|
|
| 31st May 2018
|
|
| 26th May 2018. See article from xbiz.com
See trailer from YouTube |
Creators of Sesame Street are suing the production company behind The Happytime Murders, claiming the mainstream comedy that features ejaculating puppets and other sexual puppetry routines is appropriating its brand. Sesame Workshop, creators of
the kids show, alleges that the misuse of its brand is intent on confusing the public and infringes on it intellectual property rights. The company has initiated a lawsuit as a result of a trailer with explicit, profane, drug-using, misogynistic,
violent, copulating and even ejaculating puppets, along with the tagline 'NO SESAME. ALL STREET'. The Happytime Murders, set for an August, is a murder mystery revolving around puppets who exhibit raunchy behavior.
Update: Judge not impressed by Sesame Street claims 31st May 2018. See article from pagesix.com
Manhattan federal Judge Vernon Broderick has rejected a request by the Sesame Workshop for a temporary retraining order to halt ads for the upcoming comedy Happytime Murders, including a YouTube trailer with the tagline, No Sesame.
All Street. Broderick ruled that the STX film -- directed by Brian Henson, the son of the late Jim Henson, whose Muppets have been central characters in the children's mainstay since its inception in 1969 -- was geared toward an entirely
different audience than Sesame Street. He also found that the trailer's No Sesame. All Street tagline was intended to differentiate the raunchy adult film from the wholesome educational show featuring Big Bird and Oscar the Grouch. The judge added:
I find the use of the tagline to disclaim -- albeit in a short and pithy manner.
|
|
Music industry is quick to lobby for Hancock's safe internet plans to be hijacked for their benefit
|
|
|
| 24th May 2018
|
|
| See article from torrentfreak.com
|
This week, Matt Hancock, Secretary of State for Digital, Culture, Media and Sport, announced the launch of a consultation on new legislative measures to clean up the Wild West elements of the Internet. In response, music group BPI says the government
should use the opportunity to tackle piracy with advanced site-blocking measures, repeat infringer policies, and new responsibilities for service providers. This week, the Government published its response to the Internet Safety Strategy green paper ,
stating unequivocally that more needs to be done to tackle online harm. As a result, the Government will now carry through with its threat to introduce new legislation, albeit with the assistance of technology companies, children's charities and other
stakeholders. While emphasis is being placed on hot-button topics such as cyberbullying and online child exploitation, the Government is clear that it wishes to tackle the full range of online harms. That has been greeted by UK music group BPI
with a request that the Government introduces new measures to tackle Internet piracy. In a statement issued this week, BPI chief executive Geoff Taylor welcomed the move towards legislative change and urged the Government to encompass the music
industry and beyond. He said: This is a vital opportunity to protect consumers and boost the UK's music and creative industries. The BPI has long pressed for internet intermediaries and online platforms to take
responsibility for the content that they promote to users. Government should now take the power in legislation to require online giants to take effective, proactive measures to clean illegal content from their sites and services.
This will keep fans away from dodgy sites full of harmful content and prevent criminals from undermining creative businesses that create UK jobs.
The BPI has published four initial requests, each of which provides food for thought.
The demand to establish a new fast-track process for blocking illegal sites is not entirely unexpected, particularly given the expense of launching applications for blocking injunctions at the High Court. The BPI has taken a large number of
actions against individual websites -- 63 injunctions are in place against sites that are wholly or mainly infringing and whose business is simply to profit from criminal activity, the BPI says. Those injunctions can be expanded fairly easily to
include new sites operating under similar banners or facilitating access to those already covered, but it's clear the BPI would like something more streamlined. Voluntary schemes, such as the one in place in Portugal , could be an option but it's unclear
how troublesome that could be for ISPs. New legislation could solve that dilemma, however. Another big thorn in the side for groups like the BPI are people and entities that post infringing content. The BPI is very good at taking these listings
down from sites and search engines in particular (more than 600 million requests to date) but it's a game of whac-a-mole the group would rather not engage in. With that in mind, the BPI would like the Government to impose new rules that would
compel online platforms to stop content from being re-posted after it's been taken down while removing the accounts of repeat infringers. Thirdly, the BPI would like the Government to introduce penalties for online operators who do not provide
transparent contact and ownership information. The music group isn't any more specific than that, but the suggestion is that operators of some sites have a tendency to hide in the shadows, something which frustrates enforcement activity. Finally,
and perhaps most interestingly, the BPI is calling on the Government to legislate for a new duty of care for online intermediaries and platforms. Specifically, the BPI wants effective action taken against businesses that use the Internet to encourage
consumers to access content illegally. While this could easily encompass pirate sites and services themselves, this proposal has the breadth to include a wide range of offenders, from people posting piracy-focused tutorials on monetized YouTube
channels to those selling fully-loaded Kodi devices on eBay or social media. Overall, the BPI clearly wants to place pressure on intermediaries to take action against piracy when they're in a position to do so, and particularly those who may not
have shown much enthusiasm towards industry collaboration in the past. Legislation in this Bill, to take powers to intervene with respect to operators that do not co-operate, would bring focus to the roundtable process and ensure that
intermediaries take their responsibilities seriously, the BPI says. |
|
|
|
|
|
7th May 2018
|
|
|
YouTube has 'how to' videos for pretty much everything See article from torrentfreak.com |
|
147 European organisations oppose a disgraceful new EU copyright law that will give rise to machines to automatically censor people's internet posts
|
|
|
| 27th April 2018
|
|
| See open letter from indexoncensorship.org
|
Directive on copyright in the Digital Single Market destined to become a nightmare OPEN LETTER IN LIGHT OF THE 27 APRIL 2018 COREPER I MEETING Your Excellency Ambassador, cc. Deputy Ambassador,
We, the undersigned, are writing to you ahead of your COREPER discussion on the proposed Directive on copyright in the Digital Single Market. We are deeply concerned that the text proposed by the Bulgarian
Presidency in no way reflects a balanced compromise, whether on substance or from the perspective of the many legitimate concerns that have been raised. Instead, it represents a major threat to the freedoms of European citizens and businesses and
promises to severely harm Europe's openness, competitiveness, innovation, science, research and education. A broad spectrum of European stakeholders and experts, including academics, educators, NGOs representing human rights and
media freedom, software developers and startups have repeatedly warned about the damage that the proposals would cause. However, these have been largely dismissed in rushed discussions taking place without national experts being present. This rushed
process is all the more surprising when the European Parliament has already announced it would require more time (until June) to reach a position and is clearly adopting a more cautious approach. If no further thought is put in
the discussion, the result will be a huge gap between stated intentions and the damage that the text will actually achieve if the actual language on the table remains:
Article 13 (user uploads) creates a liability regime for a vast area of online platforms that negates the E-commerce Directive, against the stated will of many Member States, and without any proper assessment of its impact. It
creates a new notice and takedown regime that does not require a notice. It mandates the use of filtering technologies across the board. Article 11 (press publisher's right) only contemplates creating publisher rights despite
the many voices opposing it and highlighting it flaws, despite the opposition of many Member States and despite such Member States proposing several alternatives including a "presumption of transfer". Article 3
(text and data mining) cannot be limited in terms of scope of beneficiaries or purposes if the EU wants to be at the forefront of innovations such as artificial intelligence. It can also not become a voluntary provision if we want to leverage the wealth
of expertise of the EU's research community across borders. Articles 4 to 9 must create an environment that enables educators, researchers, students and cultural heritage professionals to embrace the digital environment and
be able to preserve, create and share knowledge and European culture. It must be clearly stated that the proposed exceptions in these Articles cannot be overridden by contractual terms or technological protection measures. The interaction of these various articles has not even been the subject of a single discussion. The filters of Article 13 will cover the snippets of Article 11 whilst the limitations of Article 3 will be amplified by the rights created through Article 11, yet none of these aspects have even been assessed.
With so many legal uncertainties and collateral damages still present, this legislation is currently destined to become a nightmare when it will have to be transposed into national legislation and face the test of its legality in
terms of the Charter of Fundamental Rights and the Bern Convention. We hence strongly encourage you to adopt a decision-making process that is evidence-based, focussed on producing copyright rules that are fit for purpose and on
avoiding unintended, damaging side effects. Yours sincerely, The over 145 signatories of this open letter -- European and global organisations, as well as national organisations from 28 EU Member States,
represent human and digital rights, media freedom, publishers, journalists, libraries, scientific and research institutions, educational institutions including universities, creator representatives, consumers, software developers, start-ups, technology
businesses and Internet service providers. EUROPE 1. Access Info Europe. 2. Allied for Startups. 3. Association of European Research Libraries (LIBER). 4. Civil Liberties Union for Europe
(Liberties). 5. Copyright for Creativity (C4C). 6. Create Refresh Campaign. 7. DIGITALEUROPE. 8. EDiMA. 9. European Bureau of Library, Information and Documentation Associations (EBLIDA). 10. European Digital Learning
Network (DLEARN). 11. European Digital Rights (EDRi). 12. European Internet Services Providers Association (EuroISPA). 13. European Network for Copyright in Support of Education and Science (ENCES). 14. European University Association
(EUA). 15. Free Knowledge Advocacy Group EU 16. Lifelong Learning Platform. 17. Public Libraries 2020 (PL2020). 18. Science Europe. 19. South East Europe Media Organisation (SEEMO). 20. SPARC Europe.
AUSTRIA 21. Freischreiber Österreich. 22. Internet Service Providers Austria (ISPA Austria). BELGIUM 23. Net Users' Rights Protection Association (NURPA) BULGARIA
24. BESCO -- Bulgarian Startup Association. 25. BlueLink Foundation. 26. Bulgarian Association of Independent Artists and Animators (BAICAA). 27. Bulgarian Helsinki Committee. 28. Bulgarian Library and Information Association
(BLIA). 29. Creative Commons Bulgaria. 30. DIBLA. 31. Digital Republic. 32. Hamalogika. 33. Init Lab. 34. ISOC Bulgaria. 35. LawsBG. 36. Obshtestvo.bg. 37. Open Project Foundation. 38. PHOTO
Forum. 39. Wikimedians of Bulgaria. C ROATIA 40. Code for Croatia CYPRUS 41. Startup Cyprus CZECH R EPUBLIC 42. Alliance pro otevrene vzdelavani (Alliance
for Open Education) 43. Confederation of Industry of the Czech Republic. 44. Czech Fintech Association. 45. Ecumenical Academy. 46. EDUin. DENMARK 47. Danish Association of Independent
Internet Media (Prauda) E STONIA. 48. Wikimedia Eesti FINLAND 49. Creative Commons Finland. 50. Open Knowledge Finland. 51. Wikimedia Suomi. FRANCE 52.
Abilian. 53. Alliance Libre. 54. April. 55. Aquinetic. 56. Conseil National du Logiciel Libre (CNLL). 57. France Digitale. 58. l'ASIC. 59. Ploss Auvergne-Rhône-Alpes (PLOSS-RA). 60. Renaissance Numérique. 61.
Syntec Numérique. 62. Tech in France. 63. Wikimédia France. GERMANY 64. Arbeitsgemeinschaft der Medieneinrichtungen an Hochschulen e.V. (AMH). 65. Bundesverband Deutsche Startups. 66.
Deutscher Bibliotheksverband e.V. (dbv). 67. eco -- Association of the Internet Industry. 68. Factory Berlin. 69. Initiative gegen ein Leistungsschutzrecht (IGEL). 70. Jade Hochschule Wilhelmshaven/Oldenburg/Elsfleth. 71.
Karlsruhe Institute of Technology (KIT). 72. Landesbibliothekszentrum Rheinland-Pfalz. 73. Silicon Allee. 74. Staatsbibliothek Bamberg. 75. Ubermetrics Technologies. 76. Universitäts- und Landesbibliothek Sachsen-Anhalt
(Martin-Luther-University Halle-Wittenberg). 77. University Library of Kaiserslautern (Technische Universität Kaiserslautern). 78. Verein Deutscher Bibliothekarinnen und Bibliothekare e.V. (VDB). 79. ZB MED -- Information Centre for Life
Sciences. GREECE 80. Greek Free Open Source Software Society (GFOSS) HUNGARY 81. Hungarian Civil Liberties Union. 82. ICT Association of Hungary -- IVSZ. 83.
K-Monitor IRELAND 84. Technology Ireland ITALY 85. Hermes Center for Transparency and Digital Human Rights. 86. Istituto Italiano per la Privacy e la Valorizzazione dei
Dati. 87. Italian Coalition for Civil Liberties and Rights (CILD). 88. National Online Printing Association (ANSO). LATVIA 89. Startin.LV (Latvian Startup Association). 90. Wikimedians of Latvia
User Group. LITHUANIA 91. Aresi Labs. LUXEMBOURG. 92. Frënn vun der Ënn. MALTA 93. Commonwealth Centre for
Connected Learning NETHERLANDS 94. Dutch Association of Public Libraries (VOB) 95. Kennisland. POLAND 96. Centrum Cyfrowe. 97. Coalition for Open Education
(KOED). 98. Creative Commons Polska. 99. Elektroniczna BIBlioteka (EBIB Association). 100. ePan@stwo Foundation. 101. Fundacja Szkola z Klasa@ (School with Class Foundation). 102. Modern Poland Foundation. 103. Os@rodek
Edukacji Informatycznej i Zastosowan@ Komputerów w Warszawie (OEIiZK). 104. Panoptykon Foundation. 105. Startup Poland. 106. ZIPSEE. PORTUGAL 107. Associação D3 -- Defesa dos Direitos Digitais
(D3). 108. Associação Ensino Livre. 109. Associação Nacional para o Software Livre (ANSOL). 110. Associação para a Promoção e Desenvolvimento da Sociedade da Informação (APDSI). ROMANIA 111.
ActiveWatch. 112. APADOR-CH (Romanian Helsinki Committee). 113. Association for Technology and Internet (ApTI) 114. Association of Producers and Dealers of IT&C equipment (APDETIC). 115. Center for Public Innovation. 116. Digital
Citizens Romania. 117. Kosson.ro Initiative. 118. Mediawise Society. 119. National Association of Public Librarians and Libraries in Romania (ANBPR). SLOVAKIA 120. Creative Commons
Slovakia. 121. Slovak Alliance for Innovation Economy (SAPIE). SLOVENIA 122. Digitas Institute. 123. Forum za digitalno dru@bo (Digital Society Forum). SPAIN 124. Asociación de Internautas. 125. Asociación Española de Startups (Spanish Startup Association)
126. MaadiX. 127. Sugus. 128. Xnet. SWEDEN 129. Wikimedia Sverige UK 130. Libraries and Archives Copyright Alliance (LACA). 131. Open Rights Group
(ORG). 132. techUK. GLOBAL 133. ARTICLE 19. 134. Association for Progressive Communications (APC). 135. Center for Democracy & Technology (CDT). 136. COMMUNIA Association. 137.
Computer and Communications Industry Association (CCIA). 138. Copy-Me. 139. Creative Commons. 140. Electronic Frontier Foundation (EFF). 141. Electronic Information for Libraries (EIFL). 142. Index on Censorship. 143.
International Partnership for Human Rights (IPHR). 144. Media and Learning Association (MEDEA). 145. Open Knowledge International (OKI). 146. OpenMedia. 147. Software Heritage
|
|
The EU's latest copyright proposal is so bad, it even outlaws Creative Commons licenses
|
|
|
|
12th April 2018
|
|
| See article from boingboing.net CC by Cory Doctorow |
The EU is mooting a new copyright regime for the largest market in the world, and the Commissioners who are drafting the new rules are completely captured by the entertainment industry, to the extent that they have ignored their own experts and produced
a farcical Big Content wishlist that includes the most extensive internet censorship regime the world has ever seen, perpetual monopolies for the biggest players, and a ban on European creators using Creative Commons licenses to share their works.
Under the new rules, anyone who allows the public to post material will have to maintain vast databases of copyrighted works
claimed by rightsholders , and any public communications that matches anything in these databases has to be blocked. These databases have been tried on much more modest scales -- Youtube's Content ID is a prominent example -- and they're a mess.
Because rightsholders are free to upload anything and claim ownership of it, Content ID is a font of garbagey, sloppy, fraudulent copyright abuse:
five different companies claim to own the rights to white noise ; Samsung claims to
own any drawing of its phones ; Nintendo claims it
owns gamers' animated mashups ; Sony
claims it owns stock footage it stole from a filmmaker whose work it had censored; the biggest music companies in the world
all claim to own the rights to "Silent Night" , a rogues' gallery of sleazy copyfraudsters
claim to own NASA's spacecraft landing footage -- all in all,
these systems benefit the large and the unethical at the cost of small and nimble. That's just for starters.
Since these filter systems are incredibly expensive to create and operate, anyone who wants to get into business competing with the companies that grew large without having to create systems like these will have to source hundreds of
millions in capital before they can even enter the market. Youtube 2018 can easily afford Content ID; Youtube 2005 would have been bankrupted if they'd had to build it. And then there's the matter of banning Creative Commons
licenses. In order to bail out the largest newspapers in the EU, the Commission is proposing a Link Tax -- a fee that search engines and sites like Boing Boing will have to pay just for the right to link to news stories on the
web. This idea has been tried before in Spain and Germany and the newspapers who'd called for it quickly admitted it wasn't working and stopped using it. But the new, worse-than-ever Link Tax contains a new wrinkle: rightsholders
will not be able to waive the right to be compensated under the Link Tax. That means that European creators -- who've released hundreds of millions of works under Creative Commons licenses that allow for free sharing without fee or permission -- will no
longer be able to choose the terms of a Creative Commons license; the inalienable, unwaivable right to collect rent any time someone links to your creations will invalidate the core clause in these licenses. Europeans can write to
their MEPs and the European Commission using this joint Action Centre ; please act before it's too late. The European Copyright Directive
was enacted in 2001 and is now woefully out of date. Thanks in large part to the work of Pirate Party MEP Julia Reda, many good ideas for updating European copyright law were put forward in a report of the European Parliament in July 2015. The European
Commission threw out most of these ideas, and instead released a legislative proposal in October 2016 that focused on giving new powers to publishers. That proposal was referred to several of the committees of the European Parliament, with the
Parliament's Legal Affairs (JURI) Committee taking the lead. As the final text must also be accepted by the Council of the European Union (which can be considered as the second part of the EU's bicameral legislature), the Council
Presidency has recently been weighing in with its own "compromise" proposals (although this is something of a misnomer, as they do little to improve the Commission's original text, and in some respects make it worse). Not to be outdone, German
MEP (Member of the European Parliament) Axel Voss last month introduced a new set of his own proposals [PDF] for "compromise," which are somehow worse still. Since Voss leads the JURI committee, this is a big problem.
|
|
|
|
|
| 8th April 2018
|
|
|
YouTube-MP3 was the world's largest YouTube-ripping service but last year it shut down following a lawsuit filed by the world's largest record labels. But what about companies that supply rip-it-yourself downloading tools See
article from torrentfreak.com |
|
|