|
The Swedish government internet censor fines Google for not taking down links and for warning the targeted website about the censorship
|
|
|
|
17th March 2020
|
|
| See article from sputniknews.com
|
The Swedish data protection censor, Datainspektionen has fined Google 75 million Swedish kronor (7 million euro) for failure to comply with the censorship instructions. According to the internet censor, which is affiliated with Sweden's Ministry of
Justice, Google violated the terms of the right-to-be-forgotten rule, a EU-mandated regulation introduced in 2014 allowing individuals to request the removal of potentially harmful private information from popping up in internet searches and directories.
Datainspektionen says an internal audit has shown that Google has failed to properly remove two search results which were ordered to be delisted back in 2017, making either too narrow an interpretation of what content needed to be removed, or failing
to remove a link to content without undue delay. The watchdog has also slapped Google with a cease-and-desist order for its practice of notifying website owners of a delisting request, claiming that this practice defeats the purpose of link
removal in the first place. Google has promised to appeal the fine, with a spokesperson for the company saying that it disagrees with this decision on principle.
|
|
The EU's highest court finds that the 'right to be forgotten' does not apply outside of the EU
|
|
|
|
25th September 2019
|
|
| See article from bbc.com See
A view from America: Europe's
'right to be forgotten' shows legislators what not to dofrom washingtonpost.com |
The EU's top court has ruled that Google does not have to apply the right to be forgotten globally. It means the firm only needs to remove links from its search results in the EU, and not elsewhere The ruling stems from a dispute between Google
and a French privacy censor CNIL. In 2015 it ordered Google to globally remove search result listings to pages containing banned information about a person. The following year, Google introduced a geoblocking feature that prevents European users
from being able to see delisted links. But it resisted censoring search results for people in other parts of the world. And Googlechallenged a 100,000 euro fine that CNIL had tried to impose. The right to be forgotten, officially known as the
right to erasure, gives EU citizens the power to request data about them be deleted. Members of the public can make a request to any organisation verbally or in writing and the recipient has one month to respond. They then have a range of considerations
to weigh up to decide whether they are compelled to comply or not. Google had argued that internet censorship rules should not be extended to external jurisdictions lest other countries do the same, eg China would very much like to demand that the
whole world forgets the Tiananmen Square massacre. The court also issued a related second ruling, which said that links do not automatically have to be removed just because they contain information about a person's sex life or a criminal
conviction. Instead, it ruled that such listings could be kept where strictly necessary for people's freedom of information rights to be preserved. However, it indicated a high threshold should be applied and that such results should fall down search
result listings over time. Notably, the ECJ ruling said that delistings must be accompanied by measures which effectively prevent or, at the very least, seriously discourage an internet user from being able to access the results from one of
Google's non-EU sites. It will be for the national court to ascertain whether the measures put in place by Google Inc meet those requirements. |
|
European Court of Justice moves towards limiting censorship via the 'right to be forgotten' to the EU
|
|
|
|
13th January 2019
|
|
| See article from clicklancashire.com
|
The French Internet censor CNIL some time ago insisted that censorship required under the 'right to be forgotten' should be applied worldwide rather than limited to the EU. Google appealed against the court order leading to the case being sent to the
European Court of Justice. Now opinions from the court's advocate general suggest that court will determine that the right to be forgotten does not apply worldwide. The opinions are not final but the court often follows them when it hands down its
ruling, which is expected later. CNIL wanted Google to remove links from Google.com instead of just removing links from European versions of the site, like Google.de and Google.fr. However Maciej Szpunar warned that going further would be risky
because the right to be forgotten always has to be balanced against other rights, including legitimate public interest in accessing the information sought. Szpunar said if worldwide de-referencing was allowed, European Union authorities would not
be able to determine a right to receive information or balance it against other fundamental rights to data protection and to privacy. And of course if France were allowed to censor information from the entire worldwide internet then why not China,
Russia, Iran, and Saudi Arabia? |
|
European Court of Justice hears case with France calling for its information bans under the 'right to be forgotten' to be implemented throughout the World.
|
|
|
| 10th September 2018
|
|
| See article
from article19.org |
ARTICLE 19 is leading a coalition of international human rights organisations, who will tell the European Court of Justice (CJEU) that the de-listing of websites under the right to be forgotten should be limited in order to protect global freedom
of expression. The hearing will take place on September 11 with a judgment expected in early 2019. The CJEU hearing in Google vs CNIL is taking place after France's highest administrative court asked for clarification in relation
to the 2014 ruling in Google Spain. This judgment allows European citizens to ask search engines like Google to remove links to inadequate, irrelevant or ... excessive content -- commonly known as the right to be forgotten (RTBF). While the content
itself remains online, it cannot be found through online searches of the individual's name. The CJEU has been asked to clarify whether a court or data regulator should require a search engine to de-list websites only in the
country where it has jurisdiction or across the entire world. France's data regulator, the Commission Nationale de l'Informatique et des Libertes (CNIL) has argued that if they uphold a complaint by a French citizen, search
engines such as Google should not only be compelled to remove links from google.fr but all Google domains. ARTICLE 19 and the coalition of intervening organisations have warned that forcing search engines to de-list information on
a global basis would be disproportionate. Executive Director of ARTICLE 19, Thomas Hughes said: This case could see the right to be forgotten threatening global free speech. European data regulators should not be
allowed to decide what Internet users around the world find when they use a search engine. The CJEU must limit the scope of the right to be forgotten in order to protect the right of Internet users around the world to access information online.
ARTICLE 19 argues that rights to privacy and rights to freedom of expression must be balanced when it comes to making deciding whether websites should be de-listed. Hughes added: If
European regulators can tell Google to remove all references to a website, then it will be only a matter of time before countries like China, Russia and Saudi Arabia start to do the same. The CJEU should protect freedom of expression not set a global
precedent for censorship.
|
|
|
|
|
| 22nd April 2018
|
|
|
Should Google Decide Alone? By Ray Walsh See article from bestvpn.com |
|
It takes 10s of 1000s of pounds for the justice system to consider the nuances of censorship and the right to be forgotten yet we hand over the task to Google who's only duty is to maximise profits for shareholders
|
|
|
| 15th April 2018
|
|
| See article from bbc.com |
A businessman fighting for the right to be forgotten has won a UK High Court action against Google. The unnamed businessman who won his case was convicted 10 years ago of conspiring to intercept communications. He spent six months in jail. He as ked
Google to delete online details of his conviction from Google Search but his request was turned down. The judge, Mr Justice Mark Warby, ruled in his favour on Friday. But he rejected a separate but similar claim made by another businessman
who had committed a more serious crime. The other businessman, who lost his case, was convicted more than 10 years ago of conspiring to account falsely. He spent four years in jail. Google said it would accept the rulings. We work hard to
comply with the right to be forgotten, but we take great care not to remove search results that are in the public interest, it said in a statement: We are pleased that the Court recognised our efforts in this area, and
we will respect the judgements they have made in this case.'
Explaining the decisions made on Friday, the judge said one of the men had continued to mislead the public while the other had shown remorse. But how is Google the
right organisation to arbitrate on matters of justice where it is required to examine the level of remorse shown by those requesting censorship? |
|
Two UK court cases set to test Google's decisions over deleting factual data concerning convicted criminals from search results
|
|
|
|
21st January 2018
|
|
| See article from bendbulletin.com
|
Google is set for its first appearance in a London court over the so-called right to be forgotten in two cases that will test the boundaries between personal privacy and public interest. Two anonymous people, who describe themselves in court filings
as businessmen, want the search engine to take down links to information about their old convictions. One of the men had been found guilty of conspiracy to account falsely, and the other of conspiracy to intercept communications. Judge Matthew
Nicklin said at a pre-trial hearing that hose convictions are old and are now covered by an English law -- designed to rehabilitate offenders -- that says they can effectively be ignored. With a few exceptions, they don't have to be disclosed to
potential employers. A Google spokeswoman said: We work hard to comply with the right to be forgotten, but we take great care not to remove search results that are clearly in the public interest and will defend
the public's right to access lawful information.
The cases will start on February 27 and March 13. Update: Convicted business man wins right to be forgotten 21st December 2018. See
article from theregister.co.uk Google has settled a legal case brought against it by a convicted criminal who
wanted the adtech company to delete embarrassing search results about his criminal past. The Court of Appeal in London was due to hear the anonymised businessman's appeal against an earlier High Court ruling which said he couldn't have the results
deleted. Known only as NT1 thanks to a reporting restriction order , the man was found guilty of conspiracy to account falsely in the late 1990s. |
|
France ludicrously claims the right to censor the World's internet and fines Google for not blocking Americans from viewing content censored in the EU
|
|
|
| 26th March 2016
|
|
| See article from engadget.com
|
Europe's right to be forgotten is a nasty and arbitrary censorship power used to hide internet content such as past criminal history. Many think it tramples on the public's right to know, as quite a few examples have born out. It seems that
France and the EU thinks that such content should be censored worldwide, and have fined Google 100,000 euro for allowing non EU internet viewers to see information censored in the EU. Since EU laws don't apply elsewhere, Google at first just
deleted right to be forgotten requested results from its French domain. However, France pointed out that it would be easy to find the info on a different site and ordered the company to scrub results everywhere. In an attempted compromise, Google
started omitting results worldwide as long as it determined, by geolocation, that the search was conducted from within France. But now EU internet censors have rejected that idea (as it would be easy to get around with a VPN) and fined Google
effectively for allowing Americans to see content censored in the EU. Google commented: We disagree with the [regulator's] assertion that it has the authority to control the content that people can access outside
France.
In its ruling, France's CNIL censor says that geolocalizing search results does not give people effective, full protection of their right to be delisted ... accordingly, the CNIL restricted committee pronounced a 100,000
euro fine against Google. Google plans to appeal the ruling.
|
|
Google starts censoring google.com for countries affected by EU censorship demands
|
|
|
| 5th March 2016
|
|
| See article from theregister.co.uk See
article from searchengineland.com |
If you use Google in Europe, your search results will be censored under the EU's's disgraceful 'right-to-be-forgotten'. Until now if you used Google.com rather than, say, Google.de, you could still find results that have been arbitrarily removed based
on how loud people shout. The censorship has been implemented as follows. Assume that someone in Germany files a Right To Be Forgotten request to have some listing censored for their name. If granted, the censorship will work like this for
searches on that person's name:
- Listing censored for those in Germany, using ANY version of Google.
- Listing censored for those in the EU, using a European version of Google.
- Listing NOT censored for those outside Germany but within the EU, using non-European
versions of Google.
- Listing NOT censored for those outside the EU, using ANY version of Google.
Google's Peter Fleischer explained the reasons for the censorship: We're changing our approach as a result of specific discussions that we've had with EU data protection regulators in recent months.
We believe that this additional layer of delisting enables us to provide the enhanced protections that European regulators ask us for, while also upholding the rights of people in other countries to access lawfully published
information.
|
|
Google to censor all EU searches under the disgraceful 'right to be forgotten'
|
|
|
| 12th February 2016
|
|
| See article from bbc.com |
Google says it will remove links, censored under the right to be forgotten, from all versions of the search engine when viewed from countries where the censorship was invoked. Now, removed results will not appear on any version of Google,
including google.com. Until now, search results removed under the right to be forgotten were only omitted from European versions of Google - such as google.co.uk or google.fr. EU internet censors previously asked the firm to do this. The
French data protection authority had threatened the company with a fine if it did not remove the data from global sites, such as google.com, as well as European ones. This censorship will be applied whenever a European IP address is detected but
all users outside Europe, will still see a set of unedited results. Hopefully European VPN users operating via non European countries will also be unaffected by Google's geo-blocking. The BBC understands that the change will be in effect from
mid-February. |
|
|
|
|
| 19th December 2015
|
|
|
Here's a conspiracy theory for you: Are anti-European MEPs behind this shamefully bad censorship legislation so as to encourage us to vote for Brexit? See
article from techdirt.com |
|
ICO demands that Google censors information from google.com when accessed from the UK
|
|
|
| 25th November 2015
|
|
| See
article from
publicaffairs.linx.net |
The "right to be forgotten" applies to any search engine accessible in the UK, the Information Commissioner's Office has claimed. In a blog post earlier this month, ICO demanded: In August we
issued our first enforcement notice in this area , ordering Google to remove nine search
results brought up by entering an individual's name. Google has so far responded constructively, and the links are no longer visible on the European versions of their search engine. However we consider that they should go a step further, and make the
links no longer visible to anyone directly accessing any Google search services from within the UK (this would include someone sat a desk in Newcastle, but using google.com). This is a proper and proportionate reflection of what the EU Court of Justice
ruling means in practice, and so we've clarified the original enforcement notice , with the original text remaining the same but with a new section
added spelling out exactly what we expect of Google.
|
|
More disgraceful censorship legislation on the way from the EU
|
|
|
| 21st November 2015
|
|
| See article from
eff.org |
Europe is very close to the finishing line of an extraordinary project: the adoption of the new General Data Protection Regulation (GDPR), a single, comprehensive replacement for the 28 different laws that implement Europe's existing 1995
Data Protection Directive . More than any other instrument, the original Directive has created a high global standard for personal data
protection, and led many other countries to follow Europe's approach. Over the years, Europe has grown ever more committed to the idea of data protection as a core value. In the Union's Charter of Fundamental Rights, legally binding on all the EU states
since 2009, lists the "right to the protection of personal data" as a separate and equal right to privacy. The GDPR is intended to update and maintain that high standard of protection, while modernising and streamlining its enforcement.
The battle over the details of the GDPR has so far mostly been a debate between advocates pushing to better defend data protection, against companies and other interests that find consumer privacy laws a hindrance to their business
models. Most of the compromises between these two groups have now already been struck. The result is a ticking time-bomb that will be bad for online speech, and bad for the future reputation of the GDPR and data protection in
general. The current draft of the GDPR doubles down on Google Spain, and raises new problems. (The draft currently under negotiation is not publicly available, but July 2015 versions of the provisions that we refer to can be found
in this comparative table of proposals and counter-proposals by
the European institutions. Article numbers referenced here, which will likely change in the final text, are to the proposal from the Council of the EU unless otherwise stated.) First, it requires an Internet intermediary (which is
not limited to a search engine, though the exact scope of the obligation remains vague) to respond to a request by a person for the removal of their personal information by immediately restricting the content, without notice to the user who uploaded that
content (Articles 4(3a), 17, 17a, and 19a.). Compare this with the DMCA takedown notices, which include a notification requirement, or even the current Right to Be Forgotten process, which give search engines some time to consider the legitimacy of the
request. In the new GDPR regime, the default is to block. Then, after reviewing the (also vague) criteria that balance the privacy claim with other legitimate interests and public interest considerations such as freedom of
expression (Articles 6.1(f), 17a(3) and 17.3(a)), and possibly consulting with the user who uploaded the content if doubt remains, the intermediary either permanently erases the content (which, for search engines, means removing their link to it), or
reinstates it (Articles 17.1 and 17a(3)). If it does erase the information, it is not required to notify the uploading user of having done so, but is required to notify any downstream publishers or recipients of the same content (Articles 13 and 17.2),
and must apparently also disclose any information that it has about the uploading user to the person who requested its removal (Articles 14a(g) and 15(1)(g)). Think about that for a moment. You place a comment on a website which
mentions a few (truthful) facts about another person. Under the GDPR, that person can now demand the instant removal of your comment from the host of the website, while that host determines whether it might be okay to still publish it. If the host's
decision goes against you (and you won't always be notified, so good luck spotting the pre-emptive deletion in time to plead your case to Google or Facebook or your ISP), your comment will be erased. If that comment was syndicated, by RSS or some other
mechanism, your deleting host is now obliged to let anyone else know that they should also remove the content. Finally, according to the existing language, while the host is dissuaded from telling you about any of this procedure,
they are compelled to hand over personal information about you to the original complainant. So this part of EU's data protection law would actually release personal information! What are the incentives for the intermediary to
stand by the author and keep the material online? If the host fails to remove content that a data protection authority later determines it should have removed, it may become liable to astronomical penalties of ?100 million or up to 5% of its global
turnover, whichever is higher (European Parliament proposal for Article 79). That means there is enormous pressure on the intermediary to take information down if there is even a remote possibility that the information has indeed
become "irrelevant", and that countervailing public interest considerations do not apply. It is not too late yet: proposed amendments to the GDPR are still being considered. We have written a
joint letter with ARTICLE 19 to European policymakers, drawing their
attention to the problem and explaining what needs to be done. We contend that the problems identified can be overcome by relatively simple amendments to the GDPR, which will help to secure European users' freedom of expression, without detracting from
the strong protection that the regime affords to their personal data. Without fixing the problem, the current draft risks sullying the entire GDPR project. Just like the DMCA takedown process, these GDPR removals won't just be
used for the limited purpose they were intended for. Instead, it will be abused to censor authors and invade the privacy of speakers. A GDPR without fixes will damage the reputation of data protection law as effectively as the DMCA permanently tarnished
the intent and purpose of copyright law.
|
|
France set to take action requiring Google to operate the 'right to be forgotten' across the world
|
|
|
| 22nd
September 2015
|
|
| See article from reason.com |
The French internet censor has responded to a Google statement which explains why European internet censorship cannot be applied across the world. This summer, France's Commission Nationale de l'Informatique et des Libertes (CNIL) sent Google an order
to not merely delist links from European Google searches but search results around the world, too. Google responded: This is a troubling development that risks serious chilling effects on the web.
CNIL's president did not find this persuasive, rejecting Google's appeal of the order. In a statement released today, CNIL claimed that:
Once delisting is accepted by the search engine, it must be implemented on all extensions, because if this right was limited to some extensions, it could be easily circumvented: in order to find the delisted result, it
would be sufficient to search on another extension and this would equate stripping away the efficiency of this right.
CNIL pointed out that delisted info remains directly accessible on the source website or through a search using
other terms than an individual's name and: In addition, this right is not absolute: it has to be reconciled with the public's right to information, in particular when the data subject is a public person, under the
double supervision of the CNIL and of the court. Google must now comply with the formal notice or face CNIL's sanctions committee.
There's no further opportunity to appeal the decision at this stage under
French law. But if Google refuses to comply, it could later appeal any sanctions levied by CNIL. Fines would likely start at around € 300,000 but could increase to between 2-5% of Google's global operating costs. The search
engine could then go to the Conseil d'Etat, the supreme court for administrative justice, to appeal the decision and fine. |
|
BBC publishes a long list of websites censored from Google search under the EU's 'right to be forgotten'
|
|
|
|
27th June 2015
|
|
| See list of censored news stories from
bbc.co.uk |
The BBC explains its commendable policy in a blog post: Since a European Court of Justice ruling last year, individuals have the right to request that search engines remove certain web pages from their search results. Those pages
usually contain personal information about individuals. Following the ruling, Google removed a large number of links from its search results , including some to BBC web pages, and continues to delist pages from BBC Online.
The BBC has decided to make clear to licence fee payers which pages have been removed from Google's search results by publishing this list of links. Each month, we'll republish this list with new removals added at the top.
We are doing this primarily as a contribution to public policy. We think it is important that those with an interest in the right to be forgotten can ascertain which articles have been affected by the ruling. We hope it will
contribute to the debate about this issue. We also think the integrity of the BBC's online archive is important and, although the pages concerned remain published on BBC Online, removal from Google searches makes parts of that archive harder to find.
The pages affected by delinking may disappear from Google searches, but they do still exist on BBC Online. David Jordan, the BBC's Director of Editorial Policy and Standards, has written a blog post which explains how we view that
archive as a matter of historic public record and, thus, something we alter only in exceptional circumstances. The BBC's rules on deleting content from BBC Online are strict; in general, unless content is specifically made available only for a
limited time, the assumption is that what we publish on BBC Online will become part of a permanently accessible archive. To do anything else risks reducing transparency and damaging trust. One caveat: when looking through this
list it is worth noting that we are not told who has requested the delisting, and we should not leap to conclusions as to who is responsible. The request may not have come from the obvious subject of a story. See
list of censored news stories |
|
The EU's right to be forgotten has diminished free speech according to former UN free speech rapporteur
|
|
|
|
27th June 2015
|
|
| See article
from indexoncensorship.org |
Freedom of expression is more in danger today than in 2008 because of the right to be forgotten , the United Nation's former free expression rapporteur Frank La Rue told an internet conference. At the event La Rue told Index on Censorship:
The emphasis on the 'right to be forgotten' in a way is a reduction of freedom of expression, which I think is a mistake. People get excited because they can correct the record on many things but the trend is towards
limiting people's access to information which I think is a bad trend in general.
La Rue, who was the UN's rapporteur between 2008 and 2014, addressed lawyers, academics and researchers at the Institute of Advanced Legal Studies in
London, in particular covering the May 2014 right to be forgotten ruling from the Court of Justice of the European Union, and its impact on free speech. On the ruling, La Rue said: I would want to know the past.
It is very relevant information. Everyone should be on the record and we have to question who is making these decisions anyway? The state is accountable to the people of a nation so should be accountable here. Not private
companies and especially not those with commercial interests.
|
|
Internet censor wants Google to implement its censorship demands worldwide, not just in France
|
|
|
| 15th June 2015
|
|
| See article from bbc.co.uk
|
Google has 15 days to comply with a demand from France's internet censor to extend the right to be forgotten to all its search engines. Google has responded to European censorship under the right to be forgotten by only removing the required
information for the copy of the search engine specific to the censoring country. And in particular leaves the links live in the global google.com version. French censor CNIL said Google could face sanctions if it did not comply within the time
limit. In response, Google said in a statement: We've been working hard to strike the right balance in implementing the European Court's ruling, co-operating closely with data protection authorities.
The ruling focused on services directed to European users, and that's the approach we are taking in complying with it.
|
|
Ambulance chasing law firms exploit the EU's right to be forgotten
|
|
|
| 18th April 2015
|
|
| See
article from
independent.co.uk |
Ambulance-chasing law firms are exploiting the European Court's ruling on the right to be forgotten to drum up business, leading to a rise in the number of newspaper articles being deleted from Google search results. The companies, some of
which have no legal background but say they specialise in reputation management , have sensed an easy opportunity to make money by offering to cleanse the internet of embarrassing references to their clients on a no-win no-fee basis, media lawyers
said. The service can amount to little more than filling in Google's one-page form requesting that a particular link is removed from search results -- which can easily be completed for free by the client themselves. Last month alone The
Independent was informed by Google that links to 13 news articles had been removed from its search results, marking a sudden rise on previous figures when only a handful had been hidden each month. Mark Stephens, a media law specialist at London firm
Howard Kennedy said: You've got ambulance-chasing lawyers who are, I think, trying to attract custom for cases which you don't need a lawyer for. People are being asked to pay for something when there's no good reasons
to do so -- you can do this online, for free, for nothing.
He added that the problem was not restricted to the UK, with media organisations across Europe feeling the chilling effects of the ruling as unscrupulous companies
realised that citing the ruling could be an easy way to make money.
|
|
|
|
|
|
8th February 2015
|
|
|
New York Times editorial is unimpressed by the EU wanting to impose its disgraceful 'right to be forgotten' censorship rules on the US See
article from nytimes.com |
|
|
|
|
| 18th December 2014
|
|
|
That is a big step, one that even China, the master of internet censorship, has never taken. See article from
huffingtonpost.com |
|
|
|
|
|
14th December 2014
|
|
|
The company is under relentless attack by European authorities who won't stop until they do real damage. By Mike Elgan See article from computerworld.com
|
|
EU internet censors publish their rules about the 'right to be forgotten'
|
|
|
| 4th
December 2014
|
|
| See article from
searchengineland.com See
EU 'right to be forgotten' censorship rules [pdf] from
ec.europa.eu |
The EU has issued formal censorship rules surrounding the so-called Right to Be Forgotten (RTBF). The formal considerations that the EU data censors want considered in evaluating any RTBF request are:
- Does the search result relate to a natural person -- i.e. an individual? And does the search result come up against a search on the data subject's name?
- Does the data subject play a role in
public life?
- Is the data subject a public figure?
- Is the data subject a minor?
- Is the data accurate?
- Is the data relevant and not excessive?
- Is the information sensitive within the meaning of Article 8 of the Directive 95/46/EC?
- Is the data up to date? Is the data being made available for longer than is necessary for
the purpose of the processing?
- Is the data processing causing prejudice to the data subject?
- Does the data have a disproportionately negative privacy impact on the data subject?
- Does the search result link to information that puts the data subject at risk?
- In what context was the information published?
- Was the
original content published in the context of journalistic purposes?
- Does the publisher of the data have a legal power, or a legal obligation, to make the personal data publicly available?
-
Does the data relate to a criminal offence?
In most cases, it appears that more than one criterion will need to be taken into account in order to reach a decision to censor. In other words, no single criterion is, in itself, determinative. The document asserts that successful RTBF
requests should be applied globally and not just to specific country domain search results, as Google has been doing: [D]e-listing decisions must be implemented in a way that guarantees the effective and complete
protection of these rights and that EU law cannot be easily circumvented. In that sense, limiting de-listing to EU domains on the grounds that users tend to access search engines via their national domains cannot be considered a sufficient means to
satisfactorily guarantee the rights of data subjects according to the judgment. In practice, this means that in any case de-listing should also be effective on all relevant domains, including .com
But any such global de-listing
sets up a conflict of laws between nations that recognize RTBF and those that do not. Google had been notifying publishers that their links were being removed, causing some to republish those links for re-indexing. This has frustrated some European
censors who see this practice as undermining the RTBF. Accordingly, the EU says that publishers should not be notified of the removal of links: Search engine managers should not as a general practice inform the
webmasters of the pages affected by de-listing of the fact that some webpages cannot be acceded from the search engine in response to specific queries. Such a communication has no legal basis under EU data protection law.
The EU also
doesn't want Google to publish notices to users that links have been removed for similar reasons: It appears that some search engines have developed the practice of systematically informing the users of search engines
of the fact that some results to their queries have been de-listed in response to requests of an individual. If such information would only be visible in search results where hyperlinks were actually de-listed, this would strongly undermine the purpose
of the ruling. Such a practice can only be acceptable if the information is offered in such a way that users cannot in any case come to the conclusion that a specific individual has asked for the de-listing of results concerning him or her.
The guidelines state that beyond external search engines (e.g., Google) they may be extended to undefined intermediaries. However they immediately go on to apparently contradict that notion:
The right to de-listing should not apply to search engines with a restricted field of action, particularly in the case of search tools of websites of newspapers. Finally the guidelines suggest that only EU citizens
may be eligible in practice to make RTBF requests. |
|
EU internet censors want to prevent Europeans from accessing censored links via google.com
|
|
|
| 27th
November 2014
|
|
| See article from bbc.co.uk
|
Google is under fresh pressure to expand censorship under the right to be forgotten to its international .com search engine. A panel of EU censors claimed the move was necessary to prevent the law from being circumvented. Google
currently de-lists results that appear in the European versions of its search engines, but not the international one. At present, visitors are diverted to localised editions of the US company's search tool - such as Google.co.uk and Google.fr - when they
initially try to visit the Google.com site. However, a link is provided at the bottom right-hand corner of the screen offering an option to switch to the international .com version. This link does not appear if the users attempted to go to a regional
version in the first place. Even so, it means it is possible for people in Europe to easily opt out of the censored lists. |
|
Man goes to court to force Google to track down links to delete
|
|
|
|
25th November 2014
|
|
| 24th November 2014. See article from
bbc.co.uk |
The case of a UK businessman who wants Google to stop malicious web postings about him appearing in its search results is set to begin. Daniel Hegglin says he has been wrongly called a murderer, a paedophile and a Ku Klux Klan sympathiser during a
malicious online campaign against him. He wants Google to block the anonymous posts from its search engine results. Google asked him to provide a list of web links to be removed, but High Court judges will rule if it should do more. He claims
there are more than 3,600 websites containing abusive and untrue material about him, and says listing all the posts for Google to remove would be expensive, time consuming, and ineffective. He says that although Google is not the originator of the
abusive campaign, its search engines have allowed the abuse to become more widespread. He is seeking a legal order to force Google to take steps to prevent the abusive posts being processed in searches in England and Wales.
Update: Settled 25th November 2014. See article from
wiggin.co.uk The case was settled on the first day of trial. Daniel Hegglin's barrister said in a statement: The settlement
includes significant efforts on Google's part to remove the abusive material from Google-hosted websites and from its search results. Mr Hegglin will now concentrate his energies on bringing the persons responsible for this campaign of harassment to
justice .
And a statement for Google: Google provides search services to millions of people and cannot be responsible for policing internet content. It will, however, continue to apply its
procedures that have been developed to assist with the removal of content which breaches local applicable laws .
|
|
Government minister Sajid Javid notes that the bad guys are getting reports of their criminal deeds censored under the EU's 'right to be forgotten'
|
|
|
| 14th November 2014
|
|
| See article
from telegraph.co.uk
|
Terrorists and criminals are being airbrushed from history as right-to-be-forgotten laws bring in censorship by the back door , the culture secretary has warned. Sajid Javid said convictions are being removed from the internet even by those who
have gone on to commit further crime, with terrorists ordering Google to remove stories about their trials. He warned that thousands of requests were being received each day by those who prefer to keep their past a secret , thanks to unelected
judges in Europe. He told an audience the European court had introduced censorship through the back door by ordering internet search engines such as Google to offer a right to be forgotten to individuals who want links to
information about them to be removed. Article 8 of the European Convention on Human Rights, he said, was being used as: Little more than an excuse for well-paid lawyers to hide the shady pasts of wealthy
businessmen and the sexual indiscretions of sporting celebrities. The 'right to be forgotten' is censorship through the back door.
|
|
Pianist attempts to censor bad reviews via the 'right to be forgotten'
|
|
|
|
2nd November 2014
|
|
| See
article from
dailymail.co.uk |
Dejan Lazic, a concert pianist from Croatia, has demanded that a bad review of a 2010 concert he gave be removed from internet search results under the European right to be forgotten law. Lazic wrote to the Washington Post, which published
the review by classical music writer Anne Midgette, to have the article removed from search results. He claimed that the review was: Defamatory, mean-spirited, opinionated, one-sided, offensive [and] simply irrelevant for the arts , despite the
fact that the original piece is in many places complimentary. In the original article, Midgette said that his performance was lackluster given his huge talents, and prone to grandiloquence . Lazic also claimed that his request was
nothing to do with censorship ...BUT... a response to the fact that newspaper reviews are too far from the truth . |
|
BBC News to maintain a list of URLs censored by Google under the 'right to be forgotten'
|
|
|
|
22nd October 2014
|
|
| See article from bbc.co.uk
|
The BBC is to publish a continually updated list of its articles censored from Google search under the disgraceful right to be forgotten rule. Editorial policy head David Jordan told a public meeting, hosted by Google, that the BBC felt some of
its articles had been wrongly hidden. He said greater care should be given to the public's right to remember . The BBC will begin - in the next few weeks - publishing the list of removed URLs it has been notified about by Google.
Jordan said the BBC had so far been notified of 46 links to articles that had been removed. The list will not republish the story, or any identifying information. It will instead be a resource for those interested in the debate . Jordan criticised the
lack of a formal appeal process after links have been taken down, noting one case where news of the trial involving members of the Real IRA was removed from search results. |
|
Google reports on the scale of censorship under the right to be forgotten
|
|
|
| 12th
October 2014
|
|
| See article from bbc.co.uk
|
One in 10 requests for web links to be censored from search results under European right to be forgotten laws have come from the UK, Google has said. Google said it had removed 498,737 links from search results since May this year - including
63,616 pages following requests from the UK. It said 18,304 requests were made in the UK, the third highest in the EU. According to a transparency report released on its website, Google removed 35% - or 18,459 - of censorship requests. Google also provided examples of the sorts of requests it had received, along with the search engine's decision.
|
|
|
|
|
| 23rd
September 2014
|
|
|
Oxford Mail republishes crime stories censored by Google under the EU's disgraceful 'right to be forgotten' See
article from oxfordmail.co.uk
|
|
|
|
|
| 23rd September 2014
|
|
|
Myth-busting: European Commission misrepresents right to be forgotten objections See
article from indexoncensorship.org |
|
EU internet censors want to extend arbitrary censorship under the right to be forgotten
|
|
|
| 19th September 2014
|
|
| See article from
searchengineland.com
|
According to Reuters, European internet censors say they've agreed on a uniform set of EU-wide rules and criteria that will be used to evaluate appeals under the disgraceful Right to Be Forgotten (RTBF) law announced earlier this year by the
Luxembourg-based European Union Court of 'Justice'. Google has received in excess of 120,00 censorship requests since May. Many have been granted but many have not. Google is hardly in a position to research the merits of the case, so the decisions
are essentially arbitrary. Those whose censorship requests are turned down will be able to appeal the decision and that's where these censorship criteria will be applied. The specifics of the rules won't be finalized until November. However
Reuters suggests they will primarily take into account factors such as the public role of the person, whether the information relates to a crime and how old it is. There's still considerable ambiguity in some of these areas. Google has
adopted a practice of notifying publishers when RTBF links are removed. Apparently EU censors don't like this practice (probably because it puts political pressure on them amid cries of censorship or objections from the publishers). Google
currently only removes the subject links and material from the individual country Google site where the request was made (e.g., Google.fr, Google.de) but not from Google.com. Johannes Caspar, Germany's internet censor, reportedly believes that these RTBF
removals should be expunged globally. He spewed: The effect of removing search results should be global. This is in the spirit of the court ruling and the only meaningful way to act in a global environment like the
Internet..
Hopefully this won't occur as the US is a bit more keen on freedom than the PC extremists of the EU. |
|
Google organises protests across Europe against the ludicrous and inept 'right to be forgotten'
|
|
|
|
8th September 2014
|
|
| See article from
stockwisedaily.com |
Google is to fight back against the European Union's inane right to be forgotten ruling. Following a ruling from the European Union Court of Justice under which, Google must remove personal information from search results upon requests without
being in the position to ascertain that the request is justified. In order to oppose against the ruling, Google is planning public hearings in seven different European cities starting in Madrid on September 9. Google is looking for a robust
debate over the ruling and its implementation criteria, as said by a top lawyer, David Drummond. Google is not the only company to criticize the ruling and Wikipedia Founder , Jimmy Wales, has called the ruling to be deeply immoral and even
said that ruling will lead to an internet riddled with memory holes. Drummond and Eric Schmidt, Google Chairman, will highlight the implications of this ruling. Furthermore, the company will outline ideas for handling requests related to
criminal convictions. |
|
Wikimedia Foundation bosses speak of the censorship being enabled by the 'right to be forgotten'
|
|
|
|
9th August 2014
|
|
| See article from
dailymail.co.uk |
The foundation which operates Wikipedia has criticised of the right to be forgotten ruling, describing it as unforgivable censorship . Speaking at the announcement of the Wikimedia Foundation's first-ever transparency report in London,
Wikipedia founder Jimmy Wales said the public had the right to remember : Wikipedia is founded on the belief that everyone, everywhere should be able to have access to the sum of all knowledge. However, this is
only possible if people can contribute and participant in those projects without reservation. This means the right to create content, including controversial content, should be protected. People should feel secure that their
curiosity and contributions are not subject to unreasonable Government requests for their account histories. They should feel confident that the knowledge they are receiving is complete, truthful and uncensored.
The Foundation's chief
executive Lila Tretikov called the ruling from the European Court of Justice a direct threat to our mission : Our Transparency Report explains how we fight and defend against that. We oppose censorship.
Recently, however, a new threat has emerged - the removal of links from search results following the recent judgment from the European Court of Justice regarding the right to be forgotten . This right to be forgotten is the idea
that people may demand to have truthful information about themselves selectively removed from the published public record or at least make it more difficult to find. This ruling, unfortunately, has compromised the public's right to information and
freedom of expression. Links, including those to Wikipedia itself may now be quietly, silently deleted with no transparency, no notice, no judicial review and no appeals process. Some search engines are giving proper notice and
some are not. We find this type of compelled censorship unacceptable. But we find the lack of disclosure unforgivable.
As part of the Foundation's bid for greater transparency, it has issued its first transparency report, detailing
the number of requests it has received from governments, individuals and organisations to disclose information about users or to change content on web pages. According to the report, the Foundation received 56 requests for user data in the last two
years. In 14% of those cases, information was produced. The report also revealed that 304 requests were made for content to be either altered or removed, with the Foundation confirming that none of those requests were granted. Geoff Brigham,
general counsel at the Wikimedia Foundation, said: The decision is going to have direct and critical repercussions for Wikipedia. Without safeguards, this decision hurts free information, and let me tell you why: the
decisions are made without any real proof, there's no judicial review, no public explanation, there's no appeals process. Yet the decision allows censorship of truthful information when one would expect such judicial safeguards.
If I may so say, in allowing this to happen, the European Court of Justice has basically abandoned its responsibility to protect the right to freedom of expression and access to truthful information. Two extremely important rights for democratic society.
In our opinion, we are on a path to secret, online sanitation of truthful information. No matter how well it may be intended, it is compromising human rights, the freedom of expression and access to information, and we cannot
forget that. So we have to expose it and we have to reject this kind of censorship.
|
|
|
|
|
|
5th August 2014
|
|
|
Remembered for claiming the right to be forgotten See article from bbc.co.uk |
|
House of Lords Committee find the right to be forgotten wrong in principle and unworkable
|
|
|
| 3rd
August 2014
|
|
| See article from
theguardian.com |
The right to be forgotten , the arbitrary removal of online material according to who shouts loudest, is wrong in principle and unworkable in practice, a parliamentary committee has said. The House of Lords home affairs, health and education EU
sub-committee has condemned regulations being drawn up by the European commission and a recent landmark judgment by the European court of justice (ECJ). The committee points out that the EU's 1995 data protection directive on which the ECJ
judgment relied was drafted three years before Google was founded. The committee's chair, Lady Prashar, said: It is crystal clear that the neither the 1995 directive nor the [ECJ's] interpretation of it reflects
the incredible advancement in technology that we see today, over 20 years since the directive was drafted. We believe that the judgment of the court is unworkable for two main reasons. Firstly, it does not take into account the
effect the ruling will have on smaller search engines which, unlike Google, are unlikely to have the resources to process the thousands of removal requests they are likely to receive. Secondly, we also believe that it is wrong in
principle to leave search engines themselves the task of deciding whether to delete information or not, based on vague, ambiguous and unhelpful criteria, and we heard from witnesses how uncomfortable they are with the idea of a commercial company sitting
in judgement on issues like that. We think there is a very strong argument that, in the new regulation, search engines should not be classed as data controllers, and therefore not liable as 'owners' of the information they are
linking to. We also do not believe that individuals should have a right to have links to accurate and lawfully available information about them removed, simply because they do not like what is said.
|
|
|
|
|
| 2nd
August 2014
|
|
|
The right to be forgotten is being well abused. It is being decided by an organisation who doesn't really want to be involved who only gets to hear one side of a story See
article from dailymail.co.uk |
|
EU internet censors get heavy with Google for informing websites that they have been censored
|
|
|
|
26th July 2014
|
|
| See article from
itproportal.com |
The EU's Article 29 Censorship Working Party has criticised Google for telling publishers about removed right to be forgotten links, and it wants links removed worldwide, not just on European variants of Google. Representatives of Google, Yahoo
and Bing were called back to address issues about the way that Google was handling right to be forgotten censorship requests. It turned into a sort of public dressing down of Google for not censoring links 'properly' . Google was criticised for
the fact that it was only removing links from the EU sites, and links could still be found on the US and other Google search pages. The EU censors feel that any EU citizen who doesn't like a particular post has the right to have all links to that story
censored worldwide. Google was also called out because they were informing the sources of the stories that they were pulling the links (causing websites to republish new articles, which added more new links, and so on). Irish data protection
censor Billy Hawkes expressed concerns regarding Google warning sites about their links being removed. The more they do so, it means the media organisation republishes the information and so much for the right to be
forgotten. There is an issue there.
...Read the full article
Jimmy Wales criticises 'right to be forgotten' censorship See article
from theguardian.com
Wikipedia founder Jimmy Wales said it was dangerous to have companies decide what should be allowed to appear on the internet. Internet search engines such as Google should not be left in charge of censoring history , the Wikipedia founder
has said, after the US firm revealed it had approved half of more than 90,000 right to be forgotten requests. Jimmy Wales said it was dangerous to have companies decide what should and should not be allowed to appear on the internet.
|
|
Society of Editors calls on David Cameron to end censorship based on the right to be forgotten
|
|
|
| 24th July 2014
|
|
| See article
[pdf] from indexoncensorship.org | The Society of Editors, which has the backing of
senior figures at the BBC, Sky News and ITN as well as major newspaper groups, as joined with Index on Censorship and the Media Lawyers Association to call on David Cameron and key EU data protection chiefs to resist censorship in the guise of the right
to be forgotten. The Society of Editors has wriiten to David Cameron: Dear Prime Minister,
The issues about the so-called right to be forgotten raised by the recent European Court judgement involving Google, with its implications for other search engines and accessibility to other journalistic information give us serious cause for
concern. We appreciate that no general right to be forgotten exists, as Ministers and the Information Commissioner have confirmed. The Court ruling is only about restricting access to links generated by search engines in
response to name searches. But there is a vital principle at stake which we trust that the Information Commissioner - responsible for adjudicating both data protection and freedom of information in the UK - and the government will defend with vigour.
The judgement makes clear that Europeans now have the right to demand that certain online material is obscured in search results and its dissemination via search engines is stopped. For media organisations and journalists, this is
akin to being asked - on the basis of the subjective opinions of individuals, rather than any specific Court order - to remove items from an index in newspaper archives. This is something we would only do after careful consideration based on a sound
legal and factual basis and hope never to be asked to do more. We feel sure that neither the Information Commissioner nor the government would wish to see this happen but we seek assurances that any such moves will be firmly
resisted and will not be applied in any new data protection legislation coming out of Europe in the future. We are concerned that the European Court's judgment goes against Article 10 of the European Convention of Human Rights and
certainly the intentions of the UK Parliament when it introduced the Human Rights Act. With regard to data protection legislation, journalistic work has always received special consideration. We are glad to see that the Court's
ruling continues this, and does not require news publishers to remove articles when asked to do so by individuals. This principle must be strongly defended or even enhanced. But the Court's ruling is deeply problematic for journalism in general, as it
has the effect of limiting the accessibility and dissemination of journalistic work via search engines, where the media company wishes this to be done. This reduces the visibility of the vital work done by journalists to ensure accountability throughout
society, which in itself is contrary to the spirit behind Article 10. For this reason, we believe that there should be greater transparency about the actions of search engines to comply with the European Court's ruling.
Specifically, we believe there should be no restrictions on the ability of Google or other operators to inform the originator of material when links to that material are removed. Any restrictions would prevent publishers having the opportunity to make
their case on freedom of expression grounds thus making the process one-sided. The Society of Editors has more than 400 members in national, regional and local newspapers, magazines, broadcasting and digital media, journalism
education and media law. It campaigns for media freedom, self regulation, the public's right to know and the maintenance of standards in journalism. This letter has the full support of the Society's board of directors which includes senior editors from
Sky News and the BBC and and key regional newspapers in England, Scotland, Wales and Northern Ireland. It also has the support of editors of major UK newspapers, including The Times, The Sunday Times, The Sun, The Guardian, The Independent, the Financial
Times, the Daily Express, the Daily Mirror, the Sunday Mirror, The Daily Telegraph, and Associated Newspapers as well as ITN. We would be grateful for your comments about this and your assurances that these principles will be
defended. |
|
|
|
|
| 10th July 2014
|
|
|
When government minister Simon Hughes tries to spin that it's not See
article from dailymail.co.uk |
|
|
|
|
|
10th July 2014
|
|
|
For centuries, the wicked have dreamt of wiping their crimes from history. Now - thanks to an idiotic ruling by European judges and Google's connivance - they're doing it in their thousands See
article from dailymail.co.uk
|
|
|
|
|
| 6th
July 2014
|
|
|
The Register asks if recent examples of right to be forgotten censorship by Google were a ploy See
article from theregister.co.uk |
|
|
|
|
| 3rd July 2014
|
|
|
The Guardian reveals how its news archives have been censored by Google in the name of the right to be forgotten See
article from theguardian.com |
|
|
|
|
|
3rd July 2014
|
|
|
The BBC reveals how its news archives have been censored by Google in the name of the right to be forgotten See article from bbc.co.uk |
|
Google begins removing search links under the EU's right to be forgotten
|
|
|
|
27th June 2014
|
|
| See article from
theguardian.com |
Google has begun removing search links to content in Europe under the right to be forgotten ruling, which obliges it exclude web pages with supposedly outdated or irrelevant information about individuals from web searches. Searches made
on Google's services in Europe using peoples' names includes a section at the bottom with the phrase Some results may have been removed under data protection law in Europe , and a link to a page explaining the ruling by the European court of
justice (ECJ) in May 2014. However searches made on Google.com, the US-based service, do not include the same warning, because the ECJ ruling only applies within Europe. Google would not say how many peoples' search histories have been
censored, nor how many web pages have been affected. Comment: Goggle.eu.censored 28th June 2014. From Alan Not mentioned in the Guardian report is the difficulty for UK surfers of finding
uncensored searches on the American site. If I'm in Italy, I can either search in Italian at google.it or, if I want to search in English and enter google.com, I get the American site. But in this country, typing the URL for google.com redirects to
google.co.uk. Looks like we Brits are particular disadvantaged by the absurd decision of twattish Euro-judges. |
|
|
|
|
| 21st May 2014
|
|
|
The European search engine ruling weakens our democratic foundations and could lead to our history being rewritten. By Mark Stephens See
article from theguardian.com |
|
The European Court of Justice allows anyone to censor personal data that do not like even if it is accurate and truthful
|
|
|
|
16th May 2014
|
|
| 13th May 2014. See
article from
publicaffairs.linx.net See article
from indexoncensorship.org |
Search engines are data controllers within the meaning of the Data Protection Directive, and responsible for complying with the data protection principles in respect of the processing they do of personal data, says Europe's highest court, the Court of
Justice of the European Union (CJEU). The CJEU upheld the right of a user to suppress search results on his name that pointed to newspaper articles about him. CJEU found that Google, as a search engine, processed personal data, by determining which
links would appear in response to a search on an individual's name, and is the data controller for that processing. This applies even when the data attached to the individual's name exclusively concerns data that has already been published, and
regardless of the fact that the processing was performed without distinction to the data, other than the personal data. By finding Google to be a data controller in its own right, the CJEU was able to apply the full scope of the Data Protection
Directive to Google, and arrive at a decision that users can, in some circumstances, have a right to be forgotten , even in respect of data that was originally published lawfully. Finally, in response to the question whether the directive
enables the data subject to request that links to web pages be removed from such a list of results on the grounds that he wishes the information appearing on those pages relating to him personally to be forgotten after a certain time, the Court
holds that, if it is found, following a request by the data subject, that the inclusion of those links in the list is, at this point in time, incompatible with the directive, the links and information in the list of results must be erased. Index
on Censorship writes: The Court's decision is a retrograde move that misunderstands the role and responsibility of search engines and the wider internet. It should send chills down the spine of everyone in the European
Union who believes in the crucial importance of free expression and freedom of information.
Update: Censorship requests roll into Google 16th May 2014. See
article from bbc.co.uk
Google has received fresh takedown requests after a European court ruled that an individual could force it to remove irrelevant and outdated search results, the BBC has learned. An ex-politician seeking re-election has asked to have links
to an article about his behaviour in office removed. A man convicted of possessing child abuse images has requested links to pages about his conviction to be wiped. And a doctor wants negative reviews from patients removed from the results.
|
|
|
|
|
| 15th May 2014
|
|
|
Tuesday's ruling from the Court of Justice of the European Union (CJEU) said that internet search engine operators must remove links to articles found to be outdated or 'irrelevant' at the request of individuals. See
article from indexoncensorship.org |
|
The supposed 'right to be forgotten' doesn't look like it will trump the right to factual legitimate information
|
|
|
|
26th June 2013
|
|
| See detailed analysis of the judgement
from ukhumanrightsblog.com
|
Google should not have to delete information from its search results when old information is pulled up that is damaging to individuals who claim to be harmed by the content. That's the early opinion of a special advisor to the European Union's
highest court, who has apparently sided with Google in a case involving a man in Spain who argued that Google searches about him provide information about an arrest years before that should be cleaned up to protect him. An expert opinion requested
by the European Court of Justice, which is based in Luxembourg, recommended that Google not be forced to expunge all links to a 15-year-old legal notice published in a Spanish newspaper documenting a failure to pay back taxes. Instead, the
European Union's highest court was advised to strike down a Spanish regulator's demand that the search engine grant citizens a broad digital 'right to be forgotten,' including the ability to delete previous arrests and other negative publicity from
Google's online search results. A final decision in the case is expected before the end of this year.
|
|
A campaign against EU legislation supporting the corporate sale of personal date without consent
|
|
|
| 15th
May 2013
|
|
| See article from
openrightsgroup.org See campaign at nakedcitizens.eu
|
People from across Europe are sending postcards like this to their MEPs asking them to support new proposals protecting our privacy and giving us control over what happens to our data.
Join them right now -
click here to send your postcard! You can choose the message and how it looks and everything. Big business isn't standing by though. They are flooding
the normal democratic process with lobbying to get the plans watered down and strip us of our right to privacy. It wants to keep on profiting from our most intimate data. Take Everything Everywhere,
reported this week to be selling the data of their 27 million mobile customers to the polling company Ipsos MORI. EE
customers' personal details could have been revealed to the police without their consent. EE say that the data has been anonymised but it is often possible to re-identify people from anonymised data. Phone companies like EE have
been pushing particularly hard against the new data protection plans. It's not hard to see why. They wouldn't be able sell their customers' data without their consent. As they stand, the new regulations would help make sure we
control what happens to our data, not the big corporations making money from data about our personal lives. Here's what the new laws would mean for you.
You'd be able to decide who gets access to your data, what they can do with it and who they can give it to. You could delete your data or move it wherever you like, whenever you like. Your
data would be protected whenever you could be identified. This includes so-called pseudonymous data that could still single you out despite being stripped of personal identifiers such as names and addresses. Services that want to use your data would have to get your explicit consent beforehand so there'd be no more vague or easy-to-misunderstand 'agreements.'
There would be severe penalties when the rules were broken to help deter companies from misusing your data and infringing your privacy.
But all this is under threat. If the big corporations and their armies of lobbyists get their way, the new law won't have any teeth and companies will just keep on invading your privacy. Help stop their
full frontal assault on our personal data! Please send a postcard to your MEPs.
|
|
|
|
|
| 5th April 2013
|
|
|
Advocating a nightmare world where historical fact will be re-written to suit those who complain loudest and have the most most money to demand their version of the truth in court See
article from guardian.co.uk |
|
Google opposes the right to be forgotten in the European Court of Justice
|
|
|
| 28th February 2013
|
|
| See article from
searchengineland.com
|
In a test case that could have significant implications for Google throughout Europe the company faced off against the Spanish data protection authority in the European Court of Justice. From the Spanish government's point of view its data
protection authority is pushing for the recently articulated right (of individuals) to be forgotten by having content or data about them removed from the search index upon request. From Google's perspective, if the court agrees with Spain, the
outcome would be tantamount to granting individuals the right to censor Google. The Spanish citizen, Mario Costeja, filed a complaint with the Spanish Data Protection Agency (AEPD) against Google and the newspaper La Vanguardia after discovering
that a Google search for his name produced results referring to the auction of real estate property seized from him for non-payment of social security contributions. The AEPD rejected Costeja's complaint against the newspaper on the grounds that
the publication of the information was legal and was protected by the right to information but, with extraordinary inconsistency, upheld his complaint his complaint against Google, ordering the search engine to eliminate about 100 links from all
future searches for Costeja's name. Google refused to accept the ruling and filed an appeal which has now reached court.
|
7th March 2012 | |
| Spain asks European Court to comment on the legality of demands of Google to de-list personal information
| See article from
reuters.com
|
Spain's highest court wants the European Court of Justice (ECJ) to decide if requests by Spanish citizens to have data deleted from Google's search engine are lawful. The Spanish court said it had asked the ECJ to clarify whether Google should
remove data from its search engine's index and news aggregator. Madrid's data protection authority has received over 100 requests from Spanish citizens to have their data removed from Google's search results. An example case is a plastic surgeon
who wants to get rid of archived references to a botched operation. The Spanish judges also asked the ECJ whether the complainants must take their grievances to California, where Google is based, or whether they can be addressed by Google
Spain. Google has maintained that it cannot lawfully remove any content for which it is merely the host and not the producer, a principle enshrined in EU law on eCommerce since 2000. Google told the Spanish prosecutor it needed more legal
justification for removing references to events in an individual's history.
|
16th February 2012 | |
| Lawyer warns that the 'right to be forgotten' will surely lead to internet censorship
|
See
article from
newstrackindia.com
|
A leading British lawyer has condemned new European regulations that force websites to delete data on users' request, saying such rules could transform search engines like Google into a censor-in-chief for the European Union, rather than a neutral
platform . According to the current European proposal from Justice Commissioner Viviane Reding, various websites will be forced to delete information shortly after consumers request it be removed. Prof Jeffrey Rosen, writing in the
Stanford Law Review, argued that the fear of fines will have a chilling effect, and that it will be hard to enforce across the Internet when information is widely disseminated: Although Reding depicted the new
right as a modest expansion of existing data privacy rights, in fact it represents the biggest threat to free speech on the Internet in the coming decade. Unless the right is defined more precisely when it is promulgated over the
next year or so, it could precipitate a dramatic clash between European and American conceptions of the proper balance between privacy and free speech, leading to a far less open Internet.
Prof Rosen warns that if the regulations are
implemented as currently proposed, it's hard to imagine that the Internet results will be as free and open as it is now.
|
30th January 2012 | |
| |
EU proposes a bag of worms that will only be untangled by incredibly expensive lawyers See article from
arstechnica.com |
12th August 2011 | |
| Google challenges the right to be forgotten in Spain
|
Based on article from
theregister.co.uk
|
The Spanish government has ordered Google to delete information about 90 individuals from its search engine indexes. Spain said it believes that the individuals have a right to be forgotten. The 90 are thoie that lodged complaints with the Spanish data
protection agency. But Google said preventing some data being accessed through search engines would have a profound chilling effect on free expression without protecting people's privacy A court will determine whether the
individuals' details should be deleted. The issue rose when Spain published an official database online. The database contained personal information about Spanish citizens that is maybe best not published on the internet. And indeed the Google
search engine indexed the site and made the information available in the usual search engine results. Viviane Reding, Vice-President of the European Commission and EU Justice Commissioner seems sympathetic with the right to be forgotten. She said:
I do not approach this subject of the 'right to be forgotten' lightly. I know that there is a balance to be struck with freedom of expression. It may also call for some flexibility in the way this balance is
struck, but I cannot accept that individuals have no say over their data once it has been launched into cyberspace.
|
| |