Melon Farmers Unrated

Gooogle Privacy


Google's many run-ins with privacy


 

Offsite Article: Free websites, advertising revenues and privacy...


Link Here 29th January 2020
Full story: Gooogle Privacy...Google's many run-ins with privacy
If Chrome fixes privacy too fast it could break the web, Google exec debates advertising revenue vs privacy

See article from cnet.com

 

 

Offsite Article Searching for better privacy...


Link Here15th January 2020
Full story: Gooogle Privacy...Google's many run-ins with privacy
Google to strangle user agent strings in its chrome browse to hamper advertisers from profiling users via fingerprinting

See article from zdnet.com

 

 

Sensitive changes...

Google to withhold details from advertisers about where people are browsing on the internet


Link Here17th November 2019
Full story: Gooogle Privacy...Google's many run-ins with privacy
In what sounds like a profound change to the commercial profiling of people's website browsing history, Google has announced that it will withhold data from advertisers that categorises web pages.

In response to the misuse of medical related browsing data, Google has announced that from February 2020 it will cease to inform advertisers about the content of webpage where advertising space is up for auction. Presumably this is something along the lines of Google having an available advert slot on worldwidepharmacy.com but not telling the advertiser that the John Doe is browsing an STD diagnosis page, but the advertiser will still be informed of the URL.

Chetna Bindra, senior product manager of trust and privacy at Google wrote:

While we already prohibit advertisers from using our services to build user profiles around sensitive categories, this change will help avoid the risk that any participant in our auctions is able to associate individual ad identifiers with Google's contextual content categories.

Google also plans to update its EU User Consent Policy audit program for publishers and advertisers, as well as our audits for the Authorized Buyers program, and continue to engage with data protection authorities, including the Irish Data Protection Commission as they continue their investigation into data protection practices in the context of Authorized Buyers.

Although this sounds very good news for people wishing to keep their sensitive data private it may not be so good for advertisers who will see costs rise and publishers who will see incomes fall.

ANd of course Google will still know itself that John Doe has been browsing STD diagnosis pages. There could be other consequences such as advertisers sending their own bots out to categorise likely advertising slots.

 

 

Offsite Article: Subtly identifying de-anatomised internet users...


Link Here 6th September 2019
Full story: Gooogle Privacy...Google's many run-ins with privacy
Brave presents technical new evidence about personalised advertising, and has uncovered a mechanism by which Google appears to be circumventing its purported GDPR privacy protections

See article from brave.com

 

 

Don't Play in Google's Privacy Sandbox...

A detailed technical investigation of Google's advanced tools designed to profile internet users for advertising


Link Here 31st August 2019
Full story: Gooogle Privacy...Google's many run-ins with privacy

Last week, Google announced a plan to build a more private web. The announcement post was, frankly, a mess. The company that tracks user behavior on over 2/3 of the web said that Privacy is paramount to us, in everything we do.

Google not only doubled down on its commitment to targeted advertising, but also made the laughable claim that blocking third-party cookies -- by far the most common tracking technology on the Web, and Google's tracking method of choice -- will hurt user privacy. By taking away the tools that make tracking easy, it contended, developers like Apple and Mozilla will force trackers to resort to opaque techniques like fingerprinting. Of course, lost in that argument is the fact that the makers of Safari and Firefox have shown serious commitments to shutting down fingerprinting, and both browsers have made real progress in that direction. Furthermore, a key part of the Privacy Sandbox proposals is Chrome's own (belated) plan to stop fingerprinting.

But hidden behind the false equivalencies and privacy gaslighting are a set of real technical proposals. Some are genuinely good ideas. Others could be unmitigated privacy disasters. This post will look at the specific proposals under Google's new Privacy Sandbox umbrella and talk about what they would mean for the future of the web. The good: fewer CAPTCHAs, fighting fingerprints

Let's start with the proposals that might actually help users.

First up is the Trust API . This proposal is based on Privacy Pass , a privacy-preserving and frustration-reducing alternative to CAPTCHAs. Instead of having to fill out CAPTCHAs all over the web, with the Trust API, users will be able to fill out a CAPTCHA once and then use trust tokens to prove that they are human in the future. The tokens are anonymous and not linkable to one another, so they won't help Google (or anyone else) track users. Since Google is the single largest CAPTCHA provider in the world, its adoption of the Trust API could be a big win for users with disabilities , users of Tor , and anyone else who hates clicking on grainy pictures of storefronts.

Google's proposed privacy budget for fingerprinting is also exciting. Browser fingerprinting is the practice of gathering enough information about a specific browser instance to try to uniquely identify a user. Usually, this is accomplished by combining easily accessible information like the user agent string with data from powerful APIs like the HTML canvas. Since fingerprinting extracts identifying data from otherwise-useful APIs, it can be hard to stop without hamstringing legitimate web apps. As a workaround, Google proposes limiting the amount of data that websites can access through potentially sensitive APIs. Each website will have a budget, and if it goes over budget, the browser will cut off its access. Most websites won't have any use for things like the HTML canvas, so they should be unaffected. Sites that need access to powerful APIs, like video chat services and online games, will be able to ask the user for permission to go over budget. The devil will be in the details, but the privacy budget is a promising framework for combating browser fingerprinting.

Unfortunately, that's where the good stuff ends. The rest of Google's proposals range from mediocre to downright dangerous.

The bad: Conversion measurement

Perhaps the most fleshed-out proposal in the Sandbox is the conversion measurement API . This is trying to tackle a problem as old as online ads: how can you know whether the people clicking on an ad ultimately buy the product it advertised? Currently, third-party cookies do most of the heavy lifting. A third-party advertiser serves an ad on behalf of a marketer and sets a cookie. On its own site, the marketer includes a snippet of code which causes the user's browser to send the cookie set earlier back to the advertiser. The advertiser knows when the user sees an ad, and it knows when the same user later visits the marketer's site and makes a purchase. In this way, advertisers can attribute ad impressions to page views and purchases that occur days or weeks later.

Without third-party cookies, that attribution gets a little more complicated. Even if an advertiser can observe traffic around the web, without a way to link ad impressions to page views, it won't know how effective its campaigns are. After Apple started cracking down on advertisers' use of cookies with Intelligent Tracking Prevention (ITP), it also proposed a privacy-preserving ad attribution solution . Now, Google is proposing something similar . Basically, advertisers will be able to mark up their ads with metadata, including a destination URL, a reporting URL, and a field for extra impression data -- likely a unique ID. Whenever a user sees an ad, the browser will store its metadata in a global ad table. Then, if the user visits the destination URL in the future, the browser will fire off a request to the reporting URL to report that the ad was converted.

In theory, this might not be so bad. The API should allow an advertiser to learn that someone saw its ad and then eventually landed on the page it was advertising; this can give raw numbers about the campaign's effectiveness without individually-identifying information.

The problem is the impression data. Apple's proposal allows marketers to store just 6 bits of information in a campaign ID, that is, a number between 1 and 64. This is enough to differentiate between ads for different products, or between campaigns using different media.

On the other hand, Google's ID field can contain 64 bits of information -- a number between 1 and 18 quintillion . This will allow advertisers to attach a unique ID to each and every ad impression they serve, and, potentially, to connect ad conversions with individual users. If a user interacts with multiple ads from the same advertiser around the web, these IDs can help the advertiser build a profile of the user's browsing habits.

The ugly: FLoC

Even worse is Google's proposal for Federated Learning of Cohorts (or FLoC). Behind the scenes, FLoC is based on Google's pretty neat federated learning technology . Basically, federated learning allows users to build their own, local machine learning models by sharing little bits of information at a time. This allows users to reap the benefits of machine learning without sharing all of their data at once. Federated learning systems can be configured to use secure multi-party computation and differential privacy in order to keep raw data verifiably private.

The problem with FLoC isn't the process, it's the product. FLoC would use Chrome users' browsing history to do clustering . At a high level, it will study browsing patterns and generate groups of similar users, then assign each user to a group (called a flock). At the end of the process, each browser will receive a flock name which identifies it as a certain kind of web user. In Google's proposal, users would then share their flock name, as an HTTP header, with everyone they interact with on the web.

This is, in a word, bad for privacy. A flock name would essentially be a behavioral credit score: a tattoo on your digital forehead that gives a succinct summary of who you are, what you like, where you go, what you buy, and with whom you associate. The flock names will likely be inscrutable to users, but could reveal incredibly sensitive information to third parties. Trackers will be able to use that information however they want, including to augment their own behind-the-scenes profiles of users.

Google says that the browser can choose to leave sensitive data from browsing history out of the learning process. But, as the company itself acknowledges, different data is sensitive to different people; a one-size-fits-all approach to privacy will leave many users at risk. Additionally, many sites currently choose to respect their users' privacy by refraining from working with third-party trackers. FLoC would rob these websites of such a choice.

Furthermore, flock names will be more meaningful to those who are already capable of observing activity around the web. Companies with access to large tracking networks will be able to draw their own conclusions about the ways that users from a certain flock tend to behave. Discriminatory advertisers will be able to identify and filter out flocks which represent vulnerable populations. Predatory lenders will learn which flocks are most prone to financial hardship.

FLoC is the opposite of privacy-preserving technology. Today, trackers follow you around the web, skulking in the digital shadows in order to guess at what kind of person you might be. In Google's future, they will sit back, relax, and let your browser do the work for them.

The ugh: PIGIN

That brings us to PIGIN. While FLoC promises to match each user with a single, opaque group identifier, PIGIN would have each browser track a set of interest groups that it believes its user belongs to. Then, whenever the browser makes a request to an advertiser, it can send along a list of the user's interests to enable better targeting.

Google's proposal devotes a lot of space to discussing the privacy risks of PIGIN. However, the protections it discusses fall woefully short. The authors propose using cryptography to ensure that there are at least 1,000 people in an interest group before disclosing a user's membership in it, as well as limiting the maximum number of interests disclosed at a time to 5. This limitation doesn't hold up to much scrutiny: membership in 5 distinct groups, each of which contains just a few thousand people, will be more than enough to uniquely identify a huge portion of users on the web. Furthermore, malicious actors will be able to game the system in a number of ways, including to learn about users' membership in sensitive categories. While the proposal gives a passing mention to using differential privacy, it doesn't begin to describe how, specifically, that might alleviate the myriad privacy risks PIGIN raises.

Google touts PIGIN as a win for transparency and user control. This may be true to a limited extent. It would be nice to know what information advertisers use to target particular ads, and it would be useful to be able to opt-out of specific interest groups one by one. But like FLoC, PIGIN does nothing to address the bad ways that online tracking currently works. Instead, it would provide trackers with a massive new stream of information they could use to build or augment their own user profiles. The ability to remove specific interests from your browser might be nice, but it won't do anything to prevent every company that's already collected it from storing, sharing, or selling that data. Furthermore, these features of PIGIN would likely become another option that most users don't touch. Defaults matter. While Apple and Mozilla work to make their browsers private out of the box, Google continues to invent new privacy-invasive practices for users to opt-out of.

It's never about privacy

If the Privacy Sandbox won't actually help users, why is Google proposing all these changes?

Google can probably see which way the wind is blowing. Safari's Intelligent Tracking Prevention and Firefox's Enhanced Tracking Protection have severely curtailed third-party trackers' access to data. Meanwhile, users and lawmakers continue to demand stronger privacy protections from Big Tech. While Chrome still dominates the browser market, Google might suspect that the days of unlimited access to third-party cookies are numbered.

As a result, Google has apparently decided to defend its business model on two fronts. First, it's continuing to argue that third-party cookies are actually fine , and companies like Apple and Mozilla who would restrict trackers' access to user data will end up harming user privacy. This argument is absurd. But unfortunately, as long as Chrome remains the most popular browser in the world, Google will be able to single-handedly dictate whether cookies remain a viable option for tracking most users.

At the same time, Google seems to be hedging its bets. The Privacy Sandbox proposals for conversion measurement, FLoC, and PIGIN are each aimed at replacing one of the existing ways that third-party cookies are used for targeted ads. Google is brainstorming ways to continue serving targeted ads in a post-third-party-cookie world. If cookies go the way of the pop-up ad, Google's targeting business will continue as usual.

The Sandbox isn't about your privacy. It's about Google's bottom line. At the end of the day, Google is an advertising company that happens to make a browser.

 

 

Offsite Article: Google defends tracking cookies...


Link Here27th August 2019
Full story: Gooogle Privacy...Google's many run-ins with privacy
Banning tracking cookies jeopardizes the future of the vibrant Web. By Timothy B. Lee -

See article from arstechnica.com

 

 

Offsite Article: Google's new reCAPTCHA has a dark side...


Link Here 28th June 2019
Full story: Gooogle Privacy...Google's many run-ins with privacy
Analysing the way you navigate around websites and hassling those it considers aren't doing it right

See article from fastcompany.com

 

 

Having to ask Google to find the way to opt out of personalised advertising...

Google fined 50 million euros for not providing clear consent when snooping on browsing history so as to personalise adverts


Link Here22nd January 2019
Full story: Gooogle Privacy...Google's many run-ins with privacy

Google has been fined 50 million euros by the French data censor CNIL, for a breach of the EU's data protection rules.

CNIL said it had levied the record fine for lack of transparency, inadequate information and lack of valid consent regarding ads personalisation. It judged that people were not sufficiently informed about how Google collected data to personalise advertising and that Google had not obtained clear consent to process data because essential information was disseminated across several documents. The relevant information is accessible after several steps only, implying sometimes up to five or six actions, CNIL said.

In a statement, Google said it was studying the decision to determine its next steps.

The first complaint under the EU's new General Data Protection Regulation (GDPR) was filed on 25 May 2018, the day the legislation took effect.The filing groups claimed Google did not have a valid legal basis to process user data for ad personalisation, as mandated by the GDPR.

Many internet companies rely on vague wording such as 'improving user experience' to gain consent for a wide range of data uses but the GDPR provides that the consent is 'specific' only if it is given distinctly for each purpose.

Perhaps this fine may help for the protection of data gathered on UK porn users under the upcoming age verification requirements. Obtaining consent for narrowly defined data usages may mean actions could be taken to prevent user identity and browsing history from being sold on.

 

 

General Data Protection Rights abuse...

Google may continue to use facial recognition to tag pictures obtained from Google Photos without obtaining consent


Link Here 2nd January 2019
Full story: Gooogle Privacy...Google's many run-ins with privacy
A US federal judge has thrown out a lawsuit that Google's non-consensual use of facial recognition technology violated users' privacy rights, allowing the tech giant to continue to scan and store their biometric data.

The lawsuit, filed in 2016, alleged that Google violated Illinois state law by collecting biometric data without their consent. The data was harvested from their pictures stored on Google Photos.

The plaintiffs wanted more than $5 million in damages for hundreds of thousands of users affected, arguing that the unauthorized scanning of their faces was a violation of the Illinois Biometric Information Privacy Act, which completely outlaws the gathering of biometric information without consent.

Google countered claiming that the plaintiffs were not entitled to any compensation, as they had not been harmed by the data collection. On Saturday, US District Judge Edmond E. Chang sided with the tech giant, ruling that the plaintiffs had not suffered any concrete harm, and dismissing the suit.

As well as allowing Google to continue the practice, the ruling could have implications for other cases pending against Facebook and Snapchat. Both companies are currently being sued for violating the Illinois act.

 

 

Offsite Article: Google sued for secretly tracking millions of UK iPhone users...


Link Here23rd May 2018
Full story: Gooogle Privacy...Google's many run-ins with privacy
Google accused of bypassing default browser Safari's privacy settings to collect a broad range of data and deliver targeted advertising.

See article from alphr.com




 

melonfarmers icon

Home

Index

Links

Email

Shop
 


US

World

Media

Nutters

Liberty
 

Film Cuts

Cutting Edge

Info

Sex News

Sex+Shopping
 


Adult Store Reviews

Adult DVD & VoD

Adult Online Stores

New Releases/Offers

Latest Reviews

FAQ: Porn Legality

Sex Shops List

Lap Dancing List

Satellite X List

Sex Machines List

John Thomas Toys