British Government proposes Orwellian scheme to connect up people's health records with data snooped from their social media use to nag them about their health
People's medical records will be combined with social and smartphone surveillance to predict who will pick up bad habits and stop them getting ill, under radical government proposals.
Matt Hancock, the health secretary, is planning a system of
predictive prevention, in which algorithms will trawl data on individuals to send targeted health nags to those flagged as having propensities to health problems, such as taking up smoking or becoming obese.
The creepy plans have already attracted
privacy concerns among doctors and campaigners, who say that the project risks backfiring by scaring people or being seen to be abusing public trust in NHS handling of sensitive information.
A representative for games developer EA has announced on an online forum that The Sims mobile game The Sims: Freeplay would no longer be available in seven countries: China, Saudi Arabia, the United Arab Emirates, Oman, Kuwait, Qatar and Egypt.
A spokesperson said that in light of regional standards the game would no longer be updated.? EA did not confirm the exact nature of these regional standards, prompting many fans to speculate that the ban was caused by the game's explicit LGBT
content. The EA spokesperson wrote:
We've always been proud that our in-game experiences embrace values as broad and diverse as our incredible Sims community. This has been important to us, as we know it is to you.
Users who had already downloaded the game would still be able to use it, however, the game will not be updated and may eventually be rendered obsolete. Players will also not be able to make in-game purchases.
The popular EA life
simulation video game includes diverse elements such as same-sex weddings and gay adoptions, and male pregnancies. The game let players pick whether the sim had a feminine or masculine frame and allowed players to decide whether their sim stood to use
the toilet.
The use of 'mobile phone extraction' tools enables police forces to download all of the content and data from people's phones. This can apply to suspects, witnesses and even victims -- without their knowledge.
With no clear
policies or guidance on the use of this technology, individuals are unaware of their legal rights in terms of:
whether data is only taken when necessary and proportionate;
getting the police to delete this data when there is no legal reason to retain it, particularly if they are innocent of any crime;
ensuring data is held securely to prevent exposure of their personal data as a result of loss of records, misuse or security breach.
As the use of this technology is unregulated, we don't know how this data is used, how it is stored and secured, and if it's ever even deleted.
Privacy International is calling for:
the use of this intrusive technology is properly regulated, with independent oversight so that abuse and misuse does not go undetected;
a proper warrantry regime to be implemented, so that the
technology cannot be used arbitrarily;
people to be informed of their rights if the police want to search their phone.
The US-based global tech giant Apple Inc. is set to hand over the operation of its iCloud data center in mainland China to a local corporation called Guizhou-Cloud Big Data (GCBD) by February 28, 2018. When this transition happens, the local company
will become responsible for handling the legal and financial relationship between Apple and China's iCloud users. After the transition takes place, the role of Apple will restricted to an investment of US one billion dollars, for the construction of a
data center in Guiyang, and for providing technical support to the center, in the interest of preserving data security.
GCBD was established in November 2014 with a RMB 235 million yuan [approximately US$ 37.5 million] registered
capital investment. It is a state enterprise solely owned by Guizhou Big Data Development and Management Bureau. The company is also supervised by Guizhou Board of Supervisors of State-owned Enterprises.
What will happen to
Apple's Chinese customers once iCloud services are handed over to GCBD? In public statements, Apple has avoided acknowledging the political implications of the move:
This will allow us to continue to improve the
speed and reliability of iCloud in China and comply with Chinese regulations.
Apple Inc. has not explained the real issue, which is that a state-owned big data company controlled by the Chinese government will have
access to all the data of its iCloud service users in China. This will allow the capricious state apparatus to jump into the cloud and look into the data of Apple's Chinese users.
Apple Inc. has not explained the real
issue, which is that a state-owned big data company controlled by the Chinese government will have access to all the data of its iCloud service users in China.
Over the next few weeks, iCloud users in China will
receive a notification from Apple, seeking their endorsement of the new service terms. These "iCloud (operated by GCBD) terms and conditions" have a newly added paragraph, which reads:
If you understand and
agree, Apple and GCBD have the right to access your data stored on its servers. This includes permission sharing, exchange, and disclosure of all user data (including content) according to the application of the law.
In other words, once the agreement is signed, GCBD -- a company solely owned by the state -- would get a key that can access all iCloud user data in China, legally.
Apple's double standard
Why would a company that built its reputation on data security surrender to the Chinese government so easily?
I still remember how in February 2016,
after the attack in San Bernardino, Apple CEO Tim Cook withstood pressure from the US Department of Justice to build an iPhone operating system that could circumvent security features and install it in the iPhone of the shooter. Cook even issued an open letter
to defend the company's decision.
Apple's insistence on protecting user data won broad public support. At the same time, it was criticized by the Department of Justice , which retorted that the open letter "appears
to be based on its concern for its business model and public brand marketing strategy."
This comment has proven true today, because it is clear that the company is operating on a double standard in its Chinese business. We
could even say that it is bullying the good actor while being terrified by the bad one.
Apple Inc. and Tim Cook, who had once stayed firm against the US government, suddenly have become soft in front of Chinese government. Faced
with the unreasonable demand put forward by the Chinese authorities, Apple has not demonstrated a will to resist. On the contrary, it is giving people the impression that it will do whatever needed to please the authorities.
Near
the end of 2017, Apple lnc. admitted it had removed 674 VPN apps from Chinese App Store. These apps are often used by netizens for circumventing the Great Firewall (blocking of overseas websites and content). Skype
also vanished from the Chinese App Store. And Apple's submission to the Chinese authorities' requests generated a feeling of
"betrayal" among Chinese users.
Some of my friends from mainland China have even decided to give up using Apple mobile phones and shifted to other mainland Chinese brands. Their decision, in addition to the price, is
mainly in reaction to Apple's decision to take down VPN apps from the Chinese Apple store.
Some of these VPN apps can still be downloaded from mobile phones that use the Android system. This indicates that Apple is not
"forced" to comply. People suspect that it is proactively performing a "obedient" role.
The handover of China iCloud to GCBD is unquestionably a performance of submission and kowtow. Online, several people have
quipped: "the Chinese government is asking for 50 cents, Apple gives her a dollar."
Selling the iPhone in China
Apple says the handover is due to new regulations that cloud servers must be
operated by local corporation. But this is unconvincing. China's Cybersecurity Law, which was implemented on June 1 2017, does demand that user information and data collected in mainland China be stored within the border . But it does not require
that the data center be operated by a local corporation.
In other words, even according to Article 37 of the Cybersecurity Law, Apple does not need to hand over the operation of iCloud services to a local corporation, to say
nothing of the fact that the operator is solely owned by the state. Though Apple may have to follow the "Chinese logic" or "unspoken rule", the decision looks more like a strategic act, intended to insulate Apple from financial, legal
and moral responsibility to their Chinese users, as stated in the new customer terms and conditions on the handover of operation. It only wants to continue making a profit by selling iPhone in China.
Many people have encountered
similar difficulties when doing business in China -- they have to follow the authorities' demands. Some even think that it is inevitable and therefore reasonable. For example, Baidu's CEO Robin Li said in a recent interview with Time Magazine,
"That's our way of doing business here".
I can see where Apple is coming from. China is now the third largest market for the iPhone. While confronting vicious competition from local brands, the future growth of
iPhone in China has been threatened . And unlike in the US, if Apple does not submit to China and comply with the Cybersecurity Law, the Chinese authorities can use other regulations and laws like the Encryption Law of the People's Republic of
China (drafting) and Measures for Security Assessment of Cross-border Data Transfer (drafting) to force Apple to yield.
However, as the world's biggest corporation in market value which has so many loyal fans, Apple's performance
in China is still disappointing. It has not even tried to resist. On the contrary, it has proactively assisted [Chinese authorities] in selling out its users' private data.
Assisting in the making of a 'Cloud Dictatorship'
This is perhaps the best result that China's party-state apparatus could hope for. In recent years, China has come to see big data as a strategic resource for its diplomacy and for maintaining domestic stability. Big data is as
important as military strength and ideological control. There is even a new political term "Data-in-Party-control" coming into use.
As an Apple fans, I lament the fact that Apple has become a key multinational
corporation offering its support to the Chinese Communist Party's engineering of a "Cloud Dictatorship". It serves as a very bad role model: Now Apple that has kowtowed to the CCP, how long will other tech companies like Facebook, Google and
Amazon be able to resist the pressure?
Smartphones rule our lives. Having information at our fingertips is the height of convenience. They tell us all sorts of things, but the information we see and receive on our smartphones is just a fraction of the data they generate. By tracking and
monitoring our behaviour and activities, smartphones build a digital profile of shockingly intimate information about our personal lives.
These records aren’t just a log of our activities. The digital profiles they create are
traded between companies and used to make inferences and decisions that affect the
opportunities open to us and our lives. What’s more, this typically happens without our knowledge, consent or control.
New and sophisticated methods built into smartphones make it easy to track and monitor our behaviour. A vast
amount of information can be collected from our smartphones, both when being actively used and while running in the background. This information can include our location, internet search history, communications, social media activity, finances and
biometric data such as fingerprints or facial features. It can also include metadata – information about the data – such as the time and recipient of a text message.
Each type of data can reveal something about our interests and preferences, views, hobbies and social interactions. For example, a study conducted by MIT demonstrated how
email metadata can be used to map our lives , showing the changing dynamics of our professional and personal networks.
This data can be used to infer personal information including a person’s background, religion or beliefs, political views, sexual orientation and gender identity, social connections, or health. For example, it is possible to
deduce our specific health conditions simply by connecting the dots between a series of phone calls.
Different types of data can be consolidated and linked to build a comprehensive profile of us. Companies that buy and sell data –
data brokers – already do this. They collect and combine billions of data
elements about people to make inferences about them. These inferences may seem innocuous but can
reveal sensitive information such as ethnicity, income levels, educational attainment, marital status, and family composition.
A recent study found that
seven in ten smartphone apps share data with third-party tracking companies like Google Analytics.
Data from numerous apps can be linked within a smartphone to build this more detailed picture of us, even if permissions for individual apps are granted separately. Effectively, smartphones can be converted into surveillance devices.
The result is the creation and amalgamation of digital footprints that provide in-depth knowledge about your life. The most obvious reason for companies collecting information about individuals is for profit, to deliver targeted
advertising and personalised services. Some targeted ads, while perhaps creepy, aren’t necessarily a problem, such as an ad for the new trainers you have been eyeing up.
But targeted advertising based on our smartphone data can have real impacts on livelihoods and well-being, beyond influencing purchasing habits. For example, people in financial difficulty might be
targeted for ads for payday loans . They might use these loans to pay for
unexpected expenses , such as medical bills, car maintenance or court fees, but could also
rely on them for recurring living costs such as rent and utility bills. People in
financially vulnerable situations can then become trapped in spiralling debt as they struggle
to repay loans due to the high cost of credit.
Targeted advertising can also enable companies to discriminate against people and deny them an equal chance of accessing basic human rights, such as housing and employment. Race is
not explicitly included in Facebook’s basic profile information, but a user’s “ethnic affinity” can be worked out based on pages they have liked or engaged with. Investigative journalists from ProPublica found that it is possible to exclude those who
match certain ethnic affinities from housing ads , and
certain age groups from job ads .
This is different to traditional advertising in print and broadcast
media, which although targeted is not exclusive. Anyone can still buy a copy of a newspaper, even if they are not the typical reader. Targeted online advertising can completely exclude some people from information without them ever knowing. This is a
particular problem because the internet, and social media especially, is now such a common source of information.
Social media data can also be used to
calculate creditworthiness , despite its dubious relevance. Indicators such as the level of
sophistication in a user’s language on social media, and their friends’ loan repayment histories can now be used for credit checks. This can have a direct impact on the fees and interest rates charged on loans, the ability to buy a house, and even
employment prospects .
There’s a similar risk with payment and shopping apps. In China, the government has announced plans
to combine data about personal expenditure with official records, such as tax returns and driving offences. This initiative, which is being led by both the government and companies, is
currently in the pilot stage . When fully operational, it will produce a
social credit score that rates an individual citizen’s trustworthiness. These ratings
can then be used to issue rewards or penalties, such as privileges in loan applications or limits on career progression.
These possibilities are not distant or hypothetical – they exist now. Smartphones are
effectively surveillance devices , and everyone who uses them is exposed to these risks. What’s more, it is
impossible to anticipate and detect the full range of ways smartphone data is collected and used, and to demonstrate the full scale of its impact. What we know could be just the beginning.
Gay dating apps have been pulled from the Google Play Store in Indonesia amid a government crackdown on the LGBT community.
China-based app Blued, which is the largest hook-up app for the LGBT community across Asia and rivals Grindr globally, was
pulled from the store as the government demanded Google censor a total of 73 LGBT-related applications. The government claimed that the app were removed due to claims of negative content and pornographic content.
Communications ministry
spokescensor Noor Iza told AFP:
There was some negative content related to pornography inside the application. Probably one or some members of the application put the pornographic content inside.
I don't know [whether the ministry has sent a similar request to Apple]. They should since there are two operating systems.
Meanwhile lawmakers are trying to pass legislation which would outlaw LGBT behaviours on
television -- potentially censoring shows that include LGBT characters as well as news reports on the LGBT community.
It is technically legal to be gay in Indonesia apart from Aceh province, which implements extreme punishments under Shariah law.
Two significant Apple shareholders, hedge fund Jana Partners and California State Teacher's Retirement System (CalSTRS) have just penned an open letter to Apple, urging the iPhone maker to take the lead on studying the impact smartphones have on kids
and offer parents improved software tools that would allow them to better manage their children's access to smartphone apps.
The two organizations want more iOS features that would give them more granular control, consistent with a kid's development.
Rosenstein and CalSTRS's director of corporate governance Anne Sheehan said in the open letter that they worked with experts to review studies that found links between the use of electronic devices and adverse effects on health, sleep, empathy,
and concentration.
The two organizations urge Apple to take more responsibility and reinvent parental controls in iPhone and iPad. In their opinion, the current settings offer an all or nothing approach, which lets parents block or allow access to
certain features. They think Apple could take the lead in redefining parental controls in smart devices, to keep up with recent studies and try to prevent various health issues that may be caused by smartphone addition:
A large number of games apps are snooping on players using the smart phone's microphone to listen to what is playing on TV, The apps recognise TV audio and report back what is being watched to home base, supposedly to help in targeted advertising.
Software from Alphonso, a start-up that collects TV-viewing data for advertisers, is used in at least 1000 games. The games do actually seek user consent to use the microphone but users may not be fully aware of the consequences of leaving an open mike in their house or in their children's rooms
Alphonso's software can detail what people watch by identifying audio signals in TV ads and shows, sometimes even matching that information with the places people visit and the movies they see. The information can then be used to target ads more
precisely and to try to analyze things like which ads prompted a person to go to a car dealership.
Alphonso claims that its software does not record human speech. The company claims that it did not approve of its software being used in apps meant
for children. But it was, as of earlier this month, integrated in more than a dozen games like Teeth Fixed and Zap Balloons from KLAP Edutainment in India, which describes itself as primarily focusing on offering educational games for kids and students.
The app can record audio from the microphone when the game is being player or when it is still running in background on the phone.
Comment: Alphonso knows what you watched last summer
Technology startup Alphonso has caused widespread concern by using smartphones microphones to monitor the TV and media habits of games and apps users.
The New York Times has published a story about a company called Alphonso
that has developed a technology that uses smartphone microphones to identify TV and films being played in the background. Alphonso claims not to record any conversations, but simply listen to and encode samples of media for matching in their database.
The company combines the collected data with identifiers and uses the data to target advertising, audience measurement and other purposes. The technology is embedded in over one thousand apps and games but the company refuses to disclose the exact list.
Alphonso argues that users have willingly given their consent to this form of spying on their media consumption and can opt out at any time. They argue that their behaviour is consistent with US laws and regulations.
Even if Alphonso were not breaking any laws here or in the US, there is a systemic problem with the growing intrusion of these types of technologies that monitor ambient sounds in private spaces without sufficient public debate. Apps
are sneaking this kind of surveillance in, using privacy mechanisms that clearly cannot cope. This is despite the apps displaying a widget asking for permission to use the microphone to detect TV content, which would be a "clear affirmative
action" for consent as required by law. Something is not working, and app platforms and regulators need to take action.
In addition to the unethical abuse of users' lack of initiative or ignorance - a bit like tobacco
companies - there could be some specific breaches of privacy. The developers are clearly following the letter of the law in the US, obtaining consent and providing an opt out, but in Europe they could face more trouble, particularly after May when the
General Data Protection Regulaiton (GDPR) comes into force.
One of the newer requirements on consent under GDPR will be to make it as easy to withdraw as it was to give it in the first place. Alphonso has a web-page with
information on how to opt out through the privacy settings of devices, and this information is copied in at least some of the apps' privacy policies, buried under tons of legalese. This may not be good enough. Besides, once that consent is revoked,
companies will need to erase any data obtained if there is no other legitimate justification to keep it. It is far from clear this is happening now, or will be in May.
There is also a need for complete clarity on who is collecting
the data and being responsible for handling any consent and its revocation. At present the roles of app developers, Apple, Google and Alphonso are blurred.
We have been asked whether individuals can take legal action. We think
that under the current regime in the UK this may be difficult because the bar is quite high and the companies involved are covering the basic ground. GDPR will make it easier to launch consumer complaints and legal action. The new law will also
explicitly allow non-material damages, which is possible already in limited circumstances, including for revealing "political opinions, religion or philosophical beliefs" . Alphonso is recording the equivalent of a reading list of
audiovisual media and might be able to generate such information.
Many of these games are aimed at children. Under GDPR, all data processing of children data is seen as entailing a risk and will need extra care. Whether children
are allowed to give consent or must get it from their parents/guardians will depend on their age. In all cases information aimed at children will need to be displayed in a language they can understand. Some of the Alphonso games we checked have an age
rating of 4+.
Consumer organisations have presented complaints in the past for similar issues in internet connected toys and we think that Alphonso and the developers involved should be investigated by the Information
Commissioner.