Physical Address

304 North Cardinal St.
Dorchester Center, MA 02124

Big data meets Big Brother as China moves to rate its citizens

Big data meets Big Brother as China moves to rate its citizens

The Chinese government plans to launch its Social Credit System in 2020. The aim? To judge the trustworthiness – or otherwise – of its 1.3 billion residents.

Big data meets Big Brother as China moves to rate its citizens

On June 14, 2014, the State Council of China published an ominous-sounding document called “Planning Outline for the Construction of a Social Credit System”. In the way of Chinese policy documents, it was a lengthy and rather dry affair, but it contained a radical idea. What if there was a national trust score that rated the kind of citizen you were?

Imagine a world where many of your daily activities were constantly monitored and evaluated: what you buy at the shops and online; where you are at any given time; who your friends are and how you interact with them; how many hours you spend watching content or playing video games; and what bills and taxes you pay (or not). It’s not hard to picture, because most of that already happens, thanks to all those data-collecting behemoths like Google, Facebook and Instagram or health-tracking apps such as Fitbit. But now imagine a system where all these behaviours are rated as either positive or negative and distilled into a single number, according to rules set by the government. That would create your Citizen Score and it would tell everyone whether or not you were trustworthy. Plus, your rating would be publicly ranked against that of the entire population and used to determine your eligibility for a mortgage or a job, where your children can go to school – or even just your chances of getting a date.

A futuristic vision of Big Brother out of control? No, it’s already getting underway in China, where the government is developing the Social Credit System (SCS) to rate the trustworthiness of its 1.3 billion citizens. The Chinese government is pitching the system as a desirable way to measure and enhance “trust” nationwide and to build a culture of “sincerity”. As the policy states, “It will forge a public opinion environment where keeping trust is glorious. It will strengthen sincerity in government affairs, commercial sincerity, social sincerity and the construction of judicial credibility.”

Others are less sanguine about its wider purpose. “It is very ambitious in both depth and scope, including scrutinising individual behaviour and what books people are reading. It’s Amazon’s consumer tracking with an Orwellian political twist,” is how Johan Lagerkvist, a Chinese internet specialist at the Swedish Institute of International Affairs, described the social credit system. Rogier Creemers, a post-doctoral scholar specialising in Chinese law and governance at the Van Vollenhoven Institute at Leiden University, who published a comprehensive translation of the plan, compared it to “Yelp reviews with the nanny state watching over your shoulder”.

For now, technically, participating in China’s Citizen Scores is voluntary. But by 2020 it will be mandatory. The behaviour of every single citizen and legal person (which includes every company or other entity)in China will be rated and ranked, whether they like it or not.

Big data meets Big Brother as China moves to rate its citizens

Prior to its national roll-out in 2020, the Chinesegovernment is taking a watch-and-learn approach. In this marriage between communist oversight and capitalist can-do, the government has given a licence to eight private companies to come up with systems and algorithms for social credit scores. Predictably, data giants currently run two of the best-known projects.

The first is with China Rapid Finance, a partner of the social-network behemoth Tencent and developer of the messaging app WeChat with more than 850 million active users. The other, Sesame Credit, is run by the Ant Financial Services Group (AFSG), an affiliate company of Alibaba. Ant Financial sells insurance products and provides loans to small- to medium-sized businesses. However, the real star of Ant is AliPay, its payments arm that people use not only to buy things online, but also for restaurants, taxis, school fees, cinema tickets and even to transfer money to each other.

Sesame Credit has also teamed up with other data-generating platforms, such as Didi Chuxing, the ride-hailing company that was Uber’s main competitor in China before it acquired the American company’s Chinese operations in 2016, and Baihe, the country’s largest online matchmaking service. It’s not hard to see how that all adds up to gargantuan amounts of big data that Sesame Credit can tap into to assess how people behave and rate them accordingly.

So just how are people rated? Individuals on Sesame Credit are measured by a score ranging between 350 and 950 points. Alibaba does not divulge the “complex algorithm” it uses to calculate the number but they do reveal the five factors taken into account. The first is credit history. For example, does the citizen pay their electricity or phone bill on time? Next is fulfilment capacity, which it defines in its guidelines as “a user’s ability to fulfil his/her contract obligations”. The third factor is personal characteristics, verifying personal information such as someone’s mobile phone number and address. But the fourth category, behaviour and preference, is where it gets interesting.

Under this system, something as innocuous as a person’s shopping habits become a measure of character. Alibaba admits it judges people by the types of products they buy. “Someone who plays video games for ten hours a day, for example, would be considered an idle person,” says Li Yingyun, Sesame’s Technology Director. “Someone who frequently buys diapers would be considered as probably a parent, who on balance is more likely to have a sense of responsibility.” So the system not only investigates behaviour – it shapes it. It “nudges” citizens away from purchases and behaviours the government does not like.

Friends matter, too. The fifth category is interpersonal relationships. What does their choice of online friends and their interactions say about the person being assessed? Sharing what Sesame Credit refers to as “positive energy” online, nice messages about the government or how well the country’s economy is doing, will make your score go up.

See also  China just built a 250-acre solar farm shaped like a giant panda

Alibaba is adamant that, currently, anything negative posted on social media does not affect scores (we don’t know if this is true or not because the algorithm is secret). But you can see how this might play out when the government’s own citizen score system officially launches in 2020. Even though there is no suggestion yet that any of the eight private companies involved in the ongoing pilot scheme will be ultimately responsible for running the government’s own system, it’s hard to believe that the government will not want to extract the maximum amount of data for its SCS, from the pilots. If that happens, and continues as the new normal under the government’s own SCS it will result in private platforms acting essentially as spy agencies for the government. They may have no choice.

Posting dissenting political opinions or links mentioning Tiananmen Square has never been wise in China, but now it could directly hurt a citizen’s rating. But here’s the real kicker: a person’s own score will also be affected by what their online friends say and do, beyond their own contact with them. If someone they are connected to online posts a negative comment, their own score will also be dragged down.

So why have millions of people already signed up to what amounts to a trial run for a publicly endorsed government surveillance system? There may be darker, unstated reasons – fear of reprisals, for instance, for those who don’t put their hand up – but there is also a lure, in the form of rewards and “special privileges” for those citizens who prove themselves to be “trustworthy” on Sesame Credit.

If their score reaches 600, they can take out a Just Spend loan of up to 5,000 yuan (around £565) to use to shop online, as long as it’s on an Alibaba site. Reach 650 points, they may rent a car without leaving a deposit. They are also entitled to faster check-in at hotels and use of the VIP check-in at Beijing Capital International Airport. Those with more than 666 points can get a cash loan of up to 50,000 yuan (£5,700), obviously from Ant Financial Services. Get above 700 and they can apply for Singapore travel without supporting documents such as an employee letter. And at 750, they get fast-tracked application to a coveted pan-European Schengen visa. “I think the best way to understand the system is as a sort of bastard love child of a loyalty scheme,” says Creemers.

Higher scores have already become a status symbol, with almost 100,000 people bragging about their scores on Weibo (the Chinese equivalent of Twitter) within months of launch. A citizen’s score can even affect their odds of getting a date, or a marriage partner, because the higher their Sesame rating, the more prominent their dating profile is on Baihe.

Sesame Credit already offers tips to help individuals improve their ranking, including warning about the downsides of friending someone who has a low score. This might lead to the rise of score advisers, who will share tips on how to gain points, or reputation consultants willing to offer expert advice on how to strategically improve a ranking or get off the trust-breaking blacklist.

Indeed, the government’s Social Credit System is basically a big data gamified version of the Communist Party’s surveillance methods; the disquieting dang’an. The regime kept a dossier on every individual that tracked political and personal transgressions. A citizen’s dang’an followed them for life, from schools to jobs. People started reporting on friends and even family members, raising suspicion and lowering social trust in China. The same thing will happen with digital dossiers. People will have an incentive to say to their friends and family, “Don’t post that. I don’t want you to hurt your score but I also don’t want you to hurt mine.”

We’re also bound to see the birth of reputation black markets selling under-the-counter ways to boost trustworthiness. In the same way that Facebook Likes and Twitter followers can be bought, individuals will pay to manipulate their score. What about keeping the system secure? Hackers (some even state-backed) could change or steal the digitally stored information.

“People with low ratings will have slower internet speeds; restricted access to restaurants and the removal of the right to travel”

The new system reflects a cunning paradigm shift. Aswe’ve noted, instead of trying to enforce stability or conformity with a big stick and a good dose of top-down fear, the government is attempting to make obedience feel like gaming. It is a method of social control dressed up in some points-reward system. It’s gamified obedience.

In a trendy neighbourhood in downtown Beijing, the BBC news services hit the streets in October 2015 to ask people about their Sesame Credit ratings. Most spoke about the upsides. But then, who would publicly criticise the system? Ding, your score might go down. Alarmingly, few people understood that a bad score could hurt them in the future. Even more concerning was how many people had no idea that they were being rated.

Currently, Sesame Credit does not directly penalise people for being “untrustworthy” – it’s more effective to lock people in with treats for good behaviour. But Hu Tao, Sesame Credit’s chief manager, warns people that the system is designed so that “untrustworthy people can’t rent a car, can’t borrow money or even can’t find a job”. She has even disclosed that Sesame Credit has approached China’s Education Bureau about sharing a list of its students who cheated on national examinations, in order to make them pay into the future for their dishonesty.

Penalties are set to change dramatically when the government system becomes mandatory in 2020. Indeed, on September 25, 2016, the State Council General Office updated its policy entitled “Warning and Punishment Mechanisms for Persons Subject to Enforcement for Trust-Breaking”. The overriding principle is simple: “If trust is broken in one place, restrictions are imposed everywhere,” the policy document states.

See also  3 Best Drop Shipping Hair Wig Suppliers and Vendors

For instance, people with low ratings will have slower internet speeds; restricted access to restaurants, nightclubs or golf courses; and the removal of the right to travel freely abroad with, I quote, “restrictive control on consumption within holiday areas or travel businesses”. Scores will influence a person’s rental applications, their ability to get insurance or a loan and even social-security benefits. Citizens with low scores will not be hired by certain employers and will be forbidden from obtaining some jobs, including in the civil service, journalism and legal fields, where of course you must be deemed trustworthy. Low-rating citizens will also be restricted when it comes to enrolling themselves or their children in high-paying private schools. I am not fabricating this list of punishments. It’s the reality Chinese citizens will face. As the government document states, the social credit system will “allow the trustworthy to roam everywhere under heaven while making it hard for the discredited to take a single step”.

According to Luciano Floridi, a professor of philosophy and ethics of information at the University of Oxford and the director of research at the Oxford Internet Institute, there have been three critical “de-centering shifts” that have altered our view in self-understanding: Copernicus’s model of the Earth orbiting the Sun; Darwin’s theory of natural selection; and Freud’s claim that our daily actions are controlled by the unconscious mind.

Floridi believes we are now entering the fourth shift, as what we do online and offline merge into an onlife. He asserts that, as our society increasingly becomes an infosphere, a mixture of physical and virtual experiences, we are acquiring an onlife personality – different from who we innately are in the “real world” alone. We see this writ large on Facebook, where people present an edited or idealised portrait of their lives. Think about your Uber experiences. Are you just a little bit nicer to the driver because you know you will be rated? But Uber ratings are nothing compared to Peeple, an app launched in March 2016, which is like a Yelp for humans. It allows you to assign ratings and reviews to everyone you know – your spouse, neighbour, boss and even your ex. A profile displays a “Peeple Number”, a score based on all the feedback and recommendations you receive. Worryingly, once your name is in the Peeple system, it’s there for good. You can’t opt out.

Peeple has forbidden certain bad behaviours including mentioning private health conditions, making profanities or being sexist (however you objectively assess that). But there are few rules on how people are graded or standards about transparency.

China’s trust system might be voluntary as yet, but it’s already having consequences. In February 2017, the country’s Supreme People’s Court announced that 6.15 million of its citizens had been banned from taking flights over the past four years for social misdeeds. The ban is being pointed to as a step toward blacklisting in the SCS. “We have signed a memorandum… [with over] 44 government departments in order to limit ‘discredited’ people on multiple levels,” says Meng Xiang, head of the executive department of the Supreme Court. Another 1.65 million blacklisted people cannot take trains.

Where these systems really descend into nightmarish territory is that the trust algorithms used are unfairly reductive. They don’t take into account context. For instance, one person might miss paying a bill or a fine because they were in hospital; another may simply be a freeloader. And therein lies the challenge facing all of us in the digital world, and not just the Chinese. If life-determining algorithms are here to stay, we need to figure out how they can embrace the nuances, inconsistencies and contradictions inherent in human beings and how they can reflect real life.

Big data meets Big Brother as China moves to rate its citizens

You could see China’s so-called trust plan as Orwell’s 1984 meets Pavlov’s dogs. Act like a good citizen, be rewarded and be made to think you’re having fun. It’s worth remembering, however, that personal scoring systems have been present in the west for decades.

More than 70 years ago, two men called Bill Fair and Earl Isaac invented credit scores. Today, companies use FICO scores to determine many financial decisions, including the interest rate on our mortgage or whether we should be given a loan.

For the majority of Chinese people, they have never had credit scores and so they can’t get credit. “Many people don’t own houses, cars or credit cards in China, so that kind of information isn’t available to measure,” explains Wen Quan, an influential blogger who writes about technology and finance. “The central bank has the financial data from 800 million people, but only 320 million have a traditional credit history.” According to the Chinese Ministry of Commerce, the annual economic loss caused by lack of credit information is more than 600 billion yuan (£68bn).

China’s lack of a national credit system is why the government is adamant that Citizen Scores are long overdue and badly needed to fix what they refer to as a “trust deficit”. In a poorly regulated market, the sale of counterfeit and substandard products is a massive problem. According to the Organization for Economic Co-operation and Development (OECD), 63 per cent of all fake goods, from watches to handbags to baby food, originate from China. “The level of micro corruption is enormous,” Creemers says. “So if this particular scheme results in more effective oversight and accountability, it will likely be warmly welcomed.”

The government also argues that the system is a way to bring in those people left out of traditional credit systems, such as students and low-income households. Professor Wang Shuqin from the Office of Philosophy and Social Science at Capital Normal University in China recently won the bid to help the government develop the system that she refers to as “China’s Social Faithful System”. Without such a mechanism, doing business in China is risky, she stresses, as about half of the signed contracts are not kept. “Given the speed of the digital economy it’s crucial that people can quickly verify each other’s credit worthiness,” she says. “The behaviour of the majority is determined by their world of thoughts. A person who believes in socialist core values is behaving more decently.” She regards the “moral standards” the system assesses, as well as financial data, as a bonus.

See also  Chinese Media Ascribe ‘Traveling Frog’ Game Hype to China’s Low Birth Rates

Indeed, the State Council’s aim is to raise the “honest mentality and credit levels of the entire society” in order to improve “the overall competitiveness of the country”. Is it possible that the SCS is in fact a more desirably transparent approach to surveillance in a country that has a long history of watching its citizens? “As a Chinese person, knowing that everything I do online is being tracked, would I rather be aware of the details of what is being monitored and use this information to teach myself how to abide by the rules?” says Rasul Majid, a Chinese blogger based in Shanghai who writes about behavioural design and gaming psychology. “Or would I rather live in ignorance and hope/wish/dream that personal privacy still exists and that our ruling bodies respect us enough not to take advantage?” Put simply, Majid thinks the system gives him a tiny bit more control over his data.

Big data meets Big Brother as China moves to rate its citizens

When I tell westerners about the Social CreditSystem in China, their responses are fervent and visceral. Yet we already rate restaurants, movies, books and even doctors. Facebook, meanwhile, is now capable of identifying you in pictures without seeing your face; it only needs your clothes, hair and body type to tag you in an image with 83 per cent accuracy.

In 2015, the OECD published a study revealing that in the US there are at least 24.9 connected devices per 100 inhabitants. All kinds of companies scrutinise the “big data” emitted from these devices to understand our lives and desires, and to predict our actions in ways that we couldn’t even predict ourselves.

Governments around the world are already in the business of monitoring and rating. In the US, the National Security Agency (NSA) is not the only official digital eye following the movements of its citizens. In 2015, the US Transportation Security Administration proposed the idea of expanding the PreCheck background checks to include social-media records, location data and purchase history. The idea was scrapped after heavy criticism, but that doesn’t mean it’s dead. We already live in a world of predictive algorithms that determine if we are a threat, a risk, a good citizen and even if we are trustworthy. We’re getting closer to the Chinese system – the expansion of credit scoring into life scoring – even if we don’t know we are.

So are we heading for a future where we will all be branded online and data-mined? It’s certainly trending that way. Barring some kind of mass citizen revolt to wrench back privacy, we are entering an age where an individual’s actions will be judged by standards they can’t control and where that judgement can’t be erased. The consequences are not only troubling; they’re permanent. Forget the right to delete or to be forgotten, to be young and foolish.

While it might be too late to stop this new era, we do have choices and rights we can exert now. For one thing, we need to be able rate the raters. In his book The Inevitable, Kevin Kelly describes a future where the watchers and the watched will transparently track each other. “Our central choice now is whether this surveillance is a secret, one-way panopticon – or a mutual, transparent kind of ‘coveillance’ that involves watching the watchers,” he writes.

Our trust should start with individuals within government (or whoever is controlling the system). We need trustworthy mechanisms to make sure ratings and data are used responsibly and with our permission. To trust the system, we need to reduce the unknowns. That means taking steps to reduce the opacity of the algorithms. The argument against mandatory disclosures is that if you know what happens under the hood, the system could become rigged or hacked. But if humans are being reduced to a rating that could significantly impact their lives, there must be transparency in how the scoring works.

In China, certain citizens, such as government officials, will likely be deemed above the system. What will be the public reaction when their unfavourable actions don’t affect their score? We could see a Panama Papers 3.0 for reputation fraud.

It is still too early to know how a culture of constant monitoring plus rating will turn out. What will happen when these systems, charting the social, moral and financial history of an entire population, come into full force? How much further will privacy and freedom of speech (long under siege in China) be eroded? Who will decide which way the system goes? These are questions we all need to consider, and soon. Today China, tomorrow a place near you. The real questions about the future of trust are not technological or economic; they are ethical.

If we are not vigilant, distributed trust could become networked shame. Life will become an endless popularity contest, with us all vying for the highest rating that only a few can attain.

This is an extract from Who Can You Trust? How Technology Brought Us Together and Why It Might Drive Us Apart (Penguin Portfolio) by Rachel Botsman, published on October 4. Since this piece was written, The People’s Bank of China delayed the licences to the eight companies conducting social credit pilots. The government’s plans to launch the Social Credit System in 2020 remain unchanged.

Leave a Reply

Your email address will not be published. Required fields are marked *