Jump to content

China's Social Credit System


Altherion

Recommended Posts

Many countries have a credit score (e.g. FICO in the US) which is used to evaluate people when giving out loans to buy cars, houses, etc. as well as to set the limit on credit cards. China has decided to take this idea further: they will of course use the traditional information typically involved in constructing credit scores, of course, but they will also use a whole lot more and the result will not be just for giving out loans, but for trustworthiness in general. Here are a couple of articles about the Social Credit Score (from Wired and from the Verge):

Quote

Starting in May, Chinese citizens who rank low on the country’s burgeoning “social credit” system will be in danger of being banned from buying plane or train tickets for up to a year, according to statements recently released by the country’s National Development and Reform Commission.

With the social credit system, the Chinese government rates citizens based on things like criminal behavior and financial misdeeds, but also on what they buy, say, and do. Those with low “scores” have to deal with penalties and restrictions.

Quote

So just how are people rated? Individuals on Sesame Credit are measured by a score ranging between 350 and 950 points. Alibaba does not divulge the "complex algorithm" it uses to calculate the number but they do reveal the five factors taken into account. The first is credit history. For example, does the citizen pay their electricity or phone bill on time? Next is fulfilment capacity, which it defines in its guidelines as "a user's ability to fulfil his/her contract obligations". The third factor is personal characteristics, verifying personal information such as someone's mobile phone number and address. But the fourth category, behaviour and preference, is where it gets interesting.

Under this system, something as innocuous as a person's shopping habits become a measure of character. Alibaba admits it judges people by the types of products they buy. "Someone who plays video games for ten hours a day, for example, would be considered an idle person," says Li Yingyun, Sesame's Technology Director. "Someone who frequently buys diapers would be considered as probably a parent, who on balance is more likely to have a sense of responsibility." So the system not only investigates behaviour - it shapes it. It "nudges" citizens away from purchases and behaviours the government does not like.

Friends matter, too. The fifth category is interpersonal relationships. What does their choice of online friends and their interactions say about the person being assessed? Sharing what Sesame Credit refers to as "positive energy" online, nice messages about the government or how well the country's economy is doing, will make your score go up.

This sounds like something out of a science fiction novel, but it looks like they have a nearly functional prototype of it. In fact, it's not really that hard to do anywhere assuming that one has full access to everything on every social media platform and all non-cash financial transactions (as China certainly does and, for example, American security agencies almost certainly do). Do you think this idea will catch on?

Link to comment
Share on other sites

6 minutes ago, Ser Scot A Ellison said:

"The Orville" did a similar episode.  It is a scary prospect.

When I first heard of it, my thought went to "Psycho-Pass", given that they also want to predict the likelihood of people committing a crime. Because that always worked so well in fiction... -.-

Link to comment
Share on other sites

It is terrifying and Orwellian, and yet not that much different from our own credit scores, Cambridge Analytica psychographics and other estimated voting profiles, inferred online profile/personal for targeted ads (especially by Facebook), holistic college admissions*, personalized insurance rate quotes, etc.  This is happening all around us already, but usually by private companies who are not accountable for accuracy/veracity.  China will have a single state entity develop the score and abuse it to nudge people toward govt policy.  We have a distributed mess of scores that try to exploit people for profit.  It really is just command economy vs. free market.

*This one is slightly different to the other data-driven examples, but it represents a way we are nudged/compelled into conformist patterns of behavior or else be excluded from the best opportunities in our society.

Link to comment
Share on other sites

18 hours ago, Altherion said:

 

This sounds like something out of a science fiction novel, but it looks like they have a nearly functional prototype of it. In fact, it's not really that hard to do anywhere assuming that one has full access to everything on every social media platform and all non-cash financial transactions (as China certainly does and, for example, American security agencies almost certainly do). Do you think this idea will catch on?

Science fictiony really? we have an unofficial version here already when it comes to politics.  There's definitely goodthink and badthink, the Chinese govt will just be more consistent in policing it.  And of course it's going to continue to catch on.  I mean what's the point of having a multi-billion dollar social media company if you aren't going to use it to reeducate deplorables?

Link to comment
Share on other sites

1 hour ago, Iskaral Pust said:

It is terrifying and Orwellian, and yet not that much different from our own credit scores, Cambridge Analytica psychographics and other estimated voting profiles, inferred online profile/personal for targeted ads (especially by Facebook), holistic college admissions*, personalized insurance rate quotes, etc.  This is happening all around us already, but usually by private companies who are not accountable for accuracy/veracity.  China will have a single state entity develop the score and abuse it to nudge people toward govt policy.  We have a distributed mess of scores that try to exploit people for profit.  It really is just command economy vs. free market.

*This one is slightly different to the other data-driven examples, but it represents a way we are nudged/compelled into conformist patterns of behavior or else be excluded from the best opportunities in our society.

This is true, but in this case, the command version is much more effective than the free market one. For example, the Chinese system allows the government to make adjustments to the score based on the people a given individual associates with which would be quite difficult for the credit agencies. Also, private entities in Western nations are legally prohibited from using (or even possessing) a great deal of data and a great deal more is out of reach simply because it's hard to collect. Amazon has no access to most of your medical info, employers may not discriminate against parents or pregnant women, colleges don't know how much time most applicants spend playing video games and so on. China's score will incorporate all of these and more.

1 hour ago, mcbigski said:

Science fictiony really? we have an unofficial version here already when it comes to politics.  There's definitely goodthink and badthink, the Chinese govt will just be more consistent in policing it.  And of course it's going to continue to catch on.  I mean what's the point of having a multi-billion dollar social media company if you aren't going to use it to reeducate deplorables?

It's not that they don't try, but they mostly fail. Political correctness in the US can certainly have a significant impact on someone's life, but this only happens if that individual has said or done something that went viral or is sufficiently prominent that people are paid to dig through the history of words and actions to see if anything can be made to go viral. This means that the effects are necessarily limited to a few hundred people per year (out of hundreds of millions). The result is that roughly half of the country metaphorically spits in the face of most of the media, the politically inclined sectors of academia and the rest of the politically correct (including those in government). The Chinese are not able to do that even now and it will get even harder for them with the new system because the information about everyone will be available to the government and the latter lacks the shackles imposed by most Western systems.

Link to comment
Share on other sites

2 hours ago, Iskaral Pust said:

It is terrifying and Orwellian, and yet not that much different from our own credit scores, Cambridge Analytica psychographics and other estimated voting profiles, inferred online profile/personal for targeted ads (especially by Facebook), holistic college admissions*, personalized insurance rate quotes, etc.  This is happening all around us already, but usually by private companies who are not accountable for accuracy/veracity. 

Agreed. It sounds terrifying and orwellian at first read. And then you realize we have similar "credit scores" that are not so different in Western countries. For instance, a bad credit history can get you massive problems. Similarly, former criminals can find it incredibly hard to get bacl on their feet. And that's just the tip of the iceberg.

Furthermore, a few things to consider:
- The way it's presented in our media. Unless we know exactly how important the "fourth and fifth" categories really are and what deeds might lower your credit score it's hard to be certain that it's worse than comparable elements in Western societies. Because yeah, being friends with some people is also bad around here, y'know? In a similar way, none of these aticles makes it clear how easy or hard it is to be considered "untrustworthy." For instance, if it takes criminal activity to be penalized, then the exact same thing exists in Western countries.
- The Wired article actually makes it clear that most people in China don't have access to loans (among other things) in the first place and that this might actually help tons of people.
- Chinese society is huge. It's a scale I can't even begin to comprehend. And resources are finite. Their government has to find a way to rationalize the providing of some types of benefits. Unfair? No doubt. But it's just a different kind of inequality than the one we are accustomed to. Many Americans aren't shocked if the poorest members of their society don't have access to education or healthcare, so why is it so shocking to distribute such benefits based on "trustworthiness" instead of family or personal wealth? Quite frankly, in the long-run, I don't see Chinese society as being that much more dystopian than the American one.

I don't support this kind of thing, of course, but I think it's the type of information that needs to be taken with a lot of critical distance. How would your government deal with 1,3 billion people? Where are our Western societies headed? We take our consumer society with all its abundance, as well as many civil liberties for granted. But abundance has many dark sides, and our civil liberties are not as vast as we tend to think they are. These news are scary for us today. Tomorrow, some of our children might see this as a decent alternative to a life in poverty here.

Link to comment
Share on other sites

It appears that somebody took a look at this program and thought something like "Well, this feels vaguely Orwellian, but I think we could do much better. You know what would really help here? Omni-present cameras with facial recognition software!"

Quote

Don’t even think about jaywalking in Jinan, the capital of Shandong province. Last year, traffic-management authorities there started using facial recognition to crack down. When a camera mounted above one of 50 of the city’s busiest intersections detects a jaywalker, it snaps several photos and records a video of the violation. The photos appear on an overhead screen so the offender can see that he or she has been busted, then are cross-checked with the images in a regional police database. Within 20 minutes, snippets of the perp’s ID number and home address are displayed on the crosswalk screen.

...

Across the country, other applications of the technology are proliferating. Many exist somewhere in the range between helpful and unsettling: A “smart boarding system” from the tech giant Baidu reduces airport check-in to a one-second face scan; at KFC China’s “smart restaurant” in Beijing, customers stand in front of a screen, have their face scanned (again, Baidu is part of the joint endeavor), and receive menu suggestions based on their age, sex, and facial expression (“crispy chicken hamburger,” roasted chicken wings, and a Coke for a 20-something male’s lunch; porridge and soy milk for a middle-aged woman’s breakfast). A female-only university dormitory has even employed facial recognition to keep nonresidents out.

...

All sorts of data will feed into this new program, but facial recognition (along with gait analysis and voice recognition, also enabled by rapid advances in machine learning and cloud computing) has the potential to one day give it something like omniscience. China’s government and commercial sectors make available to each other the endless streams of personal information they gather. Because companies have access to vast amounts of consumer data, industry experts predict that in the coming months Chinese facial-recognition software will become even more accurate. Western companies may be exploiting the same machine-learning technology, but nobody is rolling it out like the Chinese.

On the one hand, it's kind of awe-inspiring that they can do this, but on the other, the surveillance in 1984 was not intended as an ideal to reach for...

Link to comment
Share on other sites

2 hours ago, Altherion said:

On the one hand, it's kind of awe-inspiring that they can do this, but on the other, the surveillance in 1984 was not intended as an ideal to reach for...



Well not to us, but it was intended as a warning against the kind of totalitarian communism that rules China, so from their perspective it would have been.

Link to comment
Share on other sites

Those in glass houses should think twice before throwing stones. The credit system in the U.S. is nothing to gloat over. We have a burdensome system that offers little protection from identity theft, getting a mistake removed/resolved from the credit reporting agancies is way too burdensome on consumers. Your basically guilty till you can prove your innocent under our system, and there is way too little in assistance offered to one trying to clear their credit score from these identity thefts or reporting inaccuracies.

Link to comment
Share on other sites

I fail to see how you could possible have a remotely half-way decent and fair social credit system when literally half the country is stuck in what amounts to the medieval feudal system and a rich (corrupt) minority control all the power.
It just looks like utter bollocks if you ask me.

How is some poor peasant farmer in the rural districts supposed to be able to get a credit score on par with a rich city dweller, and as a result not open themselves up to all manner of nasty government reprisals.

Link to comment
Share on other sites

So essentially it isolates certain types of transactions, like buying alcohol and vacation packages, and penalize any individual's ability to purchase based on how low their social score is?  I could see an argument that non-necessities that impact your quality of life but not your ability to survive is better than social control by shaming from your social network for whatever behaviour they disapprove of.  However, no committee or watchdogs should be involved in determining what is negative behaviour.  A deep learning algorithm could provide evidence that certain behaviours negatively impact society, and adjust the penalties and rewards with feedback from the whole system looking at every type of behaviour.  There is no need why this would have to be consider Orwellian, if it does it dispassionately without bias according to stereotypes.  The starting case would have to be no penalties for any type of behaviour, and then adjustment by the system from positive and negative feedback on particular measures.

Link to comment
Share on other sites

2 hours ago, SpaceChampion said:

So essentially it isolates certain types of transactions, like buying alcohol and vacation packages, and penalize any individual's ability to purchase based on how low their social score is?  I could see an argument that non-necessities that impact your quality of life but not your ability to survive is better than social control by shaming from your social network for whatever behaviour they disapprove of.  However, no committee or watchdogs should be involved in determining what is negative behaviour.  A deep learning algorithm could provide evidence that certain behaviours negatively impact society, and adjust the penalties and rewards with feedback from the whole system looking at every type of behaviour.  There is no need why this would have to be consider Orwellian, if it does it dispassionately without bias according to stereotypes.  The starting case would have to be no penalties for any type of behaviour, and then adjustment by the system from positive and negative feedback on particular measures.

Human behaviour is not quantifiable in that way. Does drinking harm or benefit my quality of life? I'd say it benefits me, I drink a medium amount, it helps me relax and socialise. You might say the alogrithm would only punish alcoholics, but alcoholism isn't just about quantity. I've heard alcoholics say they didn't realise they had a problem, because they weren't drinking more than their peers, it was the effect it was having.

Society would be harmed by everyone being constantly stressed about being judged on all their actions.

No algorithm can ever be truly neutral, because it has to be designed by a person, using subjective ideas about what is good. Say I work loads of overtime and earn lots of money, and spend it sensibly, but the overwork makes me an incredibly stressed person than treats everyone around me badly? Not everything can be quantified by an algorithm.

Our society is not a machine. The whole thing makes me feel like the hippies had it right all along, just drop out.

Link to comment
Share on other sites

Archived

This topic is now archived and is closed to further replies.

×
×
  • Create New...