PAULA MARKOW WRITES — Whether you are religious or not – you will be convinced by the end of this article that there truly is something greater than all of us: an all-knowing force. Except this force does not judge how we spend eternity, but rather, how we spend our time here on Earth.
For the last several years, China has been preparing to launch the world’s very first social credit score (SCS) system by 2020 – a system that aims to rate citizens based on their behavior. According to the Wall Street Journal, “Blacklists will expose offenders and restrict them from certain activities,” while those who are well-behaved, according to their standards, will have access to more opportunities and fewer restrictions.
Just what does “well-behaved” mean? An algorithm will take everything about an individual into consideration when scoring them; this includes but is not limited to: driving violations, screen-usage time, and even willingness to give donations. Negative scores could lead to “blacklisting” citizens from basic rights such as public transportation, and better jobs as well as schools; or, positive ones could result in greater opportunities for highly-rated citizens as well as their children. And these ratings will be accessible by the public.
Let’s say, hypothetically, one is in charge of testing a new game, as lead-engineer of beta-testing. This individual is required to stare at a computer for hours at a time. But his or her job function will, by this new algorithm in China, interpret the job-related screen time negatively. Due to long hours in front of a screen, this person could be rated “idle,” and would lose credit points.
China sees this new credit rating system as an attempt to improve its citizens’ behavior and to create a safer environment for all. Additionally, supporters see the system as simply an extension of the scoring systems we are already familiar with (i.e. FICO). China is determined to carry this plan out and has already partnered with major private companies to weave together big data (large data sets) that will lead to improvements in the social credit algorithm.
Yet, as with all algorithms, there is sufficient room for error owing to incorrect interpretation of data. Anurag Lal, former Director of the U.S. National Broadband Task Force for the Federal Communications Committee under the Obama administration, said in an interview that “People do so many different things for so many different reasons, and if the context is not appreciated it can be misconstrued.”
Will the algorithm become intuitive enough to one day properly take into account these grey areas? Who knows? But if we look to the past, Chinese history may help contextualize these new, fundamentally radical ideas—and early indications are not good. The word “credit,” xinyong in Chinese, means “a moral concept that indicates one’s honesty and trustworthiness” – a word rooted back to the 4th century B.C. The word xinyong, now embedded in the credit system vocabulary, also points to the link between culture and government. So it looks like citizens could be judged by even one bad day, or even worse, by one imperfect action alone.
Will this monumental installation have lasting effects beyond the confines of the legal and fiscal Chinese system? This is where the real worry comes in. What if this impersonal rating system actually changes the concept of what it means to be a human, flaws and all? What if it establishes a kind of social conformity—if not uniformity— to further close the gap between China’s robot-like search for perfection and the individual force of free will?
Would you want your life, and your individuality, to be summed up by a number?