“Nosedive” is a episode in the British science fiction series Black Mirror. It examines the idea of a fictional society where people can rate each other from one to five stars for every interaction they have with each other. The resulting score has an impact on the social and economic status of the individual.
The story follows Lacie, who is a young woman overly obsessed with her ratings; she finds an opportunity to improve her ratings greatly and move into a more luxurious residence after being chosen by her popular childhood friend as the maid of honour for her wedding. Her obsession leads to several mishaps on her journey to the wedding that culminate in a rapid reduction in her ratings.
Many critics noted the similarity of the episode to real-world Social Credit Systems. Indeed we already live in a world where the judgment of people is being replaced by numbers. Algorithms calculate the value of a human being and create digitally transparent people
Nudge Theory
Social Credit Systems belong to a behavioral science concept called Nudge Theory. It proposes positive reinforcement and indirect suggestions as a way to influence the behavior and decision making of groups or individuals.
Nudging contrasts with other ways to achieve compliance, such as education, legislation or enforcement.
The utilization of “nudging” has been widely criticized for diminishing autonomy, threatening dignity, and violating liberties. Some call it psychological manipulative.
China
The Chinese system uses a combination of mass surveillance and big data to score citizens. It is currently voluntary but will be made mandatory by 2020.
In the final stage, every citizen and company will be ranked whether they like it or not. The system will manage the punishments and rewards of citizens based on their economic and personal behavior. The exact methodology for the rating is a secret but examples of infrations include bad driving, smoking in non-smoking areas, or posting fake news.
Punishments include:
- flight ban
- exclusion from schools
- slow internet connection
- public display of blacklisted individuals
By the end of 2018, 5.5 million high-speed rail trips and 17.5 million flights had been denied to prospective travelers who were on a Chinese social scoring blacklist.
Western World
Germany has its own social scoring system. The universal credit rating system known as Schufa is a private company that assesses the creditworthiness of the majority of the German citizens and that of over 5 million companies in the country. Factors like living in a low-income neighborhood (geo-scoring) can lower the Schufa score. There is mounting criticism in Germany of what some view as a lack of transparency in establishing Schufa scores.
Schufa has also been criticized for being a privately controlled central database that provides personality profiles for individuals. Having a bad score in the credit rating system will prevent people from getting a loan or renting an apartment.
Comparable credit systems are common in most western democracies. The US counterpart to Schufa is FICO and the UK has Experian.
An increasing number of societal “privileges” related to transportation, accommodations, communications, and the rates we pay for services (like insurance) are controlled by algorithms and our behavior.
Conclusion
We are in a critical phase of technological development and we need to discuss our values. Private companies will keep scoring people in areas like finance, criminality, rental housing, or mail-order businesses. It works like a charm and some citizens are even welcoming the ideal of “digitally transparent” people.
If you have something that you don’t want anyone to know, maybe you shouldn’t be doing it in the first place -Eric Schmidt, Google
But it’s only a question of time until government institutions will pull all the information from different databases together and come up with a social score for each citizen. Arguments will be made for it improving the security of people and businesses.
Do we want to live in a world as predicted in the Black Mirror episode? Our societies have to be made aware of what algorithms really can, and cannot, do. Education is the answer.