Skip to content

Your Phone Already Has Social Credit. We Just Lie About It.

white and blue magnetic card
Photo by Avery Evans on Unsplash

Your credit score is social credit. Your LinkedIn endorsements are social credit. Your Uber passenger rating, Instagram engagement metrics, Amazon reviews, and Airbnb host status are all social credit systems that track you, score you, and reward you based on your behavior.

Social credit, in its original economic definition, means distributing industry profits to consumers to increase purchasing power. But the term has evolved far beyond economics. Today, it describes any kind of metric that tracks individual behavior, assigns scores based on that behavior, and uses those scores to determine access to services, opportunities, or social standing.

Sounds dystopian, doesn’t it? But guess what? Every time an algorithm evaluates your trustworthiness, reliability, or social value, whether for a loan, a job, a date, or a ride, you're participating in a social credit system. The scoring happens constantly, invisibly, and across dozens of platforms that weave into your daily life.

The only difference between your phone and China's social credit system is that China tells you what they're doing. We pretend our algorithmic reputation scores are just “user experience features.” At least Beijing admits they're gamifying human behavior.

When Americans think of the "Chinese social credit system," they likely picture Black Mirror episodes and Orwellian nightmares. Citizens are tracked for every jaywalking incident, points are deducted for buying too much alcohol, and facial recognition cameras are monitoring social gatherings; the image is so powerful that Utah's House passed a law banning social credit systems, despite none existing in America.

Here's what's actually happening. As of 2024, there's still no nationwide social credit score in China. Most private scoring systems have been shut down, and local government pilots have largely ended. It’s mainly a fragmented collection of regulatory compliance tools, mostly focused on financial behavior and business oversight. While well over 33 million businesses have been scored under corporate social credit systems, individual scoring remains limited to small pilot cities like Rongcheng. Even there, scoring systems have had "very limited impact" since they've never been elevated to provincial or national levels.

What actually gets tracked? Primarily court judgment defaults: people who refuse to pay fines or loans despite having the ability. The Supreme People's Court's blacklist is composed of citizens and companies that refuse to comply with court orders, typically to pay fines or repay loans. Some experimental programs in specific cities track broader social behavior, but these remain isolated experiments.

The gap between Western perception and Chinese reality is enormous, and it reveals something important: we're worried about a system that barely exists while ignoring the behavioral scoring systems we actually live with.

You already live in social credit.

Open your phone right now and count the apps that are scoring your behavior. Uber drivers rate you as a passenger. Instagram tracks your engagement patterns. Your bank is analyzing your Venmo transactions and Afterpay usage. LinkedIn measures your professional networking activity. Amazon evaluates your purchasing behavior. Each platform maintains detailed behavioral profiles that determine your access to services, opportunities, and social connections.

We just don't call it social credit.

Your credit score doesn't just determine loan eligibility; it affects where you can live, which jobs you can get, and how much you pay for car insurance. But traditional credit scoring is expanding rapidly. Some specialized lenders scan social media profiles as part of alternative credit assessments, particularly for borrowers with limited credit histories. Payment apps and financial services increasingly track spending patterns and transaction behaviors to build comprehensive risk profiles. The European Central Bank has asked some institutions to monitor social media chatter for early warnings of bank runs, though this is more about systemic risk than individual account decisions. Background check companies routinely analyze social media presence for character assessment. LinkedIn algorithmically manages your professional visibility based on engagement patterns, posting frequency, and network connections, rankings that recruiters increasingly rely on to filter candidates. Even dating has become a scoring system: apps use engagement rates and response patterns to determine who rises to the top of the queue and who gets buried.

What we have aren't unified social credit systems…yet. They're fragmented behavioral scoring networks that don't directly communicate. Your Uber rating doesn't affect your mortgage rate, and your LinkedIn engagement doesn't determine your insurance premiums. But the infrastructure is being built to connect these systems. We're building the technical and cultural foundations that could eventually create comprehensive social credit systems. The question isn't whether we have Chinese-style social credit now (because we don't). The question is whether we're building toward it without acknowledging what we're creating.

Where China's limited experiments have been explicit about scoring criteria, Western systems hide their decision-making processes entirely. Even China's fragmented approach offers more visibility into how behavioral data gets used than our black box algorithms do.

You may argue there's a fundamental difference between corporate tracking and government surveillance. Corporations compete; you can switch services. Governments have monopoly power and can restrict fundamental freedoms.

This misses three key points: First, switching costs for major platforms are enormous. Try leaving Google's ecosystem or abandoning your LinkedIn network. Second, corporate social credit systems increasingly collaborate. Bad Uber ratings can affect other services; poor credit scores impact everything from insurance to employment. Third, Western governments already access this corporate data through legal channels and data purchases.

Social credit systems are spreading globally because they solve coordination problems. They reduce fraud, encourage cooperation, and create behavioral incentives at scale. The question isn't whether Western societies will adopt social credit (because we're building toward it). The question is whether we'll be transparent and accountable about it or continue pretending our algorithmic reputation scores are just neutral technology.

Current trends suggest both systems are evolving toward more comprehensive behavioral scoring. European digital identity initiatives are linking multiple service scores. US cities are experimenting with behavioral incentive programs. Corporate platforms increasingly share reputation data. Financial services integrate social media analysis into lending decisions.

If both countries evolve toward comprehensive behavioral scoring, and current trends suggest they will, which approach better serves individual agencies? One that admits it's scoring you, or one that pretends algorithmic recommendations are just helpful suggestions?

When Uber can destroy your transportation access with a hidden algorithm, and when credit scores determine your housing options through opaque calculations, is that really more free than a system where you know at least some of the behaviors that affect your score?

So when China's explicit social credit approach inevitably influences Western platforms, when your apps start showing you the behavioral scores they've always been calculating, when the rules become visible instead of hidden, don't panic.

Because for the first time, you'll finally understand the game you've been playing all along. And knowing the rules means you can finally choose whether you want to play.