I recently got the internet in my apartment fixed, and my technician had an unusual request. I'd get an automated call after he left asking me how satisfied I was with the service, he explained, and he wanted me to rate him 9 out of 10. I asked why, and he said there was a glitch with the system that recorded any 10 rating as a 1, and it was important for him to keep his rating up.
Since then, a couple of people have told me that technicians working for the company have been making this exact request for at least two years. A representative for Spectrum, my internet provider, said they were worrying over nothing. The company had moved away from the 10-point rating system, he said, adding that customer feedback isn't "tied to individual technicians' compensation."
But even if the Spectrum glitch exists only in the lore of cable repairmen, the anxiety it's causing them is telling. Increasingly, workers are impacted by automated decision-making systems, which also affects people who read the news, or apply for loans, or shop in stores. It only makes sense that they'd try to bend those systems to their advantage.
Attempting to manipulate mysterious automated systems is a common feature of modern life. Just ask any search engine optimist, Instagram influencer, or foreign intelligence agency. There exist at least two separate academic papers with the title "Folk Theories of Social Feeds," detailing how Facebook users divine what its algorithm wants them try to use those theories to their advantage.
People with algorithms for bosses have particular incentive to push back. Last month, a local television station in Washington covered Uber drivers who conspire to turn off their apps simultaneously in order to trick its system into raising prices. The segment showed drivers standing in a parking lot while a group of organizers yelled to everyone when to shut their phones off, then turn them back on. "When we find out what's the highest surge, that's when we say, 'Everybody on,' and everybody gets paid what we think we should be getting paid," explained one of the orchestrators.
Alex Rosenblat, the author of Uberland, told me that these acts of digital disobedience are essentially futile in the long run. Technology centralizes power and information in a way that overwhelms mere humans. "You might think you're manipulating the system," she says, but in reality "you're working really hard to keep up with a system that is constantly experimenting on you."
Compared to pricing algorithms, customer ratings of the type that worried my repairman should be fairly straightforward. Presumably it's just a matter of gathering data and calculating an average. But online ratings are a questionable way to judge people even if the data they're based on are pristine -- and they probably aren't. Academics have shown that customer ratings reflect racial biases. Complaints about a product or service can be interpreted as commentary about the person who provided it, rather than the service itself. And companies like Uber require drivers to maintain such high ratings that, in effect, any review that isn't maximally ecstatic is a request for punitive measures.
Drew Franklin, who has worked as a field technician for Verizon in Washington, D.C. since 2017, said the customer review system is a source of near-constant stress. His customers get a five-question phone survey after each visit, as well as a chance to leave a message elaborating on their experiences. Franklin, who also ran unsuccessfully for D.C.'s district council in 2016, has looked at his own reviews, and says the sentiment in the messages periodically conflicts with the numbered scores from the survey. If the survey scores are low, his boss is automatically alerted. "If you get a bad review and they look into it, maybe it's frivolous," Franklin says. "But your score is your score."