Employers are using your personal data to figure out the lowest salary you'll accept

Dow Jones04-01 20:30

MW Employers are using your personal data to figure out the lowest salary you'll accept

By Genna Contino

A growing number of employers are using surveillance wages to negotiate your next paycheck

Algorithms are increasingly using personal data to determine the minimum pay a worker is willing to accept, consumer watchdogs say.

You've likely already felt the digital sting of "surveillance pricing." It might look like an airline advertising a specific fare bundle because a customer's loyalty-program data suggests they're likely to buy it, or a website charging more for infant formula because an algorithm sensed the desperation of a new parent.

We're living in a world where your purchase history, browsing speed and even your ZIP code increasingly dictate the cost of your life. And as companies get better at collecting and analyzing personal data, they aren't just gunning for the money coming out of your wallet - they're controlling how much goes into it, too.

Experts describe "surveillance wages" as a system in which wages are based not on an employee's performance or seniority, but on formulas that use their personal data, often collected without employees' knowledge.

Companies already try to get new hires to accept the lowest possible wage offer. But while that once meant sizing up a candidate's experience and credentials against the going market rate, it increasingly means feeding the candidate's personal data into an algorithm.

According to Nina DiSalvo, policy director at labor advocacy group Towards Justice, some systems use signals associated with financial vulnerability - including data on whether a prospective employee has taken out a payday loan or has a high credit-card balance - to infer the lowest pay a candidate might accept. Companies can also scrape candidates' public personal social-media pages, she said, to determine if they are more likely to join a union or could become pregnant. The data can be used to determine wage increases after an employee is hired, and the practice can veer into discrimination, experts say.

"If you're a company who's messing around with these types of practices on consumers, you're watching how well they work," said Lindsay Owens, executive director of Groundwork Collaborative, a progressive think tank. "Workers are consumers, too. If it works on consumers, it works on workers. It's the same psychology."

A first-of-its-kind audit of 500 labor-management artificial-intelligence companies by Veena Dubal, a law professor at University of California, Irvine, and Wilneida Negrón, a tech strategist, found that employers in the healthcare, customer service, logistics and retail industries are customers of vendors whose tools are designed to enable this practice. Published by the Washington Center for Equitable Growth, a progressive economic think tank, the August 2025 report identified major U.S. employers as being among these customers, including Intuit $(INTU)$, Salesforce (CRM), Colgate-Palmolive $(CL)$, Amwell $(AMWL)$ and Healthcare Services Group $(HCSG)$.

The report does not claim that all employers using these systems engage in algorithmic wage surveillance. Instead, it warns that the growing use of algorithmic tools to analyze workers' personal data can enable pay practices that prioritize cost-cutting over transparency or fairness.

Colgate-Palmolive's director of corporate communications, Thomas DiPiazza, said the company "does not use algorithmic wage-setting tools to make compensation decisions for our employees or to set new-hire salaries."

Intuit does "not engage in such practices," a spokesperson for that company told MarketWatch.

The other companies named in the report did not respond to MarketWatch's requests for comment.

Surveillance wages don't stop at the hiring stage - they follow workers onto the job, too.

The vendors that provide such services also offer tools that are built to set bonus or incentive compensation, according to the report. These tools track their productivity, customer interactions and real-time behavior - including, in some cases, audio and video surveillance on the job. Nearly 70% of companies with more than 500 employees were already using employee-monitoring systems in 2022, such as software that monitors computer activity, according to a survey from the International Data Corporation.

"The data that they have about you may allow an algorithmic decision system to make assumptions about how much, how big of an incentive, they need to give to a particular worker to generate the behavioral response they seek," DiSalvo said.

'Judging our desperation rate'

One of the clearest examples of surveillance-driven wage setting appears in on-demand healthcare staffing. A report put together by the Roosevelt Institute, a liberal-leaning think tank, was based on interviews with 29 gig nurses and found that staffing platforms that gig nurses use to sign up for shifts, including CareRev, Clipboard Health, ShiftKey and ShiftMed, routinely use algorithms to set pay for individual shifts.

ShiftKey denied engaging in surveillance wage setting when reached by MarketWatch for comment. "ShiftKey unequivocally does not use any data broker services or engage in any surveillance-wage setting," said Regan Parker, the company's chief legal and public affairs officer. Parker specifically disputed claims from the Roosevelt Institute report suggesting that its platform uses workers' debt levels to determine pay, stating that ShiftKey does not use credit-card or other debt data to set wages and could not speak to the practices of other platforms.

CareRev, Clipboard Health and ShiftMed did not respond to requests for comment.

Rather than offering a fixed wage, the platforms adjust pay based on what they know about each worker - including how often a nurse accepts shifts, how quickly they respond to postings and what pay they have accepted in the past, according to the Roosevelt Institute report. Nurses interviewed for the report said this often resulted in nurses being paid different amounts for the same work, even within the same facility.

Critics argue the system rewards workers not for skill or experience, but for what their behavior reveals about their financial vulnerability. Such systems "may determine pay by what the firm knows about how much a nurse was willing to accept for a previous assignment," the report's authors wrote, locking them into lower pay bands over time.

According to Rideshare Drivers United, the union that represents rideshare drivers, algorithmic wages have been shaping pay for that industry's workers for years. Ben Valdez, a Los Angeles-based rideshare driver, said that after Uber $(UBER)$ and Lyft $(LYFT)$ rolled out new pay algorithms several years ago, his earnings declined - even as post-pandemic demand rebounded. Comparing notes with other drivers, Valdez said he has seen different drivers offered different base fares for the same trip at the same time.

Valdez said drivers are initially shown a take-it-or-leave-it rate, which rises only after enough drivers reject it. How that starting rate is set is opaque. "Why one driver gets a different, higher base is unknown," he said.

That uncertainty is by design, according to Zephyr Teachout, a Fordham University law professor. In a 2023 report, Teachout wrote that Uber "uses data-rich driver profiles to match the wage to the individual incentives of the driver and the needs of the platform," citing prior research by Dubal and reporting from The Markup.

Uber said in an email to MarketWatch that its up-front fares are based on time, distance and demand conditions, and that its algorithms do not use individual driver characteristics or past behavior to determine pay. Rideshare trade association Flex, which responded after MarketWatch reached out to Lyft for comment, said in a statement that data-driven technologies "help process real-time and historical data to help match workers with a delivery or ride that represents the most efficient use of their time, which, in turn, allows them to spend more time earning."

Worker advocates remain skeptical. "It's judging our desperation rate," said Nicole Moore, president of Rideshare Drivers United.

Some lawmakers are paying attention

Critics of surveillance wages argue the practice can lead to discrimination in the workplace by allowing employers to bypass traditional merit-based pay. Because these algorithms are designed to find the absolute minimum a person will accept based on their financial history and other factors, they can disproportionately target the most financially vulnerable workers.

This creates a cycle where a person's past economic distress or personal life choices are used to justify lower pay in the present, often without the employee ever knowing which data points were used against them.

"We know the concept of the glass ceiling. But at least in that concept, we've got some visibility through that glass ceiling. We have a sense of what that world looks like. We can break it if we do the right things and galvanize," said Joe Hudicka, the author of a book called "The AI Ecosystems Revolution." "This wage-surveillance ceiling - it's iron. It's concrete. It's something that's impermeable."

Legislators have been slower to address surveillance wages than surveillance pricing. New York state recently passed a rule requiring companies to disclose to consumers when their prices are set with algorithms that use their personal data - but most laws around the country are just looking at prices, not paychecks.

Colorado is trying to go further. A bill introduced in the state House, titled the Prohibit Surveillance Data to Set Prices and Wages Act, would ban companies from using intimate personal data - such as payday-loan history, location data or Google $(GOOG)$ search behavior - to algorithmically set what someone is paid. The bill carves out performance-based wages, meaning employers could still tie pay to measurable productivity.

MW Employers are using your personal data to figure out the lowest salary you'll accept

By Genna Contino

A growing number of employers are using surveillance wages to negotiate your next paycheck

Algorithms are increasingly using personal data to determine the minimum pay a worker is willing to accept, consumer watchdogs say.

You've likely already felt the digital sting of "surveillance pricing." It might look like an airline advertising a specific fare bundle because a customer's loyalty-program data suggests they're likely to buy it, or a website charging more for infant formula because an algorithm sensed the desperation of a new parent.

We're living in a world where your purchase history, browsing speed and even your ZIP code increasingly dictate the cost of your life. And as companies get better at collecting and analyzing personal data, they aren't just gunning for the money coming out of your wallet - they're controlling how much goes into it, too.

Experts describe "surveillance wages" as a system in which wages are based not on an employee's performance or seniority, but on formulas that use their personal data, often collected without employees' knowledge.

Companies already try to get new hires to accept the lowest possible wage offer. But while that once meant sizing up a candidate's experience and credentials against the going market rate, it increasingly means feeding the candidate's personal data into an algorithm.

According to Nina DiSalvo, policy director at labor advocacy group Towards Justice, some systems use signals associated with financial vulnerability - including data on whether a prospective employee has taken out a payday loan or has a high credit-card balance - to infer the lowest pay a candidate might accept. Companies can also scrape candidates' public personal social-media pages, she said, to determine if they are more likely to join a union or could become pregnant. The data can be used to determine wage increases after an employee is hired, and the practice can veer into discrimination, experts say.

"If you're a company who's messing around with these types of practices on consumers, you're watching how well they work," said Lindsay Owens, executive director of Groundwork Collaborative, a progressive think tank. "Workers are consumers, too. If it works on consumers, it works on workers. It's the same psychology."

A first-of-its-kind audit of 500 labor-management artificial-intelligence companies by Veena Dubal, a law professor at University of California, Irvine, and Wilneida Negrón, a tech strategist, found that employers in the healthcare, customer service, logistics and retail industries are customers of vendors whose tools are designed to enable this practice. Published by the Washington Center for Equitable Growth, a progressive economic think tank, the August 2025 report identified major U.S. employers as being among these customers, including Intuit (INTU), Salesforce (CRM), Colgate-Palmolive (CL), Amwell (AMWL) and Healthcare Services Group (HCSG).

The report does not claim that all employers using these systems engage in algorithmic wage surveillance. Instead, it warns that the growing use of algorithmic tools to analyze workers' personal data can enable pay practices that prioritize cost-cutting over transparency or fairness.

Colgate-Palmolive's director of corporate communications, Thomas DiPiazza, said the company "does not use algorithmic wage-setting tools to make compensation decisions for our employees or to set new-hire salaries."

Intuit does "not engage in such practices," a spokesperson for that company told MarketWatch.

The other companies named in the report did not respond to MarketWatch's requests for comment.

Surveillance wages don't stop at the hiring stage - they follow workers onto the job, too.

The vendors that provide such services also offer tools that are built to set bonus or incentive compensation, according to the report. These tools track their productivity, customer interactions and real-time behavior - including, in some cases, audio and video surveillance on the job. Nearly 70% of companies with more than 500 employees were already using employee-monitoring systems in 2022, such as software that monitors computer activity, according to a survey from the International Data Corporation.

"The data that they have about you may allow an algorithmic decision system to make assumptions about how much, how big of an incentive, they need to give to a particular worker to generate the behavioral response they seek," DiSalvo said.

'Judging our desperation rate'

One of the clearest examples of surveillance-driven wage setting appears in on-demand healthcare staffing. A report put together by the Roosevelt Institute, a liberal-leaning think tank, was based on interviews with 29 gig nurses and found that staffing platforms that gig nurses use to sign up for shifts, including CareRev, Clipboard Health, ShiftKey and ShiftMed, routinely use algorithms to set pay for individual shifts.

ShiftKey denied engaging in surveillance wage setting when reached by MarketWatch for comment. "ShiftKey unequivocally does not use any data broker services or engage in any surveillance-wage setting," said Regan Parker, the company's chief legal and public affairs officer. Parker specifically disputed claims from the Roosevelt Institute report suggesting that its platform uses workers' debt levels to determine pay, stating that ShiftKey does not use credit-card or other debt data to set wages and could not speak to the practices of other platforms.

CareRev, Clipboard Health and ShiftMed did not respond to requests for comment.

Rather than offering a fixed wage, the platforms adjust pay based on what they know about each worker - including how often a nurse accepts shifts, how quickly they respond to postings and what pay they have accepted in the past, according to the Roosevelt Institute report. Nurses interviewed for the report said this often resulted in nurses being paid different amounts for the same work, even within the same facility.

Critics argue the system rewards workers not for skill or experience, but for what their behavior reveals about their financial vulnerability. Such systems "may determine pay by what the firm knows about how much a nurse was willing to accept for a previous assignment," the report's authors wrote, locking them into lower pay bands over time.

According to Rideshare Drivers United, the union that represents rideshare drivers, algorithmic wages have been shaping pay for that industry's workers for years. Ben Valdez, a Los Angeles-based rideshare driver, said that after Uber (UBER) and Lyft (LYFT) rolled out new pay algorithms several years ago, his earnings declined - even as post-pandemic demand rebounded. Comparing notes with other drivers, Valdez said he has seen different drivers offered different base fares for the same trip at the same time.

Valdez said drivers are initially shown a take-it-or-leave-it rate, which rises only after enough drivers reject it. How that starting rate is set is opaque. "Why one driver gets a different, higher base is unknown," he said.

That uncertainty is by design, according to Zephyr Teachout, a Fordham University law professor. In a 2023 report, Teachout wrote that Uber "uses data-rich driver profiles to match the wage to the individual incentives of the driver and the needs of the platform," citing prior research by Dubal and reporting from The Markup.

Uber said in an email to MarketWatch that its up-front fares are based on time, distance and demand conditions, and that its algorithms do not use individual driver characteristics or past behavior to determine pay. Rideshare trade association Flex, which responded after MarketWatch reached out to Lyft for comment, said in a statement that data-driven technologies "help process real-time and historical data to help match workers with a delivery or ride that represents the most efficient use of their time, which, in turn, allows them to spend more time earning."

Worker advocates remain skeptical. "It's judging our desperation rate," said Nicole Moore, president of Rideshare Drivers United.

Some lawmakers are paying attention

Critics of surveillance wages argue the practice can lead to discrimination in the workplace by allowing employers to bypass traditional merit-based pay. Because these algorithms are designed to find the absolute minimum a person will accept based on their financial history and other factors, they can disproportionately target the most financially vulnerable workers.

This creates a cycle where a person's past economic distress or personal life choices are used to justify lower pay in the present, often without the employee ever knowing which data points were used against them.

"We know the concept of the glass ceiling. But at least in that concept, we've got some visibility through that glass ceiling. We have a sense of what that world looks like. We can break it if we do the right things and galvanize," said Joe Hudicka, the author of a book called "The AI Ecosystems Revolution." "This wage-surveillance ceiling - it's iron. It's concrete. It's something that's impermeable."

Legislators have been slower to address surveillance wages than surveillance pricing. New York state recently passed a rule requiring companies to disclose to consumers when their prices are set with algorithms that use their personal data - but most laws around the country are just looking at prices, not paychecks.

Colorado is trying to go further. A bill introduced in the state House, titled the Prohibit Surveillance Data to Set Prices and Wages Act, would ban companies from using intimate personal data - such as payday-loan history, location data or Google (GOOG) search behavior - to algorithmically set what someone is paid. The bill carves out performance-based wages, meaning employers could still tie pay to measurable productivity.

(MORE TO FOLLOW) Dow Jones Newswires

April 01, 2026 08:30 ET (12:30 GMT)

MW Employers are using your personal data to -2-

Rep. Javier Mabrey, a Democrat sponsoring the bill, draws a sharp line between dynamic pricing - where costs shift based on broad market conditions - and what he argues these systems actually do. "What our bill is about is individualized price setting, which is distinct from dynamic pricing," he said. "It requires the company to pull some really personal data related to you, not supply and demand."

For surveillance pay specifically, the bill would prohibit companies from using workers' personal data - without their consent - to determine what they're paid. Uber and Lyft have denied using individual driver characteristics to set wages, yet Mabrey said both companies are lobbying against the bill. "What is the problem of codifying in law that you're not allowed to?" he said.

-Genna Contino

This content was created by MarketWatch, which is operated by Dow Jones & Co. MarketWatch is published independently from Dow Jones Newswires and The Wall Street Journal.

 

(END) Dow Jones Newswires

April 01, 2026 08:30 ET (12:30 GMT)

Copyright (c) 2026 Dow Jones & Company, Inc.

At the request of the copyright holder, you need to log in to view this content

Disclaimer: Investing carries risk. This is not financial advice. The above content should not be regarded as an offer, recommendation, or solicitation on acquiring or disposing of any financial products, any associated discussions, comments, or posts by author or other users should not be considered as such either. It is solely for general information purpose only, which does not consider your own investment objectives, financial situations or needs. TTM assumes no responsibility or warranty for the accuracy and completeness of the information, investors should do their own research and may seek professional advice before investing.

Comments

We need your insight to fill this gap
Leave a comment