personalized dynamic pricing is here (It Just Requires a Disclaimer)
New York State passed a law requiring that companies disclose when they are using algorithmic pricing, and the law is already facing a legal challenge in the courts. We know companies have long used “dynamic pricing”—which is adjusting prices based on certain conditions, such as demand—for ride-sharing apps, airline tickets, and concert tickets. Now, with the increasing amount of information that companies have about consumers (or purchase from data brokers), the identity of the consumer can be one such “condition” in pricing. Delta Airlines recently revealed plans to increase “personalized algorithmic pricing,” which is when companies use personal data to show a consumer a distinct price. And it’s not uncommon. Nor is it illegal.
To recap, “personalized algorithmic pricing” is dynamic pricing derived from or set by an algorithm that uses consumer data (i.e., data that identifies or could identify the person making a purchase or lease of goods or services).
New York State recently passed a law on “Personalized Pricing Transparency and Anti-Discrimination,” which took effect on July 8, 2025. It sets disclosure requirements when companies use consumer data for algorithmic pricing and prohibits the use of certain consumer data to set such prices. If a company is using a consumer’s internet browsing history or shopping history to set prices, they must have the following disclaimer: “THIS PRICE WAS SET BY AN ALGORITHM USING YOUR PERSONAL DATA.” Location data does not count, so someone in Albany could be shown a different price than someone in Tribeca, without a disclaimer being required.
Importantly, beyond the disclaimer requirement, the law does not prohibit the use of personalized algorithmic pricing in all cases. “Protected class data” such as national origin, disability, sex, orientation, gender identity and expression, pregnancy outcomes, and reproductive healthcare, cannot be used to set prices IF the use of that data would result in certain groups missing out on "accommodations, advantages, or privileges.” If there aren’t any “missed benefits,” then companies can use this “protected class data” to personalize algorithmic pricing.
The National Retail Federation (NRF) has sued, in an attempt to block the law, arguing that it is a First Amendment violation to require companies to use some exact language (as with the mandatory disclaimer); they also argue that stigmatizing these tools will cause consumers to pay more.