Contact Sales
Contact Sales


Modern User Privacy
An overview for businesses considering biometrics

Nov 18, 2021 by Betsy Floyd

For companies using or considering adopting biometrics, a key piece is prioritizing privacy. However, privacy is complicated in the United States. There’s no comprehensive federal regulation to enforce rules and hold businesses accountable.

As a result, the public lives in a shroud of mystery, feeling concerned and distrustful due to the lack of transparency around what businesses are doing with their data, and the plain evidence of their data being taken or sold without their consent. With so many factors at play, it’s not always straightforward for businesses to uphold user privacy in the way they should.

The following is an exploration of the state of modern American consumer privacy, and some guiding privacy principles for companies to consider as they adopt biometric authentication.

Why preserving consumer privacy matters

It’s a fundamental human right.

On April 10, 1948, the United Nations General Assembly gathered to compose the Universal Declaration of Human Rights, a comprehensive standard for all peoples and nations. Article 12 states the following:

“No one shall be subjected to arbitrary interference with his privacy, family, home or correspondence, nor to attacks upon his honour and reputation. Everyone has the right to the protection of the law against such interference or attacks.”

People are being manipulated for profit

Many of us have had the experience of researching a product, and then “magically” seeing this product and similar ones advertised to us on different web pages for days afterwards. Our behaviors are being tracked on the internet and through apps, and then we’re relentlessly hounded with ads.

There’s consequences to data breaches

U.S. senator Maria Cantwell led a hearing titled, “ Protecting Consumer Privacy” in September 2021. In her opening statement, she cites that, ”According to a May 2021 report by the Identity Theft Resource Center, victims of identity theft are turned down for loans, unable to rent houses, they have their credit damaged, they are billed for medical services they never received, and they can't find unemployment benefits because their name was basically stolen.”

Personal data can be weaponized

Leaking or sharing people’s information can lead to cases of discrimination or prevent certain demographics of people from opportunities. Facebook has been known to allow advertisers to exclude users of certain races or demographics from seeing housing ads.

When people apply for home loans online or through apps, Black and Latino people end up paying higher interest rates than white borrowers. During the pandemic lockdown, those struggling with gambling addiction reported seeing higher rates of online gambling advertisements in their web browsing.

The power of consumer data

Geoffrey Fowler, Washington Post tech journalist, performed an investigation into exactly how companies are accessing his data, who they’re selling it to and how this leads to the targeted ads he’s constantly bombarded with. He dug into how we’re being tracked on the apps we use, how our data is leaked through our web browser extensions, and the sheer value user data possesses.

Fowler noted that, “it used to be that Facebook was a lot more open about sharing your Facebook data and your friends' data with apps. They shut down a lot of that a couple of years ago. And that's the problem they got into with Cambridge Analytica as well, that they were allowing apps to collect data about your friends and then pass that along. Again, they'll say that's for privacy, but it's really because they realize that data is too valuable. They want to keep it for themselves so they can charge companies to market to you through them.”

Though Fowler’s investigation took place in 2019, his findings are only underscored by today’s data market, and how much money people make off of selling our data ( or are not making anymore). In the wake of Apple’s privacy feature that allows people to opt out of apps tracking their activity, social media giants, e-commerce companies and the adtech industry have seen a significant downturn in profit.

The consequences of businesses misusing consumer data

What’s arguably one of the most egregious privacy-related offenses a company can commit, is touting a rigorous privacy policy, and then proceeding to do otherwise behind closed doors. For example, Flo, a period and fertility tracking app, told their users that they wouldn’t sell their data to third parties. This promise turned out to be a lie.

The FTC reached a $100M + settlement with Flo, following a 2019 report the Wall Street Journal did, where they according to TechCrunch, “found the fertility tracking app had informed Facebook of in-app activity — such as when a user was having their period or had informed it of an intention to get pregnant. It did not find any way for Flo users to prevent their health information from being sent to Facebook.”

How people feel about their data privacy

People’s understanding and their attitudes towards their control over their data privacy falls within a spectrum, ranging from hopelessness and distrust to confidence and control.

The Pew Research Center did a study in 2019 on Americans’ perceptions of their control over data privacy, their trust of companies and government with their data, and how much they’re tracked.

Some of their relevant findings include:

  • Six in ten adults believe it’s impossible to go through daily life without getting tracked by companies or the government in some form or fashion.
  • 81% of adults believe that they have little to no control over how companies use their data.
  • 59% of adults feel that they have very little to no understanding of how companies use their data.
  • 79% of adults feel very concerned over how companies use their data.

These attitudes reflect a general malaise around people’s data privacy and their inability to have control over it. Today, companies aren’t just having to step up their policies, they also have to actively educate people on how their data is used, and give them ways to exercise rights over their data.

However, a 2019 report conducted by Cisco revealed a very different demographic and their attitudes and behavior towards their privacy. A small subset (32%) of the respondents were labeled as “privacy actives”: adults who care about their privacy, are willing to take action to protect their privacy, and already make choices about which companies they do business with based on their ability to protect one’s data.

This smaller demographic is different from those described above. They’re competent, confident and inciting businesses to upgrade their privacy conduct, otherwise they’ll take their money elsewhere.

Where to go from here

While the United States lacks federal privacy regulation and effective enforcement, there are basic tenets of user privacy that businesses can take and design into their processes, products and customer relationships.

The following principles are condensed versions from NYT’s exploration of privacy and the opening statement of Maureen K. Ohlhausen, Former Acting Chair of the Federal Trade Commission, for the Senators Hearing on “Protecting Consumer Privacy.”


Lack of transparency is one of the most glaring failures people face today when they’re handing their data over to companies. Ashkan Soltani, Executive Director of the California Consumer Protection Agency and former chief technologist at the Federal Trade Commission says, “Most people believe they’re protected, until they’re not. Sadly, because this ecosystem is primarily hidden from view and not transparent, consumers aren’t able to see and understand the flow of information.”

Additionally, Ohlausen stated during her opening statement, that future federal privacy regulation “ should provide consumers clarity and visibility into companies’ data collection, use, and sharing practices, as well as easily understandable choices regarding these practices, calibrated to the sensitivity of that data.” This same principle can be taken into consideration by any company looking to improve their user privacy policies.

Strong Consumer Rights

Data collection - people should have the right to request what personal data companies are storing and request companies to delete the data as well.

Opt-in consent - companies should ask their consumers if they can share or sell their data to third parties.

Data non-discrimination - companies shouldn’t increase prices for those want to protect their privacy, or offer discounts in exchange for more data.

Data Minimization - companies should take the least amount of data needed from consumers.

Final words

We hope the above has given you insight into the current privacy landscape and ideas to take to your next meeting. While Keyo can’t legally consult you, we’re also on the journey to uphold user privacy in everything we do, and are always happy to share our resources on best practices with you.

Want to learn more?

Check out our other privacy-related posts.

Let's read!



API-First Vs. Single Brand ID Networks

A comparison

Learn about the different approaches to biometric ID networks.

Thermal Receipt Printer Integration

Keyo's latest feature

Keyo developed a successful thermal receipt printer integration with our palm vein scanners.

What People Need Before Accepting Widespread Biometrics

People need transparency, clarity, intentionality and trustworthiness before accepting widespread biometrics.