Elevate your enterprise data technology and strategy at Transform 2021.
Apple’s move to a more private, consumer-driven data model with the announcement of its App Tracking Transparency (ATT) feature puts the consumer in the driver’s seat of data privacy, allowing them to opt-in or opt-out of data sharing. The move is creating tension among businesses, especially big tech, who worry that giving up control of their data will stifle innovation — but nothing could be further from the truth. It’s not about who’s driving the car, it’s about establishing rules of the road.
It may be hard to believe, but the internet is still in a relatively nascent stage, similar to the mass production of cars in the early 1900s. When people were accustomed to horse-drawn carriages and walking as a primary means of transportation, hearing an engine rev and seeing a gas-powered contraption barrel down the road was equally as exciting as it was alarming. The same can be said with data privacy and the internet. Consumers are inexplicably fascinated by its possibilities, but neither they nor businesses have yet to grasp its full potential — for better or worse.
It will take the coordination of governments, Big Tech, and consumers to implement common bedrock principles in order to protect privacy and pave the way for safe data travel in the internet age. While fear is often associated with change, everyone can win if we learn to be creative with a shared set of privacy standards. Such standards should include:
Looking both ways: Understanding the ad ecosystem
More often than not, consumers incorrectly assume that their data is isolated to one particular app, when in fact it’s usually shared with a whole network of partners. The ad ecosystem is highly complex, with companies approaching it from every angle, leaving many people unintentionally perplexed. There is an element of education needed first before the industry can begin to build common rules of the road. Companies are responsible for communicating to their customers exactly how and when their data will be used — beyond just their immediate purposes. This can be challenging; just look at the Google Incognito lawsuit. While the private web browser splash screen indicated that websites may be able to collect information about a user’s browser activity, users still expected, given the name, that their data would be kept confidential and they would not be tracked. Keeping data safe is very different from keeping it private, yet people often conflate the two.
Information must be presented in a way that is easy for consumers to understand. Long words and small fonts can sometimes be the leaky tire that leads to a crash.
Red light/green light: Allowing for consumer consent
Once consumers understand how their data is being used, the second step is setting up the figurative traffic lights. Companies should allow consumers to opt in or out, just like Apple’s ATT feature. What remains to be seen is whether or not the opt-in feature is designed and presented in a way that’s understandable to consumers or if it will be just another button people click without a second thought.
One thing companies should consider when creating consent options is the use of dark patterns. These carefully crafted user interfaces can be designed to either enlighten or confuse a user. Examples include confusing language that contains double negatives, such as “don’t not sell my personal information” and making it appear that the user must submit or share non-essential information to continue using a product or access a webpage. Companies should take what they’ve learned about dark patterns and user behavior to help consumers understand what they consent to. This will establish better relationships with their customers in the long run.
Stop signs: What happens when a consumer clicks disagree?
Change is always scary and can feel restrictive. Consumers are fearful about how their data is being used and the lack of control with regard to where it’s sent. Businesses worry that less access to data will negatively impact their bottom line. For too long companies have taken consumer data for granted, resting on the way things have always been done. But just because a consumer opts out doesn’t mean you’ve hit a stop sign — just turn right!
User consent can provide a wakeup call to companies who still believe consumer privacy is on the other end of the pendulum of innovation. Now is the time to rethink your digital ad strategy and re-evaluate how to connect with customers in a privacy-centric way, or risk losing them all together.
Watch out for speed bumps! Staying compliant with data privacy laws
Once the rules of the road are established, it’s important to keep your eyes on the road and watch for any unexpected obstacles that may arise. In the absence of (current) federal legislation surrounding consumer data privacy, many states have enacted their own laws including California’s CPRA and Virginia’s VCDPA. More states are expected to pass their own bills this year, and companies must be attentive to constantly changing regulations — especially companies that operate in many different states and jurisdictions.
The data privacy landscape is constantly changing. It is important for companies of all sizes, and their lawyers, to stay up to date on new regulations, moves across big tech, and consumer demand.
Beth Magnuson, CIPP/US, CIPP/E, joined Practical Law from Oracle, where she was managing counsel, responsible for privacy and security matters. Her prior positions at Oracle (formerly Sun Microsystems) focused on trademark and copyright matters. Before that, she was special counsel with Faegre & Benson, general counsel of Pumpkin Masters, a seasonal products company, and an intellectual property associate at both Finnegan, Henderson, Farabow, Garrett & Dunner and Welsh & Katz (now Husch Blackwell).