Blog

Blog

Article 17 Part 3

The Article 17 Series Part 3: Privacy for Mobile Developers

Posted Jun 9, 2020

In our series on privacy with tech company privacy attorney and expert Alan Chapell, we’ve taken a look at both the history and future of privacy. But what does this mean for developers and publishers? In this final feature, Chapell shares where this is all going, so app publishers can anticipate these changes and get ahead of future digital privacy laws as they make improvements and grow portfolios with new titles.

Digital privacy is the #1 most important issue in the mobile advertising industry today, and it will be over the next five years. Yet it’s the topic that most of us know the least about. The Article 17 series takes a look at privacy considerations for mobile publishers and advertisers together with Alan Chapell, founding partner of a law firm focusing on privacy for tech companies, providing his fact-based POV on the current state of privacy regulations.

Need to catch up? Read part 1 or part 2!

How should mobile app developers think about privacy?

AdColony: If I’m making a mobile game today, what sort of steps should I be taking to ensure that everything I’m doing is as privacy-compliant as possible?

Alan Chapell: You need to do the same thing that I mentioned previously (in part 1 of this series) with respect to GDPR. If you’re creating an app, the first question is: What data are you collecting for the app? And then, how is that data going to be utilized? Who are you planning on capturing? You need to engage in the same basic data governance practices that are applicable to ad tech companies. 

AdColony: Is there a certain level of common sense, or a perspective, that they should take?

Alan Chapell: Part of the process is understanding what is a “good reason” for collecting a certain type of data. You might think it’s a reason, but you have to also think from the user perspective. Would they think it’s a good reason? Is your reason for wanting the data related to the app functionality?

Here’s an example. A couple of years ago, before phones started offering a flashlight, there was an app that enabled users to use their phone as a flashlight. That app also collected precise location data, and it sold that location data to other companies. It begs the question, why does a flashlight app need to know your location? Is the collection and use of precise location something that a reasonable user would think is part of a flashlight app’s functionality? Did the app at least ask the user’s permission for the app’s use of precise location? The FTC took the position that collecting this type of information without disclosing it anywhere was deceptive.

Keep in mind that this was several years ago before industry standards (e.g., NAI, DAA, app stores) were such that the collection of precise location requires user consent. But the larger point I’m making is that app developers (really every company) need to really ask themselves why they are collecting certain information, evaluate the privacy risks, and seek to minimize those risks. I think there will continue to be a lot more scrutiny on situations like that now. 

It’s easy to look back – hindsight’s always 20/20 –  but now, looking back on that, you could say that was an egregious example of unreasonable data collection. The app developer in that case probably didn’t need precise location, and could have thought more about it – particularly given how sensitive that type of data is. Had they engaged in a privacy-by-design approach, somebody in that process probably would have flagged it.

I’m not here to say what anybody should or shouldn’t collect, but I do think that the idea that you can just collect whatever you want without evaluating the consequences…I think those days are pretty much done.

AdColony: That raises a question, does any game app need to know your location other than something like Pokemon Go, where it’s integral to the experience? And is there a hard and fast rule as to what data you should collect? 

Alan Chapell: I think you need to ask yourself: Why do we need this information? Is it driving value for the end-user? An analysis here of what you’re collecting and the end result for the user, including an evaluation of the risks is a good starting point.

And then, how comfortable are you disclosing to users that you’re collecting it? How clearly are you going to communicate that? There’s a fine line between putting something right in someone’s face and making it more discreet, especially if the data is sensitive. For instance, certain inferred medical conditions, like cancer – that’s the type of stuff that certainly requires explicit consent. 

AdColony: So you ask yourself, what is the value of collecting the information to the user, and then how much am I willing to disclose that fact. What else should you consider? 

Alan Chapell: The next question is: Who am I passing this information to?

For example, with respect to some of the larger platforms like Facebook and Google, they may take the position that their services function better and are more helpful and usable to consumers because they have location turned on all the time. But I think there’s an argument to be made that they don’t need it all the time. I’m not here to judge their privacy practices, but I will say when so much information is collected under so many different contexts, it becomes difficult for any user to truly manifest consent. This goes to the heart of of one of the issues being debated and litigated in the EU under GDPR. 

Regarding the last question, though, anytime you contemplate passing on data to another party, that should trigger an immediate internal set of questions internally.

AdColony: So if a developer is “privacy mindful,” it sounds like they can avoid some of the messes that are created later on. But what about the small publisher, the solo or small-team game designers? Is the paperwork unavoidable? 

Alan Chapell: It’s true, the GDPR certainly has  a pretty significant paperwork component. The number of agreements, specifying exactly how data is being processed, may come off as fairly cumbersome to a small team working out of their garage.

But it’s time for developers to start taking more ownership, I think. Over the last 10 or 20 years, publishers, including app publishers, have leaned into their ad tech partners and just said, “Hey, you guys handle this. I’ll make the app, you monetize it.” Now fast forward into the current regime where even a publisher under the CCPA needs to worry about passing opt-out requests upstream to partners, or GDPR, where you’re a publisher and need to obtain consent for tracking technologies, and executing all of these agreements with your ad tech partners. This trend has perhaps made one of the key things that ad tech does for the publishing community less valuable because there’s only so much that you can take off their hands now.

About Chapell & Associates
Alan Chapell serves as outside counsel and chief privacy officer to digital media companies. Since 2003, Chapell has advised well over two hundred different companies, from venture-backed startups to some of the largest media, technology, and telecommunications companies in the world. Chapell’s mission is to help clients navigate regulatory, public policy and other marketplace challenges to maximize the value of their products and services.

For more information, please reach out to achapell@chapellassociates.com or visit Alan’s LinkedIn profile here. Chapell is often asked to write for industry trades, and some of his writing may be found here.

Join the Conversation
Questions about mobile publishing privacy compliance for your app? Tweet us at @AdColony. For the latest AdColony mobile news and updates, follow @AdColony on Twitter, like us on Facebook, or connect on Linkedin.

Jonathan

Latest at AdColony