On 20 July, the CNIL, the authority responsible for data protection in France, issued two warning notices for non-compliance with the GDPR. These were the first-ever notices for GDPR violations, and startups Fidzup and Teemo were the chosen ones. Both companies have very decent portfolios of venture capital investment, they had hired data protection officers (read more), and published ostentatious privacy protection statements on their websites (read more).

Who were the first to have such bad luck?

Let’s be honest, we all wanted someone to get caught and were like “Please let it be someone else!”. It might have been these two companies’ policies, or the Cambridge Analytica scandal (read more), or that both startups used a pay-as-you-use data charging scheme, so that they came to the attention of the media. BuzzFeed wrote about them, and both companies were featured in a Yale Privacy Lab Announcement, along with DoubleClick, Braze, Millennial Media, and others. In addition, in May, Apple removed all Teemo and Fidzup applications from the App Store.

It seems like those guys thought they were data privacy Chuck Norrisses. But then came CNIL and gave both Chucks 3 month to come to their senses.

What was actually wrong?

Both companies practiced geotargeting. They collected geolocation data from people who used mobile applications offered by Fidzup’s and Teemo’s B2B partners. Personal data processing started right after the partner’s app download. It worked like that: app was downloaded –> data was collected –> data was processed –> pop up ads appeared –> you signed up (perhaps) and finally –> you agreed that some of your data would be processed (perhaps).

  • What’s wrong with this scheme?
  • The thing is, the last two steps should actually be the second and third ones.
  • Isn’t this a problem for the data controller (or “owner”) if it doesn’t get user consent?
  • Well, not exactly. Fidzup and Teemo collected data and determined the purpose of such processing. In this case, we are talking about the data collected as a result of app download. If you collect and process such data, you are a data controller (owner). And when it comes to collecting the geolocation data, Teemo and Fidzup should be held responsible for failing to comply with the GDPR requirements.

More information about Teemo and Fidzup

Teemo used an SDK (a software development toolkit) to process their app users’ geolocation data (or rather, the geolocation data of people who downloaded their apps) and unique promo IDs. Data processing started before applications were even launched, and users were unaware of that. While partners had long launched their products and got their own users, Teemo’s SDK started collecting user data following the first application update (sure thing, without notifying the users). Using the data from customers’ points of interest, they sent targeted ads to users.

Likewise, Fidzup used some other SDK to process promo IDs and technical information from gadgets (MAC address). They passed the data to their partners who could use Fidbox and as well launch targeted ads when users approached selected sales outlets.

In both cases, there was no publicly available information for users to learn where such ads came from, and no data processing notices indicated by the data controllers, i.e.  Fidzup or Teemo.

Useful tips on consent requirements 

The GDPR requires that consent must be a voluntary, specific, informed, and explicit to have a person’s data processed. This consent should be given for one or several data processing purposes and before data processing has started.

According to CNIL, neither of the two companies complied with any of GDPR data processing consent requirements.

For starters, they did not get any consent before data processing. And this is a crucial requirement!

In addition, it appears that users did not give their consent voluntarily. It had to do with all types of processing and was built into the very fact that you signed up with an application (otherwise, it would not work). In addition, application download was inextricably linked to an SDK.

User consent was not specific. Users could not select some specific types of processing they agreed or did not agree to. In fact, hardly anybody does it right (some say it is kind of expensive, while most say that it looks awful on a website). However, such consent differentiation does make sense. Some people do not want to get ads or have their data included in CRM systems.

User consent was not an informed one since it did not contain any information that third parties would use any data for targeted ads. Before downloading apps, people did not get any notices that their data had been processed already.

It seems that there are enough violations, right? But non-compliance actually involved things other than user consent, at least in Teemo’s case.

Retention period: don’t keep it longer

Teemo processed data within terms unreasonable for processing purposes. That’s what CNIL concluded regarding geolocation data processing within a 13-month period (so, how cool is that, dear advocates of 10 years?).

At CNIL, they ran a ratio test (lawyers do it while assessing controller’s legitimate interests) and stated that the right to privacy definitely prevailed. Why so?

That’s because using toolkits that help target a person for 13 months in a row is a serious intrusion into privacy. It also helps keep track of people on a large scale. And what if intruders get this information as a result of some data breach?

Both startups should use this information to think things over. They have three months, which is enough to shoot Three Billboards Outside Ebbing, Missouri at least three times.

We started to think it over as well. So, here is the question:

What should those who don’t want to end up like Teemo and Fidzup do?

  1. Get data flows within your organization sorted out. A simple questionnaire and an excel chart will do. Here are the questions you should ask first: from where, from who, where to, what for, and for how long? You can hire a lawyer as well.
  2. Check what you rely on when collecting data (data owner’s diplomatic immunity or rather user consent). If it is the latter, check what exactly user consent involves and what exactly you notify your users of. Perhaps, a completely different reason for processing will do for you, for example, your legitimate interests.
  3. Take care of your data retention period. Ideally, work out your data retention policy. Do not retain any data longer than is necessary. Maybe you do not need any user personal data at all.

That will do for starters. At least, that will be enough to make sure Le Monde won’t be writing about you.

If you’re an app user who speaks French and you don’t want any companies with laissez-faire attitudes to the rules to locate you, these tips by CNIL can help you protect your privacy.

While the accused companies say that 3 months is too much and promise to fix things even earlier, let us once again go through some typical questions about GDPR people usually ask their lawyers.

  • “You told us to do some technical stuff and drafted some company policies. We made documents publicly available on our website footer, and a pop-up window. Do we even need to do any of these? No one has been fined yet.”
  • “You told us to limit the retention period. Why can’t we just retain some marketing data for like, 10 years, or like, forever? Who even checks stuff like that?”
  • “Didn’t they invent this whole GDPR thing just to nail Google, Facebook and other big names to the wall? That has nothing to do with us.”

Here are very short and easy-to-remember answers: (1) yes, you do; (2) no, a special authority checks; (3) no, not only for the big guys.


The Associated Press, together with Princeton University, has recently investigated Google. To find out more, you can check out the Guardian.

According to this investigation, Google collects geolocation data from Google products users. Even when users change settings and don’t agree to their data being collected.

What’s that smell? It smells like 20 million EUR, doesn’t it?