TikTok is under scrutiny in the US. In Europe, we should not follow the same path.

Rules and transparency must rule our European attitude toward TikTok and all other platforms, not political stances or latent commercial wars.

There is a considerable amount of mischief in how US lawmakers have been investigating TikTok: more committed to proving their case than making a serious assessment of whether the platform complies with US regulations.

Take some questions that Forbes and Casey Newton recently reported, asked during a hearing held by the US Senate:

SENATOR JOSH HAWLEY (REP): “WOULD IT SURPRISE YOU TO LEARN THAT FORBES MAGAZINE RECENTLY REPORTED THAT AT LEAST 300 CURRENT TIKTOK OR BYTEDANCE EMPLOYEES WERE MEMBERS OF CHINESE STATE MEDIA?”

TikTok Chief Operating Officer Vanessa Pappas said the company does not “look at the political affiliations of individuals.”

The rest of the hearing, which also involved the other major social and content platforms, touched on more objective topics: the transmission and access of personal data and other metadata of US citizens by TikTok and Bytedance employees in China, children protection, and the control of disinformation.

The fact remains, however, that mixing plans where objective elements mix with geopolitical concerns can bring no good. In the US, TikTok must prove itself ‘purer’ than the American-made platforms. Casey Newton writes:

“After today’s hearing, though, it now seems clear that TikTok will be held to a higher standard than its rivals in this regard. When Facebook or Twitter are found to have mishandled user data, or accidentally employed a foreign agent, the worst they can expect are fines and some theatrical grandstanding in a congressional hearing. For ByteDance, though, the same crimes risk the equivalent of a death penalty — TikTok being banned altogether, as it was in India in 2020; or being forced to divest it, as former President Trump attempted while he was in office.” (The Verge, 15th September)

To be clear: here in Europe, asking – or investigating – the political affiliation of a company’s employees is a crime. It must remain so, regardless of where the employees come from or they live.

There European Union should not follow the same path.

In recent years, Europe has made an enormous effort to regulate the use and transmission of personal data (GDPR) and is in the process of drafting new legislation to cover crucial aspects of our democracies – even if debated and criticised, the Media Freedom Act, the Digital Service Act, and the Digital Market Act attempts to approach systematically and objectively aspects such as disinformation, freedom of information, market access and competition with large platforms, the responsibilities of the big players towards users and gatekeepers towards smaller competitors, which are almost always European.

Putting all these initiatives and legislations together, we get a compass to be used with everyone equally, be they European, American or Chinese platforms. But the compass only works when we follow it and use it accordingly. Let us remember what did not happen with the Cambridge Analytica scandal. If we had implemented clear rules, Facebook would have had to be banned forever from the territory of the European Union.

So what are the elements of this compass?

1 Implementing the GDPR. Consequently.

The transfer of personal data belonging to European citizens to countries outside the EEA is permitted, provided the third country is considered adequate by the EU regulations. To date, this is neither the case for the USA (except for the transfer of Passenger Name Record data by air carriers) nor, of course, for China, India, Brazil and Russia.

Transferring personal data to any of these countries requires certain stringent conditions, including:

  • The data subject must explicitly consent to the proposed transfer after having been informed of the possible risks of such transfers for the data subject due to the lack of an adequacy decision and adequate safeguards;
  • the transfer is necessary for the performance of a contract between the data subject (the customer) and the data controller (the vendor);
  • the transfer is necessary for ensuring the performance of the contract;
  • the transfer is necessary for important reasons of public interest;
  • the transfer is necessary for the establishment, exercise or defence of legal claims;
  • the transfer is necessary to protect the vital interests of the customer.

All these needs must be explicit, searchable and public. So: either we ban once and for all the sending of data of European citizens to non-secure Countries, including the USA, or we make it crystal clear to users what can happen to their data.

2 DSA: Platforms must take action against illegal content

The Digital Services Act (DSA) obliges digital services providers such as social media or marketplaces to quickly take action against illegal online content (such as hate speech). The new obligations will also include more traceability and controls to ensure that products and services are safe, for instance, by using spot checks to determine whether illegal content resurfaces. Keyword here: quickly and traceability.

3 DSA: Limitation of personalised advertising and stop to the dark patterns.

Platforms must be more transparent and accountable: they must provide clear information about content moderation or how their algorithms work to recommend content, allowing users to challenge content moderation decisions.

The Digital Service Act would also ban misleading practices and certain types of targeted advertising, such as advertising to children and advertising based on sensitive data. It would also ban so-called dark patterns and manipulating practices used by fraudsters to manipulate users’ decisions.

The challenge here is: to implement systematic controls and support “watchdog” organisations.

4 Stricter rules for the Giants. Read: GAFA, and now also ByteDance.

Big platforms and search engines, which are used by more than 45 million people a month in Europe and pose a higher risk, under the DSA will have to comply with stricter requirements enforced by the Commission.

They would have to contain systemic risks. These include the distribution of illegal content and adverse effects on fundamental rights, electoral processes, gender-based violence and mental health.

They would also allow for independent audits. Platforms shall also ensure their users can refuse recommendations based on profiling. They would also have to give authorities and accredited researchers access to their data and algorithms.

Note: this is exactly how TikTok is proceeding, showing way more commitment than the GAFA ever did before. They are giving access to their algorithm and moderation systems to independent researchers, and they have assigned Oracle to the auditing of their data centres. We can only hope that the EU DSA will come into force soon, to see something like that finally happening at Meta, Amazon and Google.

5 EU Digital Market Act: Gatekeepers must establish compatibility with third-party services.

The Digital Markets Act (DMA) imposes obligations on large online platforms that operate as so-called gatekeepers in the digital market.

These platforms decide on market access and are almost impossible for consumers to circumvent. To prevent unfair business practices, the players classified as gatekeepers would have to ensure the compatibility of their services with those of third parties.

6 No preferential treatment of your own products at the expense of competitors: Amazon and marketplaces, this is for you!

Business users will have the right to access their data on the gatekeeper’s platform, advertise their offers and conclude contracts with their clientele outside the gatekeeper’s ecosystem.

Gatekeepers would no longer be able to rate their services or products on their platforms better than the ones of third parties and thus favour their own company.
Furthermore, they would only be allowed to use usersì personal data for targeted advertising if they expressly agree to this.

Conclusion: the EU must implement good rules and enforce them. It is on the way. Now, it is time to accelerate, not to follow the American political frenzy.

No, it is not the political affiliation of TikTok’s employees that should concern us, nor the quality of the content they offer: more clever, it must be said, than what we now see on Meta platforms.

We in Europe are trying – after undeniable delays – to draw a clear, objective and rigorous map of rules to follow. We need only fear our slowness in converting it into operational laws.

And we must use our visible hand as much as possible to create our ecosystem of players and platforms. But that is another matter, fit for another post.

In the meantime, if you want a well-done portrait of TikTok’s parable, I recommend this piece from the Economist:

If, on the other hand, you want to follow TikTok’s actions step by step, without celebratory or contrarian attitudes, but with the spirit of a researcher, I recommend this newsletter in English, run by the German Marcus Bösch of Hamburg University:

https://tiktoktiktoktiktok.substack.com/

[mailpoet_form id=”7″]

Cover Photo by Eyestetix Studio on Unsplash