Advertisement

TikTok’s security boss makes his case. Carefully.

Roland Cloutier says the video-sharing company does not share user data with the Chinese government.
TikTok
(Flickr / <a href="https://flic.kr/p/2jjJXGz">Solen Feyissa</a>)

Roland Cloutier, the global chief security officer for ByteDance, says he still doesn’t understand why the U.S. government has labeled TikTok as a national security threat.

The video-sharing social media company, owned by Beijing-based ByteDance, filed a lawsuit in U.S. federal court Monday challenging a White House executive order that effectively will ban the app unless TikTok is sold by Nov. 12. Then, news broke Thursday that TikTok chief executive Kevin Mayer had resigned, three months after he was hired, amid reports that he’d been excluded from acquisition talks.

U.S. officials have said that commercial apps with roots in China, like TikTok, present a risk to national security by enabling the Chinese Communist Party to sweep up Americans’ personal and location data. Researchers, meanwhile, have suggested that TikTok collects much of the same information as other social media apps. The dispute resembles the 2018 debate over Kaspersky Lab, in which the U.S. discouraged the use of a Russian security firm in connection with similar concerns about laws in the country where the company is based.

Without more evidence, it’s impossible to examine the accuracy of the specific allegations of wrongdoing.

Advertisement

The attention, though, also is a reminder of the U.S. intelligence community’s scrutiny of Chinese-based telecommunication firms Huawei and ZTE.

While the CEO of Huawei, for instance, said last year that the firm would not provide data to the Chinese government, he neglected to mention how two pieces of national security legislation that apply to China-based companies would complicate that situation. The laws, signed in 2014 and 2017, force technology companies to aid the government’s “intelligence work” by stipulating that “any organization or citizen shall support, assist and cooperate with state intelligence work in accordance with the law,” according to the 2014 law.

Similarly, national security analysts have interpreted the broadly-worded 2017 legislation as a move to co-opt corporations to help Beijing go on offense in cyberspace.

While TikTok remains in talks to sell its U.S. operations — reportedly with Microsoft and Oracle — the lawsuit filed Monday accuses the White House of violating its right to due process “by banning TikTok with no notice or opportunity to be heard.” Officials have cited only “the speculative possibility that the application could be manipulated by the Chinese government,” the allegation states.

The legal filing echoes some of the same complaints Cloutier voiced during an Aug. 20 interview with CyberScoop, conducted prior to the lawsuit. He spoke on a range of topics, like scenarios in which people outside the U.S. might be able to access data on American TikTok users, but was adamant that the firm does not share data with the Chinese government. He also stressed that he’s unaware of the White House’s specific security concerns.

Advertisement

“When you find out, let me know,” he said.

CyberScoop: Can you walk me through how TikTok’s risk profile has changed since it became the subject of focus for the U.S. government?

Roland Cloutier: Obviously there’s been a lot of noise and untruths spread about sharing of information.

We simply don’t share data with governments, including the Chinese government. We have very specific processes when law enforcement or government agencies were to ask us for things, and because we sit in the United States, it would have to go through the U.S. government. So we don’t share information. Our information is protected in both our U.S. and Singapore data centers through all of our controls that you would expect our organizations to have, as well as the encryption that we put on protected data assets and objects in our applications. That’s managed out of the U.S. team.

That aside, we start with the primary focus of ensuring that the platform is safe. Ensuring that, as people come on to the platform, ensuring that the information they’re accessing is safe both from things like misinformation and disinformation, protection for our younger users on the platforms, as well as ensuring that we don’t have that type of material that’s not part of our community agreements. We’re not going to have pornographic material. We are not going to have violence and speech. We are not going to have those things in our community.

Advertisement

CS: In terms of the data transfers, you said the company is not sharing data with the Chinese government unless there’s a specific request. Is that correct? How does the law enforcement data request process work?

RC: Let’s be clear: We’ve never gotten a request from the Chinese government.

We publicly disclose, and we will continue to publicly disclose, responsibly in our transparency reports when law enforcement asks us for things.

Obviously, U.S. law enforcement is very active in the digital space, and we get warrants from them. We’ve gotten it from other countries. We’ve never gotten it from China because, quite simply, TikTok does not operate in China.

Neither TikTok data, nor use, occurs in China, so therefore [the Chinese government] does not have jurisdiction over the platform. It’s pretty simple. The data doesn’t even exist in China, so there’s a whole bunch of ways to look at this, but the biggest fundamental truths are that the Chinese government doesn’t ask for it, because it doesn’t exist in China.

Advertisement

(Ed. note: TikTok does not operate in China, but ByteDance operates an app called Douyin that is a carbon copy of TikTok solely meant for Chinese users. When asked about this after the interview, ByteDance said: “TikTok and Douyin have separate markets, users, and content. TikTok is in close to 150 markets around the world, but it is not offered in China. The TikTok and Douyin apps are run entirely separately, on separate servers.”)

There’s a global process by which law enforcement has to do that. In the United States, it falls under the Fifth Amendment, and in other countries there are different things, but there has to be a legal process by which information is requested, provided and disclosed. For any country to ask for U.S. data, it has to go through the Department of Justice. For the U.S. to ask for data from another country, it has to go through that country of origin.

CS: How does it work with Bytedance owning TikTok and being based in China, though?

RC: It doesn’t matter. The information is the information. It’s within the jurisdiction or under the legal guidelines of the country of origin. It doesn’t change anything. If someone were to unlawfully attempt to take information — it’s like any type of data, intellectual property, trade secrets — we’re required to protect that at a level where that can’t happen. So if Russia is hacking the [Democratic National Committee], or if another nation-state is hacking a bank, they have a requirement to be able to protect the information so it’s protected. And we do that through a variety of industry best practices for access control, access management and encryption. That’s how we handle it.

CS: So if I understand this 100% correctly, because TikTok user data is stored in the United States, none of that is subject to Chinese law, right?

Advertisement

RC: Correct.

CS: How does it work in terms of TikTok sharing information? It says in the privacy policy TikTok shares user data with a “parent or subsidiary or another affiliate of the corporate group.” What does that mean?

RC: If we’re doing some sort of testing or analysis on a new function that another part of the organization may have developed, we may need to test certain parts of the dataset to ensure that a product works.

We may authorize individuals from one of the 48 countries that we operate in to do various testings. Remember, we have various engineers — an A.I. firm we bought out of the U.K., development organizations in Germany and Spain, and a large group in India, as well as China.

We have people all over the globe. The privacy policy is meant to say that, through the appropriate controls and restrictions we have on information, that we may use portions of that information to work in our testing and validation of our technologies.

Advertisement

Quite simply, we have strict employee controls on any data. Protected data, which are the super sensitive things we consider to be [personally identifiable information], we go as far to encrypt those throughout the lifecycle that they’re in the platform.

CS: Is that encrypted user data shared with international affiliates under the terms of the privacy policy? I understand you’re doing testing and security and everything needs to be reliable, but is that encrypted user data folded into that in terms of being shared with parties outside the United States, whether it be China or elsewhere?

RC: It wouldn’t be transferred outside the United States, if that’s what you’re asking.

There may be some work that needed to be done from an engineer in a different location and potentially we would allow them access into the U.S. data store to get that information. And we have, obviously, a bunch of limitization, controls oversight, monitoring controls, data exfiltration controls, and all of those things in place, as well.

The data always remains resonant in the U.S. or in Singapore.

Advertisement

CS: So they can be somewhere else and access the U.S. database?

RC: No. Let’s not use broad terms like “the database.” They may be able to access a specific set of data if necessary to do that. It would most probably not be the entire database. It would most probably be a portion of that data. Again, if they could access it based on requests that were done, and the review of the U.S. executive team to allow for that to happen, and then it would be done under the guise and controls of the U.S. security team.

CS: One of the things about TikTok that stands out is the number of users who are under 18. How do you protect that data compared to the way that you would have someone who is an adult?

RC: Think of it like we’re protecting the whole person. We want to protect the whole person, their information, and their experience on the platform. We have a set of tools, technologies and frameworks so that, if you were under 17, it can be applied. And if you’re under 13, it will automatically be applied, such things like you can’t [send a direct message], or you can’t look at certain types of content. There’s a host of things I can point you to in terms of what happens with our younger users.

From a data defense standpoint their information is no less or no more important than anyone else’s that we protect. We protect every user from every position in society that’s on our site in the same way. They get that same level of assurance.

Advertisement

CS: What happens next?

RC: It doesn’t matter, in my space, what the future holds from a business portfolio standpoint. What matters to me is that we still have almost a billion users on our platform, we still have a very core mission in front of us, which is to ensure the safety of the community on our platform. Everything on my list today to achieve and every business initiative we are driving is still going forward. Whether that’s with company A, B or C, it doesn’t matter.

This interview has been condensed and lightly edited for clarity.

Jeff Stone

Written by Jeff Stone

Jeff Stone is the editor-in-chief of CyberScoop, with a special interest in cybercrime, disinformation and the U.S. justice system. He previously worked as an editor at the Wall Street Journal, and covered technology policy for sites including the Christian Science Monitor and the International Business Times.

Latest Podcasts