In Texas v. Johnson, the Supreme Court summarized the “bedrock principle” of the First Amendment: “that the government may not prohibit the expression of an idea simply because society finds the idea itself offensive or disagreeable.” But what if the idea is being pushed by an all-knowing algorithm . . . and the idea being pushed is NyQuil chicken?
TikTok has grown in popularity in recent years among a wide range of demographics. Its users often find themselves in niche categories such as BookTok, Cottagecore, and ThriftTok via one of the app’s most notable features: the algorithm.
Through the algorithm, TikTok provides users with a never-ending flow of videos that are curated to the user’s interests. The algorithm’s ability to know almost exactly the content a user may want to consume is both irresistible and uncanny. This early stage machine omniscience prompted scrutiny from local and federal government agencies.
Scrutiny on Data Security
Perhaps the most newsworthy issue is users’ security. ByteDance, TikTok’s parent company, maintains its headquarters in Beijing and is incorporated in the Cayman Islands. Unsurprisingly, this ownership structure has raised some eyebrows. For example, the House Energy and Commerce Committee held a congressional hearing earlier this year and grilled TikTok’s CEO with their safety concerns.
With the increased scrutiny came increased regulation. President Joe Biden signed legislation that banned the app from government devices. Other countries in the “Five Eyes” security alliance (the intelligence alliance composed of Australia, Canada, New Zealand, the United Kingdom and the United States) enacted similar steps. Effective this new year, Montana will ban the app in the entire state including for private individuals.
While many dismiss the actions as unnecessarily alarmist, there are bits of validity in the concerns. Last year, TikTok admitted to using app data to track down journalists’ sources when ByteDance used the app’s records to access journalists’ IP addresses to see if they were in the same location as employees suspected of leaking information. Last month, users of ByteDance’s video editing app CapCut sued the company alleging privacy concerns. According to the complaint, CapCut harvests “unique identifying information, biometric data, geolocation, telephone numbers, and other private or confidential data, in violation of state and federal consumer protection laws.”
One after another, countless government entities—countries, states, universities—began banning TikTok from government devices. Can your agency do the same?
My Device, My Choice
As a general rule, public agencies should have policies regulating employee use of agency-owned e-mail accounts, computers, and other devices. These policies should advise that employees have no expectation of privacy or right of privacy concerning their activities on agency-owned devices. This takes care of potential liability for a claim of invasion of privacy.
Besides the privacy concern, employees may argue that banning a social media platform on agency devices restricts their freedom of expression. In the context of agency-owned devices, the focus for a First Amendment analysis is on whether the agency, by supplying devices with access to other social media, created a designated or limited “public forum.”
The agency can avoid the risk of creating a designated or limited “public forum” by instituting an “Acceptable Use” policy. This policy should restrict use of agency-owned devices to work-related purposes only, with an accommodation for incidental personal use. It can even explicitly state that the devices do not create any types of public fora. This allows the agency to discipline employees for breaking the Acceptable Use policy, and not for any protected speech. So, an employee who decides to use an agency-owned device to engage in speech on TikTok would not be disciplined for engaging in the protected speech. Instead, the employee can be disciplined for violating the Acceptable Use policy.
Don’t Take it Personal
It is much harder to regulate an employee’s use of their personal devices on their own personal time—including whether or not the employee chooses to install TikTok on their devices. The ongoing litigation over Montana’s statewide ban of the app from government and personal devices illustrates these challenges.
Employees likely have a reasonable expectation of privacy in the contents of their personal devices. Under California Labor Code, section 980, employers cannot even require employees or applicants to disclose their social media accounts.
Employers face another uphill battle on the free expression front. Employees may argue that any security concern can be mitigated by prohibiting employees from bringing their personal devices into the workplace. Outside the workplace, they can then argue, any restrictions on TikTok serve only to restrict the employee’s ability to engage in free expression.
Consequently, restricting TikTok on employees’ personal devices is likely inadvisable. Still, there are varying levels of security concerns depending on the various policies an agency may impose on use of personal devices. For example, an agency may allow employees to use their personal devices to access agency-owned email accounts, payroll services, or scheduling apps. An agency may also, instead of supplying a device, reimburse the employee for their existing personal device. These situations open up a multitude of issues to consider. If your agency decides that there is value in restricting TikTok from personal devices, such a decision should be guided by legal counsel.
Should your employees be on TikTok?
The threshold choice of restricting TikTok in the workplace is largely a policy decision. Agencies should consider the potential security concerns and determine if employees’ use of TikTok might put the agency’s data at risk.
Restricting TikTok involves several factors such as whether the agency owns the employees’ devices, the level of data security risk relative to the employee’s position and duties, and even the potential benefit of social media for marketing opportunities. Implementing these policy decisions requires comprehensive analysis of several factors unique to each agency and should be undertaken with guidance from legal counsel familiar with the issues.