When Christina Kim joined TikTok, she was looking for a new form of entertainment. Like many people during quarantine, she scrolled through the platform aimlessly, liking funny dance videos and posting clips about being a mom and nurse practitioner during the pandemic. But after a while, she noticed a concerning trend: videos and comments filled with false information about Covid-19. “It was a huge eye opener,” she says. “I was so shocked to be exposed to this world of people—people who didn’t believe in science.”
On a whim in July, she posted a video from work with text that read, “Wearing a mask will NOT affect your oxygenation or cause ‘carbon dioxide poisoning.’” Over Valentino Khan & Wuki’s song “Better,” she puts on a surgical mask while hooked up to a pulse oximeter, which estimates the percentage of your blood that is saturated with oxygen. The oximeter reads 98 percent (within the normal range). She then puts on a thicker surgical mask, then an N95 mask, and finally all three together. Her oxygen level never drops below 98 percent.
The video has since been viewed 1.7 million times, transforming Kim from a casual TikToker to one with over 50,000 followers. “I realized that maybe with this many followers, I had the opportunity to educate and dispel myths,” she says.
Kim is among a legion of scientists, medical professionals, and others who’ve taken to TikTok to combat Covid-19 misinformation. With US coronavirus cases now topping 6 million and deaths approaching 190,000, the wildly popular social media platform may offer a way to educate people while keeping them entertained and engaged.
Misinformation, or “fake news,” as it’s often referred to, isn’t new. Organizations from the Nazis to tabloids have spread inaccurate information to control a narrative or generate revenue. On social media, these rumors can travel faster and further. Think of the shooting of security officers at a federal courthouse in Oakland, California, in May. The incident took place amid protests over the police killing of George Floyd, leading some—including Vice President Mike Pence—to blame the shooting on Black Lives Matter protesters. In fact, the man charged with the shooting was linked to the Boogaloo right-wing extremist movement.
When social media first appeared in the early 2000s, early efforts at misinformation on blogs or MySpace often featured bad graphics, background music, or poor page layout, making them easier to detect, says Jen Golbeck, a professor in the College of Information Studies at the University of Maryland. But platforms like Facebook and Twitter standardize the look of their pages, making it more difficult for viewers to know when they are viewing false information. And groups like Russia’s Internet Research Agency have learned how to blend in by mastering meme culture, placing a greater burden on individuals to distinguish fact from fiction.
Misinformation often flourishes in the immediate aftermath of a crisis, says Kate Starbird, a professor in the College of Engineering at the University of Washington. That’s exaggerated by the prolonged Covid-19 pandemic. “It’s not just a couple of days after an earthquake or hurricane,” says Starbird. “This is uncertainty for months and months and months about what the disease is, what the best response is, how well do masks work. It’s pervasive.”
When Kim, who is a nurse practitioner at a large hospital in Boston, started scrolling through TikTok, Covid-19 misinformation seemed endless. “There is the idea that masks are evil, Covid is a hoax, the pandemic is real but not nearly as bad as the media is making it out to be, the conspiracy theory that it was created by scientists and that it was delivered during the election year to sabotage Trump,” she says. She was particularly irked by one commenter on her mask video who claimed Covid-19 case numbers have been inflated, due to all of the time and energy she and others have spent treating Covid-19 patients.