By: Will Bernstein
The Dark Side of Social Media
TikTok, the internet’s latest craze, has been taking the world by storm. The app follows many established conventions of other social media apps such as a liking system for popularizing user content and a comment system for providing user reactions. At first glance, the app seems like a harmless tool used for pleasure and communication. Nonetheless, it poses a serious security threat to users. Additionally, the app poses the threat of false information since not all content shared is verified. Due to events that have created tension between the nations, like the ongoing trade war between the United States and China, how can Americans trust that the China-based company ByteDance, the owner of TikTok, isn’t misusing the data from the 80 million monthly users in the United States? This data can be as benign as what videos a user has liked or alarming such as the rhythm with which a user types. However, most concerning is that the Chinese government has access to all of it due to controlled capitalism. As with all social media, there are people who are using the app responsibly with no mal-intent; however, the questionable practices of ByteDance and the malicious intentions of hackers and child traffickers outweigh the few benefits of TikTok.
The roughly four-year existence of the app outside of the Chinese market has already seen a great deal of controversy. ByteDance remains a mysterious company overseeing the application and has yet to be investigated for many genuine claims of harvesting data from users and suppression of content made by persecuted groups of people. Just this year, TikTok paid 92 million dollars of settlement fees out to 89 million users who declared ByteDance participated in “The theft of private and personally identifiable TikTok user data.” ByteDance needs to be transparent with their intentions or at least allow an investigator to inspect their practices to make sure if there is nothing malicious lying beneath the surface level.
Some users may claim that TikTok spreads joy and has been somewhat of a coping mechanism for the difficult year humanity has just experienced. However, TikTok has yet to address the lack of a mechanism to determine whether content is appropriate for a user based on age. TikTok asks for a date of birth to verify a user’s age before they can use the app, and the many young children who lie about their age to gain admission into the trendy app will see sexually inappropriate content. However, the issue is rooted deeper than young children duping the background check. The lack of a system that reports sexually explicit content means that even people who are old enough to use the app may still see disturbing content. While there are other mechanisms that exist to cloister this content (in particular the “Report” button), the lack of such a system also invites child traffickers to abuse the system and put children in danger. The aforementioned children who cheated the birth check makes this exploitation all the more dangerous.
Maybe the most disturbing, however, is how TikTok is built to keep users on the app as long as possible, nurturing the perfect environment for unhealthy addictions. Since most social media platforms generate profits via ads, they have an incentive to keep their users on their app for prolonged periods of time. Snapchat, another mainstream platform among teens, uses the idea of a daily streak to keep users coming regularly since many fear the risk of losing their streak. After extending a streak, the brain sends dopamine as a sense of reward that makes users want to come back. Although users may try to resist their urge to stay on their social media apps, these platforms resort to brain chemistry to purposefully influence their users to come back and continue to use the app. Sadly, TikTok is no different. TikTok intentionally forces a time restriction for a video upon users to stimulate a dopamine response. Not until recently did the company expand the time limit of a video to 60 seconds, but the video is still a combination of 4 15-second segments to try and initiate the dopamine response. Furthermore, TikTok uses algorithms to appropriately recommend the types of videos that would maximize the dopamine response of an individual based on previously enjoyed content. This practice ensures that people stay hooked on TikTok even if they might be trying to get off the app. Perhaps these practices would be acceptable if ByteDance was to at least acknowledge them and admonish to users the realistic possibility these practices could have on them, but currently, this is not the case.
Although TikTok has done a great deal of good and many users enjoy the creative freedom and user-generated content the app provides that has helped users cope, ultimately, TikTok is dangerous. ByteDance needs to be more transparent about its practices and intentions towards not only Americans but the entire international TikTok community to ensure a safer experience for the users. The app is a magnificent tool to express creativity, but in its current state and management, it is too harmful to a user base made up of young people to be considered safe for usage. Blockage of sexually explicit content and complete transparency of ByteDance’s techniques employed throughout the app are necessary before TikTok can finally be considered beneficial.
Links:
https://www.bbc.com/news/technology-53476117
https://www.gbnews.ch/tiktok-for-business-what-is-tiktok-anyway/
https://medium.com/dataseries/how-tiktok-is-addictive-1e53dec10867
https://www.oberlo.com/blog/tiktok-statistics
https://www.imf.org/external/pubs/ft/fandd/2015/06/basics.htm
TikTok’s logo
ByteDance, the Chinese company that owns TikTok