Article Text

Download PDFPDF

Mpox (monkeypox) information on TikTok: analysis of quality and audience engagement
  1. Ao Shi1,2,
  2. Joe El Haddad1,2,
  3. Peng Cai3,
  4. Shijie Song4,
  5. Yuan Jia Wang5,
  6. Qi Liu6,
  7. Pengyang Li7
  1. 1 Faculty of Medicine, St George’s University of London, London, UK
  2. 2 School of Medicine, University of Nicosia, Nicosia, Cyprus
  3. 3 Department of Mathematical Sciences, Worcester Polytechnic Institute, Worcester, Massachusetts, USA
  4. 4 Business School, Hohai University, Nanjing, Jiangsu, China
  5. 5 Department of Molecular Biosciences, The University of Texas at Austin, Austin, Texas, USA
  6. 6 Wafic Said Molecular Cardiology Research Laboratory, Texas Heart Institute, Houston, Texas, USA
  7. 7 Division of Cardiology, Pauley Heart Center, Virginia Commonwealth University, Richmond, Virginia, USA
  1. Correspondence to Dr Pengyang Li; leelpy0109{at}


Background Online information on mpox (monkeypox) is not well studied. We have analysed the video content, information quality, and audience engagement of mpox-related videos on TikTok.

Methods Using a hashtag-based searching strategy, we identified 2462 mpox-related videos on TikTok from 1 January to 11 August 2022; 85 were included after exclusion criteria screening. Videos were evaluated for content on features and treatment of mpox. Video and information quality was assessed using the DISCERN instrument and the Journal of the American Medical Association (JAMA) criteria. We recorded video source, evaluation scores, and viewer engagement metrics. The Kruskal-Wallis test was used for statistical analysis and multiple linear regression for factor-association studies.

Results Of the 85 videos, two assessed all content topics and highlighted 33% of all content items in clinical guidelines. The overall average score for the videos was 39.56 of 80 on the DISCERN instrument and 1.93 of 4 on the JAMA criteria. No video met all JAMA criteria. Subgroup analysis based on author identity suggested the variance in video scores by source (p<0.05 for all). Overall scores were higher for videos produced by doctors and science communicators than for those made by institutional users, nurses, and the general public. Multiple linear regression analysis showed that having people in the video (69.20, p=0.0001) and including information on treatment choices (1.15, p=0.045) were significant, independent determinants of audience engagement.

Conclusion Public-directed TikTok videos on mpox frequently provide incomplete, inaccurate information, highlighting the potential risks of using TikTok as a health information source.

  • health education and promotion
  • public health
  • Monkey Pox

Data availability statement

Data are available upon reasonable request.

This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See:

Statistics from

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.


  • TikTok is one of the most active social media platforms used today. Because of its popularity and accessibility, it is an important public source of health information, especially in health emergencies such as the mpox (monkeypox) outbreak.


  • Health information about mpox on TikTok is of poor quality and lacks the necessary content to provide accurate and comprehensive clinical knowledge of the disease.


  • Our findings highlight the risks of using social media to obtain health information during public health emergencies and encourages the development of strategies for building an efficient health information sharing system.


The emerging outbreak of mpox (monkeypox) virus raises global concern for this new public health challenge.1 Having recently spread in non-endemic areas, mpox disease presents with symptoms of fever, lymphadenopathy, and painful skin eruptions all over the body that last from 2 to 4 weeks2 ,3 Human-to-human transmission has been reported.2 A state of emergency was declared by the USA4 to manage and contain the outbreak.

Because of its popularity and accessibility among communities, social media has been widely used to share health information.5 6 Moreover, more users turn to social media platforms for healthcare information and emotional support related to their health.7 In a previous social media study, more than 70% of people claimed that information acquired from social media affected their health status.8 Consequently, governments and healthcare agencies also use social media to communicate essential health information during health crises.9 Engagement analysis is an essential tool to evaluate the information available publicly on different platforms because this information may be used to acquire more views, likes, and comments on posted material.10

Reaching more than 1 billion users in 2022,11 TikTok is one of the major platforms people use to seek information and support related to healthcare and provides quick, widespread access to information on timely public health issues such as the mpox outbreak. TikTok is a short video mobile platform that allows users to easily create videos lasting between 15 s and 5 min. Users in the USA spend an average of 45.8 min on TikTok daily,11 making this one of the most active social media platforms. The quality of information in healthcare communication is critical; low-quality information may mislead patients in their health decisions and contribute to health crises.12 For example, according to a COVID-19 survey, 12% of respondents believed that injecting disinfectant could prevent or cure COVID-19.13 Moreover, the repetition of low-quality information can significantly distort cognition, even when an individual has knowledge of a topic; this scenario is known as the ‘illusory truth effect’.14 15 Health-related content on TikTok, an emerging source of health information, has shown a high audience engagement.16 17 Videos mentioning infectious diseases such as COVID-19 and sexually transmitted diseases, as well as related topics such as vaccines and mental health, were viewed more than 100 billion times in 2022.18 During the COVID-19 pandemic, TiKTok served as a platform for physicians and public health departments to promote knowledge of the disease and to identify and correct misinformation on COVID-19.16 19 Furthermore, COVID-19-related TikTok videos conveying alarm or concern or information on disease susceptibility and severity had higher user engagement.16 Therefore, evaluating the health-related content shared on this popular platform is both urgent and important.20

Previous studies have focused on the quality of information and video content addressing health matters like COVID-19 and Takotsubo syndrome and on the engagement modalities of these videos.21 22 However, information on mpox is lacking, despite the outbreak and the rapid access to healthcare content on TikTok. In the present study, we aim to examine the content and quality of information in TikTok videos that address the recent mpox outbreak as well as the engagement these videos receive based on their characteristics and content.


Search strategy

We conducted the online search for this study on 11 August 2022, by using a web crawler tool to capture all videos uploaded between 1 January 2022 and 11 August 2022. To retrieve TikTok videos specifically related to mpox, we used the following 12 hashtags: Monkeypox virus, Monkeypox, Monkey Pox, monkeypoxrash, monkeypoxoutbreak, monkeypoxpandemic, monkeypoxsymptoms, monkeypoxtreatment, monkeypoxtips, monkeypoxprevention, monkeypoxcase, and monkeypoxcases. The initial search returned 2462 videos (4, 1244, 49, 27, 13, 286, 496, 113, 4, 201, 4, and 21, respectively, for the above terms). All videos for each keyword were retrieved and screened. After manually checking each video, 85 were included for analysis. Videos were excluded if they were (1) not related to mpox or lacked educational information, (2) duplicate videos, (3) not in English, or (4) not original. The workflow chart is shown in figure 1.

Figure 1

Workflow for video search strategy.

Data extraction

Baseline characteristics of the videos and the author profiles were extracted on the same day as the video search. We collected the following information on video authors: the number of followers and who they were following, the total number of posted videos and likes, and their account ID. In addition, we checked to see if they had profile verification, profile photos, and contact details. Based on the video profile page, the TikTok authors were divided into four categories: (1) doctors (MDs and PhDs) and science communicators, (2) institutions, (3) nurses and other healthcare workers, and (4) general public users.

For each video, we recorded the uniform resource locator, the date published, the number of likes, comments, and forwards, and the time length. Moreover, the content was evaluated for the use of human characters, background music, emojis, flash, and subtitles.

Operationalisation of variables/video assessment

The content of selected videos was examined by using three validated tools for evaluating health information: the DISCERN instrument (Quality Criteria for Consumer Health Information),23–25 the Journal of the American Medical Association (JAMA) criteria,23 26 and six questions obtained from Goobie et al.27

The DISCERN instrument comprises 16 questions that evaluate healthcare information from three perspectives: reliability of the publication (questions 1 to 8), quality of information about treatment options (questions 9 to 15), and the overall score of the publication (question 16).23 24 26 Each question is scored from 1 (poorest coverage) to 5 points (best coverage). An average score below 26 is very poor; a score of 27 to 38 is poor; a score of 39 to 50 is fair, a score of 51 to 62 is good, and a score above 63 is excellent.23 24

The JAMA criteria uses four questions to assess the reliability, plausibility, transparency, and usefulness of video information on the internet, with a score of 0 indicating an item not included and a score of 1 indicating an item included in the video. Each video can be scored between 0 (minimum) and 4 (maximum).23 24 26

Goobie et al analysed videos based on six categories of questions: disease definition, signs or symptoms, risk factors, diagnosis, management, and complications. Each aspect was scored as 0, 1, or 2 (not addressed, partially addressed, and sufficiently addressed, respectively).

Video coding

All video content was independently scored and coded by two authors (AS and JE). A training exercise was conducted before coding. In this exercise, 20 videos each were scored and coded independently by the two authors, and any resulting discrepancies were addressed and resolved to reach consistent standards. Microsoft Excel (2022) was used for data collection, processing, and analysis.

Statistical analysis

Descriptive statistics, including the mean, median, range, and standard deviation (SD), were calculated. The R statistics software (version 4.2.1, R Development Core Team) and Prism GraphPad (version 8.4.3) were used for data analysis.

Continuous variables of descriptive statistics were analysed including the mean, median, range, and SD. The mean, median, range, and SD of descriptive statistics were calculated using R statistics software. Discrepancies were allowed between the two raters. The average scores were used for final analysis. To test the reliability of ratings, the average agreement by intra-class correlation coefficient was calculated. The average agreement by intra-class correlation coefficient was 0.98 for content rating, 0.99 for the DISCERN rating, and 0.99 for the JAMA Benchmark rating. The average intra-class correlation coefficients for inter-rater agreements all exceeded the recommended value of 0.75, indicating that the ratings had good reliability.28 The Kruskal-Wallis test was performed to identify differences between the extracted variables. A p value <0.05 was deemed statistically significant.

We used the number of likes as a proxy measure of audience engagement. Multiple linear regression was performed to evaluate the association between audience engagement and other factors such as video quality, video content, and author categories. Review by the institutional review board was not required because this study did not involve human subjects.


Descriptive characteristics of videos and authors

The average length of the evaluated videos was 78 s (range 6–243 s) (table 1). The mean value of received likes, comments, and forwards for each video was 11015, 211, and 693, respectively. Overall, most videos (90.59%, 77/85) had people featured in the video (table 2). Background music was used in 24.71% (21/85) of videos and flash or animation in 36.47% (31/85). Emojis and subtitles were used less, at 10.59% (9/85) and 5.89% (5/85), respectively.

Table 1

Descriptive characteristics of mpox-related videos on TikTok

Table 2

Production characteristics of mpox-related videos on TikTok

The most common authors of videos were doctors and science communicators (43.53%, 37/85), followed by general public users (35.29%, 30/85), nurses and other healthcare workers (12.94%, 11/85), and institutional users (8.24%, 7/85) (figure 2). Table 3 shows the baseline characteristics of videos and authors categorised by their public profile.

Figure 2

The distribution of author groups who made the mpox-related videos.

Table 3

Characteristics of mpox-related videos on TikTok categorised by source

Video content assessment

To study the content of mpox-related videos further, we used a six-question set from the study by Goobie et al.27 The median content score of analysed videos was 4 out of 12 (table 4), indicating videos addressed a median of 33.33% of all content items that were highlighted in clinical practice guidelines. We found that 85.29% of all videos suggested at least one risk factor (figure 3). Content scores were generally higher in videos from doctors and science communicators and institutional users than in other groups (tables 4 and 5).

Figure 3

Percentage of videos addressing each content domain from Goobie et al.27 Blue represents the percentage of videos that addressed all items in a single domain, orange represents the percentage of videos that addressed some items in a single domain, and black represents the percentage of videos that did not address the specified domain.

Table 4

Video content scores as assessed by Goobie et al 27

Table 5

Video content and quality scores by different instruments

Video and information quality assessment

The average score obtained by using the DISCERN instrument was 39.56 out of 80 (table 5), indicating that the overall quality of information in the videos was poor. Of the author groups, doctors and science communicators made videos with the highest average score (42.2), followed by institutional users (40.6), whereas the lowest scored videos were made by general public users (36.6), indicating that the video quality of this group was the worst (table 5). Several of the questions addressing reliability, bias, and uncertainty of the videos (questions 1, 2, 3, 6, and 8) were scored greater than 3 out of the 5 points. However, questions 9–14 were scored around 1, indicating the videos delivered poorly on providing treatment-related information and using up-to-date supporting references (figure 4).

Figure 4

DISCERN scores for all videos across different sources. The shade of each cell represents the DISCERN score, ranging from a low score of 1 (white) to a high score of 5 (dark blue). (A) Doctors and science communicators. (B) General public users. (C) Nurses and other healthcare workers. (D) Institutional users.

Information quality was also assessed by using the JAMA criteria. No videos in our study met all the JAMA criteria due to their lack of disclosure statements. The average JAMA score was 1.93. Videos made by doctors and science communicators were scored the highest (2.20) on information quality, followed by nurses and other healthcare workers (2.00), institutional users (1.71), and general public users (1.62) (table 5).

Overall, assessment scores based on the three instruments used in this study showed statistical differences among the author groups (table 5).

Analysis of audience engagement

Multiple linear regression analysis on all factors mentioned in the above sections identified that two factors—the presence of people (69.20, p=0.0001) in the video, and including the quality of treatment choices (1.15, p=0.045) in the DISCERN instrument—were significant and independent determinants of audience engagement (counted by likes). Other factors did not demonstrate a significant correlation in engagement (table 6).

Table 6

Multiple linear regression analysis on audience engagement*


The global outbreak of mpox has triggered the emerging trend of mpox-related videos for sharing and educational purposes. To our knowledge, this study is the first to assess the quality, content, and audience engagement of videos pertaining to mpox on TikTok, one of the main social media platforms globally. We assessed video content by using six content categories related to the mpox disease in clinical practice; only a third of these categories were reported in the videos, with most addressing mpox risk factors. Moreover, our assessment showed that the overall quality of video information was poor. In addition, we found that the presence of people in the video and the inclusion of information on the quality of treatment choices were independent determinants of audience engagement. Overall, the material on the recent mpox outbreak shared through TikTok videos was frequently unreliable and incomplete, hindering public health efforts to share accurate information on mpox.

Content inclusion analysis

In assessing video content, we found that most videos (85.29%) discussed the concept of risk factors for mpox disease. Jiang et al also reported that risk factors were a main concept addressed in discussing Takotsubo syndrome on TikTok.22 Understanding the risk factors of mpox disease is important in slowing down the spread of the disease. However, on average, videos addressed only one third of the content items highlighted in clinical practice guidelines. In their study on genitourinary cancers, Xue et al found an imbalance in the coverage of content categories shared on TikTok.29 Our results showing poor content inclusion in most videos may relate to the limited information available on the disease in the literature and on the novelty of mpox disease. Furthermore, the content scores were generally higher in videos made by doctors and research scientists compared with those made by other groups. Previous studies have shown the same trend.29 30 These findings highlight the importance of the role of these professionals in sharing information on social media.

Quality analysis

Our study is the first to examine the quality of mpox-related videos on TikTok. DISCERN and JAMA instruments indicated an overall poor quality for videos addressing mpox on TikTok. Our findings support those of Sledzinska et al who found an overall poor quality of YouTube videos addressing meningioma treatment.31 Lack of scientific supervision on information quality on social media might explain these findings. In fact, there is no restriction on the content that is published and no discrimination on quality and relevance of information in published videos.22 In studying the quality of videos based on author type, we found that doctors and science communicators made the highest quality videos. Current studies show that videos created by professionals have better quality, whereas videos from general users had the lowest quality.22 32–34 Specifically, videos from general users scored lowest on the use of current references and the inclusion of treatment-related information. In their study on genitourinary cancers, Xue et al also found deficient referencing and treatment-related information.29 Our quality-of-information results emphasise the need for developing instructions on health information videos on social media and for encouraging more content from health professionals. Instructing video creators on the content they need to share and clarifying the standard of quality expected may help promote the production of higher quality videos in this setting.

Audience engagement analysis

Audience engagement was measured by the number of likes for each video. Using multiple linear regression analysis, we found that two factors—the presence of people, and the discussion of the quality of treatment choices—were significant and independent determinants of audience engagement. No other factor had a significant effect on audience engagement. The number of likes is often seen as a user filter and an indicator of popularity and can reflect video quality.22 35–37 Interestingly, the DISCERN instrument’s category on quality of treatment choices affected the video’s audience engagement in our study. This finding is in contrast to those reported from previous studies, which showed no correlation between video quality and audience engagement.22 36 38 One explanation for this discrepancy with results from other studies might relate to the lack of essential information about mpox treatment on social media. Consequently, audience engagement is driven toward the few videos addressing management modalities of mpox. Additionally, more than 90% of videos made by health professionals included the presence of people compared with 28.57% of videos made by general users. Health professional groups produced higher quality videos, thereby exposing the public to more reliable videos of mpox disease. Thus, consumers are provided higher quality content, which is possibly an important factor in improving the outcome of mpox disease. However, in our overall results, video quality was poor, hindering the positive outcomes potentially generated by audience engagement.


This study shows that health information about mpox shared on TikTok is of poor quality and lacks the content needed to inform viewers appropriately about mpox disease. However, in some videos, select aspects of the disease were addressed in an unbiased and reliable way. In addition, audience engagement was higher with better quality videos. Nevertheless, overall, people seeking health information about mpox from TikTok were faced with low-quality videos that did not provide adequate information on all aspects of the disease. Hence, our findings have some practical implications.

Our study highlights the risks of referring to TikTok or social media as a health information source. Social media has become indispensable and is constantly accessed by most people,11 having become both a way to share and acquire information on all topics.22 39 40 Poor quality videos with biased content may lead to confusion and impair successful informed decision making. This exacerbates the ‘infodemic’ on social media,41 42 deterring efforts to prevent and manage disease outbreaks,41 notably the mpox outbreak.

The findings presented here indicate the need for the production of higher quality videos with richer content in accessible platforms such as TikTok. Because accessing social media for information is inevitable, creating more efficient ways to share health information is essential. Previous studies have shown that health professionals and governments have adopted social media tools to manage health crises.43 However, with the evolution of social media platforms and the ease of access in producing and sharing videos, it has become essential to adopt efficient strategies for improving the health information sharing system. Moreover, influencers and celebrities from different social media platforms can affect audience perceptions and behaviours about health issues. Collaboration with influencers can promote health communication messaging, increase disease awareness, and enhance audience engagement.44–46 From this perspective, during the COVID-19 pandemic, TikTok provided updated WHO resources and prompted influencers to share more accurate COVID-19-related information.47 Potential future strategies may involve including clear guidelines on creating health-information videos in social media apps or enhancing algorithms in the apps that favour reliable content.


This study has limitations. First, although all efforts were made to provide adequate video quality assessments, there is no single well-validated way to assess the quality of videos and their content. Specifically, we did not assess how the limited video length on TikTok could affect the quality. Given the short video length and the nature of the TikTok platform, covering all perspectives of health information, including disclosures, in appropriate detailed fashion is challenging. Second, our video search was performed in a limited time period, and the high rate of adding and deleting videos on TikTok could have affected the results. Third, there may be limitations in using the DISCERN and JAMA instruments in evaluating video content because they were originally used as tools to evaluate website information.48 Finally, our study does not assess end user behavioural and psychological changes caused by the videos.


Our analysis of TikTok videos addressing the mpox outbreak showed a lack of necessary content for conveying informed clinical knowledge of the disease. In addition, the overall quality of videos was low. We also found that the presence of people in the video or the discussion of quality of treatment choices increased audience engagement. This study sheds light on the risks involved in using social media for health information and encourages the evolution of strategies for building an efficient health information sharing system.

Data availability statement

Data are available upon reasonable request.

Ethics statements

Patient consent for publication

Ethics approval

Not applicable.


The authors thank Rebecca Bartow, PhD, of the Department of Scientific Publications at the Texas Heart Institute for editorial contributions.



  • Handling editor Seye Abimbola

  • Contributors AS and LP conceptualised the project. AS and JEH collected data and drafted the original manuscript. AS, PC, and LP performed data analysis. YJW and QL revised the manuscript. LP is responsible for the overall content as the guarantor. All authors edited and approved the final manuscript.

  • Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.

  • Competing interests None declared.

  • Patient and public involvement Patients and/or the public were not involved in the design, or conduct, or reporting, or dissemination plans of this research.

  • Provenance and peer review Not commissioned; externally peer reviewed.