Network meta-analysis (NMA) is a powerful tool for comparing the effectiveness of multiple interventions 1. It synthesizes direct and indirect evidence from a network of randomized controlled trials, providing a comprehensive picture of the relative merits of different treatments. However, missing data is a common challenge in NMAs, potentially leading to biased results and misleading conclusions if not handled appropriately. Transparency in reporting the methods used to address missing data is crucial for ensuring the credibility and reliability of NMA findings. This article provides best practices for transparent reporting, drawing on established guidelines and current research.
Importance of Transparency in Reporting Missing Data Handling
Missing outcome data can occur in NMAs for various reasons, such as participants withdrawing from a trial, incomplete outcome assessments, or inconsistencies in how the primary studies report their findings 2. The impact of missing data depends on the underlying mechanism and the method used to handle it. Ignoring missing data can lead to biased estimates and reduced statistical power 3. While statistical techniques can mitigate the impact of missing data 3, they rely on assumptions that may not always hold. The extent to which structural properties are affected by missing data depends on the properties themselves 4. For example, measures based on indegrees are more robust against missing data than other measures because incoming ties are only partially missing.
Transparent reporting of missing data handling allows readers to:
- Assess the potential for bias: Understanding the extent and nature of missing data, along with the chosen handling method, helps readers evaluate the risk of bias in the NMA results.
- Judge the validity of assumptions: Different methods for handling missing data rely on different assumptions. Transparent reporting allows readers to assess the validity of these assumptions in the context of the specific NMA.
- Replicate the analysis: Clear and detailed reporting enables other researchers to replicate the analysis and verify the findings.
- Conduct sensitivity analyses: Transparency facilitates the exploration of the robustness of results to different missing data assumptions.
Systematic reviews generate different types of knowledge for different users, such as patients, healthcare providers, researchers, and policymakers 5. Transparent reporting caters to the needs of these various stakeholders by providing a clear and comprehensive account of how missing data was addressed in the NMA. It is also important to recognize that there are different types of bias due to missing evidence, such as publication bias, where studies with statistically insignificant results are less likely to be published, and selective outcome reporting bias, where specific outcome results are not reported in a publication 2.
Reporting Guidelines for Network Meta-Analyses
Several reporting guidelines provide a framework for transparent reporting in NMAs. These guidelines offer valuable recommendations for ensuring clarity, completeness, and consistency in reporting, which are essential for facilitating the interpretation and assessment of NMA findings.
Guideline
Focus
Key Recommendations for Missing Data
PRISMA-NMA 6
Reporting of systematic reviews incorporating network meta-analyses
Describe the methods for handling data and combining results (e.g., handling of multi-arm trials, selection of variance structure, prior distributions in Bayesian analyses, model fit) 8.
NICE Decision Support Unit (DSU) Technical Support Documents (TSDs) 8
Conducting and reporting NMAs
Recommendations for handling missing data and presenting results of sensitivity analyses.
Cochrane Handbook for Systematic Reviews of Interventions 9
Conducting systematic reviews, including NMAs
Discusses the assumptions underlying NMAs, methods for handling missing data, and the importance of assessing the risk of bias due to missing evidence.
It is worth noting that an international group is currently developing an extension of the PRISMA statement specifically for network meta-analyses 1. This initiative highlights the ongoing efforts to improve reporting standards and provide more specific guidance for researchers conducting NMAs.
Best Practices for Reporting Missing Data Handling
Building on these guidelines, the following best practices promote transparency in reporting methods for handling missing data in NMAs:
1. Describe the Extent and Pattern of Missing Data
- Quantify missing data: Report the amount of missing data for each outcome and intervention arm, both overall and within individual studies.
- Characterize the pattern: Describe the pattern of missing data (e.g., monotone, non-monotone) and any potential correlations between missingness and other variables.
- Investigate reasons for missingness: Explore and report the potential reasons for missing data, such as participant dropout or study limitations. This information can help assess the plausibility of different missing data mechanisms.
2. Specify the Missing Data Mechanism
- Define the assumed mechanism: Clearly state the assumed missing data mechanism (e.g., missing completely at random, missing at random, missing not at random) 11.
- Justify the assumption: Provide a rationale for the chosen assumption based on the available evidence and clinical knowledge.
- Acknowledge limitations: Discuss any limitations or uncertainties associated with the assumed mechanism.
3. Detail the Handling Method
- Name the method: Clearly identify the specific method used to handle missing data (e.g., complete case analysis, single imputation, multiple imputation, model-based methods) 12.
- Provide rationale: Explain the reasons for selecting the chosen method, considering its strengths and limitations in the context of the NMA.
- Describe implementation: Provide sufficient detail about the implementation of the method, including any specific assumptions or parameters used (e.g., imputation model, prior distributions).
- Consider different perspectives: When handling missing data, it's important to consider different perspectives on how to regard missing values. For example, in a meta-analysis of interventions for excessive drinkers, missing values could be regarded as failures (providing a lower bound to the success rate), or different scenarios like best-case and worst-case could be explored 13.
4. Data Extraction and Reporting
Transparent reporting in NMAs requires careful consideration of the data extracted from included articles. This includes:
- Study identification: Report details such as the first author's name, location of corresponding authors, year of publication, and journal name 14.
- Study design and characteristics: Describe the number and design of studies included in the NMA, the study population, the interventions being compared, and the outcome measures 14.
- Outcomes: List and define all outcomes for which data were sought, ensuring that all results compatible with each outcome domain in each study are included 15.
5. Conduct and Report Sensitivity Analyses
- Explore different assumptions: Conduct sensitivity analyses to assess the robustness of the NMA results to different missing data assumptions. For example, compare results under MAR and MNAR scenarios 16. It is important to assess how robust the results are to reasonable deviations from the MAR assumption, as this assumption cannot be tested 13.
- Vary imputation models: If using imputation, explore different imputation models or assumptions to assess their impact on the results. This could involve comparing different types of imputation models, such as predictive mean matching or logistic regression, or varying the assumptions about the relationship between missingness and other variables.
- Present sensitivity analysis results: Clearly present the results of the sensitivity analyses, highlighting any significant changes in the conclusions.
6. Discuss Limitations and Implications
- Acknowledge limitations: Discuss the limitations of the chosen missing data handling method and the potential impact of missing data on the NMA results.
- Address uncertainty: Acknowledge any uncertainty surrounding the missing data mechanism and its potential influence on the findings.
- Interpret results cautiously: Interpret the NMA results in light of the potential for bias due to missing data.
Examples of Reporting
To illustrate how these best practices can be applied in practice, we provide the following examples of reporting missing data handling:
- Example 1: "Missing data were handled using multiple imputation under the missing at random assumption. We used a predictive mean matching imputation model with all available covariates. We conducted sensitivity analyses to assess the robustness of the results to different missing data assumptions, including missing not at random scenarios." 16
- Example 2: "We assumed that data were missing completely at random. We conducted a complete case analysis, excluding participants with any missing outcome data. We acknowledge that this approach may introduce bias if the missing data are not truly MCAR." 13
Conclusion
Transparency in reporting methods for handling missing data is essential for ensuring the credibility and reliability of network meta-analyses. By following the best practices outlined in this article, researchers can enhance the trustworthiness of their findings and contribute to a more informed interpretation of the evidence. Clear and comprehensive reporting allows readers to assess the potential for bias, judge the validity of assumptions, and understand the implications of missing data for the NMA conclusions.
Specifically, researchers should prioritize:
- Describing the extent and pattern of missing data: This includes quantifying the amount of missing data, characterizing the pattern of missingness, and investigating the reasons for missing data.
- Specifying the assumed missing data mechanism: Clearly state and justify the assumed mechanism (MCAR, MAR, or MNAR) and acknowledge any limitations.
- Detailing the handling method: Identify the specific method used, provide a rationale for its selection, and describe its implementation.
- Conducting and reporting sensitivity analyses: Explore the robustness of results to different missing data assumptions and imputation models.
By adhering to these best practices, researchers can contribute to a more transparent and trustworthy body of evidence derived from NMAs, ultimately leading to more informed decision-making in healthcare and other fields. It is important to emphasize that inadequate reporting of results may mislead clinical researchers 1, highlighting the crucial role of transparency in maintaining research integrity.
Works cited
1. Reporting of results from network meta-analyses: methodological ..., accessed on January 16, 2025, https://www.bmj.com/content/348/bmj.g1741
2. Tool to assess risk of bias due to missing evidence in network meta ..., accessed on January 16, 2025, https://www.medrxiv.org/content/10.1101/2021.05.02.21256160.full
3. The Impact of Missing Data on Statistical Analysis and How to Fix It ..., accessed on January 16, 2025, https://medium.com/@tarangds/the-impact-of-missing-data-on-statistical-analysis-and-how-to-fix-it-3498ad084bfe
4. www.cmu.edu, accessed on January 16, 2025, https://www.cmu.edu/joss/content/articles/volume10/huisman.pdf
5. The PRISMA 2020 statement: an updated guideline for reporting ..., accessed on January 16, 2025, https://www.bmj.com/content/372/bmj.n71
6. Appendix 2: Reporting checklist (PRISMA-NMA) - BMJ Open, accessed on January 16, 2025, https://bmjopen.bmj.com/content/bmjopen/9/10/e032773/DC2/embed/inline-supplementary-material-2.pdf?download=true
7. NMA — PRISMA statement, accessed on January 16, 2025, https://www.prisma-statement.org/nma
8. Appendix K: Network meta-analysis reporting standards | Tools and ..., accessed on January 16, 2025, https://www.nice.org.uk/process/pmg20/resources/developing-nice-guidelines-the-manual-appendices-2549710189/chapter/appendix-k-network-meta-analysis-reporting-standards
9. Chapter 11: Undertaking network meta-analyses - Cochrane Training, accessed on January 16, 2025, https://training.cochrane.org/handbook/current/chapter-11
10. Chapter 13: Assessing risk of bias due to missing evidence in a ..., accessed on January 16, 2025, https://training.cochrane.org/handbook/current/chapter-13
11. methods.cochrane.org, accessed on January 16, 2025, https://methods.cochrane.org/statistics/sites/methods.cochrane.org.statistics/files/uploads/SMG_training_course_cardiff/2010_SMG_training_cardiff_day1_session3_higgins.pdf
12. Handling Missing Data: Best Practices for 2024 - Editverse, accessed on January 16, 2025, https://editverse.com/handling-missing-data-best-practices-for-researchers-in-2024/
13. Dealing with missing outcome data in meta‐analysis - PMC, accessed on January 16, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC7003862/
14. Evaluation of the Reporting Standard Guidelines of Network Meta ..., accessed on January 16, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC9778181/
15. Checklist for reporting a systematic review (with or without a meta ..., accessed on January 16, 2025, https://www.goodreports.org/reporting-checklists/prisma/
16. The M-Value: A Simple Sensitivity Analysis for Bias Due to Missing ..., accessed on January 16, 2025, https://pmc.ncbi.nlm.nih.gov/articles/PMC10089074/