Electronics

Flexible synthesis provides more relevant and current evidence for policy and research

Summarizing and synthesizing evidence, which is carried out through systematic review, has transformed decision making. But, synthesis has three significant issues. For one, most studies are not able to cover the questions or geographical regions required for making specific decision-making ( 1). Additionally, the repetition of reviews that cover similar subjects can hinder the efficient utilization of research sources ( 1). Thirdly, reviews quickly get out of date ( 2). These issues undermine the use of evidence to make decisions. There are ways to get around these issues by offering a customizable and precise evidence synthesis that can be easily customizable. These tools can be extremely beneficial for those trying to make their decisions using recent, trustworthy evidence.

To support evidence-based decision-making, studies are usually ineffective, not precise enough to tackle a specific problem, or outdated too fast. This can hinder their value in decision-making. There’s a better method. Image credit: Shutterstock/Tupungato.

Significant efforts have been made to increase the reliability of scientific evidence ( ). We need additional progress to make the proof more helpful to policymakers and practitioners ( ). Even if evidence regarding an issue of interest has been compiled, the significance of any findings to a decision-maker could be limited if the study is based on studies in contexts that differ from those of decision-makers ( ). The key is transferability, which refers to the extent to which an intervention’s effectiveness will be the same in another context ( ). Many factors can affect the transferability of interventions, such as (i) settings, areas, or locations where interventions are implemented; (ii) the target group of the investigation; (iii) the method of delivery and the timing of it or the experiences of the person who is delivering it and (iv) the method by which outcomes are analyzed. Different settings, target populations, or the delivery of interventions can result in significant outcome variations.

In contrast, evident differences in outcomes could disappear if the methodologies used in studies are identical ( ). Differences in context are essential. This is especially true when the effects of interventions vary greatly. Meta-analysis and systematic reviews can help form choices in these areas; however, generalizing the intervention’s effectiveness can be challenging ( 7). Additional analyses may clarify this issue. However, the comment that authors find fascinating (and that is why they conduct the type of analysis they carry out) could differ from the kind of analysis the decision-makers want.

Sign up to receive PNAS alerts.

Be alerted for the publication of new articles or receive an email when a particular piece is mentioned.

Accessing the most current and relevant evidence suitable for policymakers and practitioners requires an innovative way of synthesizing proof based on the advances made in systematic analysis.

Timely Evidence

Accessing the most current and relevant evidence suited to policymakers and practitioners requires a new method of synthesis of evidence based on advancements in systematic reviews. This method will allow users to define their situation by explaining the setting, the target population, the intervention’s delivery, and the measurement of outcomes (as described in SI Appendix, Table S1) and allowing them to explore existing evidence. This approach should enable decision-makers to consider various issues and allow flexibility in multiple contexts. In this article, we will present the implementation of “dynamic meta-analysis”–a new method that blends meta-analysis, systematic review, and an interactive web application. The significant difference between dynamic and conventional meta-analysis is that it actively lets users select and weigh evidence based on their preferences and create customized analysis (see Figure. 1 for a summary). Today, online tools generally contain only evidence about intervention abstractly or qualitatively.

Leave a Reply

Your email address will not be published. Required fields are marked *