Medical Health Cluster

1 noviembre, 2021

In Re-Analysis, Ivermectin Benefits Disappeared as Trial Quality Increased

Some authors of a widely reported meta-analysis of ivermectin studies that was flagged for including potentially fraudulent research have just posted a re-analysis on a preprint server — and ivermectin’s seemingly beneficial effects disappeared as trial quality went up.

For the re-analysis, Andrew Hill, PhD, of the University of Liverpool in England, and colleagues included 12 studies with 2,628 participants, and assessed them for bias. Overall, four studies had a low risk for bias, four studies had moderate risk, three studies were at high risk for bias, and one was potentially fraudulent.

Taken at face value, the overall meta-analysis found a 51% increase in survival with ivermectin (P=0.01), but excluding the potentially fraudulent trial, ivermectin’s benefit fell to 38% and was of borderline significance (P=0.05), they reported.

Taking out the studies with a high risk of bias led to a further drop — down to a nonsignificant 10% increase in survival (P=0.66), they noted. Further removing studies with a moderate risk of bias took the benefit down to 4% (P=0.9).

“This has made me more wary about trusting results when you don’t have access to the raw data,” Hill told MedPage Today in an interview. “We took them on trust and that was a mistake.”

Hill and his co-authors on the re-analysis — one who worked on the initial meta-analysis, and one who did not — published their findings on Research Square, the preprint server that also published the study that was ultimately found to be fraudulent and was retracted, though it had carried much of the benefit seen in the initial meta-analysis.

“I’ve been working in this field for 30 years and I have not seen anything like this,” Hill noted. “I’ve never seen people make data up. People dying before the study even started. Databases duplicated and cut and pasted.”

The retracted study by Elgazzar et al. was reported to have included data that showed a third of the people who died from COVID-19 were already dead when trial recruitment began, and some appeared to have been hospitalized before they started — raising questions about the study’s prospective randomized nature.

Hill said during the process of re-analysis, he also found a trial from Lebanon in which the same 11 patients had been “cut and pasted” repeatedly in the database.

“It was quite shocking, really,” he said.

It’s been a difficult road for Hill after the Elgazzar study was retracted. His meta-analysis that had included it got slapped with an “expression of concern” from publisher Open Forum Infectious Diseases, an Oxford journal.

Hill immediately stated that his team would re-run their analysis with the Elgazzar trial removed. Then the threats intensified, he wrote in The Guardian.

“Like others, I received death threats,” he told MedPage Today. “People want to believe that having a treatment allows them not to be vaccinated, but that’s simply not true.”

He felt he needed to run the re-analysis because “this is serious stuff,” he said. “Unfortunately, people were looking at studies of ivermectin, concluding that it worked, and unfortunately deciding not to get vaccinated. Some of them ended up infected, in the hospital, or some even died. That’s a very serious situation.”

For their re-analysis, Hill and colleagues conducted an in-depth evaluation of individual study quality, in addition to using the Cochrane Risk of Bias tool (RoB 2) and the CONSORT checklist.

During the individual trial evaluations, they looked at the effectiveness of the randomization process by comparing baseline characteristics across treatment arms, checked randomization dates to ensure participants entered in a similar time frame, and checked to see if recruitment into treatment arms was balanced.

They also analyzed patient-level data when they were available, to screen for things like duplicate participants and unexpected homogeneity or heterogeneity.

Ultimately, their findings that the benefits of ivermectin diminished as trial quality increased suggested that “existing and widely used risk of bias assessment tools are not enough,” they noted in their paper.

“These tools provide a systematic framework for identifying potential key sources of bias in a trial’s internal methodology, but work on the fundamental assumption that a published study is reporting accurate and complete findings,” Hill and colleagues wrote. “They allow reviewers to make judgments on the assumption that basic standard procedure is followed, the data is real, and that no information is being intentionally hidden.”

The saga has made Hill something of an advocate for open data. He pointed to the two COVID-19 studies published in The Lancet and New England Journal of Medicine that were retracted because they relied on potentially fraudulent data from a company called Surgisphere.

Hill also noted that regulators like the FDA or the European Medicines Agency (EMA) “won’t just believe a study until they’re given the raw database and had the chance to check through it. Meta-analyses in journals don’t have that degree of scrutiny.”

Going forward, it’s “essential that access to patient-level databases is provided. If authors fail to provide this data, the study should be considered with a higher index of suspicion,” Hill and colleagues concluded.

https://www.medpagetoday.com/special-reports/exclusives/95333?xid=fb_o&trw=no&fbclid=IwAR3xCsFVGquCZd-nJ_gZGbx4qxUd8Rz8nVQ_YQlHQt8UuD9Mx9CKZwwQyG0


Créditos: Comité científico Covid

Deja un comentario

Tu dirección de correo electrónico no será publicada. Los campos obligatorios están marcados con *