RWE: Relevance, Reporting, and AI
Join Rob Matheis and Richard White for a discussion of the basics of what Real World Evidence (RWE) is and how it is different from HEOR, to the big questions of how the global use of RWE has changed over time and how the rise of AI - both analytics and large language models - could change the landscape of RWE. What changes should medical communications professionals be aware of in the reporting and usage of RWE data?
Richard White is the Chief Operating Officer of Oxford PharmaGenesis, a global, independent HealthScience communications consultancy. He has more than 20 years of experience in Medical Affairs, and founded the Oxford PharmaGenesis Value Demonstration and Patient Engagement Practices to apply evidence-based communication best practice to clinical, HEOR, RWE, patient-reported outcome (PRO) and patient preference studies. He has run training workshops in these areas for multiple pharmaceutical companies, and has also delivered invited presentations, panel discussions and roundtables at international conferences such as ISMPP, TIPPA, Patient Summit Europe, Patient Engagement & Experience Summit and the World Orphan Drug Congress.
Where to Listen
Find us in your favorite podcast app.
Rob Matheis (00:00):
Hello, and welcome to InformED, a podcast series where you're going to hear experts share their thought-provoking insights and lessons in the field of medical communications. This series is brought to you by ISMPP and is generously sponsored by MedThink SciCom. I'm Rob Matheis, President and CEO of ISMPP.
Rob Matheis (00:13):
Today, I'm talking with Richard White, Chief Operating Officer at Oxford PharmaGenesis about RWE or real-world evidence; what it is, and how it's changing. Welcome, Richard.
Richard White (00:25):
Thanks, Rob.
Rob Matheis (00:26):
It's great to have you on with us. I know we've probably had this conversation a number of times at our various conferences, so it's great to finally get it committed to a podcast. Our listeners vary in terms of their knowledge in this area. I thought it might be good if we just start the conversation by getting a sense of some definitions.
Rob Matheis (00:39):
I hear all the time people use RWE and HEOR almost interchangeably. Are they the same thing?
Richard White (00:46):
It's a good question, and they're not. HEOR, just to go over the basic definitions is health economics and outcomes research, and HEOR groups and the studies they run, they're designed really to provide evidence for typically a payer audience. Health technology assessment's a good example, reimbursement bodies.
Richard White (01:05):
And the idea is to show the value of a new healthcare intervention. So, they're answering two questions and one's, what's the benefits, what's the impact on health outcomes and what are the costs, particularly the economic consequences.
Richard White (01:17):
Now, to do that, you often find you need evidence about the disease area and the unmet need. So, you need to be looking at things like patient population, disease burden, healthcare resource utilization. And for that, you need real-world evidence because you don't get that sort of information typically from a randomized controlled trial.
Richard White (01:37):
So, RWE, real-world evidence, it's also called observational studies, non-interventional studies. Basically, they're routinely collected data, so they're not from somewhere where the researcher controls the management of the patient. They just observe outcomes in routine practice.
Richard White (01:58):
So, that's things like registries, claims data, electronic health records. And the reason I think that the two terms are often used interchangeably is because back in the day when you and I started looking at things like this, probably, most real-world evidence was done by HEOR groups to fill in gaps in the evidence base for their payer audiences.
Richard White (02:21):
Nowadays, real-world evidence is much more central to medical evidence and even becoming central to regulatory decision-making. So real-world evidence is a much bigger thing than just being part of HEOR now.
Rob Matheis (02:35):
Interesting. So, you mentioned RCT (randomized control trials), how do they differ? How do these types of studies that we're talking about here, HEOR, RWE differ from RCT?
Richard White (02:45):
I think that apart from the methodology, if you're talking about it from the perspective of a medical publications professional, the big difference is that RWE and HEOR studies more broadly, but we'll talk about RWE — they're much less tightly regulated, they’re randomized controlled trials.
Richard White (03:03):
And that means the information you receive as a publications professional is much less consistent. So, real-world studies are not covered by good clinical practice (GCP), and there's something called International Conference on Harmonization that states what needs to be done for randomized controlled trials. So, they have to have a protocol, to a particular template, a statistical analysis plan, and a particular study report.
Richard White (03:30):
None of that exists for real-world evidence. They also don't have the equivalent of what everyone's familiar with for randomized controlled trials, which is FEDORA and the FDA final rule. So, that means for real-world studies, there's no commitment to register them and there's no commitment to publish the results.
Richard White (03:51):
So, I suppose, coming back to your overall question, the key differences compared with randomized controlled trials, real-world evidence has almost been a bit like the wild west if you're looking at it from a publications perspective, there's far less rigor around what you have to do with those studies.
Rob Matheis (04:07):
Got it, got it. So, would you say that RWE or HEOR studies, are they inferior to RCTs or are they just a different breed?
Richard White (04:16):
I would say different. They provide a very different evidence base and that's particularly useful in a lot of areas. Typically, randomized controlled trials have rather narrow inclusion/exclusion criteria. So, you end up investigating a subpopulation of your overall indication.
Richard White (04:36):
It may lack very old patients, very young patients, patients with comorbidities. And actually, the information on how those patient groups respond to a particular intervention is really important, and that's what real-world evidence gives you.
Richard White (04:51):
It also gives you often — and this is why it's used a lot for payer and reimbursement and decision-making — it gives you local evidence. You'll often find database studies or registries are done in a particular country or a particular region to give for example, a payer group evidence of how a treatment works under the sort of treatment algorithms that they're used to in their own environment. So, they're very different and they're very complimentary.
Rob Matheis (05:19):
Got it, got it. Now, you and I have both gone to the podium many times over the last many years and gotten a soapbox about the importance of these types of studies, these types of analysis. Do you feel this is still relevant even after all these years?
Richard White (05:33):
Yeah, it really is. It's a good question. I look back when I knew I was going to do this podcast and the first presentation I gave on real-world evidence at an ISMPP annual meeting was in 2015. So, you can certainly say it's a slow burner. It's definitely still relevant now, and I think that's because real-world evidence is being used so much more. And yet, really, it's still not trusted by a lot of decision-makers externally.
Richard White (06:03):
Also, probably still, a lot of people who've grown up in clinical development within pharma companies who are customized to randomized controlled trials and the sort of statistical rigor you have there. And externally, there's definitely that issue of trust whenever it's a study done by industry.
Richard White (06:20):
Now, when I did that presentation back in whenever it was, one of the things I said is a key way to build trust is through transparency. And this is why I think it's really coming round to the medical publications professional because we know that publications are a huge part of transparency.
Richard White (06:35):
And I think what we are seeing now is that some of the things we've been talking about needing for real-world evidence for years that we've got with randomized controlled trials, things like registering studies, posting your protocols, committing to publish, decent reporting guidelines — they are actually now really coming to real-world evidence studies.
Richard White (06:58):
So, it's a time, I would say, for the medical publications professional to start helping with that.
Rob Matheis (07:04):
And that's a great segue into something I wanted to chat a little bit about, is to really drive into a bit what do our listeners, our medical publication, medical communication professionals need to be doing now so they can be prepared for today and for the future?
Richard White (07:19):
I would say to get acquainted with some of the tools and guidance that's coming out of a few different areas. But the main one, I'd say is very important, is the Real-World Evidence Transparency Initiative. This is a collaboration of few groups. The major academic group involved is ISPO and also ISPI. But there's FDA involved, Duke Margolis are involved too.
Richard White (07:44):
And there are probably three things I really focus on that are relevant to a medical publication professional that's coming out of those groups. One is the reporting guidelines. They're putting out more reporting guidelines around RWE and database studies.
Richard White (08:02):
Now, anyone who's worked on an RWE publication knows that strobe exists. There is a guideline there, but what's coming out of ISPO or ISPI goes into a lot more detail on different types of study. And it's important to engage with that, to understand how to write the thing up.
Richard White (08:18):
But before that, there's some other things coming too. So, there is now an RWE study registry dedicated site to register RWE studies. We've been accustomed to clinicaltrials.gov for RCTs for years and years. But now, there's something there for real-world studies.
Richard White (08:37):
It's slow uptake, I would say, but it is in the view of FDA and EMEA, and they've suggested in future, they will actually start mandating that RWE studies are registered, if they're going to be used in regulatory submissions. And probably, it won't be too long after that, that ICMJ says they have to be registered before they can be published in their journals.
Richard White (09:01):
So, I think these things are coming. So, the register is another one. And the third thing which was published a few months ago was something called The Harper Template. This is a template for real-world study protocols and it's important development because it's a harmonized template. So, there's been a mishmash of different real-world evidence protocol templates for years and years.
Richard White (09:21):
But Harper's a real advance, particularly if you are on the receiving end of these sorts of documents as medical publications professionals are, you've got something now that's going to be more consistent. It's going to be a more detailed summary of the study protocol. So, it'll really help actually write some of these things up.
Rob Matheis (09:40):
The Harper Template, is there a particular way that a publication professional should be using that? Like can we dig into that a little bit more?
Richard White (09:47):
It's something where I think if you are in a publications role, if you're on the consultancy side, I would be asking your clients, are you going to use Harper or are you using it already? Because it has only just come in. And having them build that into their SOPs is going to be really important so that you can use it downstream.
Richard White (10:07):
And if you're in a firmer role actually with control over that, I would say ask your RWE, ask your HR teams if they're using it. Because if they're not, then they'll need to change processes in the near future. There'll be a betting in period because different groups have used different templates up until now.
Richard White (10:26):
But the really nice thing about Harper, and it's in the paper that the Transparency Initiative published on this, it shows in that paper, how the different elements of Harper tie into the existing template.
Richard White (10:41):
So, you can see if you've used this template in your study, here's how it populates Harper and here's the additional bits you'll need to go and find out. So, it's a guide also to what gaps there might be if some other template's been used.
Rob Matheis (10:55):
That's a really good response because I think a number of our listeners probably aren't aware of The Harper Template or haven't really started using it. And I think it's helpful to think about different ways that they can start to implement on both sides of the industry. So, that's terrific.
Rob Matheis (11:08):
Let me ask you, do you think in the future that some of these types of studies would be able to replace RCTs if they're done better or have better transparency, or there'll always be a place for both?
Richard White (11:19):
I think there will always be a place for them. It is a really interesting question because one of the initiatives that's part of the RWE transparency project is something called RCT Duplicate. So, what they're doing there is basically looking at using real-world evidence to see if it can replicate randomized controlled trial results, and identifying some of the conditions where that might be possible.
Richard White (11:42):
It's the idea being that not everywhere, but possibly in some indications, in some circumstances in the future, real-world evidence could be used for registration. I think that what's more likely is rather than one replace the other, you're going to see more studies that retain the core principles of randomization of the RCT, but integrate some elements of real-world studies.
Richard White (12:04):
So, things like screening eligibility, you could have use of real-world studies in synthetic control arms and so on. So, there'll probably be more hybrid studies. Again, though from a medical publications perspective, it's really important to understand a bit more about that real-world methodology for when reports come through that show this hybrid methodology.
Rob Matheis (12:29):
It does sound like there's going to be a lot more opportunity for sophistication in our publication planning going forward. Better integration of these types of studies and interesting ways to put them together to tell the full medical story of a particular product, service or technology.
Rob Matheis (12:42):
Are there any secular trends happening right now? Anything that is happening that you see on the forefront or coming up soon in this space?
Richard White (12:53):
Probably the global utilization of real-world data is something that will change. And if you're in the United States, you'll probably be aware that real-world studies have been around for a long time because there's claims databases. The use of it has been very well-established.
Richard White (13:13):
It's not so much the case globally, but that is changing and particularly in Europe — Europe as you know, lots of individual countries all doing things differently. And trying to bring them together to get some idea of a regional picture is a particular challenge. But there are some advances coming, something called Darwin which is a collaborative of data sources. And the idea being, you can run studies across a representative European data network.
Richard White (13:45):
The idea very much behind them is they're going to enable regulatory and reimbursement decision-making based on a European view, which is something that really hasn't been possible before. It's been possible to do it within individual countries, but not from a European perspective. So, that's, I think, quite a big advance and it's important to understand that there will be studies coming through potentially based on these sources.
Richard White (14:09):
Initially, they'll be used more on the decision maker side, but there is certainly scope in the future that they could be used potentially by pharmaceutical companies as well.
Rob Matheis (14:21):
Excellent, excellent. It's been a great dialogue so far. I feel compelled, Richard, to ask you one more question: RWE and AI, it's on everybody's mind. Is there a future there?
Richard White (14:31):
Definitely. In fact, there's a present. AI is already used in real-world evidence, and it's incredibly useful though, one, without getting into huge amounts of detail. Just an example for the audience here, there's something called abstraction.
Richard White (14:48):
So, let's say you have a hundred thousand health records of patients with cancer, which is not uncommon if you're looking at big database studies. You want to identify which of those patients had metastatic cancer, but those records are unstructured data, so they're things like physician notes.
Richard White (15:05):
You can imagine that manually, it's incredibly labor-intensive for an abstracter to have to go through every single record individually. But what you can do and what is being done is you can train an AI algorithm on how to identify particular elements in records that are associated with metastatic cancer, and then you can automate a lot of the process.
Richard White (15:29):
So, it's already happening to an extent. I would say if we're thinking more at the medical publications end, I wonder if real-world evidence and actually HEOR more generally, may be a bit more resistant to artificial intelligence and automation and the randomized controlled trial.
Richard White (15:47):
Because I can imagine that you could with all privacy obstacles overcome, you could put your CSR into something like GPT-4 and say, "Please, in the style of a medical writing professional, write me a 5,000-word article that follows the following template" suitable for whatever journal you want. And it might come out with something pretty well-developed.
Richard White (16:11):
If you work on real-world studies, you know because of the issues we said earlier, you don't typically have necessarily a robust protocol statistical analysis plan or a defined study report. A lot of the upfront work on these studies is having received a bunch of tables and maybe a couple of slide decks.
Richard White (16:30):
You work out with the author as well what are we going to put into this paper? What do these data show? There's a lot of interpretation now. I'm not saying I can't do that, but I can see that it might be a longer journey to that than just writing from a study report.
Rob Matheis (16:46):
I'm sure that's the case for sure. Well, we could probably record an entire other podcast on RWE and AI, and maybe we will in the future. But for now, I want to thank you so much for the time today and for chatting with me. It's been a really interesting discussion and one that I'm confident will be very useful for our listeners. Thanks so much.
Richard White (17:04):
Thanks, Rob.
Rob Matheis (17:04):
Thanks for listening to InformED for Medical Communication Professionals. Please take a minute to subscribe to the show on your favorite podcast app, inform your colleagues, and rate our show highly if you like what you heard today.
Rob Matheis (17:15):
We hope you'll also join us at an upcoming ISMPP University webinar or even consider becoming a member of our association. Just go to ismpp.org to learn more. I'm Rob Matheis.
Hide TranscriptRecent Episodes
View AllBringing the Patient Voice into Company-Sponsored Publications: The Med Comms Perspective
InformEDBringing the Patient Voice into Company-Sponsored Publications: The Pharma Perspective
InformEDDigital Features: Are They Worth the Effort? Questions Answered
InformEDAccessible Conference Presentations: Results and Insights From a Study
InformEDHear More From Us!
Subscribe Today and get the newest Evergreen content delivered straight to your inbox!