Mark Maietta, President

YPrime’s Expert Community is a forum for education, ideation and collaboration. Our (plus clients and partners) point of view, best practices, lessons learned, and other advice are expressed through many formats. These include webinars, white papers, ebooks, and most recently, live events. Last week, YPrime hosted its first Expert Community event at the Science History Institute to an engaged audience of clinical operations, data systems, clinical outcomes, clinical supplies, data management, patient technologies and eCOA/IRT implementation professionals.

eCOA and IRT experts shared their short-term pragmatic tactics and long-term vision for driving adoption, improving user experiences, enhancing data quality, and complying with regulatory requirements. The first of two blog posts summarizing this content-rich event focuses on the general morning and eCOA sessions. Part 2 will focus on general afternoon and IRT sessions.

Dawn Sorenson of Theravance Biopharma and Aubrey Llanes of YPrime presided over comprehensive UAT best practices. They asked the audience to consider the goals of UAT, offered advice on what makes for the best experience and clarified sponsor and provider responsibilities.

Comments from the audience focused on the differentiation between a finding and change, good documentation practices for inspection preparation and the need for user-friendly specification documents. Among the audience, debate on UAT script writing responsibility (sponsor vs. vendor) fueled lively discussion.

Willie Muehlhausen and YPrime’s Donna Mongiello offered perspectives that combined the latest research in BYOD equivalence and acceptance with practical experience.

Willie’s key findings included:

  • Recent research suggests that logistics for BYOD models are harder, and cost savings may be negligible except in cases of large studies.
  • Since it’s impossible to conduct equivalence with every device, and every device class, his research focused on establishing the equivalence of widgets instead of assessment questions on devices. “If a patient population understands the concept of a numeric rating or visual analog scale, that should be enough” Willie argued.  His research concluded the widgets were equivalent – with no discernable differences in the way patients understood how to use visual analog scales, numeric rating scales or verbal response scales, whether presented on paper, iOS or Android formats.
  • Thus, the use of standard widgets, implemented in accordance with CPATH/ISPOR guidelines meets regulatory expectations.
  • Expert screen review should take the place of cognitive debriefing.
  • This presentation concluded with a call for more industry standardization in eCOA implementation with the use of widgets, and instrument libraries as a starting point.

Donna’s key findings included:

  • Hybrid BYOD models involving a mix of patient-owned and provisioned devices are the only current feasible model to avoid disqualification of an otherwise clinically relevant participant.
  • Patient preferences for device use may have shifted. While many patients still prefer the convenience and ease of a personal device, others choose provisioned devices due to concerns about privacy and/or data usage fees, as a condition of participation in a trial.
  • What’s more, site personnel may prefer to hand out provisioned devices to participants, especially if there are concerns about support patients and their own devices.
  • Site training – in the form of comprehensive upfront training, followed by refresher training is essential when BYOD models are used.

Ari Gnanasakthy of RTI Health Solutions presented recent research involving the use of PROs in oncology. Citing his own work [Gnanasakthy et al., 2016], oncology endpoints prior to 2016 mainly focused on treatment activity or economic evaluation, to the exclusion of patients’ physical functioning, quality of life or treatment differences. The surge of novel oncology therapies over the last five years demands a fresh look at how PROs are used, he argued. Ari offered recommendations for integrating PROs into oncology studies and shared four common mistakes:

  • Treatment cycle lengths are now different, and so are safety issues. Traditional assessment schedules may not be appropriate to capture the impact of treatment. – e.g. “Asking patients about what happened in cycle 1 at the beginning of cycle 2 is problematic, when cycle 1 treatment may have taken place three weeks ago.” Real-time administration and quick clinical action lead to better patient outcomes.
  • Conflicting value messages. One example cited “the difference in the grade 3 or grade 4 adverse events between the study arms did not translate into clinically meaningful differences in the reported HRQoL,” when differences spanned deaths, dropouts, dose adjustments and treatment breaks. The differences could be attributed to sub-optimal measurement tools, sub-optimal assessment schedules or sub-optimal analysis.
  • Measurement redundancy. Assessment of the same concept using multiple instruments at the same time impacts patient burden, study burden, data quality and analysis.
  • Unrealistic expectations about treatment or disease impact are common, and many assessments are not suitable for labelling.

Ari also emphasized the importance of including patient experience data in labelling, which may include route of administration preferences.

Colin Cleary followed with a discussion of the more practical issues related to incorporating the patient voice into oncology trials. He cited recommendations and case examples from Basch et al., (2012) Recommendations for Incorporating Patient-Reported Outcomes Into Clinical Comparative Effectiveness Research in Adult Oncology. Journal of Clinical Oncology 30:34, 4249-4255, that illustrated the importance of investigative site training and patient support throughout the trial. Key points included:

  • Collect PRO data electronically whenever possible. It’s preferred by regulators and patients and offers superiority to paper.
  • Limit endpoints to only the most critical, collecting data at only those necessary timepoints, and avoiding use of multiple instruments to capture the same data.  Basch et al., recommend within 20 minutes at baseline and within 10 to 15 minutes at later timepoints.
  • Creative solutions for data collection are needed, especially for patients with rare and severely compromising diseases like cancer.
  • Back-up modes of data collection (that don’t involve paper) are necessary. These solutions are critical where missed site visits are anticipated, or expected, due to patients being too ill to travel.
  • Account for missing data whenever possible. It’s important to document for submission support and greater sensitivity analysis.

The conclusion? While oncology studies present unique challenges, there are many opportunities to include the patient’s voice and improve data collection.

(Gnanasakthy A, DeMuro C, Clark M, et al. patient-reported outcomes labeling for products approved by the Office of Hematology and Oncology Products of the US Food and Drug Administration (2010-2014). J Clin Oncol. 2016;34:1928-34).

A panel discussion dissected the eCOA product lifecycle into three distinct stages and offered best practices for each stage. Discussion centered on these questions:

  • Design
    • What drives complexity and how can unnecessary complications be reduced?
    • What factors in when working with more complex implementations? What can make it easier or harder to deliver?
    • Who needs to be at the table at the design stage? Which key stakeholders are most often absent and how do you get engagement by others earlier in the process?
    • What are essential best practices to consider?
  • Launch
    • What are the differences in how large vs small sponsors manage eCOA studies? What are the different expectations?
    • What are the key activities and considerations of eCOA launch?
    • How do you plan for site readiness?
    • What are the key shipping and logistics considerations?
    • What’s the best way to instill best practices and reinforce user knowledge?
  • Study maintenance
    • Considering the frequency of compressed development timelines, what functionality is often still in development after launch?
    • What can go awry once the study is live, and how do you plan for these early on?
    • What recommendations do you have to help with data integrity planning and audit readiness?

Panelists Aubrey Llanes, Ed Potero of Biogen and Michelle Hylan of Innov8tive Research Consulting covered a lot of topics with anecdotes, examples and perspectives from differing lenses of providers, sponsors and consultants.

Stay tuned for more updates from YPrime’s Expert Community experiences. In the meantime, we invite you to participate in the conversation. Talk to us about your eClinical challenges and experiences.