Life Sciences R&D & Medical

Driving clinical trial transformation with Data at the Speed of Light

By Sowmyanarayan Srinivasan, Ansh Srivastava, Prasun Basu, Vickye Jain, and Geoff Schmidt

Oct. 3, 2025 | Article | 7-minute read

Driving clinical trial transformation with Data at the Speed of Light






Clinical trials have long been promised a technology-powered future of faster timelines, reduced patient burden and smarter decisions. But despite steady innovation, one bottleneck remains: access to high quality, usable data.

 

The problem: With nearly 70% of trial data now generated outside traditional electronic data capture (EDC) systems, sponsors face a flood of data from genomics, imaging, ePRO, wearables and real-world evidence scattered across disconnected systems and incompatible formats. Without end-to-end, standards-first data flow, where metadata is defined consistently from the start and managed through systems like a metadata repository (MDR), even the most advanced tools won’t deliver the value leaders are targeting.

 

To enable Data at the Speed of Light, R&D organizations must instill a data-first mindset built around capturing, harmonizing and reusing data in real time. Done right, this shift would transform data from the exhaust of fragmented trial operations into the engine that drives decisions and enables submission-ready outputs that are correct on the first try.

Why digital data flow is finally within reach, and how sponsors can capitalize on the opportunity



For years, the vision of seamless, standards‑driven data flow in clinical trials has felt tantalizingly close but just out of reach. It’s been blocked by fragmented systems, manual handoffs and rigid processes designed for a bygone era. Today, that vision is no longer out of reach. A combination of forces has created the conditions to connect trial data from the source through submission.

 

What’s changed? Four major shifts:

 

1. Data is richer and more diverse. Beyond EDC, trials now capture eCOA, ePRO, imaging, genomics, wearables and real-world evidence. This has created a surge of multimodal data that can deliver deeper insights if connected and reused. With this abundance of data, companies must shift to a digital protocol and create data that is natively digital, structured and standards‑ready for reuse.

 

In response, sponsors must define upfront how each data stream will be captured, harmonized and analyzed and then use agent‑enabled automation to ingest, structure and tag metadata at scale. This will allow for the same data set to flow from trial design through monitoring to submission without redundant reentry or manual rework.

 

2. Technology has matured. Agentic platforms now use AI reasoning, automation and orchestration to capture, structure and clean multimodal data in real time at the point of care. Tools like Microsoft’s Dragon Copilot can autogenerate structured clinical data entries, cutting manual effort and accelerating quality checks.

 

With this technological maturity, sponsors must make tech integral, not ornamental. When trial workflows are designed with end‑to‑end data flow in mind, technology can amplify efficiency at every stage. A digitally structured protocol built on USDM standards, for example, can trigger AI‑powered agents to autoconfigure data capture, map outputs to SDTM/ADaM and prepopulate submissions. With the right process foundation, capabilities such as real‑time validation and automated content generation can eliminate manual handoffs to create a continuous, intelligent workflow from design to submission.

 

3. Regulatory pathways are evolving. Regulators are rapidly adopting faster, more flexible models to accelerate submission. The FDA’s Real-Time Oncology Review, for example, allows regulators to evaluate data incrementally as it becomes available. Meanwhile, authorities now recognize platforms such as Accumulus Synergy as an emerging way for sponsors to efficiently generate and submit AI-assisted dossiers.

 

When regulators open a door, sponsors should walk through it. Openness to modular evidence models and rolling submissions enables trial data to move continuously from source to regulator, reducing idle time and rework. Agent-led tools can convert live trial data into traceable, modular dossiers that maintain integrity while accelerating decision-making.

 

4. Ecosystem players are aligning around shared standards. Initiatives such as TransCelerate’s USDM, HL7 FHIR and DTRA are making interoperability and “bring your own tech” integration feasible across sites, partners and platforms.

 

To take advantage of this momentum, sponsors must build workflows that assume and embrace a standards-based ecosystem. That means designing trials to integrate seamlessly with site-preferred tools and third-party systems, not imposing rigid, sponsor-driven stacks. Flexible architectures, modular data models and shared data governance can turn this alignment into operational efficiency, unlocking faster collaboration, cleaner data and more scalable trial execution.

 

Today, for the first time, sponsors have both the tools and the mandate to create an intelligent, standards-first data backbone, one that connects the end-to-end clinical development continuum and enables data to flow seamlessly across every phase. Together, these shifts unlock the conditions for scaled automation, where agentic AI handles the complexity and clinical teams stay focused on what matters: insight, speed and outcomes.

How sponsors are already using generative AI to fix data flow gaps



These shifts aren’t hypothetical. Sponsors are already taking concrete steps to modernize how they design protocols, manage data and prepare submissions. With intelligent tools and modular data infrastructure, three areas offer the clearest near-term opportunities to close the gap between today’s fragmented workflows and tomorrow’s seamless, insight-driven trials.

Figure 1: How sponsors are modernizing clinical data flow



Sponsor Area What they did What they gained
Bayer Digitally enabled protocol design Bayer shifted to structured, machine-readable protocols using TransCelerate’s USDM standard. Enabled streamlined digital authoring and seamless updates across systems and stakeholders, enabling faster updates and reduced rework.
Roche Autonomous clinical data management and analysis Roche built an open-source tool automating CDASH-to-SDTM mapping, using 22 reusable algorithms and covering 80% of domains. Streamlined clinical data transformation and improved consistency across trials.
Novo Nordisk Automated content generation Novo Nordisk, in collaboration with Anthropic’s Claude AI and AWS Bedrock, deployed generative AI to accelerate clinical study report creation. Reduced report generation time by over 90%, enabling faster turnaround and increased operational efficiency.

Blueprint for a standards-first data flow pipeline to enable Data at the Speed of Light



To create data that is real-time, interoperable and analysis-ready, sponsors will need to build a standards-first data pipeline, where protocols are structured once and then flow seamlessly from design through execution and submission, all enabled by a centralized MDR.

 

Figure 2 illustrates one potential setup for a standards-first, end-to-end clinical data flow.

Figure 2: End-to-end clinical data automation



What it will take to achieve Data at the Speed of Light



Reaching Data at the Speed of Light will require sponsors to upgrade foundational capabilities even as they invest in the next-generation infrastructure required for scale. We suggest a series of targeted assessments to identify bottlenecks, followed by focused sprints to modernize existing systems and build future-ready ones.

 

Here’s how to start:

 

1. Diagnose current maturity across people, processes and technology

 

Assess current capabilities in areas such as clinical data design, transformation and submission. Where are manual handoffs and friction points slowing things down? Evaluate integration across data capture, data standards, data repository and data transformation to identify data pipeline gaps.

 

2. Create an agentic blueprint with the finished product in mind

 

Design a phased roadmap that introduces intelligent agents at critical junctures in the trial life cycle, beginning with digital protocol authoring and automated metadata propagation. Once in place, expand agentic capabilities to include AI-supported data review, reconciliation and transformation, as well as clinical study report generation.

 

3. Reimagine talent for a human + AI world

 

As agents take on routine tasks, human roles will need to evolve. Protocol designers, clinical research associates, data managers and regulatory leads must step into roles supervising automation, validating AI outputs and orchestrating cross-functional workflows.

 

Getting to Data at the Speed of Light isn’t about doing more. It’s about redesigning how clinical trials work, from start to finish and the inside out. To explore how your organization can start building real-time, standards-first data flow, reach out to a ZS clinical data transformation expert.



About the author(s)