Digital Biomarkers and the New Language of Cancer Care with Dr. Sean Khozin, CEO Roundtable on Cancer
Dr. Sean Khozin — oncologist, former FDA leader, current head of the CEO Roundtable on Cancer and Project Data Sphere — talks with DeepScribe CEO Matthew Ko about how ambient AI in oncology can surface signals using voice as a digital biomarker.
Every oncologist knows the moment when a room gets quiet in a different way. The patient’s words make sense, but something in the cadence or energy implores the physician to look closer. That voice signal lives almost solely within the conversation, rarely if ever making it to the EHR.
In this episode of Beyond the Chart, Dr. Sean Khozin — oncologist, computer scientist, former FDA leader, and now CEO of the CEO Roundtable on Cancer and cancer data-sharing collaborative Project Data Sphere — makes a case for treating those “between the lines” signals as measurable data. With ambient AI capturing the visit from start to finish, Dr. Khozin and host Matthew Ko, DeepScribe founder and CEO, explore how voice features like affect, cadence, and dynamic range can become reliable indicators that strengthen clinical insight and care. The opportunity at hand is to transform these observations into validated, reliable measures that strengthen both care and research.
How Ambient AI Captures the Clinical Voice Signal
Clinicians already hear things that charts can’t capture. Patients show a shift in not what they’re saying, but how they’re conveying it, changes that hint at mood, pain, or neurologic change. Dr. Khozin has seen it firsthand.
“I’ve picked up brain metastases on the phone just by talking to patients… they’ve sounded different and their affect has changed.”
Voice carries a clinical signal beyond words. Capturing it via ambient AI makes it observable and reproducible across visits and across clinicians.
“Beyond the semantics of what they’re saying,” adds Dr. Khozin, “there’s a lot of valuable information that an experienced physician, and now… a properly trained voice recognition model, can pick up.”
From Waveforms to Biomarkers
Dr. Khozin applies his expertise in acoustics to further expand on the value of the patient’s voice.
“If you look at the audio waveforms for a patient that have a flat affect… the waveforms would be almost compressed and AI can pick that up very easily.”
Armed with that information, clinicians can intervene sooner to address everything from changes in mood to increases in pain. Dr. Khozin, however, reiterates the most important element of any clinical data – the clinician’s know-how and experience. “Those signals are quite important…” he says, “but at the end of the day, it’s all based on your intuition.”

Why Ambient AI for Oncology Restores Provider Presence
The pressures of documentation have gradually eroded the potential for a strong therapeutic bond during the patient visit, instead turning encounters transactional. Ambient AI systems let clinicians literally look up and focus visually on patients. This restores both attention and curiosity to the patient visit without losing the structure that’s required to document care.
Dr. Khozin identifies today’s charting templates as necessary but reductionist; ambient AI capture expands the “context window” of the visit.
From Exploratory Signals to Regulated Biomarkers
For voice-derived signals to shape oncology, they must move from exploration to validation. Dr. Khozin notes the process will feel familiar to anyone who’s shepherded a biomarker through regulatory review.
“We need systems that are technically valid and clinically meaningful. The FDA already has pathways to approve digital biomarkers, including through software-as-a-medical-device frameworks.”
Pain is a telling example of the need for signal validation. The classic 0–10 faces scale is easy to use, but woefully limited. Ambient AI offers a chance to measure pain continuously through voice and facial patterns that align more closely with real experience.
“We can do a lot better. Pain is reflected in the voice and in facial features. We’ve already shown that the signals are quite strong.”
That combination of data integrity, clinical meaning, and human-centered validation will determine which new biomarkers in cancer care make it from lab to clinic.
Beyond the Chief Complaint: Structuring a Semantic Layer
Even before new vocal or facial endpoints emerge, oncology can learn more from the patient conversations currently taking place. Each cancer care visit contains an overlooked layer of context in the way patients describe their fatigue, what they fear about treatment, or the social pressures that may hinder adherence.
By structuring the semantic layer of the conversation, ambient AI for oncology can highlight social determinants of health, treatment preferences, or subtle changes in performance status. These details often disappear in templated notes but can meaningfully shape care plans and outcomes.
“Preferences are often not discussed early, especially in cancer care. But the semantic content of conversations can be very, very useful in optimizing treatment.”
Project Data Sphere and the Pre-Competitive Blueprint
Dr. Khozin’s work at Project Data Sphere demonstrates how openness accelerates innovation. The initiative began as an open-access platform for oncology trial data and evolved into a pre-competitive hub connecting regulators, academia, and industry. I this environment, sometimes the best ideas come from an unexpected source.
“The first global challenge we launched… the winning team had nothing to do with healthcare. They were aerospace engineers. It showed how data liquidity brings new minds into the field.”
Those collaborations now power projects with the FDA and National Cancer Institute to automate tumor measurement and create standardized models for complex endpoints. For Dr. Khozin, the lesson is clear: progress in digital medicine depends on shared data and shared purpose.
Recap: Presence First, Signals in Service of Patients
Ultimately, Dr. Khozin believes the value of technology isn’t in what it can restore to medicine.
“Let’s automate the mechanics… so we can get back to how we used to practice — when it was just about the doctor and the patient.”
Ambient AI allows oncology teams to stay present while capturing every layer of meaning: the story, the science, the subtle cues. Voice-derived biomarkers and structured semantics don’t replace the clinician’s role, they extend it, helping care teams see and hear the whole person.
In part 2 of this discussion, Dr. Khozin and Matt dive deep into how these ideas translate into implementation, policy, and the economics of modern cancer care.
Listen to the Full Conversation
Watch part 1 to learn more about the layer of data that’s revealed in patients’ voices, and how ambient AI for oncology can bring it to the surface.
text
Related Stories
Realize the full potential of Healthcare AI with DeepScribe
Explore how DeepScribe’s customizable ambient AI platform can help you save time, improve patient care, and maximize revenue.

