Check out Nayeema Raza's podcast Smart Girl Dumb Questions: https://linktr.ee/smartgirldumbquestions
We’re at a turning point in public health. From billionaires chasing immortality to the growing influence of AI in medicine, the future of healthcare is being rewritten in real time. In this week’s episode of the Newcomer Podcast, journalist and filmmaker Nayeema Raza joins us to unpack the promises and pitfalls of health tech.
We dive into highlights from Deus Ex Medicina, our one-day, invite-only summit where 200 Silicon Valley founders and investors debated the future of AI and longevity. Together, we explore:
- Why America’s healthcare system leaves people needlessly suffering
- The hype (and hope) around GLP-1s and new treatments
- What RFK Jr.’s health movement means for research and policy
- How China is outpacing the U.S. in human trials
- The looming question: Is HIPAA already dead?
This is a conversation about power, innovation, and the very real consequences of technology reshaping our bodies and lives.
00:00:00
What is your outlook on what the Trump administration has done in
00:00:02
healthcare so far? What kind of grade would you
00:00:04
give? RFKI don't know if there's a
00:00:06
grade low enough F minus minus. We're at a crazy moment in
00:00:11
health. We've got AI disrupting
00:00:14
traditional medicine. Doctors don't want to rely on
00:00:16
Chachi BT, but they're happy to rely on open evidence.
00:00:19
RFK and the Trump administration with their Maha Make America
00:00:23
Healthy Again movement is transforming American health for
00:00:26
good and bad. I'm gonna let him go wild on
00:00:29
health. And then you've certainly seen
00:00:31
people like Brian Johnson and other billionaires who are
00:00:34
interested in figuring out how they could try to live forever.
00:00:37
You will die. False.
00:00:41
We brought together some of the most serious people in Silicon
00:00:44
Valley and the startup world who are thinking about all those
00:00:46
changes in the health at newcomers brand new event Deus
00:00:50
Ex Medicina. I hosted the event with Naima
00:00:52
Raza and she's here to join me to talk about some of the major
00:00:56
themes. We'll play some of the
00:00:57
highlights for you and give our reactions and help you make
00:01:01
sense of the crazy stuff and help that's going on in Silicon
00:01:04
Valley right now. This podcast is supported by
00:01:07
Google. Hey folks, Steven Johnson here,
00:01:10
Co founder of Notebook LM. As an author, I've always been
00:01:13
obsessed with how software could help organize ideas and make
00:01:16
connections. So we built Notebook LM as an AI
00:01:20
first tool for anyone trying to make sense of complex
00:01:23
information. Upload your documents and
00:01:25
Notebook LM instantly becomes your personal expert, uncovering
00:01:29
insights and helping you brainstorm.
00:01:32
Try it at Notebook lm.google.com.
00:01:35
Name it, you're not wearing your hat.
00:01:37
My, Oh my God, I have another Hon.
00:01:39
You know, I'm an Aspen, so I have like, you're moving.
00:01:41
On to the next thing. You're like, we just wrapped up
00:01:44
Deus Ex medicina. You're like, oh, I've got a new
00:01:46
hat, new new things on my mind. I love the hat.
00:01:50
You know, we're always going. I'm wearing the hat.
00:01:52
You're not. That's sort of like the the
00:01:54
dynamic overall. I think you're a little.
00:01:57
Too look at this hair, you want to cover this hat?
00:01:59
All the time. Exactly.
00:02:01
Anyway, we have wrapped up Deus Ex Medicina, our inaugural AI
00:02:06
Health and Longevity Summit that we hosted together.
00:02:09
Obviously, you are the wonderful host of the Smart Girl Dumb
00:02:13
Questions. I got my plurals and singulars
00:02:16
right, Smart Girl Dumb Questions host.
00:02:18
And yeah, thrilled to be here with you.
00:02:20
What was your reaction to the event and what was what
00:02:23
surprised you the most? What surprised me the most, you
00:02:26
know, I thought one of the things that was electric in the
00:02:28
room every time it came up was the Trump administration's
00:02:32
policies surrounding healthcare. And what I thought was what did.
00:02:38
You call it like a brick or? Electric, I said it was.
00:02:43
Electric. Oh, OK.
00:02:44
Yeah, it was electric in the room because I think people were
00:02:47
just like, really curious to know what other people thought
00:02:49
about it. It was kind of like everybody
00:02:51
had a thought about it, but everyone wants to know what
00:02:52
someone else thinks before they talk about what they think.
00:02:54
Exactly why, sort of in our predictions episode, I said I
00:02:57
don't know if people are going to want to talk about it, I
00:02:59
think. Yeah, yeah.
00:03:00
But I think people love talking about it and loved hearing what
00:03:02
people said about it. So there was a real, I would say
00:03:06
split not within the room, but within many of the people that
00:03:09
we heard from on stage. People would split their opinion
00:03:12
between what they thought of vaccine policy.
00:03:15
No good, very bad, as Vino Khosla put it, F minus, minus
00:03:19
for RFK. Pretty much across the board.
00:03:21
People voiced concern about it. But then a lot of optimism
00:03:25
around, you know, the kind of kill the clipboard campaign that
00:03:28
you're seeing out of CMS and Medicare, Medicaid, A lot of
00:03:33
hope for the administration's willingness to work with the
00:03:35
private sector on revolutionizing some of the
00:03:38
technology and processes around healthcare in America.
00:03:42
And I, I kind of left thinking, OK, is this self interested?
00:03:45
Is this what companies say? Because like, they need to be
00:03:47
doing business with government here?
00:03:49
But I really felt it was sincere.
00:03:51
You had people on stage like Tom Hale, CEO of Aura, saying, look,
00:03:55
I'm not a Republican, but showing some bullishness around
00:03:58
what the Trump administration has done or could do in policy.
00:04:02
Yeah, policy obviously loomed large.
00:04:05
I mean, the government and Trump are affecting what everything,
00:04:08
everything like Tiktok has grown in private.
00:04:10
But health obviously more than anything, a highly regulated
00:04:12
industry. Silicon Valley cares a lot about
00:04:15
health. Vinod giving them an F minus was
00:04:18
obviously like a moment and I but I don't know, I'm more on
00:04:24
the view that everybody plays their role and if I'm the CEO,
00:04:30
it's like gotta go for whatever win with the government I can
00:04:33
get. And so you have to leave some
00:04:35
room for like what? It's like you're you're
00:04:38
parenting a small child. You know, as I'm preparing to do
00:04:41
you, you're you hope for you teach them.
00:04:43
You're aspirational. Like, yo, the your intentions
00:04:45
are so good. You're trying to do this thing,
00:04:48
you're trying to privatize the thing we all want.
00:04:50
Whenever I hear them say they're really gonna let businesses run,
00:04:53
it's sort of like leave us alone, let us do our thing.
00:04:56
But then when you actually push them on, what about the FDA?
00:04:59
It's either what about the FDA like it's bad or I, I haven't
00:05:03
read or looked at anything on that.
00:05:05
It's like, I don't think anything specific.
00:05:06
They're like pretending they sort of like have no clue what's
00:05:09
going on. Well, I like that.
00:05:11
I, I feel like when I really like your metaphor in which
00:05:14
Donald Trump is a baby in diapers and Tom Hale's like the
00:05:16
big daddy being like you're doing great.
00:05:18
You keep crawling up here. But that's not, but I, but see,
00:05:22
I think that if that were the case, they would be wary because
00:05:25
those I, I think we taped right before Trump had even kind of,
00:05:28
or maybe it was just around Trump kind of distancing himself
00:05:31
from RFKS vaccine policy, but saying that the White House
00:05:34
still stands behind him completely, whatever that means.
00:05:37
But you know, it was in the specifics for me.
00:05:39
Like when we had the conversation with Strand
00:05:42
Therapeutics, Jake Bacraft and Janice Chen from Mammoth
00:05:45
Biosciences and Janice told the story of how this
00:05:48
administration, which so hates mRNA vaccine technology.
00:05:54
I mean, two things that came out of that conversation. 1 Jacob
00:05:57
Bacraft clarified that what the administration had done in terms
00:06:02
of cutting $500 million in mRNA funding, which was widely
00:06:05
reported as such, was really around the vaccine funding.
00:06:08
There continues to be a lot of investment dollars when it comes
00:06:11
to Cancer Research, cancer specific vaccines, etcetera.
00:06:14
It was in the in the other world of vaccines, flus, etcetera,
00:06:17
that they they had cut this research, the respiratory stuff.
00:06:21
But then Janice made the funny point that, you know, RFK hates
00:06:25
mRNA, loves CRISPR, which relies on mRNA technologies.
00:06:29
And so there there is clearly a disconnect.
00:06:31
Hopefully he doesn't learn the science anymore.
00:06:33
Like if he figures it out, he's going.
00:06:35
To hate curse personal, it's better.
00:06:36
That he just like stay, stay, just stay out informed on this
00:06:39
connection. Yeah.
00:06:41
So I guess I'm less skeptical of their bullishness about policy
00:06:46
and more by it. But I'm curious, what surprised
00:06:48
you, Eric? Yeah, I mean, the pure, the
00:06:52
business person in me, the sort of student of the Silicon Valley
00:06:56
startups, I'm very interested in the rivalry between a bridge and
00:07:00
open evidence. It's not, it's not a rivalry
00:07:02
yet, but these are two companies that are core to how do we bring
00:07:06
foundation models to doctors, a bridge we're going to sell to
00:07:11
health systems. You're going to you're your
00:07:13
doctor talking to your note taker app.
00:07:15
We're going to help you make sense of that.
00:07:16
And then very importantly, we're going to help you sell to tell
00:07:20
your insurance company why they owe you a lot of money.
00:07:22
You know your patients, you know a lot of coverage.
00:07:24
And basically we're going to sort of be your tool and your
00:07:27
the fight with insurance companies and making sense of
00:07:30
those messy doctor notes. On the other hand, you have open
00:07:34
evidence, which is like, no, we're going to help doctors
00:07:36
actually diagnose patients, figure out what's going on.
00:07:39
And you know, it helps the patients probably don't want to
00:07:42
hear that their doctor is going to chachi BT to figure out what
00:07:46
to do. So no, we're going to open
00:07:47
evidence. The doctor official trained on
00:07:50
medical journals tech and and they're like basically having
00:07:53
doctors just choose to use their free product and then, you know,
00:07:58
getting pharmaceutical ads. The good old fashioned business
00:08:01
model. Yeah.
00:08:02
Anyway, so I thought there are two super interesting companies
00:08:05
that eventually are going to compete with each other.
00:08:06
But do you have a prediction on who's going to win, Eric?
00:08:08
Are you now going to be, are you going to be like, oh, you know,
00:08:11
I think everyone's doing a good job.
00:08:13
Keep crawling, babies. Interesting.
00:08:16
Who do I think is going? I haven't actually.
00:08:17
I mean, without the numbers, it's always hard to tell.
00:08:20
It's a great like Silicon Valley tech question because it's like,
00:08:23
oh, a bridge is sticky. They're selling to health
00:08:26
systems. And so the real threat, of
00:08:29
course, is open AI coming, coming for them both.
00:08:31
So I do think a bridge is building deals that might be
00:08:35
hard for open AI to dislodge. I think open evidence is
00:08:40
providing like free advice, something that Touchbt does
00:08:44
really well. So I guess I think open evidence
00:08:47
faces more threat from open AI. But you know, people love a
00:08:53
Silicon Valley business that's like so good.
00:08:55
Everybody just wants to use it. It grows because people tell
00:08:58
each other about it. So I guess I'm betting a little
00:09:02
bit on a bridge if I had to pick, but they're really great
00:09:04
companies. And I thought that was a good
00:09:05
moment in your interview with the Abridged founder where she
00:09:08
was like, Oh yeah, we are basically going to compete with
00:09:10
them. We're going to go in that game
00:09:12
with open evidence. I also like the moment in my
00:09:15
conversation with the Hippocratic CEO, Manjal Manjal
00:09:18
as well as Daniel Kahn from Slingshot AI.
00:09:20
And they had kind of back and forth about whether or not
00:09:23
there'll be 1 foundational model that eats the rest or not.
00:09:26
And Daniel kind of came to Manjal was saying, oh, you know,
00:09:32
someone unnamed AKA Sam is pushing this idea.
00:09:35
Sam Altman, Open AI. Sam Altman, Open AI, is pushing
00:09:37
one idea of one big model, and he thinks it's going to be Manny
00:09:40
and Daniel. Daniel was saying, actually, Sam
00:09:42
doesn't necessarily believe that it's going to be the case, but
00:09:45
that is, yeah, the back end of all this.
00:09:47
Like where is this all going to be powered?
00:09:49
And who's going to own the models?
00:09:50
And who does? Who own the models, own the
00:09:52
system Or, you know, or who attracts the doctors, owns the
00:09:56
system? We'll see.
00:09:58
I mean, and drug discovery was very important, but I think
00:10:00
that'll come up in the clip. So I'll, I'll hold my my drug
00:10:03
discovery takes for that. All right, let's play our first
00:10:05
clip and it's I think me in conversation with Jonathan
00:10:08
Swirlen, who's the CEO of Function.
00:10:10
Do you see trends that are geographically distributed?
00:10:12
When you see the data, it upsets your stomach a little bit
00:10:14
because we see how many people are suffering needlessly with
00:10:18
things that are actually really solvable, that don't even
00:10:20
require medical intervention, but are things you solve in your
00:10:23
life. The level of cardio metabolic
00:10:24
disease in this country, even amongst people you you would
00:10:27
consider healthy is so high. And the result of that is like
00:10:31
in 20-30 years, if they're, if they're in their 30s or 40s,
00:10:34
like there's a likelihood they're going to have a heart
00:10:36
attack and they might leave their families behind.
00:10:39
And these things are, I know what happens when you pull the
00:10:41
thread on some of these issues. It's a little nauseating to see
00:10:44
that at such a scale. And then it's also at the same
00:10:46
time encouraging because you, you, we flipped the lights on
00:10:49
like people are in the dark. Wow, you guys just really chose
00:10:52
the the least spicy moment of that whole interview.
00:10:55
Because I thought it was a very Silicon Valley moment.
00:10:57
I wanted to start there just because he really hit that chord
00:11:00
you expect from Silicon Valley. It's just we want to save the
00:11:03
world. I'm a true believer.
00:11:04
You gave him a hard dime. And it was definitely amusing.
00:11:06
And people go, we could, we're posting the full clips on
00:11:09
YouTube so you can go listen to it and watch them sort of rough
00:11:13
him up a little bit. But I did think, what did you
00:11:15
think of his, the sincerity of his sort of, there's so many
00:11:18
people that need our help and you can explain the business a
00:11:21
little bit to people. What sincerity do you see what I
00:11:25
did there? They have no, I, I don't.
00:11:29
I think that look, he has to say that I was curious what they're
00:11:32
doing. Like my big my big question was
00:11:33
what are you doing with all this data?
00:11:34
Right, because function health is one of the fastest growing
00:11:37
companies they've had. I think it's 450% year on year
00:11:40
growth. Like if you're on any social
00:11:42
media platform, you have seen the ads.
00:11:44
It's 500 bucks. Get all these blood panels that
00:11:46
you'll never have from your doctor.
00:11:47
Earlier in the day, Annie Lamont, investor at Oak HCFT,
00:11:51
had made the point that, you know, 90% of what you see in
00:11:54
some of these blood panels, etcetera are actually covered by
00:11:57
insurance. And so there's some marketing.
00:11:58
Will every you know, sure rich people will pay for functions
00:12:02
branded we're going to help you figure everything out test.
00:12:06
But as it reaches mass market, those people are just going to
00:12:08
go to health insurance companies if this is possible.
00:12:10
That's that's basically the trade off with.
00:12:12
Functions and Jonathan brought up like the geographical
00:12:15
distribution of their of their company so I'm like great what
00:12:18
does that data look like are are people healthier in parts of the
00:12:20
country or not and I got this kind of soft answer back So what
00:12:23
do I think of the sincerity of that look I don't think you get
00:12:26
into the healthcare business because you're.
00:12:27
Trying to. Am I struggling to stay on
00:12:30
topic? Or no, no, no, you're I'm, I'm
00:12:33
saying you're struggling with sincerity itself.
00:12:36
I like I. I no, because I think actually
00:12:38
like I, I mean, I believe that most people get into their
00:12:41
fields because they care about the thing.
00:12:42
So yes, I do believe that Jonathan is someone who cares
00:12:46
about the health Americans. I think Markimas someone who
00:12:48
cares about that. They might have different
00:12:50
perspectives on you about that, but they have their own stories
00:12:52
that brought them to it. Yeah.
00:12:54
What I think is interesting about the business is you are
00:12:58
providing services that are then executed on by Quest Labs and
00:13:02
now with their recent acquisition of Ezra MRI scans
00:13:04
that are done by third party vendors.
00:13:07
What you have value in is the data.
00:13:09
And are you going to move into the actual labs business?
00:13:12
And Jonathan didn't say no. He said not today, which to me
00:13:15
as a journalist says, OK, tomorrow, you know, choose
00:13:18
whatever length of time. Why do you choose?
00:13:20
The medical system is built to treat sick people and like the
00:13:26
longevity insight, you know, and I think there are a couple, but
00:13:29
a key one is, well, we should the medical system should be
00:13:32
doing more for healthy people and say you're healthy.
00:13:36
Doctors would say don't get a bunch of tests and scare
00:13:39
yourself. Like just you're healthy, be
00:13:40
healthy, don't worry about it. Like do these, you know, eat
00:13:43
healthy, exercise and like leave us alone.
00:13:45
And I think like companies like function where they're like, no,
00:13:48
get a scan. Like things could be wrong or
00:13:50
you could be better. Like, I just think businesses
00:13:52
coming at something with a different mentality and saying
00:13:56
we, you know, we believe in medicine, but we should like
00:13:58
look at these things differently.
00:13:59
I think there's opportunity there just to like take a new
00:14:01
lens to something where there's so much consensus and how
00:14:05
medicine is thought about something.
00:14:07
So I think they have opportunity and just that they clearly have
00:14:09
a fresh lens for how most of you know medical testing is getting
00:14:14
done. Yeah.
00:14:15
And also, I mean, that is a shift that I think is here to
00:14:18
stay preventive. I mean, call it longevity, call
00:14:20
it whatever, but preventative health and people's own kind of
00:14:24
end of one healthcare like treating yourself as your own
00:14:28
baseline of what good health looks like and not shifting all
00:14:31
those costs onto the system or onto the back end.
00:14:33
Now I think what's important is like, can everyday clinicians
00:14:38
compete and will they upgrade and start providing a lot of
00:14:41
these indicators and tests? And as Andy said, many of them
00:14:44
would cover it, but you would have to ask, your insurer would
00:14:47
have to agree, etcetera. And that bureaucratic process is
00:14:50
really hard. So it's easier to, you know,
00:14:52
swipe and and get the blood done.
00:14:56
I think we have a clip on GLP once, maybe we go to that
00:14:58
because I think that fits great directly with this longevity
00:15:01
conversation we're having. Probably and this will probably
00:15:03
be our conversation with Noom CEO, Seju Zhang and.
00:15:07
Yeah, all yours are the best, I guess.
00:15:08
You know, we're doing all Naima. Clip No, no, I'm just, I'm just
00:15:11
introducing the clip. No, it's good play that they're
00:15:14
not the best. I can make fun of myself.
00:15:17
Yes. Do you think pharmaceutical
00:15:18
companies are going to double down on GLP ones and other kind
00:15:21
of preventative care over the R&D that they're doing for
00:15:24
cancer treatments? Or there's definitely a lot of
00:15:27
rumors that they're going to get into explicit aging and
00:15:30
longevity and like have an aging division, which would be the
00:15:34
sickest thing ever. And I hope it happens.
00:15:36
Like an aging drug or longevity drug sounds so radical, but it's
00:15:39
just a preventative medicine for multiple diseases, right?
00:15:43
Ozempic is preventative medicine for the downstream negative
00:15:46
consequences of obesity. Statins are preventative
00:15:48
medicine for certain forms of age-related cardiovascular
00:15:51
disease. That's all a longevity drug is
00:15:53
too. And oh, by the way, statins are
00:15:54
the most successful drug class in history.
00:15:57
GLP ones are probably going to beat them, right.
00:15:59
And so the minute people see that, and I'm sure people there
00:16:03
already do, they're incredibly smart.
00:16:05
Yeah, I think it's going to happen.
00:16:06
So this is Celine, who's the founder of Loyal, which is you
00:16:09
might think she's doing drug life extension for humans, but
00:16:12
in fact she's doing life extension for dogs, which is
00:16:14
such an interesting play, but came up in human sciences.
00:16:17
Why do you think of her prediction, Eric?
00:16:18
What I love about Celine is that like, she's sort of doing this
00:16:22
like Tesla strategy, right? You start with like the Roadster
00:16:26
and then you're like, from there, we'll save the planet.
00:16:28
Elon might have gotten lost on his save the planet with clean
00:16:32
energy. But Celine, it's like, OK, if we
00:16:34
can figure out how to save your dog, which is a lower regulatory
00:16:37
burden then everybody else, then we'll be able to do it with
00:16:40
humans and do much more. And so she's a true believer in
00:16:43
like longevity. How can we make healthy people
00:16:46
live higher quality lives? And I, I think she's right that
00:16:50
like pharmaceutical companies are going to wake up and say,
00:16:52
oh, there's clearly money in it. Silicon Valley's investing in
00:16:54
it. GLP ones made a bazillion
00:16:56
dollars and we should take this more seriously.
00:16:59
Yeah, to quote the the great artist Cuba Gooding Junior, it's
00:17:03
like, show me the money. This is the money.
00:17:05
This GLP ones have fundamentally shifted how we think about
00:17:09
people's willingness to pay for healthcare.
00:17:11
This is an industry that you would rather pay for.
00:17:14
Like you don't think twice about buying a new pair of sneakers,
00:17:18
but when it comes to something for your health that might cost
00:17:21
a fraction of that, you're like, no, because it should be covered
00:17:24
by my insurance. I already paid for my insurance
00:17:25
and the insurance should cover it.
00:17:26
So there's this whole psychological wrap up.
00:17:29
And what we're seeing with GLP ones is what we've seen with,
00:17:32
say, Botox, where people are willing to pay for a medical
00:17:36
product out of pocket all of a sudden.
00:17:38
And no doubt pharmaceuticals want to get in that space.
00:17:42
And they also I think she and Seiju from new made this point
00:17:46
will want to increasingly go direct like Celine has chosen to
00:17:49
own distribution end to end from during clinical trials and
00:17:52
putting, you know, to putting shots in pause, I guess is the
00:17:56
analogy in this case. And.
00:17:59
I think a subtle thing people learn with GLP ones is like, the
00:18:03
story of the drug matters so much.
00:18:05
You know, it's like they had been around, you know, has
00:18:08
helped people with diabetes. And obviously, you know, you
00:18:11
need it to be sort of primarily useful for obesity, which, you
00:18:16
know, allowed for this It's. Blow up, yeah, it's still
00:18:19
heavily regulated. You still need a BMI of 25 to
00:18:21
qualify for Ozempic. I think Newham has introduced
00:18:24
this micro dosing GLP ones, which I'm very interested in.
00:18:28
I think this will move over time.
00:18:30
It's someone who's not at that BMI.
00:18:31
I would like to. I mean, I've thought about oh,
00:18:34
should I be micro dosing GLP ones because all of a sudden if
00:18:37
it's anti-inflammatory and really had longevity benefits
00:18:40
which are still being borne out and being tested and studied.
00:18:43
But if that is the case, then why would we reserve it only for
00:18:46
a portion of the population? All right, let's play the next
00:18:49
clip. And this is you, Eric, talking
00:18:51
to Doctor Han Park, who's the head of digital health at
00:18:53
Samsung. 40% people in the US do not know they have sleep apnea.
00:18:57
There was a study that showed when you reach 61% decrement per
00:19:01
year in deep stage sleep increase your dementia risk by
00:19:05
2527%. And so our users use our
00:19:09
wearables a lot for tracking their sleep.
00:19:12
One of the top reasons for sleep issue has to do with sleep
00:19:16
apnea. So we have a software as a
00:19:18
medical device, we'll screen for you and say we think you have
00:19:22
sleep apnea. 40% people in the US do not know they have sleep
00:19:25
apnea. So the watch actually uses PPG
00:19:28
signals. We validated this in a sleep lab
00:19:31
and so and we got the FDA clear based on the standard we met and
00:19:35
essentially we're able to screen to a very good accurate degree
00:19:39
as to whether you have sleep apnea.
00:19:41
I have to say, after that moment, I was like, I think I
00:19:44
have sleep apnea. It's just the most people's
00:19:47
reaction to that moment. What did you think about that?
00:19:50
I have two very contradictory takes.
00:19:52
I guess the, this is the one that sort of fits with what I've
00:19:55
been saying is just, you know, it, it's good to help people
00:19:59
sort of, you know, you're already wearing these devices
00:20:02
like get the data, figure out what's wrong with you, like find
00:20:05
some improvement. There is another part of me
00:20:08
that's like, man, it's what capitalism does best.
00:20:11
It's like, you don't realize you have a problem.
00:20:14
Let us sort of plant the seed that you have a problem that we
00:20:17
can then sort of sell you a solution to.
00:20:19
And I, I do think that's where the medical establishment gets
00:20:22
wary. It's like some of these things.
00:20:25
Are only really problems if you feel like you're suffering and
00:20:29
if they suggest you, you might be now all of a sudden you might
00:20:32
have a need that you didn't have before.
00:20:34
So I think it just depends on the seriousness of an issue.
00:20:37
I mean, I do think sleep apnea is a real thing, but you can see
00:20:40
how, taken to an extreme, suggesting people should be
00:20:44
worried about this new thing or that could be stressful for
00:20:47
people. Yeah.
00:20:48
Sleep apnea is a very interesting use case for it
00:20:50
because it's the kind of thing that you might not know you have
00:20:53
it. And especially if like you're
00:20:54
asleep, you think like you're asleep.
00:20:56
And if you're not in a relationship, right?
00:20:58
And like, if you're in a relationship, you might know,
00:20:59
someone might say to you, hey, you're snoring or hey, you seem
00:21:02
to stop breathing in the middle of the night.
00:21:03
But if it's just you having bad sleep, you have no idea what
00:21:06
that is. And, and third kind of thing
00:21:08
that it triggers in me is these basic elements of health, which
00:21:12
is like health, you know, it's like, are you hydrating?
00:21:15
Are you eating good food? Are you moving your body?
00:21:17
Are you having good communal relationships?
00:21:19
Are you, well, kind of those are your five real indicators of
00:21:22
health and or like drivers of health, most people would say.
00:21:25
And so, but this is 1 where. Yeah, I think it's interesting
00:21:28
the way they've been able to work with the FDA to get that
00:21:31
use case out there. And certainly will be a way that
00:21:36
we see a lot of these devices start to compete with each other
00:21:40
is I think in their ability to diagnose, slash, prescribe or at
00:21:44
least get proximate to that, give you readouts that are
00:21:47
doctor friendly and or talk to you like the OR Tom Hale.
00:21:51
I was like, when is your ring going to talk to you?
00:21:53
And he's like, Oh yeah, we've been talking to Manjal, the CEO
00:21:56
Hippocratic about that for years.
00:21:58
Like could your ring just call you?
00:22:00
I mean, to get like philosophical for a second,
00:22:03
yeah. I feel like there's a funny
00:22:05
realization with health where you're like, oh, I'm in charge
00:22:07
of it, You go to your doctor. But then at some point, like
00:22:11
your doctors like barely see you, they're like oblivious and
00:22:14
you realize like, oh, it's like it's just me.
00:22:17
Like no one is really your doctors definitely know less
00:22:20
than you. They have more expertise, but
00:22:21
they're like they're not dialed in.
00:22:23
And it's not that high stakes. Like if you have an issue, you
00:22:25
sort of need to conclude you have an issue or not.
00:22:28
And this is like a very like, I don't know, you're alone in the
00:22:31
universe sort of revelation that I think people have at some
00:22:35
point. And it's funny with technology
00:22:37
that I do think there is this real potential that it's like,
00:22:40
oh, we emit all this data and AI is getting very smart.
00:22:44
And like, someday soon there will be another force sort of
00:22:48
like watching out for you and saying, I'm worried you have
00:22:51
this thing. And like, will that change for
00:22:54
good and bad, the sense that there's a protective sort of
00:22:58
layer around us that that we pre that force didn't have?
00:23:02
Yeah. And the data around it is really
00:23:04
interesting because all of a sudden you have that data and
00:23:07
then it becomes is it what signal, what's noise in that
00:23:09
data? And that's where you're really
00:23:11
dependent on the device and the authorizations for people
00:23:13
telling you. But like, you know, one of the
00:23:15
greatest innovations of the Aura ring in the last six years has
00:23:18
been their ability to tell you where you are in your cycle.
00:23:21
And people are using that for for sexual health, for for
00:23:24
fertility tracking, for trying to get pregnant.
00:23:27
And you're taking something that was a really terrible experience
00:23:30
of like you spend your 20s trying not to get pregnant, then
00:23:32
you spend your 30s trying to get pregnant and peeing on sticks
00:23:35
and doing all kinds of things. And whereas making that, I think
00:23:39
the studies show that it's much more accurate than time
00:23:41
thinking. It's probably.
00:23:42
I don't. I don't know if it's as accurate
00:23:44
as peeing on a stick, but. I mean, part of what you're
00:23:47
getting at, I think is this like it's a tool to inform you.
00:23:51
It's like it's your data, it's giving you advice, but you still
00:23:55
feel like the actor. It's not this sort of like
00:23:58
other, which is like, hey, like I've made this unilateral
00:24:01
decision for you. It's like it ultimately
00:24:03
reinforces the sense you're in control because you're getting
00:24:06
information that allows you to be as sort of more informed
00:24:10
decision. Maker though health is, you
00:24:12
know, one of these industries that you still have to rely on
00:24:15
potentially going through someone else or at least seeking
00:24:17
the advice of someone else because data has a bunch of
00:24:19
numbers and that's where I think they use a bill.
00:24:22
Like a lot of what will drive the competitiveness or not of
00:24:26
these companies is how usable and glanceable glance ability is
00:24:29
a metric that Apple, for example, looks a lot at.
00:24:31
So when they create these PDF readouts, I know one of the
00:24:34
things that they look at is how quickly can a doctor digest that
00:24:36
information? Because a problem in the
00:24:38
clinical setting that you're having is people are coming in
00:24:41
with all this information and saying to their doctors, but I
00:24:43
have this massive MRI report and they're saying, why are you
00:24:46
doing this? Like you're in your 30s, you
00:24:48
seem super healthy. Why are you having a full body
00:24:50
MRI scan? Doctors talk a lot about the
00:24:53
cost on the medical system from people having too much data and
00:24:57
wealthy people in particular coming in with more data, taking
00:24:59
more time. That's taking away time for them
00:25:02
to see other patients. So the health system has to
00:25:04
think about how to absorb or compete with what we're seeing
00:25:07
here. One question I have for you is
00:25:09
we have talked about whether or not they'll be 1 bundle for
00:25:13
healthcare or not or if there'll be, you know, like what the
00:25:16
future of healthcare looks like. Will it be like streaming where
00:25:19
you have all these different bundles competing with each
00:25:21
other or will these stay disaggregated companies?
00:25:25
Did you have a prediction or more of a thought on that coming
00:25:27
out? Of no, it's beyond me.
00:25:28
It's just so hard with the insurance layer.
00:25:31
Like your vision is like sort of a Netflix type subscription
00:25:35
where it's like you have insurance and then you have your
00:25:38
like additional I'm willing to pay.
00:25:40
And the way you get people to do that is you sort of offer them a
00:25:43
bunch of stuff and they. Well, I'm just saying that
00:25:46
vertical integration like function health buying as well.
00:25:48
Like all of a sudden it's like, oh, are you part of the, you
00:25:50
know, I'm making this up obviously or something.
00:25:52
Are you part of the Ponuvo Aura Newham bundle or are you part of
00:25:55
the function Health Woop whatever bundle.
00:25:59
It makes, I mean, I, you know, I used one medical like it's, you
00:26:01
know, a relatively small fee, but certainly I would be happy
00:26:06
to pay much more if it was like, OK, here's do all these things,
00:26:10
you know, lead you to it tied in with we're going to try and get
00:26:14
insurance to cover everything we can, but we've sort of thought
00:26:18
through what the supplemental stuff is that you should pay
00:26:20
for. Yeah.
00:26:21
I imagine the future we're going to be paying for like they'll
00:26:24
still be insurance, which will largely be employment driven.
00:26:26
And then you'll have a set of subscriptions that you pay, but
00:26:29
instead of paying 10 different providers, you're going to pay 1
00:26:32
bundle price. That's going to be a discount on
00:26:34
like 4 things that you're getting.
00:26:35
Here's a question to Chevro, the CEO of a bridge, the company
00:26:39
that I talked about that was selling 2 medical systems giving
00:26:43
doctors transcription services. My mother is so worried about AI
00:26:48
is going to deny all my claims and the insurance companies are
00:26:52
going to just be so empowered to like find every way to weasel
00:26:55
out delivering care and that could get extended to Medicare
00:26:58
Advantage. How worried are you about
00:27:00
insurance companies weaponizing this technology?
00:27:02
My, my sense so far is that there is a ton of opportunity
00:27:06
for us to figure out the right way forward.
00:27:08
And when you think about risk adjustment, it's so difficult
00:27:12
for a clinician to capture the whole story.
00:27:14
And as technology can do that and that's a win, win for
00:27:17
everyone involved. I used to be on a clinical
00:27:19
documentation improvement team at a health system at UPMC in
00:27:22
Pennsylvania. And we used to do these lunch
00:27:23
and learn. So we go from department to
00:27:25
department and do pizza and PowerPoints and try to teach
00:27:27
doctors about risk adjustment, about what they call meet
00:27:30
criteria, about documenting what you discuss monitoring,
00:27:33
evaluating, assessing or treating against any given
00:27:35
problem. We try to teach them about
00:27:37
utilization management and versus OP status or E&M criteria
00:27:41
or medical decision making. Did you document it completely?
00:27:44
All of this stuff in one year out the other Every single
00:27:46
clinician in those sessions wanted had 1000 yards there and
00:27:50
just wanted to get back to clinic, see their patients and
00:27:52
then get home and see their families.
00:27:54
And So what happens when you can take these models to risk
00:27:56
adjustment school to revenue cycle school?
00:27:58
What happens when you take them to prior authorization school?
00:28:01
What happens when they're behind the scenes reasoning through and
00:28:04
what happens when they go to medical school and they can help
00:28:05
the like the clinician feel like a superhero in front of their
00:28:09
patient. And so that's that's sort of the
00:28:11
high level journey that we're on.
00:28:13
Pizza and PowerPoint, I think very different things happen in
00:28:15
all of those settings. I mean, I imagine insurers are
00:28:18
going to use that AI to deny, deny, deny, and clinicians to
00:28:22
diagnose, diagnose, diagnose. But what do you think?
00:28:24
I mean, Shiv is, you know, on the side of doctors, They're
00:28:27
they're going to work with insurance companies some, but I
00:28:30
think, you know, doctors are their customers.
00:28:32
So they are trying to help doctors get more out of
00:28:34
insurance companies. So he's sort of the feel good
00:28:38
story. I don't think the insurance
00:28:39
companies have any incentive to shout about how they use AI.
00:28:43
And so we'll hear less about it, but certainly it's going to be
00:28:46
going on. But I take Shiv's point that
00:28:49
like insurance companies, their job is to figure out like, oh,
00:28:52
you didn't do the right thing. Like we're not going to pay for
00:28:54
that. That's not how it's supposed to
00:28:55
go. And so doctors, you know, get
00:28:57
into the business of being a doctor business.
00:28:59
They want to help people, not because they want to file
00:29:01
claims. And so perhaps they have more
00:29:04
room to benefit from having this tool than insurance companies
00:29:09
do. But I definitely think an arms
00:29:11
race is very possible. And that's in a lot of domains.
00:29:14
The most dystopian case for AI, which is, I mean, sort of like
00:29:19
the law in America, you have your expensive lawyer, I get my
00:29:22
expensive lawyer and we fight it out.
00:29:24
Like you get your great AII, get my great AI, and we fight it
00:29:26
out. Then it's like, are we any
00:29:28
better with all that spending? Yeah, it'll be a three-way arms
00:29:32
race because, you know, like my big excitement when chatbots
00:29:36
became mainstream and 2023 is like, I dreamed of a chat bot to
00:29:40
fight the United chat bot, United Airlines chat.
00:29:44
Like, just like I will just put this thing to work.
00:29:46
So it's not my time. It's like you deal with my
00:29:48
agent. And I think over time it will be
00:29:50
bad. It'll be the doctors, you know,
00:29:52
it'll be the agentic Dr. and the agentic insurer and the agentic
00:29:55
patient fighting it out about what's going to be the right
00:29:59
solution. So there is that arms race.
00:30:01
Reality is something that I think we're driving towards.
00:30:04
And Speaking of this great arms race, it's probably a good time
00:30:06
to play the conversation that we were talking about earlier
00:30:09
between Manjal Shah, Hippocratic AI, and Daniel Kahn from
00:30:13
Slingshot AI about whether it'll be 1 foundational model that's
00:30:17
going to really power all of the healthcare companies in the
00:30:20
world or if it's going to be competing models out there.
00:30:23
Certain people who will remain unnamed have been running around
00:30:26
the world saying you know 1 ring will rule them all.
00:30:29
Yeah. And it's just not true.
00:30:31
We have tried it. We have tried.
00:30:32
There's only one model that we trained ourselves on, 6
00:30:35
calls of nurses talking to patients.
00:30:36
And you can't get there 'cause these models, even when they
00:30:39
have a million token context window, they can't pay attention
00:30:42
to all of it. And so you have to multiply
00:30:44
attention span by multiplying models.
00:30:46
It could be that closed source or open source models dominate.
00:30:49
It's extremely hard to predict. But wherever the the world goes
00:30:51
it, it is entirely possible that it's fine tuning.
00:30:54
It could be a whole new paradigm of fine tuning that emerges.
00:30:56
Who knows, we might be building on top of foundation models.
00:30:58
We might be able to be building separate models.
00:31:00
I think it's personally, I just think it's kind of pointless to
00:31:03
predict. I think machine learning
00:31:04
engineers on the ground, we don't really care.
00:31:07
So that's such an interesting take because now Daniel Kahn,
00:31:11
who's arguing that it doesn't really matter, is actually from
00:31:14
Slingshot AI, which has built its own foundational model for
00:31:17
behavioral health. It's a mental health provision
00:31:19
company that's really probably like really competing with
00:31:22
ChatGPT and one of their prime businesses right now.
00:31:24
What they wanted necessarily, but it is a big use case therapy
00:31:28
versus the Hippocratic AI model. It's really like 20 plus
00:31:33
individual models. I think they're largely based on
00:31:35
another outside foundational model, but it's like 26 models
00:31:40
because they need one mean 1 and 20 plus nice ones to deal with
00:31:43
the patient. So the patient is getting all
00:31:46
the information they need. But when the patient isn't
00:31:47
behaving, patient isn't complying, taking their
00:31:49
medicine, doing their walk, doing their PT, the mean model
00:31:52
can come in and say, hey, you really got to be doing that in
00:31:55
the world of kind of agentic voice nurses slash medical
00:31:59
assistants that they're in right now.
00:32:00
So they're just like taking points of view that I thought
00:32:02
was like, that's kind of interesting given what your
00:32:04
company is actually doing. What do you think?
00:32:08
I mean, you know, I feel like I spend my life debating this
00:32:11
question. You know, Cerebral Valley, our
00:32:13
AI summit. It's like will open AI rule it
00:32:15
all or will be lots of different models?
00:32:17
I think having a super smart model is very valuable in
00:32:21
getting you customers associating me with the thing,
00:32:24
right? If Daniel Slingshot, it can be
00:32:27
the best at therapy or one of the best, it'll be easier to
00:32:30
build a brand for therapy and get people doing it.
00:32:34
And you need to sort of be competitive.
00:32:36
But I think the Moat more and more is like people download
00:32:40
ChatGPT, people download your app like you need to have users
00:32:44
engage with it. Teach them what am I getting out
00:32:47
of it? Teach them the rhythm of that
00:32:49
and that matters more. And if you're, you know, using
00:32:52
small models and paying ChatGPT, like I think to me the customer
00:32:58
relationship matters more than your proprietary model
00:33:03
intelligence. And I think that's how things
00:33:04
have been developing. Yeah.
00:33:06
So like you're saying, whatever's in the front of the
00:33:08
house matters more than what's in the back of the house?
00:33:10
Yeah, shuffle a bunch of models and like all this stuff is still
00:33:13
happening where like price is somewhat abstractive because
00:33:16
they've been able to raise a bunch of money.
00:33:18
And once price matters, you know it's nice to use open source
00:33:21
models because they're cheaper. 100% that is interesting.
00:33:24
I feel like I'm I less have a clear point of view on this in
00:33:29
some way. I'm with you that what what the
00:33:32
harder thing is, is the adoption.
00:33:34
But then I think you have one bad news story, especially like
00:33:38
in mental health, one bad news story about a diagnosis in the
00:33:41
medical world. It's like if your model is
00:33:43
feeding inaccurate information, that's that's really
00:33:47
challenging. And that's why so many of these
00:33:49
companies, including Hippocratic, including
00:33:51
Slingshot, are staying away from prescription and diagnostics
00:33:56
because that is the scary, scary world, you know, of, of
00:34:00
medicine. I have like older parents.
00:34:03
And so I have been in the healthcare system as like an
00:34:06
advocate for my parents for a long time.
00:34:08
And my sisters and I know so much about the healthcare
00:34:10
system. The doctors will often be like,
00:34:12
or the nurses will say, are you guys medical professionals?
00:34:15
I'm sure it's racial profiling also because we're brown.
00:34:17
But, and I'll always say to them, Oh no, I'm a malpractice
00:34:20
attorney. Oh my God.
00:34:22
And they literally look like they're going to pee.
00:34:24
And I'm like, I'm just. Kidding.
00:34:26
I don't know, I'm just kidding, but I know some, but I'm just
00:34:29
kidding. But you know, I always get, I
00:34:31
always think, I always think you get more bees than honey.
00:34:35
But it's so funny. It's so funny to see people's
00:34:39
reaction to that because the only thing I think doctors hate
00:34:43
more than insurers are these like, right?
00:34:47
All right, let's play our last clip, the spiciest of the day.
00:34:51
I think we teased this at the very beginning of the episode.
00:34:53
What grade would Vinod Khosla, star investor, give Health
00:34:56
Secretary RFK Junior RFKI don't know if there's a great low
00:35:01
enough F minus minus. He obviously cut a deal with
00:35:06
Trump during the election, which is a sad state of affairs where
00:35:10
our health portfolio and vaccine policies are auctioned off
00:35:14
during an election. But they are doing a lot of good
00:35:17
things. Doctor Oz has a very sensible
00:35:20
view of how to use AI. I think the FDA is very
00:35:24
interested. And you head of AI at the FDA.
00:35:28
All right, he had some nice things to say about Doctor Oz.
00:35:32
By the way, we didn't keep that clip running, but he was talking
00:35:34
about how Doctor Oz had run his what was it 2012 or 2016, 2016
00:35:39
paper or treat us on the future of healthcare and and do we need
00:35:44
doctors or do we need algorithms kind of take and that doctor Oz,
00:35:48
now head of of Medicare and Medicaid had responded to him
00:35:52
and engaged with him on this. So Venona, very good things
00:35:54
about to to think about the administration overall, but no
00:35:57
love for RFK, right Junior? I think that's a consensus view.
00:36:01
The only non consensus thing is whether you're willing to say it
00:36:04
or not. Anyways, this was so fun.
00:36:05
I, you know, I learned a lot. I think that we should
00:36:07
definitely do it again. We should go to Austin or we
00:36:10
should go to New York next. Weigh in where where should we
00:36:13
bring daily sex medicine? And this was a lot of fun.
00:36:15
Plenty, plenty more to dig into. And yeah, great to coast with
00:36:18
United. Yeah.
00:36:20
Thank you. Yeah, it was great to Co host it
00:36:21
with you. I'm glad that we made this thing
00:36:23
happen. Great.
00:36:24
Sounds good. All right.
00:36:26
Thanks guys.
