On this episode of the Newcomer podcast, host Eric Newcomer is joined by co-host Nayeema Raza for conversations with some of the most influential voices in healthcare and venture capital. Bob Kocher, Partner at Venrock, and Annie Lamont, Founder and Managing Partner of Oak HC/FT, share their perspectives on business models in healthcare, the rise of AI applications, the promise and pitfalls of longevity drugs like GLP-1s, and the future of Medicare Advantage. Later, Vinod Khosla, Founder of Khosla Ventures, brings his trademark candor to a wide-ranging discussion about AI’s role in healthcare, regulatory challenges, global competition, and how startups can reimagine the system from the ground up
00:00:00
Hey, welcome to the Newcomer podcast.
00:00:02
It's Eric. Newcomer here.
00:00:03
I'm fresh off Daegu sex Medicina.
00:00:05
I think I'm literally wearing the same sweater I wore to Daegu
00:00:08
sex medicina. I promised I'd change my
00:00:10
clothes, but I, you know, you get attached to a sweater and
00:00:14
then I wear it to death. Anyway, Yeah, it was a great
00:00:16
event. This was our inaugural AI Health
00:00:19
and Longevity Summit. We brought together, you know,
00:00:22
some 200 people in San Francisco, heavy mix of
00:00:26
founders, investors, members of the media, other health insiders
00:00:30
to talk about 3 big themes rising in health right now.
00:00:34
One obviously foundation models in the rise of generative
00:00:37
artificial intelligence, which is powering companies like a
00:00:41
bridge, which is doing doctor note taking and open evidence
00:00:44
which is helping doctors diagnose patients.
00:00:47
Both companies had their Co founders speak at daew sex
00:00:50
medicina. Second, we had the longevity
00:00:53
trend with GLP ones, you know, suddenly making everybody
00:00:58
commit. So helping healthy people.
00:01:00
There's a lot of money to be made there, a lot of value to be
00:01:02
created for patients. We had the Co founder of Noom
00:01:06
Seiju, who you know, whose company is doing, I think micro
00:01:09
dosing GLP. We had Celine, the cofounder and
00:01:14
CEO of loyal who is helping dogs live longer and always very
00:01:21
thoughtful speaker our longevity.
00:01:22
So we had that longevity theme. And then third, there's all the
00:01:25
insanity with Maha, you know, make America healthy again, RFK
00:01:30
and the Trump administration's changes to health care.
00:01:33
So how is that sort of an inflection point for good and
00:01:36
bad? And there was definitely some
00:01:37
soul searching of whether, you know, the traditional
00:01:41
Republicans were helping Medicare Advantage as much as
00:01:43
some investors had thought Democrats like to beat up on
00:01:46
Medicare Advantage. And now Trump hasn't come in and
00:01:49
saved the day either. And that's obviously been an
00:01:51
exciting area of investment. So those were three big themes.
00:01:54
Other themes, you know, took us by surprise.
00:01:56
Chai's ability to do drug trials much more effectively than US
00:01:59
certainly stood out. And then I think the final,
00:02:03
final theme I'd flag before kicking this over would just be,
00:02:06
you know, continued hopes that patients will be buyers of
00:02:09
healthcare sort of directly. And you know, we had the CEO of
00:02:12
Function Health speak who's selling diagnostic tests to
00:02:15
consumers and we had some healthy debate over whether
00:02:20
consumers would prefer only to use their insurance company or
00:02:23
not. Anyway, those are my high level
00:02:25
takes. We picked two of our favorite
00:02:28
panels to share on here in the podcast feed.
00:02:31
First, my conversation with Bob Kosher at Fenrock and Annie
00:02:36
Lamont at Oak HTFT, 2 of the top health investors.
00:02:41
And then the second is me with Naima Raza, who I Co hosted
00:02:46
daily sex medicine. Yeah, we interviewed the ever
00:02:48
spicy Vinod Khosla and I wanted to share the investor
00:02:52
conversations because I do think they zoom out the farthest you
00:02:55
get the greatest scope of what we really covered in the day.
00:02:58
We're going to post a lot, if not all of the other talks on
00:03:02
our YouTube channel. So go check out Newcomer Media's
00:03:05
YouTube channel. You can always go to the sub
00:03:08
stack newcomer.co to get routed to our stuff.
00:03:11
So go watch the videos there. But I think this conversation
00:03:14
with Bob and Annie and then my conversation with Naima and
00:03:17
Vinod really helped sum up some of the big questions of the day.
00:03:21
So without further ado, give them a listen.
00:03:26
Investors that I really trust, both in terms of having a good
00:03:30
sense on policy and having good values and then being very
00:03:34
serious about like the businesses you invest in.
00:03:36
So say to talk about it all. And obviously this is still a
00:03:39
technology business at the end of the day.
00:03:41
So the changes in technology willpower, everything you're
00:03:43
doing. I just want to start off, you
00:03:45
know, we're talking about healthy businesses.
00:03:48
What what are some startups earlier stage businesses where
00:03:51
you think this is not like a long bet on like the technology
00:03:55
coming together, You feel like they've got a great business
00:03:57
model? Like where are areas right now
00:03:59
you're excited about like business model types in
00:04:02
startups? Annie, you want to go first.
00:04:05
Sure. So look it, it doesn't matter if
00:04:08
it's healthcare, fintech technology, I mean the best
00:04:11
business model out there is SAS software.
00:04:14
Wait works over and. Over again and.
00:04:17
That's over here. It's dead.
00:04:19
It was software and back, you know, I mean, I think that's
00:04:22
what's really interesting. So I I2 parts to this question,
00:04:25
right And the answer is that SAS software anytime, anywhere, you
00:04:30
know, if I can find that and we we were both in Athena way back
00:04:35
and that's a version that's like a 70% because it's SAS software
00:04:39
enabled service really. But SAS if you can get 8090%
00:04:43
margins, you know, like great. And I feel like we're back there
00:04:46
in terms of being able to sell software to providers and health
00:04:51
systems where, you know, EHR has effectively scooped up all of
00:04:55
those dollars, you know, for the last 20 years and now they're on
00:04:58
for business. But I think the the other part
00:05:01
is impact. And I think Bob and I are both
00:05:03
in this for impact. And so I think you know, like
00:05:07
you just had Prashant from Aikido, one of our companies and
00:05:12
that's impact, you know, like embedding obviously AAI with
00:05:17
interior delivery, you know, like see the impact, you know,
00:05:19
love that. We just sold the company called
00:05:22
Care Bridge. And you know, if you, if you
00:05:25
actually looked at that model, that was the dual eligibles LTSS
00:05:29
in the home where you were taking risk on them.
00:05:31
But the first part of the business was the wedge and that
00:05:34
was they became the standard for evaluating how what kind of
00:05:39
caregiver should be in your home and for how many hours.
00:05:43
And if you can believe it, there's literally no standard in
00:05:45
America for that. And so they created a standard
00:05:48
and algorithms rounded and support and and that's really
00:05:51
what health plans first bought. And then I took that wedge into
00:05:54
taking risk on all of those patients.
00:05:56
So you know, to me, impact matters.
00:05:59
Bob, you have businesses that are making you salivate right
00:06:01
now. Absolutely.
00:06:02
But first, thank you Eric and Iema for bringing us together.
00:06:05
This is super cool and I love that you taught us some Latin.
00:06:09
Yeah, I checked it. I actually did.
00:06:11
I have a, you know, my wonkiest friend from college who's like a
00:06:14
classics professor. I did call him up and made sure
00:06:17
that it was like somewhat coherent as a name in lat.
00:06:20
So it was a it was blessed. I'm glad I took linguistics.
00:06:24
It's a good thing. And I'm a zoologist first to any
00:06:29
Yeah, that matters. That's why we do this job.
00:06:31
We could be SAS investors and that would suck.
00:06:35
Helping do eligible people with better is, is like what we
00:06:38
actually want to do. I think 3 things to find a good
00:06:41
business model. And once it's happened to me,
00:06:44
the first one is get paid upfront, a lot of money.
00:06:47
Most healthcare business that I'm involved with and see don't
00:06:50
get paid a lot upfront. They get paid later and not
00:06:53
enough. And then you're like hoping to
00:06:54
make it up on value later. And then cash.
00:06:57
Cash is king. And having it first is really
00:06:59
good #1 #2 is software. Software has a lot better
00:07:03
margins and you can do a lot more when you have more margin
00:07:05
and then #3 is have very long contracts, just hard to do in
00:07:09
the US but the best business I'm involved with from a business
00:07:12
model perspective is 1 called me.
00:07:14
I knew you were going to say that.
00:07:15
I was like, yeah, yeah, I have a friend.
00:07:16
New work there. Yeah, yeah, yeah.
00:07:17
And as a company, we started in Korea.
00:07:19
You know. In Korea, people buy 20 year
00:07:22
insurance policies and they can't get out of them and so
00:07:25
we're attached on Day 1 and in 20 years they need you.
00:07:28
And so you get paid for 19 years before they need the service.
00:07:30
And so that's gets you all the cash up front and then you do
00:07:33
the thing and it's supposed to software.
00:07:35
And so one thing we should do in the US to really improve
00:07:37
preventative care and make everybody's life easier is to
00:07:39
have multi year insurance models.
00:07:41
The idea that you can switch every year means that you can't
00:07:43
have no time for ROI. And so I hope that that's
00:07:46
something that comes out is a multi year policy because then
00:07:49
you can do a bunch of the stuff that we all want to do and have
00:07:52
an ROI period that works. Yeah, we couldn't have a sort of
00:07:55
AI oriented event without asking about the investability of like
00:07:59
foundation model startups. Like what is your view on, you
00:08:03
know, rapper companies, companies that are competing
00:08:06
directly against open AI and Anthropic?
00:08:09
Like how much have you been enticed to invest in businesses?
00:08:12
They're spending a lot of energy building their own models.
00:08:15
And how much have you been wary or not of businesses that rely
00:08:18
on open AI and others? It's a big question.
00:08:24
Big question. I can make him go first.
00:08:25
You had to go first on the 1st. I I think competing with opening
00:08:28
against the foundation models is crazy.
00:08:30
Don't do it and I wouldn't do it.
00:08:32
All of our companies are using AI inside to make the product
00:08:35
better, the margins higher and the last panel talked about the
00:08:39
need to have multiple models and switching across them.
00:08:42
I think the multiple models work awesome and you can switch
00:08:44
across them and it seems like you can train them pretty
00:08:46
quickly to do almost everything you want to do.
00:08:49
And so I wouldn't want to be against that, right?
00:08:51
I think about AI making my ideas to serve patients just a lot
00:08:55
more scalable, a lot more profitable, a lot more
00:08:56
effective. I mean, you can listen to
00:08:59
elderly people who are lonely and make them feel better.
00:09:01
You can do every language now. You can tailor the care model or
00:09:05
the dietary recommendations to any kind of diet.
00:09:08
So it it makes everything better.
00:09:11
You just you agree with that or. I, I agree with that.
00:09:14
And I think the reality is it's a, it's a safer place to, to, to
00:09:18
play in the sense that they're not going to do the last mile.
00:09:22
I mean, they might do it. If you think about drug
00:09:24
discovery, I actually think they're going to be working on
00:09:26
models that'll be, that will effectively compete with those
00:09:31
that are working in drugs, drug discovery companies.
00:09:33
But I also think they will partner with pharma and with
00:09:36
biotech companies and, and, and, but they will be closer to the
00:09:40
product creation and probably in a way of royalties and some of
00:09:45
the classic companies right now they're doing drug just drug
00:09:47
discovery. But I think in, you know,
00:09:50
healthcare service, it's just hard, you know, like, I mean, if
00:09:53
you, if you actually looked at the volumes at open AI, I mean,
00:09:56
the reality is, is it's the tools, right, Developer tools
00:09:59
off the chart using it. And then you, you know, you do
00:10:03
have the, you know, the ambiences that are bridges and
00:10:06
health care that are like got, you know, real volumes and then
00:10:09
sort of, you know, everything else falls off.
00:10:12
I think the reality and drug discovery actually doesn't
00:10:14
create much traffic models. But so, you know, I think the
00:10:20
reality is, is that that last mile is really hard.
00:10:23
They're not going to want to do it from a go to market motion
00:10:26
from the actually understanding the edge cases.
00:10:28
I mean, they can. They can.
00:10:30
And this is the risk you're saying the reason open AI might
00:10:33
not fully embrace it is it's just like.
00:10:35
The Guard. The Guard house here.
00:10:37
Sort of a no. It's just like, yeah, how
00:10:39
quickly you can like scale these things.
00:10:40
Yeah, right. Like when you have.
00:10:42
Some coding coming out of you. It's like, oh, this is great.
00:10:44
Like let's do that. Yeah, let's do that.
00:10:46
What I want to ask the audience, honestly just straightforward AI
00:10:50
in health, over hyped, under hyped.
00:10:55
Just like let's I let's get the pulse on the road over hyped
00:10:58
under hyped AI in health, all right.
00:11:00
Who is? Over hyped.
00:11:04
Wow, we've got. Who's the most optimistic crowd
00:11:07
ever? Very not dated or skeptical.
00:11:10
Under hyped. Yeah, it's like everyone is
00:11:15
still a true believer. No one has.
00:11:16
Gotten it off. Yeah, yeah, it's good.
00:11:19
To learn early crowded I. Think the reality is both things
00:11:22
can be true, right? I mean, I think the reality is,
00:11:25
is there 800 companies that have been funded the last three years
00:11:28
and, you know, doing AI and healthcare.
00:11:30
And the reality is, is that some subset of those just like Google
00:11:35
and Amazon, you know, came out of 2000 and a mania there.
00:11:38
I mean this is a mania and valuations are going to be like
00:11:41
way beyond for every company, but probably what they should
00:11:44
be. But that except for that topic
00:11:48
we bet in. A bunch of businesses and then
00:11:49
some of them are exactly. It's the nature of intra
00:11:51
business. I think that I think it's such
00:11:53
an extraordinary moment, like I've never seen a moment of my
00:11:56
30 years in healthcare where we're actually going to make a
00:11:58
difference. We're actually going to, we are
00:12:00
actually going to lower costs. We are actually going to make
00:12:02
providers and clinicians lives better and patients lives
00:12:04
better. So I think it's worth it.
00:12:07
And I also think it's, you know, you're going to have the top 5%
00:12:11
of companies are going to be amazing, you know, and then
00:12:14
you're going to have probably well, probably well, I'll
00:12:16
probably get lucky in some like points point solution companies
00:12:20
that you know, we sell them. You know, they'll probably be, I
00:12:22
don't know, couple 100 companies sold and then you know, they're
00:12:25
going to be the other 600 of 1000 that have been created, you
00:12:28
know, Walking Dead like you don't think usual moving.
00:12:31
On to sort of a new some companies die in start-ups,
00:12:35
spoiler alert, the Bob the GOP ones and just sort of like
00:12:40
change the longevity conversation.
00:12:42
Like, you know, this, this idea like treating longevity means a
00:12:46
lot of things to different people.
00:12:47
It's obviously sort of an amorphous word.
00:12:49
I mean, some of it has been like, oh, let's think outside
00:12:51
the medical system. Some of it to people, I think is
00:12:54
like individualized care. Some of it's treating healthy
00:12:57
people. But like what to you has been
00:12:59
the lesson from GLP ones? And like how are using it change
00:13:02
with startups you're you're looking at I mean.
00:13:05
They're the greatest medicines I've ever seen, I think, except
00:13:07
for cancer drugs. As a doctor, there's so many
00:13:11
patients I've spoken to about, you know, how to be healthier or
00:13:14
lose weight, change their diets. And for some people, no matter
00:13:18
what they try, it doesn't work. And then you give them a GLP 1
00:13:21
and it works awesome and everything gets better.
00:13:23
Their cardiovascular risk, their cancer risk, their energy that
00:13:26
have to diabetes go the way and the prices for these drugs are
00:13:29
falling to the point that they're going to become widely
00:13:31
accessible to people and orals are coming soon.
00:13:33
And so this is a revolution that they can in health for people on
00:13:37
earth. And I hope that they're more
00:13:39
accessible to more people faster.
00:13:40
At the end of the day, I think pairing it with lifestyle
00:13:43
changes and an aura ring and sleep and stuff makes them
00:13:45
better. So I hope people do that too and
00:13:47
don't just take them and then like, eat McDonald's.
00:13:50
But I think. That we're at the beginning of a
00:13:53
lot of biological insights are ran out of live longer,
00:13:56
healthier and better. But the key thing is years of
00:13:58
like health that you can do things where you're alive and
00:14:01
well. And I'm very hopeful that we're
00:14:03
going to have a bunch of improvements in our joints and
00:14:06
in our weight and in our metabolic health and in our
00:14:08
cancer care to help us a little longer.
00:14:10
And I think the next 10 years are going to be actually a kind
00:14:13
of like there'll be a wave of longevity type things that
00:14:15
really work. And I think about this from the
00:14:18
Medicare management insurance. Andy and I are together at
00:14:20
Devoted Health and we think about or devoted, how do we
00:14:22
improve with the lifespan and health of our patients and do
00:14:25
cost effectively and GOP ones are now becoming part of the
00:14:27
cost effective approach to doing that.
00:14:29
Yeah. But it.
00:14:31
Goes back to you're actually saying you need to actually own
00:14:34
as an insurer, you need to own the life you know long enough to
00:14:37
actually pay for it. I got devoted.
00:14:39
We keep them over more than five years and so we can do a lot of
00:14:41
things about their health and well-being, investing.
00:14:43
But you can't. Do in a commercial plan with us
00:14:45
which every year for sure. Yeah, have.
00:14:47
They changed your philosophy on investing or the success of GOP
00:14:51
ones? No, not.
00:14:54
Not really. I meant to think the reality is
00:14:57
is I think there's intelligent I mean we're in Noom, you know,
00:15:00
that's that's certainly part of their offering.
00:15:03
And I think when they absolutely makes an impact from people's
00:15:06
health and make it more available through compounded
00:15:08
drugs. But I think the whole area of
00:15:11
longevity other than frankly GLP ones is is that is over hyped.
00:15:16
You know, that isn't you know, like the I mean you may have
00:15:20
function health speaking today. And then I think the reality is
00:15:23
90% of the tests that they do can be reimbursed, you know, by
00:15:26
if you go to your doctor, you know.
00:15:28
Oh, good. Well, I'm yeah, well, I'm there.
00:15:31
But our GOP ones I. Think the the big idea and
00:15:34
there's data from a couple companies, Amata and Bruta being
00:15:37
the two that I've seen the most data on Bruta were investors in
00:15:40
is that you can get people off of them and keep the weight off.
00:15:42
So what I love about GOP ones is that anybody who takes them will
00:15:45
lose weight. And then that's the moment that
00:15:48
you say, let's change your diet because you're not hungry when
00:15:50
you're on GOP 1. So I can then change your diet
00:15:51
around. I can teach you how to eat a
00:15:53
healthier, lower carbohydrate diet that makes you feel good
00:15:58
and then get you off of the medicine and keep the weight
00:16:00
off. He is keeping the weight off and
00:16:01
not bouncing back and not taking it forever because we're not
00:16:05
sure actually taking them forever at the doses of GLP, 1
00:16:08
weight loss doses is a good idea.
00:16:10
But for sure for a year to get you to where you should be, it's
00:16:13
a good idea. And if you keep the weight off,
00:16:14
then it's that's the best thing. And it's like you see a lot of
00:16:17
people doing that and then it's very cost effective.
00:16:20
You know, like. Cutting edge medicine I get, you
00:16:22
know, or like, you know, I'm taking with Govi.
00:16:25
My wife did a bracha cancer screening.
00:16:28
But those are still like medical establishment provided things.
00:16:33
Do you think there are things that like the Silicon Valley
00:16:35
connected type is doing in terms like buying their own tests that
00:16:39
you're like, yeah, Oh my God. When your rich friend.
00:16:41
'S calling you and you're like, you're a doctor and you know
00:16:42
everybody my. Rich friends call me.
00:16:44
I say, what the fuck are you doing?
00:16:46
You're like, go to a. Doctor, you're like, don't
00:16:47
ingest the stuff or is there anything You're like, oh, there,
00:16:49
that's a good one. You should you should do that.
00:16:52
Oh my. God alright, you're asking for
00:16:55
non evidence based things. Literally a good.
00:16:57
Idea to say in public, Yeah. A lot.
00:17:00
A lot of people. That metformin probably is a
00:17:03
good idea for most people would say if it's impactive and it
00:17:05
makes you live longer, maybe erythromycin is good for your
00:17:08
joints. I don't know, but there's much
00:17:10
people who think it is maybe. And then we get down the path of
00:17:14
supplements and I don't know, people feel better on magnesium.
00:17:16
So that's probably good, but I don't know.
00:17:18
But you're. Wary.
00:17:19
I don't know. Lithium.
00:17:20
I mean, yeah, they gave out this study on lithium this week,
00:17:24
actually, where improves your memory and may reduce dementia,
00:17:27
induce locks, plaque. I'm like, OK, we can go to the
00:17:30
CVS and get that right in here. The.
00:17:34
Moving on to sort of a, the policy conversation, I mean,
00:17:37
both of you, I think we're excited, you know, a potential
00:17:40
upside of a Republican administration with that they
00:17:42
would embrace Medicare Advantage.
00:17:44
Have you, have you seen that what you read specifically on
00:17:47
Medicare Advantage? And then I'll ask more broadly
00:17:49
what your reaction to sort of the Trump health policy has
00:17:52
been. All right, Henny, Annie, you're,
00:17:54
you're up. You're.
00:17:55
Up. OK.
00:17:57
So yeah, you know, like. Weirdly, both Trump
00:17:59
administrations have had a number of my CEOs actually
00:18:03
involved and they've been, I would say, much more
00:18:06
constructive than the Biden administration, unfortunately,
00:18:10
in healthcare. So I would say there are two
00:18:11
things. I mean, obviously Medicaid is
00:18:13
actually, you know, the one that people are most worried about
00:18:17
and the obviously the subsidization of the exchanges.
00:18:23
My biggest fear is actually the the exchanges go away because
00:18:25
you don't do the subsidies, you know, people.
00:18:28
Then it you know, the cost of the exchange going being on the
00:18:31
exchange goes up so much. The people, just the healthy
00:18:33
people just get off it. They're not going to pay,
00:18:35
basically. Medicaid funding has been cut
00:18:37
because of worker tests, yes. Worker testing, you know.
00:18:42
And then if they don't have that money flowing in, well and then.
00:18:44
And but, but also, what about the?
00:18:46
Changes because there's still all the people under the ACA.
00:18:49
He will be subsidized on the exchange going to be about 2
00:18:51
million people from the subsidies expiration.
00:18:53
The 22 million will stay. So there's enough healthy people
00:18:56
there. Why does?
00:18:57
I, you know, I can only talk from one state, but we, we're
00:18:59
down to like 1 insurer willing to, to be in that.
00:19:02
Well, that's the. Question is, is it exciting for
00:19:04
insurance? Yeah.
00:19:05
Does it does it make sense for insurers, because you're going
00:19:07
to, I think of the risk of that population, you have a less
00:19:10
healthy population. And then in those exchanges and
00:19:13
on the MA side, I, you know, I think it, it's interesting
00:19:17
because I think Republicans and Democrats, like if you obviously
00:19:20
that's one, that's one of my CE OS said Republicans have now
00:19:23
become Democrats, or at least the base that's supporting it.
00:19:25
You know, the base that was supporting Democrats now
00:19:27
supporting Republicans, Republicans now supporting
00:19:29
Democrats. So what does that mean for our
00:19:31
policies? And I would say that the on and
00:19:35
they they're going to be friendlier, but we'll see how
00:19:39
how friendly at the end of the day, because if you actually
00:19:41
looked at the legislature, they don't let they don't like how
00:19:44
they see that the citizens don't like health plans.
00:19:47
They blame everything on the health plan.
00:19:49
So therefore MA is and you've got not for private health
00:19:53
systems. So blaming the MA.
00:19:56
So you've got like two groups that are now like dumping on MA
00:19:59
and it'll be interesting to see where it all comes out.
00:20:01
I think when? Doctor Oz wakes up in the
00:20:02
morning after he takes whatever supplements he take.
00:20:06
He thinks. To himself how can I grow the MA
00:20:07
business for the world and so I think that they're going to make
00:20:10
stars easier more plans will be in four plus star plans I think
00:20:14
they're going to like. Besides.
00:20:16
You not to healthcare like in general put more money into the
00:20:18
system and let it and want it to grow I think they're going to
00:20:21
pay for food as medicine because that's a RFK junior idea and I
00:20:26
think that at the end of this their Medicare Advantage will be
00:20:28
quite a bit bigger than it will be yeah so you're.
00:20:31
Optimistic on that? Well, I just.
00:20:33
Wake up in the morning thinking that every Republican is wearing
00:20:35
AT shirt that says I heart MA because they like the
00:20:37
privatization of Medicare and they would have liked Zafra
00:20:40
going to be bigger and so while they're making any.
00:20:43
Is saying that like the voter base, it doesn't square anymore,
00:20:46
that they would continue. I know that's the.
00:20:47
Confusing part because everything's diffusing, but I
00:20:49
think that at the end of the day, like Republicans love MA
00:20:51
and that's just one of those things that is like it's a good
00:20:54
it's a it's a dogmatic statement.
00:20:56
And I think they don't like they're sort of burden of the
00:20:59
current stars program of all the complexity that we put in it as
00:21:03
A to make it better. So I think they're going to let
00:21:05
it grow with the asterisk being an out of the health care
00:21:08
concerns and because of their coding practices and they're
00:21:10
going to get a bunch of skirt and the vaccine.
00:21:12
Step like what's what's your reaction to that?
00:21:14
Like how damaging or not do you think it will be?
00:21:19
I I think it's. Well, I think it's actually very
00:21:22
damaging. But I do think, you know, for
00:21:25
example, we're actually in a vaccine administration company
00:21:28
that's doing very well. But last year when you had
00:21:33
states like Texas and Oklahoma, you had several, you know, I
00:21:39
would say towns that did not embrace vaccines.
00:21:42
And those towns ended up getting the needles.
00:21:45
And then more, you know, more children went to the hospital.
00:21:48
You had a couple kids die. Our vaccines administration for
00:21:52
measles went up 50% last year. So I think that that's all it
00:21:56
takes. And I do think you mostly have
00:21:58
rational physicians. I think physicians are, you
00:22:02
know, saying that vaccines were good in general, you know, in
00:22:05
most of the list is there. Obviously but our our
00:22:08
pediatrician won't even recommend a non vaccine
00:22:11
providing Dr. You don't want to take vaccine like good luck.
00:22:14
Vaccines are the. Safest intervention that exists
00:22:17
in healthcare on the Earth, period.
00:22:18
They work. You should get them, you should
00:22:20
recommend them. I give them to people, and I
00:22:23
think that most people actually believe that and we'll get them.
00:22:26
I think it's sad that we're making them harder to get.
00:22:29
I think it's sad that now they won't always be covered so you
00:22:32
don't have to pay a pocket. But my 84 year old aunt just
00:22:35
sent me a picture from this morning from Safeway where she
00:22:38
was getting her COVID shot. I mean, she bought herself a
00:22:40
donut and it was a good day for her.
00:22:41
And so I'm happy about that. Great.
00:22:44
Bob and Annie, thank you very much.
00:22:51
Here's a conversation with venture capitalist Vinod Khosla.
00:22:54
I sat down with Naima Raza, my Deus Ex Medicina Co host, to
00:22:58
talk with Vinod about all the major themes in health.
00:23:03
You've been making. Predictions about healthcare
00:23:05
since I think longer than we've been alive.
00:23:09
You know, the last long term prediction I did was in 2016,
00:23:13
about 10 years ago and I would say we are well ahead of the
00:23:17
schedule. I predicted 8 three-year cycles
00:23:21
of innovation. So I had assumed 3 year cycles
00:23:27
of significant innovation. That's 25 years.
00:23:32
So so I figured 2040, but I think things are happening much
00:23:38
faster than I ever imaged one of the.
00:23:40
Things you've said is that innovation and AI is not going
00:23:43
to come from the giants, it's going to come from start up
00:23:46
because you've held strong to that.
00:23:47
So when you look at it, who do you?
00:23:49
Who do you? See, as the kind of Tesla you're
00:23:52
famously, you know, you're a big fan of what Elon Musk has, not
00:23:55
an electric vehicle. Who do you see as the Tesla of
00:23:57
AI and then the Tesla of healthcare?
00:24:00
Well, I don't. Think the Tesla of AI has
00:24:03
emerged, OK, but it will emerge. So if you look at healthcare
00:24:10
companies today, and I speak to all of them, I speak to their
00:24:13
boards and executives, they are looking at things incrementally.
00:24:19
And so yes, and I'm very glad they'll all introduce a bridge.
00:24:24
One of our companies so great, but that's taking current system
00:24:32
and incrementally adding AI to it and automating.
00:24:39
There was a board I was speaking to recently.
00:24:41
I said clearly do that and you can take costs out and improve
00:24:45
efficiency in fact just like a bridge does.
00:24:50
The real company in the UK called Tortoise and they just
00:24:54
announced a major study with the NHS with doctors adapting their
00:25:01
system. It is pretty stunning.
00:25:04
Got 25% more actual FaceTime with patients.
00:25:12
And so my tweet was so suddenly they've increased the doctor
00:25:16
supply in all of the UK by 25%. That's a pretty big deal.
00:25:22
It would have taken them two decades to do that any other
00:25:26
way. But that's not the key question.
00:25:31
The key question is if you believe all healthcare
00:25:35
expertise. It doesn't matter whether you're
00:25:37
talking about primary care doc, mental health therapist, a
00:25:41
psychiatrist, an oncologist, A gastroenterologist, a physical
00:25:46
therapist, a health coach, all of it was free.
00:25:50
How would you design A healthcare system that's a very
00:25:54
different system than today's system in?
00:25:58
The venture capitalists have been throwing themselves against
00:26:01
those rocks for gener or decades was sort of like there are
00:26:05
constraints. Now there were constraints and
00:26:08
automotive too, lots of regulation, lots of standards
00:26:12
and what it takes is not rocks. What it takes is a great
00:26:20
entrepreneur like Ilan did that and you don't.
00:26:23
You don't look across and see that.
00:26:24
Yeah, you say it hasn't yet. I.
00:26:26
Haven't seen a person fundamentally trying to redesign
00:26:31
the system on an assumption that all expertise costs zero.
00:26:36
So you'd never ration access to an oncologist or a neurologist.
00:26:41
You'd provide it upfront, day one first conversation because
00:26:46
it's the same cost as a nurse. Do you think that?
00:26:48
Might happen outside of the United States because I mean,
00:26:51
one of the things that we've been hearing from people today
00:26:53
is how the US lagging behind on healthcare not just from
00:26:56
alchemy, but I think it. Might it might happen in the
00:26:59
developing world? Now developing world costs are
00:27:03
very low for expertise, so in a doctor in India doesn't cost
00:27:08
very much, but it might happen there.
00:27:14
I do think I originally, when I first wrote my blog in 2016,
00:27:20
taught the uninsured population in the US, which was.
00:27:26
Back then about 40 million people, pretty good side would
00:27:29
be the right place to start because they had no other
00:27:33
starting news. But instead we took a lot of
00:27:36
those people away. I wanted to pose like what to me
00:27:41
is like the existential question of this conference, right?
00:27:45
We, we've spoken before at this River Valley AI Summit, which
00:27:48
sort of takes as the premise that the smartest companies will
00:27:52
be the general purpose models. And obviously you are the first
00:27:56
venture investor in open AI, You're highly aligned with open
00:28:00
AI, but you also have health investments, obviously many of
00:28:02
them as well. Like what's your heuristic?
00:28:06
How do you decide this is the domain of the juggernaut of open
00:28:11
AI of the general purpose model versus say it's a health
00:28:14
specific challenge? In a lot of areas, I'd like to
00:28:19
joke every time Open AI releases the model, half of the wise
00:28:24
batch goes out of business that just the standard paragraph.
00:28:29
So I think if you're a starter interested in help, they're
00:28:32
interested in, they're interested in help, they're
00:28:34
interested in help, they're interested in a lot of different
00:28:37
areas and they should be and all the major guys will do all these
00:28:41
things. But what has happened
00:28:43
traditionally is because there's not one model provider and you
00:28:50
can easily move between model providers, question will be on
00:28:57
top of the mark. We are open AI or Entropic or
00:29:02
Google will have an advantage. So what value can they add or
00:29:07
the specific application now really want to put up pull up a
00:29:13
patient's record out of Epic before giving advice.
00:29:19
Tricky question. I couldn't say never, but it's
00:29:23
not looking night likely that they're going to write you a
00:29:26
prescription today. And I think that many, many
00:29:31
things you can do that would be startup territory.
00:29:38
So I think they'll be great start-ups.
00:29:41
I don't care what open AI does. It doesn't compete with sword
00:29:45
health. Sword health is doing physical
00:29:48
therapy at a level and growing faster than I would ever imagine
00:29:53
any healthcare start up with the exception of a bridge which is
00:29:58
doing really really well. Fortunate to be investors in
00:30:03
both, but there is going to be value add.
00:30:08
Physical therapy is one of those where in the eyes watching you
00:30:11
ain't guiding you in handling billability.
00:30:15
And now people like Amazon taking care of the fact that
00:30:20
they'll bill your insurance will open AI do that, Google do that?
00:30:25
Possibly, yeah. We we're going to ask you what?
00:30:27
Companies you like outside of your portfolio of a note, but I
00:30:30
I do want on this idea outside of AI, you're seeing obviously
00:30:34
massive tailwinds around GLP one.
00:30:37
How has that and the willingness of consumers to pay out of
00:30:41
pocket? Change your.
00:30:43
Thesis on healthcare if at all I.
00:30:45
Think consumers have always been willing to pay if it's cost
00:30:50
effective, not the insurance price.
00:30:53
This is the only market where buying one of something is
00:30:57
cheaper than buying 100 of something.
00:31:00
That's the the unfortunate part of the healthcare system.
00:31:05
But GLP 1 is a good example. But you know, five years ago,
00:31:11
people, four years ago people said people will never pay out
00:31:15
of pocket for cardiac care. A live core has 300
00:31:19
subscribers and retention is 90 some percent year to year.
00:31:25
The cardiac patients died. But other than that, incredible
00:31:31
retention, incredible engagement.
00:31:33
The typical patient is taking 6 ECD's a month.
00:31:39
You'd be lucky if you're in the best healthcare system and got
00:31:43
6CC DS in two years, right? That's per month.
00:31:48
Now they can look at your, they have the data, millions and
00:31:53
millions of PC DS per month, usually in context and they can
00:31:58
do something like say, hey, you're taking, by the way, this
00:32:05
doesn't even take FDA approval. We're watching your EC GS
00:32:09
taking, you're taking them more than once a week.
00:32:13
Something's changing. Come on in, talk to us about,
00:32:18
talk to our cardiologist. So adjust an alert saying you
00:32:23
need to check in. It's incredibly valuable.
00:32:26
And the number of visits to emergency rooms goes through the
00:32:31
floor. So that's an example of
00:32:34
something the big model companies aren't going to do.
00:32:37
They aren't going to take that many CDs and provide coaching
00:32:40
around that. So the the sword is an example.
00:32:44
That's an example. There's plenty of examples like
00:32:48
that when. You think about, you know, your
00:32:51
own health or you know, recommending to your friends.
00:32:55
Like how much do you think all all the things you're doing are
00:32:58
recommended by a doctor or you think there is like everything?
00:33:01
I do is recommended by ChatGPT and then I check with the
00:33:05
doctors for safety like is it OK you're since you're being
00:33:09
sincere? I am being sincere.
00:33:12
Your doctor's. Like Oh my God, why are you
00:33:14
bringing this to me? Or I mean, just ask them.
00:33:18
And if they disagree with chat, TPDI asked another doctor.
00:33:23
Because the SO here I believe. There's a so let me give you a.
00:33:28
Recent study out of Stanford, Arnie Milstein who's probably
00:33:35
one of the better known professors in the country on
00:33:39
quality of medical care did a multi centre study.
00:33:45
Human doctor performance for this was for complex disease
00:33:48
diagnosis at pretty fancy institutions like Stanford.
00:33:52
Human accuracy in complex disease diagnosis for 73%.
00:33:59
That means 27% of the patients of complex disease.
00:34:03
This is not you got the flu, got the wrong diagnosis or a
00:34:07
suboptimal diagnosis. AI, which is out-of-the-box, It
00:34:14
wasn't really tuned for medical practice.
00:34:17
What's 88%? And then they gave the AI to the
00:34:22
doctors. The doctors improved, but from
00:34:25
73% to 76% and they degraded the AI from 88 to 76, right?
00:34:31
And that is the reality of the attachable.
00:34:34
RAW you don't want to like have the doctor mess.
00:34:37
This is why if if a doctor agree disagrees with ChatGPT, you
00:34:43
should ask another doctor and another doctor and another
00:34:45
doctor. ChatGPT?
00:34:46
What happened when they gave the the GPT the doctor?
00:34:52
If they couldn't use it, you know it's who you put in charge.
00:34:58
If the doctors in charge, they will introduce their biases.
00:35:03
In one of the biggest biases, the doctor had this recent C
00:35:07
bias. Which patient did they see
00:35:09
recently and what did they have? Do you think though?
00:35:12
There's by the way. Plenty of studies to show if the
00:35:15
New York Times mentioned the disease, it's diagnosis in the
00:35:19
country goes way up. It's this recency bias and.
00:35:24
AI has some recency bias surely Building it's hopefully.
00:35:28
Not but yeah, that's right. When you look at the future, one
00:35:32
of the things when John Shaw was here earlier today talking about
00:35:34
Hippocratic AI and you know, if when asked how many doctors will
00:35:38
there be in the in the world in 2040, he said they'll be admin
00:35:42
doctors as people. The clinicians will all be
00:35:44
artificial intelligence agent. You know Tom Hill was here from
00:35:48
aura and talked about I asked him if the ring would one day
00:35:52
connect to an agent and he said that he and Lynjol had been
00:35:54
talking about that is this is there going to be a fungal and
00:35:57
Healthcare is how close were you here's.
00:35:59
His Healthcare is large span. There's four major chunks to it.
00:36:06
One is doctors and expertise outside the hospital.
00:36:10
Another slice is drugs. Another slice is testing and MRI
00:36:15
and imaging and blood tests and X-rays that is in hospital care.
00:36:20
So roughly say each one is a trillion dollars.
00:36:25
The the first portion which medical expertise is going to be
00:36:31
better in an AI? No question if we let it get
00:36:35
there, the AMA is completely opposed to AI practicing
00:36:40
medicinal prescribing because for a dollar they lose $150.00
00:36:46
physician visit in the clinic. That's the reality, right?
00:36:52
I've talked to the the he stepped down now the president,
00:36:55
the MAA half a dozen times and they always avoid the topic.
00:37:01
But there's going to be easy ways.
00:37:03
I was just talking to Doctor Oz and I said, so how many people
00:37:09
familiar with Medicare Advantage?
00:37:11
A few hats. The odd thing about medical
00:37:16
Medicare Advantage, if you talk to companies like Humana that
00:37:21
are some of the largest in Medicare Advantage, they spend I
00:37:25
would say 80% of their effort in risk scoring a patient up not on
00:37:33
medical care. They provide almost no care.
00:37:37
And now what I told Doctor Oz is Medicare Advantage should have
00:37:41
an AI score to the patient. If a provider like Humana
00:37:47
disagrees with the scoring, they can appeal it to a human.
00:37:52
But then if they're wrong, they pay for it.
00:37:56
It's a simple I did you get the read that they are going to do?
00:38:01
More in Medicare Advantage or less or what was your read?
00:38:03
I think there's definitely a lot.
00:38:04
Of interest in doing that kind of thing but why would you let
00:38:09
the the fox in the hen house and let them say how complex is
00:38:13
who's this patient so pay me more.
00:38:16
What is your outlook on what the Trump?
00:38:18
Administration has done healthcare so far.
00:38:20
What kind of grade would you get LA?
00:38:22
Through A well, I think. RFK Yeah, I know.
00:38:26
If there's a grade low enough, what comes?
00:38:30
I asked. F minus.
00:38:33
Minus. Do you think he's going to ask?
00:38:39
I won't comment on that. I don't know the inside
00:38:41
politics. He obviously cut a deal with
00:38:44
Trump during the election, which is a sad state of affairs where
00:38:49
our health portfolio and vaccine policies are auctioned off
00:38:55
during an election. But they are doing a lot of good
00:39:00
things. For example, I would say Doctor
00:39:07
Odds has a very sensible view of how to use AI.
00:39:12
In fact, I first wrote 100 page document on the transformation
00:39:17
of medicine in 2016 called 20% Doctor Included it's 100 page
00:39:23
PDF. And I have an e-mail from Doctor
00:39:26
Odds in 2016 commenting on MY108810 years ago.
00:39:33
So he's very interested in that kind of thing.
00:39:37
I think the FDA is very interested.
00:39:41
The new head of AI at the FDA, Shantu Shantanu, is really very
00:39:48
interested in more adoption up here and the CDC any.
00:39:53
Thoughts on what's happening at the CDC?
00:39:55
Yeah, I have no exposure to the. Cdci can not to deal with that
00:40:00
side of. Put some of this in the context
00:40:02
of the. Competition with China,
00:40:04
obviously that's been an important issue for you.
00:40:06
How do you scale? You know, we were hearing
00:40:08
earlier today certainly, you know, developing drugs.
00:40:11
China is becoming in some ways this becomes the dominant place
00:40:14
all our. Companies trying to go to China
00:40:16
get first in humans is easy. Drug manufacturing is easy it
00:40:22
there's still decent regulation, but trials are easier.
00:40:26
The US needs to loosen up. It's not a big deal.
00:40:29
It can be developed in China and then we can pay for it.
00:40:31
What's what's your sense of how this plays out?
00:40:34
Well, I don't think we will loosen up here.
00:40:38
So that's unfortunate. Fast forward now, there's
00:40:42
another path, right? AI based drug design can be done
00:40:48
pretty differently. Not for all diseases, you know,
00:40:53
a vaccine is a vaccine for everybody, but there's a lot of
00:40:58
diseases. So for example, in sickle cell
00:41:01
anemia, gene therapy can be pretty well done.
00:41:06
Here we we have a company that's treated its second patient.
00:41:10
It's public data, completely cured of sickle cell, that kind
00:41:16
of precision. Preventative medicine that seems
00:41:18
to be taking off in the United States.
00:41:20
Do you think like Fast forward, do you still think a time
00:41:24
horizon for prediction should be 40 years or should it now be
00:41:26
like 10 years worth? You can't predict past five
00:41:29
years. In technology, application of
00:41:34
technology is a different metrics especially if it has a
00:41:37
regulatory constraint. That's why I think it's
00:41:40
completely up to grabs how aggressive or regressive with
00:41:44
the administration beyond hopeful this administration in
00:41:48
its quest to have pure regulation will will be much
00:41:53
more aggressive in adapting AI and new technologies in this
00:41:58
country and what are three things that you.
00:42:00
Think American policy makers, regulators should be doing to
00:42:03
that? America up to beat China and
00:42:05
well, first assume. AI will be smarter than humans
00:42:09
in almost every single area. No, don't.
00:42:12
Aside, don't side. With the AMA as a consumer,
00:42:15
right? Because the government is such a
00:42:20
spender, large spender in healthcare have a process for
00:42:26
safely because safety is no matter what you think and no
00:42:32
matter what RFK thinks what safety is.
00:42:36
Safety is critical in healthcare, but you can still do
00:42:40
safety in a lot of trials. I mean, look at the medicine.
00:42:48
The fundamental premise of medicine is the Hippocratic
00:42:52
Oath, which every doctor takes, and it's mathematically wrong.
00:42:57
It's the dumbest rule to have. If you can save 10 lives and do
00:43:02
a little harm and save 10 lives, won't do it.
00:43:07
You want the Bayesian, not the. Hypocrite here, but.
00:43:11
You know, taking a progressive, logical first principles view of
00:43:15
what is the right thing to do, that's the right thing.
00:43:20
Great. Well, we can talk all day,
00:43:21
Vinod. Khosla, thank you so much.
00:43:23
Great. Thank you.
