WOYM: Nvidia GTC Conference, Fed Watch, Top Decile Venture

Guests:
Ram Ahluwalia & Justin Guilder
Date:
03/25/24

Thank you for Listening this Episode!

Support our podcast by spreading the word to new listeners. We deeply appreciate your support!

Episode Description

"- Nvidia GTC Conference - Fed Watch - Top Decile Venture - Economist Cover - Factor Rotation"

Episode Transcript

Speaker 1 [00:00:00] All right. Hey, Justin. How are you? Welcome back. 

Speaker 2 [00:00:03] Aaron. I'm doing well. Good to be back. Yeah. A week away on vacation. 

Speaker 1 [00:00:08] You missed, so, of AGI. Jensen said. Jensen said AGI in five years actually had that same prediction the day before, by the way. I was like, I've got my timetable after the GTC, which is our annual conference last Monday. I recommend folks take a look at I've got a video. It's pinned to the channel. It's a 30 minute walkthrough of the highlights with my annotations on the video. So if you want the skinny on, I highly recommend checking that out. Anyway, so Justin and I thought what we would do is a walkthrough of the Lumina Ledger newsletter that goes out each weekend and just riff off some current events that are breaking so fast every day. So much happening. 

Speaker 2 [00:00:58] Yeah. No, I think it's a great idea. And, I also was spending time thinking about artificial intelligence on my week off. I did listen to a Lex Fridman podcast where, yeah, yeah, I'm looking on, and he was not as bullish on AGI. I think it's an interesting one. So I'll share some of that as we go through that. 

Speaker 1 [00:01:21] Yeah, I want to hear that. I did listen to the Sam Altman interview with Lex as a waste of time, otherwise would have gotten close. It literally was a waste of time, like there was no value added from it whatsoever. So don't waste your time. There's not much of a reveal there. Disappointing. Well, let's go through that. So some highlights here. So quick plug. We had a great interview with VC top decile with Dave Lambert. Check out unlimited the non consensus VC non consensus investing. You'll find that podcast there as well as another one which was this is one of my favorite just to run I with doctor al-Masri. You know he's into crypto and a really interesting conversation PhD and entrepreneur operator investor. But let's get into it Nvidia. So our again the video that's pinned to my channel has that deep dive. So if you want to really understand what's going on there take a look at that. But I'll give you some of the quick highlights. Like one is if aliens came to Earth and they wanted evidence of the advancement of our civilization, we would show them the black hole Nvidia chip. It is the culmination of of human innovation. It really it there's not there's nothing like this. It's not putting a man on the moon. 

Speaker 2 [00:02:39] I loved your analogy to the Milton Friedman pencil example, because I think that's a beautiful example of supply chain. 

Speaker 1 [00:02:49] And. 

Speaker 2 [00:02:50] Taking that to a modern piece of hardware. That's a smart comparison of like, how far we have come as a society to be able to produce these types of right components. 

Speaker 1 [00:03:03] I don't know if people get the pencil reference. So just and I do. But the basic idea is you have a pencil. No one knows actually how to build or manufacture a pencil because of special much the components. And you've got to rely on others and certainly seven incredible supply chains. So the other takeaway is just it's betting on the CapEx receivers. That's it's that's the obvious thing. You've got governments, corporations, startups spending money. So the thing we know for sure is that the recipients of that of that capital expenditure spend, we call them CapEx receivers. Those are the winners. And we saw that today. This morning, Bloomberg announced that core we've has a $16 billion valuation. 

Speaker 2 [00:03:56] Yeah. Yeah. It was really exciting to see that. Obviously we've had some involvement in their last secondary sale where they were valued at about $7 billion, which was great. I think if you could share a little bit more about that, that was really exciting for our clients and for Lumina, and then to see that six months later. Yes, exactly. 

Speaker 1 [00:04:21] Probably five months later. But if you're looking at IRR a little bit. But yeah look it 2.3 to 2.3 x. So this way people are going to hear a lot more about wave because we expect they'll go public the back end of this year. So core weave is a cloud I. Data center provider. They actually started as an ethereal staking miner. And then when proof of stake went away, sorry. When proof of stake was introduced, Ethereum mining went away. They pivoted the business into AI computer. An incredible story. And. So they are one of the key providers of GPU compute from Microsoft and OpenAI. To the tune of a billion plus revenue. Okay, so when you hear that Microsoft is powering open AI, it's actually core wave is powering Microsoft is powering OpenAI. Now Microsoft has some other compute somewhere too, but Core Weave is ultra efficient in how they generate GPU compute. There all these little things that come together that determine efficiency. How much output do you get per unit of cost? The cost of processing a token. We should actually share to the public our investment. Right. And we'll have to redact sensitive, confidential materials, but we'll do a post. Our core weave thesis. Now, by the way, at that last year we were approach to investing like open anthropic and all this other stuff. And those were all like FOMO. Trades, and the demand for those were like insatiable, by the way. Like. Yeah. You know so we've is great. We're excited about that. 

Speaker 2 [00:06:07] And linger there for us because I think separating out the idea of a good business and a good investment is right. And to linger on more totally. 

Speaker 1 [00:06:18] Look. Okay. Open it out. You bet. $86 billion valuation. Now, we know that competition is proliferating, and it's not clear what the value of these foundational models are. Because you have open source models like Meta Llama two or Huggingface, which you can now rent on AWS, or in the near future you can run on your laptop. So what's the value of those models? But people need the compute. They need the picks and shovels. They need the infrastructure. And it's easier to go from a. $7 billion valuation to a $16 billion valuation like we did with Coral Reef, than it is to go from a $86 billion valuation to $1 trillion. I would like I'm sure that I'll do fine anyway. I guess maybe, but from a risk adjusted return perspective, it's just less interesting. 

Speaker 2 [00:07:12] Exactly. I mean, it's the same thesis as your interview recently on an early stage venture investing, right? 2 to $3 million valuations. Yes. It's very different than 20 to $30 million valuation. 

Speaker 1 [00:07:26] Exactly. Here's another deal we passed on. Let me tell you Elon Musk's x AI business. We have access to a lot of pretty cool deals, and most of the time we're saying no, no, no no no no no. That's a hot deal. I share that deal with a friend from China originally, he said. So we're going to spin up an SPV and do this right. So what are you talking about now? What do you mean? There's no revenue. It's a multi-billion dollar valuation. It's yet another limb. And grok, which powers Twitter. Still sucks. No. He's like, you know, people in China love Elon Musk. This will fill itself like it's not for sale is not in our business model to do that. We're not a broker, dealer or fiduciary, etc. but that is like that's the thing. It's like the FOMO to get in anything that's affiliated with like a celebrity tech person. Whereas if you just get an A 2 to $3 million valuation, you do risk so much just on the basis of price. You exit 20 million, you're up in ten after dilution. What are you sorry about? 

Speaker 2 [00:08:34] It's hard to stay disciplined like that. 

Speaker 1 [00:08:36] Exactly. You know. So. So Callie was excited sailing. You're going to hear a lot more about that. There's a great on lots of podcasts that they to like. How I first learned about Callie and then, we had been thematically focused on it. I on Twitter, so friend reached out to me. I've known her for ten plus years as Ron. I know the CEO, CSO of course we've. If you guys want to get in, let's have a chat. And then that weekend we did the diligence. We did rapid. It took like a week, 1 or 2 weeks really. But it started because it's fast moving. Deal. 

Speaker 2 [00:09:10] What's really exciting to see them hiring? The former Google, cloud VP of finance to as their CFO, because that's the that's the signal you want to see for a company preparing for a pulling off rate. 

Speaker 1 [00:09:24] These guys went from three data centers to over a dozen in the last year. Think about that. Think about what a data center is. It's a big plant. It's got to have sustainable, efficient electrical. It's got to be loaded with compute in the form of these servers, and they be strung together with optics and then other software. I mean, there's a lot. They went from three to over a dozen. It's phenomenal execution. So we're going to interview one of the leaders at Core Wave soon. Stay tuned for that. But we're we're thrilled about that investment. 

Speaker 2 [00:10:04] While we're talking about AI, let me share a little bit about the, podcast that I listen to that I thought was interesting. I think it forces the thesis on, picks and shovels. It was really talking about, the predicted shortcomings of LMS and essentially their lack of a real world model. And it was talking about how to get the leap to artificial general intelligence. You, at least in this person's prediction, need a model that is both trained differently and also understands the real world through visualization. Yeah. And then therefore need a model that can predict video and can understand and interpret video. And, you know, you think about the hardware requirements to process that like massive load of data. Yeah. It only increases. So whether that's the winner or not. it to me for forecast the need for you in the reader investment in the hardware right. 

Speaker 1 [00:11:14] So for the hardware layer is delivering we're going to get trillion parameter access plot models from Nvidia. So they're delivering trillion parameter going up from like multi-billion to a trillion parameter. So it's going to come out of the training data. And you know again while at Google one they're multi-modal. Their base layer is multi-modal as compared to like OpenAI where they actually have different verticals. Here's like the video layer, here's a text layer that's one. And they've got YouTube, they got the video content and they've got more semiconductor compute than all their competitors combined. Of course they're going to figure that out. And then the World Wide Web. By the way, one of the things in the Nvidia demo that Jensen shared is that they can move the entire content of the World Wide Web in one second, using optics and processing with the hell out. Who's who's going to crack the code on that? I'm pretty sure Google's going to be relevant in that world. 

Speaker 2 [00:12:15] I would think so. I would think so. You know, it's very interesting, though, when you think about what. The data needs are. And so this this person was talking about the experience that a four year old has. 

Speaker 1 [00:12:34] And this is Jonathan. This. 

Speaker 2 [00:12:36] This is Yann LeCun. Exactly. 

Speaker 1 [00:12:38] Meta's head of. 

Speaker 2 [00:12:39] AI. You want to. 

Speaker 1 [00:12:40] Train your guests, right? 

Speaker 2 [00:12:41] Exactly. So he's talk. He he made a very simple point that was powerful to me, which is the amount of data. That a four year old, I think 4 or 5 year old has consumed through, their eyes just observing the world. Yeah. And that that amount of data is more than all the training data that an Lstm today trains on in written format. Right? Just in visual observation of how the world works. Right. And it's greater by a significant order of magnitude. And he talked about how, you know. Any 17 year olds can learn how to drive a car in 20 hours of practice, and we don't have a, automated driving experience. The real world that way. That's like, you know, far. Way to go. 

Speaker 1 [00:13:42] The model for the Colorado. I think I think. I think we'll see. AGI like capability in in 4 to 5 years because, you know, the inputs we had to calibrate. Now Tesla relaunched their FSD. They had to rebuild our entire model. So all that legacy tech is gone. That's going to happen again and again, not just for Tesla but others. Here. But we've got the compute. We've got the large parameter models. We just need more training data. That's actually the main constraint and multi-modal training. But we got a few years to figure it out. So let's share the screen again here and share some other highlights. This is from our Lumina Ledger, which publishes every weekend off set Sunday at 11. And let's go through some quick highlights here. So shifting gears a bit. So on the macro we saw that the fed did not cut rates. This was actually one of our calls in December. We said, hey the public schools sending a pivot. It's not going to happen. I didn't happen. Well here's what's really this is actually very dovish. I don't think people appreciate how dovish this is. Let me show you this is the updated. Projections from the fed. So what you see here is the December projection. And then you see their current projection. Okay. So 24. This is the outlook. They're saying hey, in December we thought the growth in real GDP would be 1.4%. Now it's 2.1%. So in a quarter they've elevated the growth outlook. Second in December the inflation forecast was 2.4%. They've increased their expectation for inflation's by the inflation 2.6. Higher growth higher inflation. And what about rates. They expect the fed funds rate to be the same as their December projection. So the fed seems to be willing to tolerate higher inflation. That's that's bullish there short term or long term. Well, I don't want to get into a big thing here but. You know yeah. 

Speaker 2 [00:15:58] I think it's I think also the Fed's coming around to the idea that it's harder than they expected to bring inflation into their current target zone. 

Speaker 1 [00:16:10] Yeah I think they're yeah. Look they need to keep rates higher for longer. That's another theme we've been on. Literally was a name on my Twitter handle last year to higher for longer. So be yeah higher for longer sear on rates. You know the it looks like they're like the majority of community expect a 13 or sorry three rate cuts. Some expect to. I'm more in the two camp. Let's just see how the data comes in, though. The CPI print. I'll skip over the CPI. This is really funny. The economist came out the magazine cover called America's Public Economy. These covers are always late. So we'll see what that means. S&P futures are down this morning. Thank you. Economist. 

Speaker 2 [00:16:54] So there's always a swing back. 

Speaker 1 [00:16:56] We're going to go ahead and have it. 

Speaker 2 [00:16:58] Right here on my desk actually. That's great. 

Speaker 1 [00:17:01] I don't know that I even read the articles in this thing anymore. I the cover has some value. Like if there's something that pops, I might take a look, but it's just it's too backward looking. 

Speaker 2 [00:17:15] Well written. All written media must be right. Right. I mean, for it to be news, right? Otherwise it's an opinion. 

Speaker 1 [00:17:26] That's right. And they're not talking about China anymore. That's interesting. Here's another one. Back to I. So what's the best asset class to own in a world of AI? Of AI abundance? Meaning like forecast, the end state. The dream is work is a decision you do because you enjoy working, for instance, right? There's abundance. What goes up in value? If you've got robots manufacturing everything right? You can have fun. 

Speaker 2 [00:18:01] Well, I think we're we're in a world of universal basic income there, right? Right. We're in UBI world. I agree with the, point that it's land. It's actually interesting because I was, in I was in Manhattan Beach for vacation this past week in California. Talk about a place with, scarce land. Incredibly valuable land. And, you know, the price per square foot is breathtaking when you think about it. Compared to other locations, including, say, Manhattan itself, where, you know, Manhattan Beach is just dwarfs it. And I at one point I was surfing. And you get this incredible view when you're out in the ocean and you turn around, you look back, and it's just every square foot is covered. I mean, houses are on top of one another and then on top of one another again and again and again. So. And they're not making anymore and. 

Speaker 1 [00:19:03] Not make anymore. 

Speaker 2 [00:19:04] Islands. 

Speaker 1 [00:19:06] The key word you said there is location. So it's not all land, but land in general should do well. Land is one of those goods that has it's called an economics rivalrous consumption. Meaning we both can't share in it. It's a positional good. There's only one boardwalk. There's only one park. Place a monopoly. Yes. Unlike, say, technology. Technology can be shared, right? Learning a quadratic equation can be shared instantly. That's non rival. But real estate and land not the same. And. So we were by homebuilders last week. So the question is how do you buy land like you can buy land. So paying the bucks to get paid property taxes. First of all, the home builders own land. They own hundreds of millions or billions of dollars worth of land. And they have stocks that go up while you on the land. Two and they're cheap now. They're overbid right now by the way. We're talking about homebuilders last week. And they had a it was a best performing theme last week up 5% last week actually.