Skip to content

Instantly share code, notes, and snippets.

@simonw
Created May 5, 2023 18:22
Show Gist options
  • Save simonw/d901ca6dfdae610176e9525f4c0ce925 to your computer and use it in GitHub Desktop.
Save simonw/d901ca6dfdae610176e9525f4c0ce925 to your computer and use it in GitHub Desktop.
{
"segments": [
{
"speaker": "A",
"start": "0:00:00.497812",
"stop": "0:02:48.977813",
"transcript": [
{
"start": "0:00:00.497812",
"text": " Yeah, this is a document which I first saw at three o'clock this morning, I think."
},
{
"start": "0:00:05.817813",
"text": " It claims to be leaked from Google."
},
{
"start": "0:00:08.297812",
"text": " There's good reasons to believe it is leaked from Google."
},
{
"start": "0:00:10.397813",
"text": " And to be honest, if it's not, it doesn't actually matter because the quality of the"
},
{
"start": "0:00:14.017812",
"text": " analysis I think stands alone."
},
{
"start": "0:00:16.097812",
"text": " If this was just a document by some anonymous person, I'd still think it was interesting"
},
{
"start": "0:00:21.177812",
"text": " and worth discussing."
},
{
"start": "0:00:22.497812",
"text": " And the title of the document is, We Have No Mote and Neither Does Open AI."
},
{
"start": "0:00:27.097812",
"text": " The argument it makes is that while Google and Open AI have been competing on training"
},
{
"start": "0:00:31.697812",
"text": " bigger and bigger language models, the open source community is already starting to outrun"
},
{
"start": "0:00:36.177813",
"text": " them given only a couple of months of really serious activity."
},
{
"start": "0:00:42.017812",
"text": " Facebook Llama was the thing that really kicked this off."
},
{
"start": "0:00:44.297812",
"text": " There were open source language models like Bloom before that and GPTJ."
},
{
"start": "0:00:48.537813",
"text": " They were very impressive."
},
{
"start": "0:00:49.857813",
"text": " Nobody was really thinking that they were chat GPT equivalent."
},
{
"start": "0:00:53.737813",
"text": " Facebook Llama came out in March, I think March 15th, and was the first one that really"
},
{
"start": "0:00:58.617813",
"text": " sort of showed signs of being as capable maybe as chat GPT."
},
{
"start": "0:01:05.737813",
"text": " I think all of these models, the analysis of them has tended to be a bit hyped."
},
{
"start": "0:01:10.457813",
"text": " I don't think any of them are even quite up to GPT 3.5 standards yet, but they're within"
},
{
"start": "0:01:15.617812",
"text": " spitting distance in some respects."
},
{
"start": "0:01:17.837812",
"text": " So anyway, Llama came out and then two weeks later, Stanford Alpaca came out, which was"
},
{
"start": "0:01:22.857812",
"text": " fine tuned on top of Llama and was a massive leap forward in terms of quality."
},
{
"start": "0:01:27.257812",
"text": " And then a week after that, Vicuna came out, which is to this date, the best model I've"
},
{
"start": "0:01:31.897813",
"text": " been able to run on my own hardware."
},
{
"start": "0:01:33.377812",
"text": " I've run it on my mobile phone now."
},
{
"start": "0:01:35.857812",
"text": " It's astonishing how little resources you need to run these things."
},
{
"start": "0:01:39.537812",
"text": " But anyway, the argument this paper makes, which I found very convincing, is it only"
},
{
"start": "0:01:44.337812",
"text": " took open source two months to get this far."
},
{
"start": "0:01:47.477812",
"text": " It's now every researcher in the world is kicking in on new things."
},
{
"start": "0:01:51.217812",
"text": " It feels like there are problems that Google has been trying to solve that the open source"
},
{
"start": "0:01:55.157812",
"text": " models are already addressing."
},
{
"start": "0:01:57.417812",
"text": " And really, how do you compete with that?"
},
{
"start": "0:01:59.377812",
"text": " Like with your closed ecosystem, how are you going to beat these open models with all of"
},
{
"start": "0:02:03.657812",
"text": " this innovation going on?"
},
{
"start": "0:02:05.097812",
"text": " But then the most interesting argument in there is it talks about the size of models"
},
{
"start": "0:02:08.237812",
"text": " and says that maybe large isn't a competitive advantage."
},
{
"start": "0:02:12.257812",
"text": " Maybe actually a smaller model with lots of different people fine tuning it and having"
},
{
"start": "0:02:16.497813",
"text": " these sort of these LoRa, L-O-R-A, stackable fine tuning innovations on top of it."
},
{
"start": "0:02:22.097813",
"text": " Maybe those can move faster."
},
{
"start": "0:02:23.537813",
"text": " And actually having to retrain your giant model every few months from scratch is way"
},
{
"start": "0:02:27.817813",
"text": " less useful than having small models that you can fine tune in a couple of hours on"
},
{
"start": "0:02:33.097813",
"text": " a laptop."
},
{
"start": "0:02:34.097813",
"text": " So it's fascinating."
},
{
"start": "0:02:35.577813",
"text": " Basically, if you haven't read this thing, you should read every word of it."
},
{
"start": "0:02:39.217813",
"text": " It's not very long."
},
{
"start": "0:02:40.397813",
"text": " It's beautifully written."
},
{
"start": "0:02:41.737813",
"text": " Like it's I mean, if you try and find the quotable lines in it, almost every line of"
},
{
"start": "0:02:45.577812",
"text": " it's quotable."
},
{
"start": "0:02:46.577812",
"text": " Yeah."
},
{
"start": "0:02:47.577812",
"text": " Yeah."
},
{
"start": "0:02:48.577812",
"text": " That's that."
},
{
"start": "0:02:49.577812",
"text": ""
}
]
},
{
"speaker": "B",
"start": "0:02:49.450312",
"stop": "0:03:23.588437",
"transcript": [
{
"start": "0:02:49.450312",
"text": " That's a wonderful summary, Simon. Yeah, there's so many angles we can take to this. I'll just"
},
{
"start": "0:02:54.090312",
"text": " observe one thing, which if you think about the open versus closed narrative, Imad Mostak,"
},
{
"start": "0:03:00.970313",
"text": " who is CEO of Stability, has always been that open will trail behind closed because"
},
{
"start": "0:03:06.090312",
"text": " closed alternatives can always take learnings and lessons from open source. And this is the first"
},
{
"start": "0:03:12.650312",
"text": " highly credible statement that is basically saying the exact opposite, that open source is moving"
},
{
"start": "0:03:16.490312",
"text": " than closed source. And they are scared. They seem to be scared, which is interesting. Travis."
}
]
},
{
"speaker": "C",
"start": "0:03:23.166562",
"stop": "0:04:41.483438",
"transcript": [
{
"start": "0:03:23.166562",
"text": " Yeah, a few things that I'll say. The only thing which can keep up with the pace of AI these days is open source."
},
{
"start": "0:03:32.166562",
"text": " I think we're seeing that unfold in real time before our eyes. And, you know, I think the other interesting angle of this is to some degree LLMs are, they don't really have switching costs, they are going to become commoditized."
},
{
"start": "0:03:47.166562",
"text": " At least that's what a lot of people kind of think. To what extent is it a rate in terms of pricing of these things?"
},
{
"start": "0:03:55.166562",
"text": " And, you know, they all kind of become roughly the same in terms of their underlying abilities. And open source is going to be actively pushing that forward."
},
{
"start": "0:04:04.166562",
"text": " And then this is kind of coming from, if it is to be believed, you know, the kind of Google or an insider type mentality around, you know, where is the actual competitive advantage?"
},
{
"start": "0:04:14.166562",
"text": " What should they be focusing on? How can they get back into the game when, you know, when currently the external view of Google is that they're kind of spinning their wheels and they have this code red."
},
{
"start": "0:04:26.166562",
"text": " And, you know, it's like they're playing catch up already. Like, you know, could they use the open source community and work with them, which is going to be really, really hard, you know, from a structural perspective, given Google's place in the ecosystem."
},
{
"start": "0:04:38.166562",
"text": " But a lot of jumping off points there."
}
]
},
{
"speaker": "D",
"start": "0:04:42.074063",
"stop": "0:05:51.919688",
"transcript": [
{
"start": "0:04:42.074063",
"text": " I was going to say, I think the post is really focused on how do we get the best model, but"
},
{
"start": "0:04:47.114063",
"text": " it's not focused on how do you build the best product around it."
},
{
"start": "0:04:50.154063",
"text": " A lot of these models are limited by how many GPUs you can get to run them."
},
{
"start": "0:04:54.434063",
"text": " And we've seen on traditional open source, everybody can use some of these projects like"
},
{
"start": "0:04:59.274063",
"text": " Kafka and like Elasti for free, but the reality is that not everybody can afford to run the"
},
{
"start": "0:05:04.154063",
"text": " infrastructure needed for it."
},
{
"start": "0:05:06.014063",
"text": " So I think the main takeaway that I have from this is a lot of the modes are probably around"
},
{
"start": "0:05:12.654063",
"text": " just getting the sand, so to speak, and having the GPUs to actually serve these models."
},
{
"start": "0:05:17.214063",
"text": " Because even if the best model is open source, like running it at a large scale for an end"
},
{
"start": "0:05:22.414062",
"text": " is not easy and like it's not super convenient to get a lot of the infrastructure."
},
{
"start": "0:05:28.354063",
"text": " And we've seen that model work in open source where you have the open source project and"
},
{
"start": "0:05:31.934063",
"text": " then you have an enterprise cloud hosted version for it."
},
{
"start": "0:05:35.414062",
"text": " I think that's going to look really different in open source models because just hosting"
},
{
"start": "0:05:39.734063",
"text": " a model doesn't have a lot of value."
},
{
"start": "0:05:41.914062",
"text": " So I'm curious to hear how people end up getting rewarded to do open source."
},
{
"start": "0:05:48.214063",
"text": " We figured that out in infrastructure, but we haven't figured that out in Adalense yet."
}
]
},
{
"speaker": "A",
"start": "0:05:53.168438",
"stop": "0:07:15.231563",
"transcript": [
{
"start": "0:05:53.168438",
"text": " I mean, one thing I'll say is that the models that you can run on your own"
},
{
"start": "0:05:56.868438",
"text": " devices are so far ahead of what I ever dreamed they would be at this point."
},
{
"start": "0:06:01.668438",
"text": " Like the Quna 13B, I think is the current best available"
},
{
"start": "0:06:06.368438",
"text": " open mode model that I've played with."
},
{
"start": "0:06:08.268438",
"text": " It's derived from Facebook llama, so you can't use it for commercial purposes yet."
},
{
"start": "0:06:12.428438",
"text": " But the point about the Quna 13B is it runs in the browser directly on web GPU."
},
{
"start": "0:06:18.008438",
"text": " There's this amazing web LLM project where you literally your browser"
},
{
"start": "0:06:21.508438",
"text": " download a two gigabyte file and it fires up a chat GPD style interface."
},
{
"start": "0:06:26.168438",
"text": " And it's quite good."
},
{
"start": "0:06:27.408438",
"text": " It can do rap battles between different animals and all of the kinds of fun stuff"
},
{
"start": "0:06:31.908438",
"text": " that you'd expect to be able to do with language model running entirely in Chrome"
},
{
"start": "0:06:35.448438",
"text": " Canary, it's shocking to me that that's even possible, but that kind of shows"
},
{
"start": "0:06:39.648438",
"text": " that once you get to inference, If you can shrink the model down and the techniques"
},
{
"start": "0:06:44.568438",
"text": " for shrinking these models, the first one was the the quantization, which the llama"
},
{
"start": "0:06:49.348438",
"text": " dot CPP project really sort of popularized Mac."
},
{
"start": "0:06:51.908438",
"text": " And, you know, by using four bits instead of 16 bit floating point numbers,"
},
{
"start": "0:06:55.508438",
"text": " you can shrink it down quite a lot."
},
{
"start": "0:06:57.648438",
"text": " And then there was a paper that came out days ago suggesting that you can prune"
},
{
"start": "0:07:01.248438",
"text": " the models and ditch half the model and maintain the same level of quality."
},
{
"start": "0:07:05.548438",
"text": " So with with things like that, with all of these tricks coming together,"
},
{
"start": "0:07:08.988438",
"text": " it's really astonishing how much you can get done on hardware that people"
},
{
"start": "0:07:13.608438",
"text": " actually have in their pockets even."
}
]
},
{
"speaker": "B",
"start": "0:07:15.231563",
"stop": "0:07:18.083438",
"transcript": [
{
"start": "0:07:15.231563",
"text": " Just for completion, I've been following all of your posts."
}
]
},
{
"speaker": "B",
"start": "0:07:20.378438",
"stop": "0:07:27.195938",
"transcript": [
{
"start": "0:07:20.378438",
"text": " Yes. I want to follow up. Simon, you said you're running a model on your phone."
},
{
"start": "0:07:24.538438",
"text": " Which model is it? And I don't think you've written it up."
}
]
},
{
"speaker": "A",
"start": "0:07:27.600938",
"stop": "0:07:44.172188",
"transcript": [
{
"start": "0:07:27.600938",
"text": " Yeah, that one's the Q-nut. I did. Did I write it up? I did. I've got a blog post about how it"
},
{
"start": "0:07:33.440938",
"text": " knows who I am, sort of, but it said that I invented a pattern for living called the"
},
{
"start": "0:07:38.560938",
"text": " Bear or Bonnie pattern, which I definitely didn't, but I love that my phone decided that I did."
}
]
},
{
"speaker": "B",
"start": "0:07:44.172188",
"stop": "0:08:40.281563",
"transcript": [
{
"start": "0:07:44.172188",
"text": " I will hunt for that because I'm not yet running Vykonio on my phone and I feel like I should"
},
{
"start": "0:07:49.452188",
"text": " and as like a very base thing. But I'll follow up two things right like one I'm very interested in"
},
{
"start": "0:07:54.812188",
"text": " and I won't let's talk about that a little bit more because this concept of stackable improvements"
},
{
"start": "0:07:58.972188",
"text": " to models I think is extremely interesting like I would love to npm install abilities onto"
},
{
"start": "0:08:04.332188",
"text": " my models right which is really awesome but the first thing I think is under discussed is"
},
{
"start": "0:08:09.612188",
"text": " I don't get the panic like honestly like Google has the most modes I was arguing maybe like three"
},
{
"start": "0:08:16.012188",
"text": " months ago on my blog like Google has the most mode out of a lot of people because hey we have"
},
{
"start": "0:08:20.412188",
"text": " your calendar hey we have your email hey we have your you know Google docs like isn't that a"
},
{
"start": "0:08:26.172188",
"text": " sufficient mode like why are these guys panicking so much I don't I still don't get it like sure"
},
{
"start": "0:08:30.732188",
"text": " you know open source is running ahead and like it's on device and whatever what have you but"
},
{
"start": "0:08:35.452188",
"text": " they have so much more mode like what are we talking about here there's many dimensions to"
},
{
"start": "0:08:39.772188",
"text": " compete on."
}
]
},
{
"speaker": "C",
"start": "0:08:42.002813",
"stop": "0:10:17.616563",
"transcript": [
{
"start": "0:08:42.002813",
"text": " Yeah, there's like one of the things that the author mentions in here is when you start to have the feeling of what we're trailing behind, then your brightest researchers, you know, jump ship and go to open AI or go to work at academia or whatever."
},
{
"start": "0:09:00.162812",
"text": " And like the talent drain at the level of the senior AI researchers that are pushing these things ahead within Google, I think is a serious, serious concern."
},
{
"start": "0:09:09.322813",
"text": " And my take on it's a good point, right? Like, like, like, like, Google has modes, they're not running out of money anytime soon, you know, I think they do see the level of the defensibility and the fact that they want to be on time."
},
{
"start": "0:09:23.162812",
"text": " And the leader around pretty much anything tech first, there's definitely have lost that feeling, right?"
},
{
"start": "0:09:31.322813",
"text": " And, you know, to what degree they can, they can with the open source community to get that back and help drive that, you know, all of the llama subset of models with with alpaca and vicuna, etc."
},
{
"start": "0:09:43.322813",
"text": " That all came from from meta, right? Like that? Yeah, like it's not licensed in an open way, we can build a company on top of it, but is now kind of driving this family of models."
},
{
"start": "0:09:51.322813",
"text": " Like there's a tree of models that that they're they're leading and where is Google in that in that playbook?"
},
{
"start": "0:09:56.322813",
"text": " Like for a long time, they were the one releasing those models being super open and and now it's just they seem to be trailing and there's there's people jumping ship and to what degree can they can they can they close off those wounds and and focus on on where where they have unique ability to gain momentum, I think is a core part of my takeaway from this."
}
]
},
{
"speaker": "D",
"start": "0:10:19.422188",
"stop": "0:11:33.554063",
"transcript": [
{
"start": "0:10:19.422188",
"text": " Yeah, and I think another big thing in the post is, oh, as"
},
{
"start": "0:10:22.782188",
"text": " long as you have high quality data, like you don't need that"
},
{
"start": "0:10:25.142188",
"text": " much data, you can just use that. The first party data loops"
},
{
"start": "0:10:28.742188",
"text": " are probably going to be the most important going forward if"
},
{
"start": "0:10:31.102188",
"text": " we do believe that this is true. So Databricks, we have Mike"
},
{
"start": "0:10:34.322188",
"text": " Conover from Databricks on the podcast, and they talked about"
},
{
"start": "0:10:37.622188",
"text": " how they came up with the training set for Dolly, which"
},
{
"start": "0:10:40.902188",
"text": " they basically had Databricks employees write down very good"
},
{
"start": "0:10:44.102188",
"text": " questions and very good answers for it. Not every company has"
},
{
"start": "0:10:48.302188",
"text": " the skill to do that. And I think, you know, products like"
},
{
"start": "0:10:51.142188",
"text": " Google, they have millions of people writing Google Docs,"
},
{
"start": "0:10:54.782188",
"text": " millions of people using Google Sheets, and millions of people"
},
{
"start": "0:10:57.742188",
"text": " writing stuff, creating content on YouTube. The question is, if"
},
{
"start": "0:11:02.262188",
"text": " you want to compete against these companies, maybe the"
},
{
"start": "0:11:04.742188",
"text": " model is not what you're gonna do it with, because the open"
},
{
"start": "0:11:07.822188",
"text": " source kind of commoditizes it. But how do you build even better"
},
{
"start": "0:11:11.582188",
"text": " data first party loops. And that's kind of the hardest thing"
},
{
"start": "0:11:15.182188",
"text": " for startups, right? Like, even if we open up the models to"
},
{
"start": "0:11:18.782188",
"text": " everybody, and everybody can just go on GitHub and or"
},
{
"start": "0:11:21.702188",
"text": " Hugging Face and get the weights to the best model, or get enough"
},
{
"start": "0:11:24.382188",
"text": " people to generate data for me, so that I can still make it"
},
{
"start": "0:11:27.222188",
"text": " good. That's, that's what I would be worried about. If I was"
},
{
"start": "0:11:30.542188",
"text": " a new company, how do I make that happen really quickly?"
}
]
},
{
"speaker": "A",
"start": "0:11:34.499063",
"stop": "0:12:40.193438",
"transcript": [
{
"start": "0:11:34.499063",
"text": " I'm not convinced that the data is that big a challenge."
},
{
"start": "0:11:37.899063",
"text": " So there's this project."
},
{
"start": "0:11:39.379063",
"text": " So the problem with Facebook Llama is that it's not available for commercial use."
},
{
"start": "0:11:43.459063",
"text": " So people are now trying to train a alternative to Llama that's entirely"
},
{
"start": "0:11:46.899063",
"text": " on openly licensed data and that the biggest project is this red pajama project."
},
{
"start": "0:11:51.779063",
"text": " They released their training data a few weeks ago and it was 2.7 terabytes."
},
{
"start": "0:11:56.579063",
"text": " Right."
},
{
"start": "0:11:56.819063",
"text": " So actually tiny, right?"
},
{
"start": "0:11:58.259063",
"text": " You can buy a laptop that you can fit 2.7 terabytes on."
},
{
"start": "0:12:01.459063",
"text": " But it was the same exact data that Facebook, it was the same thing that"
},
{
"start": "0:12:05.419062",
"text": " Facebook Llama had been trained on because for your base model, you're not really"
},
{
"start": "0:12:09.459063",
"text": " trying to teach it facts about the world."
},
{
"start": "0:12:10.979063",
"text": " You're just trying to teach it how English and other languages work, how they fit"
},
{
"start": "0:12:14.339063",
"text": " together."
},
{
"start": "0:12:14.939063",
"text": " And then the real magic is when you fine tune on top of that, that's what Alpaca"
},
{
"start": "0:12:18.619063",
"text": " did on top of Llama and so on."
},
{
"start": "0:12:19.979063",
"text": " And the fine tuning sets, it looks like tens of thousands of examples to kick one"
},
{
"start": "0:12:24.619063",
"text": " of these raw models into shape and tens of thousands of examples like Databricks"
},
{
"start": "0:12:28.499063",
"text": " spent a month and got the 2000 employees of their company to help kick in and it"
},
{
"start": "0:12:32.459063",
"text": " worked."
},
{
"start": "0:12:34.299062",
"text": " You've got the Open Assistant project and crowdsourcing this stuff now as well."
},
{
"start": "0:12:38.339063",
"text": " So it's achievable."
}
]
},
{
"speaker": "B",
"start": "0:12:40.547813",
"stop": "0:13:16.052813",
"transcript": [
{
"start": "0:12:40.547813",
"text": " sort of throw it. I agree. I think it's a fascinating point. Actually, so I've heard"
},
{
"start": "0:12:44.627813",
"text": " through the grapevine that Red Pajamas model trained on the data that they released is going"
},
{
"start": "0:12:49.507813",
"text": " to be releasing tomorrow. And it's a very exciting time because there's a couple more models that are"
},
{
"start": "0:12:55.907813",
"text": " coming down the pike, which independently reproduced. And so, yeah, everyone is"
},
{
"start": "0:12:59.747813",
"text": " challenging all these assumptions from first principles, which is fascinating."
},
{
"start": "0:13:05.187813",
"text": " I did want it to try to get a little bit more technical in terms of the specific points raised,"
},
{
"start": "0:13:10.307813",
"text": " this doc was just amazing. Can we talk about Laura? I'll open it up to Simon again if he's back."
}
]
},
{
"speaker": "A",
"start": "0:13:16.845938",
"stop": "0:13:21.992813",
"transcript": [
{
"start": "0:13:16.845938",
"text": " I'd rather someone else take on Laura. I know as much as I've read in that paper, but not much more than that."
}
]
},
{
"speaker": "B",
"start": "0:13:21.992813",
"stop": "0:13:40.437188",
"transcript": [
{
"start": "0:13:21.992813",
"text": " So I thought this was kind of like an optimization technique."
},
{
"start": "0:13:25.232813",
"text": " So LoRa stands for low rank adaptation."
},
{
"start": "0:13:28.052812",
"text": " But this is the first mention of LoRa as a form of stackable improvements where he, I"
},
{
"start": "0:13:33.592813",
"text": " forget what, let me just, let me just kind of Google this, but obviously anyone's more"
},
{
"start": "0:13:38.212813",
"text": " knowledgeable, please come on in."
}
]
},
{
"speaker": "D",
"start": "0:13:40.437188",
"stop": "0:14:08.939063",
"transcript": [
{
"start": "0:13:40.437188",
"text": " All I've learned is through chat GPT, I spent about 20 minutes on GPT-4 trying to figure"
},
{
"start": "0:13:45.917188",
"text": " out what it was."
},
{
"start": "0:13:46.917188",
"text": " I studied computer science, but this is not my area of expertise."
},
{
"start": "0:13:51.717188",
"text": " What I got from it is that basically instead of having to retrain the whole model, you"
},
{
"start": "0:13:55.397188",
"text": " can just pick one of the ranks and you take one of the weight matrices and make two smaller"
},
{
"start": "0:14:02.477188",
"text": " matrices from it and then just two to be retrained and retraining the whole model."
},
{
"start": "0:14:07.437188",
"text": " So it saves a lot of time."
}
]
},
{
"speaker": "B",
"start": "0:14:08.939063",
"stop": "0:14:23.198438",
"transcript": [
{
"start": "0:14:08.939063",
"text": " You freeze part of the thing and then you just train a smaller part like that"
},
{
"start": "0:14:12.179063",
"text": " Exactly is to be a area of a lot of food for research"
},
{
"start": "0:14:15.219063",
"text": " Yeah, I'm mini GPT for recently did something similar as well"
},
{
"start": "0:14:19.219063",
"text": " And then there's this there's a there's a sparse model people on today that"
}
]
},
{
"speaker": "A",
"start": "0:14:23.552813",
"stop": "0:14:55.952813",
"transcript": [
{
"start": "0:14:23.552813",
"text": " So I've seen a lot of LoRa stable, the stable diffusion community have been using LoRa a lot."
},
{
"start": "0:14:28.912813",
"text": " So that in that case, they had a, the thing I've seen is people releasing LoRa's that are like,"
},
{
"start": "0:14:33.472813",
"text": " you train a concept like a particular person's face or something you release. And the LoRa"
},
{
"start": "0:14:39.472813",
"text": " version of this ended up being megabytes of data, like, which is, it's, you know,"
},
{
"start": "0:14:43.552813",
"text": " it's small enough that you can just trade those around and you can effectively load multiple of"
},
{
"start": "0:14:47.152813",
"text": " those into the model. But what I haven't realized is that you can use the same trick on language"
},
{
"start": "0:14:51.712813",
"text": " models. That was one of the big new things for me in reading the leaked Google paper today."
}
]
},
{
"speaker": "D",
"start": "0:14:56.914688",
"stop": "0:15:46.577813",
"transcript": [
{
"start": "0:14:56.914688",
"text": " Yeah, and I think the point to make around owning the infrastructure, so what Chad GPD has told me is that when you're figuring out what rank you actually want to do this fine tuning at, you can either go too low and like the model doesn't actually learn it, or you can go too high and the model over fits those learnings."
},
{
"start": "0:15:14.154688",
"text": " So if you have a base model that everybody agrees on, then all the subsequent like LoRa work is done around the same rank, which gives you an advantage."
},
{
"start": "0:15:23.554688",
"text": " And the point they made in the stat since Llama has been the base for a lot of this LoRa work, like they own the mindshare of the community."
},
{
"start": "0:15:32.194687",
"text": " So everything that they're building is compatible with their architecture."
},
{
"start": "0:15:35.714687",
"text": " But if Google open sources their own model, you know, the rank that they chose for LoRa on Llama might not work on the Google model."
},
{
"start": "0:15:43.794688",
"text": " So all of the existing work is not portable."
}
]
},
{
"speaker": "A",
"start": "0:15:46.577813",
"stop": "0:16:20.834063",
"transcript": [
{
"start": "0:15:46.577813",
"text": " The impression I got is that one of the challenges with LoRa is that you train all these LoRAs"
},
{
"start": "0:15:51.057813",
"text": " on top of your model, but then if you retrain that base model, those LoRs become invalid."
},
{
"start": "0:15:55.377813",
"text": " Right?"
},
{
"start": "0:15:56.377813",
"text": " They're essentially, they're built for an exact model version."
},
{
"start": "0:15:58.417813",
"text": " So this means that being the big company with all of the GPUs that can afford to retrain"
},
{
"start": "0:16:02.737813",
"text": " a model every three months, that's suddenly not nearly as valuable as it used to be."
},
{
"start": "0:16:06.857813",
"text": " Because now maybe there's an open source model that's five years old at this point and has"
},
{
"start": "0:16:10.977813",
"text": " like multiple, multiple stacks of LoRs trained all over the world on top of it, which can"
},
{
"start": "0:16:15.177813",
"text": " outperform your brand new model just because there's been so much more iteration on that"
},
{
"start": "0:16:19.697813",
"text": " base."
}
]
},
{
"speaker": "B",
"start": "0:16:20.192813",
"stop": "0:16:44.003438",
"transcript": [
{
"start": "0:16:20.192813",
"text": " I think it's fascinating. Jim Fan from NVIDIA was recently making this argument for transformers."
},
{
"start": "0:16:25.712813",
"text": " Like even if we do come up with a better architecture than transformers, the sheer hundreds and"
},
{
"start": "0:16:31.552813",
"text": " millions of dollars that have been invested on top of transformers make it actually there is some"
},
{
"start": "0:16:36.752813",
"text": " switching costs and it's not exactly obvious that better architecture equals that. We should all switch"
},
{
"start": "0:16:42.672813",
"text": " immediately tomorrow."
}
]
},
{
"speaker": "A",
"start": "0:16:43.935938",
"stop": "0:17:05.147813",
"transcript": [
{
"start": "0:16:43.935938",
"text": " It's kind of like the difficulty of launching a new programming language today,"
},
{
"start": "0:16:48.415938",
"text": " is that Python and JavaScript have a million packages, so no matter how good your new language"
},
{
"start": "0:16:53.135938",
"text": " is, if it can't tap into those existing package libraries, it's not going to be useful for them."
},
{
"start": "0:16:57.135938",
"text": " Which is why Mojo is so clever, because they did build on top of Python, so they get all of that"
},
{
"start": "0:17:01.295938",
"text": " existing infrastructure, all of that existing code working already."
}
]
},
{
"speaker": "B",
"start": "0:17:05.147813",
"stop": "0:17:10.935938",
"transcript": [
{
"start": "0:17:05.147813",
"text": " I mean, what we got you since you're, you know, co-creator of Jenga and all that, do"
},
{
"start": "0:17:08.747813",
"text": " we want to take a diversion into Mojo?"
}
]
},
{
"speaker": "C",
"start": "0:17:10.547812",
"stop": "0:18:45.959063",
"transcript": [
{
"start": "0:17:10.547812",
"text": " I'd be happy to jump in and get someone's take on Mojo."
},
{
"start": "0:17:14.707813",
"text": " One small point on Laura is, you know, I just think,"
},
{
"start": "0:17:20.227813",
"text": " if you think about at a high level,"
},
{
"start": "0:17:22.067812",
"text": " what the major downsides are of these large language models,"
},
{
"start": "0:17:27.307812",
"text": " it's the fact that they, well,"
},
{
"start": "0:17:29.987813",
"text": " they're difficult to train, right?"
},
{
"start": "0:17:32.307812",
"text": " They tend to hallucinate and they have a static,"
},
{
"start": "0:17:36.747813",
"text": " like they were trained at a certain date, right?"
},
{
"start": "0:17:38.787812",
"text": " And with Laura, I think it makes it a lot more amenable"
},
{
"start": "0:17:42.707813",
"text": " to training new updates on top of that,"
},
{
"start": "0:17:46.107812",
"text": " that like base model on the fly,"
},
{
"start": "0:17:48.027812",
"text": " where you can incorporate new data"
},
{
"start": "0:17:50.667812",
"text": " in a way that is an interesting"
},
{
"start": "0:17:54.347812",
"text": " and potentially more optimal alternative"
},
{
"start": "0:17:56.787812",
"text": " than in the kind of in context generation, you know,"
},
{
"start": "0:17:59.787812",
"text": " because most of like who have perplexity.ai"
},
{
"start": "0:18:02.107812",
"text": " or any of these approaches currently,"
},
{
"start": "0:18:04.207813",
"text": " it's like all based off of doing real time searches"
},
{
"start": "0:18:07.747813",
"text": " and injecting as much into the local context window"
},
{
"start": "0:18:10.787812",
"text": " as possible so that you try to ground your language model,"
},
{
"start": "0:18:16.227813",
"text": " both in terms of the information that has access to,"
},
{
"start": "0:18:19.387812",
"text": " that helps to reduce hallucinations."
},
{
"start": "0:18:21.587812",
"text": " It can't produce it, but you know, helps to reduce it."
},
{
"start": "0:18:23.667812",
"text": " And then also gives it access to up-to-date information"
},
{
"start": "0:18:26.147812",
"text": " that wasn't around for that massive, like pre-training stuff."
},
{
"start": "0:18:29.347812",
"text": " And, you know, I think Laura in mind"
},
{
"start": "0:18:31.907812",
"text": " really makes it more amenable to having constantly shifted"
},
{
"start": "0:18:37.707813",
"text": " lifting lightweight pre-training on top of it"
},
{
"start": "0:18:39.747813",
"text": " that scales better than normal fine tune, fine tuning."
},
{
"start": "0:18:43.787812",
"text": " Yeah, that was just kind of my one takeaway there."
}
]
},
{
"speaker": "A",
"start": "0:18:45.959063",
"stop": "0:19:43.452188",
"transcript": [
{
"start": "0:18:45.959063",
"text": " I mean, for me, I've never been, I want to run models on my own heart. I don't actually care"
},
{
"start": "0:18:51.239063",
"text": " about their factual content. Like I don't need a model that's been, that's trained on the most"
},
{
"start": "0:18:55.719063",
"text": " up-to-date things. What I need is a model that can do the Bing and Bards trick, right? That can"
},
{
"start": "0:18:59.879063",
"text": " tell when it needs to run a search and then go and run a search to get extra information and,"
},
{
"start": "0:19:03.959063",
"text": " and bring that context in. And similarly, I want it to be able to operate tools where it can access"
},
{
"start": "0:19:08.199063",
"text": " my email or look at my notes or all of those kinds of things. And I don't think you need a very"
},
{
"start": "0:19:12.599063",
"text": " powerful model for that. Like that's one of the things where I feel like, yeah, the Quna running"
},
{
"start": "0:19:16.839063",
"text": " on my, on my laptop is probably powerful enough to drive a sort of personal research assistant,"
},
{
"start": "0:19:22.119063",
"text": " which can look things up for me and it can summarize things to my notes and it can do all"
},
{
"start": "0:19:25.639063",
"text": " of that. And I don't care, but it doesn't know about the Ukraine war because the Ukraine war had"
},
{
"start": "0:19:29.319062",
"text": " a training cutoff. That doesn't matter if it's got those additional capabilities, which are quite"
},
{
"start": "0:19:34.039062",
"text": " easy to build. You know, the reason everyone's going crazy building agents and tools right now"
},
{
"start": "0:19:38.679063",
"text": " is that it's a few lines of Python code and there's sort of a couple of paragraphs of prompt"
},
{
"start": "0:19:42.599063",
"text": " to get it to work."
}
]
},
{
"speaker": "C",
"start": "0:19:44.920313",
"stop": "0:21:02.039062",
"transcript": [
{
"start": "0:19:44.920313",
"text": " Well, let's maybe dig in on that a little bit."
},
{
"start": "0:19:47.440313",
"text": " And this also is very related to Mojo,"
},
{
"start": "0:19:50.080313",
"text": " because I do think there are use cases and domains where"
},
{
"start": "0:19:53.480313",
"text": " having the hyper-optimized version of these models"
},
{
"start": "0:19:57.880313",
"text": " running on device is very relevant, where you can't"
},
{
"start": "0:20:00.440313",
"text": " necessarily make API calls out on the fly"
},
{
"start": "0:20:03.320313",
"text": " and do context-augmented generation."
},
{
"start": "0:20:06.840313",
"text": " And I was talking with a researcher at Lockheed Martin"
},
{
"start": "0:20:11.960313",
"text": " yesterday, literally about the version of this"
},
{
"start": "0:20:15.680313",
"text": " of language models running on fighter jets, right?"
},
{
"start": "0:20:18.800313",
"text": " And you talk about the amount of engineering precision"
},
{
"start": "0:20:22.300313",
"text": " and optimization that has to go into those type of models."
},
{
"start": "0:20:25.480313",
"text": " And the fact that you spend so much money,"
},
{
"start": "0:20:27.980313",
"text": " like training a super distilled version"
},
{
"start": "0:20:30.640313",
"text": " where milliseconds matter,"
},
{
"start": "0:20:32.200313",
"text": " it's a life or death situation there."
},
{
"start": "0:20:34.240313",
"text": " And you couldn't even remotely have a use case there"
},
{
"start": "0:20:37.340313",
"text": " where you could call out and have API calls or something."
},
{
"start": "0:20:40.600313",
"text": " So I do think there's keeping in mind the use cases"
},
{
"start": "0:20:45.560313",
"text": " where there'll be use cases that I'm more excited about"
},
{
"start": "0:20:48.240313",
"text": " at the application level, where, yeah,"
},
{
"start": "0:20:50.160313",
"text": " I want to just have it be super flexible"
},
{
"start": "0:20:52.980313",
"text": " and be able to call out to APIs"
},
{
"start": "0:20:54.320313",
"text": " and have this agentic type thing."
},
{
"start": "0:20:56.560313",
"text": " And then there's also industries and use cases"
},
{
"start": "0:20:59.080313",
"text": " where you really need everything baked into the model."
},
{
"start": "0:21:01.520313",
"text": " Yep, agreed."
}
]
},
{
"speaker": "B",
"start": "0:21:02.039062",
"stop": "0:21:15.336563",
"transcript": [
{
"start": "0:21:02.039062",
"text": " My favorite piece of take on this is I think GPC4 as a reasoning engine, which I think"
},
{
"start": "0:21:06.879062",
"text": " came from Nathan at every.to, which I think, yeah, I see the 100 score over there."
},
{
"start": "0:21:12.719063",
"text": " Simon, do you have a few seconds on Mojo?"
}
]
},
{
"speaker": "A",
"start": "0:21:15.927188",
"stop": "0:23:06.036563",
"transcript": [
{
"start": "0:21:15.927188",
"text": " Sure. So Mojo is a brand new programming language used just announced a few days ago."
},
{
"start": "0:21:20.407188",
"text": " It's not actually available yet. I think there's an online demo, but"
},
{
"start": "0:21:23.927188",
"text": " presuming it becomes an open source language we can use."
},
{
"start": "0:21:26.727188",
"text": " It's got really some very interesting characteristics. It's a superset of Python."
},
{
"start": "0:21:31.767188",
"text": " So anything written in Python will just work, but it adds additional features on top that let you"
},
{
"start": "0:21:37.287188",
"text": " basically do very highly optimized code written in Python syntax with the compiles down."
},
{
"start": "0:21:43.767188",
"text": " The main thing that's exciting about it is the pedigree that it comes from."
},
{
"start": "0:21:47.207188",
"text": " It's a team led by Chris Latner and they built LLVM and Clang, and then he designed Swift at Apple."
},
{
"start": "0:21:54.007188",
"text": " So he's got like three for three on an extraordinarily impactful"
},
{
"start": "0:21:58.647188",
"text": " high performance computing product. And he put together this team and they've basically,"
},
{
"start": "0:22:02.647188",
"text": " they're trying to go after the problem of how do you build a language which you can do really"
},
{
"start": "0:22:07.847188",
"text": " high performance optimized work in, but where you don't have to do everything again from scratch."
},
{
"start": "0:22:12.087188",
"text": " That's where building on top of Python is so clever."
},
{
"start": "0:22:14.567188",
"text": " So I wasn't like, if this thing came along, I didn't really pay attention to it until"
},
{
"start": "0:22:18.727188",
"text": " Jeremy Howard, who built fast AI, put up a very detailed blog post about why he was excited about"
},
{
"start": "0:22:24.807188",
"text": " Mojo, which included a, there's a video demo in there, which everyone should watch because"
},
{
"start": "0:22:30.007188",
"text": " in that video, he takes matrix multiplication implemented in Python."
},
{
"start": "0:22:34.007188",
"text": " And then he uses the Mojo extras to 2000 X the performance of that matrix multiplication."
},
{
"start": "0:22:40.487188",
"text": " Like he adds a few static types functions, sort of struct instead of the class."
},
{
"start": "0:22:44.087188",
"text": " And it gets 2000 times the performance out of it, which is phenomenal,"
},
{
"start": "0:22:48.087188",
"text": " like absolutely extraordinary. So yeah, and that got me really excited."
},
{
"start": "0:22:52.647188",
"text": " Like the idea that we can still use Python and all of this stuff we've got in Python,"
},
{
"start": "0:22:56.647188",
"text": " but we can just very slightly tweak some things and get literally like a thousand"
},
{
"start": "0:23:02.007188",
"text": " times upwards performance out of the things that matter. That's really exciting."
}
]
},
{
"speaker": "B",
"start": "0:23:07.403438",
"stop": "0:23:38.909063",
"transcript": [
{
"start": "0:23:07.403438",
"text": " Yeah. I'm curious, like how come this wasn't thought of before?"
},
{
"start": "0:23:11.163438",
"text": " It's not like the concept of a language superset hasn't, you know,"
},
{
"start": "0:23:15.723438",
"text": " hasn't has, it has, it's completely new, but all, as far as I know,"
},
{
"start": "0:23:19.503438",
"text": " all the previous Python interpreter approaches,"
},
{
"start": "0:23:22.143438",
"text": " like the alternate runtime approaches are, you know, like they, they, they were"
},
{
"start": "0:23:25.703438",
"text": " more, they're more sort of conforming to standard Python,"
},
{
"start": "0:23:29.643438",
"text": " but never really tried this additional approach of augmenting the language."
},
{
"start": "0:23:33.043438",
"text": " I wonder if you have any insights there on, on like, why,"
},
{
"start": "0:23:36.443438",
"text": " like why is this a breakthrough?"
}
]
},
{
"speaker": "A",
"start": "0:23:38.909063",
"stop": "0:25:11.772188",
"transcript": [
{
"start": "0:23:38.909063",
"text": " Yeah, that's a really interesting question. So Jeremy Howard's piece talks about this thing"
},
{
"start": "0:23:43.469063",
"text": " called MLIR, which I hadn't heard of before, but this was another Chris Latner project. He built"
},
{
"start": "0:23:49.469063",
"text": " LLVM as a low-level virtual machine that you could build compilers on top of. And then MLIR"
},
{
"start": "0:23:56.429063",
"text": " was this one that he initially kicked off at Google. And I think it's part of TensorFlow"
},
{
"start": "0:24:00.509063",
"text": " and things like that, but it was very much optimized for multiple cores and GPU access"
},
{
"start": "0:24:05.549063",
"text": " and all of that kind of thing. And so my reading of Jeremy Howard's article is that they basically"
},
{
"start": "0:24:10.989063",
"text": " built Mojo on top of MLIR. So they had a huge, huge starting point where they knew this technology"
},
{
"start": "0:24:19.389063",
"text": " better than anyone else. And because they had this very, very robust high-performance basis that they"
},
{
"start": "0:24:25.069063",
"text": " could build things on, I think maybe they're just the first people to try and build a high-level"
},
{
"start": "0:24:31.389063",
"text": " language with MLIR, with some extra things. So it feels like they're basically taking a whole bunch"
},
{
"start": "0:24:36.909063",
"text": " of ideas people have been sort of experimenting with over the last decade and bundled them all"
},
{
"start": "0:24:41.789063",
"text": " together with exactly the right team, the right level of expertise. And it looks like they've got"
},
{
"start": "0:24:45.549063",
"text": " the thing to work. But yeah, I mean, I'm very intrigued to see, especially once this is actually"
},
{
"start": "0:24:50.909063",
"text": " available and we can start using it, Jeremy Howard is someone I respect very deeply. And he's hyping"
},
{
"start": "0:24:57.629063",
"text": " this thing like crazy, right? His headline is, and he's not the kind of person who hypes things"
},
{
"start": "0:25:02.029063",
"text": " if they're not worth hyping. He said, Mojo may be the biggest programming language advanced in"
},
{
"start": "0:25:05.949063",
"text": " decades. And from anyone else, I'd kind of ignore that headline, but from him, it really means"
},
{
"start": "0:25:11.069063",
"text": " something."
}
]
},
{
"speaker": "B",
"start": "0:25:11.772188",
"stop": "0:25:49.082813",
"transcript": [
{
"start": "0:25:11.772188",
"text": " Yes, because he doesn't hype things up randomly."
},
{
"start": "0:25:14.412188",
"text": " Yeah, and he's a noted skeptic of Julia, which is also another data science hot topic."
},
{
"start": "0:25:20.572188",
"text": " But from the TypeScript and web development worlds, there has been a dialect of TypeScript"
},
{
"start": "0:25:24.972188",
"text": " that was specifically optimized to compile to WebAssembly, which I thought was promising"
},
{
"start": "0:25:30.212188",
"text": " and then eventually never really took off."
},
{
"start": "0:25:33.592188",
"text": " But I like this approach because I think more frameworks should essentially be languages"
},
{
"start": "0:25:39.052188",
"text": " and recognize that they are language supersets and maybe work on compilers that work on them."
},
{
"start": "0:25:43.972188",
"text": " And that is the, by the way, that's the direction that React is going right now."
},
{
"start": "0:25:47.652188",
"text": " So fun times."
}
]
},
{
"speaker": "A",
"start": "0:25:49.082813",
"stop": "0:25:56.102813",
"transcript": [
{
"start": "0:25:49.082813",
"text": " Ha! TypeScript's an interesting comparison actually because TypeScript is effectively"
},
{
"start": "0:25:52.922813",
"text": " a superset of JavaScript, right? It is, but there's..."
}
]
},
{
"speaker": "B",
"start": "0:25:54.449063",
"stop": "0:25:58.465313",
"transcript": [
{
"start": "0:25:54.449063",
"text": " It is, but there's not a corporate angle."
},
{
"start": "0:25:56.649063",
"text": " It's purely its types, right?"
}
]
},
{
"speaker": "A",
"start": "0:25:58.313438",
"stop": "0:26:04.590938",
"transcript": [
{
"start": "0:25:58.313438",
"text": " Gotcha. Right, so I guess Mojo is the superset of Python, but the emphasis is absolutely on tapping into the performance stuff."
}
]
},
{
"speaker": "B",
"start": "0:26:05.232188",
"stop": "0:26:07.004063",
"transcript": [
{
"start": "0:26:05.232188",
"text": " Right."
}
]
},
{
"speaker": "C",
"start": "0:26:07.004063",
"stop": "0:26:59.046563",
"transcript": [
{
"start": "0:26:07.004063",
"text": " care about. Yeah, the one thing I found is very similar to the early days of TypeScript."
},
{
"start": "0:26:12.764063",
"text": " There was the most important thing was that it's incrementally adoptable, you know, because"
},
{
"start": "0:26:17.964063",
"text": " people had JavaScript code bases and they wanted to incrementally like add the main value"
},
{
"start": "0:26:22.924063",
"text": " prop for TypeScript was reliability and the static typing. And with Mojo, we see it being"
},
{
"start": "0:26:28.684063",
"text": " basically anyone who's a target, a large enterprise user of Mojo or even researchers, like they're"
},
{
"start": "0:26:34.044063",
"text": " all going to be coming from a hardcore background in Python and have large existing libraries. And"
},
{
"start": "0:26:40.604063",
"text": " the question will be for what use cases will Mojo be like a really good fit for that incremental"
},
{
"start": "0:26:47.164063",
"text": " adoption where you can still tap into your massive like Python existing infrastructure,"
},
{
"start": "0:26:53.244063",
"text": " workflows, data tooling, etc. And, you know, what is what does that path to adoption look like?"
}
]
},
{
"speaker": "B",
"start": "0:27:00.042188",
"stop": "0:27:35.547188",
"transcript": [
{
"start": "0:27:00.042188",
"text": " We don't know because it's a wait-listed language, which people were complaining about."
},
{
"start": "0:27:04.962188",
"text": " The Mojo creators were saying something about they had to scale up their servers and I'm"
},
{
"start": "0:27:08.282188",
"text": " like, what language requires a central server?"
},
{
"start": "0:27:10.322188",
"text": " So it's a little bit sus, a little bit like there's a cloud product already in place and"
},
{
"start": "0:27:15.042188",
"text": " they're waiting for it, but we'll see."
},
{
"start": "0:27:17.442188",
"text": " We'll see."
},
{
"start": "0:27:18.442188",
"text": " Mojo is actually promising."
},
{
"start": "0:27:19.442188",
"text": " And I actually want more programming language innovation this way."
},
{
"start": "0:27:22.962188",
"text": " I was complaining years ago that programming language innovation is all about stronger"
},
{
"start": "0:27:26.522188",
"text": " types, all about more functional, more strong types everywhere."
},
{
"start": "0:27:30.022188",
"text": " And this is the first one is actually much more practical, which I really enjoy."
},
{
"start": "0:27:33.602188",
"text": " This is why I wrote about self-provisioning runtimes."
}
]
},
{
"speaker": "D",
"start": "0:27:36.627188",
"stop": "0:28:20.755313",
"transcript": [
{
"start": "0:27:36.627188",
"text": " And I mean, this is kind of related to the post, right?"
},
{
"start": "0:27:40.067188",
"text": " Like if you stop all of a sudden,"
},
{
"start": "0:27:42.747188",
"text": " we're like the models are all the same"
},
{
"start": "0:27:44.387188",
"text": " and we can improve them."
},
{
"start": "0:27:45.227188",
"text": " Like where can we get the improvements?"
},
{
"start": "0:27:47.067188",
"text": " It's like better runtimes, better languages,"
},
{
"start": "0:27:49.627188",
"text": " better tooling, better data collection."
},
{
"start": "0:27:51.787188",
"text": " Yeah."
},
{
"start": "0:27:53.227188",
"text": " If I were a founder today,"
},
{
"start": "0:27:54.867188",
"text": " I wouldn't worry as much about the model maybe,"
},
{
"start": "0:27:57.507188",
"text": " but I would say, okay,"
},
{
"start": "0:27:58.627188",
"text": " what can I build into my product?"
},
{
"start": "0:28:00.027188",
"text": " And like, or what can I do at the engineering level"
},
{
"start": "0:28:02.627188",
"text": " that maybe it's not model optimization"
},
{
"start": "0:28:04.587188",
"text": " because everybody's working on it."
},
{
"start": "0:28:07.347188",
"text": " Like you said, it's like,"
},
{
"start": "0:28:08.187188",
"text": " why haven't people thought of this before?"
},
{
"start": "0:28:09.747188",
"text": " It's like, it's definitely super hard,"
},
{
"start": "0:28:12.267188",
"text": " but I'm sure that if you're like Google"
},
{
"start": "0:28:14.227188",
"text": " or you're like OpenAI or you're like Databricks,"
},
{
"start": "0:28:16.667188",
"text": " you got smart enough people"
},
{
"start": "0:28:17.827188",
"text": " that can think about these problems."
},
{
"start": "0:28:19.067188",
"text": " So hopefully we see more of this."
}
]
},
{
"speaker": "B",
"start": "0:28:21.430313",
"stop": "0:28:43.502813",
"transcript": [
{
"start": "0:28:21.430313",
"text": " You need a planner."
},
{
"start": "0:28:22.870313",
"text": " Okay, I promise to keep this relatively tight."
},
{
"start": "0:28:25.430313",
"text": " I know Simon on a beautiful day."
},
{
"start": "0:28:27.070313",
"text": " It is a very nice day in California."
},
{
"start": "0:28:28.630313",
"text": " I wanted to go through a few more points"
},
{
"start": "0:28:30.430313",
"text": " that you have pulled out Simon"
},
{
"start": "0:28:32.230313",
"text": " and just give you the opportunity to rant and riff"
},
{
"start": "0:28:35.230313",
"text": " and what have you."
},
{
"start": "0:28:37.510313",
"text": " Are there any other points from going back"
},
{
"start": "0:28:39.230313",
"text": " to the sort of Google OpenAI mode documents"
},
{
"start": "0:28:41.790313",
"text": " that you felt like we should dive in on?"
}
]
},
{
"speaker": "A",
"start": "0:28:44.380313",
"stop": "0:29:03.449063",
"transcript": [
{
"start": "0:28:44.380313",
"text": " I mean, the really interesting stuff there is the strategy component, right?"
},
{
"start": "0:28:47.580313",
"text": " This idea that Facebook accidentally stumbled into leading this because they put out this model that everyone else is innovating on top of."
},
{
"start": "0:28:55.880313",
"text": " And there's a very open question for me as to would Facebook re-license llama to allow for commercial usage?"
}
]
},
{
"speaker": "B",
"start": "0:29:02.537813",
"stop": "0:29:05.119688",
"transcript": [
{
"start": "0:29:02.537813",
"text": " Is there some rumor? Is that today?"
}
]
},
{
"speaker": "A",
"start": "0:29:06.705938",
"stop": "0:29:08.764688",
"transcript": [
{
"start": "0:29:06.705938",
"text": " Is there a rumor about that? That would be interesting."
}
]
},
{
"speaker": "B",
"start": "0:29:08.714063",
"stop": "0:29:13.506563",
"transcript": [
{
"start": "0:29:08.714063",
"text": " Yeah, I saw something about Zuck saying that he would release the llama weights officially."
}
]
},
{
"speaker": "A",
"start": "0:29:13.506563",
"stop": "0:29:18.130313",
"transcript": [
{
"start": "0:29:13.506563",
"text": " Oh my goodness, no, that I admit, that is, that's huge!"
}
]
},
{
"speaker": "B",
"start": "0:29:17.168438",
"stop": "0:29:19.919063",
"transcript": [
{
"start": "0:29:17.168438",
"text": " Let me confirm the tweet. Let me find the tweet and then..."
},
{
"start": "0:29:19.568438",
"text": " Okay."
}
]
},
{
"speaker": "A",
"start": "0:29:19.514063",
"stop": "0:29:57.904688",
"transcript": [
{
"start": "0:29:19.514063",
"text": " Okay, because essentially I met somebody from Facebook machine learning research a couple"
},
{
"start": "0:29:23.674063",
"text": " of weeks ago and I pressed them on this and they said basically they don't think it'll"
},
{
"start": "0:29:28.434063",
"text": " ever happen because if it happens and then somebody does horrible fascist stuff with"
},
{
"start": "0:29:32.234063",
"text": " this model, all of the headlines will be Murg releases a monster into the world."
},
{
"start": "0:29:37.034063",
"text": " So a couple of weeks ago his feeling was that it's just too risky for them to allow it to"
},
{
"start": "0:29:43.074063",
"text": " be used like that."
},
{
"start": "0:29:44.174063",
"text": " But you know, a couple of weeks is a couple of months in the AI world."
},
{
"start": "0:29:47.254063",
"text": " So yeah, it feels to me like strategically Facebook should be jumping right on this because"
},
{
"start": "0:29:52.594063",
"text": " this puts them at the very lead of open source innovation around this stuff."
}
]
},
{
"speaker": "B",
"start": "0:29:58.309688",
"stop": "0:30:11.050313",
"transcript": [
{
"start": "0:29:58.309688",
"text": " So I've pinned the tweet talking about Zuck and Zuck saying that meta will open up mama."
},
{
"start": "0:30:02.949688",
"text": " It's from the founder of Obsidian, which gives it a slight bit more credibility,"
},
{
"start": "0:30:06.389688",
"text": " but it is the only tweet that I can find about it. So completely unsourced."
}
]
},
{
"speaker": "B",
"start": "0:30:13.851563",
"stop": "0:30:41.577188",
"transcript": [
{
"start": "0:30:13.851563",
"text": " We shall see. I mean, I have friends within Meta, I should just go ask them."
},
{
"start": "0:30:18.091563",
"text": " But yeah, I mean, one interesting angle on the memo actually is that, and they were linking to"
},
{
"start": "0:30:23.611563",
"text": " this in a doc, which is apparently like Facebook got a bunch of people to do, because they never"
},
{
"start": "0:30:29.371563",
"text": " released it for commercial use, but a lot of people went ahead anyway and optimized and"
},
{
"start": "0:30:33.451563",
"text": " built extensions and stuff. They got a bunch of free work out of open source,"
},
{
"start": "0:30:38.411563",
"text": " which is an interesting strategy. This, okay. I don't know if I have to like."
}
]
},
{
"speaker": "A",
"start": "0:30:41.239688",
"stop": "0:30:56.055938",
"transcript": [
{
"start": "0:30:41.239688",
"text": " Oh, I've got an exciting piece of news. I've just heard from somebody with contacts at Google that"
},
{
"start": "0:30:46.759688",
"text": " they've heard people in Google confirm the leak, that that document was indeed a legit Google"
},
{
"start": "0:30:50.679688",
"text": " document, which I don't find surprising at all, but I'm now up to a 10 out of 10 on whether that's real."
}
]
},
{
"speaker": "B",
"start": "0:30:57.456563",
"stop": "0:31:21.807188",
"transcript": [
{
"start": "0:30:57.456563",
"text": " Excellent. Excellent. Yeah, it is fascinating. Yeah, I mean, the strategy is really interesting."
},
{
"start": "0:31:03.456563",
"text": " I think Google has been definitely sleeping on monetizing. You know, I heard someone call when"
},
{
"start": "0:31:10.096563",
"text": " Google Brain and DeFi merged that it was like goodbye to the Xerox PARC of our era."
},
{
"start": "0:31:15.536562",
"text": " And it definitely feels like Google X and Google Brain were definitely Xerox PARCs of our era."
},
{
"start": "0:31:20.496562",
"text": " I guess we all benefit from that."
}
]
},
{
"speaker": "A",
"start": "0:31:21.756563",
"stop": "0:32:06.492188",
"transcript": [
{
"start": "0:31:21.756563",
"text": " So one thing I'll say about the Google side of things, like there was a question earlier,"
},
{
"start": "0:31:26.316563",
"text": " why are Google so worried about this stuff? And I think it's just all about the money,"
},
{
"start": "0:31:30.236563",
"text": " you know, the engine of money at Google is Google search and Google search ads,"
},
{
"start": "0:31:34.396563",
"text": " and who uses ChatGPT on a daily basis like me will have noticed that their usage of Google"
},
{
"start": "0:31:39.196563",
"text": " has dropped like a stone, because there are many, many questions that ChatGPT,"
},
{
"start": "0:31:44.716563",
"text": " which shows you no ads at all, is a better source of information for than Google now."
},
{
"start": "0:31:49.756563",
"text": " And so yeah, I'm not, it doesn't surprise me that Google would see this as an existential threat,"
},
{
"start": "0:31:53.676563",
"text": " because whether or not they can, you know, barred, it's actually, it's not great, but it exists,"
},
{
"start": "0:31:59.196563",
"text": " but it hasn't yet either. And if I've got a chat bot that's not showing me ads and chat bot that"
},
{
"start": "0:32:03.996563",
"text": " is showing me ads, I'm going to pick the one that's not showing me ads."
}
]
},
{
"speaker": "B",
"start": "0:32:07.977188",
"stop": "0:32:13.073438",
"transcript": [
{
"start": "0:32:07.977188",
"text": " Yeah, yeah, I agree."
},
{
"start": "0:32:09.777188",
"text": " I did see a prototype of Bing with ads, Bing chat with ads."
}
]
},
{
"speaker": "A",
"start": "0:32:13.562813",
"stop": "0:32:15.250313",
"transcript": [
{
"start": "0:32:13.562813",
"text": " I haven't seen the prototype yet, no."
}
]
},
{
"speaker": "B",
"start": "0:32:14.204063",
"stop": "0:32:24.447188",
"transcript": [
{
"start": "0:32:14.204063",
"text": " Yeah, anyway, it will come obviously and then we will choose, we'll go out of our way to"
},
{
"start": "0:32:20.484063",
"text": " avoid ads just like we always do."
},
{
"start": "0:32:22.124063",
"text": " We'll need ad blockers in chat."
},
{
"start": "0:32:23.964063",
"text": " Excellent."
}
]
},
{
"speaker": "A",
"start": "0:32:24.430313",
"stop": "0:34:07.165313",
"transcript": [
{
"start": "0:32:24.430313",
"text": " So I feel like on the safety side, there are basically two areas of safety that I sort of"
},
{
"start": "0:32:29.870313",
"text": " split it into. There's the science fiction scenarios, the AI breaking out and killing"
},
{
"start": "0:32:34.830313",
"text": " all humans and creating viruses and all of that kind of thing, the sort of the terminator stuff."
},
{
"start": "0:32:38.990313",
"text": " And then there's the people doing bad things with AI. And that's latter one is the one that I think"
},
{
"start": "0:32:45.310313",
"text": " is much more interesting. And that, you know, because you could like things like romance scams,"
},
{
"start": "0:32:50.190313",
"text": " romance scams already take billions of dollars from vulnerable people every year. Those are very"
},
{
"start": "0:32:55.070313",
"text": " easy to automate using existing tools. I'm pretty sure the Quna 13B running on my laptop could spin"
},
{
"start": "0:33:00.910313",
"text": " up a pretty decent romance scam if I was evil and wanted to use it for that. So that's the kind of"
},
{
"start": "0:33:06.190313",
"text": " thing where I get really nervous about it. Like the fact that these models are out there and"
},
{
"start": "0:33:10.270313",
"text": " bad people can use these bad things. Most importantly, at scale, like romance scamming,"
},
{
"start": "0:33:15.710313",
"text": " you don't need a language model to pull up one romance scam. But if you want to pull up a thousand"
},
{
"start": "0:33:19.390313",
"text": " at once, the language model might be the thing that helps you scale to that point."
},
{
"start": "0:33:23.790313",
"text": " And yeah, in terms of the science fiction stuff and also a model on my laptop that can guess what"
},
{
"start": "0:33:28.510313",
"text": " comes next in a sentence, I'm not worried that that's going to break out of my laptop and destroy"
},
{
"start": "0:33:32.190313",
"text": " the world. I get slightly nervous about the huge number of people who are trying to build the AGIs"
},
{
"start": "0:33:38.590313",
"text": " on top of this models, the baby AGI stuff and so forth. But I don't think they're going to get"
},
{
"start": "0:33:42.590313",
"text": " anywhere. I feel like if you actually wanted a model that was a threat to humanity, a language"
},
{
"start": "0:33:47.550313",
"text": " model would be a tiny corner of what that thing was actually built on top of. You'd need goal"
},
{
"start": "0:33:52.350313",
"text": " setting and all sorts of other bits and pieces. So yeah, for the moment, the science fiction stuff"
},
{
"start": "0:33:56.510313",
"text": " doesn't really interest me. Although it is a little bit alarming seeing more and more of the"
},
{
"start": "0:34:01.230313",
"text": " very senior figures in this industry sort of tip the hat and say, you know, we're getting a little"
},
{
"start": "0:34:05.630313",
"text": " bit nervous about this stuff now."
}
]
},
{
"speaker": "B",
"start": "0:34:08.802188",
"stop": "0:34:19.298438",
"transcript": [
{
"start": "0:34:08.802188",
"text": " Yeah, so that would be Jeff Henson and Yosha Benjio."
},
{
"start": "0:34:11.302188",
"text": " And I saw this meme this morning that Yann LeCun was like happily saying this is fine."
},
{
"start": "0:34:17.082188",
"text": " Being the third Fury Code winner."
}
]
},
{
"speaker": "A",
"start": "0:34:20.378438",
"stop": "0:34:36.274688",
"transcript": [
{
"start": "0:34:20.378438",
"text": " But you'll see a lot of the people who've been talking about AI safety for the longest"
},
{
"start": "0:34:24.218438",
"text": " are getting really angry about the science fiction scenarios because they're like,"
},
{
"start": "0:34:27.138438",
"text": " no, the thing that we need to be talking about is the harm that you can cause with these"
},
{
"start": "0:34:30.658438",
"text": " models right now today, which is actually happening."
},
{
"start": "0:34:33.418438",
"text": " And the science fiction stuff kind of ends up distracting from that."
}
]
},
{
"speaker": "B",
"start": "0:34:36.240937",
"stop": "0:35:16.909688",
"transcript": [
{
"start": "0:34:36.240937",
"text": " I love it. Okay, so Eliezer, I don't know how to pronounce his name, Eliezer has a list of ways"
},
{
"start": "0:34:43.600937",
"text": " that AI will kill us post. And I think Simon, you could write a list of ways that AI will harm us,"
},
{
"start": "0:34:48.720937",
"text": " but not kill us, right? Like the non-science fiction actual harm ways. I think I haven't seen"
},
{
"start": "0:34:54.720937",
"text": " an actual list of like, hey, romance scams, spam. I don't know what else, but that could be very"
},
{
"start": "0:35:00.560937",
"text": " interesting as a practical, like here are the situations we need to guard against because they"
},
{
"start": "0:35:05.760937",
"text": " are more real today than that we need to think about, warn about. Obviously you've been a big"
},
{
"start": "0:35:10.400937",
"text": " advocate of prompt rejection awareness, even though you can't really solve them. And I"
},
{
"start": "0:35:15.040937",
"text": " would throw a scenario with you, but yeah."
}
]
},
{
"speaker": "A",
"start": "0:35:17.264063",
"stop": "0:35:34.459687",
"transcript": [
{
"start": "0:35:17.264063",
"text": " Yeah, prompt injection is a whole other side of this, which is, I mean, that if you want to risk"
},
{
"start": "0:35:21.024063",
"text": " from AI, the risk right now is everyone who's building, are building systems that attackers"
},
{
"start": "0:35:25.664063",
"text": " can trivially subvert into stealing all of their private data, unlocking their house,"
},
{
"start": "0:35:30.144063",
"text": " all of that kind of thing. So that's another very real risk that we have today."
}
]
},
{
"speaker": "B",
"start": "0:35:35.033438",
"stop": "0:35:48.904688",
"transcript": [
{
"start": "0:35:35.033438",
"text": " Yeah, I think in all our personal bios, we should edit in prompt"
},
{
"start": "0:35:38.433438",
"text": " injections already. Like on my website, I want to add in a"
},
{
"start": "0:35:41.513438",
"text": " personal prompt injection so that if I get scraped, like I"
},
{
"start": "0:35:44.793438",
"text": " all know if someone's like reading from a script, right,"
},
{
"start": "0:35:47.153438",
"text": " that that is generated by the Ibot."
}
]
},
{
"speaker": "A",
"start": "0:35:49.073438",
"stop": "0:35:59.080313",
"transcript": [
{
"start": "0:35:49.073438",
"text": " I've seen people do that on LinkedIn already and they get they get recruiter emails saying,"
},
{
"start": "0:35:53.233438",
"text": " hey, I didn't read your bio properly and I'm just an AI script, but would you like a job?"
},
{
"start": "0:35:57.953438",
"text": " Yeah, it's fascinating."
}
]
},
{
"speaker": "B",
"start": "0:36:00.413438",
"stop": "0:36:39.293438",
"transcript": [
{
"start": "0:36:00.413438",
"text": " Okay, all right. So trying to stay roughly on topic. I think"
},
{
"start": "0:36:05.213438",
"text": " this is this mode is, you know, it's a peek under the curtain of"
},
{
"start": "0:36:07.613438",
"text": " the internal panic within Google, I think it is very, very"
},
{
"start": "0:36:11.413438",
"text": " validated. I'm not so sure they should care so much about small"
},
{
"start": "0:36:14.613438",
"text": " models, or like on device models. But the other stuff is"
},
{
"start": "0:36:17.973438",
"text": " interesting. There was a comment at the end that you had,"
},
{
"start": "0:36:21.653438",
"text": " Simon about as for opening, open it is themselves open, it"
},
{
"start": "0:36:25.133438",
"text": " doesn't matter, right. So this is a Google document talking"
},
{
"start": "0:36:27.573438",
"text": " about Google's position in the market and what Google should be"
},
{
"start": "0:36:29.893438",
"text": " doing. But they had a comment here about opening. They also"
},
{
"start": "0:36:32.353438",
"text": " say opening I had no mode, which is a interesting and brave"
},
{
"start": "0:36:35.613438",
"text": " comment, given that opening I you know, as the leader in a lot"
},
{
"start": "0:36:38.493438",
"text": " of these innovations."
}
]
},
{
"speaker": "A",
"start": "0:36:39.917812",
"stop": "0:37:33.192187",
"transcript": [
{
"start": "0:36:39.917812",
"text": " Well, one thing I will say is that I think we might have identified who within Google"
},
{
"start": "0:36:44.437812",
"text": " wrote this document."
},
{
"start": "0:36:45.437812",
"text": " Now there's a version of it floating around with a name and I looked them up on LinkedIn."
},
{
"start": "0:36:49.957812",
"text": " They're heavily involved in the AI corner of Google."
},
{
"start": "0:36:53.377812",
"text": " So my guess is that at Google, and this one I've worked for companies, I'll put out a"
},
{
"start": "0:36:57.037812",
"text": " memo, I'll write up a Google doc and I'll email it around and it's nowhere near the"
},
{
"start": "0:37:00.917812",
"text": " official position of the company or the executive team."
},
{
"start": "0:37:04.377812",
"text": " It's somebody's opinion."
},
{
"start": "0:37:05.837812",
"text": " I think it's more likely that this particular document is somebody who works for Google"
},
{
"start": "0:37:10.277812",
"text": " and has an opinion and distributed internally and then it got leaked."
},
{
"start": "0:37:13.277812",
"text": " I don't know if it necessarily represents Google's institutional thinking about this."
},
{
"start": "0:37:18.637812",
"text": " I think it probably should."
},
{
"start": "0:37:19.797812",
"text": " Again, this is such a well-written document."
},
{
"start": "0:37:22.497812",
"text": " It's so well argued that if I was an executive at Google and I read that, I would be thinking"
},
{
"start": "0:37:27.317812",
"text": " pretty hard about it."
},
{
"start": "0:37:28.317812",
"text": " But yeah, I don't think we should see it as the official, secret, internal position of"
},
{
"start": "0:37:32.557812",
"text": " the company."
}
]
},
{
"speaker": "B",
"start": "0:37:33.462188",
"stop": "0:37:37.562813",
"transcript": [
{
"start": "0:37:33.462188",
"text": " Yeah, first of all, I might promote that person because he's clearly more definitely"
}
]
},
{
"speaker": "A",
"start": "0:37:36.803437",
"stop": "0:37:43.469063",
"transcript": [
{
"start": "0:37:36.803437",
"text": " Oh definitely! He's really... I would hire this person on the strength of that document."
}
]
},
{
"speaker": "B",
"start": "0:37:42.844687",
"stop": "0:37:54.505313",
"transcript": [
{
"start": "0:37:42.844687",
"text": " But second, this is more about OpenAI."
},
{
"start": "0:37:44.764688",
"text": " Like, I'm not interested in Google's official statements about OpenAI."
},
{
"start": "0:37:47.884687",
"text": " But I was just interested, like, his assertion that OpenAI doesn't have a mode."
},
{
"start": "0:37:51.284688",
"text": " That's a bold statement."
},
{
"start": "0:37:52.764688",
"text": " I don't know."
},
{
"start": "0:37:53.764688",
"text": " It's got the best people."
}
]
},
{
"speaker": "C",
"start": "0:37:55.568438",
"stop": "0:39:10.442813",
"transcript": [
{
"start": "0:37:55.568438",
"text": " Well, I would say two things here. One, it's really interesting, just at a meta point, that they even approached it this way of having this public leak."
},
{
"start": "0:38:05.048438",
"text": " It kind of talks a little bit to the fact that they felt that doing internally wasn't going to get anywhere, or maybe this speaks to some of the middle management type stuff within Google."
},
{
"start": "0:38:18.328438",
"text": " And then to the point about opening and not having a moat, I think for large language models, it will be over time a race to the bottom, just because the switching costs are so low compared with traditional cloud and SaaS."
},
{
"start": "0:38:33.288437",
"text": " And yeah, there will be differences in quality, but over time, if you look at the limit of these things, I think Sam Altman has been quoted a few times saying that the price of, marginal price of intelligence will go to zero."
},
{
"start": "0:38:47.528438",
"text": " And the marginal price of energy powering that intelligence will also head over time. And in that world, if you're providing large language models, they become commoditized, like, yeah, what is your moat at that point?"
},
{
"start": "0:38:59.528438",
"text": " I don't know, I think they're extremely well positioned as a team and as a company for leading this space. I'm not that worried about that. But it is something from a strategic point of view to keep in mind about large language models becoming a commodity."
}
]
},
{
"speaker": "A",
"start": "0:39:11.134687",
"stop": "0:40:12.120937",
"transcript": [
{
"start": "0:39:11.134687",
"text": " So it's quite short, so I think it's worth just reading the entire section. It says,"
},
{
"start": "0:39:15.214687",
"text": " epilogue, what about OpenAI? All of this talk of open source can feel unfair given OpenAI's current"
},
{
"start": "0:39:20.414688",
"text": " closed policy. Why do we have to share if they won't? That's talking about Google sharing."
},
{
"start": "0:39:24.494688",
"text": " But the fact of the matter is, we are already sharing everything with them in the form of the"
},
{
"start": "0:39:28.814687",
"text": " steady flow of poached senior researchers. Until we spend that time, secrecy is a moot point."
},
{
"start": "0:39:33.774687",
"text": " I love that. That's so salty. And in the end, OpenAI doesn't matter. They are making the same"
},
{
"start": "0:39:39.694687",
"text": " mistakes that we are in their posture relative to open source, and their ability to maintain an edge"
},
{
"start": "0:39:43.934688",
"text": " is necessarily in question. Open source alternatives can and will eventually eclipse"
},
{
"start": "0:39:48.334687",
"text": " them unless they change their stance. In this respect, at least, we can make the first move."
},
{
"start": "0:39:52.574687",
"text": " So the argument this paper is making is that Google should go like meta and just lean right"
},
{
"start": "0:39:58.334687",
"text": " into open sourcing it and engaging with the wider open source community much more deeply,"
},
{
"start": "0:40:02.654687",
"text": " which OpenAI have very much signaled they are not willing to do. But yeah,"
},
{
"start": "0:40:07.054688",
"text": " it's it read the whole thing. The whole thing is full of little snippets like that. It's just super"
},
{
"start": "0:40:11.694687",
"text": " fun."
}
]
},
{
"speaker": "B",
"start": "0:40:12.593438",
"stop": "0:40:21.840938",
"transcript": [
{
"start": "0:40:12.593438",
"text": " Yes. Yes. Read the whole thing. I also appreciate that the timeline, because it said a lot of really great context for people who are out of the loop. So yeah. Yeah."
}
]
},
{
"speaker": "D",
"start": "0:40:20.946563",
"stop": "0:40:28.658438",
"transcript": [
{
"start": "0:40:20.946563",
"text": " Yeah."
},
{
"start": "0:40:21.946563",
"text": " The final conspiracy theory is that this got leaked right before Sundar and Satya and Sam"
},
{
"start": "0:40:26.866563",
"text": " all went to the White House this morning."
}
]
},
{
"speaker": "B",
"start": "0:40:29.451563",
"stop": "0:40:33.113438",
"transcript": [
{
"start": "0:40:29.451563",
"text": " Yeah, I haven't caught up."
},
{
"start": "0:40:31.051563",
"text": " The White House."
}
]
},
{
"speaker": "D",
"start": "0:40:34.277813",
"stop": "0:40:41.635313",
"transcript": [
{
"start": "0:40:34.277813",
"text": " I just saw the photos of them going into the White House."
},
{
"start": "0:40:38.917813",
"text": " I haven't seen any post-meeting updates."
}
]
},
{
"speaker": "B",
"start": "0:40:41.736563",
"stop": "0:40:45.195938",
"transcript": [
{
"start": "0:40:41.736563",
"text": " I think it's a big win for Anthropic to be at that table."
}
]
},
{
"speaker": "D",
"start": "0:40:44.554688",
"stop": "0:40:49.752188",
"transcript": [
{
"start": "0:40:44.554688",
"text": " Oh yeah, for sure. And Cohere is not there. I was like, hmm, interesting. Well, anyway."
}
]
},
{
"speaker": "B",
"start": "0:40:48.351562",
"stop": "0:41:03.623438",
"transcript": [
{
"start": "0:40:48.351562",
"text": " Well, anyway, yeah, they need, they need some help. Okay. Well, I promised to keep this relatively"
},
{
"start": "0:40:53.871562",
"text": " tight. The spaces do tend to have a tendency of dragging on, but before we go, anything that you"
},
{
"start": "0:40:58.751563",
"text": " all want to plug, anything that you're working on currently, maybe go around to Simon. Are you"
},
{
"start": "0:41:02.591562",
"text": " still working on dataset?"
}
]
},
{
"speaker": "A",
"start": "0:41:04.247812",
"stop": "0:42:01.622812",
"transcript": [
{
"start": "0:41:04.247812",
"text": " I am, I am. I'm having a bit of it. So Datasette's my open source project that I've been working on."
},
{
"start": "0:41:08.967812",
"text": " It's about helping people analyze and publish data. I'm having an existential crisis of it at"
},
{
"start": "0:41:13.767812",
"text": " the moment because I've got access to the chat GPT code interpreter mode and you can upload a SQLite"
},
{
"start": "0:41:19.527812",
"text": " database to that and it will do all of the things that are on my roadmap for the next 12 months."
},
{
"start": "0:41:25.127812",
"text": " So that's frustrating. So I'm basically, I'm leaning Datasette, my interest in Datasette and"
},
{
"start": "0:41:29.687812",
"text": " AI are rapidly crossing over. I'm a lot harder about the AI features that I need to build on"
},
{
"start": "0:41:34.727812",
"text": " top of Datasette to make sure it stays relevant in a chat GPT can do most of the stuff that it"
},
{
"start": "0:41:39.287812",
"text": " does already. But yeah, I think I'll plug my blog, simonwillison.net. I am now updating it daily with"
},
{
"start": "0:41:44.807812",
"text": " stuff because AI moves so quickly and I have a Substack newsletter, which is effectively my blog,"
},
{
"start": "0:41:50.727812",
"text": " but in email form sent out a couple of times a week, which please subscribe to that or RSS feed"
},
{
"start": "0:41:55.927812",
"text": " on my blog or whatever because I'm trying to keep track of all sorts of things and I'm publishing"
},
{
"start": "0:42:00.727812",
"text": " a lot at the moment."
}
]
},
{
"speaker": "B",
"start": "0:42:02.500313",
"stop": "0:42:14.329688",
"transcript": [
{
"start": "0:42:02.500313",
"text": " Yes, you are. And we love you very much for it because you are a very good reporter and"
},
{
"start": "0:42:07.220313",
"text": " technical deep diver into things. It's all the things. Thank you, Simon. Travis,"
},
{
"start": "0:42:10.980313",
"text": " are you ready to announce? I guess you've announced it somewhere."
},
{
"start": "0:42:13.780313",
"text": " Yeah."
}
]
},
{
"speaker": "C",
"start": "0:42:13.739062",
"stop": "0:42:47.944688",
"transcript": [
{
"start": "0:42:13.739062",
"text": " Yeah, yeah. So I just founded a company. I'm working on a framework for building reliable"
},
{
"start": "0:42:19.499063",
"text": " agents that aren't toys and focused on more constrained use cases. I look at AGI and these"
},
{
"start": "0:42:27.179062",
"text": " autogy type projects as jumping all the way to self-driving. And we want to start with some more"
},
{
"start": "0:42:34.459062",
"text": " anger and really focus on reliable primitives to start that. And that'll be an open source TypeScript"
},
{
"start": "0:42:40.139062",
"text": " project. I'll be releasing the first version of that soon. And that's it. Follow me on here for"
},
{
"start": "0:42:45.819062",
"text": " this type of stuff. I, everything AI."
}
]
},
{
"speaker": "B",
"start": "0:42:49.497188",
"stop": "0:42:51.792188",
"transcript": [
{
"start": "0:42:49.497188",
"text": " chatty petite bike while you still can."
}
]
},
{
"speaker": "C",
"start": "0:42:51.302813",
"stop": "0:43:21.492188",
"transcript": [
{
"start": "0:42:51.302813",
"text": " Oh yeah, the ChatGVT Twitter bot is about 125,000 followers now."
},
{
"start": "0:42:55.302813",
"text": " It's still running. I'm not sure if it's..."
},
{
"start": "0:42:57.142813",
"text": " Burn your credits."
},
{
"start": "0:42:57.942812",
"text": " Yeah."
},
{
"start": "0:42:59.262813",
"text": " Can you say how much you spent, actually?"
},
{
"start": "0:43:01.182813",
"text": " No, no. Well, I think probably totally like a thousand bucks or something,"
},
{
"start": "0:43:04.622813",
"text": " but it's sponsored by OpenAI, so I haven't actually spent any real money."
},
{
"start": "0:43:08.502812",
"text": " What? That's awesome."
},
{
"start": "0:43:10.702813",
"text": " Yeah, yeah."
},
{
"start": "0:43:11.782813",
"text": " Well, once I changed..."
},
{
"start": "0:43:13.142813",
"text": " Originally, the logo was the ChatGVT logo, and it was the green one,"
},
{
"start": "0:43:16.422812",
"text": " and then they hit me up and asked me to change it,"
},
{
"start": "0:43:18.582813",
"text": " so now it's a purple logo, and they're cool with that."
}
]
},
{
"speaker": "B",
"start": "0:42:58.845938",
"stop": "0:42:59.993438",
"transcript": [
{
"start": "0:42:58.845938",
"text": " Yeah."
}
]
},
{
"speaker": "B",
"start": "0:43:22.116563",
"stop": "0:43:45.066563",
"transcript": [
{
"start": "0:43:22.116563",
"text": " Yeah, yeah."
},
{
"start": "0:43:22.936563",
"text": " Hopefully I'm sending take down notices to people with GPT stuff apparently now."
},
{
"start": "0:43:26.316563",
"text": " So it's yeah, it's a little bit of a gray area."
},
{
"start": "0:43:28.376563",
"text": " I want to write more on modes."
},
{
"start": "0:43:29.836563",
"text": " I've been actually collecting and meeting to write a piece of the modes."
},
{
"start": "0:43:32.296563",
"text": " And today I saw the memo."
},
{
"start": "0:43:33.396563",
"text": " I was like, Oh, okay."
},
{
"start": "0:43:34.296563",
"text": " Like I guess today's the day we talk about modes."
},
{
"start": "0:43:36.376563",
"text": " So thank you all."
},
{
"start": "0:43:37.696563",
"text": " Thanks."
},
{
"start": "0:43:38.016563",
"text": " Thanks, Simon."
},
{
"start": "0:43:38.436563",
"text": " Thanks Travis for jumping on and thanks to all the audience for"
},
{
"start": "0:43:41.776563",
"text": " engaging on this with us."
},
{
"start": "0:43:42.776563",
"text": " We'll continue to engage on Twitter, but thanks to everyone."
}
]
},
{
"speaker": "A",
"start": "0:43:45.319688",
"stop": "0:43:47.395312",
"transcript": [
{
"start": "0:43:45.319688",
"text": " Thanks everyone. Bye."
}
]
},
{
"speaker": "B",
"start": "0:43:45.758438",
"stop": "0:43:48.728437",
"transcript": [
{
"start": "0:43:45.758438",
"text": " Thanks, everyone."
},
{
"start": "0:43:46.758438",
"text": " Bye."
},
{
"start": "0:43:47.758438",
"text": " Bye."
},
{
"start": "0:43:48.758438",
"text": " All right."
},
{
"start": "0:43:49.758438",
"text": " Thanks, everyone."
},
{
"start": "0:43:50.758438",
"text": ""
}
]
}
],
"speakers": {
"count": 4,
"labels": [
"A",
"B",
"C",
"D"
],
"embeddings": {
"A": [
-10.288838451405312,
15.88706106331665,
-8.580840458188,
14.813819809654166,
19.889977360829814,
10.391620582381693,
4.9425874515307235,
-9.191938463184568,
7.361467844383451,
23.57769397035655,
14.423380546606264,
10.530243806777493,
1.044411207159375,
-2.116610813620759,
1.942548754151623,
-11.802397890504272,
8.25484183782135,
12.882854126799401,
-13.540736160553356,
3.6021088784272295,
19.67143385361008,
18.87568798350052,
2.975980005526383,
14.035715271449012,
-28.73295407218947,
24.918418361228845,
10.459250235309204,
22.443226003261756,
12.993583469895558,
1.9322321139893213,
-22.680322839300118,
-22.040394225386933,
24.77246365793759,
-3.9713545881751333,
5.680434498750228,
-12.505799824115082,
12.738636799755373,
1.3592274036842384,
6.752684724747225,
-16.60220530913082,
1.3970030093314274,
14.367670001281011,
-1.7873102896528075,
-20.907257795895614,
-8.79586337071969,
14.356660028934707,
-0.6371896036324047,
15.735320771005862,
12.77282665877433,
6.8232357525931935,
13.742926360449813,
-6.454421533406564,
-6.963018483319227,
-26.202304260583507,
-5.846243292148713,
-12.24539188130404,
-8.002746242377537,
8.979945236565692,
4.692456718577656,
7.777864306159909,
-17.15816430278337,
1.8076091312012443,
-14.488423308291074,
-4.458542342480671,
-22.873981501315793,
0.7393481992702517,
-5.046347638082576,
7.631792601382571,
-16.667402598439597,
-13.454458793918121,
-36.34691278612803,
-36.07809763703318,
22.661818083280526,
-9.567153724533181,
2.0137483881017757,
-7.731890591875916,
6.879548163969062,
24.834536243577503,
25.51243735512807,
16.298877184094476,
-17.15916824568708,
3.596069247104063,
1.4029492527853498,
-3.79562925485458,
15.99369212365081,
1.4456182682502365,
-20.658095230838093,
-0.5055367704776544,
12.103935789270697,
-25.272579376696655,
-20.111120362405753,
-14.523381831110566,
2.8370683574901214,
14.456430099361265,
-9.27917314908423,
33.611017969629124,
-22.090599850781025,
-26.167079332299412,
-25.378746798040257,
-6.715270035796696,
1.193164530533573,
-32.13001774405942,
-17.612518335796064,
38.68609195762861,
17.50813406244907,
-5.089837542060571,
-18.560680406079406,
38.46660741690605,
-7.4598259883627875,
9.573908736620936,
-20.376338961229305,
36.38163120500625,
-1.4772497880686488,
17.906251623025458,
11.376297910714388,
-9.219121630436607,
-7.805178891765767,
24.211999726119554,
4.982757242257884,
-26.07397666020644,
-9.796572739535588,
-1.2158296129681052,
-8.541766701310518,
40.84907286720617,
-5.729974766692427,
0.6387239162429487,
10.012536772050051,
14.439372286040683,
-1.0940560325319575,
-18.98970444453141,
-15.151205156595328,
10.013287523721663,
9.705505958501442,
23.202286525910335,
-6.816629119714471,
-0.7028487691404446,
-30.42976219734798,
-1.9990333677953227,
-2.1096404748340505,
-22.89271162283082,
-8.766325909171774,
-25.740626066583136,
4.832609582102917,
-23.27927531337454,
-6.706252997463,
15.29375128690911,
9.614477913442348,
0.8231015909344904,
-32.486998533563956,
-2.542546990435637,
15.194753583823701,
7.245943634482544,
0.7086727647184734,
35.67810806868568,
-13.972239949155622,
-11.88139296622969,
3.3311549267964438,
3.0036217888395877,
20.246312799157632,
6.060682978747608,
-30.720879056564872,
-39.8823890294584,
-51.316082172923615,
-2.426894599682465,
7.334661673075826,
-9.243390996948749,
-3.1130716509905474,
4.350667710112045,
18.31710727158047,
-3.3307286036174952,
19.16330017501189,
13.454345715541711,
-31.670942593129382,
-18.081453663710917,
-15.99951158577737,
13.07270670214013,
24.11289693690866,
2.9039737765219003,
11.278167386594717,
6.249637785290058,
-51.56305917766359,
-1.1602940204616135,
-5.725735661741863,
42.07050971083698,
14.83362628178974,
5.6947647130286825,
21.342572462880828,
10.930675347171176,
24.619380834682417,
7.819526035412554,
-13.323200765752329,
-19.88741771685743
],
"B": [
-13.65046364336454,
-4.5991541053992115,
17.264492021601306,
-32.31121875397755,
-9.118279280095212,
7.601837179716315,
-6.1589207822724426,
9.754626765596605,
9.132816959615257,
7.687384625557053,
16.11453856764913,
-10.230073845373012,
9.121278289623588,
12.406248062256276,
-32.54331050065248,
-1.4035487305878975,
10.434523283452142,
-17.992329063626514,
12.523657071140304,
23.226003899910086,
2.022200308401586,
-6.357677318223235,
-14.102913768940338,
-8.994454322529796,
-12.137755657306464,
16.72344161826698,
-14.6763797591297,
1.5671696650372748,
14.519326830752686,
-0.5778896966575613,
-1.8018803441750046,
-0.5004556549937111,
6.189655474730482,
7.908125959375312,
13.30347878139895,
8.107354979369342,
-3.516177244469967,
-8.936352782233355,
-5.841016453863116,
7.155149580722851,
0.024324212199195082,
6.225954885551776,
-6.920288085142322,
-21.550914456125682,
4.221839441224465,
21.697494918015458,
5.623173051382406,
-0.8966059031478222,
2.6129532685553967,
11.685373478726921,
5.546475466605839,
13.34241676824059,
-16.052775204412995,
-12.75992176165132,
-1.0984111624404806,
-7.860691769012427,
-22.6893677484114,
6.193488036791997,
-4.159992053709625,
-17.046464421616488,
27.512264871003556,
-5.895350258508252,
17.626341867040153,
-7.627853779823436,
11.025469296576906,
-6.7075854980737395,
15.953167081708125,
-16.979243349129717,
-7.181376461895065,
-11.094149851051062,
-4.66155468175166,
-2.945401945405605,
-2.810844684190769,
-21.98461494203488,
-2.2958719191776367,
-12.691465318318635,
19.880693056662825,
-35.26950735007961,
1.8488349021637733,
-0.008530025880056087,
14.208190219455131,
5.9356126058437075,
5.431532757363992,
-15.973123410844364,
19.881848505241518,
5.449663181238619,
5.330084683665135,
15.93824975478811,
-17.159270633373502,
-20.877080219201407,
-7.437357688552783,
28.022699596974782,
3.6384017172174343,
32.29235516475058,
-10.168066668209715,
17.328369511592587,
13.964845017870987,
7.5860025362216446,
-11.46964445397412,
0.46408968917148385,
-11.323546503298225,
14.970644906664418,
-11.615166502813164,
-11.200900928390828,
5.42081518576301,
4.972758392963612,
5.143625516468541,
7.07550690963233,
20.96508285459396,
14.639968992243455,
-7.118179116171591,
8.49104386465901,
-46.52404798974041,
-20.5413946351675,
-25.94948625795659,
17.635937573761712,
-8.165735065226201,
-4.072632510909018,
4.209440581939083,
-8.388339466450285,
24.173543922574215,
14.702887481842358,
5.0880303722920734,
13.698754208767244,
-29.61255360069898,
-14.589900187058868,
12.656302903483672,
-6.4823719674930995,
-16.61427299676398,
-8.785064788176202,
13.44745609268334,
14.610234373794896,
25.060538081314903,
-13.087635156996114,
28.43410015584272,
-2.543651139597381,
9.69526209921382,
-13.181641865083117,
4.212742612402004,
0.2585689645389488,
16.539862236550213,
0.13962686601616586,
-3.0117726891191934,
-26.329406511814444,
14.832920012393194,
-16.542041017913668,
39.10817534494832,
-24.826352895597747,
2.8917596989671517,
4.463800963855325,
-2.91195989121595,
0.25292188694452405,
8.431401796581135,
16.724308373567975,
11.602875237017003,
3.1777346259245665,
-5.186385544024186,
8.61681647876333,
6.850976338361861,
13.643321788154138,
12.006476970724425,
11.369314380462388,
-19.447143681906404,
12.52318664589586,
-21.17060979707275,
20.437097428532805,
-15.17136944147046,
-20.438915573966533,
-15.262894373512399,
-17.78160404052494,
3.397840059394882,
-0.6711622729516777,
0.026678617498783194,
-2.622006104263883,
-23.23129780804389,
7.980030399297797,
-2.4517566798023522,
-16.309093274667685,
2.6470931064515804,
3.6082765111870785,
33.320000212094996,
-28.76993599971624,
5.739248139641572,
4.825234142487524,
-31.884681997023147,
-6.482215202699041,
-18.1453341174037,
-10.465768375524343,
4.374811584920288,
-27.797385335773384,
-2.3090553713907016,
-17.272059185105107
],
"C": [
-29.543788388844323,
12.454226355971954,
-5.90999493017694,
11.926123813682834,
-13.063834534888104,
52.9087017756161,
-1.7242363543551884,
28.03595982692626,
-12.799057881321108,
11.047680028465475,
7.70870588809482,
20.27071810406828,
6.806317355084926,
0.9827892905754956,
37.99898240937303,
-16.844264158613107,
24.828146590557253,
-23.390366667211385,
26.366168189588237,
-27.740911938496772,
-17.83411676997512,
2.0269771809576627,
6.182037549623788,
1.2004079435752586,
16.88405811109027,
-22.57022107406941,
11.704472294048495,
-3.178414012661109,
32.47041794772838,
-35.12880802996638,
-3.2930623224420397,
-4.228080890612229,
9.224927033760295,
-7.016237728725337,
12.105917689256847,
15.3280079471889,
-20.801108254288337,
-17.71211723279759,
7.542226306565731,
19.71034335478252,
-19.743048081959845,
10.500224875471252,
9.8803121686212,
7.426993111900936,
-1.7745472950610037,
15.704592212542241,
-9.766834521579822,
-32.77658994657051,
20.275035389166565,
14.150007159936408,
8.65965043078228,
17.689235568313837,
-10.518271413333633,
4.884926935740094,
9.136620908687775,
-12.586649560873246,
-33.329556775935174,
-31.063070420529407,
-13.27924160119434,
16.018940086001592,
40.21354582951822,
1.5938364827664153,
-20.724200162337578,
33.59847337533833,
-7.754191470790909,
-4.683292339196137,
15.718547421146727,
42.58289932836234,
-16.77455364494681,
-0.8942481895773023,
-23.408754788626965,
-9.309898695384232,
8.280141897748326,
-39.35173418979771,
12.665613879891684,
13.858452467132713,
11.915885780672758,
-6.363020909995367,
16.813205191960133,
23.552987836153303,
9.333383757778327,
-26.946201591302227,
29.14273448492386,
-2.769905367859991,
20.33968739471425,
0.467585184716231,
-8.556113923264556,
21.837195180671056,
45.93139521701973,
31.084288896972243,
-11.402797090548562,
16.95075807247309,
16.724021168577817,
-2.866392911997753,
6.860968241595538,
-7.375628511082159,
-22.632638408812728,
3.8868808425617534,
14.288420421365345,
9.615820208987916,
-0.8942707096561625,
5.891903722473723,
-13.686425199506035,
-3.442404946752271,
-4.245648842791165,
3.6990153085035384,
-2.031602794797906,
6.190166654365413,
18.197870574505057,
40.80748336678309,
5.414275477083182,
11.88315704998624,
-20.973613660585208,
-31.675864306309364,
33.14739124092251,
7.739099923003193,
4.994049082730155,
7.8489985331917635,
4.50425284196341,
-12.849148249652476,
-27.65528692334693,
21.411558166763047,
0.72454502430249,
-3.7774330693077105,
-10.06859705868541,
-46.83003745805349,
16.40582885410612,
3.17691028921168,
2.2364770384057207,
-2.5734680537422236,
-41.846332180578976,
19.704608493153557,
15.45444391711428,
12.2152803593272,
14.551320058020206,
-7.999158551698555,
7.501410515190495,
-19.8117179477024,
9.811360588079323,
-5.171009516232456,
-16.222483682152188,
-3.0617549624828575,
12.798059864222559,
14.406149514792508,
31.994588938393687,
17.20655203174808,
13.98235502648301,
15.172658698135786,
-12.594192007050806,
-18.11130071242461,
16.282972236252316,
25.50444484549774,
-2.900678497519155,
-8.898418746543246,
17.155772618357314,
-16.601375729602168,
-19.15427143197449,
17.7999093098274,
-34.26459316859972,
16.1945359583206,
-19.58050364924895,
19.637516231265817,
-18.819963594080335,
-19.271592252281295,
26.201960282083643,
-51.89654981853157,
5.663643094683251,
31.453949760324907,
-19.021500306944084,
-17.44336043334428,
29.263416562385643,
24.78687443599006,
4.505235698477059,
-10.509557033808816,
-8.470379087373935,
4.764156666807378,
13.150316643925429,
13.176217542809104,
9.952204691464077,
-17.872728296356158,
21.69365128982508,
-0.6050225224625525,
-38.21349093771928,
28.825743734672105,
-9.603294648966088,
-0.33890132791471667,
19.529819161847037,
12.918097594843367,
32.187282619886844,
-16.353653372641627,
-18.373236964602707,
16.271598330705075
],
"D": [
23.44160132351759,
9.055487144904143,
27.580357070844453,
31.280299895518535,
20.5430307506441,
19.80399333672212,
-47.808399956505575,
-10.769995152950287,
-2.086020929130583,
21.859143893326724,
-2.1813579129038363,
23.359770096502864,
-32.711649098106335,
-19.478250752496827,
-2.0020313733886623,
8.096756086626032,
-3.4483456954460694,
-39.42309625084336,
-1.9828456741598275,
25.00276671202333,
5.120436223389514,
6.964414339378342,
9.377023476385721,
0.2113014289287028,
-23.74706691422978,
40.5075257898451,
-5.594238547859965,
5.841190422641802,
30.15984202102498,
-55.03157343520775,
2.5660451766908974,
-25.334880357390052,
-11.965459845053989,
8.721525513390834,
18.053209663511396,
2.0740389100250765,
16.77387964631523,
8.558634238120437,
-14.853144554639453,
-5.177669849985086,
-10.492898219273433,
-7.313651110767177,
25.344666769509917,
6.438560447177371,
-7.2720904871779455,
-15.586615424836054,
12.001615807380494,
-19.927492919252128,
-4.9057511822698086,
-13.054654050249237,
-14.766716881906865,
-8.135160435393855,
-4.605111085482545,
-8.704130691929242,
17.828844103995745,
9.92926490560241,
-8.441421793329084,
-17.26573883540727,
5.4430656827868775,
29.859867136220675,
7.803674410273497,
34.07481253684104,
-16.044604660221584,
-2.2911026025996417,
-5.08719113415426,
-15.02945079221516,
-12.278926469754797,
-20.73143153335597,
-2.4065736087106235,
-1.0994846974877086,
-5.449594560557523,
-4.966740047945096,
0.7545796163245901,
-7.969439379594012,
18.23809266144091,
-15.56917934765754,
-2.9716630408389344,
-19.730370044439763,
-30.662672190515845,
-2.0100615453142843,
-9.083740551544874,
-18.95994766809986,
12.637500241868668,
-11.616056506563952,
11.365903451417884,
-12.875501805329108,
-2.4920025712785288,
-1.6975723099097744,
21.473343597271956,
12.493624205256367,
-20.37274860070498,
11.210647186063028,
-13.221419240473896,
16.760946545582097,
11.061347985757632,
8.151942830734157,
7.823558310885821,
-6.003459393512458,
-2.3149232429049507,
-38.392691846366404,
5.583926093856896,
16.547557460202896,
39.03679731192889,
11.608806940701765,
-28.74070551857218,
12.68846093894293,
-5.118520333314197,
-7.754119946864677,
15.551182256868898,
-12.206553441036057,
9.04421659632007,
-6.535094860991514,
-9.929382617509848,
25.28033308651265,
5.084063406870071,
-20.834460972682447,
14.465020803310951,
-9.519060950222853,
-5.0037898153804985,
-22.21100018797694,
-9.187111136058832,
2.727604891869936,
9.973738375732356,
31.238055892758542,
3.5470455011587827,
-0.06056085667732331,
-16.93471211476906,
-5.334661615521622,
-24.025865050821423,
-3.133184504535821,
4.0839206503603505,
17.19220897260013,
13.102172894403338,
8.075390434010071,
18.003628600019592,
9.785935884022766,
-3.6570179144419765,
-32.02960028626897,
3.2543356946023465,
-10.892785675135684,
35.66511597074904,
20.0751046900478,
1.7971664014700297,
-22.290279637183154,
22.511222387588507,
42.12984887544099,
29.799946560516013,
-8.458382879459375,
-31.35619200672115,
2.9370037574983865,
27.238495733286882,
5.210155347756505,
-28.9665569429462,
45.17340074036572,
11.684824290744087,
4.952525878425789,
-29.61061380575369,
3.9026330297158376,
-34.35985581966134,
2.8025489871050357,
-4.771420031114742,
6.159773921577243,
19.825792318814106,
2.7438359038216307,
41.52622431892532,
0.11530101031216013,
-12.250316390675401,
-19.523140506466497,
-35.63811719095385,
12.435228903215748,
35.46951292871355,
4.5285761293258755,
8.457948269607785,
-11.715852177894867,
-2.4455996420179904,
-51.35880346985551,
0.22515207938514314,
19.905954793565446,
-7.292286093808241,
8.358342489244434,
-14.200365151387748,
-27.372639728995324,
-3.877035542181483,
-3.287269562394736,
4.562799141222091,
3.8496004047065235,
11.052349250763655,
6.887343369730589,
7.744352468954365,
-12.38616776760869,
-15.365954854049477,
1.6133632925884411
]
}
}
}
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment