Skip to main content
903

May 19th, 2025 × #AI#OpenSource#Editors

Fork Yeah! Microsoft open sourcing Copilot

Eric Gamma and Kai Metzler from the Versus Code team announce open sourcing AI features from GitHub Copilot into core Versus Code.

or
Topic 0 00:00

Transcript

Wes Bos

Welcome to Syntax. Today, we got a really exciting Node. Big announcement about Versus Node and GitHub Copilot.

Wes Bos

We got Eric Gamma and Kai Metzel on who are some of the early on they've been at been working on Versus Code for for many, many years, and they're gonna they're gonna talk to us all about it. So welcome, Kai. Welcome, Eric. Thanks so much for coming on. Thanks for being here, miss. Yeah. Real quick. Let's give a quick introduction before we hit the announcement.

Wes Bos

Just gives people some context of of who you are within the Versus Code project.

Topic 1 00:33

Eric started Versus Code project 15 years ago

Guest 1

So I'm I'm I initiated Wes I joined Microsoft fifteen years ago. I started the Versus Code project. I was leading it for quite a while.

Guest 1

And more recently, you know, I stepped down, became more an an IC individual contributor, and Kai is now running the show.

Guest 2

And I joined the Versus Code project

Wes Bos

almost ten years ago now. Man. Ten years. Wow. Yeah. Is Versus Code really fifth fifteen years old? What what was the initial,

Guest 1

like, creation? Like, why why did you start that project in the early days? So the we noticed there are kind of this new kind of developers that we often refer to as phony the web developers, and we wanted to also give them tools so that we are relevant to them. Right? We had a big tool, great great Visual Studio, all integrated, one of the Wes, but, we wanted to disrupt this a little bit and really attract only the web developers that don't like to all integrate one. They like command lines. They like to build their own tools, use many languages, whatever. That was the creation moment. And from that, it went on.

Wes Bos

Oh, well, it it worked. You were just telling me a story about how like like, I'm a web developer. I'm probably the target audience. I moved over from, Sublime Text many years ago. Are you are you able to tell that story really quickly? Because I thought that was kinda funny. Yeah. No. You were one of the Sublime eventually.

Guest 1

Right? And you saw that. I guess you wrote a book, had a theme about it, and you really talked about how can we convince the Sanity, with Versus Node, right? And we saw you and we said, yeah, if you were able to convince Wes Bos, Boom. Nice. Okay. But Versus Node, that's a good milestone. Right? And then, I guess, you wrote to see, and you gave feedback. And so we we somehow accomplished that. I haven't I still missed a book from you on Versus Code. Oh, yeah. Yeah. I've been, I've been considering it. I,

Topic 2 02:34

Wes recently switched off Sublime key mappings after using for 7 years

Wes Bos

maybe at at some point, there's there's quite a bit that a little tricks and whatnot that I have. I I actually recently got off the Sublime Text key mapping, which I was on for, like, seven Yarn, and now I feel like a baby again. I can't remember all of them.

Scott Tolinski

I remember when, Versus Code first came out, it did seem like it caught fire immediately, because, yeah, we had Atom. We had Sublime. We had all these editor. But it does feel like a substantial portion of the community moved right to to Versus Code. From your perspectives, Wes that how it felt internally as well?

Guest 1

We sometimes talk about that it's an overnight success ten years in the making.

Topic 3 03:11

Versus Code's growth described as overnight success after 10 years work

Guest 1

Right? So we worked quite a bit before. Right? So when we announced it, we worked for five years on it in a monthly rhythm shift every month.

Guest 1

And we then gradually have grown. Right? It wasn't a big bang.

Guest 1

Right? So Mhmm.

Guest 1

And it was always this, growth approach. Right? We have continually growth, and we are now at by 40,000,000.

Guest 1

14 million. Yeah. 14,000,000,

Wes Bos

like, for, like, users? Four zero. Four zero.

Guest 1

Yeah.

Guest 2

Oh, yeah. That's nuts. The growth rate, right, when you compare it to today's growth rates for for some of other tools, right, then maybe it was not that big. Right? But for the time, it was, it was quite impressive, right, how we actually grew over time. And I remember, so the first million we hit, right, that was a big party for and then the second million we hit, that was a big party.

Guest 2

And then, you know, the third million, so the parties got a little bit smaller.

Guest 2

And I was like, okay. We we doubled it again, and we doubled it. And and so it's the, and that was like, oh, yeah. Okay. Now we hit 40,000,000. That's Yeah.

Guest 1

Amazing. I remember the 1,000,000 cake. For 1,000,000, we had a cake, but I don't remember the other cakes.

Scott Tolinski

And if you want to see all of the errors in your application, you'll want to check out Sentry at century.i0/syntax.

Scott Tolinski

You don't want a production application out there that, well, you have no visibility into in case something is blowing up, and you might not even know it. So head on to Century.i0/Syntax.

Scott Tolinski

Again, we've been using this tool for a long time, and it totally rules. Alright.

Wes Bos

Node more thing real quick. May 29, Denver, Colorado. We're doing a Syntax meetup. You gotta be there if you're anywhere within a horse ride of Denver, Colorado. We got the folks from Vercel, the folks from Mux, folks from Century. We're all coming out, of course. We're gonna have Scott there. CJ is gonna be there. I'm gonna be there. It's gonna be a really great time. If you wanna grab some tickets, they're free, but you gotta get tickets. Go to syntax.fm/meetup.

Wes Bos

Versus Node, all absolutely huge. In the last, probably, what, four years or so, copilot, GitHub copilot, absolutely huge, sort of changing how we we write code, and and it's been a a massive one. So the big announcement, lay it on us. What do you what is it today? Well, it's about open sourcing.

Topic 4 05:42

Announcing open sourcing AI features from Copilot to Versus Code

Guest 1

Versus Node is open source ESLint, ten Yarn.

Guest 1

And, we wanna now open source the AI features from Copilot to Copilot in Versus Code. So it's about open source, which we talked about Versus Code success before.

Guest 1

Yeah. And open source is one of the reason I think we got all this developed love. Right? We try hard. We try hard to be a good open source project. We try to be transparent, make PR submission easy, listen to feedback, and so on. And, yeah, open sourcing is very important, dear to our heart. It's a a way how we develop, and that's why this announcement is so exciting to me. Incredible. That's really exciting. So Yeah. What that means is that like like, explain to us how Copilot and and Versus Code works right now because it's it's like a is it a plug in, or is it like a, was it a closed part of Versus Code? Kindly let me explain how it is, and then you explain the whole business story because you you initiated the whole thing, and you can tell this very well. Okay. Right? So today, it's an open course open core pattern we are using.

Guest 1

Right? Versus Node is open source, of course. It has some AI enabling features in there, which are open source, but, it's open core. But the Copilot is a closed source extension. And there was a good read there JS a good reason why we started like that and why there JS an evolution.

Guest 2

And that's where I jump in. Yeah. Yes. Exactly.

Guest 2

So yeah. The, you know, when you think about the the story as such, right, it it started actually with GitHub Copilot just completions.

Guest 2

Right? Yeah. And there at the time, we had really close, working relationships actually with the folks at GitHub Next who did this in the beginning. Right? Because it was about what is what should the UI look like. Right? And this is when the ghost text, in a way, came in, ghost text completions. Right? Before we really just had suggest widgets. Right? So and and there, it Wes, in a way, like, a lot of the smarts were actually just in the extension.

Guest 2

It was a a fine tuned, three five model.

Guest 2

Right? So there was a lot of, craft actually in the prompt, creation and all of these kinds of things. Right? And and Copilot was, the when I say Copilot in this case, the completions, right, they were really run of the success.

Guest 2

Right? You saw a lot of folks kinda seeing this and saying, oh, this is great. This is magical. There were still some who were kinda upset that it was not correct, right, and then sometimes proposed something that was not quite spot on. But Vercel, I think it was a runaway success.

Guest 2

And then the, GPT four. Right? So JetGPT announcement and GPT four came. Right? And I think at this moment, everyone really, even the last person, started thinking about what are all the other things that you actually can do with with AI. Right? And what became relatively, obvious was that I think the UI pieces of AI need to be really within the flow of what you actually do. So it really starts permeating all of this. Right? But at the same point in time, it was still extremely hard to to actually get reasonably functioning AI functionality. Right? So there was still a lot of work that actually you had to put in the prompts, the preprocessing of the data in order to create the prompt, then actually post processing. Right? The models were really bad and actually answering in a format that you wanted them to answer. So there was a lot of code that you actually had to write in order to make that happen. Right? Many different kinds of prompts for all different kinds of use cases and so on. Right? So that was innovate the technical side. Right? On the on the business side, I think you also have to think about Node one really knew how to make a business out of this Node early on. It was like, oh, this is insanely expensive to run.

Guest 2

Right? So what do we do? We don't wanna give anything away because no one really knows where the market goes and so on. Right? So in the end, we ended up, as Eric said, in this kind of open core approach, which is that the UI elements, a whole bunch of the UI elements actually started more and more moving into core. We had to enable some of them through, for example, APIs. Right? So we created more APIs and so on, and then parts were in the in the Copilot extension. Then you then actually think about what really happened over the last few years. Right? I mean, you just go let's say, you go to OpenAI and the OpenAI Cookbook. Right? Out of literally tells you, oh, here's the best ways of how you write your prompt in order to get, you know, reliable answers.

Guest 2

Anthropic, the same story. Right? Google just had a this, 68 page PDF, right, for for how to prompt and so on. Right? Then you have something like Node, that OpenAI, open source, right, where you literally can follow along how the whole prompt is created and all of these kinds of things. I think the landscape really massively shifted over the last year, maybe two Yarn, right, in a way how mature it is, how good the models actually became, even working with small, little prompts. Right? Prompting overall became much simpler. Yeah. So and and if you look if you take this altogether, right, then and now also better understanding of the business models, right, that, what you really have to have on the servers. Right? And that is pretty pretty much your your secret sauce. And what do you have on the clients in a way became more and more of a of a way of, how should I say, Scott too much, specific pieces in there that really differentiate you. And from that point of view, right, we we think it's it's really a product type, right, to to put all of this in open source. Like, you wanna have people people to be able to to kinda go and say, oh, Versus Node, I cloned it. Right? Now I have cloned it. Now I built it. Now I launch it. Now I have Node OSS. Right? This is the out of source kind of version that you run. Right? And with that, you can pretty much type something in the chat, right, and follow along. Right? You can debug all the way how that actually makes it then finally to the server, Right? And what the server actually responds. Right? And then you can see how these things actually come. You can optimize for it. Right? You can find problems.

Guest 2

You can fix issues. You can quit PRs if you want. Right? So I think that's that's really the story. Right? I think, overall, the same reasoning really applies to open source that apply to to Versus Node in the first place. Right? Which is when you achieve a certain Sanity of a market, right, then it makes sense to actually have a solution that is really open for for all developers, right, where people can see what is happening behind the scenes, right, can participate, can have discussions with you about the features that you actually want to have, and so on. Right? And then not just rely on us to to actually do something about it. Yeah. And then there's a discussion but can can come and contribute. Right? So Wes we strongly believe that this moment is here Node. So that's why Wes made that happen. Interesting.

Topic 5 12:15

Want developers to clone, build and launch open source Versus Code with AI features

Guest 1

So Maybe I I would I'd like to add Node what what I see also. For instance, we have a competition like, and what they really show us and what Node me always jealous. Right? When you are kind of an extension of Versus Node, you are not really a little bit bolted on. Right? But what you wanna do with this move, you wanna have the same thing. Right? It's an AI editor. It's not just bolted on. It's really a core part Wes you do Node.

Guest 1

Right? It's there. It's not that you have to install whatever. Right? So it should just be this experience. And maybe, also, Node thing, though, the whole prompt thing that Kai mentioned, it's in the extension right now. Right? The whole prompting how we do it and talk to the model prompt construction, that's in in the in the extension.

Guest 1

And this thing right now, we also think we wanna make this open source, get more eyes on it, right, get contributions on it, and, yeah, has security benefits and so on.

Guest 1

And that's a good thing. So it's a very good thing for the developers, aren't they? I like it. The it's because Yeah. For us as a developer, it's a little bit stressful to kind of straddle between open source and closed source. Mhmm. Right? We do both. Right? And we notice we talk differently open source to closed source. Open source, we love to talk about, and we get love. Closed source is different, right, because it's not as friendly. People are more demanding because I don't know. It's just different. And for us JS developer team, it's always stress to play both hats. Right? And for us, it's a relief to have this open source consistently for the build process and for our thinking process.

Wes Bos

Yeah. So part of of Copilot and and probably a big part of it is is figuring out what what parts to send send to the model. Correct me if I'm wrong, but, like, I'm I was always curious about, like and we we've had the folks from Windsurf on and and whatnot, and we always ask them, like like, what what do you send? Like, are you looking at my clipboard? Are you looking at the open tabs? Like, how do you how do you crawl for that perfect context and and, how how all that works? So with Copilot being open source, we'll be able to see exactly what is is being sent to the model and and how those and then also improve upon. Is that is that one of the big ideas here?

Guest 2

Certainly Node of them. Right? Yeah. It changed a little bit. Right? In the beginning, for example, you were absolutely right. Right? The the models didn't support reasonably tool calling.

Guest 2

Right? And from that point of view, you had to pretty much pack up everything upfront, like, that you need it and send it to the model. Right? But then there was also the needle in the haystack problem that when you send too much, right, the the model kinda got confused and these kinds of things. Right? We we still do this for for completions. Right? That we pretty much pack all of the context upfront and then let them the model complete that. Right? And there's still a lot of thinking going into it, what needs to be sent along. Like, for example, opening tabs, we we compute some additional kind of context depending on if the language server actually supports that, you know, what what types, for example, should be sent along in in a stripped out way and so on. Right? But then there are also other parts, like, that are very, where that doesn't play so much into it anymore, like agentic flows. Right? In agentic flows, you use something that we sometimes internally refer to as as breadcrumb prompts, where you pretty much say, so so there is a problem in the problems panel. There's output in in the terminal. And so you really just say these things, and then you give the tool actually, sorry, the model tools in order to to retrieve this information when it thinks that it actually needs that information.

Guest 2

Oh, okay. Okay. And and that is a is a much simpler approach, actually.

Guest 2

Right? And sometimes you you have to help a little bit along with the right prompting approach to say what kind of aspects to prioritize in its considerations and so on. But Vercel, right, that, tool calling made that much simpler.

Guest 1

Interesting. Node when you say prioritization, Kai, what you also have infrastructure. Right? We use kind of TSX to describe branches, not just string concatenation.

Guest 1

And this gives us flexibility, right, that we can say the nodes in a tree have a priority, and we can add more or less depending on on the context window.

Guest 2

Yeah. The, this is how we render prompts. Right? What what I was saying is more like in the prompt itself, you you tell the tool that or you sell the model what tools first to look at. Right? For example, check you know, if there is some problem and the problem is found, do this first. Right? Don't don't look, for example, in the output of the tunnel or something along those lines. Right? So you give it some guidance how to pre proceed through the context. What what Eric is referring to is, well, of course, right, we still have limited context windows.

Guest 2

Right? And, there's a real trade off here. Right? You have context windows that Yarn either technically constrained by the Node, not or you actually, want to constrain it so that you get a certain throughput. So the rule of thumb clearly is with the simp with the smallest possible prompt that you can can send, you want to have the best possible outcome. Right? And so if you pump pump the full context window, right, then the real question is and you get a certain outcome, how small can you make it in order to get the same outcome, assuming that the outcome actually is good. Right? And so that is then what where pretty much the our whole prompt rendering story comes in. Right? What are parts in the prompt that you can can drop at which points in time? Like, once you hit the context window, once you hit a certain size for a certain portion, like, think about history. Right? You don't wanna just pump pump pump history into into the context window until the context window is full. Right? So what is a what's a natural way of capping this? Right? At which point did the conversation flip, for example, to to be about something else, and this is not relevant anymore. Right? So these kinds of things. And does that include logic for

Wes Bos

different types of programming languages, or is that left up to the the model? Like, does the Copilot extension have logic for how to treat TypeScript and JavaScript versus, like, Java?

Guest 2

So, actually, the prompts are not too specific.

Guest 2

So there's a history here.

Guest 2

Okay. And the history is in in the early days, absolutely.

Guest 2

Right? We had an approach that we called, for example, cookbooks.

Guest 2

Right? Where we're saying, oh, different languages, different language tools have, for example, different, like, different linters have different errors and different error messages and so on. Right? And then we created these kinds of cookbooks by saying, oh, if you're in a Python file and you use ESLint and you see the particular kind of errors, here are ways of how you can fix it and, you know, what this really means and how you can fix it. And that was quite language specific. Right? Sometimes you had to give also hints about syntax and so on. And finally, that not Sanity.

Guest 2

Right? Just Wes I said before, an pnpm maturation curve, these kinds of things are way less necessary these days. What we still do clearly, language specific, is what we talked about completions context. Right? Because that is I mean, that's clearly language specific. Right? What what if a language supports type, let's say, interfaces. Right? Yeah. And then you need to write some code in order to find the right interfaces that you wanna include

Guest 1

and so on. Right? I think also what is kind of semantic, but not language specific, is when it comes to workspace knowledge. Right? So the whole thing, how you find the relevant snippet to add to a prompt, that has a very sophisticated indexing infrastructure that is based on tree database, Node, and AST know how of of ranges and so on and then computing embeddings.

Guest 1

But that's

Scott Tolinski

that's done at the level of, grammars. Right? It's not Mhmm. Very easy. Yeah. This is all just so exciting to me, but I I often I'm, like, wondering about I know to we're getting into some different things here, but I I was more curious about the implications of what this would mean for paid Copilot features within Versus Code.

Guest 2

So now first of all, Copilot today has a free option.

Guest 2

Right? So you you just can sign up and you're on on free. Right? There are some limits associated with Wes, right, to how many premium requests you can get, right, to premium models. There is a base model that we actually have in in Copilot, right, where you can continue, right there are no limits on those calls and so on. With free, you can also bring your own key. Right? So Yeah. You you put it in whatever API key you already have and you pay for whatever ID you you can actually use.

Guest 2

So will Copilot be so is there a relationship between you paying for, really advanced Copilot, functionality and open source? Node. No. Not really. Right? Because the the point is still someone has to to pay for the compute Right. Yeah. To run the LLMs and all of this. Right? And and that is really unrelated from the open source question. Right? So open source means that you can follow along everything that works on happens on the ESLint. And you can see what we send to the servers, right, and what you pay for then ESLint the end, assuming that you pay and you're not on free, right, is pretty much the compute on on the back end. Okay. And and, like, for enterprises, some some additional service like indemnification

Guest 1

and these kinds of things. Right? And the key thing to keep in mind, right, bring your own key. This is the infrastructure where you can bring your API key, and then Wes go to your model directly without going through the GitHub back end service, right, to get the completion.

Guest 1

And that, I think, is a nice complement

Wes Bos

that we already have for this story. Right? Yeah. And is there any, like, secret sauce that you put on on the server side of Copilot? And and if so because I I was chatting with Burke on our livestream a couple weeks ago, and he's like, yeah, he's like, bring your own key. Most of that, you can run through like, you if you wanna bring your own key, you can run it straight to the OpenAI model or anthropic model or Google model, whatever. Mhmm. But he also said there is a, like, a fine tune model. Is that is that true? That are for some parts of of Copilot?

Guest 2

Yeah.

Guest 2

There's not just one. There there are several. Think about it this way. Yes. You can have bring your own key. There are additional services that Copilot. So GitHub the GitHub Copilot service gives us. Right? And Eric made a reference to one of them, like finding relevant source code snippets. So what what happens, right, is GitHub when I say GitHub, right, the team on the GitHub side JS actually indexing all of those repositories that are all open source repositories.

Guest 2

You can opt in for for your private repositories to be indexed and so on. Right? But what indexing really means JS you I'm pretty sure you've heard that before JS, the the code pretty much gets chunked, and then embeddings are computed for each of those chunks. Mhmm. Right? And then when we look for relevant code, for example, what happens is we take that query that the user types in, right, and then we compute the embeddings for that query, and then we pretty much go and find embeddings that are relatively close to this in the repository.

Guest 2

Right? And that, for example, is functionality that that GitHub provides us, right, is that secret sauce. Right? There are some some clever things that are being done. Right? But it's not I wouldn't call it, I wouldn't call it secret sauce, but it's clearly about, you know, scalability of all of this. Think about this there. You Node? I don't know what the latest numbers, but something like 500,000,000 repositories of on on on GitHub and so on. Right? So there's this there there's a massive scaling infrastructure behind that. Yeah. Then there are some other use cases where, for example, new models are really good in editing files, for example. Right? So Mhmm. SONNET has the replace, string.

Guest 2

Right? So if you can search for a certain string, right, find it and then replace it with something else. And the model JS very reliable in giving you this. The new four one models that OpenAI is they give you something like, it's called apply patch.

Guest 2

Right? So they really give you this kind of pnpm patch format to to some. Right? And they're also very good in doing this. So but there JS a Scott of other cases where models actually are not good in in doing something like this.

Guest 2

But then there JS another technique where you just go and tell the model, hey, pretty much give me a rewrite of the file and all of the code that is pretty much the same of, like, stuff that has not changed, just say here JS existing code. Right? And then you pretty much give it the original file plus that kind of rewritten file that the lighted spaces.

Guest 2

You give this to a model and and now fiddle it all together so that a reasonable outcome comes out. Right? Mhmm. And that, for example, is a is a custom model that does that. Right? Okay. And we do that we do that, many times a day. Right? So that needs to be cheap to run and so on. And when you want to run it cheap, right, then you need to have custom models that you the smallest possible model that does a really good job at it. Right? So that and, like, that's, for example. And then the completions models are are clearly custom.

Guest 2

Right? Then next edit suggestion.

Guest 2

Right? That is the tap tap tap kind of functionality.

Guest 2

Right? You you at some particular period, you made a change. Here are all the subsequent changes that you should make in order to make it consistent again. Right? That is a is a custom model. Okay. I think this is what you will will see more and more. Right? What would really happens is you can take one big Node. Right? And the big model is really smart at doing things, but it's also really expensive doing things and it and it's slow at doing things. Right? And what you want is you wanna make it for the user fast and for you JS, you know, you wanna put little load on your service infrastructure.

Guest 2

And so you've got it then pretty much distill and distill and distill, right, in multiple steps until you have the smallest possible model for giving use cases. Right? Okay. And it will vary over time how many of those custom models you have.

Guest 2

Right? Can you achieve the same by just, as I said before, to talk to a big model? Yep. Right? But then, again,

Wes Bos

quite expensive and slow. Yeah. Yeah. You're running that. Like, I often think about that. I'm coding for ten hours a day, and it's it's running every single I don't know how often, but, like, keystrokes. Right? Yeah. Right. And I often think, like, how are they making money on this? It's gotta be expensive to run this. Yeah.

Guest 1

May I wanna say one point since you talk about all these backend Node and services, right, for the open source announcement. Right? So the the backend services, they will not be open sourced. Right? That's right. Okay. And, of course, that's good to know. All the I know how we render diffs that get streamed in from the model, how we do ESLint chat, how we do Yeah. Kind of the agent UI and so on. All that, all that is open sourced.

Scott Tolinski

Right? Okay. Yeah. I do wonder about the UI bit of it because we've seen kind of the rise of the the Versus Node fork and experimenting in different UIs for these types of things. How much do those design choices modify your thinking in terms of what the Versus Node interface and how it should relate to working with AI? Or do you have your own experiments? I mean, I'm sure you do. But is everything that comes out into Versus Code informed by your own experiences, or is there influence coming in from, others in the space, working in these types of UIs?

Guest 2

Mark, clearly, it's informed by others, as well. Right? I mean, even today, issues for for Copilot, for example, are, are open. Right? There's the Versus Node, Copilot release repository where, you know, you can look at issues. People make feature proposals there. In the Versus Node core Deno, right, many issues actually lend directly in the open source Versus Code repo.

Guest 2

Like, we're saying, oh, I do this. Right? And you have this button here, but I really like to have this button over here because there are two others who put the button over there. Like Yeah. So from that point of view, clearly, right there, their influence comes also from others.

Guest 2

But I wouldn't say that this isn't the, this is just like tool development always happen. Right? You take good inspiration from from others. Right? You would be stupid not to do so. But at the same point in time, you would need good scrutiny in what you actually really do Mhmm. And what you don't do because you need to maintain your identity.

Guest 2

You need to think about the users that are already there, the UI patterns they already do understand.

Guest 2

Then also a big difference for us JS, like, our extensibility story.

Guest 2

So, like, for example, we always need to think about something like, oh, if we put a button here, who else wants to put the button here? And then in the end, you have 15 buttons and then, right, that doesn't work. So how do we make UI scalable, keep it still lean and clean, don't lose our own design language by still picking Scott ideas that that other folks had. Right? So I think in AI, exactly the same rules apply that always apply to

Guest 1

to to tool development. Right? So, actually, that's a concern that I have. Right? So Wes see these forks. Right? And I sometimes see they mislead with the spirit of Versus Node. Right? They add a blue button for AI everywhere, right, and promise you whatever.

Guest 1

Of course, you can do that, but we have a platform mindset. Right? We wanna be good for many others. You cannot be monopolistic and say Node everything is like that. And that also the other thing I sometimes miss when I look at what the folks are doing, kind of the consistency we have, right, in in in in in Versus Node. And that's one of the concerns I have with the forks. Maybe an example. Right? Cursor, they have changed the UI, totally, yeah, unrelated that they have the activity bar on top, right, which I think is a great UI idea. Right? Why not? Right? In the meantime, we have added this option as well. I have a hard time to understand why didn't they contribute it back. Right? Mhmm. Because it's it's US thing. So that I I don't understand it. Because in particular, when we did it and what do you notice? The implementation they give to the user is not complete. Right? So we get batches where you see how many changes you have from for the for the Git view. Right? And they don't do that. So that's kind of that hurts me. Right? This loss of consistency we have kind of established and we care about when we do a feature.

Guest 1

Because sometimes the forks, they optimize where they wanna go and forget about the platform.

Guest 1

And that's, to me, the whole long term view we have. We wanna be built to last for a long time and not for just one wave.

Wes Bos

Makes sense.

Wes Bos

And I I wanna talk a little bit about, like, the the business decision. I know I know you guys are engineers and and whatnot, but, if you're comfortable talking about that, like, what does open sourcing GitHub Copilot mean for Cursor and and Windsurf? Like, they forked the editor, so you're saying have the rest as well? You know? What what JS what happens there? What do you mean by have the rest? Sorry. Like so they they fork Versus Node and then add their own AI stuff on top of it. And and now you're saying, well, here here's our AI stuff as well.

Wes Bos

You can have that as well. Like, anybody can have that. Anybody can build upon this as well. Is there, like, a strategic business reason that Microsoft is doing this?

Guest 1

Maybe Node thing is what we heard from some organizations, they they really don't like closed source IDEs.

Guest 1

So by making the other pieces open source, I think we we address that need. Right? At that point, that's more for these bigger organizations.

Guest 2

Mhmm. And, of course, then they have a choice between closed source and open source. Yeah. You put the disclaimer upfront, right, where engineers on the project. Yeah. So, a pure business Vercel might actually answer that somewhat differently.

Guest 2

Yeah. Right? But when I think about this, the the part is really could they take whatever we have? Right? The whole thing. Could someone take the whole thing, put a new batch on it, and say this is this is now ours and mark it under under their name? Sure. Right? But we also know how pretty much the open source community has reacted for for many years when someone tried to do this with Versus Node proper. Right? There was always immediately this question, why would you go some why would you do this? Why would you not bring the changes back to Versus Code? Why would you not work in in favor of the over developer ecosystem? Right? I think the competition, at least that that's my point of view. Right? The move more and more to the services that you actually provide around this, and really what's running on the back ends and all of these kinds of things. Right? Right now, for example, we have and I'm purely speculating here, of course. Right? But right now, we have access all to the best frontier models. Right? So OpenAI, Anthropic, Google, all of them give the models out. Right? But you could also imagine a world where where that is not the case, right, where people actually try to create very, very specific models that they didn't and keep certain kinds of secret model source just to themselves and to their own AI tools and so on. Right? So I think, again, as it was in many other cases, right, the what really matters, right, is on the servers.

Topic 6 36:01

Real competition happens on servers with models, scaling and infrastructure

Guest 2

How that is done. This is, right, how you scale, how you provide, in the end lasting value to your customers.

Guest 2

Right? And what happens on the ESLint, those things you want to trust. You wanna see what is happening there. If you're a big enterprise, you wanna see and and pretty much audit those things. Right? And make sure everything is is good. Right? Like, we have discussions, right, with with enterprises. There's conversation. So what do you put in your prompt? What really is this? Right? And like you, Wes, right, you asked. So what's the context you put in there? And people are like, woah. We're really suspicious about what you put in here and and all of this. Right? I was like, you know, I have a look now. Right? Cool. You look. Right? So it's all there. You can do it. Right? You can see it. So I I cannot really tell you what what it does to business, but but also their business, not mine. So Yeah. But I I can, like, think about that or what what I just mentioned. Right? I think it moves to the server.

Guest 2

That's where where real competition happens.

Guest 2

Mhmm. Right? And what we do on the clients again. Right? It also informed our decision. I I mentioned that earlier. Right? Like, the the prompt crafting, all of these kinds of things. Right? There there are, like, more more understood.

Guest 1

Right? Mhmm. We also know there is kind of some convergence, right, on the totally. That's right. As Kai said, the secret sauce is on on the server, all these custom models. On the UI side, there is kind of more convergence, right, in live chat, how you do that, how you show the diff stream get that gets streamed in.

Guest 1

And this was actually I just wanna when I when you when you talk to these forks, why they fork? One of the reason they say to Wes the Versus Node API was how we're feeling after they enable the AI the AI innovation they wanted. Right? Yeah. And that's correct. We we don't allow you to do kind of big UI in using our API. That's totally intentional, right, because we care about stability and performance.

Guest 1

For instance, you could not do a co completion UI using our extension API. Right? That's just not how it is. And this has, of course, a little bit this tension. Right? You wanna kind of change deep in the core, but, of course, we don't allow that because you want stability in the evolution.

Guest 1

And but by now, I think things have converged and evolved.

Guest 1

So I find it a missed opportunity that there was no collaboration Yeah. On that. Right? Because we're a small team. The other forks are small teams, and I think both talented people, so we could have to maybe something even better for the whole thing because the whole value is in the back end.

Scott Tolinski

Yeah. In that regard, was it a surprise when these forks started popping up, or was it something you all had anticipated?

Guest 1

That was a surprise. Right? And it's, of course, you know, a fork for an open source steward like KIND I or is is not a pleasant experience. Right? Mhmm. And when you see about and, of course, you wanna understand why. And, of course, one is the API thing. Yeah. But I must once Sanity, even though the API is not really made for deep UI innovations, what we see is the community innovate a lot on top of our existing APIs.

Guest 1

Right? This is some nice, AI editors, extensions also. So this always blows me away how much creativity our community has to do innovation on top of our UI, but it's totally true. Deep innovation below it JS challenges. Yeah. That's also one reason. The other thing, of course, is what always made me jealous about them, this they have don't have AI bolted on. They have it built in. Right? And this is what a fork allowed them to do, which what we didn't have because we had to restructure our code that all the AI stuff is in this closed source extension. Right? And to me and for the developer team, it's a big relief now when we no longer have to do this kind of interaction. We can also go to a really AI first mindset.

Wes Bos

And and that means that, like, other extensions can now, like, use the AI features as part of their their feature set, you know, if you're making an a totally unrelated to Copilot extension for for something like a a database tool.

Wes Bos

Is is there APIs or will there be APIs that, like, let's say, that database tool add on could then use AI APIs?

Guest 2

Yeah.

Guest 2

There JS actually, a couple of things here. The the first one is we actually made that available before.

Guest 2

Right? So there is, for example, in the Versus Node API namespace, right, there is an LLM namespace.

Guest 2

So you can create, for example, chat participants, and then those actually can participate in in a normal chat conversation. Right? Mhmm. The tools, for example, that models can can call. Right? Those tools are already public APIs. So extensions can contribute tools. MCP servers can contribute tools. Right? There is a way you can just, as an extension, just read all of the tools that are available. You can craft your prompt, send it through the model, through our APIs. That is actually stuff that that already exists today. Okay. Right? And there are also, there's a good ecosystem around this. Right? When you look at, client, for example. Right? They, they use some of that API. Right? In their drop down, they can say, yeah. Use the models that you get through Copilot. Right? But they also do some some other custom pieces. Right? But will there be more APIs going forward? Probably.

Guest 2

Do we know right now what they will look like? No. Not not quite yet.

Guest 1

Mhmm. But but the mindset, right, it's it's still like like in Versus Code. In Versus Code, you can contribute by writing an extension, which is just the extension API.

Guest 1

You can contribute it by making a a pull request if you wanna find an issue or find a problem, wanna improve in the AI sync, a a a prompter. So so this is kind of the same thing. What we now give to this open sourcing, we give kind of the community two things. Right? Source code access. If they have a problem with their extension, they can go to the source and look deeper.

Guest 1

That's that's one thing.

Guest 1

And we give them the tool to make pull requests, right, which is we make code extension. That's that's the same. How we do think about it. Right? It's not that Wes already have extension APIs, as Kai said, and a very important one, right, how we can contribute new tools to our to the agentic flow. And that's already there, and this will continue to be there. But what we see from from Versus Node, they will opt this love to have source success when they do extension authoring. Well, at least me. I do that.

Wes Bos

Mhmm. Totally. I agree. That that's I'm excited about it because, I don't know, Bos couple years ago, there was a a bug in Copilot where it was adding, like, an extra curly bracket at the end of all the tabs, and I was losing my mind over it. And, and I was like, if we just had access to this thing and I I couldn't even fathom, like, like, how it was working or or or what was going on or how edits get applied. That's all, like, a kind of black box to me. So it's I'm very curious now to actually dive into the code and see how that stuff works.

Guest 1

I think we should say, right, we announced that we will make it open source, that there is work ahead of us. Because it's different. Right? When you open source Versus Node, you are totally closed source, and then we when you're ready, we're through the switch.

Guest 1

Right now, what happens is we already have an open source core, and now we have to tune it, refactor it to make it, open to Wes, these AI features.

Guest 1

This means the community will see Wes make changes in the core, and they will ask questions. Right? What's going on? You will be surprised. Deno Node? They watch every commit. Right? And, of course, we can do fake commit messages, know what improved in aiming when we really do kind of an improved app. Right, Jeff? So we have work ahead of us to do. So today, we have an announcement. We'll do that, and we're excited about it. But we have work ahead of us.

Guest 1

And I don't know, Kai, whether you wanna since you you drive the whole thing, whether you wanna say timeline things. But for me, we have big problems, right, that we have to solve, and I'm not a fan of timeline pnpm, Kai, if you want.

Wes Bos

Software engineers are famously great at giving timelines, so let's have it. Yeah. No.

Guest 2

No. They let me put it this way. The, areas, right, there JS a rock ahead of us. There there there are a couple of things to consider when you open source something. Right? So there's a legal aspect to this. Right? Mhmm. And the the that you have the right copyrights everywhere, that, you know, that you make sure that this is all properly done, that you don't have Vercel references to internal issues and and these kinds of things in your source code. Right? All all these kinds of things that are kinda obvious. Right? Then there are other considerations about code quality. Right? Developers, there's no secret here. If no one's watching, maybe your code is not exactly the same quality of if the world is watching. Right? So there are people who might be concerned about this. Right? But we'll make a point here saying Node quality is not is not an an aspect that we we care at this moment, right, when we make this open source. But there are other bigger things. Like, what do we do with the internal issues we already have? What do we do with the like, for example, we invented this whole mechanism of what we call Wes test. Think about it JS unit Wes that can deal with the stochastic behavior of models. Right? So you don't run the test just once. You run it several times. Right? You you cache responses from LLM. And and and so there's

Guest 1

very interesting what we did there. That's cool. But now there JS a lot of test test, Emma. So as a as a person who works on unit ESLint. Right? So the big problem is when you have in a test anything with LLM, then the test becomes stoch stochastic. And testing, the worst thing in unit testing is flaky tests. Right? You hate flaky tests. Right? But as soon as you have an LLM in there, your test JS, by definition, flaky.

Guest 1

Right? And that's what we try to address with what Kai explained there. This it's a simulation test, right, which is pretty cool technology, but it also that will require work to to package and make available and think about that. Mhmm. Make available then

Guest 2

the actually Wes cases.

Guest 2

Right? So how do we get good test cases? Right? That is usually from internal kind of, flights, right, where we can actually real look at the at the concrete prompt that was being sent, right, with the concrete source snippets and so on. Right? I'm saying, oh, in these particular cases, it failed. Right? Then we added it, for example, to our s test suite.

Guest 2

Right? Now we've Scott clean this all out. Right? We cannot make these things available publicly. So there's no work to to be done. Right? So we'll start pretty much today, right, with the announcement that that work is starting.

Topic 7 47:17

Won't throw open sourcing over the fence, will provide transparency in every step

Guest 2

Right? And and then we'll make sure step by step by step, right, that we go I think our main point here is we make the announcement. We don't throw it over the fence.

Guest 2

Right? That's not how we work, but we will provide transparency in every single step that we that we take here. Right? So do you know exactly what to expect. Right? And you know all of our alterations plans are public.

Guest 2

Right? They Yarn even pinned on the issues, right, for for every single month.

Guest 2

And and GitHub GitHub repo.

Guest 2

There will be entries in there you can follow along. You can ask questions. Right? That's that's that's how we operate, so we will make sure that there's a fully transparent process.

Guest 2

Will you see everything right now? Node. Right? You just mentioned why why there's things that we need to do first. Right?

Wes Bos

That's cool. That Wes test stuff is that's really neat. I've I know that there are are some other, like, just generic AI tests that help you answer the question of, is this getting worse? You know? Because you always hear that from people being like, oh, the AI is not as good anymore. They they turn some knobs down, and you often wonder, like, is that just my perception of it? Am I am I expecting more, or is it is it actually doing it? And it's like nondeterministic output of AI is is is kinda hard to do. So that's really interesting to see that you you have some sort of testing

Guest 1

against the the AI output. Yeah. The life is no longer red green. Right? The life is percentages, and you have also that a baseline. But as you said, right, it's very important you compare against the baseline because you don't have a red green anymore. And that's what the test ESLint infrastructure tries to do, and it uses, of course, cloud infrastructure for doing that, which is another to do item how we make that. Mhmm. I do have to ask just, like, switching things because you're in this AI space, obviously.

Wes Bos

Do you have any opinions on, like, what future of software engineering looks like with AI? Are are we out of a job? Or, like, what are your personal opinions and thoughts there?

Guest 2

Oh, if he only wouldn't you.

Guest 2

No.

Guest 2

I I had this conversation just the other day with someone, and, I I Node a statement there where independent of what it will look concretely.

Guest 2

I I think you will not have just people who are just at one particular level of abstraction. Right? Really deep just do, right, I don't know, optimization for for one particular aspect. Maybe a few of them, they'll survive. Right? But you could make a good living as a as a developer by being quite specialized. Right? And I don't know if this will remain true. Right? What what I believe is that JS a developer, you Node to be or what what a developer will be in the future or creator, maybe we should just call them creators. Right? We'll be going across many different levels of abstraction in in, pretty much, speed of light. Right? So you think about the product idea.

Guest 2

You've you scuffled something out. You you run it, it fails a little bit. So you've been all the way down, right, just to fix a tiny little issue. Then you go up again and think about your customer, right, and think about changes and think about design. You go down again. Right? And this, I I think this flexibility, right, the thinking about your users, I think developers will become a little bit more entrepreneurial in their whole way of how they think about this Tolinski, how they have to approach a problem. Right? And I think that's that's where that I would guess where that really goes.

Guest 2

Right? Where you live on all of those levels of abstractions very, very quickly. And today, you could make a comfortable life to to kind of just sit here and never go down or just be a PM and sit up here and never really, you know, go down go down. So it Wes like, I think that that will go away. Right? I Interesting. That the most powerful people will be those that can think in all of these levels of abstraction. It's pretty much fluently

Guest 1

going up and down. Yeah. You're not the first person that said that to me. So what's definitely true for me is there will be more creators. Right? Given all that, I think there will be more creators, but I also think if I look I I like to exploit by looking back and looking forward. Right? It's incremental. Right? First, we got ghost text, then we got chat, then we got snippets. Now we got then multiple file edit, then we got agents. Now it it's it JS an evolution. Right? It's not that all of a sudden this is changing, I think. That's why I'm not too nervous because we can follow along.

Scott Tolinski

Mhmm. Yeah. It is interesting. Even, like I mean, we've noticed, Wes and I are both kind of tinkerers and creators, and it it's enabled me to get into, man, CAD modeling and all kinds of stuff. Wes was hacking his Roomba with it. Yeah. C plus plus microcontroller

Wes Bos

for the first time in my life. And then, like, on the flip side as well, like, you get a little higher into the, three d space and and and trying to figure out what customers won, like you said. So you're not the first person that I've talked to that said that, like, you both you both go both deeper and and higher.

Wes Bos

Alright. We got we got five minutes left. Is there anything we we didn't touch upon that you would you would like to get across? What's your favorite model to use? I'm curious.

Guest 2

Oh, that's a a loaded question. Yes. No. It's not loaded.

Guest 2

It's relatively straightforward. Right? Because, and that is clearly a point of time, right, statement. It's always a point of time statement with all of those models. Right? Yep. Right now, Sonnet is is just nicer to use. Right? Because it's it's more predictable in in the output that it generates.

Guest 2

It's more tracking, particularly in agent flows, more tracking, you know, on on a particular task. And it also creates code that is nicer.

Guest 2

And Node and what I mean by this one is I look at the code that's on and proposes, and and I like it better than the code that some other models propose. Right? I personally didn't have that much exposure yet to to Gemini two five pro. Right? There I hear really, really good things about it. So the few experiments I personally ran looked kinda good.

Guest 2

Right? But, yeah, then there Yarn the more aspects to it. Right? So, like, for example, speed. Right? So do I like a model that is not as great, but it's much, much faster? Yeah. It depends on the delta, but I like fast. Right? So so it really is, it's quite a fluid field. Right? And Mhmm. And what I really like is the competition.

Guest 2

Right? What I what I like to see is that new Node model is here, and everyone, like, new solid Node, and OpenAI comes back and say, here's our new model. Try this one. This is awesome. Yeah. So that kinda goes back and forth. Right? So

Guest 1

Yeah. Everything gets better, faster, and cheaper, which is is good for us. And you get more choices. Right? Bring your own key. You can select a model and switch it. Right? Things become multi multimodal. Right? That's what I really think is such a great accomplishment, basically. Yeah.

Guest 2

I mean, because we're talking open source. Right? What what is really nice and now I when I talked about the models, right, the, I really just took the the close to eight ones. Right? But what is really beautiful, and I think will will be way more important going forward, right, is pretty much open source or open wait models.

Guest 2

Right? And you've seen what what DeepSeq had Node and what kind of stir that actually created in the community and so on. Right? And now think about it. Right? You have Versus Node is an OpenEye editor, and you have OpenMate models.

Guest 2

Node, that's really a combination that that's quite powerful. Right? Yeah. Well,

Scott Tolinski

man, this is just so exciting overall, and, thank you both for coming on and sharing all this with us. You know, I I think people are gonna be very excited about this news in general.

Wes Bos

Mhmm. Just as you mentioned earlier Versus Node. Man, like, I honestly think, like Yeah. Versus Code is probably if I were to, like, look at my entire career, like, Versus Code and TypeScript are probably the two most important pieces of software in in my career, which is is absolutely massive. So, like, it's I thank you for for putting it out in the world. It's it's I think it's made software ESLint the world a a much better place in general. I know that people in tech tend tend to say that, but, man, it's it's been huge.

Guest 2

Yeah. And, I mean, we we started with this. Right? Thank you for for helping us making MuseScore quit. Right? I I remember the the one feature that we just built for you.

Guest 2

I saw the call.

Guest 2

What was it? Oh, that that the, I think because you you I forgot where it came from. Maybe from a tweet or something. Eric might remember. It was like, oh, you Node, when I present in my courses and so and talk, then people cannot see my cursor, so I need a little bit of a wider cursor. So we made an option to make the cursor wider, and that was really exciting for you.

Wes Bos

I love it. It was just for you. I was, I appreciate that. I was like I think I was hacking it with CSS at the time because, like, my students needed to see where I was, with the cursors. They just made a huge, big, blocky one and and threw it in there. So Yeah. Appreciate that. He gave you a setting ESLint. Man, we were tracking you on Twitter. Right? We were following you. And as he said, once you had something you don't like, we jumped on it. We want to make you a happy customer. Right? Yeah. Alright. Well, let's while I have you, what else you got? Yeah. What else do I I want on this type of thing?

Guest 1

Now you're a convert. You're now you're a convert. Right? So Yeah. Yes. Yeah. Now you don't have to worry about me.

Wes Bos

That's great.

Wes Bos

Great. Well, thank you so much for coming on. We appreciate all your time. Really excited about this, and, and, hope to talk to you again sometime. Thank you for having us. Thank you.

Guest 1

Bye bye.

Share