BigCheese AI Podcast Show Notes
Episode Title: Developer Productivity and the Role of AI in Enhancing It
Co-hosts: Sean Hise, Jacob Wise, Brandon Corbin Special Guest: Chris Vanoy
Topics Covered:
Relevant Quotes:
Additional Insights:
Closing Thoughts:
Contact Info:
00:00:00 [silence] 00:00:03 Welcome to the Big Cheese Podcast. My name is Sean Heis, I'll be your host today. 00:00:08 As always, I'm here with Brandon Corbin, Hello everybody. 00:00:12 resident AI fanboy and Jacob. Hey guys. 00:00:16 Jacob, the CTO and Chris Vanoy. How are y'all? 00:00:20 Special or special guest today. We're going to be talking about developer productivity today, 00:00:25 which is a really probably something that is near and dear to all of our hearts as developers. 00:00:29 I believe all four of us are here on the podcast today are very experienced developers. So 00:00:32 if you're looking to understand a little bit more about developer productivity, 00:00:41 AI, tune in. Before we get into the podcast, I would like you to smash that like and subscribe 00:00:47 button. Please, we need more subscribers. Somebody unsubscribed yesterday and ruined my day. 00:00:50 It took me about an hour to figure out who they were. 00:00:53 We're sending you a care package. 00:00:59 You'll be knocking on your door later. And then that's the moment when which I realized people 00:01:02 get push notifications when they're subscribers and they must know how to play something. 00:01:13 So we do have actually, it was kind of a slow news cycle after Apple and WWDC, right? 00:01:23 And then you, as it happens, it seems like this curve is just accelerating in an increasing rate. 00:01:29 And we talked about anthropic on the pod last week, because of the 00:01:39 paper that they released. And I had not been as tuned in to that company as much as some. 00:01:44 Give our viewers a little bit before we get into the news, just a little bit about the company, 00:01:46 of what you know about anthropic and cloud. 00:01:51 Okay. So I'm going to let you go because you had this, they just released the 3.5 sonnet, 00:01:55 blowing people's minds, very impressive model. You did some research on it, 00:01:59 but then yesterday I went and I signed up to be a paid full pro membership. 00:02:03 So I do have some that's funny. That's funny, because I also signed up yesterday to be. 00:02:10 And, you know, they just came out with 3.5 sonnet, which is out the benchmarks on it are great. 00:02:15 It's it's a better at coding than before. Oh, it's a game. The three oh, opus. 00:02:19 Yeah. It's outperforming even there. They're larger. Yeah, they're kind of, yeah, they're 00:02:23 ultra or whatever. You know, and I played around with it. The project's feature is really cool. 00:02:28 It's basically a way to build a context window about a specific topic or project that you're 00:02:34 working on. You can upload documents. So I just saw the doctor today and then they told me I needed 00:02:40 to do physical therapy. So I was like, when I'm found like a couple of documents PDFs on physical 00:02:44 therapy for that specific injury, uploaded them and said, here's my issue. Here's what I'd like. 00:02:50 Give me a rehab plan for the next week. And it was pretty pretty good. I cross-referenced 00:02:55 against the documents that I had uploaded. But it's this whole idea of like being able to store 00:03:00 context and documents and then go back to that and have an ongoing conversation because next 00:03:05 week I can say, okay, here's what worked. Here's what didn't work. And, you know, chatgbt is like 00:03:09 scratching the surface with it with their memory. But I think this is just kind of the next progression 00:03:15 of like a more it's got staying power. Like you can actually go back to it and continue that 00:03:20 conversation. So so one thing that so I saw James James paid me who we had was that last week that 00:03:29 he was on. Yes. So James ended up posting a video where he was going through on LinkedIn and he 00:03:34 was going through using the artifacts of Claude and doing some modifications and having it update 00:03:39 and all this. So I'm like, that's why I'm signing up because I want that. Well, artifacts is different 00:03:44 than projects, but it's it's awesome. Okay. Because artifacts is like, think of it as like a component 00:03:49 in your code base. Right. That is something you can update. Okay. But project is like the collection 00:03:52 of all those things. Sure. Right. So you started a project but a project that could have multiple 00:03:59 artifacts. Yeah. How the fuck do I get Claude to generate the artifacts? You can upload you can 00:04:04 upload the files and then those are your artifacts and then it can make modifications to it. But I 00:04:09 just generate like I'm like, Hey, I had this thing where I was like both chat GPT and Claude. I'm like, 00:04:17 I want you to write an NPM package for a new thing called, you know, we'll call it press JS. And it's 00:04:24 basically a JavaScript, no JavaScript wrapper around the word press plug API. Okay. They both did a 00:04:30 great job. But I was what I thought was going to happen is that as Claude was going to go through 00:04:35 and generate the code that those are going to become artifacts now automatically in that I would then 00:04:38 be like, Oh, no, I need to change this. So it's going to go and it's going to just update that 00:04:43 artifact and not have to generate the entire code again. I couldn't figure out for the life of me. 00:04:51 So here's my take is that from a UX perspective, open AI is a is a hundred percent better than Claude, 00:04:56 Claude, in my humble opinion, great model, maybe, but the user experience is dog shit. 00:05:05 Interesting. Interesting. See, I liked it better, but it's you. Okay. Full caveat, I've used it 00:05:10 really full time for one day. So you have way more experience with it. Right. Yeah. Well, 00:05:16 you're trying to code. Yeah. You're trying to do something that required a very specific large 00:05:20 context. But even, but even like when it generates the code, right, and you've got this code block 00:05:25 now and it's got copy and it's got a couple of things. That is an artifact, guys. Just let me 00:05:32 click that button. Okay. I used it yesterday for, I thought, a little fun side project that I 00:05:36 toy with Raspberry Pi's every once in a while. And my dogs, you know, going in and out of the house, 00:05:41 I want to make sure I want to know when they're outside. So really, my neighbors have dogs too, 00:05:44 and they're allowed to be outside at the same time. They bark at the fence together. So I was like, 00:05:49 I was like, I was like, how would I do this using like maybe RFID chips or whatever. So it gave me a 00:05:56 whole game plan, but then it drew a co-generated diagram of like the workflow, user flow of what 00:06:00 that could look like. Right. I was like, oh, that's pretty cool. Yeah. So that's that's one of the 00:06:06 things that I like where they're going. And it's good to see in general, in general in a market 00:06:11 competition. Right. So like, I don't know what if I was an investor, or if I was the CEO of this 00:06:19 company, do I really think that that I don't know? Is there is there enough for for multiple players 00:06:25 in Silicon Valley to be the website you go to to talk to AI? Right. I mean, that's, I mean, there's 00:06:30 that. I mean, go ahead. Well, you're already starting to see a little bit of differentiation 00:06:33 between like, some of the models are really good at coding. Like you mentioned, a lot was good at 00:06:38 that user interface aside. Right. And then you've got others that are doing like good for press 00:06:43 releases or good for interviews, right? You end up with these specializations that you jump from 00:06:46 one or the other to find something that's actually good at. Totally. I think if you get one that 00:06:50 really concentrates on code, and this might tie into what we talk about a little bit later, 00:06:55 but like, as you were talking about projects, like uploading a like product requirements document, 00:07:00 and then getting that workflow out and then taking each one of those pieces and turning it into 00:07:04 code and like, get the whole chain could get really interesting. Somebody really wanted to double 00:07:08 down on like a specialty for one of these. Right. Yeah. I'm really good at this. Yeah. Yeah. 00:07:13 Well, I like that. And I think that one, one of the things I saw on Twitter this week is that 00:07:19 somebody uploaded a research paper and Claude was able to generate a interactive 00:07:24 experience from that. And I thought that was really cool. It's like, hey, it's not just going to 00:07:29 ask it questions and have a chat based interface, but it was able to create a unique, almost like 00:07:33 interactive, like almost like a little micro site. Right. Right. And I think that that's kind of 00:07:39 cool. Yeah. Like as these AI companies figure out the deliverables of AI, right, a flow chart, 00:07:45 would be really good. An interactive, you know, search kind of click around interface where things 00:07:51 are categorized and just digestible, right? Like, because a lot of research papers are like, right, 00:07:55 nobody wants to read that PDF. It's not. It just doesn't feel right. I mean, you have to... 00:08:01 Not everybody. Not everybody needs. Oh, God. It's like the abstract at the top. 00:08:05 Like, if that were visualized into some key points, that would be way... I just want to kind of 00:08:09 generally understand what they're talking about. It's like, you read the, you know, you go by a 00:08:12 novel. Like I always said, is this one a novel? I need to read the whole thing, or can I read 00:08:15 literally the first chapter where they actually wrote the book and then the rest was just filler. 00:08:22 But I do like the concept of the deliverable not always being the answer, meaning the image that 00:08:28 you need or the code that you need. Maybe it's more of the flow chart or the... Or it's taking 00:08:34 the requirements and building out, you know, suggestions. It's an augmented solution. It's not 00:08:41 the solution. Yeah. Yeah. I will say that. So in terms of the code, it does seem that 00:08:49 Claude 3.5 Sonnet code is better than for O's code. But I will say that in terms of creative 00:08:58 writing, for O, Spanx, Claude. Claude's very terse, very like kind of to the point where for O, 00:09:02 for O just wants to talk your fucking, you're wrong. For O just wants every time you ask 00:09:07 questions like, okay, here's your answer. Oh, by the way, here's all the code that we've never 00:09:13 generated before, right? And it just goes crazy. And it's yeah. So, but I will say I haven't been 00:09:19 impressed with the creative writing side of Claude versus for O. But the code side's been 00:09:26 really impressive. Speaking of chat GPT, OpenAI, been relatively quiet since for O. 00:09:32 Since the announcement. So we, one of the coolest things that has been demoed in the last six 00:09:37 months is their voice feature, right? Yeah. In the zero latency piece. Yeah. Even though it's, 00:09:42 we know that they're just, it's a better skeleton loading state. Right. Right. Well, 00:09:46 that's an interesting question. Let me ponder that. And then all of a sudden then it catches up. 00:09:51 Right. Right. Right. But they announced that they're delaying launch of their voice feature. 00:09:58 And they, the way that they indicated is that I think this thing is completely 00:10:03 unregulated and off the chains. Probably is. Yeah. And I think because of the way that you're 00:10:09 interacting with it, like the conversations that they were having that their, their trainers and 00:10:15 their testers were having got pretty, pretty morbid. Yeah. Pretty quick. Right. And, and it was all 00:10:21 sexy talk to during the whole thing. She's like, Oh, breathy. Oh, that's an interesting idea. 00:10:30 I hope you die. There's also an aspect of like demo ready versus production. Right. Like it 00:10:34 might have been a little massage to a little practice. I know. Like you're seeing may not 00:10:39 reflect reality. Totally. Well, well, it kind of, kind of here's where I'm going with it. 00:10:44 There's a big difference between text responses and a personified response, 00:10:50 meaning it's of human voice in it. And you're talking to it. There's a zero limit to the 00:10:54 accessibility of that, right? Like a three year old could have that conversation if they get the 00:11:02 mom, mom and dad's app. It's a terrifying thought. And so, and so I think that I think that there's 00:11:07 either, like you said, it's just not production ready, or this scared the crap out of somebody, 00:11:10 even for a company that literally just fired their entire safety staff. 00:11:15 Right. They're like somebody there to be scared. Yeah. I mean, they, they, that's what I found 00:11:19 shocking is like this company literally threw the baby out with the bathwater. They're going for 00:11:24 profit. They don't care about your personal safety anymore. Right. They, they wasn't about like, 00:11:29 like, oh, we're going to build an AI system that's, that's, you know, has a moral compass or whatever. 00:11:33 Right. And they're, and they're the ones pulling the reins back. So I'm going to go with bullshit 00:11:38 on this. I think it's just probably too buggy. Yeah. But at the end of the day, like when you get 00:11:46 when, I mean, in six months or less, we're going to be having conversations with AI avatars that 00:11:52 have zero latency that look like somebody. Right. Right. That, or maybe you're even you. I don't know 00:11:57 if you like in video games, you have the mimic concept, like where you duplicate yourself. Yeah. 00:12:02 I mean, we're going to be talking to ourselves. Yeah. Don't we already do that though? Yeah, 00:12:08 we do. It's just me. Yeah. I know. Good, good point. Right. We do it all day long. But now we're 00:12:13 going to have this like, yeah, again, yeah, you, you're, you're, we're going to see it in, in email, 00:12:17 or I mean, in, in video games, right, like NPCs are going to become smart. I mean, 00:12:22 it's going to be, it's going to be a wild, a wild ride incorporating intelligence, 00:12:25 quote, unquote, intelligence. So don't let all the purists get all upset that these things 00:12:30 aren't intelligent or reasoning. Because as far as I'm concerned, as long as it looks like it's 00:12:35 intelligent, then, and I can't tell a difference, then it's intelligent, right? Like maybe not like 00:12:39 if we get down to the nitty gritty, but yeah, we're going to see this thing where it's just 00:12:43 incorporated all the time. The one thing that I'm the most excited for versus the video games, 00:12:50 I'm not, I'm not a gamer, but is, is media. And they cover this in, in a black mirror episode, 00:12:57 where the basically the equivalent of a Netflix, which was kind of funny because it was so meta, 00:13:02 like it was Netflix, but it wasn't Netflix inside a Netflix show, that they basically were producing 00:13:07 this show in real time with AI, right? And we're going to get there. So my right now, 00:13:12 my wife is obsessed, obsessed. And she's going to watch this and she's going to be like, 00:13:17 can't believe you told everybody with this trial that's going on in Massachusetts. 00:13:21 Karen, I believe her name is. They charged her with killing her husband, 00:13:26 but it looks like it's complete corruption. This guy was a cop. It looks like he was potentially 00:13:31 killed by other cops and they're blaming her for it. It's insane. It's absolutely insane. The, 00:13:36 the judges suspect. I mean, all of this is just nuts. So she is for like the last three weeks, 00:13:42 been watching this thing in real time and just like, can't stop. Like every time I go to upstairs, 00:13:45 you know, from my office, I go upstairs and she's sitting there in the kitchen doing whatever, 00:13:51 and that thing's running the trial. And so of course, Netflix apparently has been in there 00:13:56 recording this, recording it and trying to capture. So you know, there's going to be a show coming out, 00:14:04 but imagine if they could literally be taking that footage and generating the show almost like 00:14:10 an hour or two hours, three hours after the event happens versus waiting for six months, 00:14:15 right? I think we're going to eventually get to a point where, and oh, the Johnny Depp was, 00:14:19 the Johnny Depp case was another one that she got really obsessed with. Then Netflix came out 00:14:23 with the show like five months later, but it was, it wasn't, it was a little bit more leaning towards 00:14:30 Amber Heard's case versus Johnny Depp. Emily was 100% on team Depp, right? That was her thing. 00:14:35 Every single woman was right there. They all were. They all were. But now imagine being like, 00:14:41 hey, what bent do you want on this show? I want it to be more bent towards my favorite, 00:14:46 which is Depp, or I want it to be more bent towards Herb. What was that Netflix show that you could 00:14:51 pick your own? But it did, but it didn't work on the, on the Apple TV. That one? Yeah. Yeah. 00:14:56 Well, it was just announced today, NBC has the Olympics coming up. Yeah. And so they made an AI 00:15:00 version of Al Michaels that will do like your own personalized highlight package. 00:15:06 If you have really, yeah, just, just announced it today. Al Michaels like approved it said like, 00:15:11 yeah, yeah, go do it. So I generated voice of Al Michaels custom narrating whatever you care 00:15:15 about. Right. That's actually, that's actually, that's why I'm waiting for is 00:15:25 right. That's actually such a good example of, of a, of a use case for AI. So I forget which 00:15:32 announcer it was, but like they go, I think it was a Clark Kellogg. So they go when they 00:15:39 shoot those video game voiceovers, they have to announce every single variation of every single 00:15:45 thing that could happen. Like third and 32. What is this guy doing? You know, like, like, 00:15:50 like all that stuff. But think about them not only the data that's coming in, right? So they're 00:15:56 getting all this like high quality person specific data. Then think about throwing 00:16:02 the ability for that data to be not binary. Like it's not just here's what it can say. Yeah. 00:16:07 Yeah. And without Michaels, you've got like 50 years of high quality audio of the Olympics. 00:16:12 Like it's a really good data set to build that sort of model. Yeah. So I was playing around with 00:16:16 Claude and that there was a tutorial I watched about making a video game. And part of the video 00:16:22 game was it was that that boxing glove hit in a ball, right? Well, part of it was he went and got 00:16:28 generated the boxing glove image off of one of the dolly or something. And he used 11 labs or 00:16:31 the other one to do some sound effects kind of stuff and the voiceover kind of stuff. But like, 00:16:38 you could like start to stitch together some really interesting complete projects based off 00:16:43 of AI generated things that might derive from, you know, the voices that you captured, but 00:16:47 it's pretty cool. Like, yeah, I was just double blunt. The sound did you hear me with the playing 00:16:53 with a sound effect yesterday? Oh, yeah. Just like a thunder storm in a big city is like, 00:16:59 I'm just like, that's cool. Speaking of speaking of sound are our friends over it. 00:17:08 So, you know, yeah, got sued got soon. So, so no is a very popular AI platform for generating music. 00:17:12 Yeah, I'm still trying to figure out if that if they were the ones that generated that, 00:17:18 how do you spell show first song that went hyper hyper viral? I mean, it's one of those things like, 00:17:23 you could probably spend 30 minutes a day messing with that thing and having a lot of fun and you 00:17:28 just like don't have time to. Yeah, like that that that sites a lot of fun. It's a lot of fun. 00:17:33 So like I've done birthday songs for folks like I'll go in. I'll be like, here's who they are. 00:17:35 Here's what they're saying. Just generate a song sent it to them on the birthday. 00:17:40 Oh, that's right. It's good. It's good for, you know, you could you can generate intro and 00:17:44 out for music for your podcast. You can do all this. And basically where it's going to generate, 00:17:52 you know, a minute long song based on whatever you want. So the I the or a or I a what the recording 00:17:57 industry of America, whatever they are, the people that basically sued everybody back in the 00:18:02 the after days and restaurants to make sure that they have cleanest to play on their songs. 00:18:08 Exactly. Which is a weird, weird thing. The music police. So they're suing them saying that 00:18:13 they've trained their models on a bunch of copyright and material and that they can go out and they 00:18:19 can generate songs that are very close to Buddy Holly's song or whatever. Yeah. 00:18:25 I think we're going to get into a point here though where we're going to need almost entirely 00:18:32 new laws about copyright. Well, I agree. And I and it comes back to my favorite quote from 00:18:48 Ernest goes to camp. Who can own a tree? Oh, shit, dude. Who can own a tree? 00:18:53 He's got deep in our head. He's got deep. Thank you, Ernst. Right. Like who can own the musical 00:18:57 note? Right. You can own the combination of chords that that generate appeal to a human. 00:19:02 Right. No one. Yeah. Right. Like Taylor Swift doesn't own the G chord. I know you play it 00:19:08 every fucking song. But like, like the fact that it was trained to me, the fact that it's trained 00:19:14 on other people's music means nothing. Yeah. Like, right. Like I listen to music. I learn music. 00:19:19 You can you can listen to a Taylor Swift song. You can go and write a new song that's kind of 00:19:23 inspired by it. That's what they all do. Is that mean you're stealing? That's what the 00:19:27 that's what their industry's been doing for years. And there there is a line, right? Like there's 00:19:32 been existing like copyright stuff that's gone above it. I can't remember a particular example 00:19:38 now, but it's like, I there was an old like VH one, like thing about vanilla ice that sounds 00:19:42 exactly like under pressure. But it's I remember a quote from it. I quote it to my kids all the time. 00:19:47 It's like, no, no, ours goes done, done, done, done. Yeah, like very slightly different. Yeah. 00:19:52 And you get those sort of lawsuits. It's like, where's the line between like inspired by and 00:19:58 direct rip off. Totally. So, so Diddy, Diddy used a song from Sting, like every breath you take or 00:20:04 something like that, right? And it ended up being where he had to pay some crazy amount every year 00:20:10 is like 150 grand. Sting was in the music video with Puff. Was he? Yeah. Just just more like, hey, 00:20:15 thank you for the money. Yeah, basically like, you know, the guys made five million bucks. 00:20:21 But the thing that I think becomes becomes problematic for this entire thing is that right now I could 00:20:27 go write a blog post that describes Taylor Swift's music. And she starts here. She does this. She 00:20:32 sings this. This is the style. These are the chords she uses. This is all the information about 00:20:39 without ever once playing her music. And I could train an AI on that and have it generate a song 00:20:47 that probably sounds close to Taylor Swift. Is that copyright? Right? Like, no, I don't think 00:20:51 it is. Like, it's just being inspired by her style and then mimicking the style. 00:20:57 So, is that not really what we're doing when we're training these? We're taking the understanding, 00:21:01 we're taking the bits, we're taking the process, we're taking the flow, and then we're generating 00:21:05 something that's based on that. To me, that doesn't seem like it's a copyright thing. 00:21:10 I think it's copyright. I think it's copyright. If you go, hey, here's Taylor Swift's song. 00:21:14 Copy it. Yeah, talk about it. Like, make a song that's exactly like, 00:21:20 right? Or make a new song that sounds like Taylor Swift singing it. And I want to go out and be like, 00:21:25 this is Taylor Swift's new song. That's problematic, because again, I'm tagging Taylor Swift to this. 00:21:29 I don't know. I mean, it seems like a sticky space we're in. The other piece of unsettled 00:21:34 law is not the output of these things, but more like what's going into the training day, right? 00:21:37 So, you're taking copywriting material, and this is true across all of them, right? 00:21:41 Taking copywriting material and using that to train your mom. So, you're using that material. 00:21:44 You're right. You're right. That law isn't even fully settled. 00:21:51 Well, and that's a good point. Like, so all things considered, the concepts are not new, right? 00:21:57 Music has existed and notes blah, blah, blah. But if you're taking that data that you don't own 00:22:01 and using it in the training of the model, I do think that you have an argument then. 00:22:05 But so, same with, now there's no difference between the image models, 00:22:10 right? The same exact thing. Again, I'm taking, I've got a sequence of pixels, 00:22:14 and the sequence of pixels that I see is pretty much standard. So, when I generate a new one, 00:22:17 I'm going to take the sequence of pixels and I'm going to try to figure out what to the next one, 00:22:22 or what the proper one is. I don't know. I don't think, I'm just, nevertheless, 00:22:26 I think ultimately, generative AI, we're going to have to have a whole new set of laws that 00:22:30 are going to basically set up. Yeah, like existing fair use doctrine. 00:22:33 Right. It doesn't work for hardly any of us. Like, it's a completely different use case. 00:22:37 Exactly. I do have a question. I don't know if we have any music buffs in here, but like... 00:22:41 I know someone had a band that sings. You make money in music these days. 00:22:47 You're in a band. You answer them for us. We don't make money. Isn't it your image, though? 00:22:50 Like literally, yes. I think it's your image. You don't make money on your music. 00:22:55 But you make it on the same exact thing that it's actually always been. 00:22:59 Really? Right. Which is, it's your... It's yours. I feel like, back in the day, CD, the CD... 00:23:02 CD's absolutely did. Like, we had a sweet spot, like, from, like... 00:23:09 They just cranked it, dude. It took a total of 80 to 95, 96. We then went out before it in P3. 00:23:14 Exactly. Metallica's making millions of dollars off their CDs, and all of a sudden, 00:23:18 Napster comes up and they're like, "Wait a minute." And then now we got to go back to it. 00:23:23 My record was the call to action. Yeah. And now it's come to my concert. 00:23:26 Yeah. Yeah. It is. Yeah. Because that's the thing you can't replicate and can't... 00:23:32 Totally. No. But it wasn't that originally. Like, if we really think about what music was, 00:23:37 really was back. It's always been a performance. It's always exactly. You go and you enjoy the 00:23:41 thing, you maybe buy some of their shit, and then you buy some of their stuff later on. I mean, 00:23:46 that's what it's always been about. It hasn't been just capitalizing on the format of how I 00:23:51 deliver my music. It was much more about how do I experience it in a holistic way. 00:23:59 So, if you're a famous musician, deal with it. The whole Metallica thing, when that all came out 00:24:07 with Napster, and that they were suing fans for pirating their music, man, that was a weird time. 00:24:13 Oh, I got a cease and desist from Eminem, one time. Really? Yeah. Well, my roommate, 00:24:19 we shared internet and used Comcast or whatever, and they must have downloaded an Eminem song 00:24:24 that was tracked or whatever. And I opened up this letter because I was on paying for the internet, 00:24:33 and it was like, Eminem says, "Stop it." So, we've had a shirt. I've had many stop doing this 00:24:41 from Comcast specifically. Really? Oh, yeah, yeah. I'm not saying I'm a pirater. 00:24:50 But I pirated a lot of stuff back in the day, and I had the Windows MP, or a personal video 00:24:57 recorder. The greatest software Microsoft has ever written, which was their personal PVR software. 00:25:02 It was like this 10-foot UI where you could have like a, I mean, it looked like Tivo. I mean, 00:25:07 it was nuts. So, anyway, back then, sure enough, I would constantly be getting them. Finally, 00:25:11 said, "Okay, I'm sitting up in a VPN, and now I'm not getting them anymore." But then all of a 00:25:16 sudden, I got one for downloading this game, and I'm like, "What the hell is this?" So, 00:25:21 I go up to my son. I'm like, "Hey, we just got a notification for pirating this game." He's like, 00:25:27 "What?" He's like, "I literally just finished downloading it." And I'm like, "Listen, if you're 00:25:32 going to do this, you need to talk to me about it. I'm going to give you my account to pro-taught 00:25:39 VPN, and I don't care if you do it." I just installed the VPN, and now I'm like, 00:25:44 versus like, I reformed my, my centerways. No, no, no, no, no. The modern birds and the 00:25:49 beat-stalk. It was trying to hit it safely. It was such a good thing to do. It was funny. So, 00:25:54 by the way, if you're pirating stuff using BitTorrent, they can see your IP address. Everybody, 00:25:59 and they literally just sit there looking at that one specific file, look at all these IP addresses 00:26:03 that are downloading it. They're going to take all those, and they're going to blast it out to 00:26:09 whatever network providers on those IP addresses, and you're going to get dinged. So, just stop. 00:26:13 Doing it. Use a VPN. Yeah. Run a VPN case. That's it. Yep. Too slow. 00:26:21 So, ProtonVPN has actually been really quick. Is it? Yeah. This episode has been brought to you by 00:26:27 ProtonVPN. By the way, oh, actually, what's funny is we are starting to get, I'm getting more and more 00:26:33 people hitting us up to promote their products, and where they've got affiliate stuff, and, you know, 00:26:38 there's maybe money to be made there. But most-- We will make money on this podcast. Eventually, 00:26:44 some point. Before we start printing all this money with our podcast, let's talk about my 00:26:54 favorite topic, probably, developer productivity. So, I think that AI is a piece of this here, but, 00:27:01 like, you know, I think coming from, we have four developers sitting here. I think we can all agree 00:27:08 on one thing, which is one developer versus another developer, not always the same output. 00:27:14 No. Yeah. And it's not often, like, even, like, talent level or anything. Like, it is just, like, 00:27:20 sometimes it's unlocking the right thing in the person's head, right? That suddenly, like, 00:27:24 it just flows. Right. Like, most productivity for developers happens when they're in flow state. 00:27:28 Yep. When they know where they're going to go and how to get there, and they just fly, right? 00:27:31 And that's where you end up losing an entire afternoon because you've been cranking on something 00:27:37 you tired of, right? And one person might have lower starting friction, so they get into flow 00:27:40 state easier, and then they're more productive than everything else. Whereas someone else might 00:27:44 take them longer to get into that, but then once they do, they go down that road, or they have 00:27:49 better knowledge for, like, you just happen to get them into the right zone from there. 00:27:55 Sorry. That went off the... No, that's really good. So, flow state is, is basically 00:28:04 something that most people, I feel like, need to find a way to, in their profession, if they can, 00:28:07 achieve that. I would argue that AI, because we were talking about this before the show, 00:28:12 AI can help you, it can hurt you, too, but it can help you get into flow state. Like, 00:28:17 the other day, I had an issue, and I went to... I didn't know how to get started. 00:28:21 So, what normally I would do is I'd sit there, and I'd think about it really hard for as long as it 00:28:28 took. You just popped in there. No, I go Google of 50 things, and eventually, I'd figure out the 00:28:33 right thing to Google, and I would eventually get there. But AI, you know, chatbots, I can 00:28:39 describe generally what the problem is, and I can get a good mental model of where I'm trying to go, 00:28:44 then I can get back to coding and get right back into that flow state. So, can help there. 00:28:50 It's also hurt me when times when I've stepped out too soon, when I wasn't ready to ask the question, 00:28:53 and it's like, I really just needed to keep going, because I was this close, you know. 00:28:58 In my case, I found that it hurts me a lot. Again, you're talking to somebody that turned off 00:29:04 code completion like a long time ago, because it would be wrong just often enough that it would 00:29:08 piss me off, get out of flow state, and I'd be like, just go away. So, I turned it all off, 00:29:16 because oftentimes, when it's like a lot of these tools are wrong, just often enough 00:29:19 that they can trigger that and knock you out of it. And it's like, wait, that was weird. 00:29:23 That wasn't what I actually expected it to be, and then you'll go further. You'll end up 00:29:30 breaking out of that state where you can be super productive. For the starting friction block, 00:29:35 that part is interesting. Like I mentioned earlier, your product person gives you a set 00:29:39 of requirements, and they're breaking down on those requirements into the individual tasks you 00:29:45 have to go do. Some people, that comes naturally, and they just go. There's lots of other folks who 00:29:49 need to break that down ahead of time. And if you can throw that into a clot or some other AI, 00:29:53 and then like, hey, what this actually means is you need to do this, this, this, this, this, 00:29:57 that's in this, right? And help grease the wheels and get that starting friction out of the way. 00:30:03 Right. I think that one thing that we've been doing with developer productivity is 00:30:10 trying to completely abstract developers away from having to do any sort of reporting or 00:30:15 accountability on their own. It's just done for them. Thank you. Yeah, seriously. 00:30:23 So like one thing that I do is I go through and I create a report from Jira of all the tickets 00:30:27 that have been worked on and all the activity and the stream that. And I get that, but I get 00:30:31 everything. I get all the comments, I get all the descriptions, I get the task name, 00:30:35 and I throw that into chat GPT and I say, hey, what happened last week? 00:30:42 I've noticed it helps incentivize actually using those tools and writing better 00:30:46 comments and all that because before it's like, oh, comment your code, make sure you're doing it, 00:30:49 but half the time we don't even look back at it. And sure, six months from now, 00:30:54 you might be thinking yourself for documenting it well, but you may not either. You may not ever 00:31:01 look at that piece again. I think that creating real time content is the best way to, you know, 00:31:08 put your stuff into a leverage, a little AI solution later, right? Did you, did you do this 00:31:13 with code or are you just like copying and pasting? Oh, no, I just created a really nice 00:31:18 basically web view of it. So when I copy and paste it in the chat, it knows how I'm doing 00:31:23 you just grab it completely manual. Yeah. And I think that that actually, that's advice is like, 00:31:29 figure out a way if it's, if you can, just just toss some stuff in there. Exactly. Just toss it in 00:31:34 there. Then you can figure out how to automate it once it's done. I did that with, we were, I don't 00:31:39 know if we talked about it, but one of my clients, I needed to get them excited about AI. So I just 00:31:43 went and I grabbed, I went to the Better Business Bureau and I started grabbing all the complaints 00:31:50 and the reviews. And basically just caught, literally, I'm just like, select, just copy it, 00:31:55 drop it into the, into our AI tool. And I just ask it, hey, summarize this. And they're just like, 00:32:00 oh my God, I'm like, yeah, that's literally, you just copy huge amounts of text, throw it in there, 00:32:05 ask your question. You don't need rag. You don't need any of this shit. Just copy, paste, 00:32:12 ask your question. Yeah. And on the, on the other side of it, you also have code generation, 00:32:20 which is, is actually proving to be one of the most interesting aspects of chat, GPT, 00:32:24 Claude, the, here we go, Brandon kicks something. I'm starting to break things. 00:32:34 How are you guys using AI to generate code right now? I'll go first. So I'm, I'm all in, right? Like, 00:32:38 so at night, usually, so what happens is I go down and, and I'm like, all right, I just, 00:32:41 I'm just going to get on my iPad. I'm not going to code anything. And then I'm like, oh, I've got 00:32:49 no idea. So last night was, you know, I needed to have this WordPress wrapper for, in JavaScript. 00:32:53 And I've done this before I did it for our big cheese, like a lot of it's automated. So I've 00:32:56 already written some of it. But I'm like, I'm kind of curious to see what would happen if I'd 00:33:01 ask chat GPT and Claude to basically go and do this. So I'm like, I want you to build me an NPM 00:33:08 package that's TypeScript based that basically is just a node wrapper around the WordPress API. 00:33:13 And then just go and sure enough, right? Both of them, like chat GPT, here's what our file 00:33:18 structure is going to be. And then it just goes, here's what your, your, your TS config is. Here's 00:33:22 what your MP or your package on Jason is. Here's the main file. Here's all of this. And it just 00:33:27 goes through and it generates it. Claude did the same exact thing. All of them, though, and then 00:33:31 literally just copy and paste and you build it out. And they're, you're done. I mean, you're done. 00:33:37 And it, and it worked. And it's just like, this is actually really incredible. Again, nothing novel, 00:33:42 right? All we're doing is we're taking this PHP based API that has all the documentation and I'll 00:33:46 make it accessible through JavaScript. But the reality was, is that I would have to spend 00:33:51 three days basically writing this by hand, testing it, validating all of it. 00:33:57 In literally 30 minutes, I had all of it done for me and it was ready to go and I could just 00:34:04 basically copy and paste and build it out and I'm, I'm rocking. That is, again, that's kind of my, 00:34:08 my thing is like, I don't think developers get replaced here. I think developers who don't use 00:34:14 AI get replaced by developers who do you as a use AI, because all of a sudden that one feature 00:34:17 that I had to build, I was able to do it in 30 minutes versus three days. 00:34:23 But isn't, but it sounds like from, from what I'm hearing with you and how you're doing it, 00:34:29 you typically you're going to AI, especially when you are doing something with an API that, 00:34:35 that you don't typically interact with versus I'm, I'm in my, now I'm just writing 00:34:38 JavaScript and front end stuff today. You know what I mean? 00:34:40 Yeah, like your own custom stuff. 00:34:44 It's the stuff that you know tangentially, but you don't necessarily, aren't necessarily an 00:34:50 expert. Yeah, it's very much like, here's what I know exactly what I need to build. I know the LLM 00:34:54 probably has a good model of what I need to build already. And so go and just build out for me. 00:34:58 So I don't need to spend the time. Let me focus on the novel things, the things that you don't 00:35:02 have an idea of, the things that we haven't figured out yet. That's where I want to spend my time, 00:35:06 not this bullshit of just dealing with things that already exists that we already know about. 00:35:13 It's using AI to do the commodity code. It's how I think a lot about the current state of AI is 00:35:18 that it replaces commodity resources, right? 100% off like the bottom end of the market. 00:35:21 Like that task you just described, you would have put on like Fiverr, 00:35:24 farmed out to somebody. Exactly. Low value like here, go do this thing. 00:35:28 Yep, yep, yep. And it's like me, you would have just been like, man, I don't want to do this. 00:35:33 And then just stop completely. That's a lot of busy work. I know how to do it. 00:35:37 That seems hard. It seems like I got to like read stuff. 00:35:43 Yeah. So those are the things that I find extremely valuable right now is that I can just 00:35:50 offload a lot of that work. But at the end of the day, it's still code that I can review, 00:35:54 that I can validate, that I can be like, all right, this is it, this is going into my repository. 00:36:01 You want to know, so to your point of commodity versus novel, I was using a library that is 00:36:05 fairly new, the documentation isn't that great yet. And I copied some of their code and I was 00:36:09 asking questions because I wasn't getting like this event to fire, right? I think it didn't answer 00:36:14 my question because it just hasn't been in existence long enough and didn't know. But 00:36:19 it did tell me like three errors that their code had. And I was like, oh yeah, they aren't 00:36:24 awaiting that. Like it didn't matter because I was just firing off a thing and technically I 00:36:30 didn't have to await, but you probably should await still. And there was like, oh, it's probably good 00:36:36 for like, if I were trying to learn a new language, paste the code in and tell me 00:36:41 the code translation. Yeah, the like, yeah, this code in JavaScript, what does it look like in Ruby, 00:36:46 like something like that? And we're talking about developer productivity, not developer 00:36:50 replacement or developer training. Yeah, right. We're talking about productivity and productivity 00:36:54 means that you have to be that profession. You don't just go start working on cars, right? 00:37:00 And fixing them. Yeah. One of my clients had a junior developer at one point that they hired 00:37:05 and then started to try and do all their code through AI chat GPD stuff. And that didn't work. 00:37:09 No. No. It didn't work. That might be one of the dumbest things you could do. 00:37:14 Yeah. Yeah. If anybody is going to be using chat GPT to generate code, it needs to be a senior 00:37:20 guy, right? Because a senior guy is going to be like, ah, you idiot. So Jacob and I crafted 00:37:25 have crafted up top 10 crafted up.com. We have 10 developers that work for us. 00:37:34 And Jacob and I adopted AI as a test about a year and a half ago. And then about eight months 00:37:42 later, we let two of our senior developers start utilizing AI as a test because we knew that the 00:37:47 junior developers would would not would they we thought that their growth would actually be, 00:37:52 yeah. Like the would be stopped. The writing of code is the easy part, right? The understanding 00:37:56 of code and translating the code into what this thing is actually supposed to be doing. Does that 00:38:01 match what the user told me or what the product person told me? That's the hard shit, right? Yeah. 00:38:09 Like, yeah. And you know what else is hard is a customer called everything's busted. 00:38:20 Right. And now and now you go, everything's alert and 10. And AI isn't going to fix that, right? 00:38:26 You have to you have to have so many different under you have to have so much different expertise 00:38:31 understanding into it. Like every there's so many different seals that come into that moment 00:38:36 that prepare you for that. Even if AI could fix it, which they might be able to help you, 00:38:41 you still need to ask appropriate questions. We had a DNS issue yesterday, right? And the 00:38:46 reason I knew how to solve that is because I have experienced that issue like before, right? 00:38:51 And I knew you know, so like senior developers have the experience to leverage to ask the right 00:38:56 questions to get eventually AI might be able to supplement or give me a little bit of extra like 00:39:00 fill in the blanks of oh yeah, I've never seen that maybe we'll try that too. But if I'm just 00:39:05 like sitting here as a new developer and never it had this issue before and a customer site went 00:39:09 down and I'm just going to be looking at a Blake screen with a cursor like oh shit. 00:39:15 In a previous life, I was a coding bootcamp instructor. So I take adults and I, you know, 00:39:19 day one is like teaching you what a variable is, right? And the thing I told them all the time 00:39:23 is that the difference between a junior developer and senior developer is how many bugs you've written 00:39:28 and how many bugs you've fixed, right? That comes from understanding code fixing code, 00:39:33 like just experience over time of how many mistakes have you made and how quickly can you 00:39:36 recover from those mistakes. I think one of the best developer 00:39:44 tests that I have is when there's a production bug, how fast they fix it. 00:39:48 Yeah. And I think that there's two things that come into that understanding of the code base 00:39:53 they're working in and previous experience. Yeah, hand saying calm under pressure, which I think 00:39:58 those two things lend to that like I used to get really flustered when something would break and 00:40:06 now it's like, dude, I've seen so much shit break. How many times though has those situations 00:40:13 come down? You actually look back and you go, that's actually insane that that was even fixed. 00:40:20 How did I get to the point where I literally found that one file in that little character, 00:40:26 and then the whole system just goes back to it, right? Like there's such a, it's just a thin red 00:40:32 line that holds all this stuff together and you can't, it has to come here. It has to come from, 00:40:36 it has to come from capable, you know, people that are maintaining these systems. 00:40:39 Oftentimes I've had the opposite experience of how did this ever work? 00:40:48 I mean, it's like, I mean, in the world of, so for product managers, I got a confession to make, 00:40:50 this whole thing's a fucking house cards. 00:40:57 So it's literally, yeah, I mean, like we have, we have polyfill.io, they got hacked by China this 00:41:02 week. Did you see that? Well, I didn't see polyfill was hacked by China. That's crazy. 00:41:09 So polyfill.io was a, was a third party library that you would include to just include a bunch 00:41:14 of JavaScript that made your website work on older browsers. Well, the domain didn't get renewed 00:41:20 and someone from China bought the domain. And so anyone that had that script put on their website 00:41:23 got redirected to a scam. Which again, is almost everybody. 00:41:29 So we had the, we had the left, we had the, the MPM package a year ago. 00:41:33 Yeah, the left, left, left, left, left. And that was just like a, like a, 00:41:37 really, it was like four lines, adding back to the left. 00:41:42 So, so you have all these packages that are maintained by people that you don't know, 00:41:45 that are in a public registry that hold up all of our software. 00:41:50 Okay. And we're supposed to make this shit work. Yeah. 00:41:54 You know, junior developer walks in and goes, MPM install, you don't, and it just installs the 00:41:59 latest package. You deploy it to production. You think everything's great. Literally banking 00:42:05 software. There was some MPM package the other day that I chuckled at where like the developer 00:42:08 who maintained, it was an underlying one. I forget which one it was, but underlying package 00:42:13 that a lot of packages used. And he was just fed up. Someone like, wrote him like a threatening 00:42:16 comment about you need to fix this shit. And he's like, I quit. 00:42:21 Just don't lie to half the fucking internet comes dead. 00:42:26 So, so I do have a question about that. Who the hell wants to maintain open source software? 00:42:32 Nobody. If you've ever just, so I did it, right? So I, for eight years, I ran an open source 00:42:37 project. Of course you did. Right. And, and I'll tell you what, dude, it is the most 00:42:42 painful experience ever. No one wants to give you money for shit. 00:42:47 Everybody wants to complain about absolutely everything. And then you get these pull requests 00:42:55 of assholes who have no idea what they're doing. And you're just like, oh, no. Like, what the hell 00:43:01 is this? I mean, and so it's very easy to see how like you could do a just traditional human 00:43:06 engineering to basically fool these guys. Again, somebody could come to me. There's been multiple 00:43:09 times that this has happened where somebody's like, hey, you know, they start engaging with a 00:43:14 packaged guy who's managing the, their, their open source thing, giving them feedback into my 00:43:18 ideas and just talking to them for like a year. And then all of a sudden it's like, hey, I just 00:43:22 submitted a PR to go fix this one bug. They're like, well, I've been talking with this guy for 00:43:28 a year now. I trust him. Go ahead and do it. No, man. And then he, he, he sneaks in his little, 00:43:32 the very shit. Now that brings us to the point where, what happened? Oh, yeah, we're talking about 00:43:37 AI could solve some of these issues, like where you could have, and I'm sure like Cloudflare does 00:43:42 the obfuscation stuff, but I'm sure there's packages that get ups working on it. I think 00:43:44 that's all on the other day, but that they should be able to scan your code. 00:43:50 Exactly. And that PR, that PR has all of a sudden you've just added some external call to a, 00:43:55 you know, a home server in China or wherever. That was the, that would be huge. 00:44:01 And I encode review generally. That was a good idea. Because like humans are not particularly good 00:44:07 at spotting like syntax errors or logic errors. So why are you particularly good at not rubber 00:44:10 stamping PRs? Well, and a hundred, really what I'm saying. Yeah. 00:44:21 Great. But I wonder why, like, why haven't, why aren't we seen more of that at GitHub, 00:44:27 considering that a, or Microsoft owns GitHub, right? And Microsoft now owns open AI, if we're 00:44:33 being honest. Why, why are we not seeing that yet? Like you'd think that that's like a simple one 00:44:38 that they would be like, yeah, like let's just do this on the github.com. Let's just make it. 00:44:43 Any PRs that come in get basically analyzed by AI. I mean, they got dependent bot and secure 00:44:47 a bot. And I don't know where those bots come from. Yeah. I just ignore my dependent body 00:44:51 emails every week. Also, if you're, if you're Microsoft, like how much revenue does GitHub bring 00:44:56 in, right? How much resources are you devoting to that? The data side of things? The data side. 00:45:02 Yeah. But you don't need to do code review to get that data, right? No, you're right. They've got, 00:45:06 they got bigger fish to fry. That's true. Yeah. A bigger opportunity, though, 00:45:11 because it's needed. Like, I mean, like it's, it literally is how we can protect our internet is 00:45:17 by having a third eye basically sitting here watching when these people are trying to do this 00:45:22 stuff. And we would have been able to mitigate a lot of the problems that we've had. Well, I would 00:45:29 say two, two things. Now we're talking about security, but, you know, understanding that your 00:45:36 software is a collection of code that you didn't write. And then, don't hard code your fucking API 00:45:44 keys. Because rabbit, because that's what rabbit did. Oh, so rabbity, Mr. Rabbit got here. We're 00:45:49 seeing that scene in Snatcher, like, where they, the dog in the rabbit, they have to bet the rabbit 00:45:57 gets fucked. And the rabbit does not. So wait, so rabbit, the, the hardware device. Yeah. So some 00:46:02 people have figured out, well, really what it is, is it's actually just running Android, 00:46:06 and it's got a piece of software. The rabbit software is literally just an app that's running 00:46:11 on Android. So people have been able to extract it and then found all of the keys. 00:46:16 So they are all hard coded API keys. So 11 labs. So 11 labs. 00:46:22 And then another one, but the one API key that, and by the way, when they published the article, 00:46:27 rabbit did not revoke any of the API keys. They were all still active. They're like, no, we are, 00:46:32 our researchers went in and looked and it's fine. Yeah, literally, they literally responded. 00:46:37 And then, and the security company that hit them, they were like, we can see everything that 00:46:42 your users have ever done on this platform with this API key. We can get, we can use this API key, 00:46:47 we can get all the history, everything that they've ever done, we can expose you. Yeah. And, 00:46:52 you know, and I think that comes to the bad side of AI, which is people chasing this 00:46:58 this, this thing that is not going to work. We talked about the product dilution, 00:47:04 500 plus image generation apps, 500 plus content generation apps, everybody, everybody wants to 00:47:09 develop this and this or that. But like, they're taking that mindset that isn't, I want to create 00:47:15 value and create a company. There's cash grabbing. If the guy who created the product also was a 00:47:20 former crypto bro, pushing some shit coin. And that's true, right? Yeah. Yeah. 00:47:27 Just new listeners, like, yeah, you're getting that you are the product people. 00:47:31 Yeah. You're the product. Like you are getting you've been had bad dude, because like that, 00:47:37 there's a lot of cool concepts that that that team brought, but it was all bullshit. Like the 00:47:42 large action model, right? Not a thing doesn't exist. Sorry. There we go again. 00:47:47 Yeah. And I mean, don't you're not going to replace the you're not going to people aren't 00:47:52 going to stop buying iPhones. No, dumb. But I do think that like when it comes to when it comes 00:47:57 to products and AI, like you have to understand that like a lot of people are in this just for 00:48:01 the cash. Oh, 100%. Yeah, totally. There's a lot of folks that are looking at AI as the what 00:48:06 instead of the how. Yeah. Right. It's like this is we have to go do AI because our venture fund 00:48:10 folks want us to do AI. So if they can get more money, right? Yeah. Instead of flipping it around 00:48:15 and asking what is the thing we want to do? What is the what? Right. And then we'll figure out 00:48:20 AI is the how to go make that. Yeah. And that's why I think from a from a if I'm a developer, 00:48:25 if I'm in the tech, if I'm really in any sort of blue cloud productivity industry, 00:48:31 think of AI as a way that the tools are there's a lot of tools out there right now that can just 00:48:35 augment your productivity. Yeah. Help you help you be better, worry about your privacy, 00:48:40 worry about your security, understand what what you you know, what your your company's AI policies 00:48:45 are. Right. But like if they have them. Well, and we have yeah, that's that comes my story. 00:48:52 I have that story. But but but use you should be using or at least trying to use AI to help you 00:48:58 get the mundane stuff done because that stuff. The mundane is what you get the thumbs up from 00:49:03 your boss for. Yeah. Thank you for that report. Thank you for that really well crafted email. 00:49:09 Did you? Yeah. Did you watch that little snippet video with opus? Not opus. When you shared on 00:49:13 the yeah, Claude saw it 3.5. I need to figure out. I want to figure out how to do that. So 00:49:18 talking about like mundane stuff, they they had a program that like it was just a little Python 00:49:22 script that like took an image and then cropped it into a circle and removed the background, 00:49:29 maybe a PNG or something. There's a bug. So they used Claude to fix the bug, but also write a test. 00:49:34 And they gave it access within a sandbox environment to like be able to read and write files. So like 00:49:41 they literally from the terminal were like, Hey, write me a test script for this function file. 00:49:46 I'm like, Okay, that looks like that would be fun. That was awesome. Yeah, I want to figure that 00:49:50 out. I couldn't figure out how they actually pulled it off though, because it's run it was 00:49:54 all running locally. So I absolutely now like once you share that, I'm like, I got to figure 00:49:58 out how to figure out how to do this. So here's the, here's the, I was going to turn this on the 00:50:05 head because yeah, so we had a situation where in the last two weeks, I don't know if you've had AI 00:50:10 join your meetings, who has had a zoom meeting where they joined. Yeah, yeah. 00:50:16 I have a huge problem with that. I have a huge problem with that. Yeah, I do not like it. 00:50:21 I think it's completely intrusive. Yeah, I say shit on meetings that I would never say if it 00:50:28 was recorded for a reason. But it's coming. It's, it's coming to one of my clients now, 00:50:33 where the boss is joining a meeting that they're not invited to with their AI. 00:50:42 That's going to happen. Yeah. Yeah. Absolutely. So, so it's like the promises, oh, well, 00:50:47 we can record the meeting and we can do notes and later on, if you want to review what happened, 00:50:53 cool. But the reality is, well, now the boss can come and watch this meeting and then you, 00:50:59 you're just never not being watched. Yeah. So, so again, like with one of my clients, 00:51:04 the, the idea basically being able to take the entire data estate and turn it into an AI 00:51:11 addressable system that if you have the right access as a CEO would, a CEO could hypothetically 00:51:17 come in and say, Hey, show me every transcript that talks poorly about the CEO. Yeah. Correct. 00:51:21 And here's the problem that I'm seeing. No one's talking in these meetings anymore. 00:51:27 Yeah. Well, if you're in sales organizations, like you've already lived in this world for a 00:51:31 long time, like gong is in every sales call you've ever done. So what's that? What's that? Oh, 00:51:35 just like listening into all your. Listening in all the sales calls that later on, they'll get 00:51:38 transcripts for it. They'll run analysis on it. So we're just being held accountable like other 00:51:44 people have already been like it. Like every sales team. 00:51:50 Right out of college, I did a, this really shitty job with them selling direct TV or, 00:51:56 yeah, direct TV or edition network. Anyway, I was pretty good at it. It was inbound sales, 00:52:01 but anyway, they told me, I'm sure they told me they were recording me. I just had forgotten 00:52:06 and my boss came up to me. She was like, Hey, a really good job this week. You're saying the word 00:52:14 yep a lot though. I was like, damn, why they're listening to me? This is like, you know, 00:52:21 how many, how many department calls go on, right? Zoom calls, teams calls right now where 00:52:27 department A is just shit, non-department. Oh, totally. Yeah. Yeah. Now every single call. 00:52:32 I'm so, so now that I've got my one app that I've built, and I keep trying to do it natively, 00:52:36 and I keep running into walls, but nevertheless, every single call that I'm on now, I'm recording. 00:52:40 It's all local though. Like that's my big thing. And I tell everybody, I'm like, Hey, listen, 00:52:46 I'm recording the call. Just know that it's all local. I'm running it through command plus as my 00:52:53 model for it. But it's invaluable. Like I, like, when it doesn't record, I've had a couple times 00:52:59 that it didn't record. I'm just like, Oh my God, I don't know what I'm gonna do. The value, 00:53:07 so the future of work is surveilled and self-documenting. Yeah. It's not going to, 00:53:12 it's probably one of the biggest changes that's going to happen to how we work, in my opinion. 00:53:18 I think that I've got to figure out, we've got to figure out how to live in that world. 00:53:25 I think that being careful with what you say is now important, which takes a little bit of the 00:53:31 creative away. Yeah, not me. I don't give a fuck. That's good point. I don't give a fuck either. 00:53:35 It's super frustrating because like, I can think of a lot of developer meetings I've been a part of 00:53:40 that we've, we've ideated and had really good conversations around possible solutions. And we 00:53:44 didn't take good notes because we're dumbass developers. It would have been really nice to be 00:53:49 like, Hey, what was that little thing I said last week with candy? You know, literally every day 00:53:54 I'll go and I'll be like, Oh, what were my takeaways from that meeting? Yeah. All right. That's why I got 00:53:58 new practices like doing that, essentially. As soon as a call ends, like I sent a recap. 00:54:04 Yeah, totally. But you're pulling from your memory. My memory, like I'm over here thinking of 00:54:08 cartoons that I watch. If I don't do it as soon as the meeting's over, it's gone. Totally. 00:54:13 So I was having one with prompt privacy. And Sabrina and I were talking and was like, 00:54:16 all this great information. And I'm like, I'm glad I'm recording this because I couldn't 00:54:20 remember all of it. And then sure enough, something went wrong and it didn't record. And I'm like, 00:54:24 Oh, my God. So I did. I just try to capture all of it that I could. 00:54:32 It's too valuable not to exist and it will change things. And for people like me and my 00:54:38 style is very off the cuff and instinctual. And so it hurts a guy like me, right? Because 00:54:43 they're going to catch. They're going to get canceled. I'm going to get canceled. I'm getting 00:54:47 canceled. Can you do a like, we'll take this offline. And like after that call ends, just the 00:54:52 two of you get on a call that would require you to have the forethought and to we should take this 00:54:57 offline versus actually told on this call, by the way, that oh, it only requires one person's 00:55:03 consent to be recorded in the state of Indiana. That's correct. Oh, is it in the end? We're not 00:55:08 counting a baby. Yeah, that's funny. I used to know the list of states back when I worked in 00:55:16 journalism, but I can't remember them. But also, isn't a lot of this a huge material financial 00:55:20 and security risk for these companies? Very well. When, when are the lawyers going to come in and 00:55:23 be like, give me all my transcripts? No, I know, it absolutely. I like transcripts. 00:55:28 Once, once some of the lawyers at one of my bigger companies, like really step back and started 00:55:34 thinking about, wait a minute, we have, we have zoom recording and transcribing every single one 00:55:39 of our instances that we're having a conversation with, it was, it's problematic. Well, yeah, I'm 00:55:44 sure some engineer at Boeing at some point has said on a recording, we shouldn't be doing the 00:55:50 next one. We don't need doors. Yeah, doors are not. As soon as you create the first ever 00:55:58 no AI zoom meeting software, let's release it next week. So I think, I think, I think you should, 00:56:04 it needs to have the AI in it, but it has to be private. That's the key is that the people who 00:56:09 are going to do things privately will ultimately, I think, be able to win in kind of a long run. 00:56:13 Would you just have that like a read once and then burn, burn after reading situation, 00:56:17 which is a pretty good movie if you've seen it? Potentially, yeah, yeah, it could just be 00:56:28 something like that. Send us offline. Send us your, send us your AI enabled zoom alternative 00:56:33 software ideas. Yeah, do it. Also, this is making me laugh because everything's coming full circle, 00:56:38 right? It's like online shopping, everything went online, then they're like, oh, what if we had a 00:56:48 place where you could actually try the things? Netflix is building out a retail model. 00:56:54 Well, also, they're the best buy. They're probably just going to sell Roku TV. Are they going to 00:56:59 do commercials to or some shit? It's just like, they just reinvented cable. Why are they doing the 00:57:04 Amazon thing where it's like, these are the shirts that are on the show right now. Yeah, 00:57:11 watching Amazon by now. Shane Gillis, if you've actually watched Shane Gillis's stand up hilarious, 00:57:16 by the way, if you're not easily offended. I'm sitting there, I'm watching them. I'm like, 00:57:21 dude, I like that guy's shoes. And Emily and I are watching it literally like 10 minutes later, 00:57:25 she's like, all right, they'll be here tomorrow. She like somehow or another found the exact shoes 00:57:29 that he was wearing. And it was these guys. And she bought them for me because I was watching 00:57:33 Shane Gillis and I liked it. There's a website for furniture, I think, where you can like 00:57:37 copy a link into it. Or maybe it's not just for furniture. But anyway, you copy, 00:57:41 append the link at the end of the website and it'll send you. Oh, that's pretty cool. 00:57:47 A cheaper lookalike thing like West Elm 500 dollar chair. Well, there's one on Amazon for 50. 00:57:55 That's cool. Now, for our, for the Android cycle pants that we have on our thing, 00:57:59 Android now has a thing where like supposedly, I guess it's a Google thing where you can circle 00:58:03 like a thing that you would see in a video or whatever and then we'll try to go and find 00:58:06 those products. All 12 Android users are having a great time with that. 00:58:15 Can't wait for Apple to come out with that feature. This has been another big cheese podcast. 00:58:19 Tell us how, tell our fans how they can get in touch with you. 00:58:24 Yeah, I'm Chris Vanoy. I'm available at Chris Vanoy.com or get axiomatic.com. I'm working on 00:58:29 training engineering managers, how to actually manage people, which turns out they're a lot 00:58:33 harder than computers lots of times. And if you're anything like me, you're gonna screw it up for 00:58:38 about five years before you figure it out. So yeah, thanks for having me on. I really appreciate it. 00:58:40 No, that's awesome. Awesome. All right, we'll see you guys next week.