Hosted by Brandon Corbin (filling in for DeAndre Harakas), with co-hosts Sean Hise, Jacob Wise, and a special mention of Tim Hickel for the upcoming episode.
The discussion on AI predictions is set against the background of upcoming practical episodes on marketing and personal finance.
The conversation kicks off with expectations for Apple's performance, particularly their financial growth and AI-related developments:
Anticipation of Apple's stock potentially hitting $1,000 and possibly splitting twice.
Discussion revolves around Apple's developments, such as the Apple Vision Pro, financial performances, and the rise in stock prices post-announcement.
Mention of AI research, narrative on Reddit, and Apple's potential success in the AI space.
Debate on Apple's AI-powered M4 chip, expected to drive AI on devices and increase refresh rates for Apple products.
Conversation extends to the broader tech ecosystem:
Noting a decline in the rate at which people are purchasing new iPhones.
Reference to Apple's shift in focus from producing cars to focusing on AI and robotics.
Personal anecdotes related to Bitcoin, with discussions on its value increase and relationship to AI.
The role of AI in streamlining personal efficiency and organization is highlighted, with practical examples of using AI tools like ChatGPT and Gemini for personal tasks.
Discussions on notion and content creation platforms:
Notion is highlighted as a powerful and flexible platform for content organization and sharing.
Comparison of Notion to Apple, and Obsidian as its Android counterpart, with the former being more user-friendly and the latter more open.
Personal experiences using Notion, including new features like calendar integration.
Predictions on the future role of AI:
Integration of AI with household robots, potentially leading to widespread domestic use in higher-income homes.
The nature of AI and robot interaction with humans, and potential security and ethical considerations.
The hosts share light-hearted personal stories and debates, leading to a well-rounded discussion focused on the future influence and integration of AI in technology, finance, and day-to-day life.
[00:00:00.000 --> 00:00:05.440] and welcome back to the big cheese podcast. I'm your host Brandon Corbin filling in for DeAndre [00:00:05.440 --> 00:00:12.960] because he's out in California doing some stuff for his new startup. I am the 15th best moderator. [00:00:12.960 --> 00:00:18.480] Thank you very much. Actually, I'm probably like the worst moderator and of all of moderators. [00:00:18.480 --> 00:00:24.720] And so today we're going to be talking about our predictions, except for Jacob because he doesn't [00:00:24.720 --> 00:00:31.760] have any. Our predictions of what AI is going to happen within the next 10 years. So next week [00:00:31.760 --> 00:00:36.320] in the week following, we're actually going to have two very pragmatic shows. It's going to be [00:00:36.320 --> 00:00:40.240] about marketing for one and then the other is going to be personal finance. So we figured let's [00:00:40.240 --> 00:00:44.960] step back and try to have a little bit loosey goosey one that's not as pragmatic and might not have [00:00:44.960 --> 00:00:51.680] any bearing on the future whatsoever. So with that, who would like to start on a prediction of where [00:00:51.680 --> 00:00:58.000] AI is going to go in the next 10 years? Apple's stock's going to be 1,000 and have split twice. [00:00:58.000 --> 00:01:05.280] So you came in today with the show talking very excitedly about Apple. [00:01:05.280 --> 00:01:14.080] Well, they're stock popped and I honestly, so two things. Apple, last year when they said they [00:01:14.080 --> 00:01:20.000] were doing the Apple Vision Pro, I thought that that was going to be something big and their company [00:01:20.000 --> 00:01:27.760] did well financially, obviously they always do, and that their stock just steadily rose after that [00:01:27.760 --> 00:01:32.800] announcement up through the release. And it's kind of like subsided a little bit. And we've been doing [00:01:32.800 --> 00:01:41.120] so much research on AI and my whole narrative that I've been pushing on Reddit in this podcast [00:01:41.120 --> 00:01:47.280] into Jacob to everybody is that Apple's going to do well in the AI space. I actually posted on Reddit [00:01:47.920 --> 00:01:56.240] about realm and someone responded, hey, you know, great, but are they going to make money? I said, [00:01:56.240 --> 00:02:02.480] well, what's the what's the market of AI spend in revenue in the next 10 years? I just put a huge [00:02:02.480 --> 00:02:07.920] chunk to Apple. And so they do not have a lot of information, but I know that they announced or [00:02:07.920 --> 00:02:13.280] they are putting into production their M4 chip. And that's that's the AI powered chip. So that's [00:02:13.280 --> 00:02:17.760] what's going to hopefully ship with these models and be able to right to do AI on the device. [00:02:17.760 --> 00:02:22.480] Right. So that would make sense because we were talking before where they were saying that they [00:02:22.480 --> 00:02:26.720] were seeing the speeds about the same, you know, the same speeds as you were getting on device that [00:02:26.720 --> 00:02:31.600] you would get with Chad GBT. So it would make sense that it would have to be a new chip and it's going [00:02:31.600 --> 00:02:34.720] to be a new chip that we're all going to have to pop down. Yeah, I mean, at this at that point, [00:02:34.720 --> 00:02:39.760] right, like that that's where you get a I don't know what the refresh rate device refresh rate is [00:02:39.760 --> 00:02:44.240] for Apple, but you look at like the last couple of years, it's been like, okay, the lens, it's [00:02:44.240 --> 00:02:48.800] got a lens that's got three lenses and you know what I mean? It's like, I don't know what we talked [00:02:48.800 --> 00:02:54.320] about it last week, but I just saw an article that was like, all right, the people just aren't [00:02:54.320 --> 00:02:59.040] going out and buying new iPhones anymore. It's like they were, right? It's more commoditized, like, [00:02:59.040 --> 00:03:04.960] okay, what's the big difference 16 has is going to have a great camera. Cool. But like, there's nothing [00:03:04.960 --> 00:03:10.240] about it. Now you're talking AI in the old and like, it's good to force how to get in the lab, [00:03:10.240 --> 00:03:16.560] I think, probably since last year around November or whatever when we talked about when chat GPT [00:03:16.560 --> 00:03:20.160] came out. They're spending a ton on our research and development, by the way. I mean, they just [00:03:20.160 --> 00:03:26.480] killed their car and the other. They moved the car to the day they are on the robotics, the people. [00:03:26.480 --> 00:03:32.640] Yeah, yeah. But I haven't heard anything about the car. So they totally [00:03:32.640 --> 00:03:40.160] Kibosh their car, which is funny. No more I car. So if you're a bear on Apple, you look at some [00:03:40.160 --> 00:03:44.880] of their failed endeavors and they're no, Steve Jobs ain't coming through that door of Johnny I've [00:03:44.880 --> 00:03:51.760] even. But yeah, I think for the next 10 years, we need to talk a little bit more, you know, [00:03:51.760 --> 00:03:57.120] I guess with a little bit more grandiose, but I came in with my chest puffed out because I want [00:03:57.120 --> 00:04:04.400] Apple. It's the M four is the next 10 years. Yeah, and I have an M one and like, I'd love to see [00:04:04.400 --> 00:04:08.320] what the what the newer ones can do, you know what I mean? The fact that they're coming out with [00:04:08.320 --> 00:04:14.720] these two every year, it's very impressive. So for people who don't know, Apple took control of [00:04:14.720 --> 00:04:21.920] their chip manufacturing and chip design about into what almost four years ago now. I think that's [00:04:21.920 --> 00:04:27.120] about right. And they started releasing their own chips and those chips are using the arm [00:04:27.120 --> 00:04:32.160] architecture, which is what you typically see in like Microsoft or Windows machines, right? [00:04:32.160 --> 00:04:38.480] Windows is x86 and the arm is was typically, I believe, on like more devices. Yeah, more. Yeah, [00:04:38.480 --> 00:04:44.800] more like on. Well, I think like the surface, I don't know whatever. Oh, yeah. So you would have [00:04:44.800 --> 00:04:50.640] like tablets and things that had like low power requirements would leverage arm because it was [00:04:50.640 --> 00:04:56.400] so efficient, which is why the M ones and or the M series is so great, because you can have like [00:04:56.400 --> 00:05:00.800] 24 hours of battery life and be running at the old time. Is anybody running an M three here? I'm [00:05:00.800 --> 00:05:05.680] running M one. Okay, so you got an M one. I've got an M two. You've got an M two. Yeah. Yeah. So I [00:05:05.680 --> 00:05:11.040] would be curious. I know that when they did the M three, there was some band like memory bandwidth [00:05:11.040 --> 00:05:16.480] was actually reduced. And so some people were kind of complaining about that versus the like the [00:05:16.480 --> 00:05:22.400] max M two or whatever. So I'll be curious to see what I mean, I haven't heard a fan going off in [00:05:22.400 --> 00:05:28.160] a minute. No, no, I get everything. Every once in a while, I'll get like weird video playback [00:05:28.160 --> 00:05:33.280] issues. It'll be like really shocked in the beginning. And then it's fine. But you know, that's [00:05:33.280 --> 00:05:37.440] it. It's almost like what do we call it? The hardware cliff thing? What's what's that concept [00:05:37.440 --> 00:05:41.120] you were mentioning where the hardware is ahead of the software? Yeah, yeah, the overhang. [00:05:41.120 --> 00:05:46.000] The overhang. And I think we're at an I think I think what Apple's doing to the ecosystem is [00:05:46.000 --> 00:05:50.400] they're going to create this big overhang. It is funny to me though, how we keep having this [00:05:50.400 --> 00:05:55.200] ebb and flow where it's like, and you look at it in technology all the time, Netflix did it where [00:05:55.200 --> 00:06:00.320] they were like, okay, we're going to take the DVD that you get delivered on device. And we're [00:06:00.320 --> 00:06:05.440] going to move that to the cloud. Right. And then they actually had a plan where they were going [00:06:05.440 --> 00:06:11.280] to store a bunch of their assets closer to the actual destination. They're like trying to work [00:06:11.280 --> 00:06:15.600] with like Comcast or different ISPs to be like, we're going to have a little Netflix server here [00:06:15.600 --> 00:06:20.240] that hosts all the popular movies in this region. So you like you constantly see this like back [00:06:20.240 --> 00:06:29.600] and forth like Google tried to take gaming GPU to the cloud. Strata strata strata. Yeah. And then [00:06:29.600 --> 00:06:34.960] like chat GPD comes out and it's like, well, because the chips didn't exist on for computers, [00:06:34.960 --> 00:06:39.120] they weren't the right types of chips or powerful enough. They're like, okay, we're going to put [00:06:39.120 --> 00:06:43.200] AI in the cloud. And now Apple's thinking, you know what, it would actually make more sense, [00:06:43.840 --> 00:06:48.160] or at least that does appear to be their strategy of like put it on device. Because one, we sell the [00:06:48.160 --> 00:06:52.720] devices. So we want to sell more devices. And two, it's going to be faster. There's just hardly any [00:06:52.720 --> 00:06:57.120] way to compete. Right. If you have the things, I mean, it's just expensive. So so 10 year, [00:06:57.120 --> 00:07:00.400] I was thinking about this in the way. And this is the most random question I had for you. [00:07:00.400 --> 00:07:07.120] Is you see, like during this whole AI thing that's been going on in the past six months, [00:07:07.120 --> 00:07:21.760] this revolution, Bitcoin's gone from $28,000 to $74,000. Is that is there something going on? [00:07:21.760 --> 00:07:27.600] That's significant in terms of how the world is it significant in the way the world is going [00:07:27.600 --> 00:07:34.400] to work in the next 10 years. And does AI have does AI in the blockchain have any sort of relationship [00:07:34.400 --> 00:07:41.680] in the next 10 years? It's an interesting question. So I, when when all of the crypto came out, [00:07:41.680 --> 00:07:45.600] I was very bullish on crypto. I thought it was going to change the world. You know, so I put, [00:07:45.600 --> 00:07:52.560] I think I put about a total of like three grand in the crypto, probably 10, 11 years ago, right? [00:07:52.560 --> 00:07:58.880] So that's that's the max. Now, so I'm sitting on a pretty nice bag of crypto at this course you [00:07:58.880 --> 00:08:04.240] are. Well, now full disclosure though, like, so funny little backtrack funny story is I'm sitting [00:08:04.240 --> 00:08:08.240] there and I'm like, I know I'm sitting, I have a hard drive that's got three Bitcoin on it, [00:08:08.240 --> 00:08:12.720] right? And this is when Bitcoin was at, I don't know, 50 grand or whatever, poor coin. And I'm like, [00:08:12.720 --> 00:08:16.160] all right, I got to find this. And I went on a mission and there's a whole Facebook post of [00:08:16.160 --> 00:08:20.560] this mission as I'm going through. And I, you know, there's a photo of all these hard drives [00:08:20.560 --> 00:08:24.000] all over my desk. And I literally bought a thing that you can just pop these hard drives and they [00:08:24.000 --> 00:08:29.440] check them. And I went through and checked 20 hard drives. I finally found the one that had the wallet [00:08:29.440 --> 00:08:34.560] on it. And I opened up that wallet and I see the transactions for sending two or three different [00:08:34.560 --> 00:08:39.040] Bitcoin. I'm like, what the hell is this? And I open it and I'm like, oh, yeah, that's when I [00:08:39.040 --> 00:08:47.360] bought magic mushrooms off the dark web for three, for three friggin Bitcoin. Oh, no. And it was like, [00:08:47.360 --> 00:08:52.880] you know, 300 bucks. It was like $100 per coin. Yeah. And it was when I spent that. And yeah, [00:08:52.880 --> 00:08:58.560] so I had managed to spend three Bitcoin. I'm so glad I asked you this. Magic mushrooms. [00:08:58.560 --> 00:09:03.360] Oh my gosh. You know, my buddy had a similar story, not quite as dramatic, but he was trying [00:09:03.360 --> 00:09:09.200] to buy some coin that wasn't on any real exchange. I always kind of stayed away from it because it [00:09:09.200 --> 00:09:13.440] seemed shady to me. I was like, I don't understand. You have to get this website. And then anyway, [00:09:13.440 --> 00:09:18.560] so he like tried to buy the some coin three or four times and it costs like 100 bucks or [00:09:18.560 --> 00:09:23.440] something every time you try to buy it. And it kept failing. He said, forget it. I'm just going to [00:09:23.440 --> 00:09:28.080] move on. Well, if he had gotten that transaction to go through, he would have made like two million [00:09:28.080 --> 00:09:33.520] dollars or something. Hell, like, that doesn't seem right. No, it's so bad. Also, the other one too, [00:09:33.520 --> 00:09:41.040] is so know me, which was my app that let you track everything. I ended up moving from firebase [00:09:41.040 --> 00:09:47.360] to blockchain or a block stack is what it was originally called. And it's basically a decentralized [00:09:47.360 --> 00:09:53.680] data store that you could have like this completely private store of data. And you could have data. [00:09:53.680 --> 00:09:59.440] And so I integrated it with know me. So they ended up giving me like 20 20,000 tokens of the [00:09:59.440 --> 00:10:04.800] XTX token because of this. And so I've just been sitting on it, you know, not never even thinking [00:10:04.800 --> 00:10:09.120] about it. And I just went and looked the other day and they're like at three dollars and 50 cents [00:10:09.120 --> 00:10:14.960] per coin right now. And I'm just like, there we go. That was one of those actual useful coins. [00:10:14.960 --> 00:10:20.240] Well, so what they're doing, which is kind of interesting, is they're a level two provider and [00:10:20.240 --> 00:10:26.960] but it's all built on the on Bitcoin. So you can then stake your your STX tokens and they actually [00:10:26.960 --> 00:10:32.880] pay you in Bitcoin. So their entire thing is really tied to the actual Bitcoin system. But it [00:10:32.880 --> 00:10:38.480] brings contracts, it brings all the things that like Ethereum has, but it brings it to the Bitcoin [00:10:38.480 --> 00:10:43.920] blockchain. Anyway, this isn't a Bitcoin podcast. So wait, but what was your prediction? Is it just [00:10:43.920 --> 00:10:49.280] Apple's gonna be it? Well, actually, I had a question. My prediction, my prediction was just [00:10:49.280 --> 00:10:55.760] please buy Apple stock. This is a public service announcement. No, what I was I was reading the [00:10:55.760 --> 00:11:03.520] show notes and it's just this discussion around where where Apple's going is I was where I look [00:11:03.520 --> 00:11:09.840] always. And Apple in the what's the conversation right now is not just I'm gonna chat with this [00:11:09.840 --> 00:11:14.720] thing proactively and it's gonna I'm gonna have a chat stream, right? It's it's reading, it's reacting [00:11:14.720 --> 00:11:19.520] and it's doing things. And we talked about that with rabbit and we and I see that as the next [00:11:19.520 --> 00:11:23.440] evolution. And I don't think it's gonna take 10 years, but I think the impact is gonna be felt [00:11:23.440 --> 00:11:28.800] over the next 10 years. So so what does the world look like when you don't really have to do much? [00:11:28.800 --> 00:11:34.080] Yeah, that is one of that was one of my predictions is that that UI moves from this like [00:11:34.720 --> 00:11:39.920] this thing where you have to go and interact with it to it now just becoming more of a thing that's [00:11:39.920 --> 00:11:44.640] confirming with you, you know, like when when you're interacting with these ais and these robots and [00:11:44.640 --> 00:11:49.360] the agents and all this stuff, it's gonna just be more of, Hey, I need this and it'll be like, all [00:11:49.360 --> 00:11:55.600] right, you want this and you want that? And so I think that the UI the whole UI space completely [00:11:55.600 --> 00:12:01.600] changes and and rabbits are a good example of that, right? Like rabbits taking the UI and is [00:12:01.600 --> 00:12:06.800] gonna automate it themselves and then basically confirm, Hey, or do you want this one from Uber [00:12:06.800 --> 00:12:14.400] each or whatever it is? In a world where UI doesn't matter, is someone need to invent a like a mid [00:12:14.400 --> 00:12:20.880] tier UI stack that that only interacts with AI agents? Well, that Zapier, Zapier, whatever the [00:12:20.880 --> 00:12:30.560] hell it is came out with their, Oh my gosh, center or their AI bot connector tool. So I tried it out [00:12:30.560 --> 00:12:34.160] last night because my buddy was asking me about it. And I was like, I don't know, I haven't used [00:12:34.160 --> 00:12:38.560] it yet. But it was one of their announcements a while back that's kind of starting to roll out. [00:12:38.560 --> 00:12:45.200] But essentially, you know, some of the tooling we've built, you could do using the make we've [00:12:45.200 --> 00:12:52.240] talked about make.com we've talked about zaps and all that. This is like the combination of those [00:12:52.240 --> 00:12:57.600] where you can go on there and I was like, okay, what would be useful for me? Because if I actually [00:12:57.600 --> 00:13:03.200] got used out of it, I might actually build it. And it was, oh, what when a Jira tickets created, [00:13:03.200 --> 00:13:08.320] or like when when someone comments on it, then maybe you could take that comment with the ticket [00:13:08.320 --> 00:13:13.840] and then write a response to it or or generate some actions to drive from that. And you can, [00:13:13.840 --> 00:13:19.680] that's the whole thing is like, it ties in two different data sources. So you could do an email. [00:13:19.680 --> 00:13:25.760] I tried to, I tinkered by building a custom email triggering, gmail triggering thing where [00:13:25.760 --> 00:13:30.640] it's like email comes in, send it to open AI, do some things. Well, you can do a lot of that [00:13:30.640 --> 00:13:37.600] with this new platform. So I do think to your point, like that's probably you'll see the proliferation [00:13:37.600 --> 00:13:41.760] of those types of platforms. I think my question with those platforms, and obviously that's a [00:13:41.760 --> 00:13:47.520] great, like that is if I was going to pick like a SaaS tool, that'd be in the top five, right? [00:13:47.520 --> 00:13:54.880] You know, in terms of like just pure SaaS play for building a platform AI can interact with. [00:13:54.880 --> 00:14:03.920] However, the longest lasting and most successful platform is just the web in general, in URLs, [00:14:03.920 --> 00:14:13.360] right? And so like if you look at how an AI, a general, intelligent AI being that is your [00:14:13.360 --> 00:14:20.240] assistant that's proactive, that can do things on your behalf, like, is there a place for just [00:14:20.240 --> 00:14:25.120] the web there? You know what I mean? Like the web is a tool and the standard just changes. Like [00:14:25.120 --> 00:14:30.000] schema markup, for example, when you're looking up a hotel rooms or a menu. You know what I mean? [00:14:30.000 --> 00:14:37.120] Yeah, I think, again, it's going to be how the AI pulls you into the discussion, right? It could be, [00:14:37.120 --> 00:14:43.440] hey, it's a text message says, hey, your anniversary is coming up. I've scheduled, I've got three different [00:14:43.440 --> 00:14:47.840] ideas for you and your wife. Here they are. You can go to it and you can actually see it. Now, [00:14:47.840 --> 00:14:53.280] that might be an app on your phone. It might be just a web app or whatever. But that's where [00:14:53.280 --> 00:14:58.400] I see that this AI goes. It's going to pull you in, basically ask you which ones. [00:14:58.400 --> 00:15:05.040] And of course, Google and Apple are the two, and maybe Facebook are the ones most prime for [00:15:05.040 --> 00:15:09.680] that conversation. Google already, I don't know if you guys have the Google app on your iPhones, [00:15:09.680 --> 00:15:17.200] but the personalized, maybe they're a little too personalized. I watched the show Invincible. [00:15:17.200 --> 00:15:23.520] Yeah. And almost every other day, there's a article on my Google feed that's like Invincible [00:15:23.520 --> 00:15:30.080] season three or something that's Invincible related. And I think we've talked about it before, [00:15:30.080 --> 00:15:37.200] but that's personalized using machine learning and AI. And to your point, why not go a step further? [00:15:37.200 --> 00:15:42.480] Google has my calendar. They can see my significant others, birthday or friends' birthday. I would [00:15:42.480 --> 00:15:47.840] love that. I want to do more personal things. I want to be more thoughtful, but it's like, [00:15:47.840 --> 00:15:54.880] I kind of forget sometimes it'll be like the day of. It's my three-year-old book prediction maybe, [00:15:54.880 --> 00:16:03.280] then, is if there's AGI and these things are trying to get things done for people, [00:16:03.920 --> 00:16:09.120] wouldn't they basically gravitate towards what people are also gravitated towards? [00:16:09.120 --> 00:16:15.200] So to me, that would just promote general accessibility of information. [00:16:15.200 --> 00:16:20.320] You know what I mean? If you're building a website for your company and it's fast and [00:16:20.320 --> 00:16:25.280] it's accessible right now, I feel like that's going to be a tool that will be successful for [00:16:25.280 --> 00:16:30.480] to intend you're still there. So like a layer's lasting power in that. And I think some of these [00:16:30.480 --> 00:16:35.760] proprietary platforms too, like you said, the leaders, the Facebooks, the Googles, the apples. [00:16:35.760 --> 00:16:40.880] But I still think that the, I still think bold prediction is there might be things that are just [00:16:40.880 --> 00:16:47.200] the same. Yeah. Yeah. Now you could see where, so you're going to have an AI that should be able [00:16:47.200 --> 00:16:54.800] to consume websites. That's kind of what I'm getting. Yeah. Because again, we as humans know [00:16:54.800 --> 00:17:00.240] what a button is. We know what a form field is. We know how to kind of go and navigate Amazon. [00:17:00.240 --> 00:17:04.720] There's no reason why you couldn't build a large language model that's basically like [00:17:04.720 --> 00:17:10.960] part, like tightly integrated with the engine of a web browser that it would know like everything [00:17:10.960 --> 00:17:14.720] that's going on with the browser. It'll probably, but you made this point already. And I think [00:17:14.720 --> 00:17:19.280] another thing that's important is it probably makes making accessible websites just that much [00:17:19.280 --> 00:17:23.360] more valuable. Right. Because it's like if you don't use a button element on a button on a website, [00:17:23.360 --> 00:17:27.600] it's going to use a span. Exactly. It's a diff. Yeah. And it might be able to infer that [00:17:27.600 --> 00:17:32.000] eventually. Like the ones that use the correct markup and you know, you're going to get more [00:17:32.000 --> 00:17:39.360] value. So accessibility is really important for AI because AI can interpret and interact with [00:17:39.360 --> 00:17:45.440] accessible web elements. It's a really easily. Yeah. Then it would a proprietary functionality [00:17:45.440 --> 00:17:49.520] or even a proprietary system that it has to learn the APIs for. It's the same thing that you [00:17:49.520 --> 00:17:55.760] were talking about the other day, which is these contracts or these. What did you say? You have [00:17:55.760 --> 00:18:01.440] the standards. Oh, the data exchange. Yeah. The APQC standards that yeah, where you basically [00:18:01.440 --> 00:18:06.480] just ground all of this on existing standards. Well, you know what else? Go back to the Bitcoin [00:18:06.480 --> 00:18:11.680] comment. What's the lead? What's the path of least resistance for an AI entity going to a [00:18:11.680 --> 00:18:18.560] regions bank and open in an account with a driver's license? Or putting some tokens. [00:18:18.560 --> 00:18:26.800] So that's another really interesting conversation would be, is does the AI end up leveraging Bitcoin [00:18:26.800 --> 00:18:32.320] or leveraging crypto as a means of exchanging value? Well, it's going to go to the path at least [00:18:32.320 --> 00:18:38.480] resistance. And it would be. That's a much less resistant. Yeah. Which actually, I didn't come [00:18:38.480 --> 00:18:43.760] prepared with any predictions. But all of these, this conversation has got me thinking about two [00:18:43.760 --> 00:18:48.240] things. So one, I'm like, I'm both an optimist and a pessimist about the future where it's like, [00:18:48.240 --> 00:18:53.760] okay, AI is most likely or already providing a bunch of efficiencies and efficiencies comes [00:18:53.760 --> 00:18:59.040] profitability with profitability generally comes more money back to the shareholders and less [00:18:59.040 --> 00:19:05.040] back to the common people. But I do think something has to break at some point. Like, [00:19:05.040 --> 00:19:10.400] you can only make so much money. Maybe I'm just wrong because capitalism is fun. [00:19:10.400 --> 00:19:14.880] Well, we have that conversation. Yeah. But like, my point is, [00:19:16.080 --> 00:19:20.400] the first thing you had on this list today for the show notes was sentiment, general sentiment, [00:19:20.400 --> 00:19:25.200] especially in the US is way down. Yeah. So people are skeptical of AI because they've, [00:19:25.200 --> 00:19:30.720] they've seen this show before. This is a rerun where they productivity goes way up and everyone's [00:19:30.720 --> 00:19:35.360] still laying off, laying off right because why? Because they're beholden to their shareholders, [00:19:35.360 --> 00:19:42.800] right? So how long before we actually finally collectively catch on and we're just like no more. [00:19:42.800 --> 00:19:47.840] And we either, what is it regulation or I don't know what that path looks like, but [00:19:47.840 --> 00:19:55.120] that there's going to be a straw that breaks the the camel's back. So I've got so and my other [00:19:55.120 --> 00:20:02.000] prediction here is that what we're going to see is not a job apocalypse in 10 years. Yeah, [00:20:02.000 --> 00:20:06.640] we're going to see a job apocalypse and it's going to happen where everybody's kind of going, [00:20:06.640 --> 00:20:10.400] oh, hey, I don't need all these people because I can have AI do this and do this and do this. [00:20:10.960 --> 00:20:13.760] But what I think is going to happen, the same thing that happened when we had the car, [00:20:13.760 --> 00:20:19.280] the industrial revolution, we had the cars and we had all of a sudden, entire new industries [00:20:19.280 --> 00:20:23.440] bubble up and we're going to have a lot more efficiencies. We're going to have a lot and [00:20:23.440 --> 00:20:27.040] people are going to be coming up with entirely new things that we never thought of that will [00:20:27.040 --> 00:20:31.920] require people. Now, the question ultimately does is it ever going to require as many people as it [00:20:31.920 --> 00:20:38.000] used to because we now have AI to do a lot of that work. But so I think we're going to see a [00:20:38.000 --> 00:20:45.440] short-term, very painful process transitioning to this new world of AI powering everything. [00:20:45.440 --> 00:20:50.480] But at that same time, I think we'll eventually have this cross where it's eventually now going [00:20:50.480 --> 00:20:54.640] to go. More people are now getting hired by the new industries that have bubbled up because of [00:20:54.640 --> 00:21:02.960] what actually came out because of the AI. I think that you make a good point and it reminds me of [00:21:02.960 --> 00:21:09.760] the 2008 financial crisis. So there was a huge transformation in terms of the types of jobs [00:21:09.760 --> 00:21:16.560] people were working at when you pre-2008 and post-2008. You saw a huge rise in the brew pub and the [00:21:16.560 --> 00:21:24.320] the micro restaurants. You know what I mean? That's the hipster thing became more like people [00:21:24.320 --> 00:21:29.440] started. People were doing more with less and being more in that kind of [00:21:29.440 --> 00:21:35.520] creative energy. Because people were like, "You know what? I don't need to go in this rat race. [00:21:35.520 --> 00:21:43.600] I need to go figure out myself. And I think that's potentially an outcome of all this. What are [00:21:43.600 --> 00:21:50.240] you going to do with all this time?" And maybe humans are creative beings and they figure out [00:21:50.240 --> 00:21:55.680] other things to do. Part of what is probably going to happen, you already see it on social [00:21:55.680 --> 00:22:04.800] media. Like I've learned more about gardening and exercise and music and fixing things in the last [00:22:04.800 --> 00:22:08.880] three years because of social media and all the things that people post the night did and 20 years [00:22:08.880 --> 00:22:14.320] before that. And it's just like people are trying to kind of figure stuff out. I mean, [00:22:14.320 --> 00:22:16.320] Jacob's out here making sourdough bread. [00:22:16.320 --> 00:22:24.400] Are you tempting? This Friday will be my first. I've got the starter. I've read a hundred articles. [00:22:24.400 --> 00:22:30.640] I've used chat, GPT, and Gemini to give me the schedules. I've watched the video. Even JVM has [00:22:30.640 --> 00:22:36.720] given me some advice. He does it every Sunday, he said. But think about that, right? It goes [00:22:36.720 --> 00:22:42.480] back to something my grandma told me. She was an artist. She was a cook. She was a grandmother, [00:22:42.480 --> 00:22:47.920] mother, tons of kids. And she always just told me people struggle with figuring out how to live. [00:22:47.920 --> 00:22:52.640] And one of the ways that she did that was just by being very self-sufficient and creating and [00:22:52.640 --> 00:22:56.960] just being really efficient, especially with like, let's be honest, like money stuff, [00:22:56.960 --> 00:23:01.760] like doing more with less. And I think people can, I think if you think, oh, there's just going to be [00:23:01.760 --> 00:23:08.800] people out on the street, nothing to do. I think that that is not correct. Yeah, I would agree. I [00:23:08.800 --> 00:23:13.760] would say short term, there'll be a lot of displacement, like you said, and we're going to have to figure [00:23:13.760 --> 00:23:18.080] that out because it's going to be an unfortunate side-effective. And we've talked about it many [00:23:18.080 --> 00:23:22.560] times on the show of there's just no need for certain jobs anymore. And those people [00:23:22.560 --> 00:23:31.920] will be left for now. And hopefully, as a country, as the world, we can figure out a way to come [00:23:31.920 --> 00:23:39.920] together and actually, because people are creative. They're not meant to be the transcribers. Those [00:23:39.920 --> 00:23:45.680] are jobs better suited for computers. But they're still the coal miners situation where, [00:23:45.680 --> 00:23:49.920] okay, now what? These guys, they're primed for coal mining. That's what they learned. That's [00:23:49.920 --> 00:23:55.680] what they did for a living. We people need something to do to feel valuable and to make money, right? [00:23:55.680 --> 00:24:05.360] But I will say since I've been using AI, I've noticed a crazy improvement in my productivity, [00:24:05.360 --> 00:24:12.960] my organization, my ability to get unstuck is so much better. Like, when I start writing docs or [00:24:12.960 --> 00:24:18.560] start organizing my thoughts, it's like, first thing I do is, when I get stuck is go to chat [00:24:18.560 --> 00:24:24.560] GPT or whatever. So you go to Gemini. I go to Gemini too. They're both so good, though. Like, [00:24:24.560 --> 00:24:31.280] I actually, I started using both again. Yeah. And chat GPT is really good. I forget what I [00:24:31.280 --> 00:24:37.920] used for. Oh, my taxes. Oh, really? I hate doing taxes. I don't even do them. I give this stuff to [00:24:37.920 --> 00:24:46.640] someone to do. I hate even doing that. I hate even doing that. I love scanning everything and [00:24:46.640 --> 00:24:53.040] getting my PDF into organizing it into folders and going, see, you have everything. Here's what I [00:24:53.040 --> 00:24:57.440] hate about it. I do too, actually. I love that part of it. What I hate about it is I've always [00:24:57.440 --> 00:25:01.280] relied on someone to do it. So every year, it's the same thing. I'm like, Oh, shit, what do I need [00:25:01.280 --> 00:25:04.560] to give them again? So I go back and I try to figure out what I gave them last year and follow [00:25:04.560 --> 00:25:10.400] the email audit. I just went to chat GPT and I said, Hey, I'm doing my taxes or I'm having someone [00:25:10.400 --> 00:25:15.760] do my taxes for me. What documents do I need? That's great. And they guy doesn't send you a packet. [00:25:15.760 --> 00:25:20.880] They do. But it's like this list of everything that could ever happen and it's horribly formatted [00:25:20.880 --> 00:25:26.160] and I get anxiety looking at it. So I have I took a copy of that document and I literally crossed [00:25:26.160 --> 00:25:29.200] everything out. That's not applicable. And it's like 90% of that. That's what I probably should [00:25:29.200 --> 00:25:33.600] have done. But I look at it and I'm like, no, this is applicable to me. So I just break out and [00:25:33.600 --> 00:25:40.000] then use it. But no, Chad GPT is great for something like that. I now have a notion document that says, [00:25:40.000 --> 00:25:45.200] here is everything that I as I was getting all the stuff I decided never again, will I have this [00:25:45.200 --> 00:25:50.480] anxiety. So I made a document last night of all the things that I delivered, all the LLCs and [00:25:50.480 --> 00:25:55.440] businesses that I they take care of the K ones or this guy does or whatever. And I'm like, okay, [00:25:55.440 --> 00:25:59.200] cool, this is more manual. I think I think notion deserves to be in our prediction. [00:26:00.000 --> 00:26:06.880] I think notion's a winner, short and I mean, in long term. So we we use so just for transparency [00:26:06.880 --> 00:26:13.520] at our company, we use Google workspaces. We use Jira to manage our projects. We use Git Hub and [00:26:13.520 --> 00:26:21.440] Bitbucket because we are in between, you know, relationships there. We use Slack to communicate. [00:26:21.440 --> 00:26:29.760] No one at our company sends each other emails. So that's not a thing. And we are we use but we use [00:26:29.760 --> 00:26:33.600] there's a platform called bug herd. It probably doesn't even deserve to be mentioned but to collect [00:26:33.600 --> 00:26:40.480] bugs from from customers on websites. But we use we want to use notion and notion has a is a great [00:26:40.480 --> 00:26:46.720] kind of way place to like puke out all your AI content and in just stuff in general and then [00:26:46.720 --> 00:26:52.560] figure out ways to organize it. You know, create landing pages for yourself, for your clients [00:26:52.560 --> 00:26:56.000] and different things. You shared some notion stuff with us. And then I thought, [00:26:56.000 --> 00:27:01.200] are you a notion guy? Yeah, tell us about your notion footprint. Yeah, so a notion. So I've [00:27:01.200 --> 00:27:07.040] I've gone through the journey of notes like through every single one. Yes, I need everyone [00:27:07.040 --> 00:27:15.200] needs to hear this. And I built one called belly lint, which was the name of it. And belly lint was [00:27:15.200 --> 00:27:21.120] it was basically my take on Apple notes, but it was completely decentralized using block stacks, [00:27:21.120 --> 00:27:26.000] decentralized storage engines. So you it was completely private. You could go and do whatever. And it [00:27:26.000 --> 00:27:31.200] was a smart down driven thing. And it was great. I used it for about a year and a half. The only [00:27:31.200 --> 00:27:37.280] downside was was the sharing aspect of it. And the sharing aspect is what is notions real secret [00:27:37.280 --> 00:27:42.640] sauce. Yeah, they've got a ton of secret sauce. But that's what I was asking about. Because the [00:27:42.640 --> 00:27:47.520] the so everything that's going on a mark down or a notion is mark down powered. And I think that [00:27:47.520 --> 00:27:53.200] that's a critical thing here. Because I can Google Docs, right? Google Docs is using the old what [00:27:53.200 --> 00:27:59.040] you see is what you get kind of what is so notions tables and all of it is all of it is marked out. [00:27:59.040 --> 00:28:03.680] So there's a mark down format that they feel like a Jira board kind of concept? Well, [00:28:03.680 --> 00:28:10.160] so it's most likely their thing is probably a custom markdown renderer, right? So you can have [00:28:10.160 --> 00:28:14.720] like some custom, you know, markdown components or whatever makes it lightweight. Yeah, exactly. [00:28:15.280 --> 00:28:21.280] And so but but what's great about it. So you go and copy from copy your output from chat GBT. [00:28:21.280 --> 00:28:26.000] That's always in markdown for the most part, which is terrible into Google Docs. Exactly. And you [00:28:26.000 --> 00:28:30.560] can't do it in the Google Docs. Oh my god, you just changed my life. So what you do is you copy [00:28:30.560 --> 00:28:37.040] it from there. You paste it into a notion, copy from notion, paste it into Google Drive. And it [00:28:37.040 --> 00:28:41.360] works great. And that's exactly how I do all of our document notes. I've got to copy it from chat [00:28:41.360 --> 00:28:46.320] GBT, paste it into notion, copy from notion, paste it into Google and it works great. [00:28:46.320 --> 00:28:54.800] So so that'll hopefully be a short but catch GBT hack. Take your chat GBT results. Don't copy [00:28:54.800 --> 00:28:59.840] and paste it directly into Google Docs. Go copy and paste it into notion, which is a markdown [00:28:59.840 --> 00:29:05.520] based system. Then if you really need to get it into Google Docs, copy it from notion back [00:29:05.520 --> 00:29:10.400] over to Google Docs and you just saved my life because I'm so sick of copying from chat GBT into [00:29:10.400 --> 00:29:15.280] Google Docs and seeing it form editorically. Yeah, it's horrible. Yeah, I'm sorry, horrible. [00:29:15.280 --> 00:29:21.760] Yeah. So no notion, but so notion, so notion has this sharing ability. It's got the team collaboration [00:29:21.760 --> 00:29:26.400] and the website for people don't let a lot of people have done us notion, not that count. [00:29:26.400 --> 00:29:33.440] Yeah. S O. Yeah. I do that all the time. I know. Yeah. But in terms of having like a a super [00:29:33.440 --> 00:29:38.880] flexible, very powerful, they've got to so I have upgraded. So they do have their own AI that you [00:29:38.880 --> 00:29:43.120] can kind of interact with and chat with. We have that now. It seems to kind of suck though. [00:29:43.120 --> 00:29:47.120] It's a little slow. It's a little it's slow but like don't get better. It will definitely get [00:29:47.120 --> 00:29:52.000] better but like when you're trying to ask you questions across your entire corpus of data, [00:29:52.000 --> 00:29:57.280] that thing went fucking haywire. It was just like and I was asking like very specific questions [00:29:57.280 --> 00:30:02.320] and it was like giving me the I mean the hallucinations that this thing did were just off the chart. [00:30:02.320 --> 00:30:07.840] Have you you well I know you have because you introduced it to me but are you still using obsidian? [00:30:07.840 --> 00:30:13.520] So I'm not using obsidian anymore. Obsidian is like yeah well he you tell it because you introduced [00:30:13.520 --> 00:30:18.640] it to me. Yeah so so I when I started so I had belly lent and then I'm like ah I went back to [00:30:18.640 --> 00:30:23.680] notion and then I'm like I still don't like that it's not private so then I went to obsidian and [00:30:23.680 --> 00:30:29.360] obsidian is a an app that you're going to run on your Mac, Windows, Linux I think that stores it all [00:30:29.360 --> 00:30:34.480] locally on your like unstructured documents. Exactly. It's just it's a folder structure of [00:30:34.480 --> 00:30:38.800] markdown but they do have syncing. They have a bunch of plugins. They have a bunch of really [00:30:38.800 --> 00:30:44.480] cool features. They have a couple different chat GPT kind of interfaces where you can then [00:30:44.480 --> 00:30:48.800] basically pull it in and interface with it but I didn't think that they had anything that like [00:30:48.800 --> 00:30:55.440] spanned the entire corpus. Oh yeah I wasn't sure not. My take on obsidian was it's amazing. Yeah [00:30:55.440 --> 00:31:00.320] it's just but it's like an android to me where it's like it's too open to me. So that's a great [00:31:00.320 --> 00:31:06.240] notion analogy. Notion is apple yeah and obsidian is I so and that was my thing is like if you want [00:31:06.240 --> 00:31:11.440] a phone to do whatever you want it to do go with the android. If you want a phone that just works [00:31:11.440 --> 00:31:15.840] but you have to follow it saying go with apple same exact thing if you want notes that just work [00:31:15.840 --> 00:31:20.320] go with notion. If you want your notes where you have like all the crazy shit you want calendaring [00:31:20.320 --> 00:31:24.880] you want sinking you want all this madness. Yeah go with a notion released a calendar that [00:31:24.880 --> 00:31:29.920] integrates direct with Google. I use it yeah and I just started using it. Whoa. Oh it's great. I [00:31:29.920 --> 00:31:34.960] have my favorite. Well what's your favorite feature so far? The fact that my next meetings link to [00:31:34.960 --> 00:31:41.840] go join the meeting is right there. Well seriously it's amazing. If you get the app on the Mac app [00:31:41.840 --> 00:31:45.200] you don't have to click the meeting invite. It's like your next meeting team as a team is meeting [00:31:45.200 --> 00:31:51.120] click to join. Oh yeah at the top right and then yeah which I agree that is by and far the best. You [00:31:51.120 --> 00:31:56.560] have to give it literally permissions to do every single thing ever right yeah and get your first [00:31:56.560 --> 00:32:00.640] it does get your first one. Yeah literally it's like we have it's like the last one it's like just [00:32:00.640 --> 00:32:07.920] basically everything. Yeah yeah so that that's my one my one beef with notion is that it does have [00:32:07.920 --> 00:32:14.080] access to like everything. That's a lot so you're so one thing that people don't realize is when you [00:32:14.080 --> 00:32:20.880] log in with Google Apple or X account on a separate platform it's using a authentication protocol [00:32:20.880 --> 00:32:26.880] called OAuth which is like basically like syncing some permissions it's giving the application to [00:32:26.880 --> 00:32:35.440] directly make API calls into that external app and to me that's a very difficult proposition to [00:32:35.440 --> 00:32:42.240] regulate. You're leaving it up to the application to handle that and then obviously like that's [00:32:42.240 --> 00:32:49.600] something to where you know if you're basically potentially opening yourself up at some level to [00:32:49.600 --> 00:32:56.080] if that source application that you're giving the permissions to gets compromised you know you might [00:32:56.080 --> 00:33:01.920] you might be in a little bit of trouble but I mean I think for me typically if I'm doing it with [00:33:01.920 --> 00:33:08.640] Google I feel usually feel pretty okay with that. Yeah I agree and I think Google somehow is just [00:33:08.640 --> 00:33:13.920] always going to win the auth war by the way like I don't know what they get out of that but like [00:33:14.480 --> 00:33:20.320] authing with Google it scares me or it doesn't scare me but it's weird to th with Facebook or [00:33:20.320 --> 00:33:25.120] like like Reddit or one of them the other day was get up and I'm like well one of the things is [00:33:25.120 --> 00:33:29.440] the reasons is because in that's the reason why Microsoft's probably second is because you can [00:33:29.440 --> 00:33:33.360] control someone's access to a Google workspace yeah so if they're logging in with their account and [00:33:33.360 --> 00:33:38.640] they get terminated versus like their Facebook account that's true that's true but it just feels [00:33:38.640 --> 00:33:43.040] like as a consumer that feels right. Do you guys remember the first time you use Gmail? [00:33:43.040 --> 00:33:50.720] Oh yeah wise dude three at gmail.com because I'm the third wise dude. I remember I was in the [00:33:50.720 --> 00:33:58.560] blood the main library at IU and I was I had logged into Gmail and there was two computers next one [00:33:58.560 --> 00:34:03.440] next to each other this guy his name was Wes he ended up being a really good designer and I was [00:34:03.440 --> 00:34:07.360] like the coder so he would design stuff while we were there and I would try to code it like while [00:34:07.360 --> 00:34:11.760] we're working and that's kind of how I learned front and dev but he would log into his Gmail and [00:34:11.760 --> 00:34:17.760] the chat feature was there and like they were AB testing it but for some reason it went like [00:34:17.760 --> 00:34:22.880] two years before the chat list showed up on my Gmail and I was always so mad everyone was [00:34:22.880 --> 00:34:27.840] sitting there chatting with each other and I didn't get to the chat. I never liked their chat like [00:34:27.840 --> 00:34:31.840] we used it all the time. I don't know why I just I never used I was named [00:34:31.840 --> 00:34:40.320] oh yeah I was skater boy oh wait I was Jeba dialing that was slack before slack existed yeah yeah oh [00:34:40.320 --> 00:34:45.920] but I want to go back to the notion conversation real quick because it I don't know if they'll be [00:34:45.920 --> 00:34:52.800] a winner but I do think content create platforms that enable content creation and have a unified [00:34:52.800 --> 00:34:57.200] structure of their data are going to be the winners because what sometimes what I'll do is [00:34:57.200 --> 00:35:04.080] I'll go to chat GPT and I'll say do this but I want to output in like a notion standard yeah you know [00:35:04.080 --> 00:35:09.760] they have a lot of they have a good marketplace for templates and a good base of that kind of [00:35:09.760 --> 00:35:19.040] baseline data set right to to pivot off so lightweight data semantic representations of content win [00:35:19.040 --> 00:35:24.000] in the AI world so you because right because because you can parse HTML quickly you can parse [00:35:24.000 --> 00:35:29.440] through and render markdown really quickly so if you're doing things in in that format I feel like [00:35:29.440 --> 00:35:34.320] and they're much more approachable for everyone like the accessibility they win with accessibility [00:35:34.320 --> 00:35:40.160] accessibility keeps keeps coming up here and that of course I'm totally slanted because you want [00:35:40.160 --> 00:35:45.120] that you want you don't want three proprietary platforms to win and no one to have access to [00:35:45.120 --> 00:35:50.960] create anymore yeah so I I think and this is one that I've been kind of noodling on I've never [00:35:50.960 --> 00:35:56.880] actually said it out loud so bear with me that I think we get to a point where you almost have [00:35:56.880 --> 00:36:04.720] these like portable vector volts right so I'm chatting with chat GPT I should basically just be [00:36:04.720 --> 00:36:10.320] almost like open search if you guys remember open search was a just an open standard for being [00:36:10.320 --> 00:36:17.520] able to do church yeah why couldn't we have that but it's vectorized right so I can go to chat GPT [00:36:17.520 --> 00:36:23.600] connect my notion connect my Gmail connect my Google Drive they all are an open search [00:36:23.600 --> 00:36:27.760] which I can go and provide a prompt it's going to do its own vectorization of the prompt it's [00:36:27.760 --> 00:36:31.920] then going to go and search its data through all the vectors that exist in there and return the [00:36:31.920 --> 00:36:39.440] things that are so that so for those that aren't familiar vectorizing your data is the first step [00:36:39.440 --> 00:36:44.320] basically to get a large language model to be able to interact with it but it's not standardized [00:36:44.960 --> 00:36:50.320] or is it standardized I mean like if I have a bunch of blogs I can chunk it up in an area I want [00:36:50.320 --> 00:36:54.320] well you could yeah you could you could chunk it up but nevertheless the end results still gonna [00:36:54.320 --> 00:36:59.440] just be an array of numbers that's still gonna be a vector so it's not that and but it's not even [00:36:59.440 --> 00:37:04.560] that it's critical like so when we would do the search for notion like chat GPT doesn't it really [00:37:04.560 --> 00:37:09.680] even give a shit about whatever their their vector strategy it doesn't matter no no no no because [00:37:09.680 --> 00:37:14.320] what they're gonna do I would think that the format and the vector would be so but that's that's [00:37:14.320 --> 00:37:18.800] for notion to do its search right so it's gonna do the search and it's gonna have its vector and [00:37:18.800 --> 00:37:26.080] it maybe it's a 726 you know a dimension whatever array but the query is gonna come in it's gonna [00:37:26.080 --> 00:37:31.360] convert it to its own vector format do the similarity search or do a semantic depending on how they [00:37:31.360 --> 00:37:36.720] want to do their search then that's just gonna come back to you know whatever because that's all it [00:37:36.720 --> 00:37:42.480] really is is here's the results now go and add that to your you know to the large language model [00:37:42.480 --> 00:37:48.000] along with whatever the large language model says and needs to answer but yeah I mean I do I think [00:37:48.000 --> 00:37:53.440] we might see a thing of where like you have this kind of portable vector search of all of your [00:37:53.440 --> 00:37:58.080] day because it's there I want I should be able to search my tweets I should be able to search my [00:37:58.080 --> 00:38:02.560] Facebook stuff I should be able to search my notion I should be able to search my email for [00:38:02.560 --> 00:38:09.360] fucking sure do you think I hear like a lot of the centralizing and connecting all these platforms [00:38:09.360 --> 00:38:15.520] we're just security play into this because okay it's the same it's actually one of the [00:38:15.520 --> 00:38:21.600] and Google has overcome this but it was one of the hang-ups of allowing Google to OAuth into [00:38:21.600 --> 00:38:25.680] everything that you have same same problem right so once someone cracks my Google account [00:38:25.680 --> 00:38:31.120] they now have cracked every app I've ever connected to that Google account right um essentially but [00:38:31.120 --> 00:38:36.880] the the yeah the security side of that would would absolutely be critical because especially [00:38:36.880 --> 00:38:41.680] if it's like here's your URL for your well I think that you think I think you touched on it though is [00:38:41.680 --> 00:38:47.040] is is the portable aspect so one of the things that's interesting about portable is that you take [00:38:47.040 --> 00:38:51.120] it with you and it's and you know what I mean right and that's where you know you talk about that [00:38:51.120 --> 00:38:55.600] like with the blockchain or you talk about you know certain things like not just being able to [00:38:55.600 --> 00:39:02.240] just access it unless it's something you know something you have something you are kind of concept like [00:39:02.240 --> 00:39:08.160] that that three-factor auth or literally you're taking that and the only way to access it is [00:39:08.160 --> 00:39:12.480] physically like something that you're bringing you know my god you know everyone's gonna get [00:39:12.480 --> 00:39:16.800] lock themselves out of their data so so what's funny is going back to block stack like we were [00:39:16.800 --> 00:39:22.640] talking about the the crypto which is now just STX stacks but that was kind of where they were [00:39:22.640 --> 00:39:28.560] going it was called Gaia Gaia hub is the name of it which is this decentralized storage [00:39:28.560 --> 00:39:35.200] that as an application you basically go and say hey can I have access to your storage and you're [00:39:35.200 --> 00:39:40.640] like yep you can have access to this partition and then I can just go and read and write just like s3 [00:39:40.640 --> 00:39:47.520] almost um and so I wonder if it might be something even like that where the portability is basically [00:39:47.520 --> 00:39:52.800] it's accessible I have to very specifically ask for permission it's using end-end encryption and [00:39:52.800 --> 00:39:58.640] you know elliptical curve all this nonsense so it's very secure that I wonder if it might end up [00:39:58.640 --> 00:40:03.680] being something like that where it's like here's your data store an apple could even be primed for [00:40:03.680 --> 00:40:08.240] right like here's your health your health data and here's how well that's that's one of the portability [00:40:08.240 --> 00:40:13.920] thing is interesting to me because you know the doctors like what's he gonna do like you're you [00:40:13.920 --> 00:40:20.000] actually at some point are gonna need to be able to like like give someone the key you know to the [00:40:20.000 --> 00:40:28.080] house right now and and and that that that that concept of of uniformity and how we exchange that [00:40:28.080 --> 00:40:33.680] data and being able to do these critical things like if you basically you're like hey I can't give [00:40:33.680 --> 00:40:38.960] you I can't guarantee your quality of care unless I have access to your records which are on this [00:40:38.960 --> 00:40:44.080] vectorized blockchain personal thing like like but like that makes sense right like like they're [00:40:44.080 --> 00:40:47.840] gonna know everything that they need to know like in there they're probably gonna be like well [00:40:47.840 --> 00:40:52.880] her insurance isn't gonna cover this because they were dealing with limited information and we know [00:40:52.880 --> 00:40:58.400] through through outcomes that if you give that if they have the full access to the data the outcomes [00:40:58.400 --> 00:41:05.040] are better yeah right so like there's there's a concept for sure I think where some of this really [00:41:05.040 --> 00:41:11.680] important information is going to need to be uh uh it has to be vectorized because that's [00:41:11.680 --> 00:41:16.000] because the way that interpreting and access that information is so archaic otherwise you're [00:41:16.000 --> 00:41:21.120] looking through charts and you're or I'm searching the email oh search what's that like like this [00:41:21.120 --> 00:41:26.480] thing's just like right up you know you just ask it a question it's right there yeah so I think that [00:41:26.480 --> 00:41:35.760] the I guess that bold prediction is like access access to data has to be done uniformly amps and [00:41:35.760 --> 00:41:41.600] in vector yep so the last one that I have which we're as we're running out of time which is the [00:41:41.600 --> 00:41:47.760] big one here for me is that I think that everything we're doing today all of the AI work we're doing [00:41:47.760 --> 00:41:55.120] today is a hundred percent for the robots of the future right the the being able to communicate [00:41:55.120 --> 00:42:00.400] with the large language models to being able to you know speak to it and have it be spoken back [00:42:00.400 --> 00:42:06.000] is basically us slowly figuring out how do we give agency to these robots that are going to be [00:42:06.000 --> 00:42:12.800] sitting in our house just like the human tv show or I mean how any of the sci-fi data you know any [00:42:12.800 --> 00:42:18.720] of the sci-fi shows where we had humanoid robots that we ultimately in 10 years we're either going [00:42:18.720 --> 00:42:24.240] to have like a generalized humanoid robot like what Elon's building or we're going to have a bunch [00:42:24.240 --> 00:42:29.680] of like small little specialized you know robots that are just rolling around your house you know [00:42:29.680 --> 00:42:36.080] pick up your dog shit or whatever if my tester is an AI enabled by the end of the decade you know [00:42:36.080 --> 00:42:41.440] I wanted to come up to my room and be like here's your toast just pop it out yeah so that's my take [00:42:41.440 --> 00:42:46.960] that everything we're doing though is really going to be for the robots yeah of the future yeah and [00:42:46.960 --> 00:42:53.920] hopefully they're good i mean you i the what the missing link with the robots um definitely [00:42:53.920 --> 00:42:59.280] the biggest leap for sure and you've seen it with the youtube it is just search you know AI robots [00:42:59.280 --> 00:43:03.520] whatever there's that video that robot just basically doing any task is asked on a kitchen [00:43:03.520 --> 00:43:12.160] counter but like the the the dexterity piece yeah is has been something that like Boston [00:43:12.160 --> 00:43:17.360] Science has been working on but the really the issue is interpretation the issue is visual [00:43:17.360 --> 00:43:23.600] and audio audio interpretation yeah what the what the the human wants from the robot i and i i [00:43:23.600 --> 00:43:28.720] agree that you put your i think that's a pretty good prediction and i also agree that it won't [00:43:28.720 --> 00:43:34.560] it's not going to be just one robot or just specialized robot so it'll be a general purpose one right i [00:43:34.560 --> 00:43:42.240] guess all video of drones with cameras i'm sure AI image detection and they're picking peaches off [00:43:42.240 --> 00:43:49.600] a tree and they're quite fragile right or soft so they it's got a little suction cup it goes like [00:43:49.600 --> 00:43:55.200] that and then carries it over and it slowly drops into the way and we are we have we have we have [00:43:55.200 --> 00:44:01.120] like manufacturing level like i mean that's basically what happened with Tesla what they built the best [00:44:01.120 --> 00:44:08.160] yeah so they have the automation process didn't until intel was planning on becoming completely [00:44:08.160 --> 00:44:15.680] automated or AI driven yeah AI automated factories featuring collaborative robots kobots for some [00:44:15.680 --> 00:44:22.400] kobots kobots aiming to answer manufacturing efficiencies so yeah i mean like um all these [00:44:22.400 --> 00:44:28.880] things that are labors only going to get more expensive um and restaurants no i'll make a [00:44:28.880 --> 00:44:35.680] i'll make a i'll add on to your prediction i envision within five years most people that have [00:44:35.680 --> 00:44:41.120] household income of let's say two hundred fifty thousand dollars a year or more are going to have [00:44:41.120 --> 00:44:46.880] an apple robot housekeeper assistant living in their home with them 24 hours a day what was the [00:44:46.880 --> 00:44:51.680] Jetsons robot oh you know i remember off top of her head she was thick she was thick [00:44:51.680 --> 00:45:00.720] josey yeah it was a josey rosy rosy rosy rosy rosy rosy yeah she she had she had an attitude [00:45:00.720 --> 00:45:05.360] but you know that the interesting thing about Jetsons you wake up you know you you literally [00:45:05.360 --> 00:45:09.280] go through this conveyor belt yeah you know did brushes your teeth where you put your clothes [00:45:09.280 --> 00:45:13.440] it shut out you know it charges it was still unrealistic because they only had one income one [00:45:13.440 --> 00:45:21.760] household income yeah that's right um but then here's what what happens when the when the apple [00:45:21.760 --> 00:45:28.800] robot gets lonely is this a joke are you i'm waiting for a punch i mean what if it what if you [00:45:28.800 --> 00:45:36.000] what if you come home on a in after traveling it's it's Friday afternoon wife no the robots like [00:45:36.000 --> 00:45:42.240] where have you been i have not been given any tasks the house i've cleaned the house 17 times [00:45:43.120 --> 00:45:50.480] it would be like the carpet is worn down i saw i saw bobbie i saw bobbie and sue next door and [00:45:50.480 --> 00:45:56.400] they've been interacting with their robot only i just started angry have you seen have you seen [00:45:56.400 --> 00:46:03.920] x x machina or whatever oh that's that's that's that's uh what's uh alicia vacander she's smoking [00:46:03.920 --> 00:46:08.800] she's a Tomb Raider chip yeah you yeah i'm gonna watch that again that would be round myself honestly [00:46:08.800 --> 00:46:14.240] we should all probably re-watch all of them yeah like her oh yeah another one you know i love her i [00:46:14.240 --> 00:46:21.760] love that movie yeah i love her too yeah all right well so we'll we'll wrap it up there we'll call it [00:46:21.760 --> 00:46:27.600] we'll call it uh we'll call it so next week um next week is going to be uh ai and marketing we're [00:46:27.600 --> 00:46:32.960] going to try our number two with tim hickel and hopefully now that we've got a better professional [00:46:32.960 --> 00:46:38.960] setup here that that will go off without a hitch and then following up uh the next week will be ai [00:46:38.960 --> 00:46:44.320] and personal finance and we have this really interesting guy who's uh a cs cfo but he also is [00:46:44.320 --> 00:46:49.760] somewhat of a developer building some interesting ai uh tools looking forward to that one so thank [00:46:49.760 --> 00:46:52.080] you very much everybody and we will see you next week