Hosts: Sean Hise, Jacob Wise, and Brandon Corbin Special Guest: Dipen Mehta
Topics Covered:
Relevant Quotes:
LInkes: Invest (iNVST.com), Super Amplify (superamplify.com)
[00:00:00.000 --> 00:00:05.680] Welcome to the Big Cheese Podcast. My name is Sean. I'll be your host today. I'm here with as usual Brandon [00:00:05.680 --> 00:00:09.680] Hello Jacob and we have special guests today deep in meta [00:00:09.680 --> 00:00:15.200] And today we're going to be talking about a smattering of topics. It sounds like [00:00:15.200 --> 00:00:21.680] Including personal finance and AI. So we've been talking about I don't know. This is this is the last episode [00:00:21.680 --> 00:00:26.640] Of season one we've done 26 episodes and we've covered everything from [00:00:27.600 --> 00:00:34.720] Predictions of AI foundational almond say I've and we've also done some industry specific topics as well. So we did marketing [00:00:34.720 --> 00:00:38.560] Marketing [00:00:38.560 --> 00:00:44.880] Marketing three times and then we got one one episode out of it. No, we did sales healthcare healthcare [00:00:44.880 --> 00:00:52.480] We did big data enterprise data, which was another good episode. And so we're really excited to talk with you today. Nice [00:00:52.480 --> 00:00:55.280] so but first off, I think we [00:00:55.520 --> 00:00:57.200] Just want to [00:00:57.200 --> 00:01:01.360] Get a quick introduction like tell us tell us kind of who you are what you do [00:01:01.360 --> 00:01:04.720] What you're some of your background and then kind of how you've gotten into some of the AI stuff [00:01:04.720 --> 00:01:11.200] Yeah, for sure. So my background is I started off as a floor trader at the Chicago board options exchange [00:01:11.200 --> 00:01:15.760] So I went to a Boston university undergrad study computer science [00:01:15.760 --> 00:01:21.280] And wanted to make a living doing trading and what started with just working at a company [00:01:21.760 --> 00:01:25.840] I turned into a 20-year career where I built the trading firm [00:01:25.840 --> 00:01:29.920] And I basically was one of the first trade options electronically [00:01:29.920 --> 00:01:35.920] And we used a lot of technology. We traded overnight. We traded all 24 hours [00:01:35.920 --> 00:01:39.200] It was a very competitive business. So I kind of [00:01:39.200 --> 00:01:46.640] After a certain time period, I I was sold the business in 2015 and then I got into the financial advisory space [00:01:48.080 --> 00:01:53.600] Initially as a consultant and then I was like, I want to I want to get involved on a help scale company [00:01:53.600 --> 00:01:56.480] And so I got it in that space, but I've always done AI [00:01:56.480 --> 00:02:00.320] Even when we were trading we were data mining finding opportunities [00:02:00.320 --> 00:02:06.160] Finding edges and then being able to like implement those edges and into trading algorithms [00:02:06.160 --> 00:02:10.720] To make money. And so it's such a fast fascinating feel for me right now [00:02:10.720 --> 00:02:14.800] So the when you say you're on the floor and this what year would this have been rough? [00:02:14.800 --> 00:02:21.680] So I started on the floor in 1995. Okay, so this is when it was literally a floor. Yeah, people are screaming at each other [00:02:21.680 --> 00:02:23.680] It's roamed paper around on [00:02:23.680 --> 00:02:27.360] Double phone in it. How was a trade executed back then? [00:02:27.360 --> 00:02:30.000] So what you did was um, so [00:02:30.000 --> 00:02:33.680] There was like brokers all around the pit orders would come in through the phone [00:02:33.680 --> 00:02:37.840] Um, they would write view it or down and they would say like, hey, what's the market for this? [00:02:37.840 --> 00:02:44.000] And then i'm me and like hundreds of other guys are in the pit yelling out markets like three eights five eights [00:02:44.080 --> 00:02:48.000] You know three eights five eights and then and they're like, okay. I'll sell it to three eights [00:02:48.000 --> 00:02:49.760] You know, it's all by 500 [00:02:49.760 --> 00:02:56.320] You know and you're taking this down and you're writing it in your card and you're submitting the ticket into the the trade reporter [00:02:56.320 --> 00:02:58.400] And then that's how the trade got executed [00:02:58.400 --> 00:03:04.240] And you're taking the trade and giving it to your clerk and the clerk is putting it into your your your trading system [00:03:04.240 --> 00:03:08.880] And then you're now you have an updated position that you can like analyze and see what you're at [00:03:08.880 --> 00:03:12.640] Does that experience at all exist anymore? [00:03:12.640 --> 00:03:15.920] So at a limited extent so like they're they're still pit [00:03:15.920 --> 00:03:22.080] Um, and I traded s&p 500 next options primarily and so that pit still exists [00:03:22.080 --> 00:03:28.000] But it only handles like the really really large flow like most everything else is just [00:03:28.000 --> 00:03:34.000] Electronically driven. Um, you know, there's all good. It's all computers matching computers [00:03:34.000 --> 00:03:37.440] Um, so it's very, you know, non non human [00:03:38.400 --> 00:03:43.280] Right, right. So the one of the things that and and I was just saying about it as I was driving up here [00:03:43.280 --> 00:03:46.640] Is that I remember back in the day forex for forex? [00:03:46.640 --> 00:03:53.440] Yeah, and everybody was talking about how this is how you go and you automate and make mark money and blah, blah, blah, blah [00:03:53.440 --> 00:03:56.800] Has that has it changed with AI? [00:03:56.800 --> 00:04:01.920] And if it has well first could you give us a little education on what that it even was [00:04:01.920 --> 00:04:05.680] That I mean I again. I'm flying blind here. I'm an idiot when it comes to it [00:04:05.680 --> 00:04:12.000] But I'd be curious to see how that's changed now with the advent of AI. Yeah, so I think it's for as um [00:04:12.000 --> 00:04:17.040] Companies and how they approach if you're if you're a trading company now how you approach [00:04:17.040 --> 00:04:21.920] Um, you know using AI in in in the trading space, right? [00:04:21.920 --> 00:04:27.440] Um, what a lot of companies are doing is they're they're creating their own like brains, right? [00:04:27.440 --> 00:04:30.720] So like, you know, you think about you know ray dahlio, right? [00:04:31.120 --> 00:04:36.640] Prior to that. Yeah, he's he's built this thing with every rule that he's ever thought about [00:04:36.640 --> 00:04:39.680] And it's it's you know, and it's constantly getting [00:04:39.680 --> 00:04:47.680] updated with experience that they have and so thinking about that and and using that in a way that's [00:04:47.680 --> 00:04:53.920] You really like black and white right? Either either works or doesn't right so um a lot of those things [00:04:53.920 --> 00:04:58.240] Are built off of that concept? Okay, like how do we you know? [00:04:58.240 --> 00:05:06.000] How do we build a neural net that could that could you know predict or model the markets and then be able to take that prediction [00:05:06.000 --> 00:05:12.320] and see if it's a reality so you can each prediction will have a probability that you've you've kind of [00:05:12.320 --> 00:05:19.920] Speculated on or measured and then you're making bets that are have you know higher probabilities [00:05:19.920 --> 00:05:23.440] And and and and shorting bets that have lower probability [00:05:23.920 --> 00:05:30.000] And then ultimately like having a diversified exposure that and that's kind of I think what some of those systems are doing [00:05:30.000 --> 00:05:32.960] Okay, and they're trying to make it so it's more available to [00:05:32.960 --> 00:05:35.680] To the general public, you know, right? [00:05:35.680 --> 00:05:40.640] So so it really seemed the whole forex thing was using machine learning as a model [00:05:40.640 --> 00:05:45.440] It wasn't large language models, which obviously blew up when everybody became kind of self-aware of AI [00:05:45.440 --> 00:05:53.600] Are there are there new techniques that people are leveraging that are showing better results than the traditional machine learning methods? [00:05:53.600 --> 00:05:59.520] I do yeah, so I think with the semantic search, right? So the semantic search is a generative AI [00:05:59.520 --> 00:06:06.240] Concept that's come out. You know the next last two three years. That's a very powerful search [00:06:06.240 --> 00:06:14.400] Powerful search capabilities that are you're going to be able to dissect a lot of data in a lot shorter period of time [00:06:14.400 --> 00:06:16.160] Then we used to do it back in the day [00:06:16.160 --> 00:06:20.240] Right because it took you the whole night to go through the entire market [00:06:20.720 --> 00:06:25.920] Data that you had whereas you can kind of do it a lot faster because you've got everything [00:06:25.920 --> 00:06:30.640] Cataloged already in in your in your sort of vector database, right? [00:06:30.640 --> 00:06:37.280] So once you've got all that stuff you can just at your fingertips pull that out and and you can have more [00:06:37.280 --> 00:06:41.520] Higher quality decisions in a shorter period of time [00:06:41.520 --> 00:06:41.920] Okay [00:06:41.920 --> 00:06:44.400] So I think that's that's that's gonna change the game [00:06:44.400 --> 00:06:50.400] And I think a lot of a lot of companies are starting to adopt that and how they're making trading decisions [00:06:50.400 --> 00:06:56.480] Right is any are any companies literally using like an LLM type interface where they're like give me some trading [00:06:56.480 --> 00:07:03.200] You know scenarios or give me some advice based off this data and actually interacting like an LLM style [00:07:03.200 --> 00:07:08.800] So so the so the thing concept with the the trading markets is our proprietary trading [00:07:08.800 --> 00:07:16.400] It's completely different than what we look at when we think about you know a personal finance or or even the you know [00:07:17.040 --> 00:07:20.720] The generative AI space because a lot of those guys they don't want to tell you right [00:07:20.720 --> 00:07:24.400] They're secretive. They're they're doing things. So like you're not going to know [00:07:24.400 --> 00:07:28.480] What these different companies are doing the ones that are super successful [00:07:28.480 --> 00:07:32.320] Unless unless they are trying to recruit capital, right? [00:07:32.320 --> 00:07:37.280] So if they're trying to recruit capital then you can have a good insight into what they're what they're actually doing [00:07:37.280 --> 00:07:42.880] But most of them will be pretty secretive about it, but I imagine if you know if I was [00:07:43.520 --> 00:07:46.400] Running a trading firm right now. I would be 100% [00:07:46.400 --> 00:07:51.200] Building teams that are using it and taking advantage of it because [00:07:51.200 --> 00:07:56.160] It's a game changer. Just like just like the internet was just like the personal computer was [00:07:56.160 --> 00:08:04.080] It's it should change the way that they're doing business and in a lot of cases. It's like you're it's like your library and in a in a in a [00:08:04.080 --> 00:08:09.840] Natural language, right? So you're basically asking questions like hey have you ever seen this happen before? [00:08:10.160 --> 00:08:13.920] You know what what it's the likelihood of you know this stock and this stock [00:08:13.920 --> 00:08:16.800] You know crossing levels or something like that [00:08:16.800 --> 00:08:22.080] So have you thought so well parlay just because we have a little bit more [00:08:22.080 --> 00:08:28.000] You and I have a little bit more backstory, but so you have you you're the cio cto [00:08:28.000 --> 00:08:34.800] So my role so i'm i'm the cOO of invest and now we're a financial advisor here in indianapolis [00:08:35.200 --> 00:08:38.720] Um, we have about 1.5 billion assets under management [00:08:38.720 --> 00:08:44.800] And we're we're actually completely different than any financial advisor you'll ever meet we we focus on [00:08:44.800 --> 00:08:51.200] Transforming people's lives in a in a really unique way. We we don't care about how much money they have [00:08:51.200 --> 00:08:54.800] What we're looking at is like what's your mindset? [00:08:54.800 --> 00:08:59.520] Like what do you how do what do you feel about money and you are you in a scarcity mode? [00:08:59.520 --> 00:09:03.520] Are you in an abundance mode right so we're working with people that have abundance mode [00:09:04.160 --> 00:09:10.800] Mindsets and then we're helping them transform their financial lives where we can make it make an impact on a million people [00:09:10.800 --> 00:09:15.200] So we want to help a million people live the life they want and it's it's such a powerful [00:09:15.200 --> 00:09:16.800] uh [00:09:16.800 --> 00:09:19.040] Sort of concept that we've been able to really [00:09:19.040 --> 00:09:25.360] Um, you know, build the community around it and grow grow our assets tremendously over the last [00:09:25.360 --> 00:09:33.120] And so the websites i n v s t dot com com. That's right. Okay, and you guys really get involved in like [00:09:33.680 --> 00:09:34.720] not just [00:09:34.720 --> 00:09:40.400] Investing people's money but really looking at like their entire life cycle of of all the different things that they've going on [00:09:40.400 --> 00:09:47.040] And they're in their lives and how their financial aspects also like talk talk to a little bit. Sure. Yeah, so what we have a really [00:09:47.040 --> 00:09:53.840] Interesting process, right? So we start with the why so we try to understand. What is the why? [00:09:53.840 --> 00:09:59.920] For your money, right? What do you you know, what do you want to get to what it? What is your sort of [00:10:00.880 --> 00:10:04.720] You know, uh, sort of framework and how you think about money [00:10:04.720 --> 00:10:11.200] So we establish that so we go through like your your mission your personal statement your core values as a family [00:10:11.200 --> 00:10:15.680] And we really establish that why we spend a lot of time on it and we get that in place [00:10:15.680 --> 00:10:20.000] So that's what we use to measure the financial decisions that they make [00:10:20.000 --> 00:10:26.720] Through their life, right? So, you know, you're you're if you don't know how to measure and weigh those decisions [00:10:27.040 --> 00:10:31.920] You're going to make bad decisions more than good decisions. So if we have that framework in place [00:10:31.920 --> 00:10:37.200] Things go a lot better and we've seen that in our clients the transformations that they've had [00:10:37.200 --> 00:10:40.800] So we start with that and then we we look at their their overall [00:10:40.800 --> 00:10:45.760] Sort of legacy, right? Like what kind of protections do you have? Do you have an estate plan? [00:10:45.760 --> 00:10:51.440] Is it is set up for you to be able to you know, leave a legacy for your family? [00:10:51.440 --> 00:10:55.200] Are you protected from from all kinds of insurance purposes? [00:10:55.520 --> 00:10:57.520] You know, do you have all the right? [00:10:57.520 --> 00:11:04.960] You know, sort of things in place so you can leave that legacy without any issues and and government doesn't take it all right [00:11:04.960 --> 00:11:12.000] You know with probate and all that and then once we have that locked up. We look at their cash flow, right? We look at say, hey [00:11:12.000 --> 00:11:16.640] You know, do you are you a world-class saver? Are you putting money away? [00:11:16.640 --> 00:11:21.360] Consistently so a lot of like we work primarily with business owners [00:11:21.680 --> 00:11:26.640] So when you work with business owners business owners, what do they do? They put it all online for their business [00:11:26.640 --> 00:11:31.520] They just like every day they're busting their butts. They're working as long hours [00:11:31.520 --> 00:11:36.240] You know, they're they're like they wake up thinking about a business to go sleep thinking about the business, right? [00:11:36.240 --> 00:11:42.240] So we we what we do is we help them with their businesses where we we we implement EOS for them [00:11:42.240 --> 00:11:45.280] So EOS is a entrepreneurial operating system [00:11:46.080 --> 00:11:52.080] We have an invest for a business program where we you know get a valuation for their business determine whether or not [00:11:52.080 --> 00:11:58.640] They could take they could improve that valuation and then we implement different programs like EOS [00:11:58.640 --> 00:12:03.760] You know 10 disciplines, which is a course for executives to really [00:12:03.760 --> 00:12:10.240] Balance their personal and executive life so they can live a happier life and then entrepreneur leap [00:12:10.320 --> 00:12:16.320] Which is a program for new businesses and new business owners where they they can get into [00:12:16.320 --> 00:12:22.320] Sort of the scalability of their business by learning the infrastructure how to build the right infrastructure [00:12:22.320 --> 00:12:26.960] How to think about the right strategy and some of those things so we have those programs in place [00:12:26.960 --> 00:12:29.520] or our business owners so they can [00:12:29.520 --> 00:12:34.400] Get out of their business and think about their business from a from a strategic perspective [00:12:34.400 --> 00:12:39.440] And then they can become world-class savers which in turn will create long-term wealth for them [00:12:39.920 --> 00:12:44.960] And then finally we work on their investments, right? So we what are we going to do with their investments? [00:12:44.960 --> 00:12:46.560] We're going to leverage them [00:12:46.560 --> 00:12:52.400] We're going to try to try to get them in a way where they're going to create velocity of their money like think of a bank [00:12:52.400 --> 00:12:57.040] Right a bank is going to take your money and then they're going to try to [00:12:57.040 --> 00:13:02.320] Uh, you know get money from the federal reserve and then they're going to try to multiply that [00:13:02.320 --> 00:13:08.800] Investment in multiple ways, right? So that's kind of what we're going to be doing with their investments is try to multiply that [00:13:09.440 --> 00:13:15.840] Nice, that's awesome. And then you also have on the your little side hustle. Yes, we've got super amplified [00:13:15.840 --> 00:13:20.080] And that's super amplified calm. That's right. Right. Yeah, and and this is a [00:13:20.080 --> 00:13:25.040] An app that basically is well. Go ahead. You can cut it to us [00:13:25.040 --> 00:13:32.320] Super amplify is is sort of an application that i've been sort of playing around so with with the advent of generative AI [00:13:32.320 --> 00:13:38.800] I've been just sort of playing around like what would be useful at work for people to be successful [00:13:39.280 --> 00:13:42.880] Uh at their at their jobs, right? How can you transform your business? [00:13:42.880 --> 00:13:45.280] Uh, where you can get 10 extra results [00:13:45.280 --> 00:13:51.440] From where you're at right now. Right. So super emphasize is as a platform that can help you [00:13:51.440 --> 00:13:58.400] Do a lot of things to improve your workflows to be able to use all the the large language models that are available [00:13:58.400 --> 00:14:03.360] In a safe and secure way, right? So we have the ability to create something called collection [00:14:03.360 --> 00:14:07.600] You can put in your data. It's not ever shared with the model [00:14:07.680 --> 00:14:10.880] So it's only available to you and it's it's you know [00:14:10.880 --> 00:14:15.920] Kept in a way that only the models only get what they need to give you the right answer [00:14:15.920 --> 00:14:20.640] So it creates a lot of sort of security for you and then there's also privacy [00:14:20.640 --> 00:14:26.000] Protections that are in place. So you're not giving away your intellectual property to the model. So [00:14:26.000 --> 00:14:30.400] Those are some of the things I think are really important when we talk about [00:14:30.400 --> 00:14:37.120] You know generative AI is that the only way these models get better and smarter is by [00:14:37.440 --> 00:14:38.880] Consuming data, right? [00:14:38.880 --> 00:14:42.960] And so like when you think about the new models that are out like the meta model or the [00:14:42.960 --> 00:14:46.320] You know google model or even the open AI models [00:14:46.320 --> 00:14:51.120] They've all gotten there because people have done a lot of things on their platforms, right? [00:14:51.120 --> 00:14:54.240] So meta is mining all your facebook data [00:14:54.240 --> 00:14:56.720] Google's mining all your google searches, right? [00:14:56.720 --> 00:14:59.680] So these are these are how these models have gotten smarter [00:14:59.680 --> 00:15:05.520] And that's a challenge because like your intellectual property is worth a lot to you [00:15:06.000 --> 00:15:08.000] and the minute you start [00:15:08.000 --> 00:15:10.160] letting them consume it [00:15:10.160 --> 00:15:13.520] It it devalues of who you are and what you build, right? [00:15:13.520 --> 00:15:18.160] So so that's that's kind of the purpose of it and so companies could use it safely [00:15:18.160 --> 00:15:23.360] I want to just switch gears a little bit because I don't think we've had a guest on here that actually has [00:15:23.360 --> 00:15:28.720] A lot of experience with different models like Brandon does and and like we do in terms of actually [00:15:28.720 --> 00:15:34.480] Analyzing and evaluating them. So I just have a kind of a question for you guys is like I'll start with you is like [00:15:35.440 --> 00:15:37.120] are [00:15:37.120 --> 00:15:43.440] First of all, what's your what's your what's your favorite model and what's your take on like the evolution of these models [00:15:43.440 --> 00:15:46.000] We'll just start with that. Yeah, I [00:15:46.000 --> 00:15:52.160] So so I have a different favorite for different things. So it's it's kind of cool. So like I [00:15:52.160 --> 00:15:53.920] I love [00:15:53.920 --> 00:15:58.160] I love asking different models the same question, right? [00:15:58.160 --> 00:16:00.640] And I love seeing the the the [00:16:01.040 --> 00:16:04.960] Powerful ways that they answer it. So I've asked questions like you know [00:16:04.960 --> 00:16:10.880] Hey, I want to take a brain to AI interface and build it and you help me do that some models will say [00:16:10.880 --> 00:16:16.720] Um, I can't help you. That's very controversial and then other models will be like here start with this [00:16:16.720 --> 00:16:23.600] You know, then do this. So it's there's a lot of like interesting answers that you're you're gonna get and then you can also like almost [00:16:23.600 --> 00:16:27.920] convince them to give you an answer right if you work you work them over and i work them over [00:16:28.320 --> 00:16:34.720] But but I like like I've been I like a lot the new llama model. That's k come out the llama three [00:16:34.720 --> 00:16:42.000] 70 billion parameter and that's the facebook's model for people. I don't know and that is uh open [00:16:42.000 --> 00:16:47.280] It's open source like a with an asterisk which ever if like everything with facebook [00:16:47.280 --> 00:16:49.840] Yeah, like if you have more than 700 million users [00:16:49.840 --> 00:16:54.800] Then you have to pay for a license to use it so like netflix or amazon [00:16:55.200 --> 00:16:58.800] They wouldn't be able to use it without getting the license but pretty much everybody under that [00:16:58.800 --> 00:17:05.040] So we saw today the news that apple is coming out with their with an eel what they're calling an eom [00:17:05.040 --> 00:17:08.240] Yep, and it's going to be open source and it's on device [00:17:08.240 --> 00:17:13.440] So what's the what's the reasoning why a company like a facebook and uh, um [00:17:13.440 --> 00:17:16.080] An apple would open source their models [00:17:16.080 --> 00:17:21.360] Versus, you know, some of the other companies that are not doing so and it's everything. It is going to be open source [00:17:23.280 --> 00:17:28.880] Go ahead. Yeah, it's it's an interesting question of why they're doing it. Um, and I think it's almost a [00:17:28.880 --> 00:17:35.360] I think it's a reactive response to the closed source models of open ai [00:17:35.360 --> 00:17:38.480] That it's almost a way for us to go. You know what fine [00:17:38.480 --> 00:17:43.520] We're just going to rug pull you and we're going to have this one be completely open source and available to everybody [00:17:43.520 --> 00:17:47.760] To make the competition that much better, but that's the only thing that I can kind of come up with [00:17:47.760 --> 00:17:50.000] This is now this our last step of the season [00:17:50.000 --> 00:17:52.800] And I think we've been talking about this trying to answer this question [00:17:53.120 --> 00:17:54.400] Which is [00:17:54.400 --> 00:17:58.000] Is the model a commodity and does open ai really have a moat? [00:17:58.000 --> 00:18:01.840] Right, and I think that as as like you said the rug [00:18:01.840 --> 00:18:07.040] And the the proliferation of all these models in six months [00:18:07.040 --> 00:18:12.320] Is it even going to matter or is it just about how people are implementing the tools around them? [00:18:12.320 --> 00:18:16.000] Yeah, and I think that they're they're all so similar like you were talking about [00:18:16.000 --> 00:18:19.600] And usability like Jim and i'm a huge Jim and i fan [00:18:20.080 --> 00:18:26.640] And i was watching a video on usability and it comes out it may not be as good as like content as chup [00:18:26.640 --> 00:18:32.880] Gpt4 or olama 3 but it formats better so it's easier to read and i'm like oh my gosh [00:18:32.880 --> 00:18:38.880] I have issues with ad hd probably and that's why I like that better because it's easier for me to read right? [00:18:38.880 --> 00:18:42.400] Yeah, but to your point like they're all pretty similar benchmarks [00:18:42.400 --> 00:18:48.160] And what's their moat what's opening eyes moat if five okay if i'm going to be that much better [00:18:48.320 --> 00:18:50.720] Then how much time is it going to pass before? [00:18:50.720 --> 00:18:56.480] Olama 4 is going to come out and it's going to be even better, you know, so yeah, I think it's an interesting [00:18:56.480 --> 00:19:01.040] It's commoditized for sure, but I think there is there's there's a lot of [00:19:01.040 --> 00:19:07.520] Of magic that goes on and we've talked about this multiple times where when you're when you're prompting chat gpt [00:19:07.520 --> 00:19:11.520] You're not just going straight to their gpt4 turbo right yeah [00:19:11.520 --> 00:19:16.960] There's a lot of fuckery that's going on behind the scenes before that prompt ever even ends there [00:19:17.280 --> 00:19:21.600] Yeah, you don't necessarily get like when you're using olama and you're just raw dog in that [00:19:21.600 --> 00:19:28.160] Did you see my post about jim and i 1.5 so I forget then it's mo it's a mo architecture where it's like [00:19:28.160 --> 00:19:36.160] Something of experts. Anyway, there's a bunch of they basically segregated the models into little subset models when it [00:19:36.160 --> 00:19:39.840] Asked you asked the question so open and I does this to a certain extent [00:19:39.840 --> 00:19:42.000] But they're still passing it to the main llm [00:19:42.320 --> 00:19:47.360] Well, what google's strategy was going to be is they're going to have a bunch of llms that it's going to say hey [00:19:47.360 --> 00:19:51.120] Here's my question which one is going to be best suited to answer this or maybe multiple [00:19:51.120 --> 00:19:56.720] You were talking about yeah, so you give it to and we kind I kind of kicked around this with big cheese originally [00:19:56.720 --> 00:20:02.480] Which is having like your your master your master that you can ask a question then it goes and figures out which [00:20:02.480 --> 00:20:08.240] Agents that actually should be used for this kind of call that seems like a direction that we would absolutely go [00:20:08.800 --> 00:20:12.080] So it's not the it's not necessarily the model. It's the implementation [00:20:12.080 --> 00:20:15.520] But I think that there's another thing that I kind of wanted to get everybody's take on [00:20:15.520 --> 00:20:21.680] Is is all I there was a medium article that I read this morning actually I could only read the first couple sentences [00:20:21.680 --> 00:20:26.640] I know I just wanted to give you a moment on that [00:20:26.640 --> 00:20:28.720] but the [00:20:28.720 --> 00:20:33.360] The article is stop building chat interfaces and I and i'm I think we got a lot of that where it's a cold [00:20:33.440 --> 00:20:38.720] Don't just go build a chat bot but actually as time has gone on i've we've we've evolved that conversation [00:20:38.720 --> 00:20:42.400] We're like it's not just about chat. It's literally it is about chat [00:20:42.400 --> 00:20:48.000] It is about it's not a it's not the chat bot is not commoditized people are building [00:20:48.000 --> 00:20:55.440] Entire ui paradigms around this new way of interfacing with computing and it's about like when we talk about chat gbt [00:20:55.440 --> 00:20:58.640] And we talk about gem and i we're talking about literally a new interface [00:20:58.960 --> 00:21:03.440] Where it's not point and click and find where the filter is and do this thing and click this button [00:21:03.440 --> 00:21:06.320] It's a truly new ui ux paradigm [00:21:06.320 --> 00:21:11.040] And I think a lot of people and you talked about it this week, which is uh, you guys were talking about the uh, [00:21:11.040 --> 00:21:15.200] The what's the framework that uh, you you recommended deepen the you just [00:21:15.200 --> 00:21:19.520] Open, uh open web ui and you're talking about entire ui [00:21:19.520 --> 00:21:23.440] Systems and ux systems that are built around this different interface [00:21:23.600 --> 00:21:29.120] And so I mean I think that you have to when you're talking about models and interacting with all of them [00:21:29.120 --> 00:21:36.880] It's like there's a reason this is a maybe a better or more nuanced way for people to inner interface with data [00:21:36.880 --> 00:21:43.280] Yeah, I agree and I think the reason it's a better way is because it's so open like we get a lot of customers that ask us [00:21:43.280 --> 00:21:47.120] Hey, we've got a bunch of data. We want and we've and we've got customers that maybe [00:21:47.120 --> 00:21:50.400] Um, the traditional sense you would say like oh, here's the ui [00:21:50.400 --> 00:21:54.000] We have a bunch of drop downs and they can filter the things and that's pretty cool [00:21:54.000 --> 00:21:56.240] But it's limiting of what they can actually do with that data [00:21:56.240 --> 00:22:00.720] If a customer knows what questions ask and like wants to ask more nuanced questions [00:22:00.720 --> 00:22:03.280] Well, there's no way to do that when I fixed ui [00:22:03.280 --> 00:22:07.840] So the chatbot is is that and you were talking about it in the sense of like financial advising or [00:22:07.840 --> 00:22:13.760] um looking at that that data your that company is going to get get the data and [00:22:13.760 --> 00:22:18.480] Add value and then you can use a large language model to ask good questions against that data [00:22:18.560 --> 00:22:24.080] And people who know the right questions ask are going to be able to get a lot of information a lot quicker [00:22:24.080 --> 00:22:28.880] Then instead of all night now we're 10 minutes in and we've got the same same results [00:22:28.880 --> 00:22:31.360] Isn't a chat bot really [00:22:31.360 --> 00:22:34.720] Or the chat interface isn't it just the first [00:22:34.720 --> 00:22:38.560] Design pattern for representing what you and I are doing right now [00:22:38.560 --> 00:22:42.880] Right, which is just two human beings having a conversation [00:22:42.880 --> 00:22:47.680] We're just chatting and then I stop and then you start I might interrupt and you start [00:22:47.760 --> 00:22:50.960] You know, we go back and forth. I mean, that's really what we built here [00:22:50.960 --> 00:22:55.920] And that all transfers again going back to my other prediction of that all of this is really [00:22:55.920 --> 00:22:58.800] So all of this is going to go into the brains of the robots [00:22:58.800 --> 00:23:04.320] You know, maybe just real quick real quick your take on it on the just [00:23:04.320 --> 00:23:06.000] I think [00:23:06.000 --> 00:23:08.000] I think of the chat bot as the [00:23:08.000 --> 00:23:14.000] The first interface when you think about what the interface to the world wide web was right [00:23:14.240 --> 00:23:16.480] Do you think about the interface was the browser? [00:23:16.480 --> 00:23:23.360] I think what what we got going on is is we've got all of these big tech companies [00:23:23.360 --> 00:23:25.840] Competing to be the best browser [00:23:25.840 --> 00:23:30.560] So you're you know, so like like so what facebook is meta is trying to tell you is like [00:23:30.560 --> 00:23:35.840] Hey, you don't you can run your own browser on your own computer or you can run it on your own cloud [00:23:35.840 --> 00:23:43.040] And what what I open is saying like here's my browser you can use it you'll get everything and and you've got a real battle [00:23:43.440 --> 00:23:46.000] With google because google owns the browser [00:23:46.000 --> 00:23:48.720] And now they're getting competitive like, you know [00:23:48.720 --> 00:23:54.400] You got perplexity and all these other other companies coming in and so the browser's under attack [00:23:54.400 --> 00:23:58.720] So what what you see is like, you know, who's going to end up having that browser? [00:23:58.720 --> 00:24:03.280] And like now we just use google chrome or or you know, I think safari [00:24:03.280 --> 00:24:06.800] Well, and we talked about this go ahead. Okay. So so what's kind of interesting [00:24:06.800 --> 00:24:11.760] You were just saying to remind me a lot of the search engine wars that happened early in the 90s [00:24:12.000 --> 00:24:15.920] Right, so the search engine wars and then what some people started doing is they started building [00:24:15.920 --> 00:24:21.840] Kind of aggregators of the search engines where you would basically be having one text box and to be like here's google [00:24:21.840 --> 00:24:27.200] Here's yahoo. Here's altovista. Here's this. Here's this. I wonder if we're just repeating the same [00:24:27.200 --> 00:24:30.560] I think people people [00:24:30.560 --> 00:24:37.600] Don't know what what's gonna be the best right so I think you've got a situation where like I like I talked to a lot of people [00:24:38.000 --> 00:24:43.760] in a lot of different fields, um, you know, but whether it's sort of the social sciences or even and [00:24:43.760 --> 00:24:48.400] They're all they're all like trying things out and like hey, I like this better [00:24:48.400 --> 00:24:52.240] I like this, but like, you know, I like the way this answers my questions [00:24:52.240 --> 00:24:57.920] And and so I think we're just trying to feel out to the and that's why all these companies [00:24:57.920 --> 00:25:03.680] Want want that because once you once you win that war you become google [00:25:03.920 --> 00:25:08.960] Right, you know, you become what google is in the advertising space because you have everybody on your platform [00:25:08.960 --> 00:25:12.080] The only the only thing though I'd argue against that though is that [00:25:12.080 --> 00:25:18.560] When google google was able to do it because they had scale and they had a lot of money to be able to to keep that mode [00:25:18.560 --> 00:25:23.760] But man, we're just seeing these things coming out left and right that it does make it a little bit harder of a stance [00:25:23.760 --> 00:25:26.080] Just to be able to be like all right. We're the dominant one [00:25:26.080 --> 00:25:30.720] You know, well in the other the other point. I've made it in last two or three podcasts is okay [00:25:30.720 --> 00:25:32.720] So if the browser dies [00:25:32.720 --> 00:25:35.120] Be you know that that that's the thing [00:25:35.120 --> 00:25:39.440] But like in the old days the search engine was an interface through a browser [00:25:39.440 --> 00:25:42.800] Right in the data that you got from the search engine was in the browser [00:25:42.800 --> 00:25:46.400] Now we go to this weird thing where you're going to an L11 getting answers [00:25:46.400 --> 00:25:50.960] But the L11 is trained on the data. That's massively from the browser [00:25:50.960 --> 00:25:56.080] Eliminate the the the supply and the data pipelines [00:25:56.480 --> 00:26:01.760] Where's your where how are you going to continue to train the models and where's that data going to come from? [00:26:01.760 --> 00:26:07.600] Yeah, it seems like it's going to have to it would it have to then come from proprietary platforms that [00:26:07.600 --> 00:26:11.600] That are served up from or synthetic data, right? [00:26:11.600 --> 00:26:15.360] So I can I can now go and basically ask chat gbt [00:26:15.360 --> 00:26:18.640] Here's a bunch of photos write me labels for it [00:26:18.640 --> 00:26:24.720] I can then take that and I can now go train a new model on that synthetic data that came out of chat gbt [00:26:24.800 --> 00:26:28.560] Well now, but again, does that hurt our results? Does that hurt our ranking over? [00:26:28.560 --> 00:26:31.200] I just saw a lot of bad bad stories about that [00:26:31.200 --> 00:26:35.200] Yeah, well, and yeah, it like it's eating. It's yeah, it's it's not [00:26:35.200 --> 00:26:39.600] I need so children. Yeah, but I saw you get those crazy AI [00:26:39.600 --> 00:26:48.400] Well, the podcast we have this theory that's basically says that web and accessibility is still is a thing and still will perpetuate [00:26:48.400 --> 00:26:52.320] Because because it's it's the only system that has [00:26:52.880 --> 00:26:59.120] Basically worked and provided free information to the entire world and now including all lm's and you know [00:26:59.120 --> 00:27:04.000] It's just interesting to always ask people what I don't know if you have any thoughts on that, but um, yeah [00:27:04.000 --> 00:27:10.400] No, I think it's really interesting. I think there is there is a movement to where like have you have you heard of the um [00:27:10.400 --> 00:27:14.480] What's that the one device that you can get the little rabbit rabbit? [00:27:14.480 --> 00:27:21.440] Yeah, like it's like like there's so many different like concepts that are coming out where you're actually moving away from [00:27:21.920 --> 00:27:25.920] You know like logging into something and do it like so I feel like [00:27:25.920 --> 00:27:28.640] there like we were so used to a certain [00:27:28.640 --> 00:27:34.800] Dynamic on how to how to do things that there there are going to be these major shifts [00:27:34.800 --> 00:27:40.240] Like you said like the chat bot is one like agents, you know, what's what's after agents? [00:27:40.240 --> 00:27:44.560] It's like workers right like automated things that are autonomous [00:27:44.560 --> 00:27:50.640] And for us as as humans interfacing with this type of technology [00:27:51.040 --> 00:27:53.680] We're we're going to try to replicate what we do [00:27:53.680 --> 00:27:57.440] Manually everything that we do manually is going to have some [00:27:57.440 --> 00:28:02.160] Purpose with this and so the challenge is like how do we? [00:28:02.160 --> 00:28:06.080] Balance that with you know, what are we going to do now? [00:28:06.080 --> 00:28:06.720] Right? [00:28:06.720 --> 00:28:11.040] So like there's going to be a big shift and like what we end up doing all of us [00:28:11.040 --> 00:28:17.280] So we're going to have to like plan about like like I used to do all of this every day. Now. I don't have to do that [00:28:17.280 --> 00:28:20.240] Let me do what I love and what is that going to look like? [00:28:20.240 --> 00:28:22.000] I'll share a quick story [00:28:22.000 --> 00:28:24.000] So I don't know if you guys know Tesla [00:28:24.000 --> 00:28:27.120] Is did their earnings and their stock actually went up [00:28:27.120 --> 00:28:31.120] Yeah, but their their sales went down or the revenue went down because they have a lot of competition [00:28:31.120 --> 00:28:37.440] But what they're doing their strategy why their stock went up is because they're cutting their they're going to start cutting their prices and coming out with [00:28:37.440 --> 00:28:41.600] $24,000. Yeah, they're going to start. They're going to cut the market because they have better [00:28:41.600 --> 00:28:45.280] In theory, they have more mature manufacturers. Did you listen to the call? [00:28:46.080 --> 00:28:48.080] No, but anyways [00:28:48.080 --> 00:28:52.240] Talk about feeling useless. So they dropped on all their existing customers [00:28:52.240 --> 00:28:54.480] They dropped a 30-day free trial of full self-driving [00:28:54.480 --> 00:28:58.960] So I just woke up two days ago and I had my car was full self-driving [00:28:58.960 --> 00:29:00.000] Was it didn't work? [00:29:00.000 --> 00:29:01.200] Full self-driving [00:29:01.200 --> 00:29:01.680] Did it work? [00:29:01.680 --> 00:29:03.200] So I've been I [00:29:03.200 --> 00:29:08.080] So I did the auto steering for a while which is still available on any Tesla model [00:29:08.080 --> 00:29:13.360] But I have full self-driving right now and it's driven me to work to get my kids [00:29:14.000 --> 00:29:18.080] Completely all you have to do is hit the steering wheel every once in a while [00:29:18.080 --> 00:29:23.200] But like it's changing lanes. It's handling like pretty tough traffic situations [00:29:23.200 --> 00:29:28.800] And I mean you do not feel any more useless in it in your life [00:29:28.800 --> 00:29:31.600] In sitting there just with your hands crossed [00:29:31.600 --> 00:29:37.440] Going down looking around everybody's like hi. I'm I'm watching the road here. I'm doing great [00:29:37.440 --> 00:29:40.080] And you're just sitting there like what do I do with my hands? [00:29:41.680 --> 00:29:44.320] Hands on the wheel yeah, but not I mean all you like all you [00:29:44.320 --> 00:29:51.760] At first it felt so weird, but two days later. I can see why people you get so used to it [00:29:51.760 --> 00:29:57.440] All you have to do is just program yourself to just like twinge the steering wheel every like 30 seconds [00:29:57.440 --> 00:29:59.760] And I'm telling you it is [00:29:59.760 --> 00:30:06.160] It gets a little confused sometimes, but I mean I'm saying 99% of the time. You don't have to do shit [00:30:06.160 --> 00:30:11.120] That is the dream as far as I'm concerned right like I want to sit in the back of my damn car [00:30:11.200 --> 00:30:14.480] Working on my computer while that thing drives me there. So that's killed [00:30:14.480 --> 00:30:17.520] And it just got me thinking like [00:30:17.520 --> 00:30:20.080] It first of all [00:30:20.080 --> 00:30:25.120] It was I mean that's truly unbelievable and I'm talking about traffic situations [00:30:25.120 --> 00:30:27.360] Like I'm not talking and I'm talking freeways and stuff [00:30:27.360 --> 00:30:30.640] So you're talking about a major thing that I do an hour a day [00:30:30.640 --> 00:30:35.120] Just literally got lifted off my hands. How much does that cost after the trial 12? [00:30:35.120 --> 00:30:39.040] No, they dropped it by $4,000. So I think it's $6,000 [00:30:39.680 --> 00:30:43.680] It's fixed. Yeah, it's a fixed fee. There's a monthly fee. You can pay for it and you can pay for it every month [00:30:43.680 --> 00:30:46.800] It's fixed until Elon changes it. See that's what okay, so [00:30:46.800 --> 00:30:52.880] I mean it's fixed like one time. It's a one-time. It's a one-time thing. So Tesla the prices, but it's just a software every [00:30:52.880 --> 00:30:56.320] They could it's just a it's just a button. Yeah, they said so [00:30:56.320 --> 00:31:01.680] Elon in his call said they're not a car company and in the whole knock on Tesla was [00:31:01.680 --> 00:31:05.360] Um one they're manufacturing even if it's more mature now [00:31:05.360 --> 00:31:09.520] It's was never going to catch up or it's going to be a while for it catches up to Toyota and Ford [00:31:09.520 --> 00:31:14.560] But they make all their money off of parts, right? Like they sell the cars for basically [00:31:14.560 --> 00:31:21.760] a cost or below cost and then I got a door fixed the other a couple weeks ago because my dumbass hit hit a fence post but anyway [00:31:21.760 --> 00:31:23.040] cost [00:31:23.040 --> 00:31:28.000] I think it costs like two grand. They've got to have insurance, but I'm like my my god [00:31:28.000 --> 00:31:29.760] Anyway, so they're making a lot of money on that [00:31:29.760 --> 00:31:35.440] Elon's play Tesla's play is software updates and ongoing costs there because the car [00:31:35.440 --> 00:31:39.040] I mean the door still has a fixed cost of Ford or to Honda or whatever [00:31:39.600 --> 00:31:41.600] Tesla I mean once they build that software [00:31:41.600 --> 00:31:44.880] You're supposed to be able to you know maintain it for pretty cheap [00:31:44.880 --> 00:31:47.360] Well, one thing that you see though, it's like [00:31:47.360 --> 00:31:52.960] Once you drop that full self-driving update on the screen, it changes the way the screen looks [00:31:52.960 --> 00:31:57.200] And it's you see what the computer is interpreting in real time [00:31:57.200 --> 00:32:02.320] In in how it's observing its surroundings and it has way more information [00:32:02.320 --> 00:32:06.560] Like it has like a it gives you an overhead view of all the things that are going around [00:32:06.880 --> 00:32:12.320] And it has a better it has a better view of what's going around than any human ever could and it is [00:32:12.320 --> 00:32:20.080] Simply just navigating. Yeah, and you're just like wow that is a totally like analogous to AI right now [00:32:20.080 --> 00:32:23.600] Where it's it's on the precipice of literally being able to do [00:32:23.600 --> 00:32:27.280] Almost every basic thing that we have to do [00:32:27.280 --> 00:32:30.480] Better than we can do and the self-driving can even be an asshole [00:32:30.480 --> 00:32:35.920] Do you see the video where uh it was in traffic and it just like straight up went out around and cut [00:32:36.720 --> 00:32:38.720] I was like, yeah [00:32:38.720 --> 00:32:44.240] You see you see like the like what your ass hole level [00:32:44.240 --> 00:32:47.920] The whole time you're like [00:32:47.920 --> 00:33:01.120] Yeah, it's also i'm reading elon's book or not his book by walther Isaacson's book of of his biography and [00:33:01.120 --> 00:33:03.920] And like it's interesting like he mentioned that [00:33:04.560 --> 00:33:08.160] Like he was he's always been trying to get it the car without a steering wheel [00:33:08.160 --> 00:33:10.560] He's been trying to push that so hard [00:33:10.560 --> 00:33:15.600] You know and every every meeting like can we do it without a steering right and the regulators or no [00:33:15.600 --> 00:33:18.880] But then but then he's also like trying to [00:33:18.880 --> 00:33:24.880] Get get everybody to buy into the concept that you know, hey, how can we [00:33:24.880 --> 00:33:28.480] All make this a cheaper thing, right? [00:33:28.480 --> 00:33:32.800] How can we how can we bring this down to a level where everybody can afford it? [00:33:32.800 --> 00:33:35.360] People are going to look back at elon musk 10 years from now and anyway [00:33:35.360 --> 00:33:38.720] He's that guy's the reason why no one drives a car anymore [00:33:38.720 --> 00:33:43.760] That you just get it in and it goes in a world where everybody gets in and it goes. No, there's no car accidents [00:33:43.760 --> 00:33:47.280] Yeah, there's no there's no drunk driving accidents. Yeah, no one dies [00:33:47.280 --> 00:33:52.880] Need it and then until somebody hacks the whole thing and literally drives us all off [00:33:52.880 --> 00:33:56.480] Because that's the other thing that you sit there and think about when you have nothing else to do [00:33:56.480 --> 00:33:59.440] Is you're like, what if someone just hacked this right now? [00:33:59.760 --> 00:34:04.320] Russia comes in and takes over the network. Yeah, so I just finished three body problem if you haven't seen it [00:34:04.320 --> 00:34:11.680] Yeah, but there's an episode where cars get hijacked and they try to run into that guy. Yeah, anyway, won't meant no spoiler spoilers on that [00:34:11.680 --> 00:34:14.080] But uh, is this subtitle? [00:34:14.080 --> 00:34:20.720] Uh, no, it's actually just in english. Yeah. It's by the guys who made game of thrones. Yeah. Also, the books are really good [00:34:20.720 --> 00:34:28.080] Last for me game of thrones. No, you talked too many characters. Yeah. Yeah. Yeah, well, I do audible. So i'm like a [00:34:28.560 --> 00:34:30.560] Like a child [00:34:30.560 --> 00:34:40.240] My head [00:34:40.240 --> 00:34:45.120] That's true. That's true. I do like to give the disclaimer though because people that's what they'll say to me. I'll be like, wow [00:34:45.120 --> 00:34:48.160] That worries get in there. It's everyone does it [00:34:48.160 --> 00:34:54.880] All right, sorry. Go on now. It's a good show though. I mean, but yeah, the car gets hijacked and of course that [00:34:55.200 --> 00:35:02.560] Yeah, I always wonder like, well, you know, we're going towards a protocol where forward and like because they have they have their super drive or whatever the hell is called like [00:35:02.560 --> 00:35:08.000] Yeah, if every car could you still have to watch out for for pedestrians and that kind of stuff [00:35:08.000 --> 00:35:11.520] But if every car is talking to each other then that makes accidents avoidance [00:35:11.520 --> 00:35:17.600] Yeah, you'd also have the cars would go way faster and they'd have way better traffic management [00:35:17.600 --> 00:35:21.360] Well, you know the whole I forget what the phenomenon is called but like [00:35:21.680 --> 00:35:26.240] I mean traffic when someone stops the person behind them stops a little bit a little bit earlier [00:35:26.240 --> 00:35:32.400] Yeah, and like you you've ever been on a highway and there's traffic and then you get to a point where it starts going and you're like [00:35:32.400 --> 00:35:34.720] What what were we doing? [00:35:34.720 --> 00:35:40.160] That literally happened to me this morning. Yeah, and that's just because some jackass got someone off a mile up and [00:35:40.160 --> 00:35:43.840] Anyway, yeah, right? So back to the show here [00:35:43.840 --> 00:35:49.840] I am interested to hear about [00:35:50.000 --> 00:35:55.680] The personal finance piece because I know we talked about more of the institutional and trading but like what I mean [00:35:55.680 --> 00:35:57.680] We all see, you know [00:35:57.680 --> 00:36:02.240] Different things coming out whether it's you know the rocket monies or the different nerd wallets or the the apps are helping [00:36:02.240 --> 00:36:09.680] I mean people have more access to to data that are aggregating across different platforms that tell the story of what's going on with their personal [00:36:09.680 --> 00:36:16.800] Financial information. What is the AI landscape there and like what are the what you know, what's the cool stuff? [00:36:16.880 --> 00:36:19.920] What's what's what's happening and like what are some of the things to consider? [00:36:19.920 --> 00:36:24.480] Yeah, so from what I've what I've seen, you know, there's a lot of sort of [00:36:24.480 --> 00:36:26.240] you know [00:36:26.240 --> 00:36:28.240] use cases within [00:36:28.240 --> 00:36:30.400] financial companies [00:36:30.400 --> 00:36:32.720] For example, there's a use case where [00:36:32.720 --> 00:36:37.040] You know customer service, right? So I've seen a lot of voice [00:36:37.040 --> 00:36:44.240] Activated voice trained AI that does calls that you typically would expect [00:36:45.200 --> 00:36:51.680] Like a human to do, right? So I see a lot of use cases where you're you're trying to take the [00:36:51.680 --> 00:36:54.320] the time-consuming [00:36:54.320 --> 00:36:58.800] Piece where a lot of humans have to interact with other humans where it's just [00:36:58.800 --> 00:37:01.520] Asking those same questions repeatedly [00:37:01.520 --> 00:37:05.200] So I'm seeing like a lot of work that companies are doing [00:37:05.200 --> 00:37:11.600] That are that's helping them, you know solve the customer service. So this is a robocall kind of yeah [00:37:11.600 --> 00:37:18.160] Like a robocall, but the robocalls are very human sounding. Yeah, you know, there's they they take information [00:37:18.160 --> 00:37:25.360] And you kind of everyone's hurt seeing that apple commercial or they where they try to sell you the the goggles, right? [00:37:25.360 --> 00:37:31.120] And they you know, they they keep asking you the questions and and all the right things to make you end up [00:37:31.120 --> 00:37:35.520] You know paying monthly for it, you know instead of buying them at classes, right? [00:37:35.520 --> 00:37:37.520] So it's it's that kind of thing is [00:37:38.000 --> 00:37:42.720] It's sort of the first level. The second level I see is the data [00:37:42.720 --> 00:37:47.360] um data use cases, right? So you can take people's data [00:37:47.360 --> 00:37:54.640] Um their personal data and their you know their experiences with your company and you're able to create curated [00:37:54.640 --> 00:37:59.920] Solutions for them that you may not have done in a in a in a mass way [00:37:59.920 --> 00:38:02.640] You know when you when you would have done it like [00:38:03.280 --> 00:38:08.160] By one personal so if an advisor is serving a client, um, you know, they'd have to basically [00:38:08.160 --> 00:38:14.160] Be able to do that same work with hundreds of clients, which takes a lot of man hours [00:38:14.160 --> 00:38:19.520] Whereas now you can use that data and and and create those sort of deliverables without [00:38:19.520 --> 00:38:25.280] Using you know that human right? So are you guys currently using that like in general to the AI? [00:38:25.280 --> 00:38:33.120] We're thinking about it, right? So we're thinking about like so we're we're we're trying to apply it to our our user journey [00:38:33.440 --> 00:38:35.360] So making onboarding easier, right? [00:38:35.360 --> 00:38:42.240] How can we make onboarding sort of less painful where we can gather just as a financial advisor you need to gather everyone's information [00:38:42.240 --> 00:38:50.720] For everything so like how can onboarding be easier? How can uh regular, you know, sort of discipline within your [00:38:50.720 --> 00:38:55.600] Financial management become easier where you're getting the right alerts. You're getting the right [00:38:55.600 --> 00:39:00.480] Um sort of information so you can stay track on track with where you're supposed to be going [00:39:00.960 --> 00:39:07.760] Um and in terms of like, you know those personal financial apps that you're mentioning. I think we're also seeing [00:39:07.760 --> 00:39:15.680] Um, you know sort of higher level chat bot, right? So there's there's these these things where they're they're not only just answering your questions [00:39:15.680 --> 00:39:22.160] They're proactively asking you questions, right? So when you go into some of these new systems like they're actually like [00:39:22.160 --> 00:39:26.960] They they they know already know everything about you and they're asking questions about [00:39:27.920 --> 00:39:31.840] What you what you you should be asking yourself, but you didn't think about [00:39:31.840 --> 00:39:36.480] And so I think that's kind of the direction where a lot of these financial apps are going [00:39:36.480 --> 00:39:40.160] And I think what you're gonna see you're gonna see a lot of disruption [00:39:40.160 --> 00:39:43.280] Um in the space right and whether it's the [00:39:43.280 --> 00:39:49.760] Insurance salesman's base or the financial advisor's base or you know the accountant space [00:39:49.760 --> 00:39:52.080] There's gonna be a lot of disruption because [00:39:52.080 --> 00:39:55.440] um a lot of the things that are in in the back side of [00:39:56.080 --> 00:40:01.920] Back but you know the operation side of things can be made more efficient using some of these technologies [00:40:01.920 --> 00:40:04.400] Yeah, and so you're gonna you're gonna see a lot better [00:40:04.400 --> 00:40:07.360] Services that you're gonna be a better [00:40:07.360 --> 00:40:12.880] Um you're gonna provide a better service to your clients, you know because you're gonna be able to utilize somebody's technologies [00:40:12.880 --> 00:40:15.280] Um, I don't see a full automation [00:40:15.280 --> 00:40:18.080] In the next few years. I think that's [00:40:18.080 --> 00:40:22.320] Well, because it's a lot of these type of [00:40:23.440 --> 00:40:31.680] Companies the personal interaction is important for people like to to know that there's another person I can call or talk to or relate to about [00:40:31.680 --> 00:40:39.760] My finances, um isn't important thing. I don't see that like that going away. Wait, but like what about so [00:40:39.760 --> 00:40:43.600] I um, I have personal finance question [00:40:43.600 --> 00:40:46.800] And maybe I need to get educated before I go and talk to an expert [00:40:46.800 --> 00:40:51.760] Like do you see that as an opportunity to where once I get to you or to one of your your employees [00:40:52.080 --> 00:40:57.760] I now have a better idea of what questions to ask and then I can give you my specific situation and you can advise [00:40:57.760 --> 00:40:59.920] Yeah, I think that's an opportunity [00:40:59.920 --> 00:41:07.440] It's such a regulated space too. So like when you have a chat bot giving an a piece of advice [00:41:07.440 --> 00:41:09.440] That could be [00:41:09.440 --> 00:41:13.360] Very dangerous to your firm, right? So unless you really have that nail down [00:41:13.360 --> 00:41:17.600] So, um, you know one of my hats that I wear is i'm a cheap compliance officer as well [00:41:17.600 --> 00:41:21.280] So like one of the things that i'm always starting to think about is like, you know [00:41:21.280 --> 00:41:24.320] Hey, is this gonna is am I gonna be able to defend this? [00:41:24.320 --> 00:41:26.800] So so that's that's gonna be critical [00:41:26.800 --> 00:41:30.800] Um because of the hallucinations and yeah things that could happen [00:41:30.800 --> 00:41:34.640] Um, you really need to nail it down and and I know there's [00:41:34.640 --> 00:41:37.600] uh companies that are starting out that are [00:41:37.600 --> 00:41:42.960] Completely going with this and they're gonna face a lot of regulatory hurdles [00:41:42.960 --> 00:41:49.200] They don't know what they're getting at like it's a lot it's a lot of these like like, you know, silicon valley like, you know [00:41:49.680 --> 00:41:53.360] Um right out of college like, hey, i'm gonna say it's fintech. That means it's so [00:41:53.360 --> 00:41:58.080] Yeah, it's fine. Yeah, it's gonna be no different than the mental health people that are building these ais to do [00:41:58.080 --> 00:42:00.960] therapy, right? It's just like [00:42:00.960 --> 00:42:03.600] Y'all are going out. I mean what but uber did it, right? [00:42:03.600 --> 00:42:07.760] So uber came out and was like, we're gonna do richer. They're like you can't he's like fuck it [00:42:07.760 --> 00:42:11.520] We're gonna do it anyway. I don't care what you say. We'll just keep paying the fines, right? [00:42:11.520 --> 00:42:18.720] What are your thoughts on basically the first phase of that fintech stuff like with the robin hoods and the plads and the [00:42:19.120 --> 00:42:24.080] and the stripes and like the kind of the disruption of the the like literal the literal [00:42:24.080 --> 00:42:30.480] Ability for people to get and transfer money and to trade like like what what has that layer? [00:42:30.480 --> 00:42:38.880] Done to lay the foundation for like these now these new tools like is that has that mega disrupted the financial advice industry? [00:42:38.880 --> 00:42:42.400] Yeah, I think what we've got is you've got um [00:42:42.400 --> 00:42:44.800] the ability to build [00:42:44.800 --> 00:42:46.800] apps faster because you've got [00:42:47.360 --> 00:42:49.360] these api layers [00:42:49.360 --> 00:42:54.720] Um like plaid, you know, you know, some of these other companies that have built sort of these [00:42:54.720 --> 00:43:00.880] The ability for you to build an app so anybody can build anybody can build money. Yeah, it's so easy [00:43:00.880 --> 00:43:07.920] Like y'all you need to apply the integration. They've got to be word. What is the what is the like minimum requirement for you to [00:43:07.920 --> 00:43:14.960] Put plaid in your app dude. I think it's literally like you just sign up and you insert the javascript function. Yeah, literally [00:43:15.840 --> 00:43:20.480] Yeah, I have a I have a small own account with a small regional bank in kentucky [00:43:20.480 --> 00:43:24.400] And they are able to connect to my wave account [00:43:24.400 --> 00:43:26.720] And I was like because they're using plaid, right? [00:43:26.720 --> 00:43:32.880] Yeah, and I'm like there's no way this company should have been able to afford that integration and plaid has enabled that so so [00:43:32.880 --> 00:43:37.360] one thing that scares me about personal finance in the future is [00:43:37.360 --> 00:43:43.760] Is and it's because you know, I'm a I'm a user of that and I don't have like a super high protection layer [00:43:43.760 --> 00:43:46.480] Like a financial advisor and some like stiff [00:43:46.480 --> 00:43:53.440] Thing like which you need to get you in exactly exactly but like the like the [00:43:53.440 --> 00:43:57.440] When you put money into it [00:43:57.440 --> 00:44:04.080] It's you know, it's vapor becomes it's essentially vapor. It's not like there. There's no mattress that this this cash is sitting in [00:44:04.080 --> 00:44:06.400] right and so like [00:44:06.400 --> 00:44:07.040] it [00:44:07.040 --> 00:44:07.920] is [00:44:07.920 --> 00:44:14.320] Are we sit is it too centralized at some point or like kin can can people trust [00:44:14.320 --> 00:44:16.000] the [00:44:16.000 --> 00:44:23.120] The the financial markets in a world where like things move so quickly and decisions are made so fast and money moves so quickly [00:44:23.120 --> 00:44:31.040] Like it like, you know, there's no waiting periods. It's instant, you know instant Venmo transfer instant Robinhood instant everything 5% 10 for you know [00:44:31.040 --> 00:44:35.840] Like is it is it are we one while we one crisis away from the shit? [00:44:35.840 --> 00:44:40.800] Just sitting a fan or is or is or is there safeguards in place to keep it all together? [00:44:40.800 --> 00:44:43.520] I think I think there are safeguards in place [00:44:43.520 --> 00:44:45.280] the [00:44:45.280 --> 00:44:48.800] There's always risk to you know, obviously technology [00:44:48.800 --> 00:44:54.000] Sort of cybersecurity issues where you know, there's mass sort of [00:44:54.000 --> 00:44:57.600] Exposure right but but at the same time [00:44:57.600 --> 00:45:04.640] I feel like there's enough decentralized players like like for us at invest like we're never holding anybody's money [00:45:04.960 --> 00:45:10.240] So we use a custodian. So just Charles Schwab. So Charles Schwab is our custodian [00:45:10.240 --> 00:45:14.160] We have other custodians that we have on the platform as well and they're [00:45:14.160 --> 00:45:17.440] They're holding the money. So at that point [00:45:17.440 --> 00:45:22.400] We're we've got a you got a layer of protection. So what what you see is that [00:45:22.400 --> 00:45:26.320] People are wanting to put those layers of protection in place [00:45:26.320 --> 00:45:30.880] When they handle money like I I wouldn't want to handle your money. I wouldn't want to touch your money [00:45:30.880 --> 00:45:36.720] Right, I would never want the I would never want the the impression that I'm not I'm not handling correct [00:45:36.720 --> 00:45:42.640] So it's still just essentially the same risk that we had in 2008 or or any other financial market risk [00:45:42.640 --> 00:45:49.680] It's not like works. It's not like like all those banks are doing cash sweeps to like to to establish banks [00:45:49.680 --> 00:45:53.040] So it's no different than that's right. But what about the risk of like all right [00:45:53.040 --> 00:45:57.040] Robinhood, I'm a huge proponent of it retail investors [00:45:57.040 --> 00:46:01.440] It gave a lot of people who weren't investing access to invest right hands [00:46:01.440 --> 00:46:04.160] But they don't have the tools that [00:46:04.160 --> 00:46:09.360] These bigger players have and they're never going to be able to make as informed quick decisions [00:46:09.360 --> 00:46:12.720] So it's like you just have to do those retail investors [00:46:12.720 --> 00:46:15.520] Just have to go get the ETFs and call today [00:46:15.520 --> 00:46:21.200] Or is there any any world where they can even compete in robo trading and and and all that like what are your thoughts? [00:46:21.200 --> 00:46:23.760] It's an interesting question because [00:46:23.760 --> 00:46:29.920] One of the things that we've learned is that 90% of money managers can't beat that some p 500 [00:46:29.920 --> 00:46:34.720] Yeah, so like that's something that it's really hard for people to realize that like [00:46:34.720 --> 00:46:39.520] You're there's there's pretty much there's a lot of complete information out there [00:46:39.520 --> 00:46:46.480] Like people know what's out like the prices are baked into the into the price right like I mean the yeah [00:46:46.480 --> 00:46:51.040] Yeah, yeah, it's everything's baked into the price right so so what you have is this situation where [00:46:51.440 --> 00:46:54.560] um, you know like a person that is sort of [00:46:54.560 --> 00:46:58.480] Learning and they could be able they could be making some decisions [00:46:58.480 --> 00:47:01.760] They're probably super suspect, but they could have like really great results [00:47:01.760 --> 00:47:07.040] Yeah, because it's it's like not how everybody's thinking so actually like that's not really a crutch [00:47:07.040 --> 00:47:13.200] You know it's a lot of it is like your your willingness to learn and your willingness to like be able to like [00:47:13.200 --> 00:47:19.600] Take take whatever outcomes you have and process them in a way that can help you the next time you make those decisions [00:47:19.760 --> 00:47:25.040] Yeah, and so it's it's really like i've i've seen it where like, you know when I hired traders [00:47:25.040 --> 00:47:29.040] I've seen guys that have a like tremendous amount of experience that that [00:47:29.040 --> 00:47:31.680] Didn't do anything good [00:47:31.680 --> 00:47:34.080] And whereas like somebody who didn't have much experience at all [00:47:34.080 --> 00:47:39.280] But they were able to process the data and their experience be able to have tremendous success [00:47:39.280 --> 00:47:44.800] Yeah, that reminds me. I was playing poker last weekend and I got a bad beat and I was like ended up naked [00:47:44.800 --> 00:47:46.240] It was weird [00:47:46.240 --> 00:47:48.240] I don't know why we were playing strip [00:47:48.800 --> 00:47:50.240] The old pokes home [00:47:50.240 --> 00:47:53.360] Yeah, so we threw down our cards and i'm like i'm like, dude [00:47:53.360 --> 00:47:59.040] Why did you keep betting that you know he got something on the river of course and I was like, why are you still in that hand? [00:47:59.040 --> 00:48:01.520] And he was like, I don't know. I just he doesn't know how to play poker [00:48:01.520 --> 00:48:05.760] So he was just like, I don't know. I just I thought I thought I was gonna bluff you out. I was I was raising you [00:48:05.760 --> 00:48:09.360] You weren't what anyway. Anyway, so it's like a little bit. I'm so just mad about that [00:48:09.360 --> 00:48:13.440] That was I was willing to be mad so [00:48:14.240 --> 00:48:20.800] I've I've had those experiences too. Yeah, like you do everything right. We don't make money. You lose money [00:48:20.800 --> 00:48:25.040] Yeah, it's frustrating. Yeah, but it's uh, yeah, it's part of the part of the process [00:48:25.040 --> 00:48:29.760] You know you're betting you make that bet a hundred times you you win 80 of them. Yeah [00:48:29.760 --> 00:48:33.440] You were talking about this earlier. It's all probabilities, right? So like [00:48:33.440 --> 00:48:36.160] They're probabilities for a reason. It's not a sure thing [00:48:36.160 --> 00:48:40.400] You're just putting more money into the things you have better probability and then you're shorting things that have less [00:48:40.880 --> 00:48:45.600] And in the long run if you have a big enough chess, you know war chess, you're gonna you're gonna win [00:48:45.600 --> 00:48:50.480] You're gonna come out, but if you don't then you can either win big or lose it all kind of things, right? [00:48:50.480 --> 00:48:54.000] Yeah, it's your betting size and your betting strategy. Yeah, yeah [00:48:54.000 --> 00:48:57.520] So what um in in your space [00:48:57.520 --> 00:49:04.640] Let's say five to ten years from now, which jobs do you see that could potentially be outsourced to an AI? [00:49:05.680 --> 00:49:11.280] Yeah, so that that's uh, that's a real controversial question. Oh, totally. I do lead people right now. Yeah, yeah, yeah [00:49:11.280 --> 00:49:13.920] I'm just saying this because i'm not [00:49:13.920 --> 00:49:20.400] No, I get it's it's a very sticky question because it's where we're at though that we need to like start thinking about [00:49:20.400 --> 00:49:23.200] Okay, these types of jobs very well could be replaced [00:49:23.200 --> 00:49:26.720] And that if you're in that job, you need to start thinking about what you can do [00:49:26.720 --> 00:49:29.840] Well, maybe instead of answering that directly, maybe give some advice to [00:49:30.080 --> 00:49:35.600] The folks that are in the industry of what they can do to insulate. Yeah, so I feel like you've been such a good stew [00:49:35.600 --> 00:49:41.040] I have I think you think you know, we think things abundantly, right? Yeah, so when you think of abundance [00:49:41.040 --> 00:49:44.560] Like you think of things like okay, uh, food shortage [00:49:44.560 --> 00:49:47.920] Right like people have been worried about food shortage for hundreds of years [00:49:47.920 --> 00:49:53.440] And what do you what do you have now? Like you have more food than you could than you'd ever had in in history, right? [00:49:53.440 --> 00:49:59.280] Same thing with like clothing or or you know, we've we've we've reduced famine and hunger and [00:50:00.000 --> 00:50:04.960] Low income and some like things like that are actually improving right and people don't talk about that [00:50:04.960 --> 00:50:07.200] People talk about all that don't all the negative stuff, right? [00:50:07.200 --> 00:50:15.280] So when we think about AI, I want to highlight the positive things that it does for people and actually if you are [00:50:15.280 --> 00:50:19.920] Worried about that it's an opportunity for you to upskill [00:50:19.920 --> 00:50:27.040] You know, you upskill, you know, you see like open AI just hires an engineer for $900,000 a year [00:50:27.360 --> 00:50:30.960] You're like that like they're not we're not replacing coders right now [00:50:30.960 --> 00:50:36.000] All right. Yeah, so like there's if you're a good coder if you're good at what you do [00:50:36.000 --> 00:50:41.680] You're gonna be valuable. So if we if we move that maybe two three years down the road [00:50:41.680 --> 00:50:48.560] You're still gonna be super valuable. You're still gonna be able to you know, you understand the technology you play with the technology as you [00:50:48.560 --> 00:50:51.920] But that's it you have to you have to be comfortable with the technology [00:50:51.920 --> 00:50:57.280] And and I think that that maybe is the key here is that those who are like, I'm not going to do anything with it [00:50:57.360 --> 00:51:02.160] It's like well, then you're gonna die. Yes, you know and that you need to be comfortable with it and that you can then [00:51:02.160 --> 00:51:08.320] You can then parlay that into all sorts of different stuff. Yeah, there's there's definitely a uh a chasm [00:51:08.320 --> 00:51:15.120] That's widening, which is the the folks that are leveraging and using [00:51:15.120 --> 00:51:16.960] um [00:51:16.960 --> 00:51:23.280] AI other new technologies and getting good at doing stuff different ways because [00:51:23.680 --> 00:51:30.960] With the advent of AI and the advent of new technologies things change you get things are easier things are better things are different [00:51:30.960 --> 00:51:36.640] And you know, it's like the difference between, you know using DOS and using uh, mac OS X, right? [00:51:36.640 --> 00:51:38.960] Right, like, you know, it's just it's just yeah [00:51:38.960 --> 00:51:44.640] I wonder when the word processor if like editors were like I refuse to use it and they're still editing by hand [00:51:44.640 --> 00:51:51.040] I mean, I guys I finally learned how to share a notion page with my clients publicly, so [00:51:52.000 --> 00:51:56.880] I feel like that is the greatest that is that's the best feature of notion. That's the only reason I use notion [00:51:56.880 --> 00:51:58.880] I actually 100 agree like [00:51:58.880 --> 00:52:04.640] We were talking about notion last week. I don't know if do you have used it? Yeah, okay, and it's essentially [00:52:04.640 --> 00:52:11.040] My favorite feature by far is that I can make a document and then I can give someone a public URL [00:52:11.040 --> 00:52:13.280] Instead of a google doc or whatever. Yeah, so [00:52:13.280 --> 00:52:18.000] It's the only reason that I use notion over obsidian and we talked about that a little bit last time [00:52:18.000 --> 00:52:22.240] Yeah obsidian because we have, you know, other team members was my other hang up with that [00:52:22.240 --> 00:52:26.720] But yeah, yeah, yeah and the sinking was kind of weird, but sinking any singing kind of suck [00:52:26.720 --> 00:52:31.520] But yeah, I mean it's it's almost like a wet you building like a little website. Yeah, my entire [00:52:31.520 --> 00:52:35.600] I had an open source app and I built the entire documentation [00:52:35.600 --> 00:52:41.040] In notion and just linked and then I mapped docs dot nomi dot apps [00:52:41.040 --> 00:52:47.760] My notion URL and that was it and and what was oh another thing that's killer about notion is you can actually [00:52:48.640 --> 00:52:53.600] Copy from figma right click on your layout and figma copy the URL [00:52:53.600 --> 00:52:59.360] Paste that into notion. It's going to give you a rendered version of your of your thing [00:52:59.360 --> 00:53:02.320] And if you go update it in figma, it's automatically gonna get up to you [00:53:02.320 --> 00:53:04.960] so it does that with [00:53:04.960 --> 00:53:07.520] Any google doc. Oh, does it? [00:53:07.520 --> 00:53:09.840] It also [00:53:09.840 --> 00:53:11.840] so if there's a [00:53:11.840 --> 00:53:17.680] There's a feature in notion where essentially you tell you paste in a URL and it will attempt to create a [00:53:17.680 --> 00:53:23.760] Live view to it. Yeah, yeah, that is that idle and it just it just completely abstracts the best [00:53:23.760 --> 00:53:28.480] That's the the best thing about the internet is the ability to utilize a link [00:53:28.480 --> 00:53:33.840] If you can unfurl a link into something useful, you've created a great tool [00:53:33.840 --> 00:53:38.960] We talked about this at one of my clients today where it was like we got to collect all this information from the user [00:53:38.960 --> 00:53:44.640] And I was like what what do we already know? Oh, well, we know this we know this we know in this [00:53:44.800 --> 00:53:50.240] And we also know this I said that's literally the first three pages of this form that you were going to have to fill out [00:53:50.240 --> 00:53:55.440] Is just something you could have just put and encode it into the URL. You actually only need this [00:53:55.440 --> 00:54:00.480] It's one pager go have a great day if the rest is just going to get consumed from the from the link [00:54:00.480 --> 00:54:02.320] so [00:54:02.320 --> 00:54:07.200] No, yeah, I I I truly appreciate technologies like that. That's why I think the web still wins [00:54:07.200 --> 00:54:11.200] Yeah, it's all about the protocol. We talked about protocol a lot on this [00:54:11.600 --> 00:54:14.720] Show and and uh, yeah, if something is [00:54:14.720 --> 00:54:23.920] Replicatable like that it is it's great. Yeah, I agree. Absolutely. Well, um today's been great. Um, I I really appreciate your time [00:54:23.920 --> 00:54:30.560] Um, I appreciate the uh the advice and the uh information if you haven't checked out invest. It's iNVST [00:54:30.560 --> 00:54:38.480] And super amplify i am Sean. We've been with uh deep in meta Brandon and jacob today [00:54:38.480 --> 00:54:41.280] This is the big cheese podcast. We'll see you next season. Thank you [00:54:41.280 --> 00:54:42.120] - Thank you.