Home Page
cover of Podcast Recording
Podcast Recording

Podcast Recording

Charlie Evert

0 followers

00:00-58:06

Nothing to say, yet

Podcastspeechclickinginsidesmall roommale speech
0
Plays
0
Shares

Audio hosting, extended storage and much more

AI Mastering

Transcription

Two individuals are discussing a new technology called Rabbit R1. They discuss the hype surrounding new technologies and the potential usefulness of Rabbit R1. They also talk about their experiences with previous technologies and their initial impressions of the iPhone when it was released. They then describe the features of Rabbit R1, including its design, the Rabbit OS operating system, and the use of a large action model for seamless interaction with various websites and applications. They give an example of ordering a pizza using Rabbit R1. Am I recording? I am recording. Are you? Yep, I'm recording. Wonderful. Okay, so I'm going to do what's called a clap-in, where we just go 3, 2, 1, and we'll both clap. What this does is it adds a marker to the audio that I can then use to lighten it up after I get your copy. Sound cool? Sounds good. Alright, ready? 3, 2, 1. Strike it. 3, 2, 1. And one more. 3, 2, 1. Fantastic. Can you hear me? Because I didn't hear those claps at all on your end. Yeah, yeah. I guess I've got to step those up, huh? No, no, no. Your microphone's good. I'm just hoping that your microphone picked it up. But it's all good. I'll be able to figure it out. Awesome. Alright. So generally what I'll do is I'll start off and I'll be like, Hey, I'm Chris Cantlie. And then you'll say your name and then we'll just kind of get into it if that's cool. Yeah, absolutely. Alright, awesome. Give me a second. I'm going to press another drink. Alright, ready? Yep. Hey, this is Chris Cantlie. And this is Charlie Everett. And Charlie's going to be a guest host today as we talk about a new technology that's come out. And Charlie, I know that you've looked at this technology and seen kind of the press around it, the hype around it. First off, how do you respond to hype? Because there's new technologies about like every day. I think there's well-deserved hype and then there's things that could be hype. I think when it comes to things like Rabbit R1, it's somewhere in between where it could be very useful. It could be the next step in a lot of things like Amazon Alexa, Siri, and all of that. But it could also be a dud. I think with things like Chat GPT that came out, there was a lot of hype around it. But clearly it is very useful in a lot of ways. Where were you when the iPhone dropped? Do you remember that? I have no idea. I think I was in middle school, actually. Oh God, I feel old. Well, Charlie, first off, before we get started too far, go ahead and introduce yourself as it pertains to Deloitte and the business. Absolutely. Well, I am working as an AI and data consultant within the government and public services sector of Deloitte. I started recently in October and been working on a lot of fun projects, mainly centered around generative AI. Yeah. And for those that don't know, I'm a solution architect working with the digital innovation team and particularly working on Sidekick. So this is kind of our lives at the moment, right? We're knee-deep into it and there's constantly news springing up on a daily basis. But I've seen a lot of technologies come and go. I've kind of come to the point now where I ignore a lot of the stuff that springs up or tries to grab attention because, let's face it, I've invested in a ton of technologies in the past. I've gotten excited about new technologies. I've been disappointed quite frequently. And then some of the ones that are exciting that I invested in ended up fizzling, like the service became non-existent anymore. I spent my money and the application, you know, the lights in the house turned off, right? So when it came to the iPhone, dude, I'm sitting there and I'm watching TV. I think I was in high school. Might have been even later. No, it was later than that, man. I was well into the workforce. So I'm watching and there's this presentation of, wouldn't it be cool if we had a telephone? You know, your music and bless, what was the other thing, like text messaging or internet? What if we had these three things? And I'm thinking to myself, well, we've got that, right? We've got BlackBerry and you could surf the internet on BlackBerry and have the little Pearl and, like, I was like, you could listen to music on a number of devices. Like, I didn't get it. I really didn't see the vision at the time. And when I look back now, what I didn't realize is that it wasn't like those three things. It was the combination of that and an experience and just this wealth of applications and support from online to kind of make it the utility that it is today. But at the time, I missed it. I was like, this is a flash in the pan. It's going to come and go. It's embarrassing to say it now, to say the iPhone. It's just going to come and go. It's just an Apple product. So, I was browsing TikTok and this advertisement came up and it was like an advertisement for this little orange device and it had that Apple flair to the advertisement. It would accordion out. There'd be multiple devices and it would fan and it would pan and you'd have lens effects and lens flares and blurs and whatnot and it had sets of music that was very Apple-esque. If you've ever seen a recent Apple iPhone advertisement, it's very much like that. And then it said, releasing in 10 hours and you'll catch the drop for this in 10 hours. I was like, oh shit, that's this morning. So I went and I watched it and it was this 30-minute YouTube release. I think it was maybe a day or two before CES and it was exactly like a modern iPhone presentation. The one dude, giant screen, going on about what's wrong in the industry, how iPhones now have resorted to app hoarding in your hand that you've got to constantly drill in. Every app has its own very detailed interface. It's complex. It's small. It's become too cumbersome was the argument that it was drawing to. And then he introduces Rabbit R1, which is kind of their resolution to the product. Now, I saw this video, this 30-minute drop for the product and at the end of it, I didn't know what to think because there was a part of me that was kind of like, this is probably just a flash in the pan. Somebody's got some really good marketing. But another part of me, after it was said and done, felt like this is that next generation. And I don't mean next technology generation, because it's kind of that too, and we'll get into it in a minute. But I literally meant appealing to that next generation of kids, of teens that are going to take this could be that next iPhone. Not exactly, but it'll make more sense here in a moment, so I'm just going to stop there. What was your impression after you ended up watching that video, before we get into all the details? When I watched it, it really made a lot of things come to mind that I didn't necessarily think were paying points with smartphones. You go into a smartphone, and for me, I have quite a few credit cards. I love my points across different categories. And I need to log in and pay my bill on each individual account on a monthly basis. I could set up recurring payments, but sometimes it's good to just check in. And there's 13 of these that I need to open up and manually pay. So imagine having a system where instead of needing to go onto an app to view these things, you can simply say what are my credit card balances, pay my bills, and knock it all out at once. So I think from an automation standpoint, we're going from you don't have to do everything within an app, you can do multiple things at once. That is quite useful. Let's describe what this thing kind of does. So right off the bat here, what we're talking about is this little orange square. It almost looks like one of those 90s handheld televisions. I mean, don't get me wrong, a lot cooler. But it has a flip camera so that you can take pictures of you, take pictures of what's in front of you. It's got a scrolling wheel, it's got a button on the side, we'll get into that. It's got a 2.88 inch screen, touch screen. And this little device, kind of like a little walkie talkie, you press the button and you just talk. And the whole point behind this device is that where iPhones currently and of course historically, or even general mobile devices historically have been a series of apps, a store of apps, an allotment of apps that compete for your attention. But that doesn't exist in this interface. It's just a little digital rabbit. And that the applications are on the back end. They're not vying for your attention. It's just that when you want to use it, you just press the button and make a request. And the operating system, which is coincidentally called Rabbit OS, is built on what's called an ELAM. Now this is different than an LLM, a large language model, where as you type or you say something and it comes back with text. This is built on something new. This is the first time I'd actually heard of this in this way, which is a large action model. You have an AI model that's built on top of how you use websites. Potentially how millions of websites are used. Trained on the user experience and the user interface of a huge number of websites. Which means it understands buttons. It understands form fields. It understands progression through pages in order to order something. And that now you just need to, all you need to do if you want to order a pizza is you say, I want a pepperoni and cheese pizza. And it might come back and say, there are three locations near you. And I'll say, well I just want to use my favorite. And it'll come back and say, well you've used Marco's recently, do you want to use Marco's to order the pizza? Yeah. Maybe it asks an additional qualifying question, but then it places the order. It does it. And it's trained on a number of companies' websites and applications and interactions. It's a large action model. So rather than it being trained per se on language, which it does understand language, it's trained on taking that language and converting it into internet action, website action, UI action in the background. So rather than having icons up front and then tapping and going and doing the actions yourself, those are already pre-trained. All you've got to say is, I want a pizza. And DoorDash is what it interacts with in the background. And it has a history of what you've ordered in the past. You could say, give me my favorite item. And it will say, well your favorite item is pepperoni and cheese pizza. You'd be like, yes, I want that. And then you just hit submit. One button on the screen once you've confirmed that. Was that your takeaway? Yeah. I think there's quite a bit of convenience with using something like this. And honestly, to that point of convenience, I'm wondering what you think of some unintended consequences of using an operating system like this would be. If you had a system where you could order a pizza and you say, order me a pizza, is it always going to go to Domino's? Is it always going to go to Pizza Hut? How is that going to work? I think you could specify. And I think a part of that is what it's already pre-trained in. And the operating system has the LAM built into the device. So it's not like right now where you have to call out to an API. It's built into the device. So you don't have to have a service. You're not paying for a service on the back end. It's part of the operating system. It's baked in. So you might think to yourself, how do you get new models and new websites? Well, you go to the website. You obviously have an account that's free with them. And then you log in and they have a secure method for handling that authentication. And then it sets up this environment where it acts as a proxy to you going and using a given website. And you logging in to that website. And it doesn't store any of that. But it does store the interaction with the site once you've gone beyond the login. And it just takes, I imagine on the back end, just takes the authentication token so that it can represent you in your app, in the device that you have, in your Rabbit device. But it's a bit like a macro recorder where you just go into the website and proxy to the website you want to. So let's say DoorDash Online. And then I can go and place an order. Once I've done that, that, let's call it a macro, kind of like a macro, gets sent to my device into the LAM. And now it's programmed in. I can say, hey, use DoorDash to order me a pizza from Pizza Hut. Or Marcos. Or wherever. Or I could just say, send me a pizza. I don't care from where. And then it'll just do it for me. Whichever I've used previously in the past, whichever one I've got credentials to get into. And that's what's really powerful about this. Absolutely. To that point of you can pre-program everything. Where my head goes with this, where I see some risks being presented, is if you were to integrate this with, say, banking, if you had it integrated with your Vanguard account, and somebody in passing walks by and says, sell everything and buy GameStop. You know? There you go. Here's the thing, though. It's not like Siri. It's not like Amazon, where I go, hey, whatever. You have to press the button in order for it to hear you. And like a walkie-talkie, you press the button, and you talk, and then you let go. So somebody can't just walk along and trigger or activate your device and gain that kind of control over you. My hope is that there'll be maybe some degree of voice recognition as well, or some other biometric sensor that maybe figures out my face when I'm pressing the button just to confirm that it's me that's pressing the button. The concern there is, what if you left your device somewhere, and the only thing between a bad actor and your bank account is him pressing the button? That could be a problem. So I have to imagine that there are some additional security protocols that they're going to put in place. Although they didn't dive super deep into that aspect, they're going to have some unique ways of dealing with risk and privacy, because you'd have to. If I'm going to go to their site, and I'm going to let them look over my shoulder while surfing a site to record how to operate a website and to be able to access that using my credentials in some way, they're going to need a real airtight method of dealing with that. Because you don't want somebody taking control of your interaction, hijacking your account in some way, doing a man-in-the-middle and capturing your Wi-Fi connection to whatever when ordering a pizza. Do not want that. Absolutely. The golden egg here is this. About a year ago, I went to a US AI conference in Atlanta. What was described at the time was the golden egg. That is the ability to make a phone call, to talk to a chatbot, an AI agent, in such a fluid way that I could order airplane tickets to wherever. That's a complicated process when you think about it, because if I just call and say, I need to fly to Florida. Why? It needs to be intelligent enough to say, okay, well, what city in Florida do you want to go to? And then, okay, I want to go to Daytona Beach. And then for it to say, well, what airport will you be leaving from, and what airport are you going to? And for me to be able to specify the city and it to come back and recommend the main airports that are nearby. To then ask questions about the kind of seating I want, the pricing that I'm looking for, the specific seat I might want on the plane, the time of day that I want to leave and return. Just ordering an airline ticket really is a lot of qualifying questions. Until now, the chat systems just weren't good. You had to talk to a live agent. With a large action model and the ability of it to see the process you go through on a website for ordering airline tickets and finding a flight, that it can now turn that into a conversation and perform those actions for you. That's the golden egg. That's where we are. That's the next generation of convenience. It's not just about chatting. It's about chatting with interaction and with a result. We've just not had that until now. I think that to me is what's gnawing at the back of my head. What were your thoughts when you saw that interaction where he demoed a number of these sorts of things? I think the main one that did stick out was that trip planning. The amount of complexity that goes into that because it's not just book me a hotel. It's not just book me an airline ticket. It's find the full thing along with schedules, along with rideshare. Scheduling a trip to Paris for five days. That's quite convenient and useful. You don't really see those sorts of jumps very often because I was literally just thinking about planning a trip to Philly. Now I have to book a hotel, find a dog walker, all sorts of logistics around that. If I could just speak into my smart device and have that handled, that's a convenience worth paying for in my mind. It's kind of wild because there's a lot of details that go into it. One of the demonstrations they showed was I want to plan a trip for four days. Provide me an itinerary of interesting things to do on those days. It's like a list. It was like, tell me if you want me to order tickets for you to any of these events. He looked at it and said, well that's kind of a packed itinerary. Let's lighten that itinerary up. Then boom, it lightened it up. It reduced the number. You could say, I want tickets for that date for that thing. It would order it for you and it would be done. This idea of having a true assistant that can do things on your behalf. I mean, real complex things. It feels like that next generation step. Go ahead. I think what I'm trying to really figure out is, obviously you can see it from the consumer standpoint how this would be useful. You're walking around with a smart phone but what about from the business and even the government side of things? Are you seeing any use cases that might be most interesting for essentially businesses and governments? For me, I'm looking at surveillance really. With a device like this, pretty much would revolutionize how surveillance could work. Just the next generation of security camera in a lot of ways where it can interact with folks. Extrapolate that out to drones. Have you thought of any sort of applications like that? I'm thinking more of that executive level. I'm thinking of that assistant in a pocket. That executive that's on the go that needs those tickets. Needs those things done. Needs these certain people notified of certain events. One of the things I thought was interesting was that you could take a picture of a spreadsheet and then you could say, okay, I want you to add a column that averages the numbers in this row out and email that to me. Then, boom, it would send you that spreadsheet from the picture, the spreadsheet in your email, and the numbers would be averaged for you. Then, you could reply to that email and say, add another column adding these numbers and then send it. Then it would respond back with a new spreadsheet with the additional column and the numbers added as you requested. It's this back-end assistant, this feeling that you can use natural language to ask plainly for something that's relatively complex or has detail to it that you don't want to be pestered with. That's just interrupting your day. It's sucking up your time. To be able to just say it and do it. I don't care where you ordered the pizza from. Order me a pizza. I'm in D.C. Find the most popular restaurant that's a pizza restaurant and order me a large pepperoni and cheese pizza. Come back and say, done. Press the button. This is how much it costs. You press the button. Done. Or, I need an Uber. Sending an Uber on the way. This is how much it'll cost. Press the button. Done. It knows where you are. The Uber is on its way. That, to me, at a high level, is mind-blowing. When we talk about the tools we have today, like I've got to go to Uber. I've got to open the app. I've got to put in my location. I've got to see me on the map. I've got to make adjustments. I see the multiple cars. I've got to choose the car that I want. That can be reduced down into a handful of sentences. Also that I like is that it knew when to ask questions. If I ask for an Uber, come back and ask, how many people are traveling with you? Just one. Okay. I'm going to send you a deluxe that has plenty of room in the trunk for your luggage and has plenty of room for one person. Press the button. Confirm. Or not. Or change it up. Like I could say, I want something bigger. Right? It's that level of access and ease and the fact that I don't have to open an app or find an app and then plow through all of the details. I could push those details off to an assistant. This device as an assistant to say, just send me a cheese and pepperoni pizza from the best pizza place near me. Done. Here's the cost. Press to confirm. I love that. I think that's fabulous. The time savings are just going to be immense from technologies like this. I know looking into obviously this is more of an extension on LLM agents gaining in popularity. Large action models being what could be the next step in those. But you look at a lot of tasks that have been done by humans in the past like data entry, writing documents, that sort of thing. These devices that come out they're going to be immensely convenient. But it also gets down to a philosophical question at a point. When these tasks are so easily done by machines where is the workforce going to shift? Where are these next opportunities going to open up in? I think in circumstances where creativity true creativity is a luxury or a necessary resource. Those jobs I believe are safe. Right? Generative AI is fantastic at generating stuff. Making stuff up. Even if you provide the factual information it'll make stuff up from that factual information. It can be mostly factual. But at the end of the day when risk and reputation rely on a person getting real human eyes on it with real prolonged context business deep knowledge you can't replace that yet. We can't replace that yet. But with this device I see a whole generation of teenagers coming out and going that's the new device. It's the next easiest thing. It's the next most convenient thing. The iPhone made lots of things convenient. I don't know if you remember have you ever used a physical map to drive to get to a place? Be honest. I've used it in the military. That's about it. I don't think that counts too much. That counts. You can use a map. If I handed my son or my daughter a map or anybody else around their age a map and said okay I need you to get to this address. The address isn't on the map. There are ways to find where the address would be. I don't know that they could do that. I don't have that experience. I don't have the experience of remembering locations and sites and certain waypoints if you will to get a bearing and keep a feel for where they are. They just have their device. Look there I am. I'm the dot on the map. This is that next step beyond that. No longer needing to use the whole visual clunkiness. I can just ask for something and it's provided for me. All I had to do was do it once on a website. Or somebody else has done it on a website and it's already programmed into the larger action model. Not inheriting somebody else's access but being familiar with the website and how to order stuff. Whatever that website might be. I think that's the next convenience. I've told my kids you're going to be able to tell that generation of kids I was alive before AI. I remember what it was like before. Generative AI and these large language models were really useful. They just don't get it yet. They just look at me and go okay dad. Which takes me to that next point. I really wasn't sure if this device was like a flash in the pan or I had that doubt in my mind as I do for a lot of technologies but there was just something triggered in my mind. I got my son. I told him I just saw something, some technology and I don't know if this is garbage or if this is something that's going to take off. If it takes off, it's going to take off with him and his generation. They're the ones who are going to embrace this and propel it forward. We watched it for 30 minutes. I'm trying not to lead the witness. Afterwards, I was like what do you think? He's like that's really cool. I said could you see yourself using that and maybe even not using your phone? He's like yeah, I really could. That's really interesting. He still wrapped his mind around it but he immediately recognized how easy it was to interact with that device as it was demonstrated. When you saw it, what was your feeling on the hype here? I think when there's a demo and it shows the next big thing and there's quite great production value, I always take it with a grain of salt. There was Theranos a few years ago where they came out with one drop blood test and it turned out to be completely false. In terms of have other people use this besides for the man that was presenting and his team, I hadn't heard of any reviews. I think it's more we're going to see where it goes. The pre-order is only 200 bucks or so. I think it was 199 last time I checked. It could be the next big thing. It could be a flop but I think it's interesting nonetheless and it shows where capabilities could be in the future. I think there was something that I just saw today. I don't know if it was today or yesterday but Neuralink just started human trials. You see stuff like this with large action models and agents and you think of Neuralink and I think stuff could get interesting real quickly. I think people if you had the ability to have all the knowledge on Earth within your head at any given moment in time and be able to interact with any website within your head without even needing to speak to some device or press a button on it. Imagine the productivity gains but imagine how dystopian it would be but at the same time this is something that may actually be possible in the near future. This is way off topic here. We're going to veer off the road for a moment. We're going to do heavy trails through the cornfield of tinfoil hat ideas. I don't know. It feels like a good 15 years ago they were doing experimentation with brain frequency, brain wave transmission wherein you had somebody in Japan and they had sensors over a part of their head and then you had somebody US side and they had that wired through the internet to transmit sensors that when the person in Japan would raise their arm, the person in the US raised their arm. Now was it a full takeover? No it was not. The person still had control to a extent, could struggle with that control, override that control. But here's sort of a convergence. You've got a device connecting. It is sensing brain waves that when you look at something, it could potentially like it'll mouse over, it'll click, it knows what you want to do. That means that it's picking up on signals and it's taking those signals and turning them into action. So now we're recording signals. Now all you need for an LLM is to have a ton of patterns that relate to something. Right? It's like images. You break that image down into almost garbled garbage and then you work it in reverse to the actual picture to make sure that you have this sort of diffusion, if you will, and then it brings it back based on certain keywords and that's how they can kind of mesh images by taking these sort of garbled versions of images with similar patterns and then working it up to a final picture. But if you do that for example, with brain waves connected to patterns and you're recording that and you're also recording other patterns, at what point can you then feedback or potentially AI can have a reverse presence in your life? Can you kind of see what I'm getting at there? It could potentially it could, I don't want to say control you, but influence you. Yeah. You know, maybe I would just say hey, I want to feel better and then, you know, it triggers certain neurons, certain patterns that trigger a memory that makes me feel better. Or I want to remember something and it triggers that memory. It finds that pattern, that space. You know, it's kind of a weird realm there. And again, it is kind of tinfoily, but I think when we're talking about AI and we're talking about learning, they've already done image patterning, where you can take what somebody saw and you can ask them to remember it and the AI picks up on the brain pattern behind your head, where your visual cortex is, and then it can create the image. Not great, mind you, but it does create a likeness of the image as a picture. Based on it having been trained on patterns, patterns, brainwave patterns, based on your interaction with things. That's pretty crazy. And we're only talking about like a year ago, maybe less, for that kind of research. Yep. So, yeah, Neuralink, we're going to go back onto the road now, but it's frightening for a few reasons. It's frightening for that reason, because it's definitely putting potential influence feedback. And then the other part of that is there was reporting that a lot of chimpanzees died due to I don't know if it's rejection of the connection or what, but supposedly they had a lot of test subjects die, not people, but animals. Oh, man. When I heard that, I was kind of like no, I'm not putting that in my head. That's quite the software bug there. A little bit. Now, granted, it could have been news looking for looking for their moment. It could have been a group of folks or a journalist that kind of is bent on that not happening and having a motivation. Who knows? That may not necessarily be the case, but I've seen enough articles questioning the fact that they've lost a lot of test animals due to not being successful at implementing. I kind of wonder if the technology isn't being pushed too fast. It could potentially be dangerous. I don't know. We'll have to see how all of that plays out the next year or so. Right. Back on the road. Just hype. That's what it comes down to. Is this device hype? Half of me is like, don't get caught on this. It's probably just hype, but frankly, every week since AI dropped in December of the year before, I was like, eh, it's just hype. I see what it can do. I'm like, oh my God. This stuff is meeting the hype. I can do it myself. It's working. It's doing what I want it to do. Now I'm in this place where I'm like, here's this device with a new large action model doing some pretty unbelievable stuff. Stuff that we couldn't even imagine a year and a half ago. Do you get this? Is this your generation's iPhone? He would just roll his eyes at me. He's looking for every way to say, Dad, you're crazy. You're nuts. It's garbage. That's dumb. He would absolutely love to see me defeated today. If I got even a glimmer of hope or something like this, he would just roll his eyes and go, nope, that's dumb, and he would walk away. But this time he was kind of like, you know, I think you need to get that. Wow. That's what I said. Shit, maybe it is. Maybe it is his generation's iPhone. And then you ordered two. Say again? And then you ordered two. I wish. No, I didn't. I'm still kind of waiting. I'm still like holding out and waiting for somebody nearby to get it in their hands and let me play with it. That's how stingy I am. Would the Deloitte well-being subsidy take care of it? I wonder. I still have that to blow through. It might. It might. Some R&D work there, you know. So it dropped. SCC, there was a lot of feedback from SCC this year, particularly on this device. They had nearly 5 million views on YouTube in one day. For a company I'd never heard of, and I thought I'd heard of all the companies, they had no, there was no pre-drump, right? There was no dropping. I'd never heard of this. There was no rumor at all about this. It was just a drop. And they got nearly 5 million views on YouTube in a day. And come to find out that they had over 30 million dollars of investment behind it before they even dropped it. Their series A round investment brought it to like 30 million dollars before they dropped this this advertisement. BusinessWire reported that that they had sold their run of 10,000 units in less than a day. But they had another round of units, 50,000 units that they had made 36 million dollars after their first round of resale orders. Which is pretty impressive. I mean, you compare it to like iPhone numbers, but iPhone's already established. It's in the industry. But this is still kind of a big deal. A lot of people are investing this. A lot of people see the value. So I'm kind of hoping that it's worth the hype. Yeah, it brings a Sam Altman quote to mind for me. AI will probably lead to the end of civilization, but in the meantime, there will be great companies. I hate that kind of negative view because every time a new, a grand new technology comes out, somebody's got to beat the war drum or see the world or something. Right? I don't think it's the end of humanity. I think it may be the end of humanity as we know it. Like before the iPhone, we were, I believe, more social. Like I think I spent more time on the phone before mobile phones came out and a lot less time actually talking to people on my phone now. It feels like I'm closer to more people, but I'm not as close to the few people that matter. Whereas before, I may have contacted fewer, but you know, I keep those plans. Right? Those phone calls mattered. Getting together in person to hang out at the local Shoney's was a regular thing because that was the only way to keep up on what's going on in people's lives. Now it's so easy to see what people are doing. You hardly have to say hi. You just press the like button. So I think it might be a change, but I don't think it's the end. Might even swing back a little bit, you know, now that you don't have to spend time looking at things on your phone and you just speak into your phone, now you have time to hang out with people. We'll see how it goes, but I think it could be total hype. It could be something substantial. I think more likely, whether it is something substantial or not, is it will influence where other folks invest and where they develop things. I think that the next generation of whatever this RabbitR1 is, be it from Apple, Amazon, or whoever, could be very, very powerful. I think so. Again, the biggest part of this is the action aspect. The fact that with generative AI and with this large action model that you can not just, not only talk, but now it can take action on behalf, and it can come back with questions about what actions to take. You really have this interactiveness that the ability to have an assistant is no longer a luxury that C-suite people enjoy. It's something that everyone could have. An added efficiency to life. You just ask for it, and it knows how to do that for you. It will handle it immediately. It's like having a butler, almost. At $200, if it operates like it says it does, it's just going to be an amazing luxury that most people just don't understand because they've never had it before. Now we just need Iron Man suits, and we have our own Jarvis. Yeah, isn't that the truth? It really is like that. It feels like I guess it was 10 or 15 years ago, but Jarvis was like a fantasy. It was like that technology, like the 3D technology where he just moves his finger and it builds like an armored suit behind him. This ability to interact with this artificial intelligence was the Star Trek, right? Where the little flip device would just and they could just talk and they're instantly communicating with someone. The little communicators that they had like that was the future, maybe not even reality. Maybe a fantasy, but it became reality because of shows like that. They showed in shows like that how it could be used and how it could advance interaction, and people latched onto that. For us, our fantasies, our technology fantasies, Jarvis, who would have thought that that would even be possible, really possible, like it's demonstrated in the movie. And for $200, there you go. Jarvis. Everybody gets a Jarvis for $200. Like a real, functional, can do it for you, will talk to you, will talk with you Jarvis for $200. Like that's the future. Our kids are going to have, every kid is going to know a world with Jarvis directly in their life helping them. And that blows my brain. Absolutely. Jarvis, coming up, as you have Pepper behind you. Fair. Okay, I think we've beat this down a bit. Is there anything else that you wanted to touch on? I haven't done a lot of talking. I've just had a lot to spout. Yeah, absolutely. I think we've covered most of it. I think it's mainly hype at this point. It's just an interesting thought experiment. We have Jarvis in our pockets. Do you have any questions that you would find interesting to explore? Sorry, my dog just got a little excited there. I'm sorry, say that again. My dog had gotten a little excited there. Is there anything that you hope it can do that hasn't been advertised or hasn't been demonstrated? That's a good question. I just feel like integrate with computers. Integrate with my Mac, let's say. Have it do things on my Mac and not just be a phone. Integrate with a 3D printer, let's say. I think maybe that's just fantastical thinking, but if you had this little connected device that you could plug into anything and it could do anything that you wanted it to do, just imagine the future of work. Today, writing code, you can write code in a language that you've never learned just by using GPT-4. Imagine you have this large action model that you can command to not only write code but just interact directly with other operating systems. That would be an interesting integration. I don't think it's very likely, but you could plug it into your Mac and it will do whatever you want with your Mac. Interesting. If it can control websites and it can access those websites on behalf of us, could be connected and it act on behalf of us like a macro. Could it operate our computers? That's an interesting thought. Given what they've demoed, the ability to take a picture of a spreadsheet and tell it to do something with the spreadsheet, tell it to email you and it actually converts it to a spreadsheet, that means that there's something in the background that can do that kind of functionality. Maybe a future enhancement would be connect to my computer and create that spreadsheet on my computer. Don't just email it to me. Do it on my computer. Go beyond websites. Go to applications. Go into Excel and create me a to-do list and then send it to somebody. Perform the actions for me. Go through my emails. I want you to pluck out the emails that are specifically addressed to me. Not necessarily like at Chris or because I'm in the to line, but because the text is actually addressed to me doing something or me needing to comment on something. Yeah, I think that might be that next level is that interaction. Something that Simon and I had chatted about was this is all well and good, but if I can walk into my home and I can't tell it to turn the lights on like I can with Alexa or some other home automation system, that would be a real lost opportunity. To be able to press a button and say, turn the lights on in my living room and it just happens. Right now, I think the software for home automation is clunky. The ability to organize a home automation is still a little weird. The different devices can be not compatible because the industry has moved into a direction of compatibility, but it still feels clunky. To be able to just press a button and say, turn the living room lights on and it'll do it. It doesn't matter where I am. It'll just do it. Or to be able to say, I want the lights on at 7 o'clock and I want them off at 9 o'clock and I want you to turn them on and off gradually as the sun rises and sets so that my internal lighting matches external. Something along those lines. That would be fabulous. Something like that would be fantastic. To be able to just vocally say that after I've connected it to all of these devices and interfaces. Find the way to do the thing. Yeah, man. I'll leave it with one more story. My son was born 10 weeks early. He was born the night of my birthday, November 22nd. We had a telephone service and I had just tried this online internet telephone service. It was internet driven. It had a special little box that your home phone connected to. It didn't have cell phone connected to. But it was routed through the town over. The next town over. Because they didn't have it for some dumb reason. They didn't have the service for the town that I was at. I thought, no big deal. Well, my wife starts screaming. She hits the floor. She's bleeding. I go to the phone. I dial 911. And I get a busy signal. Hang up the phone. The phone starts ringing. I pick it up. Hang up the phone. The phone starts ringing. My phone's trying to call me. And I pick it up. And finally, I do something to get through to someone at 911 who is in the next town over. And they have to route the call to my town so that the fire department that couldn't have been more than three or four hundred yards from our house gets the call and comes to address the situation and take my wife off to the hospital. I think of that incredibly frightening situation where I literally thought my wife was going to die and I might lose my son. And the idea that I could just press a button and say, send the police and the fire department. My wife is having a child too early and get over here ASAP. And for it to know what to contact, what to send, what directions to send them to, they would just do it and confirm that it did it. It's life saving. It's a game changer. Really. And in that kind of circumstance, that's the kind of interaction I feel like are those powerful little stories that drive the technology. Where a person could just press the button and say, just help me. I'm in trouble. And it knows what to do, who to contact, and to tell them where you are. Without saying anything more than that. Because in moments where you're panicked, in an emergency, like the brain just wants to deal with the emergency, it doesn't want to deal with the conversation. You know, and I'd forget my own address in that kind of emergency. Anyways, that's the kind of thing, that's the kind of effect I'd like to see this kind of device have. That it makes it so easy that even in an emergency, you just say a handful of words and get the response that you need. Yeah. That's what I'm hoping for. But again, it might just be hype. I don't know. Maybe I should pony up and spend that $200 and find out. I've spent more for less. See what the return policy says, you know. You never know. Exactly. Alright, well I think that covers it. This is Chris Cantlie. This is Charlie Everett. And thank you for listening. We'll catch you on the next episode. And stop recording.

Other Creators