EP 184: On-Device AI: What it is and do we need it? What no one’s talking about

Resources

Join the discussion: Ask Jordan and questions about on-device AI

Upcoming Episodes: Check out the upcoming Everyday AI Livestream lineup

Connect with Jordan Wilson: LinkedIn Profile


On-Device AI: What it is and do we need it? What no one's talking about. 

From the potential benefits of increased productivity and seamless integration with our devices to concerns about privacy, resource consumption, and the lack of a "kill switch," we explore the untold side of on-device AI that no one's talking about. Join us as we uncover the hidden truths of this technological trend, discuss the impact on companies like Microsoft and Apple, and anticipate the release of llama 2 and its intriguing offline integration. Stay with us to be in the know about generative AI, and its far-reaching implications in our everyday lives.

The Upsides and Downsides of On-Device AI

The episode carefully outlines both the advantages and drawbacks of on-device AI, shedding light on the multifaceted nature of this technology. One of the central themes is the potential benefits that on-device AI can offer, such as enhanced productivity, seamless integration with various applications, and meaningful interactions with devices. The convenience and empowerment of individuals, particularly those with accessibility needs, are highlighted as significant upsides.

Conversely, the episode also addresses the downsides of on-device AI, ranging from the resource-heavy nature of AI systems to concerns about battery life and performance on personal devices. The potential for intrusive or unwanted AI features, cost implications, and the pressing need for a "kill switch" to ensure control and safety are also brought to the forefront of the discussion.

The Big Picture: AI Trends and Implications

A key takeaway from the episode is the importance of understanding the broader AI trends and their implications for businesses. The episode emphasizes the influence of on-device AI on market dynamics and the anticipated impact on companies such as Microsoft, NVIDIA, Qualcomm, Meta, and Apple. Moreover, the upcoming release of new AI-powered hardware devices and the integration of generative AI into everyday interactions with devices further underline the significance of on-device AI in the broader context of AI trends.

The Potential Business Implications

For business owners and decision-makers, the discussion around on-device AI holds valuable insights. The exploration of on-device AI's impact on market adoption rates, the integration of AI into hardware, and the potential for generative AI to enhance everyday interactions with devices is directly relevant to businesses across industries.

Moreover, the podcast's emphasis on Apple's response to on-device AI trends and speculations about rumored projects involving generative AI presents an opportunity for businesses to gauge the trajectory of AI technology and its potential application in their own products or services.

Joining the Conversation and Staying Informed

The episode's call to action for audience engagement aligns with the broader need for businesses to actively participate in the ongoing discourse around AI technologies. By staying informed about the advancements in on-device AI and exploring potential use cases within their own operations, businesses can proactively position themselves to leverage the opportunities presented by this evolving landscape.

Furthermore, the podcast's encouragement to join the Everyday AI community and engage in discussions on generative AI and its impact on companies and careers provides businesses with a platform to gain deeper insights and foster connections within the AI ecosystem.

Conclusion: Embracing the Potential of On-Device AI

As the conversation around on-device AI continues to evolve, businesses are presented with a spectrum of opportunities and considerations. The insights shared in the podcast episode shed light on the nuanced nature of on-device AI, encouraging businesses to carefully evaluate its implications and explore the potential avenues for leveraging this technology in their operations.

By delving into the upsides, downsides, broader AI trends, and potential business implications of on-device AI, this article seeks to prompt discussions and inspire proactive engagement within the business community. As the business landscape continues to be shaped by advancements in AI, embracing the potential of on-device AI and its impact on operations can be a transformative step towards staying ahead in the ever-evolving AI landscape.

Topics Covered in This Episode

1. Concerns about implications of on-device AI
2. Anticipation around llama 2 release
3. Speculations on Apple's generative AI project
4. Benefits and drawbacks of on-device AI 
5. Integration of large language models with smart assistants
6. New AI-powered hardware devices
7. Challenges and drawbacks of on-device AI
8. Purpose of podcast and recent AI news


Podcast Transcript

Jordan Wilson [00:00:17]:
Do we need AI on every single device we own? Do we need an AI in our fridge or on our laptop or on our smartphone. It's something especially after the ad All of the recent CES announcements where it looks like AI is coming to every single device. And I'm not just talking about software. I'm talking about The hardware. Alright. So we're gonna be talking about that and a lot more today on Everyday AI. Thanks for joining us. My name is Jordan Wilson, and everyday AI is for you.

Jordan Wilson [00:00:53]:
It's I mean, that's that's why we're building it. We're building it for you So so you can better understand what's going on in the world of generative AI and not just keep up with what's going on because that's hard. That's difficult. I tried every day, and I feel like I'm always falling behind. But not just how we can all keep up, but how we can actually get ahead. Right? Because if you're listening to this show, First of all, congratulations. I'm not saying that because you get to listen to me. Right? That's not what this is about.

Jordan Wilson [00:01:20]:
I'm saying because you are still an early adopter. Right. And as AI becomes more and more intertwined in our lives, you are listening, so you are keeping up. You are getting ahead. Alright. So if you're joining us on the podcast, thank you as always. Make sure to check your show notes. I leave my email in there.

Jordan Wilson [00:01:36]:
Drop me a note. Let me know what you wanna see more or less of, on the show. If you're joining us on the livestream, thank you as always. Let me know what your questions are about on device AI. This is something, to tell you the truth, I'm always learning about it, But I've I've noticed this trend. So thanks for our, you know, livestream audience. Hey. It's good to see Mike 4 g back in the house.

Jordan Wilson [00:01:57]:
What's going on, Mike? Thanks for joining us. Brian tuning in from Minnesota. Hey. Is it snowing where everyone else is? Maybe maybe not in Dallas where, you know, Josh is. Maybe so. But, Man, me and me and Cecilia from Chicago here, we're actually getting hit with snow for, like, the first time in forever. Chrissy, thank you for joining us. So Our livestream audience, let me know.

Jordan Wilson [00:02:17]:
What are your questions about on device AI? I know it's kinda confusing. I'm even confused about it sometimes, so we're gonna dive into that today, and a lot more. But before we do, if you haven't already, why the heck not? Please go to your everyday AI.com. Yes. Sign up for the daily newsletter. I was told yesterday was one of our best newsletters ever. So make sure to go read that. Check it out.

Jordan Wilson [00:02:38]:
It was all about, you know, how to build and monetize on the GPT But on our website, we have more than a 180 back episodes of podcasts, You know, more than a 180 newsletters where we take deep dives into generative AI. We have different learning tracks on our website, so you can go and click. Maybe you're in sales. You can see every single sales podcast we've ever had. So it is literally a free generative AI university, so you gotta go check it out. But Let's talk about what's going on in the world of AI news for today. There's a lot. Here's a good one.

Jordan Wilson [00:03:12]:
Any Swifties out there, Be careful because there's a new Taylor Swift that's confusing fans. It's obviously an AI version. So an unauthorized ad for a cookware company featuring an AI generated Taylor Swift offering a giveaway was recently removed from social media. So if you saw Taylor Swift giving away free cookware, Not really a thing, so you're gonna have to shake that one off. This is not the 1st instance. Yeah. That was a joke. That wasn't AI written.

Jordan Wilson [00:03:37]:
That was just me ad libbing. But this isn't the 1st instance of AI generated ads using celebrity images and voices without permission. With cases, you've seen these involving Tom Hanks, in Scarlett Johansson, etcetera. Also, this has led to proposed legislation at least here in the US called the No Fakes Act aimed at protecting individuals' rights To re, to control their image and voice. Y'all, we're gonna be talking about deep fakes a lot. If you don't like it, sorry. Because at least here in the US, we have the election cycle coming up. I I've been saying it since literally the very first episode of the show 9 months ago that deepfakes are going to ruin us in the 2024 election, at least here in the US.

Jordan Wilson [00:04:18]:
Alright. Next piece of AI news. The enterprise version of ChatGPT is picking up some steam. So OpenAI just kind of announced or, information was released about some, usage for their ChatGPT enterprise version. So They've, said that they have 260 paying companies with over a 150,000 registered users. So this is big companies only, right, because ChatGPT just released, you know, not even 48 hours, not even 48 hours ago, the ChatGPT Teams plan, which is really geared for companies 2 to a 149, people deep in terms of size. And that one is, you know, pretty affordable at $30 a month. I I haven't even seen the enterprise, you know, pricing per per customer.

Jordan Wilson [00:05:02]:
I've seen so many different things. But, you know, OpenAI coming out and saying they have 260. I I I think that's a pretty big deal. I think that means that, you know, more companies are at least experimenting, with this enterprise version of ChatGpT. Alright. The last piece of news for today is one of the biggest names in generative AI has teamed up with the tech giants. So Microsoft and Typeface have announced an integration That will see the start up's AI technology embedding into Microsoft's, Dynamics 3 65 customer insights platform to simplify the campaign building process for marketers. So, yeah, if you are an enterprise marketer, this is huge news for you.

Jordan Wilson [00:05:38]:
So this integration will allow marketers to create content aligned with the company's branding and style from a blank canvas thanks to this Typeface, integration. So if you haven't heard of Typeface, it is an enterprise grade gen AI product that helps large companies use generative AI to create more personalized content for work. Lot going on in the AI news today. Hey. Josh is excited. Josh is gonna see Taylor Swift soon. Alright. Hey.

Jordan Wilson [00:06:07]:
Thank you. Tara laughed at my, Jordanism there. Hey, Bao. Thanks for joining the show. Hey. It's good good to see you back, Bao. Alright. But but I wanna talk now about this this trend, if you will.

Jordan Wilson [00:06:19]:
And I don't even know if it's a trend per se, or is this just The direction that generative AI is is heading. I'd love to get your questions and your thoughts, your comments, your insights from our live audience. And, also, if you're listening on the podcast, let me know as well. So let's talk just a little bit today. And, yes, sorry, you're just stuck with me today. I don't have a brilliant guest as I Most days normally do here on the Everyday AI Show, so you're gonna have to listen to my rambling. So here's here's what on device AI is, And I wanna talk about if we actually need it. Alright.

Jordan Wilson [00:06:54]:
So I've noticed a huge trend, over over the last couple of months of of trying to bring All kinds of generative AI to a device. Alright. So let me first kind of hit rewind and say what that means. Right? So right now, all of the generative AI platforms that most of us use on a day to day basis, so whether you're talking about chat GPT Or you're talking about Bard or MidJourney or Runway or whatever it is, you know, whatever generative AI, you know, that you use. It's essentially in the cloud. Right? Like, we go on a company's website or we go on a Discord server, and we access, you know, generative AI that way. So the big picture, right, is, and this is where the trend is heading, is bringing these models, specifically large language models or and Small, large language models, whatever you wanna call them, is bringing them to an actual device, right, and not needing to connect to an external service or even connect to the Internet to use AI, to use generative AI, to use large language models. So we've seen this in a lot of recent announcements.

Jordan Wilson [00:08:02]:
And and, hey, FYI, I'll tell you now. If if if you're hoping to get into the technical aspect of this, You know, and talk about, like, Mistral, LLM, and, you know, llama too, and and all these, you know, small models, you know, that people are are hacking and forking and putting on devices. That's not this. We're talking about the bigger picture here. Right? This trend. And we're seeing it Because you've seen, you know, even in the last week or 2, big announcements, you know, Microsoft, NVIDIA, you know, big announcements that they just had at CES, you know, the consumer electronic show, talking about releasing AI chips. Right? So this is where everything is heading. Even another one.

Jordan Wilson [00:08:40]:
Qualcomm and Meta, you know, talked about a huge partnership, and this should be releasing soon, I believe. They're bringing llama 2. So llama 2 is, you You know, Meta's large language model is small yet very powerful model, so they're bringing that to PC chips, to the Qualcomm chips. You know, so that means that on your local computers, you know, without needing to connect to the Internet, you will have llama 2. You will have a large language model Running locally on your machine. So what does this mean? Right. And this is also actually, before I I dive into that, rhetorical question. We have to talk about Apple too.

Jordan Wilson [00:09:18]:
Right? Because if I'm being honest, I think the short term success. Right? So if we're looking at short term as, you know, 6 to 18 months, I think the short term, success And in market, adoption rate really goes to what is Apple going to do. Right? Apple's never first to the party. We already see the people who are first to the party here. You know, it's it's companies like Microsoft, you know, bringing AI to PCs. It's companies like, you know, Meta bringing their model. It's it's companies like Qualcomm, NVIDIA. Right? They're first to the party, but it's usually what and and how Apple does it to see if this is going to be an industry trend.

Jordan Wilson [00:09:58]:
Right? Example, Apple wasn't the 1st media player. Right? But the iPad Made that a trend. Apple wasn't the first or the iPhone wasn't the 1st, touchscreen phone, but that's what made it the norm. Right? So I think we can't overlook. Even though we haven't heard a lot, we've heard rumors. Right? We've heard rumors that Apple is reportedly spending more than $1,000,000 a day Building their next generative AI product, whatever that is, whether it's software based, which is one thing, or whether it's hardware based. Even though technically, you know, it's, if If it's on the software, it's technically hardware based, right, because then it can be offline. So we we really Don't know, I think, or won't truly know the future of on device AI until we see what is this next Apple announcement that they've been working on.

Jordan Wilson [00:10:50]:
Is it gonna be cloud based? Are we need to are we gonna need to be connected to the Internet, or is whatever Apple is working on in the, you You know, large language model, gen AI space, is it going to live offline on our phone? Right? Is it going to live offline on our computer? Personally, I would love that. Right? Because what offline or on device AI does is it brings this concept of personal AI to life. Whether you like it or hate it, right, offline AI is the marriage of of your personal data and productivity, whatever productivity looks like to you, whether it's being more social personally with your phone or whether it's accomplishing more, on your PC. But that is essentially what on device AI is is really aimed toward, but it's gonna be interesting to see, what Apple actually does. You know, is whatever they're working on, this, you know, rumored Ajax is is one of the names or some people call it Apple GPT, is this gonna be cloud based, ad Or is it going to live offline on our phones? I would guess the latter, but I have no clue. You know? All there is is is speculations and and report out there. Yeah. Like like what Val is saying here.

Jordan Wilson [00:12:06]:
And, again, love love hearing from the, the livestream audience. Let me know what other questions or thoughts you have. You know, Valve says not being dependent on the Internet would be a game changer. Absolutely. You know, I think that is the big push, to to to bring on device AI. And I actually have, some some thoughts because we're gonna go over the pros and the cons of this here in a minute. But I have some thoughts, Val, on on what that actually means. You you know, and and I love what May Britt is saying here is is do you see AI coming to Alexa or Google Home to improve the responses further? Maybren says hers is so sassy.

Jordan Wilson [00:12:37]:
Yeah. Absolutely. Right. So this has been reported on for for many months, you know, that Alexa, Google, and Apple are working on large language model integrations with their smart assistants, and we've already started to see that slowly roll out in different form factors. Alright. But let's get back and and kind of talk again about Apple is impacting this trend. Right? Because I think that as we we we jump into, you know, not talking examples, but talking about how this is gonna play out, I think it's gonna be largely dictated by what we see from Apple and when. Right? That's I think that's what brings so much new technology To the masses.

Jordan Wilson [00:13:17]:
You know? It's Apple's using them first. You know? You always hear, you know, Android phone users, you know, when whenever Apple releases something big and it becomes popularized, it's, You know, Android users are like, oh, yeah. We've we've had that for, you know, 2 years or whatever. But it's it's you have to be keeping your eyes on what Apple is doing in this space. Alright. So now we understand. There's a trend. All the big companies are working on on device AI, making, you know, generative AI or large language models live, You know, on an actual chip to live in an offline, to bring this personal AI to all of us, to to marry our data, in a fast way because that's ultimately what it's all about, not having to connect to an Internet and waiting.

Jordan Wilson [00:13:59]:
You know? It brings that latency or the or the wait time, You know, bringing personal AI to to your device. It brings it from, you know, maybe, you know, a second or 15 seconds depending on how complex The task is that you're working in a cloud base to maybe instant. You know, maybe we're talking milliseconds. Right? And that is The big picture here. Right? Because not everyone needs a large language model like GPT 4 with its 1.8 trillion parameters. Right? You can't really get the full version of that right now on a device. I know people are, you know, trying to fork it and and to, you you know, make versions of GPT 4 run locally, and there's success on that, but that's not even what I'm talking about because I don't even think, for the most part, the average everyday person needs a model that big on their hardware. So I do think, Right.

Jordan Wilson [00:14:49]:
Even though I don't know, I do think this is where it's heading, even with Apple. I I think we're gonna see an on device and fully capable generative AI that works offline. So why? Why do we need this? Do we need this? And what are the downsides? Alright. So let's talk about the why. You know, we already talked on some, hit on some of these things, but The offline access and the speed are huge, you know, but the personal AI, that's what this is ultimately about, making every aspect of our devices smarter and faster. So if you think of, as an example, think of in your mind maybe your best use case of ChatGPT. Maybe it's very specific, But you found something that you can use a large language model in one very specific use case, and it's extremely efficient at. It it saves you time.

Jordan Wilson [00:15:44]:
You're extremely pleased with the outcome. Right? So think of that feeling that you have. Right? But think of applying that to your everyday interactions with Your device with your phone, with your computers. Right? Because that's, I think the bigger picture. Right? Right now, our our relationship, I think, with large language models or generative AI for the most part is a when we need it kind of thing. Right? Like, we have to go and seek that out. You don't just you you know, you can't just reap the benefits right now from from generative AI, at all times. But think of this.

Jordan Wilson [00:16:25]:
The average human checks their phone Hundreds of times a day. So think if you had on device AI, a large language model on that phone, Think of how much more productive maybe your life could be or your work could be or your responses could be or how much time you could save. Right. When you can have a large language model, being able to work and understand what's going on on every single program you're using. So whether you're On your phone and you're reading a message or you're typing an email out or you're scrolling social media or you're reading an ebook. Right? To be able to have a large language model understand all that and work seamlessly between those apps, that's the future. Right? And that is how I think Large language models and generative AI and on device AI eventually become seamlessly integrated into our lives to the point where it helps us so much that we don't even know it. Then it's also maybe, is it hard to live without it? Right? So Let's let's quickly go go over some of the downsides and the upsides.

Jordan Wilson [00:17:32]:
You know? I promise y'all this isn't gonna be one of those episodes where I Accidentally talk and rant for 45 minutes. We're gonna keep this to a somewhat succinct episode. Alright. Yeah. Okay. I like this I like this comment from Tara here. You know, by sending by Bicentennial Man meets big hero 6 having on device AI. Yeah.

Jordan Wilson [00:17:50]:
I love that. Right. Having your a helpful assistant that's with you everywhere you go. And and and here's the other thing, and this isn't what I'm talking about, But I'm really just talking, at least in the context of this conversation, about our devices, our current devices. Right? I'm not even necessarily talking about new devices, you know, the the the humane pin, you know, the pin that you can wear around on you, and it's, You know, always seeing and hearing and understanding what's going on, and you can talk to it. Right? Or the the new, Rabbit. Right? I think it's the Rabbit r one. That device was just announced at CES.

Jordan Wilson [00:18:28]:
It's essentially a new piece of hardware. And, you know, a lot of people are like, oh, why can't this just be an app on your phone? But the Rabbit r one is a Completely new piece of hardware. We talked about it hours after it was released, but it's a new piece of hardware. Same thing. It has a camera. It's kind of like an assistant. You know, it has a large action model built into it. Right? So we'll see if that whole terminology, picks up steam and sticks around.

Jordan Wilson [00:18:53]:
Then you have other devices. Right? These these hardware devices, you know, the the Meta Ray Ban glasses. Right? That's not what we're talking about in this show. I'm literally just talking about The current existing devices that we all use on a day to day basis, mainly our cell phones and our laptops, Right. Or your desktop computer. So so now let's let's talk about some of the the downsides and the upsides Because there's some things in here that I don't think many people are talking about or thinking of. You know, I'm sure maybe the extremely smart, brilliant people that are building this on a day to day. They're thinking of these things, but, hey, everyday AI is for everyday people.

Jordan Wilson [00:19:32]:
So I don't know if we're thinking about some of these downsides and upsides when it comes to on device AI. Alright. Let's first talk Yeah. And, hey, side note. I agree with Brian here saying the Rabbit r one looks cool, but the last thing you wanna do is carry around Another device. I agree with that. But, I mean, who knows? Maybe it'll be so awesome you won't care. So, yeah, we're just talking about the devices we already have on us on on device AI.

Jordan Wilson [00:19:59]:
So first, a downside a downside of on device AI. And For for this argument, let's just stick to phones. Let's stick to phones here. K. Because there's 2 different things. Downside of on device AI. Some of these things are obvious, some of them aren't. But local AI, offline AI, On device AI is so resource heavy.

Jordan Wilson [00:20:28]:
It's so resource heavy. Right. Think of the the compute in the in the tens of thousands of g b of GPUs needed to power something like chat gpt. Right? And think of just the sheer amount of power and resources and compute that is now going to have to be on our personal devices. Right? Obviously, that's that's you know, the chip companies Have known this for many years, and they're coming out with smaller, more powerful, just shockingly, like, and shocking advancements in chip technology and these new now AI powered chips, but still it's resource heavy. I can only imagine that that we're going to see, you know, some of these first smartphones that have, you know, This this offline large language model or this AI built in, I'm gonna I think some of them are gonna be problematic. You know? How much testing can they actually do with these before they hit you know, before they get these to market? There's always gonna be a rush. You know? Google's already, you know, announced that some of their next, you know, Samsung phones with the, Google Gemini Nano, model, you know, they're gonna have this.

Jordan Wilson [00:21:43]:
Right? So it's a rush. It's a rush to market. Everyone wants to be 1st. They wanna sell more. They wanna, you know, be the first one with an offline, large language model. You know, we're seeing that already. It's resource heavy. What that mean what does that mean for your device, your personal devices? Well, it can kill your battery life And can potentially kill your performance.

Jordan Wilson [00:22:03]:
Right? Think of, like, when you're, you know, maybe on your laptop. You know, I was just doing this last night. I was working super late last night, You know, put putting together some materials, following up with all the amazing people out there who took our our free, PPP prime prompt polish course. Side note, if you wanna access, just send me PPP. But, you know, my computer was hot. It gets hot. Right. I have a fairly expensive MacBook Pro, and, you know, when you're using it, especially when it's not plugged in, I think it's hot.

Jordan Wilson [00:22:34]:
I mean, luckily, in full disclosure, it was also kinda keeping me warm because it's kinda cold in Chicago right now. But performance when it comes to on device AI is huge. Alright. Another downside is is it gonna be useful? Right? Because with on device AI, you have to find the balance Between it being small enough yet powerful enough. Right? And I think that's where Especially some of these early iterations. And, again, I think that's why Apple usually waits. I think some of these early iterations, it's either some of them are either gonna be Too powerful and too resource sucking or some of them are just gonna be not powerful enough and not useful enough. And then it's like, alright.

Jordan Wilson [00:23:19]:
There's millions of people, maybe unnecessarily, you know, either you had to pay extra, right, because that's the other thing. On device AI phones are gonna be more expensive. Right? They're gonna cost us more. And I'm guessing it's gonna get to the point where we don't necessarily have an option. Right? Like, if you want a new iPhone, you can't opt out of the, you know, 7 cameras that it has on the back. Right? Or you could maybe get one with 5. You know? It's not actually 7. I think it's, like, 4.

Jordan Wilson [00:23:48]:
Right? But you might not be able to opt out of these things when it when this hardware comes to your device, this AI hardware. So you're probably gonna be paying more. It's gonna be more resource heavy. Battery life, I'm sure, is going to suffer. It's actually probably gonna go down to where things are now because we're we might be at, you know, kind of peak, you know, battery performance level. Alright. And then there's one other downside, And I'm gonna take I'm gonna take a sip of this. You know, Tara's Tara's saying, habit and life changing.

Jordan Wilson [00:24:20]:
Absolutely. I'm gonna take a sip of my water before I get to this downside because I have to throw out, a little bit of a precursor here. So Another thing that I think no one is talking about when it comes to the downsides of on device AI is the, what is called the Skynet factor. Right? Is there gonna be a kill switch? You know? No one wants to talk about that. If we have on device AI, right, whether it's in your your your computer chip or it's running, You can't you know, you might not be able to opt out of it eventually. Right? So think of this. Think of sometimes how, you know, Siri or Alexa, you ask them a short question and they get it wrong, And then they keep on talking for, like, another 15 to 20 seconds. Right? And it's kind of annoying.

Jordan Wilson [00:25:18]:
Think of, like, what What that would look like if this is happening all the time on your device. Will there be rogue AIs running wild? Again, because It's not like you're going to a website or opening an app or signing up to a service and putting a prompt in. Right? This AI, Very soon, I'm guessing is going to be running locally on just about every device. There's no kill switch. Again, I'm not saying. Right? I'm not one of those people, oh, my phone's gonna, You know, take over my life and take over the world, but what's to stop it? You know? Will there be a kill switch if it is hardware based, if it's in the software, Right. If I'm not opting into it when I use it, that's something we have to think about. I hope companies are keeping that in mind.

Jordan Wilson [00:26:02]:
But, you know, what happens if Every single iPhone in 3 years has AI built into the hardware. What happens if I don't want it? Will there be an opt out? I hope so, But I don't know. That's a downside that we have to think about. Yeah. Yeah. And Sean Sean with a great comment here, which we're not even getting into, you know, when we're talking Things like Neuralink, right, with the, you know, the AI in your brain. That's, yeah, that's definitely the the next step. But, here's the thing.

Jordan Wilson [00:26:31]:
On device AI is already here. It's already here. It's already announced. The chips are here. Production. We're about to see it in mass pretty soon. You know, maybe the first big push is, you know, Google's Gemini Nano that's coming out to, you know, some of these new, you know, Android or, you know, Google smartphones. We're gonna be seeing it soon.

Jordan Wilson [00:26:49]:
K. Or I I think it's come to the the the new Samsung smartphone. Sorry. Alright. So now let's talk about some of the upsides. You know, I don't wanna I don't wanna end this episode on a downer like that. Let's talk about some of the upsides of personal AI. Alright.

Jordan Wilson [00:27:06]:
One example is, I think, the potential for your life to just be much easier, Much more productive. You know? Like I just said a couple of minutes ago, if you can picture in your mind that 1 task that you do in a system like Chat g p t that is just like you're just blown away at the the amount of time it saves you, how impactful, you you know, having that for that one specific task is. Imagine if this all goes well, if it if it goes swimmingly, having that on your device, having that, you know, let's just say on your phone, everywhere you go where your phone or your AI, your on device AI, knows you. It knows where you are. It knows what you like. It knows what you're doing across multiple apps because it lives on your device. Yes. Part of that is creepy.

Jordan Wilson [00:27:59]:
Part of that is extremely empowering. I think it's gonna bring a lot of Positive momentum and good things toward people that maybe have accessibility, problems or or setbacks. Right? I think it's huge. I think it's potentially very powerful. You know the upsides of having a personal AI, of having On device AI running locally. Right? That's that's the that's the thing that makes this possible, running this locally, You know, and and getting that that that wait time or that lag time that normally you have to wait because, you know, you're you're chatting with a server, You know, like chat GPT, and there's compute power, and there's, you know, downs and ups, you you know, downtimes, uptimes. It's not like that when it's on your device. It's instant.

Jordan Wilson [00:28:46]:
It knows everything. That's number 1. You know? Think of those like recommended replies. If you're ever using Imessage and, you know, someone texts you, like, where are you? Right. And then it pops up right there at the bottom, and you can just click 1 button and say, like, oh, sends your current location. Right? Think of that across your entire cell phone or your entire laptop where it just knows everything about you, and it brings ease, flexibility to your fingertips. Alright. The the the speed in always being connected is huge.

Jordan Wilson [00:29:21]:
Alright? But I have Yes. Taro Taro loves AI for accessibility, democratizing opportunities for us all. Same thing. Same thing. So the last upside as as we wrap up today's show, And if you do have a question, please get it in now. But the last upside that I I I don't really think people are talking about Is will on device AI eventually take away this environmental black eye that generative AI has? You know, I've been trying to read more about this and to learn more. So, hey, maybe if you know someone, have them reach out to me. We'll bring them on the show.

Jordan Wilson [00:30:07]:
But, you know, one thing about generative AI as it is now, right, so working on it in the cloud, is the environmental Impact. Right? It's huge. It's it's it's enormous. It's something that people aren't really talking about. You know? The compute power, You know, a lot of us don't understand it, but the compute power needed to even just For me to use chat GPT every day is crazy. It's crazy. I'll put I'll put the exact, the exact amount, but there was a we talked about it many months ago on the show that, you know, for every 50 prompts that you run. You know, it was a certain, you know, gallon certain number of gallons of water, that means.

Jordan Wilson [00:31:04]:
Right? Because the thing with, you know, all of these, you know, kind of cloud centers, data centers, again, I'm not the high like, the most highly technical person, so I might not be using the right word. But, essentially, Not just the compute power that is is having a toll on the environment, but also you have to keep those those data centers and and those and those chip centers Cool. Right? That's what all this water is being used for because they get hot, and you have to pump this water to to to keep these centers cool. The environmental impact of generative AI is huge. It's huge. The carbon footprint on this thing is enormous. So, again, I don't know this, but here's here's my assumption. I think on device AI can help with that.

Jordan Wilson [00:31:55]:
Right? Let's just say as an example let's just say as an example. I send, 200 prompts a day to chat GPT. It's probably a lot more. Right. And let's say there's, you know, tens of millions of people that use that as much as me. Let's say there's tens of millions of people who do 200 prompts a day in Inside ChatGpT. That obviously has a cost, not just a compute cost, but an environmental cost as well. So now let's say in a year or 2, on device AI, instead of me now prompting chat gpt 200 times a day, I'm only prompting chat gpt 20 times a day.

Jordan Wilson [00:32:36]:
I reduce, you know, my cloud based prompting by 90% Because I have a local on device AI. Yes. I still have to power my phone every day, but, presumably, That has a much smaller environmental impact than always having to, you know, prompt out, You know, or or, you know, mid journey or runway, you know, these these generative AI tools that so many of us use literally all the time. You know, companies say how much that they're even losing when people use generative AI at least right now because of not just the the the actual compute cost, but the environmental cost as well. Something has to change. Right? And I think on device AI might be that first step. So that's a little sneak upside for you as we wrap up here that I think a lot of people aren't talking about. Alright.

Jordan Wilson [00:33:31]:
I hope this was a fun one. I had a good time learning a little bit more about on device AI. Also, Whether you're you're listening on the live stream or on the podcast, let me know. Let me know what else that you want to hear about. Right? So one thing that we do here at everyday AI Aside from, yes, we're reading so much news, it hurts. We're trying out so many generative AI programs. I literally lose track. I can't tell you guys the amount of time that, you know, I'll see a a generative AI program, you know, on social media or something, and I'll Google it, and I'll Google a tutorial.

Jordan Wilson [00:34:03]:
And, You know, the first result that comes up is is me from 6 months ago. That's how many generated AI tools I use. But one thing That we try to do here at Everyday AI is to look at the trends. Right? To not just look what happened, But to try to better understand what is happening and have an open conversation about it and to build a community of people together who want to learn about this To better grow their companies and to grow their careers. So thank you for tuning in. As always, there's a ton more. So go to your everyday AI.com. Sign up for the free daily newsletter.

Jordan Wilson [00:34:39]:
Maybe, you know, all a a lot of these, you know, statistics I was trying to rattle off the top of my head. I'm gonna make sure that we get all of those specifically done for you and get them in the newsletter. So go sign up at your everyday AI.com. Shoot me a message. Shoot me a DM. You know, we always, You know, put my contact information in the podcast notes. You know, if if you're listening on the live stream, you have it. Let's continue To grow together, to learn generative AI, to grow our companies, grow our careers.

Jordan Wilson [00:35:05]:
Thanks for tuning in to Everyday AI. We'll see you back for another one soon. Thanks, y'all.

Gain Extra Insights With Our Newsletter

Sign up for our newsletter to get more in-depth content on AI