Ep 291: Apple’s AI Announcements: The good, the bad and what no one‘s talking about

Episode Categories:

Critical Analysis of Apple’s Latest AI Advancements

Apple is renowned for its groundbreaking technology, but has it truly hit the mark with their recent Artificial Intelligence announcements? The new developments come with a mix of skepticism and criticism, notably for their reliance on external innovations rather than organic initiatives.

Amidst Innovation: Stagnant or Revolutionary?

While Apple prepares to introduce improvements to Siri, their stock price may face the brunt because of a perceived lack of innovation. They have included new features like natural language processing and on-screen awareness, in addition to the ability to type, expected to enhance device-user interaction. Despite these improvements, the company's strategy has been termed a "major failure" due to its excessive reliance on other companies.


The Role of Siri: Changing or Remaining Constant?

Apple intends to fully capacitate Siri in terms of personal context, significantly emphasizing privacy and data security, which can be appealing to users. But critics argue that the potential lack of substantial change in Siri’s intelligence in the near future, coupled with underwhelming natural language processing undermines Apple's tagline, "AI for the rest of us."


Apple Intelligence: A Sigh of Relief or An Additional Problem?

Apple announced "Apple Intelligence" at the WWDC, a promising new addition for iPhones and MacBooks. However, heavy questions loom around the wait list for the Intelligence features and whether Siri's enhanced capabilities are tethered to OpenAI technology. Apple's approach implying personal productivity prioritization over business applications could suggest a conscious withdrawal from competing with Microsoft in the business applications sector.


ChatGPT and Siri: A Marriage Made in Heaven or Compromise?

The integration of ChatGPT specifically, GPT 4, with Siri brings seamless access to ChatGPT features on Apple platforms. However, the lack of conversation around this development and the under specified details of new GPT 4 model functionalities is concerning. Have "cutesy" AI capacities overshadowed fundamental enterprise requirements?


Upgraded Models and Features: A Turnaround or Another Dead End?

Apple aims to please with a series of sophisticated features like semantic media, improved grammar, summarizing notifications, prioritizing information, and compatibility transcending to higher models. But the confusion surrounding different tiers of AI models poses a serious question about Apple's level of AI intelligence and integration.


Conclusion: A Promising Future or A Downfall?

Despite substantial efforts, Apple’s recent AI announcements received a mixed bag of reactions. While some appreciated its approach towards enhancing user experiences, many questioned the lack of introduction of groundbreaking innovations. With the tech giant's future depending on its strategy execution, observers and users alike are keenly watching Apple's progress on its journey towards true AI dominance.

Topics Covered in This Episode

1. Apple Intelligence and Its Offerings
2. New and Improved Siri
3. ChatGPT Integration with Apple's AI
4. Criticisms and Disappointments with Apple's AI
5. Predictions and Speculations with Apple AI


Podcast Transcript

Jordan Wilson [00:00:16]:
Apple just announced what it calls Apple Intelligence. So for the last 2 years, Apple's kind of been sitting on the sidelines, and everyone's been waiting. What is Apple going to do in AI? Are they going to change how we work, or is it just gonna be a big nothing burger? So we're gonna be talking about that today and more on everyday AI, going over Apple's AI announcements, the good, the bad, and what absolutely no one is talking about. I'm excited for today's episode. What's going on y'all? My name is Jordan Wilson. I'm the host, and everyday AI is for you. It is your daily livestream podcast and free daily newsletter, helping you make sense of everything that's going on in the world of AI. So, if you didn't know, yeah, Apple finally in one of probably one of the most highly anticipated tech announcements, I don't know, over the last decade or so.

Jordan Wilson [00:01:08]:
Apple finally unveiled the wraps on Apple Intelligence, its new AI offering for iPhone, for MacBooks, and a lot more. So extremely excited to talk about that in today's episode. So if you're joining us from the podcast, thank you. As always, make sure to check out your show notes for more on today's show and also a link. And go to your everydayai.com. Sign up for the free daily newsletter. We will be recapping today's show, what's going on in AI news today, and a whole lot more. So let's start with that.

Jordan Wilson [00:01:38]:
What is going on in AI news today? So, we're gonna go over the top stories. Number 1, Meta wants to train its AI model on European data. So Meta, obviously, the parent company to Facebook, Instagram, WhatsApp, and probably some other social media apps I don't use, wants to use data from European users to train its AI language models. So this has raised concern about data privacy with a Vienna based group urging privacy watchdogs to intervene. Meta has stated that it will not use private messages or data from users under 18 and has given the option for users to opt out. So Meta believes that the training data on European data is important for accurate and culturally aware AI models. This updated privacy policy will go into effect on June 26th, and training for the next model will begin soon thereafter. Speaking of data and training, well, Microsoft recall is changing how it functions.

Jordan Wilson [00:02:37]:
So Microsoft, their new AI powered recall feature, which, they announced at their Microsoft build conference, and it automagically just literally kinda records everything you do on your computer, and you can talk to it. But it's obviously faced criticism for privacy risk. So in response, Microsoft makes make some changes such as making it opt in instead of opt out and adding some additional encryption. Some commend the swift action, while others expressed disappointment. Microsoft plans to conduct further testing and review to address privacy and security concerns. Last but not least, on that, security and privacy concerns issue, Elon Musk says he'll ban Apple Devices. So, billionaire Elon Musk went on his social media platform and axed, I think what do we call tweeted? He put out a an axed post, and warned of what he said are potential security risks from integrating OpenAI's artificial intelligence software with Apple Devices following a presentation by Apple announcing their plans to incorporate OpenAI's ChatGPT chatbot into the series, digital assistant. So Musk threatened to ban Apple Devices from his companies if OpenAI is integrated at the operating system level.

Jordan Wilson [00:03:56]:
That's, I don't know. I find it funny. So, obviously, Elon Musk has his, x Grok, AI chatbot that I don't know if anyone uses. And it seems like, he doesn't want other people to be using OpenAI and ChatGPT. So, sour grapes, I guess, pretty pretty early on. Right? It's like when you see something that's better, than than what you created, you're like, I don't want anyone using that. Alright. But, go go log on to, your log on.

Jordan Wilson [00:04:23]:
Do people still log on? Go to your everyday ai.com. Sign up for the free daily newsletter for more on those news stories and a lot more, but let's get into today's topic, y'all. I'm excited for this one. So let's go over Apple's AI announcements at WWDC, their worldwide developer conference that just kicked off, yesterday, but it's gonna be happening all week. So we should be seeing some more news, but the big news was just the opening keynote saying, hey. Apple unveiled their Apple Intelligence. So we're gonna be going over the good, the bad, and what literally no one is talking about. So let's start super high level.

Jordan Wilson [00:05:01]:
So Apple Intelligence is what Apple is calling this. Right? They're trying only Apple would try to actually rebrand, AI and artificial intelligence and just say Apple Intelligence, which is funny because they said it, like, 60 times. I I I believe the official count was somewhere around 60. They said Apple Intelligence, but they very rarely said artificial intelligence, which I found interesting. They did say personal intelligence a lot, so more on that later. And, their slogan, I guess, was AI for the rest of us, which I have some thoughts on that, which we'll get to later. But let's first start with our take on all of this. It is hot take Tuesday.

Jordan Wilson [00:05:43]:
Every single Tuesday, we come with a hot take. Yeah. If you're new here, Mondays, we go over the AI news that matters. Tuesday, we do a hot take Tuesday. And Wednesday through Friday, we usually bring on guest experts, in interviews so you can learn from them. So our hot take Tuesday today, our hot take on Apple and their long awaited AI announcements, meh. That's kind of it. But I think Apple created a seemingly basic AI model that won't be that great and will probably actually have to wait a long time to really use much of it or any of it.

Jordan Wilson [00:06:14]:
So more on that here in a minute. But let's go over first an overview of what's announced, and then we're gonna go over really what I think are the 8 things that you need to know. And we're also gonna go over the good, bad, and what no one's talking about. So here's probably, like, the 8 bullet point overview. So if you only got a couple minutes, I'm not gonna get mad if you leave after this. So here's the 8 things. Number 1, new AI model capabilities. Yeah.

Jordan Wilson [00:06:38]:
Apple actually released a new model and didn't even really mention it by name in their keynotes. Number 2, they have an AI image model and image generation. Number 3, they have actions that can happen with this Apple Intelligence. Number 4, an improved Siri, maybe. Number 5, personal context. 6, ChatGPT integration. Finally, it's been released. Right? 7, semantic media.

Jordan Wilson [00:07:06]:
And 8, some AI functions in their main apps. Alright. So let's go over 2 important pieces you need to know first. So it's kind of a big release timeline. So, you know, Apple generally has their WBC conference in June, and then they say, oh, you know, September 15th or September 20th, we're gonna be rolling all these things out. We got a little bit more of a big timeline. So some of the things that Apple said, well, this is coming in fall. So whether that means September or December, I guess, is up to Apple.

Jordan Wilson [00:07:36]:
And then other things such as, you know, some improvements in Siri, they said, over the course of the next year. So, yeah, are we gonna be seeing these anytime soon? Not sure. However, Apple is I think has a much better track record than, like, Google as an example for actually shipping what they say they're going to ship. So, if they say over the course of the next year, yeah, it might be a week or two before their next WWDC announcement, but I do assume, that Apple will actually ship these things. And, yeah, fall might end up being December, not September, but I do expect Apple has a much better track record than most of the other big tech companies on actually shipping what they say. So, yeah, we might still be waiting with it, waiting for it. Also, there's a good chance you aren't even gonna be able to use it. So, you you know, as an example, I I I like to put this out there.

Jordan Wilson [00:08:26]:
I I use everything Apple, everything Mac. I'm a little bit of a fanboy, but I'm looking at all the Microsoft stuff, and I'm like, okay. That's that's good. I like that. Hey. Someone from Microsoft sent me sent me a couple laptops or something. Right? Anyways, you might not even be able to use all this stuff because you need an iPhone 15 Pro or higher, to use most of this, Apple intelligence on your iPhone. Presumably, the new iPhones that will be announced, Max will also have, compatibility, but, you know, I'm even recording this, you know, livestream video here on, I don't know, an Apple 12 or something like that.

Jordan Wilson [00:09:02]:
I don't I don't know what number it is. But, yeah, I'm I'm sure the majority of people aren't gonna have access to this. So, you know, Apple is obviously hoping everyone sprints out of their door and goes to buy, you know, iPhone 15 Pros or higher or pro jumbos or pro XLs. I don't I don't actually know what they're called. Or on your computer. So come to MacBook as well. I think that's a little friendlier in terms of minimum requirements to ride this ride. So you do need a Mac M Series chip or higher.

Jordan Wilson [00:09:32]:
So, you know, the Intel, Macs, you know, those won't work. But if you have an m 1, m 2, m 3, etcetera chip, Apple Intelligence should work there. Alright. Let's start with the important stuff here, y'all. Number 1 is they have a new AI model. Right? So it was it was interesting because Apple was kind of vague, and maybe that was intentionally so, but they were a little vague vague on what Apple Intelligence eve even is. Is it a model in and of itself? Not really. It seems like it's just this umbrella term for a bunch of stuff that's presumably going on under the hood, but no one was talking about this.

Jordan Wilson [00:10:14]:
They didn't mention this by name, you know, during their presentations at the 1st day of WWDC. But, yeah, there's an actual new model that they released that they just put on their machine learning blog by name, which is funny because at least, you know, at 5 AM this morning, literally not a single other human being in the world had talked about it. Alright? So, at at least, you know, you're you're getting some breaking news here. Maybe maybe by the time you're listening to this, someone else has talked about it. But, essentially this, some of the stuff that's going under the hood, going on under the hood is new models called jacks. So, you know, original reporting was it was gonna call be called AJAX like a year ago. So it is the JAX and XLA that is kind of, some of the, background technology for this new model. Right? It's people are like, okay.

Jordan Wilson [00:11:05]:
So is the GPT model running this entire thing? No. Apple has a base model running the majority of this, Apple Intelligence JAX and XLA, and it is a 3,000,000,000 parameter on device model. Okay? So that's Edge AI. So so much of, you know, what's going on locally, and we'll get to that, here in a couple of minutes. But it is happening locally with this 3,000,000,000 parameter on device model, which is technically small. Right? So if you compare it to, you know, GPT 4 turbo, I believe, is 1.8 trillion parameters. Right? So a 3,000,000,000 parameter model is technically kinda small. Right? If we were saying 5 years ago, you might say, oh, that's big.

Jordan Wilson [00:11:47]:
But, you know, compared to state of the art models, a 3,000,000,000 parameter model is pretty small, but it is just for limited scope for what this model is presumably actually performing, under this Apple Intelligence umbrella. The number 2 thing you need to know about is AI image model and generation. So, yes, part of this is a diffusion model that can create, images in a variety of forms. Personally to me, I I don't know if I would ever use this, you know, as someone and I should have put this, you know, precursor out there. Right? Myself and our team, we've been using the GPT technology since late 2020. So, you know, you know, coming up on almost 4 years now, and we've used 100, literally 100 of AI tools and, you know, a handful of great AI image generators like, you know, DALL E and, midjourney, stable diffusion, etcetera. Right? When we look at this new Apple AI image model, I don't think I would ever use it. Apple is playing it safe as Apple always does.

Jordan Wilson [00:12:49]:
You you know, I I don't think you're gonna run into any deep fake, you know, issues here because you can't actually create realistic photos. Kind of the options, I believe, were illustration, sketch, etcetera. There was 3. But it's more of a cart toony thing. So, you know, if you wanna send you know, the example here that I'm sharing on the screen is, you you know, kind of showing, like, oh, your your your mom is a superhero or something like that, but a cartoon one. So we'll get more into these features here in a little bit, but I don't know. To me, personally, I'm not really gonna be using that, but big piece. So, here's what you need to know about the AI image model generation, so a diffusion model.

Jordan Wilson [00:13:29]:
They have a new app called Image Playground where you can create these, apparently unlimited or until your phone runs out of space. They also have the Genmoji. Hey. Everyone's losing their mind about AI powered emojis. I don't know. To me, don't care. Don't care at all. But, yeah, you can use AI to make a custom emoji if you care about that.

Jordan Wilson [00:13:50]:
Also, they have an Image Wand app that turns your sketch into an image. I'd say if you're a student, that could be pretty cool. Right? It's one of the examples here, that they have. You know, you can sketch something on your notes, you you know, circle it if you're using, like, an an iPad app, and then it'll turn it into an image. They also have cleanup tools, a cleanup tool for photos, which we've seen in Google Photos for a long time. You know, you can circle someone if someone photobombed you and, you know, it erases them. Natural language search inside of photo videos, which we're gonna get to, and memory movies. So we're gonna touch on those 2 things here in a couple of minutes.

Jordan Wilson [00:14:30]:
Next, which I say is probably one of the more, I don't know if impactful is is the right word necessarily, but probably a more beneficial feature here is actions. Right?

Jordan Wilson [00:15:26]:
That's essentially between Apple Intelligence and this new quote, unquote, smarter Siri, is we are going to see actions. So let's talk a little bit about what that means. So we're gonna have cross app and in app actions, and you can just use this new, whether you're talking to the new and improved Siri, or just this Apple Intelligence in general is going to be able to perform very simple actions across Apple's built in apps. Also, for 3rd party developers, presumably, this will be rolling out soon via what they're calling app intents.

Jordan Wilson [00:16:21]:
So not like camping intents, like intent, but intents. So app intents allows third party developers to provide similar functionality, to presumably allow Siri and this Apple Intelligence to perform a simple task on users' behalf. So that part's pretty cool. Right? Also, you you know, it's kinda like the shortcuts. Right? So, Apple has had these shortcuts, but you kind of had to program them manually, before. You know, it's like, oh, when I start a workout on my watch, launch the activity, you know, app on my iPhone or something like that. Right? So you could program these shortcuts before, but now you can just use natural language and texting Siri. That's new as well.

Jordan Wilson [00:17:04]:
And, essentially, it can complete small actions across its apps. Right? So if you are texting someone about dinner plans for next weekend, then you can just tell Siri, create that as a calendar. Right? Then it'll do it for you. Yay. And it'll use all the information and all the context, from from that conversation. Alright. Number 4, improved Siri. Yay.

Jordan Wilson [00:17:28]:
Are we finally gonna have a smart smart assistant? I got some takes for this at the end of the episode. We're gonna play an example. But, the example here that, you know, Apple has on their website, it says, you know, Siri, set an alarm for, Oh, wait. No. Set a timer for 10 minutes. Actually, make that 5. Right? So just more, you know, natural language processing. You know, before Siri, Alexa, and just about all smart assistants, they've been very not smart for 13 years.

Jordan Wilson [00:17:58]:
You know, maybe it was a novel at first, but, you know, once large language models came out, you can't you couldn't really use. Right? So from 2022 to, you know, today, if you try to use Siri or Alexa or, you know, I guess Google Assistant is improving as well. It's like permanent face palm emoji. It's terrible. All all of these smart assistants are terrible. So now presumably, you know, Siri's gotta get a little smarter by using this base new large language model, and might tap into some future, GPT 4 o technologies, which we'll talk about here in a minute. So what you need to know about this, apparently new and improved Siri. Yeah.

Jordan Wilson [00:18:40]:
I got some takes on that here in a minute. But better context, more natural language. So, yeah, if you make a mistake, on screen awareness, which I think that's pretty cool. Right? So, yeah, if you are, let's say, reading a, I don't know, an email, and then you can say, you know, apps ask Siri to look up something from what's on your screen. Right? Look up bullet point 2. Yeah. Right? So, yeah, I mean, pretty nice. I don't know.

Jordan Wilson [00:19:07]:
For me personally, I'm not always talking a lot to my computer or my phone. Maybe in the future, we will be more, but at least having that kind of, on screen awareness pretty good. Like we talked about a new feature being able to type to Siri, For me, not that great, but I can see how that would actually be useful for a lot of people. Right? If you're, you know, in work or in class or, know, in a crowded place, maybe you just don't wanna seem like a a weirdo, you know, talking into your phone all the time, but you can just type to Siri. Also, like we talked about, Siri actions being able to, do these actions cross app in app, as well as app intent for third party developers that Siri will be able to complete in app actions for you. Alright. Number 5, personal context. I think this is the big one.

Jordan Wilson [00:19:58]:
Essentially, all your data. Right? Yeah. And you can opt out of this. Apple says you cannot opt out of this, and they have what's called a private cloud compute. So we're gonna get to that, you know, here in a couple of minutes. But more or less, I think this is an app, an area that Apple has been the leader in. Right? Privacy, which which is huge. Right? So I think Apple made a big play away from business.

Jordan Wilson [00:20:25]:
Right? They weren't really talking a whole lot about this is gonna help you do better at your job, you know, grow your career. It wasn't a lot of that. This is more, it seemed like an AI app for your personal life or, you know, a new AI phone for your personal life. That's what it seemed like to me, which, you know, a lot of people are thinking about privacy. Right? So let's say you're texting with your spouse maybe about a medical condition and, you know, I don't know. Do you want Siri to put that information into a calendar event? Do you want, you know, Siri or Apple to just always be reading all that information? For me, I don't care. For me, take that data. Right? I forget things hundreds of times a day.

Jordan Wilson [00:21:03]:
100. So for for me, personally, I'm looking forward to this. Right? I think for my personal life, a lot of this Apple intelligence is going to be great. You know? I don't care. Take all my data. Give me better retargeting. Right? Like, I know that's not what Apple's doing, but I'm the same thing with anything. Google take my data.

Jordan Wilson [00:21:21]:
YouTube take my data. I got nothing to hide. Make my life easier. Right? So Apple kind of going down that same, that same route with this personal context, but with a huge lean on privacy, obviously, as Apple always has that heavy lean, saying that your data is in this private cloud AI, saying your data is never stored. Use only for your request in verifiable privacy. Alright? So, essentially, what that means is it just knows everything. Right? So you could be texting, and it's going to know what's in your calendar. It's going to know your events.

Jordan Wilson [00:21:56]:
You can, you know, talk to Siri, and presumably, it's gonna know your life. I'm actually extremely excited about that part. You know, especially how I can apply that to business to, you know, gain back things little wins. Right? I I I feel personally that's what Apple's going for here instead of huge product like business productivity, you know, a 10 hour task that now you can do in 3 minutes. Now they're saying, you know, it seems like, you know, dozens of tasks that would take you 5 minutes are now going to take you 2 minutes. Or, you know, a task that would take you, 45 seconds is now gonna take you 4 or 5 seconds. So I'll take that. Not huge productivity gains.

Jordan Wilson [00:22:35]:
That's why I think the market, reception, you know, Apple stock went down, yesterday after the announcement because it's like, ah, is this gonna change how we do business? Right? Because when you follow big companies, you you know, Microsoft, NVIDIA, Amazon, right, All the other big companies, even even Meta. Right? They're changing how we do business. And, you know, I think a lot of people were expecting something similar out of Apple. Right? Is is this gonna impact how we work in Apple? I really just didn't pay much attention to that if I'm being honest, but this personal context, I think, is pretty huge. But like I said, Apple is marketing it as a security first focus and private cloud with opt out.

Jordan Wilson [00:24:01]:
Alright. The big one here, ChatGPT integration. So a, quote, unquote, direct integration with GPT 4. Oh, OpenAI's newest model.

Jordan Wilson [00:24:32]:
So, obviously, there's been a lot of reporting and a lot of confusion on what Apple is actually doing. So, you know, we heard almost a year ago that Apple was spending 1,000,000 of dollars each day trying to build its own internal large language model, and they obviously released some of the findings of their models via research papers. So, you know, at that point, everyone's like, oh, okay. Apple is gonna be going with its own model. But then you heard all these reports back and forth, OpenAI. Oh, nope. Now it's Google Gemini. Now it's called Anthropic.

Jordan Wilson [00:25:01]:
Oh, back to OpenAI. So, yes, there is an, official direct, quote, unquote, direct integration, with Chat gbt. So we'll talk about that more at the end when we go over the good, the bad, and what no one's talking about. But, ChatGPT integration where they're saying it's seamless. Right? So kind of having that ability to use all of these ChatGPT features pretty much wherever you are. So, you know, if you have a Mac and if you've been using their desktop app, it's similar to that. So it's bringing the power of this new GPT 4 o model to your desktop, not all the features, though. Right? Because there's a lot that, in this new model that OpenAI has still not even announced yet.

Jordan Wilson [00:25:42]:
So, presumably, once those features once OpenAI releases them to everyone, so all they've really released is the base model. Right, but OpenAI also talked about a lot of other things, kind of this Live Omni, you know, where, you know, the desktop app can see everything in real time, you you know, with one click of the button, kind of this way more natural, you know, voice, you know, being able to have a live conversation back and forth. So, you know, will, you know, Mac, you know, iPhone, Siri be getting that next updated version of GPT 4 o, potentially, and we'll get to that here in a second. But the direct integration, it's it's like this. Right? Because, yes, this is confusing, and all the original reporting kind of ended up being true because there's also reporting that said Apple is gonna use both on device AI or edge AI as well as, you know, kind of cloud, you know, cloud compute as well. So that's turns out to be the truth. Right? So it first is going to use internal models to, answer what it can. Right? So we talked about that JAX, you know, internal model, the 3,000,000,000 parameters.

Jordan Wilson [00:26:49]:
So it's going to use what it can for that first. And then when it can't, it's going to send to a private cloud AI, which they did not release a ton of details on that. I I don't know if that's still the JAX model or if that's the GPT model. So there's multiple tiers, which is a little confusing, but we'll get to that later. But, essentially, when you need something more powerful than just what your your personal data is or if you want to kind of parlay your personal data with the real world or the outside, you know, world, I believe, you know, is is something that was referred to is, well, then you're gonna have to use this direct GPT 4 integration, 4 o integration. So it's free, which is great. You know, they didn't really reveal a lot of details. Right? So, like, as an example, if you are having ongoing conversations with the GPG format with the GPT 4 o model, is it gonna remember those things? Right? If if if you, tell you know, if, if you're talking to Siri, and and you it needs to use chat gbt 4 o, direct integration for something, is it going to remember that in context, or is it gonna completely forget it as soon as you kind of close out? Right? Because presumably, the the good thing with the edge AI is it's gonna have this working memory.

Jordan Wilson [00:28:06]:
Right? It's going to continually be able to remember things that are in all, you know, you know, your mail, your messages, your, documents that that maybe you store, you know, on Apple's apps. But outside of that, is it gonna remember things that it uses ChatGPT for? I don't know. Alright. Number 7. We have semantic media. So we kind of talked about this with the, it kind of rolls over into the, the fusion, the, you know, AI images. But, you know, the example that Apple has on their website is essentially, you know, think. You have probably thousands of photos and videos, you know, on your iPhone or, your your Mac computer and then just being able to search semantically.

Jordan Wilson [00:28:51]:
So the example they gave, you know, presumably someone's taking a photo, of of their child here, and then they say Katie with stickers on her face. And then it returns all the photos of Katie. So at some point, you've told it this is Katie or you've identified, and then it can go through all your photos and say, alright. Here's the photos of Katie with stickers on her face. Sorry. Excuse me. Got some got some allergies kicking in here. But then, semantic media.

Jordan Wilson [00:29:18]:
Here's what this actually means. Well, it's gonna understand what's actually going on in your photos and videos. Right? Which is great. Again, we've seen parts of this. Apple has had pieces of this, but kind of this, you know, Apple Intelligence is presumably bringing it all together, connecting it a little better. And then also it can create custom memory movie based on your request. Nothing new here. Apple has had this.

Jordan Wilson [00:29:38]:
Google announced this as well. But, you know, before they were just kind of auto generated. You know, it might say, you know, 2 years ago on this day, if you took a bunch of pictures 2 years ago and it would make you a little, you know, photo montage. So now, you can kind of request and say, oh, you know, make me a montage of, you know, that trip from Mexico from 2 years ago or something or, you know, all of the times I, you know, was playing with a puppy outside. Right? Like, whatever. So you can just kind of do that with a request. And then last but not least, which I think is probably one of the better features here, is AI functions inside of Apple Apps. So the example here, which I think is great, is summarizing notifications.

Jordan Wilson [00:30:19]:
So let's say you're someone if you're in a meeting for an hour, you know, my phone is literally on d and d almost every single day. I don't get any notifications because I hate them, but a lot of people. Right? You're on a meeting. You look at your your phone, and you have 10 notifications. And then you have to waste time individually going in and seeing what all those applications are actually about. So what, Apple Intelligence is presumably going to do, it's just gonna kinda tell you. It's gonna read your text, you know, and this is gonna say, hey. Here's what this person text you about.

Jordan Wilson [00:30:46]:
Here's what you need to know. You know, your Slack messages, etcetera. It's going to give you a summarization, presumably either using the JAX model or GPT, and then it's just gonna pop that up onto your home screen. Right? So I think those are some of the best, I think features inside of these Apple apps. So having your notifications summarized. Right? So if you open your, Apple Mail, you'll have the ability instead normal preview text. You know, it'll show you a two line summary. That's good.

Jordan Wilson [00:31:14]:
Right? You don't gotta read through everything, and it's also going to use some AI to kind of prioritize things for you as well. So, yeah, if if a doc kind of prioritize things for you as well. So, yeah, if if a doctor appointment gets changed, or if a meeting gets rescheduled, that might be more important than, you know, the the group chat from the poker buddies. Right? You know, no no disrespect to the poker group chat, but, you know, probably if if if if a meeting gets rescheduled from 5 o'clock to 3 o'clock and it's 2 o'clock, Apple is presumably gonna know and put that at the top of your notification so you know about that right away. Alright. So that's a recap of what's new. Now let's go over the good, the bad, and the things that no one is talking about. Alright.

Jordan Wilson [00:32:00]:
So here's the good. In typical Apple fashion, they made things shiny and sexy. Nice UI UX. Right? Even just what I just talked about, using AI to summarize things to prioritize your notifications, kind of these these app actions, It's all really good. Right? Is it anything new? Absolutely not. You know? I I I should have started the show with this, but we got really nothing new or extremely exciting from Apple. Right? It's it's all stuff that we've had. You know, maybe it was in 2 different apps.

Jordan Wilson [00:32:30]:
Maybe it was in 20 different apps. So now I guess the nice part is you bundle it all up into Apple's nice user interface. Right? The the the summarization and the GPT 4 o writing tools, being able to access those anywhere. Again, we've had that. You know, if you've used Chrome extensions like I have, you've already had that for years. Years. Right? And so, okay. Cool.

Jordan Wilson [00:32:52]:
Now you get that inside of Apple's apps. Fine. Not bad. Right? And I do think you're gonna get some baked in AI features that are gonna replace some other popular apps. As an example, you know, part of that, you know, if you're writing an email, in inside Mail inside Apple's Mail program, you can click to make it longer, make it more informal, improve your grammar. Right? So it's like, I don't know. If I'm Grammarly, I'm not super stoked at this Apple announcement. Right? So I think that Apple is actually gonna kill off, the need for people to use some apps.

Jordan Wilson [00:33:23]:
Right? Like, they some non AI things is they have a passwords app now. Another one is the ability to record calls. It does Apple says it does record the other, or sorry, notify the other party, but then it can transcribe those calls. Right? And then, you know, voice notes, it can transcribe those. So, yeah, all these things, again, all these AI capabilities that many of us have been using for many months or multiple years are now gonna be just baked in features. So, you know, for I guess for many people, right, this is gonna be their first introduction to artificial intelligence. Right? Because, yeah, a lot of people just don't use AI, or they use it very limited, in in a very limited capacity. And then obviously, the the direct ChatGPT integration.

Jordan Wilson [00:34:08]:
Alright. Here's the bad, y'all. Here's the bad. There's model confusion here. Right? Open or or sorry. Apple didn't even really announce chat gbt or open AI until the very end. And if I'm being honest, it's a little confusing. Essentially, there's, like, 3 and a half or 4 different tiers of models.

Jordan Wilson [00:34:31]:
Right? So they have their their kind of their own model, this Apple AI JAX XLA model with 3,000,000,000 parameter, that is on device edge AI. So, when that's happening, nothing is getting sent to a cloud, right, or a third party service. Then Apple says, hey. For some things, we are gonna use private cloud AI. And it's like, okay. Well, why? I guess, what's the need for that tier? Again, a little confusing. Apple didn't do a good job of even saying, oh, we're using our own model for this. Chat gbt for that.

Jordan Wilson [00:35:06]:
Right? So they have, number 1, on device AI, the JAX 3, 3,000,000,000 parameter. Number 2, a private cloud compute. Okay. When's that being used and why? Then number 3, you have this direct and free, right, which is great, free integration to GPT-4 o, but only when you tell it to. So I'm like, okay. But then they also said in the future, you're also gonna be able to use your paid version. So you can bring in paid features of a GPT 4 o. Right? Because, yeah, a lot of these new features that haven't been released that we've kind of referenced.

Jordan Wilson [00:35:43]:
Right? This Live Omni or people call it Her. Right? That's only available on the paid plan. Some of this more, you you know, natural language, these voices that they preview that we don't have that presumably Siri is going to get because Siri's voice and also Siri's intelligence didn't seem to get much better if I'm being honest, but maybe it will with these paid features. So, again, confusing. It's almost like there's 3a half, 4 different tiers of AI models that are being used depending on the scenario, which I guess if most of it is happening under the hood, it's not a bad idea. But here's the thing. It's not all happening under the hood. And I think this is a bad thing.

Jordan Wilson [00:36:23]:
Right? Because so much, I think, of what people want to use AI for or what we've been using it previously, it's it's to interact with things on the Internet. Right? It it it's to be productive at work and in your job. And, I mean, some of those things you'll be able to do on device and you won't have to, quote, unquote, have to use Chat GbT, but Chat GbT is a more powerful model. Right? If I'm being honest, I think Apple missed the boat here. They should have just figured out a way to, you know similarly, how Google Gemini has Gemma, which is a lightweight version of the Gemini model. I think, obviously, Apple would have been much better if they could have partnered with OpenAI to just create a lightweight version of GPT 4 that ran locally. Right? So you're essentially working with multiple models that, you know, technically speak multiple languages. So, you're you're not gonna be getting consistency, I think, when you're kind of working with these different AI assistants.

Jordan Wilson [00:37:21]:
And worse than that, you know, some people said, okay. So is Apple Intelligence really just a wrapper? Right? Is Apple just, you know, you you take off its hood and it's just chat gbt underneath? Well, yeah. Kind of. But is it really a direct integration? Because Apple showed this, you know, because it essentially if you ask Siri or if you ask this Apple intelligence for something, and it's number 1, if it doesn't know, which I'm guessing is gonna be a lot of things, or if it's too powerful, if it requires more compute, whatever, it essentially gives you a message that says, do you want literally, you get a pop up that says, do you want to use ChatGPT to do that? And then you click cancel or use ChatGPT. So, I mean, is this a direct integration? It's like you're just launching an app in the background. So is Apple intelligent? It's really super intelligent. Did we really get AI, or did we just get kind of like a seamless shortcut to launch ChatGPT in the background? I don't know. Seems like a fail to me.

Jordan Wilson [00:38:25]:
That's me. So, yeah, is Apple Intelligence just a rapper? And, also, Siri. Let's talk about Siri, y'all. Yeah. I got I got some thoughts on Siri here. So is this advanced Siri and other Apple intelligence features, later this year? Right? So later this year. Okay. Does that mean 350 days from today? Does that mean by December 31, 2024? Let's let's just go ahead.

Jordan Wilson [00:38:51]:
Let's let's go ahead. You know, I I actually have a bone to pick here. Right? If if if I was Apple, I literally wouldn't have done this. Alright. Let's go ahead and listen, speaking of Siri. And is Siri smarter now? Or is it only going to get smarter if you have the paid version of GPT 4 o? Right? Because Apple said that you can integrate, the the paid version later. Right? So let's just, let me talk about this. Let me play this little 32nd clip from their presentation yesterday.

Jordan Wilson [00:39:23]:
Someone kind of going over here, talking about, Siri. So let's go ahead and listen, and then let's break down the example.

Person [00:39:31]:
And you can speak to Siri more naturally, thanks to richer language understanding capabilities. Even if I stumble over my words, Siri understands what I'm getting at. What does the weather look like for tomorrow at Muir Beach? Oh, wait. I meant Muir Woods.

AI [00:39:47]:
The forecast is calling for clear skies in the morning near Muir Woods National Monument.


Jordan Wilson [00:39:55]:
Okay. Did did anyone did you notice what didn't happen there? So okay. Here was the query for Siri. Right? What does the weather look like for tomorrow at Muir Beach? Right? Guess what's y'all, when you ask alright. I'm gonna go on a small rant here. It's Hot Take Tuesday. This is a fail. What's the one thing you wanna know when you ask for the weather? The temperature.

Jordan Wilson [00:40:32]:
Guess what Siri didn't say? The temperature. I don't know. If if if I'm Apple, I'm being very selective about the, you know, the polished the most polished version of of, you know, Siri that that you showcase. That to me, if I'm doing that right now, I'm like, come on, Siri or Alexa or whatever. Is that a smarter Siri? I don't know. So I guess we'll find out, but I don't know. My my hunch is saying when they said, oh, some Siri improvements are gonna be coming later this year, I think the only thing that you're that the the only improvement in Siri initially is going to be it's going to have access, with this new, internal Apple model to all of your data. So you can say, oh, you know, catch me up on this text message or who emailed me.

Jordan Wilson [00:41:24]:
Right? So that might work better. But everything else that you're using Siri for, I don't know. It might still stink for the next year. You know? They kind of hinted at, hey. More, you know, Apple or sorry. More Siri improvements are coming over the next year. So maybe they are actually literally just waiting, on, you know, ChatGPT's, open or sorry, OpenAI's GPT 4 o, these new voice and natural language features, and it'll integrate directly with that, but only if you have a paid ChatGPT account. So are we actually going to be getting a smarter Siri? And if so, when? And if so, is it going to require a paid ChatGPT account? I don't know.

Jordan Wilson [00:42:04]:
That demo to me was a ginormous failure. I would not have shown that. If if if if I was in charge of Siri and I'm seeing this at the keynote, I'd be like, why did we do this? If you ask about the weather, yeah, you probably wanna know. Is it gonna rain? Is it gonna snow? And what's the temperature? How how are you not gonna say the temperature? Right? Is it 95 or is it 45? That's not useful, smart assistant. Be smarter. Understand. Right? Yeah. If there's natural language processing and, you know, natural conversations, if I ask someone who's, you know if if if my wife has the weather up on her phone and I say, hey.

Jordan Wilson [00:42:44]:
What's the weather look like? She's gonna say, oh, yeah. It's either it's it's raining and, you know, it's cold, 50 degrees, or she's gonna say it is sunny. It's 90. Right? Come on. This is simple stuff. Simple stuff, Apple. So, other things, there there there was a report today, out of Mac rumors, that said a lot of these Apple Intelligence features are actually going to have a wait list during the initial limited preview. So, yeah.

Jordan Wilson [00:43:14]:
So presumably, there's gonna be, you know, Apple always does beta versions. So even when the beta comes out, if you're a dork like me and you wanna play around with these, some of these things might be waitlisted. For the general release, will they be waitlisted? I don't know. But there's a possibility that they could. Right? One thing you gotta give credit to Apple for, usually, they have a great record of shipping things on time. They ship what they say, right, but at least kind of this, you know, reporting today saying that there's going to be a wait list feature for some of these Apple Intelligence. Again, that might just be during the the the beta preview. Not sure if that's gonna be during the general release.

Jordan Wilson [00:43:51]:
Alright. Things that no one is talking about, y'all. Alright. Their their their main thing. Right? So if you go to the Apple Intelligence webpage, it says AI for the rest of us. What the freak does that even mean? AI for the rest of us? Is is is Apple, an an indie brand now? Like, are they against the man? What does that mean, Apple? I'm sorry. That's dumb. AI for the rest of us? Come on.

Jordan Wilson [00:44:23]:
You have billions of devices. The rest of us. Hey, guess what, Apple? Guess what? Knock knock. The rest of us have all had access to AI for 2 to 4 years, and we've been using it everywhere. Right? Literally everywhere. AI for the rest of us. Come on now. No amount of marketing.

Jordan Wilson [00:44:43]:
You you you can pull over my eyes. And that is your main tagline, Apple? AI for the rest of us? Come on now. That's silly. That's silly. Who came up with that? Alright. Here's the other thing that no one's talking about. This whole Siri thing doesn't really seem smarter, and it seems like it's not really gonna get smarter for at least a year. So they didn't say, is it only gonna get smarter with this improved GPT 4 o that, you know, the rest of us are waiting on for OpenAI? Is it is is it actually a new model, or is it still old Siri that we've been using that kinda stinks, and it just has access to our personal data? Right? That's better than nothing, But are we actually going to get a truly smart assistant? And if we are, is it only reliant on OpenAI's technology? And if it is, do we have to have a paid account to use it? Right? You you know, it's something that Apple traditionally doesn't do, is is require a paid subscription.

Jordan Wilson [00:45:40]:
But they did very distinctly say, that some of the paid features of ChatGPT, you will be able to integrate with. Alright. Well, are we only gonna get a smarter Siri if we have a $20 a month ChatGPT Plus account? Maybe. Alright. Another thing that no one's talking about, this model confusion. Right? Like I talked about at 5 AM this morning, you know, I was reading the the blog post on Apple's machine learning blog to learn more about this Apple AI, and I googled it because I'm like, let me see what people are talking about, and literally no one is talking about it. At 5 AM this morning, there was not a single mention. They're not a literal.

Jordan Wilson [00:46:16]:
There was not a single mention of this model anywhere on the Internet, not on Twitter, not on a blog post. Nowhere. Nowhere except there. No one's talking about this. This Apple intelligence, it's it's this 3,000,000,000 parameter model. Yeah. Apple app actually released their own model. They just didn't really say it.

Jordan Wilson [00:46:34]:
They just said, oh, it's called Apple Intelligence. Right? But people think that's just marketing and software, but, technically, it includes a model that they created. Right? That, yeah, people aren't talking about it or even mentioning it by name. The other thing, is this a real ChatGPT integration? I don't know. Doesn't seem like it. Seems like you're just running ChatGPT in the background. Doesn't seem like it's baked in. Right? It's like, hey, Siri, you know, do something for me.

Jordan Wilson [00:47:03]:
And Siri goes, oh, I can't. Do you want chat gbt to do it instead? Is that an integration? Right? Or is that just, an underperforming model that's not very great, that then it just says, I can't do anything. Go have ChatGPT do it. Is Apple just a rapper now? Are they just a rapper company? Right? Great UI, UX, and all your data, but really to make great use of it, you just gotta use ChatGPT anyways. Right? And other things, it just seems like so many useless features. I don't wanna create a a a a cartoon version of someone and send it to them in a text message. I don't wanna create an AI emoji. Why is Apple so gung ho on, like, do do they really think the majority of their customers? Right? Yeah.

Jordan Wilson [00:47:48]:
It's probably fun for kids or for, you know, grandmas or something like that, but I don't know. For people 20 to 60, do you really wanna create an AI emoji? Do you really wanna create, do you know, a birth customized birthday image for someone? I don't know. It's weird. What's what's the use case? I don't know. It seems like Apple is really relying heavily that people are gonna like cutesy AI stuff. Y'all, we've had enough of that. We're tired. We've had cutesy AI stuff.

Jordan Wilson [00:48:18]:
Apple, hey. While you were sleeping the past 2 years, we've had all this crap. It's gone. We don't pay like, we don't care. We don't pay attention to it anymore. People want AI to improve their companies, to help them save time in their business so they can spend more time in the real world. And Apple's just like, ah, AI emojis. Have fun.

Jordan Wilson [00:48:37]:
Cartoon versions of your coworkers. Send them to them. No. That's tone deaf. Right? Apple, I think, missed a huge opportunity. It seems like they pretty much stayed away from business. They stayed away from enterprise. They stayed away from, you know, work productivity.

Jordan Wilson [00:48:55]:
Right? I mean, they kind of touched on it a little bit, but for the most part, this this, WWDC, this Apple Intelligence, you know, presentations were just kind of based on your personal life, which is great. Don't get me wrong. There's a lot of, a lot of time to be won there. But like I said, you know, OpenAI, Gemini, Amazon, Microsoft, they're saying, hey, Here's how you can take a a business task that takes 20 hours and do it in 2 minutes. That's great. Apple's saying, hey. This used to take you 10 seconds. Now it's gonna take you 2 seconds.

Jordan Wilson [00:49:33]:
It's like, okay. That's fine. Here's stickers you can annoy your friends with. I I I I don't care. I don't care. I don't know if it's just me, but, yeah, I don't care. Just a huge lack of business focus, and I'm just wondering if Apple finally realized internally after reportedly spending 1,000,000 of dollars a day to make their own model, and then they just ended up using GPT 4 any anyways. Maybe they just came to an internal realization.

Jordan Wilson [00:49:59]:
They saw everything that Microsoft has been Microsoft's been crushing it. They've been killing the game like they're hunting safari, right, to to borrow a a rap line there. Like, maybe Apple just said we can't compete with Microsoft. We're not going down this business route. We're going for personal productivity. Maybe that's what they're saying. Maybe that's what they're doing here. I don't know.

Jordan Wilson [00:50:22]:
Huge fail. Apple is the most cash happy company in the world. They have the most cash. They're Scrooge McDuck sitting on piles of cash, and they couldn't get this figured out. They had to rely on someone else. It's a huge loss. Just huge, huge loss, Apple. And instead, they're like, oh, you know, Apple Intelligence is gonna help you create calendar events faster and summarize your emails.

Jordan Wilson [00:50:51]:
Yeah. We've had this. That is not the future of technology. Apple used to tell us and show us and give us the future. Instead, Apple said, here's what you've had for 2 to 5 years just with some nice marketing, just with a nice user interface with your data. It's like, okay. That's fine. But this is a $3,000,000,000,000 company.

Jordan Wilson [00:51:14]:
This is a company that is supposed to be leading the world in in how we interact with technology in the future and instead instead, here's the, you know does anyone remember now what I call music? Right? When it would, you know, recap the the best, you know, or like kids bop. Right? That's that's what this feels like from Apple. It's like, hey. Here's all the things you've been enjoying over the last 2 years, but wrapped up in a nice little package in one CD. Right? Fail. Huge, huge, huge failure on Apple's part. And I think, eventually, the stock price is going to reflect that. Right? Apple is relying is relying on people wanting this.

Jordan Wilson [00:51:56]:
Right? Wanting this Apple intelligence. But I think people are gonna realize, number 1, well, just seems like it's kinda chatty b t, number 1. Number 2, not everyone's gonna rush out. Apple is, I think, their their strategy here because it's only gonna work on iPhone 15 plus and above. So their thought is, oh, well, everyone's we're gonna have hundreds of millions of people, you know, waiting in line to go upgrade their phones. It's like, I don't know. Maybe. But is that your strategy? Right? That's that's a temporary solution here.

Jordan Wilson [00:52:28]:
I personally stock analyst, but, number 1, I did say at a Holiday Inn Express once. And number 2, I did tell you a year ago that NVIDIA was probably the most important company in the world. And number 2, I did tell you a year ago that Nvidia was probably the most important company in the world and no one listened. I think Apple is gonna tank. Not tank, but right? I I like, they've only they've always been company number 1 or company number 2. I wouldn't be surprised at this time next year if Apple is the 4th biggest company and someone else has passed them. Right? Whether it's it's Amazon, whether it's Meta, you you know, I don't know. I don't think Apple is gonna be a top 2 company at this point next year.

Jordan Wilson [00:53:05]:
They're they're relying that everyone's just gonna go buy new hardware to take advantage of all this. But, yo, Apple, I don't know if you've been literally sleeping in a lab for the last two and a half years trying to catch up to to Microsoft, to Google, to everyone else. We've had this. We're tired of this, actually. Right? You are giving us kids bop. You are giving us now 32. That's what I call hits. Don't care.

Jordan Wilson [00:53:27]:
Don't care. Am I gonna use it? Yeah. I'm gonna use it. Right? Yeah. You know, not being hypocritical. I'm I'm just being honest, but is this the Apple that we're used to that that changes how we use technology, that that changes, you know, how we connect with the world? Nope. It's not. Alright.

Jordan Wilson [00:53:48]:
That's it y'all. That is our hot take Tuesday. Apple's AI announcements, the good, the bad, and what no one's talking about. Hope you enjoyed this one. If you're listening to another podcast, appreciate your support. Make sure you check out the show notes. There's a link there. You don't even gotta do you don't even gotta text Siri.

Jordan Wilson [00:54:04]:
Right? Just check out the show notes. Go to our website, your everyday ai.com. Sign up for the free daily newsletter. We're gonna be recapping AI news, fresh friends from across the web, the top AI software. We do this every single day, Monday through Friday, bringing this to you live, unscripted, unedited. So if this was helpful, please leave us a review on Spotify, Apple, wherever you're listening. Share this with your friends, your coworkers. Tell someone who needs to know.

Jordan Wilson [00:54:30]:
Thank you for joining us. Hope to see you back tomorrow and every day for more everyday AI. Thanks, y'all.

Gain Extra Insights With Our Newsletter

Sign up for our newsletter to get more in-depth content on AI