Ep 198: Midjourney V6 – What’s new and producing powerful ad creatives

Resources

Join the discussion: Ask Jordan and Rory questions on Midjourney

Upcoming Episodes: Check out the upcoming Everyday AI Livestream lineup

Connect with Jordan Wilson: LinkedIn Profile


Understanding the Power of AI in Producing Creative Advertisements

n the marketing and advertising landscape, there has been a notable shift towards utilizing Artificial Intelligence (AI) tools in the creative process. The applications of AI in artwork and visual creation have evolved significantly, offering unprecedented opportunities for businesses and creatives.

Enhancing Artistic Quality with AI

Initially, art generated by AI was easily identifiable due to its computer-created quality. However, as the technology has advanced, AI-generated art has become almost indistinguishable from images created by humans. Tools such as DALL E and MidJourney have made it possible to generate polished and high-quality images.

In the commercial realm, this presents a unique opportunity. By selling products adorned with AI-generated visuals, businesses can create additional streams of revenue.

Integrating AI into Commercial Campaigns

Adopting AI in advertising strategies offers businesses an efficient tool to increase productivity and campaign effectiveness. However, it's essential to use AI responsibly and transparently. This includes understanding and respecting copyright laws, accurately attributing the origins of AI content, and using AI-generated imagery in a manner that aligns with industry standards and regulations.

Exploring New Frontiers in AI-Generated Imagery

Experimentation is key to unlocking the full potential of AI tools. Mood boards can be created using AI that effectively present new product lines and campaign concepts. This creative process allows brands to test methods and explore new ideas.

AI tools such as MidJourney offer promising advancements in creating bespoke visuals for businesses. Brands utilizing AI tools like these alongside traditional methods demonstrate their effectiveness and enhance their creative output.

Adapting to the Rapid Evolution of AI Tools

As with all technology, AI tools continue to evolve at a rapid pace. Staying abreast of these developments and adjusting business operations to leverage the benefits of AI is paramount. This constant learning process is akin to what businesses had to undergo during the advent of smartphones and other cutting-edge technologies.

Successful adoption of AI tools in business necessitates successful adaptation to this technological evolution. The sooner organizations can integrate and adapt to AI, the more potential they have to lead in their respective industries.

Capitalizing on AI for Ad Creative and Commercial Use

AI offers a unique advantage for businesses looking to streamline and diversify their marketing campaigns. The notable improvements in AI's capabilities contribute to producing more engaging and visually captivating ads.

The ability of AI to uniquely visualize a brand's vision and execute it with precision can help to avoid generic visuals. By acting as a supplement to traditional methods, AI has received a positive response from many clients who are eager to leverage the technology’s potential.

Moreover, AI can be highly beneficial for cost optimization. AI-driven creative ads often yield comparable—if not superior—results than traditional marketing agencies, but at a fraction of the cost.

Visual Differentiation and the Future of Ad Creative

Creative advertisement doesn't stop at appealing designs. In an increasingly competitive market, companies need to inject creativity into their marketing materials to stand out and engage audiences. AI tools can streamline the production process, enabling a higher volume of visually differentiated marketing materials to be created—with motion playing a key role.

The potential of AI in advertising is immense. It's time to embrace this technology and harness its power to unlock new frontiers in ad creative. Understanding AI tools, experimenting with features, and tailoring them to specific purposes can put businesses at the top of their game in the age of digital marketing.

Video Insights


Topics Covered in This Episode

1.  Understanding AI Art
2. Application of AI Art in Business
3. Responsible use of AI-Generated Imagery
4. Creating AI-Generated Ad Creatives
5. The Future of AI in Ad Creative


Podcast Transcript

Jordan Wilson [00:00:18]:
AI has changed the creative advertising game, and that's not hyperbole. That is Actually happening live before our very eyes. Speaking of live hey. Thanks for tuning in. This is live. This is everyday AI, your daily livestream, podcast, and free daily newsletter helping everyday people like you and me Learn and leverage generative AI to grow our companies and to grow our careers. So if you are tuning in And maybe you wanna know a little bit more about mid journey, about AI art, about AI powered videos, about how to maybe grow your business With generative AI, especially on the creative side, you're in literally the best place in the world Right now because we're gonna be bringing on here in a couple of minutes who I think is one of the best out there in the game, bar none, and he's going to be walking us all through How to use MidJourney for more powerful and more effective ad creative. This is gonna be a good one, y'all.

Jordan Wilson [00:01:22]:
But before we get into that, Make sure if you haven't already to go to your everyday AI.com. Sign up for our free daily newsletter. This is gonna be 1 you're gonna need to read to believe. We're gonna be sharing, a lot of examples. This is a very visual show. So, hey, I'll throw out the warning right now. If you are listening on the podcast, First of all, thanks for your support. As always, thanks for making every AI a top 10 tech podcast.

Jordan Wilson [00:01:45]:
We appreciate that. But you might wanna check the, the show notes, and we always leave a link back You can watch the live show. You gotta see these visuals. I know words, but I don't have the words to describe the level of creativity, that we're gonna be showing for you today live on the screen. So, before we get into this, make sure, as I always talk about, sign Sign up for the daily newsletter. We're gonna be recapping the news, the show, and more, but let's quickly go over what's going on in the world of AI news. Alright. So will chat GPT be used to keep us safe from biological threats? Maybe.

Jordan Wilson [00:02:17]:
OpenAI just released a study on the potential risk of AI enabled biological creation. So the study found that at the current generation of large language models, it does not pose a significant risk, but OpenAI did caution that future models could become more dangerous. OpenAI study found that g b t four only provides a slight improvement in accuracy For biological threat creation, it often produces misleading results. Interesting. Hey. Speaking of threats, government officials are demanding AI companies disclose safety results of their models. So the Biden administration has introduced new regulations for AI developers Requiring them to disclose their safety testing to the department of con department of commerce in order to ensure transparency and safety in the development and deployment of AI technologies. So companies must provide, companies must provide, including their safety tests To the department of congress.

Jordan Wilson [00:03:15]:
So the US government is prioritizing AI as both an economic and national security issue and is exploring international regulation and hiring more experts In federal agencies. Tongue tied this morning. Alright. And then last but not least, a new open source model has leaked, and it's Pretty impressive. So there's a recent discovery of a seemingly new open source large language model on Hugging Face referred to as My q one seventy b. Alright? And its potential really was to rival or even exceed The current top performing top performing model, which we've all heard of, this little model called OpenAI's GPT 4. So the CEO of Mistral confirmed the leak on Twitter slash x slash whatever we're calling it, saying it was actually based off of an older model of Mistral. So If you follow the large language model game and, you know, everyone's kind of been chasing OpenAI's GPT 4 for a long time, This is some news you're probably gonna wanna check out a little bit more of in the newsletter.

Jordan Wilson [00:04:16]:
Pretty intriguing. Alright? But, You probably didn't tune in, if I'm being honest. You probably didn't tune in to hear the AI news. Maybe you did. However, you are probably here To get a masterclass in midjourney. You know, we've talked about midjourney a handful of times on the everyday AI show, And we've had some guests before in the past, and I had to actually bring one of our former guests back for an encore performance because mid journey has changed so much, in the past couple of months. A big jump from version 5.2 to now version 6. There's so much to uncover.

Jordan Wilson [00:04:52]:
I didn't wanna bore you with me rambling on, so we're gonna bring on an actual expert to help guide us through. So Let's do that right about now. So please help me welcome to the show. There we go. We got him there. Rory Flynn, who is the and Founder of Systematic AI. Rory, thank you so much for joining us.

Rory Flynn [00:05:10]:
Thanks, man. I'm glad to be back. I'm glad to do this again.

Jordan Wilson [00:05:13]:
Oh, so good. So just, Like, let's get the basics out of the way. Like, what is systematic AI, and, like, what do you do as a a mid journey master?

Rory Flynn [00:05:23]:
Totally. Yeah, man. So Systematic AI essentially is a you know, I call it an operational AI company. Now, again, there's a lot of tools out there, but how do you use them? How do you actually put them into your business and make them work for you. So, typically, we're just looking for, you know, businesses that have holes in their operations and their creative processes, and we're installing tools and processes to get them from point a to point b Quicker, easier with more explosive crate. So simple as that. You know, we're doing training. We're doing consulting.

Rory Flynn [00:05:48]:
We're doing, you know, a number of different things there. But It's, it's been pretty interesting with these tools and how they develop and the pace at which they develop. So, you know, there's been a lot of opportunity to inject this stuff.

Jordan Wilson [00:05:59]:
Yeah. And just, you know, for for those people listening that aren't super familiar. Right? Because I think with when we talk about generative AI, I think the defacto thing that everyone thinks about is something like an OpenAI. Right? The chat g p t or maybe they think of, You know, Google barred. I don't think everyone necessarily even thinks about AI art, right, unless you're maybe in a creative field, obviously. Give us, like, the zoomed out version of, like, what the heck is even AI art? What is mid journey, and why should we all be paying attention to it Even if we're not maybe a a creative.

Rory Flynn [00:06:33]:
Yeah, man. So I think when you look at it, right, like, mid journey essentially at its most basic form, you put in some text, You get an image. Right? Pretty simple. But, again, typically, you know, when as marketers and people that have worked in a creative field or just basically in any industry now. Right. If you didn't have photography elements, you didn't have different brand assets, things like that. You just had to go stock photography sites, and they all kinda suck. I'm gonna be honest.

Rory Flynn [00:06:55]:
They were like, you know, you just you spend more time on there looking for something that fits your brand than actually, like, producing the work. So if you think about all the creative needs that you have in a business on and day to day. Maybe you have some, maybe you don't, but there's social media. There's blogs. There's ads. There's presentations. There's storyboarding. There's this.

Rory Flynn [00:07:12]:
There's that. Right? So all of them require some visual piece. And how do we just get from a to b quicker? Because I think a lot of people are looking for Just finish product from everything that comes out of AI. Right? Like, I wanna push a button. I want everything to be done. Now granted, probably we'll get there. But at this point, you we can also look at different pieces in our business operations as to where we get from point a to point b quicker by using the tools instead of wasting time on ancillary text. So that's kinda how I look at it overall.

Jordan Wilson [00:07:42]:
And and and who should, you know, who should be using AI art? Right? Is it is it only people who are in, you you know, the creative fields, photographers. Should should noncreatives be using, you know, tools like like MidJourney? Like, maybe what are some of the The use cases when we talk about, you know, just the wide range of people that maybe could or maybe if you could share some examples of, You know, people that maybe we wouldn't think, oh, they could use MidJourney for for for this use case. Maybe let's talk through those kind of, across different, fields or or or different professions.

Rory Flynn [00:08:16]:
Totally. I mean, like, I think personally, designers are the most apt to use The tools. Like, that you already know the technology. You already know how to make things visually stunning. You also know the terminology. Right? So thinking about prompting the same way as chat Right? You'd have the right tokens to get the right answers. Same thing with mid journey. You know, if you know what a high contrast photo is or, You know, different opacity levels or things of that nature.

Rory Flynn [00:08:39]:
Right? Like, you're already been working in it. You can describe it, and that's what's gonna get you the good output. But I think, you know, in everyday life, right, you think about, okay, I'm gonna post on LinkedIn today. Do I just put a text post up, or do I put a crazy visual along with the text to stop the scroll? Like, something very simple like that. It doesn't have to be this whole, like, crazy production. It can be something very simple. I need a I need an image from my newsletter that I'm sending out. Right? Maybe just to tie it in.

Rory Flynn [00:09:04]:
I'm talking about, you know, Burger King's new x y z or whatever the hell they're doing. Right? Maybe I just need an image of a, you know, like, a cool image of a hamburger in, like, a 19 fifties environment just to, like, make the newsletter pop a little bit more. So It doesn't have to be, like I said. You don't have to create ad campaigns. You don't have to create, you know, Hollywood movies. You can just do little things with it. It's just finding those little holes. Same way you probably look at chat g p t.

Rory Flynn [00:09:29]:
Right? I have a question. Can chat g p t answer it? I have a task. Can chat g p t do it? I just think about it the same way as a creative sense. I know. I need an image. Can MidJourney do it? Most likely, yes. So that's kind of how I think about it.

Jordan Wilson [00:09:42]:
You know, even I wanna talk through my personal experience and and get your take on this, Rory. So, you know, we've we've been using, you know, DALL E, I don't know, 2, I think originally was was very early on, and we were using MidJourney, I think, in the in the Fours or before. Right? But very early on, I'd say MidJourney or AI art in general was very AI art. Right? Like, you couldn't really, honestly, use it for anything, because the outputs you know, at the time, it looked very, You know, computer generated or, you know, obviously, very, you know, disformed, disfigured if you were trying to generate images of of people or animals, but It's not really like that anymore, especially when we talk about the big jumps from from version 5 to now we're in MidJourney v six. Can you talk just a little bit, just about the industry, at large and how now from a quality standpoint, like like, where are we at?

Rory Flynn [00:10:42]:
We're pretty we're pretty close with Midjourney, I mean, to being indistinguishable. I mean, I've produced things that people would never know were were AI, and that's because, you know, there's a there's a detail and a craft to it. I think a lot of the I think a lot of the stigma around mid about, like, AI artists. It's kinda generic. But if you have a creative vision, you can execute it. It comes down to how you prompt. So I think, You know, again, there's certain tools like DALL E is really good for people that just need some basic stuff. I it listens really well.

Rory Flynn [00:11:09]:
It produces really quality images. Midjourney is like taking that creative juice to the next level. Right? And there's other tools beyond midjourney, stable diffusion, things of that nature that can go even further than that. So really just depends on what you need. Right? And I think, you know, the jump from mid journey even, like you said, 3, which was kind of like it was cool. It was novel. It was like nightmare fuel, stuff you get, but, you know, to now in less than, let's call it, like, 18 months is insane because I can produce things that are polished and quality ready. And then even if I needed another additional tool to make it look even more real, tools like magnetic, I won't go too far into there, but it's a really good tool Or maybe something that has, like, a an AI polished looking face because sometimes you'll you'll prompt that.

Rory Flynn [00:11:51]:
It looks like a robot. Right? Like, you get a face. It looks like a robot. Like, super smooth skin. Everything is perfect. You know, a tool like Magnific upscales it and then adds detail. So skin wrinkles, little hairs, freckles, You know, bags under the eyes. You're like, okay.

Rory Flynn [00:12:06]:
Now it looks like a human, not a robot. So Mhmm. Yeah. We're we're moving pretty quick, man, and I don't you know? This is just in the level of quality from each jump inversion plus how quick it happens. You know, this development cycle It's not like, you know, your new iPhone every year, and it's like, oh, the camera upgraded. It's like every 2 weeks, we get something that's like that level of, you know, that level of development. So I I don't know how to extrapolate where it goes from here and how quickly it moves, but it's gonna move fast.

Jordan Wilson [00:12:35]:
Oh, gosh. Yeah. So fast that I would agree with with Jay, Joining us here when he says fasten your seat belts. And I will tell our live audience right now, please get your questions in because we're about to show some examples on screen, And you're probably like, your mind's gonna be like, wait. What? Like, the whole point of everyday AI is for us all to learn together from some of the smartest people out there. So real quick, we're gonna take 1 1 question here, and then we're gonna get into some examples because I can't wait for our live audience, to see this. But, great great question here from Richard saying, other than replacing stock imagery with AI image images from tools like MidJourney, Anyone making money or use of it. So yeah.

Jordan Wilson [00:13:15]:
Because I think sometimes, Rory, I think people like me and you, we maybe live in an AI bubble. Right? And and we just assume, like, oh, everyone's using, You know, mid journey and DALL E, but I think the vast majority of people probably don't. So maybe just talk a little bit here to to Richard's question about, like, hey. Are people and Really using this out in the wild, and our people making, you know, money and a career off of creating AI images.

Rory Flynn [00:13:38]:
Yeah. I mean, from a, If you're purely just trying to make money off AI images and not, like, utilize it within your actual business process, I mean, there's a 1000000 use cases. I can go I can go spin up a print on demand shop and sell T shirts Today in 20 minutes. Right? I have enough imagery there to go create a a T shirt shop like that. Do that for any sort of product that you know, on Printify. Right? Like, we're not gonna go. I'm not going that route because it's just like, again, that just bloats the market, and I don't think it's really, like, where the actual use cases are. I think really looking inside of businesses And seeing how to get from point a to point b quicker.

Rory Flynn [00:14:10]:
Right? So I think about how many presentations do people have to do. Right? Like a presentation. Instead of doing a boring PowerPoint with A white background and black text. You know, you throw a little bit more inject creativity in there. You have more engagement in your audience. If you're doing presentations for, You know, businesses, if you're doing different or or, sorry, proposals, things of that nature, giving a little life to it can be really helpful. I mean, Like I said, I work in primarily the digital marketing, like, ecommerce side of things. That's where I've been for a lot of my career.

Rory Flynn [00:14:38]:
But think about ads, social posts, emails, Blogs. Everything that comes out from marketing standpoint, how can we just visually differentiate ourselves? That's always what we're trying to do from a brand perspective anyway. So Utilizing tools like this and then even inside your internal processes. Right? Like, I think about getting from like, you if you wanted to get to a photo shoot, right, you're a brand. You know, think about that that entire process of what it takes to get to a finished product, maybe of a soda, and you wanna get to, you know, have a whole set of brand photography. I mean, you could essentially do it with AI, but, also, if you wanna do it with a photographer. Right? Gotta hire a photographer, number 1. Then when you get to the photographer piece of it, you gotta direct the photo shoot.

Rory Flynn [00:15:17]:
After that, you know or you gotta hire models. You gotta do all that kind of stuff. From then, you gotta get the revisions done, then you gotta get all the images polished. It takes a long time. But, also, you don't know how to direct a photographer. So if you give them a set of images that you're like, I wanna create it like this. This is how our brand looks. This is how we want everything to be positions.

Rory Flynn [00:15:35]:
That process is just gonna be quicker. There's no there's no thought process that essentially you're essentially outsourcing decision making in a sense. Right? So It can be a lot easier internally to get things done with tools like this. So long answer, but, I mean, there's there's I can go down that rabbit hole for hours.

Jordan Wilson [00:15:52]:
Oh, gosh. Yeah. Me and you both. Yeah. Good good thing we tell people. Yeah. Like, oh, you know, everyday eyes at 20 minute show because, Yeah. Otherwise, me and Rory might actually talk for a couple hours, but enough enough of the chitchat, Rory.

Jordan Wilson [00:16:05]:
I I like, I need everyone To see the power. So especially when we talk about, ad creative. Right? This first example that I'm gonna show is Amazing. So I'm gonna have, Rory, I'm gonna have you, again, do our best to describe what we're seeing, on screen here as I bring this up. So Let's go ahead and take a look here. So before I hit play, because, yes, we're gonna talk about this, but there's this whole, you know, photo to video scene. Just quickly describe kind of this this ad creative that we're gonna be seeing here.

Rory Flynn [00:16:40]:
Yeah. So this is what we've been testing lately. Now we've done a lot of ad creative with mid journey and static, like meta ads. Right? You think about the meta ads game, What it is is not necessarily as much of a quality game as it is a volume game because you need to run a number of different ads to actually get data to Then pick winners and then iterate off of them. So at the same time, you know, obviously, having good brand assets is gonna be essential, but Supplementing and getting more volume so that you have more ability to test so that you can get make better decisions and then keep, you know, progressing from there. That's what this is all about. So, typically, we've done a lot of this with static, and now we're just starting to test motion. Because even subtle motion, even if it looks Fake.

Rory Flynn [00:17:20]:
It might stop your scroll again. We're talking about social media marketing here and, you know, paid marketing. So it's this isn't, like, on a camp. This isn't gonna be on Times Square, You know, on a video board, this is gonna be something that you're gonna see in your feed on, you know, in your stories on Instagram, things like that. So What we did here is, like, we had static assets, and this is just a mock up so everybody knows this is just a total mock. We're testing different things here. Again, Just to give a little subtle motion because, again, if that stops you in your tracks and you say, what is that? That's a win, right, instead of people just scrolling right by it. So, again, sometimes the variable that people are looking at is different.

Rory Flynn [00:17:54]:
It's not necessarily what is, like, this is is a Super Bowl ad? Is this $1,000,000 campaign, Or did this take me 5 minutes to get data? Like, we're on the 5 minutes to get data side of it. Right? So that's kind of how we look at this stuff.

Jordan Wilson [00:18:08]:
So good. Alright. So let's let's go ahead. So, again, we have a a a mock up, campaign made from mid journey images, and and we'll talk about the Process and the other tools, but let's just go ahead and play this, and and then we'll kind of talk through it little 7 second clip here.

Rory Flynn [00:18:26]:
And yeah. So what we're seeing here too is just this basically, we're adding subtle motion to the images. We're not creating, like, full on Michael Bay action scenes. It's just a little bit of motion, little rotation, a little bit of zoom in, a little bit of, you know, tilt in the camera just to get you again from a from a visual perspective to stop you in your tracks and Wait. What? You know, that's all that's all we care about. So that's the idea.

Jordan Wilson [00:18:49]:
The and the quality here is insane. So for for for our podcast listeners, We have 3 different photos that were created in midjourney, all that look part of this, you know, kind of fictional Marc Jacobs campaign, but, you know, similar style, and Similar lighting. It starts with a, a photo of a model, you know, with a nice white blazer leaning in. So, again, it looks like we have motion. It goes then to a a high like, very high end looking purse. The details are exquisite. I mean, we have shadows. We have Lighting, we have, you know, rule of thirds going on.

Jordan Wilson [00:19:24]:
So it looks like, you know, it's almost on a a dolly. Right? Like, not dolly, like the the image generator, but like a dolly that you would use in professional filming. And then the last, shot that we have is is a a watch. Right? And this watch, again, Highly detailed. We're getting motion. It I mean, when you look at this, it looks like it took maybe 10 to 15 humans, You know, a couple of days and probably a 6 or 7 figure budget easily To produce what in theory could be a a little 7 seconds, you you know, clip here. So so so, Rory, walk us through just real quick what this process looks like because a lot of people when they think mid journey, when they think AI images, they don't ultimately think what we have here, which what you said is an example of a scroll stopping Social media visual. Right? This is a short video with movement.

Jordan Wilson [00:20:16]:
So quickly talk everyone through the process of of how this even works to get from Idea in mid journey to something like this.

Rory Flynn [00:20:25]:
Totally. So, I mean, you know, this wasn't a 7 figure budget. This was me and my sweatpants on my couch in 30 minutes doing this. Right? So, You know, this is again, from the process standpoint, I was literally playing around with some level of photography. Right? I wanted to have some sort of studio photography to see What, you know, what we could do with it, how close we can get it to studio photography. And, again, I'm thinking about, like, Marc Jacobs and their kind of bold look, direct flash photography, things like that. So From here, it was really just like, oh, I can I can get a good look? Can I can I animate it too? So, basically, in midjourney, what I'm doing is I'm utilizing a very specific prompt set, And how I'm doing is I'm being very specific in terms of the look, the feel, kind of the tone, the colors so there's a cohesive look, and then I can just apply different subjects or Objects to it. So instead of it being a woman in a white suit, it's a, you know, handbag or it's a watch.

Rory Flynn [00:21:16]:
So once I have those image sets down, then typically, I would go to to runway. I mean, normally, I just stop at static. Right? I'd go build some templates. I'd create, you know, the ad. What we're doing now, again, like testing subtle motion, I would take these images from mid journey into runway, give it a little bit of motion to see what it looks like, then, you know, utilizing Different upscalers like Magnific for photos or Topaz Labs for, video, and then building templates in Figma, putting it all together, and watching it Watching it come to life. Now, again, this is not perfect. So anyone in here trying to nitpick, you know, that the one of the clock hands is a little bit, you know, a little bit not straight. Like, I get it, but this is a concept.

Rory Flynn [00:21:57]:
Right? This is what this is what can be done. These tools are not fully developed yet like Runway. Very good. Give it 5 more months. I mean, you're probably gonna be indistinguishable. So, I mean, there's other video platforms too.

Jordan Wilson [00:22:09]:
So People don't yeah. Like, Rory, people literally, I I don't think, understand how powerful. Like, I remember telling people, you know, a year and a half ago, like, hey. Look at this, like, AI image. Look at this. Look at that. And I'm telling people, like, more than a year ago, this is how we are going to be seeing ad campaigns in the future, a 100% AI image, like, a a 100% AI generated. Is that, like just quickly because I wanna get into more examples.

Jordan Wilson [00:22:36]:
Is that where we're headed? Like, are we are we gonna get there? Is is that the future of of, you know, ad creative with things like you just showed here using, you know, 3 or 4 AI tools in your sweat in and in the matter of hours getting this level of of quality. Is that where we're headed?

Rory Flynn [00:22:51]:
I think so. I think you're starting to see it. I think you're probably gonna you know, if I had one prediction, probably see a Super Bowl commercial this year with some level of AI in integration into it. Kanye West just released a trailer for his new video. I'm not Technically, a real Kanye West fan, but I saw it, and I was like, that's runway. 100%. I knew it right away. So people are starting to utilize it, and whether it is looks totally polished like this Or can be used in a more artistic way, you're gonna see it because it's it's too it's too cool not to.

Jordan Wilson [00:23:20]:
Right?

Rory Flynn [00:23:20]:
You know what I mean? So

Jordan Wilson [00:23:22]:
Exactly. And now now I miss old Kanye. Okay. So let's let's talk let's talk a little bit about MidJourney v six, and let's talk a little bit about what's new because, you know, Rory, one thing that you said when we were walking through that Marc Jacobs kind of, mock ad. And, again, if you're listening on the podcast, you gotta go and and and watch this one so we can see the visuals. But, you know, you talked about the the power of of Prompting. You talked about, the importance of having that specificity and that consistency in your prompting language from, you know, shot to shot. So so now we're, on the screen here, you know, talking about prompt coherence because that's something that's really changed, in new mid journey v six, and there's kind of, you know, the difference of, a similar prompt, with with with a a woman Kind of smoking, smoking a cigarette, you know, in front of some plants with some great shadows and great smoke.

Jordan Wilson [00:24:18]:
But talk a little bit, you know, about how some of these updates in Midjourney v six, Specifically, you know, handle, our our written words and our prompts a little bit better.

Rory Flynn [00:24:29]:
Yeah. So there was always a kind of a gripe with MidJourney about its Coherence. Like, it wouldn't listen to everything in your prompt. So with Midjourney v six, the model became more literal. Right? So literal meaning that, like, it's more like now. Like, it'll only do what you tell it to do. It won't go and fill in as many blanks as, you know, it used to. So what I've done here is really Test the limits of the coherence, meaning, like, let's get super, super, super granular, and almost write a prompt like a screenplay.

Rory Flynn [00:24:56]:
Think about telling it a story. Like, with this prompt specifically, I went as deep as saying, you know, again, who's the person, what they're doing, what direction they're facing, You know, what color shirt they're wearing, what, you know, headband they're wearing, what color lipstick they have. You don't have to go this overboard, but if you want supreme control, especially if you're something, You know, more branded or more feel like that. You can get it. So, I mean, even, you know, if you look at this entire prompt, it's probably 60, 70 words, We're breaking it out very specifically and telling it exactly what we want, like, what the position is. Even, you know, is the background blurred? What is, you know, what is happening with the smoke, the lights interacting with the smoke. Right? If you say these things, it's gonna happen. So, really, it's just getting your vision nailed down, And that's where the art is, and the prompt is, like, saying, this is how I want it to look.

Rory Flynn [00:25:44]:
How do I describe it? So copywriters too probably gonna have a really good opportunity to to use these tools because you can describe things. You know? That's that's really what it comes down to.

Jordan Wilson [00:25:53]:
And and and talk a little bit real quick because I think it's actually even though we're talking semantics of, you know, prompt, coherence. It's actually, I think, extremely important because, You know, one of the, I think, one of the biggest, hurdles to clear or obstacles when it comes to integrating generative AI is the bad information out there or is is the people who are looking for shortcuts. And I think especially with earlier versions of mid journey, you know, people would try very long, detailed prompts like this, and the prompt handling wasn't that great. Right? But now, you know, like like what Rory's talking about, I'm not gonna read this whole prompt, here because it would take a while, but I'm gonna just spit out, like, a couple of first sentence, sentences. So it says, satisfied Brazilian woman laying in bed with a cigarette, and exhaling, subject position, facing right, photo top photo type, medium shot, editorial, subject focus, sharp focus, mid ground, wearing Muted blue oversized T shirt, headband, red lipstick, environment, cozy, favela bedroom, house plates, and but right. That's, like, a third of a third of the prompt. But earlier versions of Midjourney or even current versions of of DALL E don't do very well with this and Very long, very detailed prompt handling, and and people do test. Right? And they say, oh, you know, this version got, you know, 70%.

Jordan Wilson [00:27:12]:
But what we see here on the screen, Rory, is is you have all of those key terms, right, that you use and it looks like you have about 20 or so, kind of key terms here, It it hits mid journey, v six hit each and every right. There's a green check mark. It hits every single one. Right?

Rory Flynn [00:27:30]:
Yep. And that's the thing is, like, when I The new structure that I've been playing with is an old version 4 structure where it was really like you had to tell version 4 exactly what you wanted it to do. So, You know, in version 5.2, you could just write, like, a few terms, and it would kind of create the whole image for you. Right? But you had less control over that. Midjourney took more liberty. Now I'm going back to the older structures where it's like, okay. Who is the subject? What are they doing? What are they wearing? What what position are they facing? You don't have to do all this. This is just I want something very specific Mhmm.

Rory Flynn [00:28:00]:
You know, what's going on in the background? Again, being very specific, like almost a screenplay. Like, a screenplay writer would set this up for a director. You know? So that's kind of how I'm looking at prompting, and it's getting great results because, again, it just knows what to do. So even, you know, just taking a lot of the like, think about it in mid like, chat prompting too. What we've done a lot of the times now We're limiting those tokens. Right? Like, just limiting the tokens, getting rid of the fluff words, getting right to the meat, and it'll read that in a very specific way and then produce the way you want it to.

Jordan Wilson [00:28:32]:
Oh, gosh. I I can just talk about this one for forever, but there's more things we have to get to here. So, you know, especially when we talk about, you know, creativity. Right? And and add creative. This example is is fantastic. For our podcast audience, this is a mock up For, you know, Uber Eats, and it looks like we have 4 very distinct different, you know, campaigns for different types of food, and each One has about, you know, 4 or 5 different ads within this ad set. So, I'm gonna zoom in a little bit here. Maybe, Rory, if you could just walk us through, well, we'll zoom out first, and then we'll zoom in.

Jordan Wilson [00:29:09]:
But walk us through the process and maybe how people can, put something like this together in midjourney and why they may want to.

Rory Flynn [00:29:17]:
Yeah. Well, this is for you know, we've done this for a few brands. Right? Like, I can't I can't name them specifically because, again, there's that seem to be a stigma around AI and people not wanting to to know that, You know, they're utilizing AI. So there's a mock up in essentially of how we've been utilizing it, especially for ad creative because ad creative, again, like I said, it comes down to a volume game, and you need as much as possible. So with things like this, you know, there there's really an importance in being very meticulous, but also, like, Really looking at this and saying, how can we get more localized in terms of our ad sets? So maybe a larger company, you're running ads in multiple countries. Sometimes if you're you know, you wanna talk to an audience in South America or Asia and all your models are American. Right? Like, that's it. Maybe it doesn't fit the same way as having someone that's and Hyperlocalized or even in the language too with chat gbt.

Rory Flynn [00:30:05]:
The difference between something someone that talks, you know, New York vernacular And Charleston vernacular, totally different. So if you can change that up and be a little bit more tone specific and localized, your ads are gonna hit better. So, again, things like this, this is where, you know, we can be again, like, even with food. Right? Like, maybe a ham or a chicken sandwich doesn't hit the same way in South America As it does in the US, maybe we're utilizing something different down there, you know, empanadas, things like that, whatever it might be. I'm just throwing out examples on off top of my head. So getting more specific and having the ability to be versatile, but still staying on brand, that's where the stuff can be, you know, very, very impactful.

Jordan Wilson [00:30:42]:
Speaking about staying on brand, great great comment here from, who people say is the the other, Bash brother of of MidJourney. So shout out to Drew. Yeah. Yeah. Former guest. I think, Josh from, from the comments here maybe maybe called you guys that. So Drew is asking A very important question here. So talking about consistent styling.

Jordan Wilson [00:31:03]:
Right? Because when we're seeing what we're seeing on the screen here, you know, I know it's, you you know, kind of like the same image, but that's important as well, and we're gonna get to that. But maybe let's let's jump straight into this and and maybe talk about consistent Dialing. So, in this example, you know, there's, you know, 6 different photos, and they kind of have this this dark vibe, dark feel, You know, on a on a beach and, you know, someone, surfing, kind of it looks like in the Arctic, but this is a great example of having a consistent style or a consistent tone. So maybe if you could, Rory, walk us through kind of this photo set and and get to Drew's point about, like, how do you even get this consistent styling in mid journey?

Rory Flynn [00:31:41]:
Oh, what's up, Drew? This is so yeah. Have we this is this is good, and think about this for again, even this process, this is something you can utilize storyboarding, you know, know, with an agency storyboarding campaigns, this doesn't have to be a finished product. But, again, consistent styling, I think about it in terms of mid journey prompting. You wanna break an image down into visual building blocks. So you have a subject. You have an action. You have lighting. You have composition.

Rory Flynn [00:32:02]:
You have color scheme. You have different different elements. Right? Like, those are all little visual building blocks. Now, Typically, like, the composition in the color scheme and, you know, number of different, like, details and modifiers, that's how you're gonna wanna stay consistent. So especially something like this. We're going with, like, a dark atmosphere. We wanna have a specific color scheme. We wanna have different elements of you know, I wanna have black mountains and black sand.

Rory Flynn [00:32:25]:
Like, I'm putting all that in the prompt. Now, again, when you wanna consistently style it, sometimes all you have to do is then change the subject. So then it goes from, you know, High angle shot of a beach to close-up of a of a surfer with frost on his mustache. But then at the back end of the prompt, it's still, you know, Black sand, black ocean, dark atmosphere, you know, high contrast, things like that, muted color tones. So When you're able to just find that prompt structure that works and those building blocks, then you can just iterate at will because then the the consistent theme is gonna come through. And that's how you can really start to build these worlds or ideas around 1, you know, certain vision, if that makes sense.

Jordan Wilson [00:33:05]:
And so, you know, a a a question here, which I think is maybe we should have addressed this one sooner, but But it's probably something that so many people, are thinking is, the the the question here from, Sajid asking, Can we use runway and midjourney for commercial purposes? Because if you're watching on the live stream, like me, even though I See this AI image, like, generation stuff every day. I'm like, holy cow. This is crazy. But it maybe if you're listening on the podcast and this is maybe new to you, it's something you're thinking like, oh, yeah. Can you be using these amazing tools for commercial purposes? I know it's kind of a gray area, Rory, but but what's What's your, you know, what's your response to to using this for commercial purposes?

Rory Flynn [00:33:49]:
I've done it. I mean, we utilize it, but we do it smart. You know, we do it the way that you have to be cognizant of all the different pieces that are going around it. Would I use this for a Super Bowl ad? Probably not. Am I using this for a, you know, giant billboard ad? You know? Probably not. On, you know, meta ads and social media marketing right now, definitely. Right? So things you gotta be aware of, number 1, is that we don't know what these tools were trained on. We don't know what Ali was trained on.

Rory Flynn [00:34:12]:
We don't know what MidJourney was trained on. We don't know where that data came from. So utilizing a lot of the times reverse image search to check to make sure That this is not if I pulled up one of these images before I put it out commercially, that it doesn't come specifically or derivative or derivative from some other work. Right. That's one thing. And, also, just from an ethical standpoint, don't like to use artist names in my prompts ever. Like, any sort of artist names. Right? It's just again, you don't need to do that.

Rory Flynn [00:34:37]:
And it's, I think a lot of people do that in the style of, commercially. Right? Like, do it for fun. You know? I I don't know if that's if that still crosses and boundaries. I just don't do it. But the if you wanna go look up someone like you've seen a lot of this stuff on Wes Anderson, like a style Wes Anderson. Right? If you wanna know how Wes Anderson composes his imagery, you can go look it up on chat g p t. You can say what kind of compositions, color schemes, lighting, effects, And you can add that to your prompt, then you can build out those visual building blocks, like I said, without having to go and be so, you know, blatant about stealing kind of the style. Now, Again, it opens up a really long conversation, but things like that, yes.

Rory Flynn [00:35:14]:
And then also just being very cognizant of what copyrights are you know, what the copyright laws are within the the country that you're in, like US. Everything that's produced fully by AI can't be copyrighted. So, Essentially, it is your own personal stock photography that anyone else can use. Same thing on these other countries now that are, you know, utilizing different and Tactics. So I'm not versed in every single country. I just know that's what US says. So always something to be cognizant of is, yes, we are using this. But, Again, where did it come from? What was it trained on? How is it generating? So that you can be protected and be ethical in what you're doing.

Jordan Wilson [00:35:49]:
Yeah. And I will say this without, Getting us too far because I I like, we need to see some more of these creatives because it's, doesn't make sense how good they are. However, like, You do always always caution. Like, I always caution people, read terms of service. Even personally, mid journeys, they had a big update. I Talked about it on the show about, you know, a month or so ago. Not a huge fan of MidJourney's new updated terms of service, but like Rory said, use this smart, Use this responsibly, especially if you are using it for commercial campaigns. Use your best judgment.

Jordan Wilson [00:36:22]:
It's important also if you are working with an end client that you Use transparency. Right? Like, that you disclose when and if and how you may be using, AI imagery. So I had to get that out of the way, but but speaking of commercial campaigns, this one is silky. So let's talk about this, Rory. Maybe just describe a little bit what we have going on here in And kind of the process and and how this can be ultimately be used, in in, for ad creative.

Rory Flynn [00:36:48]:
Yeah. So I came from the product licensing world. Essentially, what that is is you're putting 2 disparate brands together and creating a new product line. So that's kind of how I thought about doing this. This is a mood board. Right? But, like, to me, Producing this for a presentation to pitch this from maybe I was representing Jeep, and I wanted to pitch this to Patagonia. This would have been months to get this done or actually look this good. I did this with 1 prompt.

Rory Flynn [00:37:10]:
Right? So we're we're collaborating, taking the 2 brands, storing them together. Now we have, like, a whole mood board, essentially, of a look and feel. This could also be for a campaign. This is just way again, like, think about these processes from if you're working in design field, it's just how do we get from point a to point b, and how do we make it smooth, and how do we tell the story? Because I think at the end of the day, if you can't tell the story, then what's the point? So that's that's kinda where the ideation came from this, and that's where a lot of the stuff I do comes from.

Jordan Wilson [00:37:35]:
So good. So good. Alright. Let's let's look at 1 more because I know, you You know, I was telling Rory before the show, oh, we'll totally 30 minutes. We can't. My gosh. This stuff's too good. So this is a fun one.

Jordan Wilson [00:37:45]:
Right? And and obviously going back to To our other points. So this is, you you know, Nike shoes and very, you know, futuristic, you know, bright neons. But but talk a little bit about this one, and, obviously, this isn't isn't one where you would, you know, commercialize this, Rory. But but talk about just, Some of the levels of of creativity of how you got to this kind of, Nike NASA collaboration mood board here, And also maybe, some advice for others on what you learned, you know, from even creating, a mood board like this and how they can use, you know, some of your findings or learnings.

Rory Flynn [00:38:23]:
Yeah, man. So something like this, this was initially when version 6 came out. What I really wanna do is just Push it to the limit. Mood board, when you think about it for an art generator, is really hard because it's a 1000000 different elements. There's a lot of different textures, styles, colors, things going on. So, So, basically, that's always my, like, test is how far has the new version gotten. And this one, I specifically just love Nike branding. I mean, You know, who doesn't? Maybe I maybe I'm in the minority there, but whatever.

Rory Flynn [00:38:49]:
It's, I always like to see, like, Nike and NASA because it's 2 disparate brands. Right? Like, space and earth. Right. It's 2 tow totally different pieces. How can we blend them together and make it cohesive? So here, you know, you see a lot of different textures, a lot of different colors. You have the technical theme of NASA. You have the, like, street vibe, you know, like, the streetwear vibe of Nike kinda blending together in a very cool and unique way. So I like to test these different things to see how good Midjourney gets, you know, basically each release.

Rory Flynn [00:39:19]:
And this one was just really fun. I mean, like, I just love I love using those 2 brands together. If you ever wanna go try that, anyone, Nike and NASA, it comes out specific. Great basically every single time. So, that's that's kind of the thought process behind just having a little fun with the tools too. You know?

Jordan Wilson [00:39:37]:
So good. So good. Alright. Here. We're gonna do this before we wrap because there's so many good questions, and that's what I love about the show is is smart people can answer questions. Let's Let's go rapid fire here, Rory. We'll see if we can get through a couple of them quickly. So Alar asking, so I'm focusing heavily on creating short ads for brands.

Jordan Wilson [00:39:55]:
What's your thought on this approach?

Rory Flynn [00:39:57]:
Think it's I think it's useful, right, depending on what your source images are and how you're getting to that end result. I mean, Tools like Runway can also utilize, you know, existing product photography. It doesn't have to be just generated images. Like, you can utilize your and Static images to animate them. So I think it's a great idea, especially if they're open to testing things like this, because why not? You're gonna have a lot more video tools coming out other than Runway, You know, over the next 6 months, you know, stable diffusion video tool, things like that. It's gonna get way more easy to do this.

Jordan Wilson [00:40:24]:
Yeah. Huge. And, yeah, we talked about that earlier. Just The process of, yes, mid journey images are great, but then being able to bring them into another program like runway and add, you know, visuals or or add movement, that's where it becomes Super powerful. Alright. Next. Rapid fire. Here we go.

Jordan Wilson [00:40:39]:
Richard asking, don't AI tools, text, and pixel just lead to everything becoming just a bit generic?

Rory Flynn [00:40:45]:
Think that depends on who's prompting it. You know? Like, I I don't know if I've seen another Nike image that looks like that. I mean, you can you can go try to find it and search on, journey, but I'm not saying I'm the best or anything. It's just my my vision's different. Right? Like, I think that's what it comes down to. So, you know, if you can get different with your vision, if you just say the Same thing every time, like girl on a beach. Like, you're gonna get things that look generic. We're gonna say girl on a beach, you know, with the sun half setting reflecting purple light against the water, in Blah blah blah.

Rory Flynn [00:41:12]:
But it's gonna look totally different than just girl on a beach. So I think it all comes down to what the vision is in executing on it.

Jordan Wilson [00:41:19]:
Perfect answer. Alright. Anthony asking. I wonder what the reception everyone is getting from clients to start integrating this into the projects. What's it been like for you so far?

Rory Flynn [00:41:27]:
I think it's been great because it's it's a I'm using it as a supplement, not the not the entirety. Right? So, like, again, it's a supplement. I like to thank everyone. You know? It's not just like, let's scrap our agencies. Let's scrap our photographers. Let's scrap everything. It's a supplement. We can test it.

Rory Flynn [00:41:43]:
I think every good business is Positioning. Right? Like, we've we've tested some more AI looking stuff with a little bit more focus on specific areas. And then, like, it starts to shift the perception because that stuff hits on social. It's like, okay. And Let's maybe make our ads look like that. So organic to organic to paid. So it's interesting how it's developing, but, I mean, again, Like, it's always it's always client by client basis. Everyone thinks about it differently.

Jordan Wilson [00:42:15]:
Cool. Tara Tara asking, how would you compare this to A Adobe's AI tools. They're they're Firefly.

Rory Flynn [00:42:22]:
I I like Adobe. I use Photoshop generative fill for a lot of different things depending on what, sometimes I pair the 2 together. You know, I think there's a there's a really good thing for anyone that's in design or needs vector images now on Adobe Illustrator. I don't know if Many people talk about it that much, but there's a text to vector, option. So this is something I wanna drop here because it's super helpful if you're looking for icons or little typography, things like that. Like, text to icon or text to, text to sorry. I just totally lost my train of thought there. Text to, vector is, like, massive because that used to be a pain in the pain in the butt on AI was creating or on, Adobe Illustrator, You know, creating vectors.

Rory Flynn [00:43:03]:
So, yeah, just think about that one too. Really helpful.

Jordan Wilson [00:43:06]:
So so good. Alright. Here we got 2 or 3 more. So Alar saying, you mentioned that you don't use artist names. What about brand names like you did with the Nike shoes?

Rory Flynn [00:43:14]:
Yep. I'm not use I'm not putting that out for commercial use. Right? Like, that's just basically showing some examples of what can be done. Now maybe that's wrong. I you know, again, I tip I typically tend to utilize that for own personal use. I'm not gonna go and put that out into commercially. So, Yeah. Maybe I need to think about that one a little bit more, but, you know, consistently, I've just been doing that for fun to test the tools and to see kind of what can be produced from it.

Jordan Wilson [00:43:36]:
Yeah. And, obviously, you're not alone in that one. Alright. Mauricio asking, if you were to offer creative ads with AI, how does that pricing differ to a real marketing agency? Great question.

Rory Flynn [00:43:46]:
Again, I think it comes down to how much you want and how much needs to be what the output is. Because if the output is the same, like, you know, if, basically, If we're creating the same amount of ad sets with humans and with AI, like, I don't see the difference there from a pricing standpoint, especially if the performance is gonna be able to be judged A lot quicker, and you're gonna get more data and access, you know, to optim to optimizing those ads. So I think it's, you know, I think a lot of people think that AI is gonna drive the price Personally, I don't think so because it's gonna help performance if we have more data and are able to optimize quicker. So the the value of that is just as important as the creative itself. Right? Mhmm. So I I think, again, it's gonna take a little bit of positioning if you're experienced with the tools and experience with the, you know, the outputs to be able to position that to potential clients. But, inputs to be able to position that to potential clients. But at the same time, if you're producing results at a much quicker rate and at a much higher clip, Again, your value is still that, not just the creative.

Jordan Wilson [00:44:42]:
Alright. And then our our last and our rapid fire questions from Alar. Again, What's your thoughts on postproduction, like adding some unique elements then when, oh, at that point, it's not fully made by AI? Yeah. What's what's your thoughts on that?

Rory Flynn [00:44:53]:
100%. 100 100%. Like, I, you know, I do a lot of this stuff in Figma. You saw the Marc Jacobs ads. You know, they do the Uber Eats ads in Figma. I create templates in there, drop the images in. You know, even the after effects in terms of, like, building different pieces into the ad sets, like, if you're a graphic designer. Right? Like, you don't need to use a full image.

Rory Flynn [00:45:11]:
Like, I need to have all this text in there. Can MidJourney generate all of it? No. Just generate the image and then add the text yourself. So, you And this ingredient can be done a lot quicker than slow roasting, you know, a pot roast for 5 hours. You know? This is this is like the Instant Pot version where We do it in or the air fryer version, we do in, like, 30 minutes. Right? So, like, that's kinda how I look at it.

Jordan Wilson [00:45:40]:
So good. Alright. We made it through rapid fire. I I I'm taking up so much of your time, but I have to ask you, Rory, as as we wrap everything up here. Right? Because the premise of this is Midjourney, you know, kind of the combination. Right? So if if I were to summarize everything, I think that the AI image and the AI creative space Has changed so much in the past 6 months with the the giant leap in quality and control going to MidJourney v six. And I think that MidJourney right now is so far ahead of all the other AI image generators, and then combining that with the new, controls and capabilities. We didn't even get into, you know, multi brush and runway.

Jordan Wilson [00:46:20]:
We'll have to do that on another show, but you combine these these Huge improvements in quality and control of MidJourney v six with being able to get consistent images and then being able to create movement, with these AI images by using an AI video tool like Runway. In my opinion, this changes the way that ad creative is done. So, Rory, with with all this in mind, maybe what's your either piece of advice or, you know, your biggest takeaway for people when it comes to AI's influence on ad creative in our society in 2024, what's that what's that takeaway?

Rory Flynn [00:46:56]:
I Think the the biggest piece of advice I can give is learn the tools now while they're developing. Because I think about this quick analogy. If you gave, you know, your grandparents an iPhone right now instead of learn it, They've never been experienced to it. It they'd be look like hieroglyphics to them. Right? But us, we've been with every iteration of the iPhone, so we can just pick it up and know how it works. That's how I think about using the tools and learning the tools because they're only gonna develop. And the more expansive they get, the harder it is gonna be to to just pick it up and use it. Right? So That to me, and I think about looking at the future of marketing.

Rory Flynn [00:47:27]:
Right? Like, is is everything that we're doing right now from a creative standpoint the end all be all? Like, is it always gonna be email? Is it Always gonna be websites. Like, I think about the Vision Pro coming out for Apple. That's gonna create immersive environments. Right? You're gonna be able to have an immersive environment For your brand marketing perspective. Right? What does that look like? You know, are you gonna be able to do that with conventional tools quickly enough to get to market to compete? Like, this is where This is why I don't sleep at night. So, you know, this is why I don't sleep because I'm just, like, I see vision pro, and I'm like, what? Like, what is what does that do to everything? So, you know, without me going on another diatribe for 40 minutes here, you know, I'm kind of at that.

Jordan Wilson [00:48:05]:
Oh, man. Well, hey. If if if you do wanna hear, like, 40 minutes More worth of of content. That's what our newsletter is. So so, Rory, thank you so much for joining the Everyday AI Show and and walking us through what's new in midjourney and how we can use that with our ad creative in AI. We so much appreciate your time and your insights.

Rory Flynn [00:48:26]:
Yeah, man. It's great great, great being back. I appreciate you having me.

Jordan Wilson [00:48:29]:
Oh, always. And, hey, there's so much that Rory Just showed us. This is literally we just got a university 101, 201, 301 course on AI art. But If you missed it, yeah, there was a lot going on, visuals, prompts. We're gonna be sharing it all. Don't worry. You know, Rory has just a wealth of information. We're gonna be sharing more about His company, some of the, you know, courses that and and, you know, guides that he has, make sure to check out the newsletter.

Jordan Wilson [00:48:56]:
It's all gonna be there. Go to your everyday AI.com. Sign up for that free daily newsletter to check that out. Also, make sure to join us tomorrow as we talk about maximizing the effectiveness of AI in health care With the president of the American Medical Association. Alright. We'll see you back tomorrow and every day for more everyday AI. Thanks, y'all.

Gain Extra Insights With Our Newsletter

Sign up for our newsletter to get more in-depth content on AI