Episode 5

Exploring the Ethics and Impact of AI in Content Creation: A Conversation with Deevo Tindall

In this episode of Digital Coffee: Marketing Brew, host Brett Deister delves into a thought-provoking conversation with guest Deevo Tindall about the impact of AI on content creation. From the use of film cameras to the differentiation of content creators, they explore the ethical considerations, potential job displacement, and the need for balance in utilizing AI tools. Join us as we dive into this intriguing discussion on the responsible and ethical use of AI in the ever-evolving world of marketing and content creation.

3 Fun Facts about Deevo:

1. Deevo Tindall used to be a coffee drinker but now primarily consumes tea.

2. Deevo Tindall operates businesses related to photography, brand messaging, and content creation in Charlotte, North Carolina.


3. The hosts prefer traditional book reading over audiobooks, citing a deeper understanding and personal attachment to physical books.




Timestamp:


00:38 - Are you a coffee or tea drinker?


01:44 - Can you summarize your expertise to our listeners?


03:02 - What are your thoughts on AI gaining more awareness?


05:09 - How is AI going to change content creation?


07:15 - Are we going to see less people knowing how to edit content?


09:13 - Is generative AI going to be a problem?


14:25 - Are we going to see less humans creating content?


16:44 - Will AI control the news?


20:05 - There will be three groups when it comes to how much they use AI?


23:37 - Like Twitter should there be badges for content creators?


25:44 - Talking about the ethics of using AI only for content creation?


27:41 - Three different groups of content creators on using AI?


30:39 - How should content creators respond to AI?


32:25 - There needs to be a federation for content creators?


33:51 - Fun question: What AI would you like for someone to create for you?


38:39 - Where can people find you online?


39:01 - Final thoughts



Review the Podcast: https://pod.link/10557726


Contact Us!

If you want to get involved, leave us a comment!

Visit us and give us a ‘like’ on our Facebook page!

Follow us on Twitter.

Follow us on Instagram.

Join our Discord!

Email at bdeister@digitalcafe.media

Transcript
Speaker:

And welcome to a new episode of digital

2

:

coffee marketing brew a once a month

3

:

podcast about PR and

4

:

Marketing but this we're going to talk

5

:

about content creation and AI the thing

6

:

that every marketer and everybody on the

7

:

internet is talking about

8

:

Chat GPT and all the other

9

:

fun stuff that goes with it

10

:

There's a lot of stuff out there, but

11

:

with me is Devo and he is a content

12

:

creator also fellow podcaster

13

:

And I did interview him on a previous

14

:

podcast called PR 360

15

:

It's just good to have him back on my

16

:

show. So welcome to show Devo

17

:

I'm glad to be back, Brett, for having me

18

:

back under a new umbrella.

19

:

That's right. And the first question is,

20

:

"All my guests, are you a

21

:

coffee or a tea drinker?"

22

:

I am an adaptogenic. I guess that would

23

:

be more of a tea than a

24

:

coffee. I gave up coffee

25

:

full-time. I've been off of it now for a

26

:

year and a half. All caffeine, actually.

27

:

Nice. How's it going?

28

:

Oh. I think it's marvelous. I don't know

29

:

that I would ever go

30

:

back. I do miss coffee because

31

:

I was a coffee connoisseur snob. I travel

32

:

around the world a

33

:

lot. One of my things was

34

:

bringing back the local coffee whenever,

35

:

wherever. So I do miss that aspect of it.

36

:

But what I'm doing now has made me much

37

:

healthier and I feel

38

:

pretty good about it.

39

:

Nice. I don't know. We say be both.

40

:

That's my thing. But whatever

41

:

works for everybody is always

42

:

good with me too. Congratulations on your

43

:

new gig, getting this

44

:

thing up, going solo.

45

:

Yeah. It's a long time coming. I also

46

:

miss interviewing

47

:

people. That was another thing.

48

:

I was like, "I miss doing this." Well,

49

:

welcome to

50

:

entrepreneurial life. You can do it.

51

:

Your role is your way from here on out

52

:

now. That's true. But I

53

:

gave a brief introduction

54

:

to your expertise. Can you give our

55

:

listeners a little bit

56

:

more about what you do?

57

:

What do I do? That's funny because I was

58

:

just talking to my social

59

:

media manager last night.

60

:

We're going to come up with a new social

61

:

media strategy because I'm

62

:

doing so many things. I don't

63

:

mean that vainly. I'm all over the place.

64

:

I own a couple of different businesses,

65

:

podcasts as you referenced, traveling,

66

:

working on some other new

67

:

ventures on the side that I'm not

68

:

going to release too much, but have a

69

:

couple of products that

70

:

I'm getting ready to launch.

71

:

I'm really excited about that. But

72

:

primarily I own Fusion

73

:

Photography, which is a traditional

74

:

photography studio based in Charlotte,

75

:

North Carolina. I also

76

:

launched in 2018 a new brand

77

:

called Fusion Creative, which focuses

78

:

exclusively on brand messaging and

79

:

helping small businesses

80

:

and entrepreneurs be very clear and

81

:

succinct on their brand messaging, which

82

:

is the critical piece

83

:

of a small business owner of any business

84

:

really. It's like, how do

85

:

you associate a product with a

86

:

brand and it's through their brand

87

:

messaging? We help them

88

:

clarify that and then create content

89

:

around that and then manage all of their

90

:

digital channels, which

91

:

is websites, social media,

92

:

advertising, anything that they would use

93

:

content for, branding

94

:

strategy to help them get their

95

:

message to a broader audience. Gotcha.

96

:

That's what we're going to be

97

:

diving into is concentration

98

:

in AI, but how's it going? What are your

99

:

thoughts on AI getting more

100

:

awareness in this space and

101

:

your feelings on concentration in AI?

102

:

Well, it's a rapidly

103

:

growing field that is evolving

104

:

by the nanosecond, literally. I think

105

:

that it has the potential to

106

:

revolutionize many aspects of our

107

:

lives from healthcare to transportation

108

:

to improving

109

:

efficiencies to increasing our

110

:

productivities, enhancing decision-making

111

:

capabilities even. But

112

:

I do think as with any

113

:

emerging technology, there are some risks

114

:

and some challenges,

115

:

including some ethical concerns

116

:

that we probably have to pay attention to

117

:

around biases and privacy

118

:

and job displacement. There

119

:

are some catastrophic consequences that

120

:

if this isn't managed

121

:

properly by the users and by the

122

:

developers, it could become a widespread

123

:

issue. We've all seen the

124

:

movies around AI, Terminator,

125

:

et cetera. It is an intriguing idea to

126

:

consider that there is very much... I

127

:

don't know if you saw

128

:

recently, there was an interview on 60

129

:

Minutes with the female

130

:

AI bot. Did you see this?

131

:

No, I didn't actually see that. Oh, check

132

:

it out. 60 Minutes just did a full

133

:

episode on 60 Minutes

134

:

of their Sunday night show interviewing

135

:

an AI female robot. It's

136

:

pretty insane, actually,

137

:

to be honest with you. It looks creepy.

138

:

It was pretty, pretty cool episode.

139

:

How can it be used conscientiously and

140

:

morally without disrupting, you know,

141

:

the disrupting too much, the status quo

142

:

of who we are, uh, the human element.

143

:

And so I do have some opinions about it,

144

:

but I think I'll, I'll

145

:

wait until the next question.

146

:

So how, how is this going to change?

147

:

Because I mean, you've used it.

148

:

I've used it a little bit through just

149

:

doing show notes and helping me

150

:

just do like time codes and like things

151

:

that I don't rather not do, but I need

152

:

to actually do them at the same time.

153

:

And then also with like, I've seen some

154

:

AI stuff just chopping up through

155

:

premiere chopping up interviews and

156

:

chopping them up perfectly where you

157

:

almost don't have to edit anymore.

158

:

So how do you, how do you think this will

159

:

change content creation?

160

:

Well, I think that there are, there are a

161

:

lot of benefits from it, from the

162

:

standpoint of, of the tedious or complex

163

:

tasks that to just take up a lot of time.

164

:

The minutia of our day-to-day businesses,

165

:

you know, there is a pod squeeze is one

166

:

of them that I've played around with, uh,

167

:

for my podcasts and it generates with

168

:

about 75 to 85% accuracy, and that's an

169

:

anecdotal number, um, the show notes,

170

:

the timestamps, et cetera, and it

171

:

generates a bunch of

172

:

other information for me.

173

:

And, um, my, my, my relationship with

174

:

that is to sort of

175

:

take a look at it and then

176

:

go through it and instead of, it's almost

177

:

like doing a soft edit, you know, back,

178

:

back in the days when we had had to write

179

:

essays in high school and college and

180

:

stuff, right.

181

:

And we'd write the first draft and we'd

182

:

give it to somebody to look over.

183

:

Right.

184

:

I don't know if you had to go through

185

:

that space, but, um,

186

:

for me, I see it as more

187

:

of like a soft edit, a sort of a preview

188

:

of what I would, what I would ultimately

189

:

post on my own, but I still have to go

190

:

through and do the manual checks and just

191

:

make sure that things are, are okay.

192

:

Um, that it's not

193

:

disrupting a lot of the data.

194

:

Um, it's not changing, you know, the

195

:

scope or the intention of, of a phrase,

196

:

et cetera, um, but it's allowed, it's

197

:

allowed me to really kind of automate

198

:

some of the tedium that I

199

:

do in my day to day business.

200

:

Um, and, and consequently, um, improve

201

:

some of my efficiency

202

:

and my productivity.

203

:

Um, I wouldn't necessarily my decision

204

:

making capabilities because I don't

205

:

really use it in that sort of, um,

206

:

algorithm, but, um, I do see a benefit

207

:

from it in terms of being able to

208

:

streamline a lot of the minutia that goes

209

:

on in a day-to-day basis in my office.

210

:

So we're going to see like less people

211

:

knowing how to actually like edit photos,

212

:

edit videos and edit audios, because we

213

:

have like tools like the script where

214

:

you can just look at the words and then

215

:

cut out the words, and then you don't

216

:

actually have to go through the manual of

217

:

looking at like the audio waves and

218

:

looking at how to like cut different

219

:

things and make it more, I guess, better

220

:

the video, or we're going to see more of

221

:

that just because AI

222

:

is becoming like this,

223

:

just this like, Hey, you can do

224

:

everything faster

225

:

with AI type of a thing.

226

:

Well, I suspect it's

227

:

probably the nature of the beast.

228

:

I mean, with any, with any new tool that

229

:

that's brought or any new curriculum or

230

:

any new insights or any new technology,

231

:

there's always going to be sort of a.

232

:

A manual degradation

233

:

of some sorts, right?

234

:

You know, like I was talking to my

235

:

partner's son last

236

:

night and we were talking,

237

:

apparently I hold my pen the same way he

238

:

does because he called it out at the

239

:

restaurant and he's like, Hey, you hold

240

:

your pen the same way I do.

241

:

And I just thought it was

242

:

an interesting observation.

243

:

And so off the cuff, I said,

244

:

do you know how to do cursive?

245

:

Like, do you know, he didn't even know

246

:

what cursive writing was.

247

:

So interestingly enough, you know, like

248

:

with anything, there's going to be a

249

:

degradation of some

250

:

sort of skill sets, but.

251

:

There's with the loss of anything,

252

:

there's typically the

253

:

gain on the other side.

254

:

So, you know, we might, we are definitely

255

:

going to lose some of the

256

:

critical thinking attributes, but I think

257

:

that's not, I don't believe that's

258

:

an umbrella statement

259

:

across for everyone.

260

:

I think just like cursive writing for

261

:

him, because I know how

262

:

to do cursive writing.

263

:

My kids know how to do cursive writing

264

:

because I made sure that they knew how

265

:

to do that because there's a whole bunch

266

:

of different reasons why I won't go into

267

:

that, but I would suspect that a large

268

:

percentage of the population will suffer

269

:

by over-utilization of AI.

270

:

And, and, and the compensation of that is

271

:

that they'll pick up new skill sets

272

:

that they hadn't had before, but they're

273

:

going to lose some, some of the stuff

274

:

that they've been

275

:

using up until this point.

276

:

So yeah, I believe there is going to be a

277

:

loss in some way, shape or form.

278

:

It's inevitable.

279

:

It's almost unavoidable.

280

:

I mean, you can go in with

281

:

like the ethical side of it.

282

:

I mean, I've seen Photoshop now have

283

:

generative AI where it kind of like

284

:

completes the photo, but it could be

285

:

complete the photo where you make

286

:

somebody look bad or videos that could, I

287

:

mean, deep fakes have been around a

288

:

little bit longer and they've gotten a

289

:

little too good for, for my liking,

290

:

where it's like, you could make somebody

291

:

or put someone's face on somebody else

292

:

and say like, look,

293

:

you did something bad.

294

:

So how do we like balance the ethical

295

:

side of it where it could be funny, but

296

:

let's not make it look like we're people

297

:

that did something, didn't do something

298

:

illegal look like they actually did

299

:

something illegal or anything like that.

300

:

Yeah.

301

:

I'm not sure how you can

302

:

manage it holistically.

303

:

I think that there's going to have to be

304

:

some discretionary decisions that are

305

:

made on a personal level.

306

:

At some point, there probably might have

307

:

to be some standards put in place.

308

:

Like I saw that Hollywood is now

309

:

mandating by the year

310

:

2024 that, you know,

311

:

certain numbers of all movies have to

312

:

meet a certain criteria of, of ethical

313

:

standards of, you know, the types of

314

:

people that are showing you in the movies

315

:

and inclusive inclusivity around that.

316

:

So there, there might have to be some

317

:

standards put in place around AI if, if

318

:

they haven't taken over by then.

319

:

Um, but I think more than anything else,

320

:

that's going to have to become a

321

:

discretionary decision right now on a, on

322

:

a user by user basis.

323

:

But I mean, let's

324

:

not, let's not be fooled.

325

:

Like inauthentic photography and video

326

:

has been around for a long time.

327

:

I mean, nobody, none of the actors or

328

:

actresses you see in movies

329

:

actually look like that.

330

:

I don't know if you ever seen any of them

331

:

in, in real person, but you know, they're

332

:

made up and their faces are made over and

333

:

their bodies are usually sometimes body

334

:

doubles, like, so this sort of, this sort

335

:

of, I don't know if you want to call it

336

:

fakery, but this sort of inauthentic

337

:

viewpoint of who we are so that we

338

:

cosmetically enhanced versions of us has

339

:

been around for a while now.

340

:

So whether you do AI or filters or, you

341

:

know, changing who you actually

342

:

look like on social media.

343

:

And I can tell you, cause I speak for

344

:

this in person, you know, I

345

:

work with a lot of influencers and the

346

:

images that people put on their social

347

:

media are not what

348

:

they look like in reality.

349

:

So there are derivation of that.

350

:

So, um, it's been around for a while.

351

:

I do think that at some point it will get

352

:

out of hand and, and somebody or

353

:

something is going to have to put some

354

:

sort of a standard of place, but right

355

:

now it's going to have to be on an

356

:

individual basis because no one's even

357

:

taking a look at that in

358

:

any way, shape or form.

359

:

Hmm.

360

:

I mean, that is true.

361

:

I think I had a friend that was doing

362

:

like professional movies and they would

363

:

sculpt and do different things.

364

:

Take out, I think like a woman was

365

:

pregnant and they took out like the way

366

:

she looked like, and so she looked,

367

:

didn't look like she

368

:

was pregnant anymore.

369

:

Type of a thing.

370

:

And that was basically like making her

371

:

not what she was at the moment, but

372

:

making it look like she was fit and

373

:

everything and not actually like six,

374

:

seven months pregnant.

375

:

So.

376

:

I actually had to put policies in place

377

:

around my production

378

:

process because early on I was

379

:

getting some sort of ridiculous requests

380

:

that was bastardizing

381

:

the actual image and what

382

:

it was sort of similar to what you just

383

:

said, cosmetically

384

:

enhancing me, making me, you

385

:

know, 20 pounds lighter,

386

:

removing things, changing this.

387

:

And I sort of had to put a line in the

388

:

sand and said, these are

389

:

not the things that I'm

390

:

going to do with my images.

391

:

It's morally, for me, it's, I have a

392

:

moral obligation to

393

:

sort of create, create what

394

:

I see.

395

:

And that doesn't mean that I'm not going

396

:

to make tweaks to it.

397

:

You know, if you've got pimples, if

398

:

you've got things that

399

:

need to be removed, I'll clean

400

:

up some wrinkles.

401

:

Like I'll do, I'll do small minor

402

:

defects, but I'm never

403

:

going to manipulate a photo

404

:

for you.

405

:

And I tell this to my

406

:

clients, it's in my contract.

407

:

There will be no photo

408

:

or video manipulation.

409

:

Like I'm going to make you look really

410

:

good, but that's going

411

:

to be on me on the outside

412

:

of how I take the photograph and the

413

:

lighting that I use and

414

:

the composition that I use

415

:

and the angles that I use, but I'm not

416

:

going to go in after the

417

:

fact and completely change

418

:

you from what you look like today.

419

:

So you're a completely different person.

420

:

So it has been around for a while and

421

:

I've gotten some crazy

422

:

requests to do some absolutely

423

:

crazy thing.

424

:

I had someone call me the other day, we

425

:

did a huge group photo

426

:

for them and about 20 or

427

:

so of the hundred people didn't have

428

:

their arms crossed in a photo.

429

:

And she wanted to know if I can make

430

:

everybody have arms crossed.

431

:

And I was like, I'm

432

:

just not going to do that.

433

:

And just, I'm just not

434

:

going to go in and do that.

435

:

So anyway, yeah.

436

:

I guess that answered the

437

:

question in a roundabout way.

438

:

That's a lot of hours just trying to get

439

:

everybody's arms crossed

440

:

because you have to make it look

441

:

like their arms are crossed.

442

:

And that's a pain in

443

:

the butt just in general.

444

:

Just doing it manually because you have

445

:

to make it look, you have to sell it.

446

:

And if it's not sold, the

447

:

people are like, that's fake.

448

:

Well, you're absolutely spot on on that.

449

:

But what's crazy to me is the fact that

450

:

the question was asked

451

:

because it's become normal

452

:

to change the images.

453

:

It's become normal to add on filters.

454

:

It's been normalized to put photos in a

455

:

situational context so

456

:

that other people can see them

457

:

that aren't actually real.

458

:

They're fabrications

459

:

of the original context.

460

:

And so, you know, those sort of questions

461

:

are not something I

462

:

would have gotten 10 years

463

:

ago.

464

:

And now I get them today.

465

:

It's like, can you put me here so that

466

:

I'm sitting on a beach as

467

:

opposed to sitting right

468

:

here?

469

:

And I was like, is that

470

:

really what you want me to do?

471

:

Like, is that, does that really matter in

472

:

the scheme of things?

473

:

So yeah.

474

:

Yeah.

475

:

Yeah.

476

:

And so, I mean, are we going to see less

477

:

humans actually doing

478

:

that stuff and more AI as AI

479

:

is getting more prevalent?

480

:

I mean, I've seen on Twitter, AI create

481

:

commercials that look very

482

:

weird, but still look pretty

483

:

lifelike to be like that could once AI

484

:

figures out how to do

485

:

actually do a good commercial

486

:

that could eventually happen.

487

:

So are we going to start seeing more of

488

:

that more AI generated

489

:

and less human generated

490

:

type of content?

491

:

I believe so.

492

:

Yes.

493

:

I do believe so.

494

:

I think it's already

495

:

happening in a lot of ways.

496

:

I mean, we've already seen some of the

497

:

video production that AI is putting out.

498

:

I think they're still

499

:

going to have to be a human.

500

:

Let me rephrase this.

501

:

I think you're going to have to make a

502

:

moral decision as a

503

:

producer, as a content creator,

504

:

whether you're in Hollywood or you're on

505

:

a small business like I am.

506

:

At what level are you going to allow this

507

:

to take over and

508

:

pervade your life and become

509

:

the holistic process of everything?

510

:

And I've made a decision, me personally,

511

:

that I don't have a

512

:

problem using it for the minutia,

513

:

but I do have a problem with it being the

514

:

final product in every sense of the word.

515

:

And so while I use it to create some show

516

:

notes perhaps, or I might use it to chat

517

:

GBT to have a conversation, if you will,

518

:

on some ideas around

519

:

what could my strategy look

520

:

like for my social media next month and

521

:

being able to use it to

522

:

come up with ideation and

523

:

take that from there and then expound

524

:

upon it, that's the decision I've made.

525

:

But I definitely, in answer to your

526

:

question, I definitely

527

:

think that there is going to

528

:

be a large majority of the population

529

:

because that's just the

530

:

nature of humanity to try

531

:

to make things more efficient, more

532

:

optimal, and to not be

533

:

involved in the process.

534

:

We're always looking for the quick fix.

535

:

What can we get done right now?

536

:

So that's part of the problem.

537

:

This is a much deeper esoteric

538

:

conversation, but that's part of the

539

:

problem why we're in

540

:

the situation that we are is because

541

:

people don't enjoy the process.

542

:

People don't get involved in the process.

543

:

They just want to

544

:

immediately get to the end.

545

:

I just immediately want that prize on the

546

:

other side of this

547

:

and nobody wants to get

548

:

into the middle of the muck and the dirt

549

:

and the filth and get

550

:

dirty and grimy and blood

551

:

and sweat and tears

552

:

anymore because we can.

553

:

We can get to the

554

:

other side very quickly.

555

:

So yeah, I definitely

556

:

agree with that statement.

557

:

Yeah, I mean, I'm more

558

:

in agreement with you.

559

:

I will use AI to help me offset things

560

:

that I may not want to

561

:

do, but it will help me

562

:

automate my workflow or if I need to

563

:

touch up some video, maybe

564

:

save some videos degraded.

565

:

I'll use AI to help uplift it because

566

:

that's an easier process

567

:

for that to do than for

568

:

me spending too many hours

569

:

trying to uplift the video.

570

:

But is it we're going to get to the point

571

:

where it's just we may.

572

:

I mean, I've seen a thoughts on news

573

:

thing a couple of months

574

:

ago where like one of the

575

:

Middle Eastern countries was doing like

576

:

an AI news reporter.

577

:

So it wasn't even a real human anymore.

578

:

It was all AI generated and they were

579

:

delivering the news.

580

:

Are we going to get to that weird spot

581

:

where even maybe the

582

:

newscasters aren't actually

583

:

real anymore and they're just not even

584

:

paying newscasters or

585

:

humans to actually do that.

586

:

They'll just have automated AI

587

:

representation of humans and

588

:

they'll just deliver the news

589

:

that way so they can control

590

:

the news that way even more.

591

:

Yeah, I don't know.

592

:

To be honest with you, I haven't watched

593

:

the news in 15 years,

594

:

so I couldn't really tell

595

:

you.

596

:

I suspect there's going to be a segment

597

:

of the population that

598

:

does go down that rabbit

599

:

hole.

600

:

I think that there's going to be a

601

:

dissection of society

602

:

that, and I don't know if it's

603

:

going to necessarily be age based, but I

604

:

definitely think there's

605

:

going to be a group, a population

606

:

that AI is going to

607

:

polarize for and against.

608

:

As with anything, I prefer balance.

609

:

I think that there is good and bad on

610

:

both sides of the equation for this.

611

:

I work with a hospital, one of my big

612

:

clients here, and they use

613

:

AI to help doctors diagnose

614

:

diseases because they can.

615

:

It's much faster.

616

:

And then develop personalized treatment

617

:

plans using AI and even

618

:

predict because AI can do

619

:

this in a nanosecond, predict some of the

620

:

patient outcomes

621

:

based upon some variables.

622

:

There are a lot of positive

623

:

benefits that AI can help.

624

:

And again, in the name of progress,

625

:

humans have always done this.

626

:

Let's just go back 60

627

:

years to automobiles.

628

:

Look at what we're driving today.

629

:

We've got electric cars and we have got

630

:

cars that can levitate.

631

:

Now I've seen there are cars that can fly

632

:

that they're testing.

633

:

We didn't have those technologies.

634

:

There were people back then that were

635

:

probably saying,

636

:

"Well, we can't go that way.

637

:

There's no way we can do this.

638

:

It's going to cause this problem.

639

:

It's going to cause that problem.

640

:

We're going to have over

641

:

traffic," whatever it is.

642

:

But for whatever reason, humans are

643

:

innovative and they always

644

:

figure out a way to make things

645

:

work.

646

:

We're very efficient people.

647

:

We're very productive people.

648

:

So I suspect that there's going to be a

649

:

polarization of this, the

650

:

naysayers, because they stay

651

:

away from it.

652

:

I'm never going to touch it.

653

:

And there's going to be people who

654

:

embrace it wholeheartedly.

655

:

And like anything, you can use it for

656

:

corruption and evil, or

657

:

you can use it for good, like

658

:

the hospital is for good and

659

:

productivity and helping people.

660

:

So I suspect within the case of your

661

:

newscasters, there is

662

:

probably going to be an AI only,

663

:

just like ESPN is just sports only.

664

:

There might be an AI only

665

:

news channel that just does AI.

666

:

And if people want to watch

667

:

it, they want to watch it.

668

:

But I still go back to

669

:

the original statement.

670

:

As with anything, overuse, overabundance,

671

:

if you have a 100% proclivity to only use

672

:

one thing, you're

673

:

going to be out of balance.

674

:

And that's going to be where the damage

675

:

and the dangers are

676

:

going to start to creep in.

677

:

So it's interesting to talk about the

678

:

groups because I even see

679

:

kind of groups going like

680

:

the AI enthusiasts, the middle ones, the

681

:

hybrids, the ones that

682

:

embrace it, but have a skepticism

683

:

towards it.

684

:

And then the ones that fully reject and

685

:

be like, this is a

686

:

degradation to humankind or

687

:

whatever the embrace Terminator two and

688

:

all those like AI is

689

:

going to kill us all type

690

:

of a thing.

691

:

So are we going to see that with content

692

:

creators like in groups

693

:

too, like the ones that say

694

:

I use AI, everything pay me to use for me

695

:

to use AI to do this

696

:

or the content creators

697

:

will do it themselves to do it.

698

:

And then they don't have to do anything

699

:

except maybe look good in the camera.

700

:

And even that maybe

701

:

not even do that anymore.

702

:

We're going to see like the hybrid

703

:

content creators like I'm

704

:

going to do this, but I'm

705

:

still going to be part of the process

706

:

because I feel like

707

:

learning the process is good.

708

:

We're going to have just the manual only.

709

:

They may even go back

710

:

to film if they want to.

711

:

I mean, I don't think you can buy too

712

:

many film cameras

713

:

anymore, but we're going to see

714

:

more of that.

715

:

Like the groups kind of dividing

716

:

themselves in that type of a

717

:

thing where it's like different

718

:

types of content creators

719

:

want to do different things.

720

:

And then we'll kind of see how I guess

721

:

people respond in a way that audience.

722

:

Yeah, that's a great question.

723

:

Well, we already have that.

724

:

So especially in the photography and

725

:

cinema world, we already

726

:

have people who have don't

727

:

have the skills, don't have the

728

:

knowledge, don't have not

729

:

put in the time to become a

730

:

professional master photographer.

731

:

Let's just use that as an example.

732

:

Becoming a master photographer requires

733

:

years of experience.

734

:

You have to do an apprenticeship.

735

:

You have to take a copious amount of

736

:

tests and different things.

737

:

And you need to be able to pass specific

738

:

standards that are put in

739

:

place by organizations like

740

:

the PPA, for example.

741

:

But that's not a law.

742

:

That's not a protocol that's required.

743

:

So anybody can go over to anywhere,

744

:

Amazon, Best Buy, wherever

745

:

you want and buy a really

746

:

expensive camera and call

747

:

themselves a photographer.

748

:

And they're going out and they're selling

749

:

that to people right now.

750

:

And while they may get some good photos,

751

:

they're not going to take

752

:

photos like a master photographer

753

:

would that takes all the variables of

754

:

photography into

755

:

consideration, which is composition and

756

:

lighting and angles and context and being

757

:

able to tell the

758

:

story of an image by just

759

:

being part of the process.

760

:

And so there's always going to be a need

761

:

for that sort of a person.

762

:

I don't know that AI will ever be able to

763

:

be sentient enough

764

:

where they can sit with

765

:

you through a photo session for, you

766

:

know, and be able to

767

:

tell your story organically,

768

:

just using AI synthetic technologies.

769

:

There's always going to be a need, in my

770

:

opinion, for people that

771

:

are still going to have to

772

:

go the traditional route.

773

:

But those

774

:

polarizations already exist today.

775

:

There are people out there today that are

776

:

selling their photography, quote unquote,

777

:

skills to an audience who isn't

778

:

necessarily as, well, I

779

:

don't know what the correct word

780

:

to say this without being undiplomatic,

781

:

but you know, there are

782

:

some people just don't

783

:

really give a shit.

784

:

Like, I just want a quick photo.

785

:

Let me get it done.

786

:

But there are other people like my types

787

:

of clients who are not

788

:

looking for the run and

789

:

gun experience.

790

:

They're looking for something else that

791

:

is curated, that is

792

:

thoughtful, that has critical

793

:

elements of decision making in them that

794

:

tells a story about their product.

795

:

And so there already are two camps.

796

:

And so I only suspect that those camps

797

:

will probably continue

798

:

down that channel of their

799

:

own of their own sort of divestitures.

800

:

But for people like myself, and I'm

801

:

putting myself in that

802

:

camp because I am a master

803

:

photographer, there's always going to be

804

:

an audience that wants

805

:

our services, at least

806

:

given the current

807

:

paradigm that we operate in.

808

:

Does that make sense?

809

:

Yeah, that does.

810

:

I mean, almost my follow up question is

811

:

like Twitter did with their own badges.

812

:

There almost be like a badge for like

813

:

people like, look, I

814

:

actually don't are I use very

815

:

little AI and then bad for people saying

816

:

I use all AI type of

817

:

thing because not the

818

:

normal person is not going to know how

819

:

much you use AI or not.

820

:

I mean, I will be I'll always be up front

821

:

and say I use it to a certain extent, but

822

:

I'm still in the process.

823

:

I still do that video editing.

824

:

I still go through everything and make

825

:

sure it sounds good and everything.

826

:

But there's going to be some people would

827

:

be like, look, I just let it do itself.

828

:

I kind of sort of

829

:

check it, but not really.

830

:

I just kind of say, update.

831

:

I did it.

832

:

So it's done.

833

:

It's almost like there needs to be some

834

:

type of like, I guess,

835

:

certification in a way,

836

:

I guess, the best way of saying it to say

837

:

like, I'm a I'm a

838

:

content creator that knows

839

:

how to video edit instead of I'm a

840

:

content creator that just

841

:

uses AI to do it for me.

842

:

Yeah, I mean, it's a

843

:

good it's a great question.

844

:

It's a good idea.

845

:

I don't know that we'll

846

:

see that anytime soon.

847

:

I think I think the question we should

848

:

probably reframe that a

849

:

little bit is, is, you know,

850

:

will people be able to

851

:

tell the difference from it?

852

:

And if they if they can't tell the

853

:

difference of it, does it matter?

854

:

So it goes back to from my perspective, a

855

:

moral decision from the

856

:

creator's perspective,

857

:

how much am I willing to share with you

858

:

that I'm doing and how

859

:

much am I willing to actually

860

:

use that service?

861

:

And I think I think from from my

862

:

perspective, because I don't

863

:

use it for any sort of content

864

:

creation right now, AI, you know, I don't

865

:

have a camera or anything like that.

866

:

But I do manually edit all my images, I

867

:

do manually edit all of

868

:

my videography, all those

869

:

things.

870

:

But probably, probably advertising your

871

:

services and letting

872

:

people know sort of what level

873

:

of curation you're taking at this would

874

:

probably be the better

875

:

better way to approach that,

876

:

just so that the in buyer, the consumer

877

:

sort of knows, and

878

:

especially if they're discerning

879

:

enough that they can tell the difference,

880

:

this is what I'm paying for.

881

:

So they sort of know upfront what they're

882

:

getting out of out of

883

:

their dollar they're

884

:

spending.

885

:

Does that make does that make sense?

886

:

Yeah, it's almost like, for me, it would

887

:

be like, asking yourself, you know, what are

888

:

you saying?

889

:

Okay, what's if I did all AI didn't tell

890

:

the customer and then

891

:

they found out that I wasn't

892

:

even part of the process?

893

:

Would they be actually upset with me?

894

:

Would they be like, I want my money back

895

:

or some type of legal

896

:

action if there is actually

897

:

ever any loss to do anything, but

898

:

something negative

899

:

towards you, would they be upset

900

:

with it?

901

:

Yeah, I think I think I would probably

902

:

want to understand the

903

:

implications of that a little

904

:

bit better.

905

:

Is AI recreating something that's already

906

:

been created through plagiarism?

907

:

So for me, that would go back to that

908

:

moral situation again,

909

:

that conscientious decision

910

:

on am I recreating?

911

:

Am I creating something using AI that's

912

:

of my original native

913

:

work, or my bastardizing

914

:

and creating something

915

:

else that someone already did?

916

:

And you know, from from a standpoint of

917

:

content creation, it's not

918

:

like it's not like everything

919

:

isn't recreated anyway, like people copy

920

:

my photos all the time.

921

:

I see versions of what I've already done,

922

:

and you know, local

923

:

photographers, but I've

924

:

done the same thing.

925

:

Like, that's what, you know, I'll find a

926

:

really cool photograph on

927

:

Pinterest, and I'll say,

928

:

hey, I'm going to sort of recreate this

929

:

in my own unique way.

930

:

So I guess, for me, it would just sort of

931

:

have to be, what's

932

:

the implications of the

933

:

recreation?

934

:

Is it truly, is the plagiarism causing a

935

:

moral and conscious

936

:

decision by like, you know,

937

:

if I'm going to write a book, for

938

:

example, with using AI,

939

:

am I going to recreate word

940

:

for word, Dostoevsky, or, you know, Jodie

941

:

Pecult, or any of these books back here?

942

:

And you know, Michael McAlwits, I'm

943

:

reading right now, Profit

944

:

First, and you know, how

945

:

to make more money with your money.

946

:

And am I going to write my own book, and

947

:

it's basically written by

948

:

AI, and it just literally

949

:

is a plagiarism of that.

950

:

Now that obviously is a moral decision

951

:

that there's going to be

952

:

some problems and fallout

953

:

from that.

954

:

But from a content creation perspective,

955

:

you know, creating

956

:

photos, creating video, I guess

957

:

I would just need to understand the

958

:

implications and sort of, you know, how

959

:

far does it actually

960

:

go?

961

:

Yeah, I mean, we could put

962

:

it into different groups.

963

:

There's inspiration, which what you would

964

:

do, and there's

965

:

plagiarism, inspiration is

966

:

like, I like that, I want to see if I can

967

:

do something, but give it a twist.

968

:

But I'm inspired by what someone else is

969

:

doing to try it myself,

970

:

where I'm still making it

971

:

a little different, but it's still some

972

:

type of inspiration, and

973

:

then a complete copy where

974

:

it's like, well, I can't tell the

975

:

difference between this one

976

:

and this one, because it's

977

:

complete copy.

978

:

Absolutely.

979

:

But it's kind of interesting, we should

980

:

probably, you know, we

981

:

could take this conversation

982

:

in a bunch of different directions.

983

:

You know, there's not much original work

984

:

anymore, period, in

985

:

anything like, you know, you go

986

:

on Instagram, and you have all these

987

:

gurus talking about

988

:

spirituality and holistic, and

989

:

you have dieticians talking about how to

990

:

do this, and you've

991

:

got people talking about

992

:

how to do this.

993

:

And those most of these people are just

994

:

recreating something that

995

:

they've already learned themselves.

996

:

And now they're just putting their own

997

:

personal spin on it.

998

:

So again, it goes back to that moral

999

:

compass, what's the

:

00:28:34,833 --> 00:28:36,125

implication of my creation?

:

00:28:37,125 --> 00:28:38,250

I would need to understand

:

00:28:38,250 --> 00:28:39,000

that a little bit better.

:

00:28:39,375 --> 00:28:41,250

Am I taking something word for word and

:

00:28:41,250 --> 00:28:42,375

writing a book with AI?

:

00:28:42,833 --> 00:28:44,166

Am I taking an image and literally

:

00:28:44,166 --> 00:28:46,041

recreating the, am I

:

00:28:46,041 --> 00:28:47,291

saying, hey, I take a look at this

:

00:28:47,291 --> 00:28:49,000

photo, scan it in, and I want you to

:

00:28:49,000 --> 00:28:50,500

recreate this exact same

:

00:28:50,500 --> 00:28:51,375

image, I'm gonna give that

:

00:28:51,375 --> 00:28:53,458

to my client, and then pass

:

00:28:53,458 --> 00:28:54,750

it off as my original work.

:

00:28:55,083 --> 00:28:56,416

You know, that's obviously a moral and

:

00:28:56,416 --> 00:28:58,083

conscientious issue that's immoral.

:

00:28:58,500 --> 00:28:59,708

And so again, I would need to understand

:

00:28:59,708 --> 00:29:00,458

the implications of

:

00:29:00,458 --> 00:29:01,500

what that recreation looks

:

00:29:01,500 --> 00:29:01,666

like.

:

00:29:01,666 --> 00:29:03,500

Because truthfully speaking, there is not

:

00:29:03,500 --> 00:29:04,625

a lot of original work.

:

00:29:05,375 --> 00:29:07,083

Almost everyone is just repeating,

:

00:29:07,666 --> 00:29:09,500

rephrasing, regurgitating

:

00:29:09,500 --> 00:29:10,625

something that they learned

:

00:29:10,625 --> 00:29:11,541

from someone else.

:

00:29:11,541 --> 00:29:14,458

Like, it's just the nature of human and

:

00:29:14,458 --> 00:29:15,750

history in and of itself, right?

:

00:29:16,958 --> 00:29:17,583

Everything's in a cycle.

:

00:29:17,833 --> 00:29:20,041

So like, media, television, cartoons,

:

00:29:20,291 --> 00:29:21,208

like, everything is just

:

00:29:21,208 --> 00:29:22,375

a recreation of something.

:

00:29:22,375 --> 00:29:24,875

My daughters are wearing Nike air flights

:

00:29:24,875 --> 00:29:26,041

and Air Jordans again,

:

00:29:26,041 --> 00:29:27,208

and like, that I wore

:

00:29:27,208 --> 00:29:28,333

when I was in middle school and high

:

00:29:28,333 --> 00:29:28,916

school, like, they're

:

00:29:28,916 --> 00:29:30,916

literally the exact same shoes.

:

00:29:31,750 --> 00:29:32,916

They just recreated them and added a

:

00:29:32,916 --> 00:29:34,416

couple extra hundred dollars to them.

:

00:29:35,875 --> 00:29:39,708

Yeah, I mean, one of the books in Bible

:

00:29:39,708 --> 00:29:41,375

Ecclesiastes basically said there's

:

00:29:41,375 --> 00:29:42,208

nothing new under the

:

00:29:42,208 --> 00:29:42,458

sun.

:

00:29:42,708 --> 00:29:44,000

And I've always been like, that's

:

00:29:44,000 --> 00:29:45,125

basically pretty true.

:

00:29:45,333 --> 00:29:47,041

I mean, we think with something new, and

:

00:29:47,041 --> 00:29:47,791

then you go back to

:

00:29:47,791 --> 00:29:48,916

history, it's like, it's actually

:

00:29:48,916 --> 00:29:49,625

not very new.

:

00:29:49,625 --> 00:29:50,791

So I'm gonna try to do that.

:

00:29:51,291 --> 00:29:51,500

Yeah.

:

00:29:52,125 --> 00:29:53,083

Dude, first of all, you're the first

:

00:29:53,083 --> 00:29:54,208

person in like five years

:

00:29:54,208 --> 00:29:55,458

has dropped some Ecclesiastes

:

00:29:55,458 --> 00:29:56,291

on a podcast.

:

00:29:56,291 --> 00:29:57,125

So well done.

:

00:29:59,916 --> 00:30:01,541

I know my Bible to a certain extent, and

:

00:30:01,541 --> 00:30:03,250

I knew, and it's also

:

00:30:03,250 --> 00:30:04,125

one of my favorite books,

:

00:30:04,291 --> 00:30:05,791

actually, because it's more philosophical

:

00:30:05,791 --> 00:30:06,958

in a way, and it doesn't

:

00:30:06,958 --> 00:30:07,958

really give you answers.

:

00:30:08,166 --> 00:30:09,958

It just kind of goes, this is life.

:

00:30:10,208 --> 00:30:11,125

This is what happens.

:

00:30:12,291 --> 00:30:13,083

Deal with it.

:

00:30:13,500 --> 00:30:14,791

Now, there's a conversation we could have

:

00:30:14,791 --> 00:30:16,500

the Bible, because I

:

00:30:16,500 --> 00:30:17,208

think it's all just a

:

00:30:17,208 --> 00:30:17,625

metaphor.

:

00:30:18,250 --> 00:30:19,083

I'm sort of sounds like

:

00:30:19,083 --> 00:30:19,875

what I hear you saying.

:

00:30:20,916 --> 00:30:21,666

And I'm actually very

:

00:30:21,666 --> 00:30:22,916

much into the Bible as well.

:

00:30:22,916 --> 00:30:23,791

So it's interesting.

:

00:30:24,208 --> 00:30:25,666

I'm reading the book of Enoch right now,

:

00:30:25,666 --> 00:30:26,416

just so I can sort of

:

00:30:26,416 --> 00:30:28,375

understand from a different

:

00:30:28,500 --> 00:30:29,708

context, because you know, the book of

:

00:30:29,708 --> 00:30:30,708

Enoch was removed from

:

00:30:30,708 --> 00:30:32,291

the canonical Bible that

:

00:30:32,291 --> 00:30:33,541

we now have in, you know,

:

00:30:33,541 --> 00:30:34,416

written print for everyone.

:

00:30:34,750 --> 00:30:37,416

So but anyway, we're getting off topic,

:

00:30:37,416 --> 00:30:37,791

but that's a great

:

00:30:37,791 --> 00:30:38,666

conversation we should have

:

00:30:38,666 --> 00:30:39,083

one day.

:

00:30:40,500 --> 00:30:40,750

Yeah.

:

00:30:41,041 --> 00:30:42,791

And then, I mean, back to content

:

00:30:42,791 --> 00:30:44,291

creators, how should we respond to AI?

:

00:30:44,666 --> 00:30:46,458

I mean, we've kind of talked about it.

:

00:30:46,458 --> 00:30:47,750

It seems like for a lot of us, we're

:

00:30:47,750 --> 00:30:48,833

gonna respond to we're

:

00:30:48,833 --> 00:30:50,583

gonna use it for a lot of

:

00:30:50,583 --> 00:30:51,958

them, I should say, not everyone, because

:

00:30:51,958 --> 00:30:53,041

there's always different groups.

:

00:30:53,708 --> 00:30:55,250

But for the kind of the people that have

:

00:30:55,250 --> 00:30:56,000

already done content

:

00:30:56,000 --> 00:30:57,125

creation for a while, it feels

:

00:30:57,166 --> 00:30:58,541

like we're gonna use it, but we're gonna

:

00:30:58,541 --> 00:31:00,500

use it wisely or in

:

00:31:00,500 --> 00:31:02,333

ways that helps us with

:

00:31:02,333 --> 00:31:05,083

our workflow, but not fully trusted or

:

00:31:05,083 --> 00:31:07,583

fully use it to us stepping

:

00:31:07,583 --> 00:31:08,833

back and not doing anything.

:

00:31:09,833 --> 00:31:11,375

Yeah, yeah, I think we need I think there

:

00:31:11,375 --> 00:31:12,416

needs to be a broader

:

00:31:12,416 --> 00:31:14,166

discussion on the role

:

00:31:14,166 --> 00:31:17,833

of humans in ensuring that the ethical

:

00:31:17,833 --> 00:31:20,541

and responsible use of AI is implemented.

:

00:31:21,458 --> 00:31:24,416

And that's on a user by user basis, and

:

00:31:24,416 --> 00:31:26,583

probably by a group by group basis.

:

00:31:28,041 --> 00:31:29,500

I think there needs to be some human

:

00:31:29,500 --> 00:31:30,875

oversight in all of this,

:

00:31:30,875 --> 00:31:32,083

because, you know, as we've

:

00:31:32,083 --> 00:31:34,541

discussed throughout the conversation, if

:

00:31:34,541 --> 00:31:35,875

there isn't, it's not

:

00:31:35,875 --> 00:31:37,291

only going to be training

:

00:31:37,375 --> 00:31:39,958

us to not use our own critical processes

:

00:31:39,958 --> 00:31:41,000

and our own critical thinking.

:

00:31:41,375 --> 00:31:45,458

But if AI were to go on if AI

:

00:31:45,458 --> 00:31:48,000

potentially, given the gaps

:

00:31:48,000 --> 00:31:49,208

that it's already closed, and

:

00:31:49,250 --> 00:31:51,041

just the few short years that it's been

:

00:31:51,041 --> 00:31:52,541

in the mainstream, if

:

00:31:52,541 --> 00:31:53,916

it were to continue in

:

00:31:53,916 --> 00:31:56,125

that capacity unfettered unchecked, like

:

00:31:56,125 --> 00:31:57,333

who knows where it could go.

:

00:31:57,791 --> 00:32:00,458

So I do think that there needs to be some

:

00:32:00,458 --> 00:32:02,791

standards put in place absolutely on both

:

00:32:02,791 --> 00:32:04,166

a personal and a global level.

:

00:32:04,541 --> 00:32:06,000

And I don't know what those look like.

:

00:32:06,000 --> 00:32:07,291

That's way outside my pay grade.

:

00:32:07,666 --> 00:32:12,250

But I would suspect that we, we each as

:

00:32:12,250 --> 00:32:13,208

individuals need to sort

:

00:32:13,208 --> 00:32:15,541

of understand what what this

:

00:32:15,583 --> 00:32:17,083

game looks like that we're playing,

:

00:32:17,833 --> 00:32:18,875

understand the implications

:

00:32:18,875 --> 00:32:20,416

of it, and then come up with

:

00:32:20,416 --> 00:32:22,083

our own set of morals and parameters on

:

00:32:22,083 --> 00:32:23,500

how we're going to use

:

00:32:23,500 --> 00:32:24,958

it safely and responsibly.

:

00:32:25,958 --> 00:32:26,708

Hmm.

:

00:32:27,375 --> 00:32:29,083

It almost needs to be like almost like a

:

00:32:29,083 --> 00:32:30,583

content creation, loosely

:

00:32:30,583 --> 00:32:32,125

Federation, basically being

:

00:32:32,125 --> 00:32:34,541

like, this is how we're going to use AI.

:

00:32:35,000 --> 00:32:36,333

And like, I mean, there could be

:

00:32:36,333 --> 00:32:37,416

different types of groups.

:

00:32:37,416 --> 00:32:39,083

But I'm saying this is how we use AI.

:

00:32:39,083 --> 00:32:40,791

This is how we tell our customers,

:

00:32:40,791 --> 00:32:41,625

because I feel like if

:

00:32:41,625 --> 00:32:43,916

we don't really message or

:

00:32:43,916 --> 00:32:46,041

market the right way, the customer is

:

00:32:46,041 --> 00:32:46,708

going to be pissed off

:

00:32:46,708 --> 00:32:47,500

and be like, I could have

:

00:32:47,500 --> 00:32:50,625

just done chat GPT without you and not

:

00:32:50,625 --> 00:32:51,375

paid all this money

:

00:32:51,375 --> 00:32:53,708

because you did exactly what

:

00:32:53,708 --> 00:32:54,708

I could have done myself.

:

00:32:55,666 --> 00:32:56,916

Yeah, but they didn't do it themselves.

:

00:32:57,666 --> 00:32:59,000

So there's, there's always going to be

:

00:32:59,000 --> 00:33:00,083

those people like, you

:

00:33:00,083 --> 00:33:01,041

know, I don't mow my own

:

00:33:01,041 --> 00:33:01,875

yard or clean up my

:

00:33:01,875 --> 00:33:02,833

yard every week anymore.

:

00:33:02,833 --> 00:33:04,708

I used to do that all all my life.

:

00:33:04,708 --> 00:33:06,916

I've taken care of my yard, but I'm at a

:

00:33:06,916 --> 00:33:08,041

point now where I have

:

00:33:08,041 --> 00:33:08,708

bigger and better things

:

00:33:08,708 --> 00:33:11,583

to do for my own personal priority list.

:

00:33:11,583 --> 00:33:11,916

Right?

:

00:33:12,708 --> 00:33:14,833

There's always going to be people who are

:

00:33:14,833 --> 00:33:16,500

going to do that themselves.

:

00:33:16,500 --> 00:33:18,500

It's always going to be the DIY world.

:

00:33:19,708 --> 00:33:20,791

But there's always going to be the people

:

00:33:20,791 --> 00:33:23,083

that have moved beyond that space and are

:

00:33:23,083 --> 00:33:24,625

going to want someone to do it for them.

:

00:33:25,000 --> 00:33:27,125

So from a, from an ethical standpoint,

:

00:33:27,458 --> 00:33:28,916

you know, however the

:

00:33:28,916 --> 00:33:30,750

job gets done, as long as

:

00:33:30,750 --> 00:33:32,916

there's no harm being done, as there's,

:

00:33:32,916 --> 00:33:33,625

and there's not being,

:

00:33:33,625 --> 00:33:35,666

you know, something being

:

00:33:35,666 --> 00:33:37,541

recreated illegally or plagiarized or

:

00:33:37,541 --> 00:33:38,500

whatever, you know, I

:

00:33:38,500 --> 00:33:40,583

think that people just, I can,

:

00:33:40,583 --> 00:33:41,666

I feel like a broken record.

:

00:33:41,791 --> 00:33:42,916

I think it's just going to have to be

:

00:33:42,916 --> 00:33:43,708

done on a personal

:

00:33:43,708 --> 00:33:45,250

individual basis so that people

:

00:33:45,250 --> 00:33:46,958

are able to monitor what they're doing

:

00:33:46,958 --> 00:33:48,125

without completely,

:

00:33:48,125 --> 00:33:50,208

completely giving away the keys

:

00:33:50,208 --> 00:33:51,250

to the kingdom on everything.

:

00:33:51,916 --> 00:33:52,958

So fun question for you.

:

00:33:52,958 --> 00:33:54,875

What AI would you like to be created to

:

00:33:54,875 --> 00:33:56,708

help you with your workflow even more?

:

00:33:59,166 --> 00:34:00,916

Austin, is there one on parenting?

:

00:34:02,458 --> 00:34:04,125

I have two teenage daughters, man.

:

00:34:04,208 --> 00:34:06,166

I need some parenting AI right now.

:

00:34:06,708 --> 00:34:08,041

I have two daughters who are

:

00:34:08,125 --> 00:34:09,750

completely different from each other,

:

00:34:10,375 --> 00:34:11,000

and one of them just

:

00:34:11,000 --> 00:34:12,083

toes the line and does

:

00:34:12,083 --> 00:34:13,375

their own thing and

:

00:34:13,375 --> 00:34:15,291

never really gets in trouble,

:

00:34:15,708 --> 00:34:16,416

and the other one pushes

:

00:34:16,416 --> 00:34:17,208

the bubble on everything.

:

00:34:17,375 --> 00:34:18,791

I'm not saying it's a bad thing because

:

00:34:19,250 --> 00:34:20,625

the people who push

:

00:34:20,625 --> 00:34:21,333

the bubble are the people

:

00:34:21,333 --> 00:34:23,083

who are the change makers in the world.

:

00:34:23,750 --> 00:34:24,500

But dealing with that

:

00:34:24,500 --> 00:34:25,750

as a parent is sometimes,

:

00:34:25,958 --> 00:34:26,833

I don't know if you have kids,

:

00:34:26,833 --> 00:34:27,916

but dealing with change

:

00:34:27,916 --> 00:34:29,250

makers as your own children,

:

00:34:29,625 --> 00:34:30,875

it can be a monumental task.

:

00:34:31,208 --> 00:34:32,416

So give me some AI parenting.

:

00:34:33,958 --> 00:34:34,958

Outside of the joke world,

:

00:34:35,250 --> 00:34:37,541

I think, really the truthfully,

:

00:34:37,541 --> 00:34:39,083

the post-production of things,

:

00:34:39,291 --> 00:34:40,916

I think, especially with my podcast,

:

00:34:41,416 --> 00:34:43,250

I find there's a lot of

:

00:34:43,250 --> 00:34:46,541

tedium duplicitous things that go on,

:

00:34:46,541 --> 00:34:48,083

from creating show

:

00:34:48,083 --> 00:34:49,666

notes to time-stamping it,

:

00:34:50,250 --> 00:34:52,208

to putting all of the different segments

:

00:34:52,291 --> 00:34:54,041

together to creating shorts from it.

:

00:34:54,375 --> 00:34:55,458

Being able to take,

:

00:34:56,208 --> 00:34:57,833

it'd be nice to be able to

:

00:34:58,750 --> 00:35:00,666

produce my own content like I do,

:

00:35:01,166 --> 00:35:02,791

and then edit that content

:

00:35:02,791 --> 00:35:04,625

down like I do manually right now.

:

00:35:05,000 --> 00:35:05,750

But it'd be really cool

:

00:35:05,750 --> 00:35:08,208

that once I got the cream,

:

00:35:08,208 --> 00:35:08,958

if you will, the cream

:

00:35:08,958 --> 00:35:10,000

of the crop of the show,

:

00:35:10,416 --> 00:35:11,375

that then I could dump

:

00:35:11,375 --> 00:35:13,083

it into a program that's

:

00:35:13,083 --> 00:35:14,583

only going to take the cream of the crop

:

00:35:14,583 --> 00:35:16,125

and then redistribute it,

:

00:35:16,125 --> 00:35:17,375

make the shorts, do the

:

00:35:17,375 --> 00:35:18,333

show notes, whatever it is,

:

00:35:18,333 --> 00:35:18,916

because then I would

:

00:35:18,916 --> 00:35:20,250

know that it's only editing

:

00:35:20,250 --> 00:35:21,166

the stuff that I've

:

00:35:21,166 --> 00:35:22,708

already created myself.

:

00:35:22,708 --> 00:35:23,291

Does that make sense?

:

00:35:23,958 --> 00:35:25,333

So that would be a nice tool for me.

:

00:35:26,666 --> 00:35:27,958

It's always the post-production,

:

00:35:27,958 --> 00:35:30,500

that's always the longest part and the

:

00:35:30,500 --> 00:35:32,208

most tedium part about everything,

:

00:35:32,208 --> 00:35:33,416

because you always have to edit the show

:

00:35:33,416 --> 00:35:35,041

to a certain extent.

:

00:35:35,041 --> 00:35:36,708

I mean, it depends on the person.

:

00:35:36,875 --> 00:35:37,833

If you wanted to always

:

00:35:37,833 --> 00:35:39,208

get rid of the umbs and uh's,

:

00:35:39,208 --> 00:35:41,708

I'm more of a, if it's

:

00:35:41,708 --> 00:35:44,000

oak, if it's not obsessive,

:

00:35:45,083 --> 00:35:45,791

let's just say that.

:

00:35:46,083 --> 00:35:47,000

If it's not saying every

:

00:35:47,000 --> 00:35:48,250

other word, umbs and uh's,

:

00:35:48,250 --> 00:35:50,833

I leave it in, but if it's like too much,

:

00:35:50,833 --> 00:35:51,583

I'm like, well, I have to

:

00:35:51,583 --> 00:35:52,583

take most of this out now.

:

00:35:52,666 --> 00:35:53,708

So I get what you're

:

00:35:53,708 --> 00:35:55,041

saying because it, it can,

:

00:35:55,416 --> 00:35:57,791

a lot of it can become tedious.

:

00:35:59,333 --> 00:36:01,041

Yeah. You know, I do, uh, as we're

:

00:36:01,041 --> 00:36:02,083

talking about content creation,

:

00:36:02,083 --> 00:36:04,291

so I just finished a shoot yesterday, um,

:

00:36:04,291 --> 00:36:05,541

with one of my clients and there's like

:

00:36:05,541 --> 00:36:07,958

2,500, 3,500 images in there.

:

00:36:08,625 --> 00:36:11,125

So, um, I know that there are AI tools

:

00:36:11,125 --> 00:36:12,708

like Adobe already has one out there

:

00:36:12,708 --> 00:36:14,583

where they'll go through and call all of

:

00:36:14,583 --> 00:36:15,458

your images for you.

:

00:36:15,750 --> 00:36:17,250

And I've played with it, but it doesn't

:

00:36:17,250 --> 00:36:18,833

do the job that I need done.

:

00:36:19,041 --> 00:36:20,375

There's a lot of error in it.

:

00:36:20,750 --> 00:36:23,291

Um, a lot of omission of, of key images

:

00:36:23,291 --> 00:36:23,958

that should have been there,

:

00:36:23,958 --> 00:36:25,291

but it's supposed to sort of look for

:

00:36:25,291 --> 00:36:26,958

duplicates and closed

:

00:36:26,958 --> 00:36:28,250

eyes and bad lighting

:

00:36:28,250 --> 00:36:29,458

and things that just don't really fit.

:

00:36:29,708 --> 00:36:32,041

But, um, that would be a nice tool to be

:

00:36:32,041 --> 00:36:32,875

able to have, to be honest with you,

:

00:36:32,875 --> 00:36:33,916

because going through a

:

00:36:33,916 --> 00:36:35,791

first pass of images for me,

:

00:36:36,166 --> 00:36:38,458

getting that 2,500 down to, you know, a

:

00:36:38,458 --> 00:36:39,208

couple of hundred images

:

00:36:39,208 --> 00:36:40,875

that I'm actually going to eventually use

:

00:36:40,875 --> 00:36:42,208

for something would be a nice tool.

:

00:36:42,375 --> 00:36:43,666

And I don't, I haven't seen the

:

00:36:43,666 --> 00:36:44,583

technology there yet.

:

00:36:44,791 --> 00:36:46,458

And if it exists, I don't know about it,

:

00:36:46,458 --> 00:36:48,458

but that would be another usage for me.

:

00:36:50,458 --> 00:36:51,750

I got a really cool book for you, if

:

00:36:51,750 --> 00:36:52,916

you'd like to read on AI that,

:

00:36:52,916 --> 00:36:54,833

because I started down this, um, it's at

:

00:36:54,833 --> 00:36:55,666

the timing of your

:

00:36:55,666 --> 00:36:56,875

request to be on the show.

:

00:36:56,875 --> 00:36:58,750

It's interesting because, and the topic,

:

00:36:58,750 --> 00:37:00,541

because I've been exploring this now

:

00:37:00,541 --> 00:37:02,500

for about six months and I've read some

:

00:37:02,500 --> 00:37:03,708

really cool books on it.

:

00:37:04,041 --> 00:37:05,333

Um, the one I really liked the most was

:

00:37:05,333 --> 00:37:07,083

called The Singularity is Near,

:

00:37:07,750 --> 00:37:09,375

when, um, it's sort of like

:

00:37:09,375 --> 00:37:11,083

when humans transcend biology.

:

00:37:11,083 --> 00:37:11,875

And I can't remember the name of the

:

00:37:11,875 --> 00:37:13,333

author, but I believe

:

00:37:13,333 --> 00:37:14,541

his last name was Kurzweil.

:

00:37:14,541 --> 00:37:15,958

I think Ray Kurzweil was his name.

:

00:37:16,458 --> 00:37:18,708

I'm really good book, but it explores the

:

00:37:18,708 --> 00:37:20,458

potential implications of

:

00:37:22,041 --> 00:37:24,208

exponential growth of this AI technology

:

00:37:24,208 --> 00:37:27,250

and, and the possibility of like

:

00:37:28,041 --> 00:37:30,375

technological singularity and, and, you

:

00:37:30,375 --> 00:37:31,291

know, where machines

:

00:37:31,291 --> 00:37:32,666

become smarter than humans.

:

00:37:32,666 --> 00:37:33,666

So it's a really good book.

:

00:37:34,208 --> 00:37:35,958

Um, if, if anyone's looking to explore,

:

00:37:35,958 --> 00:37:37,166

if you're looking for a good read on

:

00:37:37,333 --> 00:37:38,708

the impact of AI, this was

:

00:37:38,708 --> 00:37:40,041

a phenomenal read for me.

:

00:37:40,041 --> 00:37:41,833

And it's really sort of helped me shape

:

00:37:41,833 --> 00:37:43,166

some of the beliefs

:

00:37:43,166 --> 00:37:44,125

that I have around this,

:

00:37:44,166 --> 00:37:45,875

which is, you know, like there's a moral

:

00:37:45,875 --> 00:37:46,791

compass that has to be

:

00:37:46,791 --> 00:37:48,083

employed across the line on

:

00:37:48,083 --> 00:37:50,125

this on an individual level.

:

00:37:50,625 --> 00:37:52,458

Yeah. Yeah. Sounds

:

00:37:52,458 --> 00:37:53,666

like an interesting read.

:

00:37:54,833 --> 00:37:56,000

I'll put on my list of very,

:

00:37:56,291 --> 00:37:57,750

a very long list of things.

:

00:37:58,250 --> 00:38:02,166

You need an AI bot to read it for you.

:

00:38:02,916 --> 00:38:05,291

Yeah. But then I just don't learn it

:

00:38:05,291 --> 00:38:06,333

because the AI bot learns

:

00:38:06,333 --> 00:38:07,166

it and then I'm just like,

:

00:38:07,166 --> 00:38:08,250

well, that didn't learn anything.

:

00:38:08,833 --> 00:38:11,833

Yeah. I'm an old school book reader. As

:

00:38:11,833 --> 00:38:12,666

you can see behind me, I,

:

00:38:12,708 --> 00:38:15,333

I don't even listen to, um, Audible's

:

00:38:15,333 --> 00:38:16,750

like, I just like to read a book.

:

00:38:17,208 --> 00:38:19,458

I don't either. I, I do my own book

:

00:38:19,458 --> 00:38:21,791

reading too, because I know

:

00:38:21,791 --> 00:38:22,750

you can get through books a lot

:

00:38:22,750 --> 00:38:24,458

faster with it, but I don't feel like you

:

00:38:24,458 --> 00:38:25,791

will learn it as well as if

:

00:38:25,791 --> 00:38:27,000

you actually read it yourself.

:

00:38:28,166 --> 00:38:29,791

Yeah. Absolutely. 100% agree. And I like

:

00:38:29,791 --> 00:38:30,583

to, I like to make notes

:

00:38:30,583 --> 00:38:31,625

in my books and stuff and

:

00:38:31,625 --> 00:38:33,083

go back and reference them and research

:

00:38:33,083 --> 00:38:34,125

stuff. So yeah, I

:

00:38:34,125 --> 00:38:34,958

can't do that with Audible.

:

00:38:35,875 --> 00:38:37,166

Well, I guess you probably could, but

:

00:38:37,166 --> 00:38:38,208

it's not the same thing.

:

00:38:38,708 --> 00:38:39,083

Yeah.

:

00:38:40,083 --> 00:38:42,166

Agreed. And so where can

:

00:38:42,166 --> 00:38:43,125

people find you online?

:

00:38:45,333 --> 00:38:48,500

Instagram is probably my biggest area of

:

00:38:48,500 --> 00:38:49,208

playing in the digital

:

00:38:49,208 --> 00:38:50,250

sandbox. It's fusion,

:

00:38:50,250 --> 00:38:52,708

photog, short for photography. I'm also

:

00:38:52,708 --> 00:38:55,208

on LinkedIn. My website is fusion,

:

00:38:55,208 --> 00:38:58,833

creative branding and fusion photography

:

00:38:58,833 --> 00:39:00,291

studio. Those are the two websites.

:

00:39:01,666 --> 00:39:02,250

No, right. Any final thoughts for listeners?

:

00:39:03,250 --> 00:39:09,250

Um, thanks for having me on the show. I

:

00:39:09,250 --> 00:39:11,333

think AI has the potential to

:

00:39:11,333 --> 00:39:12,375

revolutionize the way that

:

00:39:12,375 --> 00:39:16,083

we work. I really do. Um, but as I think

:

00:39:16,083 --> 00:39:17,583

Joe, uh, master Yoda said,

:

00:39:17,583 --> 00:39:19,166

as with great power comes

:

00:39:19,166 --> 00:39:22,500

great responsibility. So, you know, it's

:

00:39:22,500 --> 00:39:23,791

the onus is on us as

:

00:39:23,791 --> 00:39:24,333

individuals to live our lives

:

00:39:24,375 --> 00:39:30,416

critically and profoundly and to do the

:

00:39:30,416 --> 00:39:33,125

things that are responsible and

:

00:39:33,125 --> 00:39:34,875

accountable to be good

:

00:39:34,875 --> 00:39:37,166

humans. And AI falls into that bucket.

:

00:39:37,458 --> 00:39:38,791

And if we're going to take

:

00:39:38,791 --> 00:39:41,375

a tool and bastardize it and

:

00:39:41,375 --> 00:39:44,208

make it, um, make it a tool for

:

00:39:44,208 --> 00:39:47,375

pernicious output, if you will, then

:

00:39:47,375 --> 00:39:48,833

you're not, you're not

:

00:39:48,833 --> 00:39:50,833

following in the line of, of good karma.

:

00:39:51,083 --> 00:39:51,708

And so, um, not to get weird and holistic and woo on you,

:

00:39:52,041 --> 00:39:54,000

not to get weird and holistic and woo on

:

00:39:54,000 --> 00:39:55,833

you, but like anything, find

:

00:39:55,833 --> 00:39:57,958

a balance and, and use it for

:

00:39:57,958 --> 00:39:59,583

the things that are not going to disrupt

:

00:39:59,583 --> 00:40:00,916

the status quo too much, or

:

00:40:00,916 --> 00:40:03,250

disrupt the moral status quo of

:

00:40:03,250 --> 00:40:05,875

things and, and find, and, and be able to

:

00:40:05,875 --> 00:40:06,958

have some moral obligation

:

00:40:06,958 --> 00:40:08,416

around how you use it and

:

00:40:08,541 --> 00:40:09,750

drawing a line of the stand on things

:

00:40:09,750 --> 00:40:10,833

you're not going to do with

:

00:40:10,833 --> 00:40:12,875

AI. And I, I feel like it could

:

00:40:13,166 --> 00:40:14,833

be a really productive tool and really

:

00:40:14,833 --> 00:40:15,875

help you out in a lot of different ways.

:

00:40:15,916 --> 00:40:16,125

Yeah.

:

00:40:18,333 --> 00:40:20,541

Agree. But thanks, Steve, for joining

:

00:40:20,541 --> 00:40:22,208

digital coffee marketing,

:

00:40:22,208 --> 00:40:23,500

bro, and sharing your knowledge on

:

00:40:23,500 --> 00:40:25,166

content creation and just AI.

:

00:40:25,916 --> 00:40:27,375

My pleasure. Thanks for having me, man.

:

00:40:28,416 --> 00:40:30,416

And thank you for joining. As always be

:

00:40:30,416 --> 00:40:32,541

subscribed to the podcast on

:

00:40:32,541 --> 00:40:33,375

all your favorite podcasts.

:

00:40:33,625 --> 00:40:37,500

And join us, join me next month.

:

00:40:37,708 --> 00:40:38,791

As we're talking a little great about

:

00:40:38,791 --> 00:40:39,541

you're in the PR

:

00:40:39,541 --> 00:40:41,541

marketing industry. All right,

:

00:40:41,583 --> 00:40:43,583

guys, stay safe to understand how you can

:

00:40:43,583 --> 00:40:44,666

use AI to help the

:

00:40:44,666 --> 00:40:47,708

workflow. And next month. Later.

About the Podcast

Show artwork for Digital Coffee: Marketing Brew
Digital Coffee: Marketing Brew
Get your does of marketing with your favorite coffee brew

Listen for free

About your host

Profile picture for Brett Deister

Brett Deister