The Marketing AI Show—the podcast that helps businesses grow smarter by making artificial intelligence approachable and actionable—has officially dropped!
You can now listen to the first four episodes of the podcast on your favorite podcast app. Keep reading for more on what to expect in episode four.
Episode 4: David Meerman Scott, Marketing Strategist
No one knows more about using the new real-time tools and strategies to spread ideas, influence minds and build business than David Meerman Scott. He’s a sales and marketing strategist who has spoken on all seven continents and in more than 40 countries to audiences of the most respected firms, organizations and associations. David is the author of ten books—three are international bestsellers—and is best known for “The New Rules of Marketing & PR,” now in its 6th edition, which has been translated into 29 languages and is a modern business classic with more than 400,000 copies sold so far.
In this episode, David Meerman Scott digs into the dark side of social media and considers the question: Is Facebook evil? He also shares his thoughts on how AI-powered algorithms are affecting the way we consume news, and what it all means to brands and marketers. Tune in to hear the conversation.
[Video] Watch the Full Interview
Read the Full Transcript
Disclaimer: This transcription was written by AI, thanks to Descript.
Paul Roetzer : Welcome to The Marketing AI Show. I'm joined today by international best-selling author and world-renowned keynote speaker David Meerman scott. Welcome David.
[00:00:13] David Meerman Scott: Hey, Paul. So good to be here. Thanks very much always, man.
[00:00:17] Paul Roetzer : Last time I think we did something publicly together would have been the keynote stage, the final closing talk at Marketing AI Conference 2019.
[00:00:25] David Meerman Scott: Exactly. And that was I mean, my gosh, it seems like. I was in years ago because it was pre-pandemic
[00:00:32] Paul Roetzer: Last month seems like a thousand years ago. Right now. I feel like we've all lived a lifetime in the last eight or nine months, but man, I miss those days. I know you're I know you had the new book on the standout virtual events, so I know you've kind of pivoted and made the best of it with the virtual events, but I know we're all missing the live events and seeing other humans.
[00:00:51] David Meerman Scott: Yeah, we are. I mean, you know, as you know, because you kindly took a look at the book. This book I wrote called Fanocracy, it's about human connections [00:01:00] and physical human connections are super important. It's actually hardwired in our brains, it's neuroscience that we want to be part of a tribe of like-minded people.
[00:01:10] So we're all hurting. We're hurting in so many ways because of this of this virus, but there is light at the end of the tunnel. My daughter, Reiko, who is working the emergency department in a COVID intensive care unit now she graduated from med school, she's my co-author. And she graduated from med school in April. And now as a doctor, she got her shot yesterday. Yeah. So the vaccines are rolling out. There is where I think, I think the fact that the, that she got her shot on the winter solstice and every day from now, it gets lighter. I believe we are now coming out of darkness symbolic in some way. I'll take it.
[00:01:56] Paul Roetzer: I will take anything. We can get tie in the great conjunction in there [00:02:00] and like, I'm good. I'll take it all the positive thinking. No, I agree. And I think, you know, it's like we start looking out ahead and, you know, I feel like next fall the, you know, maybe in the summer, if we're lucky, but certainly next fall, I think it's realistic that we can be back together again in some form, you know, maybe smaller events, even in the summer, like I'm starting to look out and say, can we get 50 to a hundred people together?
[00:02:25] Right. You know, different. So I'm with ya. And I think any chance we get to start having those in-person even if they're not the thing we're used to, but just some level of in-person connection is so critical. Oh, absolutely. No, he's a great book too. And I know that's not really our topic today, but it is incredible and it does.
[00:02:43] Yeah. Have a connection to what we're doing, because the reason David, I mean, David and I could talk about topics all day long related to marketing. So if you haven't read new rules of marketing and PR, that was like the international bestseller, I think 2007. When was the first time you guys and seven was the first edition.
[00:02:59] David Meerman Scott: Yeah. And I [00:03:00] wrote it no five or six now in the seventh edition, remarkably. And it's done and being an author, you would recognize these numbers, but it's done 400,000 copies in English and it's in 29 other languages from Albanian to Vietnamese. So it amazes even myself to be able to say those numbers, it's kind of, kind of amazing.
[00:03:23] Paul Roetzer: Yeah. And if you're not a business author, haven't worked in that industry. And again, I I'm going on my third book, but the first one you don't know what success looks like. And I remember asking the publisher, we had the same publisher Wiley you know, for at least the first couple and I remember asking them, like, what is success in the business publishing world?
[00:03:39] And I don't remember what number they gave me, but I think I had heard at some point that the average business book sells 500 copies, like it's to do a thousand, them,
[00:03:48] David Meerman Scott: The average business book sells like 500. It's generally a success if it sells 5,000. And new roles for marketing PR and 400. So it's kind of [00:04:00] amusing.
[00:04:00] Paul Roetzer: So that that's probably a good jumping off point here. So your seventh edition, if I'm not mistaken, is the first time that you had a dedicated chapter section about artificial intelligence.
[00:04:12] David Meerman Scott: Entire chapter. Basically anchored by my friend, Paul. But yeah, it was the first time that I really dug into AI as an important aspect for, for marketers to understand.
[00:04:25]I'm really glad. Yeah, I did. And that was actually the catalyst of that was the impact event. Yeah. That was done in 2018. And you did deliberate, you delivered a keynote and I was mesmerized by what you were sharing to the audience, because I thought I'm, you know, I'm a bit of a smarty pants about marketing.
[00:04:45] You know, I'm like up on my high horse thinking I know everything. And you just started talking about AI, a topic I didn't know anything about, except for a little bit of practical stuff, just watching Netflix and wondering how the algorithm [00:05:00] works, but you are talking about it in such a way that I have to get
[00:05:04] educated with this stuff. It is the new thing. Paul's on top of it. I'm not on top of it. I have to figure it out. And you kindly, they helped me along, educated me. And then yeah, it became a full chapter in the new rules, marketing and PR. It's actually in the subtitle now. The AI is in the subtitle. So thank you for that.
[00:05:23] Thank you for your leadership in this area because it's really truly been remarkable to see how you've grabbed at this topic and really taken a leadership role in it.
[00:05:32] Paul Roetzer: I appreciate that. I remember after that talk, you came up to me and said something about hope in your eyes. And to me, that was like, Oh man, like this is David.
[00:05:40] Like, this is the guy I kind of idolized in the industry. Looked up to, read your book at the start in 2007. So yeah, it meant a lot to me to know that it helped you. And to me. Like, it, it, it kind of validated what we were trying to do was get other really smart people in the industry, thinking about this and talking about it.
[00:05:58] And that kind of leads [00:06:00] into the topic today, which is recently you published an article called 2021 digital marketing prediction, backlash against social media algorithms. And so your understanding of AI over the last couple of years has, has helped you start looking at things differently. Yeah, I would say.
[00:06:16] David Meerman Scott: It has very much so, and I've been sort of in the back of my mind, sort of lurking is I really need to understand this. And I didn't feel that I did truly understand it for a long time, more than a year. And I, you know, I've been watching what you and your team has been putting out with the Institute subscribing to the blog posts and so on.
[00:06:39]And really trying to understand it. And I'm not saying I do understand, I understand it, but enough to kind of draw some conclusions and that blog posts you reference. It was probably my most popular blog post in a couple of years. Two or three. Yeah. And every measure, the number of [00:07:00] people who've shared it, the number of down people who've commented on it, the number of people who've reached out to me and wanted to chat about it a little bit.
[00:07:08] So, yeah. And super, super interesting. And I kind of had a draft of that for a couple of months before I finally, and I was a little bit nervous to push the button. This is a tricky topic. Should I push the button? Screw it. I push the button.
[00:07:22] Paul Roetzer: So, so if you haven't read it, so we'll include it in the show notes. We'll put a link in there, but I want to read you the lede to the, to the article and then have you kind of react to that. So I'll give you an abbreviated version. "So social networks like those developed by Facebook and Google have tremendous power to allow people around the world to connect and share. However, the increasing reliance of social networking companies on algorithms and these AI powered algorithms, not human powered algorithms, to determine what we see in our feeds has become a tremendous problem." Why, what is, what is it that you're seeing that kind of got your attention and said, hold on a second, this isn't [00:08:00] going the way it's supposed to?
[00:08:01] David Meerman Scott: The first thing I noticed this is more than a year ago was how on Netflix all of the TV shows and movies that they were showing me, I could see every single one was either based on my album, eight based on what I had seen before. Or something that they were pushing cause it was new. I was never shown anything that was serendipitous, that was interesting, that was unique, you know, and, and, and I, you can see behind me, my, my rock and roll hall of fame I mean, huge fan of live music. And when I first subscribed to Netflix, I watched a lot of rock documentaries. And freaking Netflix. Netflix is algorithm shows me every rock documentary there is, and, and they've even gone beyond rock and roll and other kinds of music documentaries.
[00:08:55] And I don't watch all of them. I don't want to see all of them. I'm not interested in having my [00:09:00] feed cluttered with and then I got into a couple of British sort of programs like the Crown and thing. And they're going to sending me thousands of those kinds of things as well.
[00:09:12] And I thought to myself, wow, I'm kind of understanding from a practical reason what's going on with an AI algorithm and I'm I get it. I'm understanding what they're doing and why. And then I started to be thinking, especially about Facebook and the same kinds of algorithms, of course, with Twitter and LinkedIn and others, but, and an Instagram of course, which is owned by Facebook, but I was really seeing it with Facebook and.
[00:09:42] And I drew a conclusion a couple of months ago that Facebook's algorithm, AI algorithm is evil. It's evil all because after you watched the social dilemma, it's it was before I watched the social dilemma, then I saw the social dilemma. And [00:10:00] rather than opening my eyes, it just confirmed what I was thinking, but the algorithm being evil to me and, and I, and, and.
[00:10:10] Let me back up just a second and saying, I'm going to turn 60 next year. And I wanted to make sure that I wasn't being just a curmudgeon about it. Right. You know, I wanted to make sure that I wasn't doing what people were doing 20 years ago. When, when I, when I saw this, the marketing revolution around online,
[00:10:30] I talked about there curmudgeons who are all about advertising? I said, so I was like, Oh my God, I've turned into the curmudgeon. I'm saying that AI is AI algorithms are evil. Oh my gosh, I am what I was talking about 20 years ago. So I really wanted to make sure that that wasn't what was going on.
[00:10:45] And I was like, Oh, you know, everything that I know is good and everything that's new is bad. I really do think the Facebook algorithm is evil because what it does, is it, is it, it polarizes people, it puts you into a group, and [00:11:00] then the things that you click, it gives you more of an, it gives you more of an, it gives you more of, and there's so many people out there who then begin to believe the conspiracy theories that Facebook shows and begin to focus on.
[00:11:15]The political sort of discussions and the political viewpoints that they initially click. They just see more of it and they see more of it. They see more of it. And many of those people just aren't, I don't want to say they're not bright enough because they probably are bright enough, but they're not educated enough to know that Facebook is just showing them what they've already clicked.
[00:11:38] Dumb as opposed to showing them the overall an overall news feed. And I'm I spent the first 15 years of my career in the financial news business, I worked for companies like Dow Jones and writers. I was a product manager for news services. I understand news deeply. And and when a human is [00:12:00] creating a newsfeed, you know, they, they try to figure out how they can create a a newsfeed that's got multiple different sources and different ways of looking at news and trying to be as as fair as they can. But when a algorithm, an AI algorithm takes over that news feed provides that information. It's I believe so out of control that Zuckerberg and his gang doesn't even know what it is doing any more? I think it's actually gone that far. It's into like the, Oh my God. The computer is taking over and you know, you know, way more about this than I do Paul and, and, and I'm just trying not to be too curmudgeony about it. I really do think, I think that it is one of the worst things that have ever happened to humanity.
[00:12:47] Paul Roetzer: Yep. No. So the way I look at it, and I, I'm kinda like you, I, I try and give these companies the benefit of the doubt and the reality is that a lot of the innovation and AI that's happening, that we're experiencing [00:13:00] today and that we'll use as marketers is coming from these companies. So whether it's Facebook or Google or Amazon, or I mean, Apple to a degree is treats differently and they don't have their social network, but What they're doing and the way AI works is it's trained to achieve an outcome, a goal. And so if we think about how Facebook's business model works, they, they make money by you staying on Facebook long and they sell advertising on top of that. And I think you use the line of. If you're not paying for the product, you are the product.
[00:13:31] I think you referenced that.
[00:13:33] David Meerman Scott: Yeah. And I can't remember who said that first, but I did reference that one.
[00:13:37] Paul Roetzer: Yeah. And that's, that is the case here. You, your data is what they're selling and to your point about conspiracies and the other things that, and this isn't new to social media, like human behavior. It through news.
[00:13:49] It's why you have sensationalized news channels. They know that if they can wrap you in this, this mythical idea or these conspiracies, that, that they know aren't true, [00:14:00] but they're going to send you down this because you're going to keep clicking and whether on YouTube or Facebook or wherever, you're going to all of a sudden see the next one, the next one.
[00:14:07] And there's a really good chance. The AI will learn to keep. Moving you further along the spectrum to get you further from what you probably thought was true, because you're going to keep looking. And so your, your attention is what they're able to then sell. Yup. That's right. AI learns ways to do that.
[00:14:27] That no, one's coding it anymore. They're not telling it to serve this ad.
[00:14:32] David Meerman Scott: Exactly. Yeah. And, and so I truly believe that when you have. Two Billy 2 billion people in the world. I don't know what their latest numbers is. And I want to stick with Facebook because I think that's the worst offender. If you have 2 billion people the vast majority of which get most of their news through that newsfeed and they're being fed this diet. [00:15:00] THat is a terrible thing for humanity. And you know, there's a bunch of I think 40 something states have have sued Facebook and, and arguing their monopoly. I don't believe that that's the right lawsuit. I think the lawsuit should be around the way the AI or the AI algorithms work. I think that's a way bigger problem than the monopolistic behavior from those networks.
[00:15:24] Paul Roetzer: And that requires a totally different understanding from the government.
[00:15:28]David Meerman Scott: And there's probably not that many people who, who can, who have dug into it enough to even understand what these things mean.
[00:15:33] Paul Roetzer: When they're sitting in front of Congress and getting asked, these questions is just laughable. I mean, obviously they have no understanding of the business model more or less the AI.
[00:15:42] David Meerman Scott: Yeah. When I think it was Grassley, one of them said, well, how do you make your money? And Zuckerburg says, Advertising, sir. These people don't understand how AI algorithms work. I don't even know how these companies make money.
[00:15:59] Paul Roetzer: You'd cited Facebook is far and away in terms of where people get their news. 52% of us adults get their news from Facebook, which is terrifying.
[00:16:09]David Meerman Scott: It's a remarkable, remarkable number of people who are seeing their news manipulated. In a way that drives to polarization and conspiracy theories. It's truly awful. Yeah.
[00:16:28] Paul Roetzer : Have you tried to manipulate the algorithm? We actually had this conversation at my agency a couple of weeks ago where you realize what the algorithm is doing. And again, without even, I don't know if the average person would do this, but like our, our team obviously understands it.
[00:16:43] So like, we'll go into like Instagram, for example, your feed. And one of the girls on my team was saying she'll purposely like specific people she follows because they'll disappear from her feed. So she'll go find like the last 10 things they did and just like them, because she knows that will then surface them back into the top of her [00:17:00] feed and adjust the way the algorithm presents them.
[00:17:03] David Meerman Scott: Yeah. Yeah. I I have screwed around with that a little bit. And I, I very, very, very much see it, you know, I'll go in and and, and. If I'm re tweeting a few people, all of a sudden they're appearing in my Twitter feed when I go in there. So I see that a lot. But my gosh, there is probably less way, less than 1% of people who are on Facebook, who understand that it's not just the problem.
[00:17:29] That's the problem. And so when, when you see these people who are interviewed about, you know, why did they believe in QAnon and the reason they believe in it? They don't actually say, say it in these ways, but they believe in QAnon because that's all the Facebook algorithm is showing them. And then they believe that the whole world is in, you know, is figured this out and they just need to make sure that they get out there and tell people about it.
[00:17:59][00:18:00] And that's really dangerous. I think. So I think we could probably agree. The government likely isn't stepping in and fixing the algorithms that are powering Facebook. Like earlier this year earlier in 2020, there was some pushback on advertising. Like there was some major brands that paused their advertising with Facebook to try and prove a point.
[00:18:22] Paul Roetzer: But I remember like the first early days of that you saw a lot of people, like, yeah, great. What are you going to do? Pause your ads for 30 days. Like, is this really going to change any behavior? Right. So. Outside of someone stepping in and changing the way Facebook works, because what Facebook claims is they are neutral.
[00:18:40] It is open, is free for anyone to share. Well, that's not true.
[00:18:44] David Meerman Scott: You know what, Paul, it's easy for them to say that. And it's actually true to say that they're neutral. That look I'm. You know, Zuck and his team can say we aren't manipulating this for we're just, it's just, it just is. That's [00:19:00] true. But the algorithm that they built, the AI algorithm they built is run a muck.
[00:19:05] Paul Roetzer : Correct.
[00:19:06] David Meerman Scott: And, and so it's absolutely true. They're not manipulate. I agree with that action.
[00:19:13] Paul Roetzer: So what is, what is the answer though? So as brands as we look ahead, What are, what are we supposed to do?
[00:19:19] David Meerman Scott: I believe that the, absolutely the answer of how we're going to deal with Facebook really comes down to Facebook employees. And I've noticed very recently that there's been a backlash from many of their employees, even publicly around what the company is doing. And I don't believe that the employees who work at Facebook are evil. And I'm not quite, I know many of them, they're not, they're good people sure about the management team, but the employees aren't evil and the employees can see what their, their AI algorithms are doing.
[00:20:00] [00:20:00] And I think the employees have to figure out what does that mean for them? And I think it means one of three things and every single Facebook employee needs to make a decision. Number one. You become an agent of change and you speak out about what's going on and perhaps get fired. Yes. Number two, you just say, suck it up and you say, all right, this is the company I work for.
[00:20:28] And I'm going to go to hell for number three, you quit. And those are the three. Those are the three choices. There are no other choices. These are the three.
[00:20:38] Paul Roetzer: So do you think, I mean, this is what often happens in. I don't want to get political, as I say, you can look at some examples of where it's yes, there are people who work for you or non, but there are people who are for organizations who believe they can have a greater change by being on the inside.
[00:20:57] That's that they see what is going [00:21:00] wrong, but they do have a voice. And while their voice may not be listened to right now, they at least have a seat at the table to say, we should think about this a different way. Yes. And so I, there are people I know who work at Facebook who are in that camp that are incredible people that do believe it can do good, but do see the dark side.
[00:21:24] David Meerman Scott: You know, Paul, they can, it can do good. I mean, I'm connected with friends. I haven't. I haven't seen in person in 30 years. Facebook has brought together humanity in a really. Fabulous way, but it has gone so far beyond those initial things to become so evil that it's now rather than being a force for good. It's a force for bad.
[00:21:51] Paul Roetzer: Right now. I always look at, I always thought AI development and in this way, there was a, a great book called The Pentagon's Brain that I read years ago. And it was about [00:22:00] DARPA and went back decades to the 1950s and like the origins of AI within the government and trying to use it for the defense.
[00:22:08] It's, it's phenomenal. But in that there's the moral of the story around like the atomic bomb and things like that, that came later of just because you can doesn't mean you should. And so I think with AI, Many times there are, there are incredible things that can be built. There are ways to take leaps forward in terms of information the way people consume information, the way we affect people's behavior.
[00:22:31]But when you, in Facebook's case, when you look back 15 years after, you know, it became available on college campuses in 2005, And you say, like, if you had to make that call now, like knowing all the good that came from it, all these people you've connected with all the things you share, especially now when we're just like, we just want to know what's going on people's lives, but you also know all the negatives that can come from it.
[00:22:56] Do you still create Facebook? You know, I, I believe that if Facebook had been built that something else would have built that would have become Facebook. You know when the new rules of marketing and PR came out in 2007 my space, I had more users in Facebook. Yeah. It was something like 25 million users on my space.
[00:23:19] David Meerman Scott: And Facebook was only for students at that time. Very soon after they opened up to non-students. So. This world would have been created if that wasn't Zuckerberg and Facebook's world, it still would have been created. And I think that I think that we need to really just take a good, hard look at this.
[00:23:39] And so I think that Facebook employees are part of the part of. The potential solution, but I think all of us are to get educated around what's going on, speak out when we can. And you know, I don't ha I don't advertise on Facebook. So I don't, I was actually going to ask you that this is, this is an internal struggle.
[00:24:01] Paul Roetzer: So I have this is so hard to talk about this topic without like sharing some really personal feelings on this stuff. All right. I'll try and figure out a word on it. So. Did you watch the great hack on, on Netflix, the story of Cambridge Analytica and how they weaponized data? And I watched the social dilemma as well.
[00:24:18] Okay. So The Great Hack, if you haven't watched, it tells the story of how Cambridge Analytica in 2016 weaponized data and conducted psychological warfare on American citizens to manipulate the election simplest way to say it. Yep. Indisputable. Like they did this After that I already had some feelings, but after that I was I was repulsed by what I had seen and I thought I understood.
[00:24:41] Yeah. I thought I understood what was happening and it just opened my eyes completely. So as someone who owns an agency who does uses Facebook as a key marketing channel yeah. I have now for four years, struggled with the fact that that we're choosing to continue to, to use a product [00:25:00] and a platform and a company that I am convinced.
[00:25:04] Chooses to allow these evil things to happen. Yes. Now I do the reason we haven't made a change and I there's probably other people who have struggled with this, I don't, I don't think it's unique to me. There are, there are evil people in all of these companies. So like, if you're going to say to farewell, we're not going to work with Facebook.
[00:25:22] It's like, there's like 49,000 employees out of the 50,000 who are probably insanely good people and like have, and it helps small businesses, like, so there's this part of this a bit. It does do good. Like I don't, I can't step back as an agency owner and say, forget it. We're just done advertising on Facebook.
[00:25:39] And I don't know that it would change anything if we did, if it wasn't part of a bigger movement and I'm not even proposing that, that's what people should do. I'm just more talking out loud about, I struggle with this. Like, this is real. I understand what they're doing. I understand the negative effect it has on people and society.
[00:25:57] And I don't know where the stopping point is, and I don't know. [00:26:00] What the end game is. And it's one of those things that you and I would be sitting around at a conference, probably talking about this over a drink, and then I'd go back to my daily life and I'd just put it away for six months and not think about it and not feel anything like that.
[00:26:14] David Meerman Scott: I exactly, I mean, in my case, I don't spend money on advertising on Facebook or Instagram and I won't ever. I mean, not that my couple hundred bucks a month would make any difference anyway. And I don't talk about Facebook any longer in my presentations. You don't tout it as here's how, what you should do if you're marketing in the next version of the new rules of market P PR, I will probably start to say that Facebook is evil.
[00:26:43]And the one thing that I thought, what about that I could do that. I like you, I could do that I haven't done is I could publish, please say I'm going off Facebook and I deleted my account. Yep. I haven't done that, Paul. I haven't done that. And again, is it going to make it. [00:27:00] Are you, do you have the ability to affect change less by not being there?
[00:27:05] Paul Roetzer: Is it almost goes back to that. You're like,
[00:27:07] David Meerman Scott: I feel like I want to understand the platform and how it works. And for that reason I need to be on it. I don't feel as if I can just, you know, wash my hands a bit and be done with it and make a state and a one-time statement that says DMS is off Facebook.
[00:27:25] That we'd get a mild social media ripple and then go away. I don't know that that's as. And as that is an important way to share my feelings than it is to, to, to try to understand deeply what's going on and talk about it like UNR. Yeah. Yeah. I was just going back to, you ended that post with like five things that marketers could do.
[00:27:46] Paul Roetzer: So if we're assuming this backlash is coming, if we're assuming this acceptance that it is evil and I, I mean, I don't know that you'd get a ton of debate from people. I mean, it's more of it's become so utilitarian within their lives that it's like, yeah, It is, but [00:28:00] so are a lot of these companies. So I'm just looking at your, your takeaways and I'm kind of leaning toward this whole idea of like, educate yourself on how the algorithms work and speak out.
[00:28:08] And I, I do think it just, it needs to be discussed because I know so many of my family members watch the social dilemma and they were just creeped out. They still don't understand how the AI works and they don't understand that it's only doing a small piece of what it's capable of doing. So what we really need is more.
[00:28:25] Marketing professionals who can connect the dots and look at the bigger picture of it's really about manipulating behavior. Because what we do as marketers is you try and get people to take actions. You try and understand needs and desires and fears. But with AI, you can, you can manipulate those things in whole new ways.
[00:28:42] And so, you know, I think that, that we've started talking about the Marketing AI Conference and how are you there? And our theme, there was more intelligent, more human that what we really need our marketers to talk about the human side. Absolutely. Right. Absolutely. Right. And you know, it was really interesting because you're just talking here second.
[00:28:57] David Meerman Scott: I remembered something you're going to get the quote [00:29:00] wrong button. In that blog post, we were talking about that. I wrote. There was my favorite quote out of the social dilemma was a, an add on to the riff that that if you're not paying for a product, you are the product. Is this the person, the guy with the dreadlocks?
[00:29:15] I forget. Yeah. Yeah. Super interesting guy. He said, no, that's not the product. And I'm going to get there. I'm going to paraphrase and you get the quote, not precisely. Right. But you said the subtle change in your behavior. When you're using a social network, that's the process. Yeah. I highlighted the gradual slight imperceptible change in your own behavior and perception as the product.
[00:29:42] Paul Roetzer: I, I agree. As soon as I read that in your piece, I was like, yeah, that's okay. And that's really creepy to say that you go in and using Facebook believing one thing. And then based on the stuff that you click, you, you finish months later thinking something else. I stopped using it. Like I, [00:30:00] cause I was going to just like, where does this go?
[00:30:02] Like 12 months from now. Are we back here? Been like, yeah, nothing changed. Like it's. Yeah. I mean, I think more people are going to do it probably you and I are talking about. And I think people in our industry have probably been doing that for a couple years, because again, we've seen it coming. I just stopped using it.
[00:30:16] Like I, especially during the election cycle, I could not go into Facebook. There's yeah. Because again, we could go down a whole nother path, but I did, I still use Instagram to a degree to at pictures of space and stuff, but like yeah, I'm kind of with you, like I'm there because I need to understand the platform because it does help businesses grow.
[00:30:37] David Meerman Scott: I really do. I I'm going to spend 2000 and what the hell year are we going? Coming up to 2021. I'm going to spend 20, 21 talking about this topic. And I don't believe that I can talk about it intelligently, unless I'm on Facebook. Just kind of getting a sense of, of, of what they're doing and how they're [00:31:00] doing it.
[00:31:00] But I would say to people who maybe aren't seeing it as much on Facebook, but. The first thing that opened my eyes was Netflix. It was Netflix that the light bulb went poof. Oh my gosh. I'm finally getting how these algorithms work, because all it was showing me was rock documentaries. Now, granted here they are.
[00:31:23] Right. I'm a rock and roll guy, but that's not all I want. Watch that Spotify would be another one of other things. Yeah. Do you use Spotify? I don't, that would be the same thing. It's it functions the same. It's a recommender algorithm. It learns what you listen to, but to your point, it's like, yeah, great. I listened to three hip hop songs and, you know, two country songs and.
[00:31:45] Paul Roetzer: No, the algorithm starts to just show me a bunch of more hip hop. So it's like, Oh, that's good. And all of a sudden, like five days later, it thinks all I like is hip hop. When in reality, like I'm going go listen to classical music, then I'm gonna go listen to like monster bands from the eighties and was like, I'm not two classes.
[00:31:58] David Meerman Scott: That's exactly right. That's exactly, [00:32:00] exactly right. And that's what educated me. That was the thing that really made me figure it out. And then I was able to take a look at Facebook with new eyes. Yep. And really, truly see what it was doing to me. And that's when I backed off and said, I don't, I don't go into the general Facebook anymore.
[00:32:19] Well, I shouldn't say anymore rare, very rarely do. And I just do to see what it wants to show me now. I go in, there's a couple of groups I go to that aren't available anywhere else, but I really do want to make sure that I see what's happening there and be able to explore when I. When I want to, to see, Oh my gosh, I clicked on that ad.
[00:32:40] And now they're trying to sell me this. And you know, I commented on that blog post and all of a sudden they know that I watched that. I looked at that blog post over on Facebook and it's like, you know, they're there that's subtle. I don't want to get the quote. Right. That subtle change. Gradual. Yeah.
[00:32:58] Paul Roetzer : Well, we'll definitely, as you continue your research into this, we'll definitely come back to this. I'd love to talk about it another 12 months and see, see where you've gone and what you doing. Thank you.
[00:33:07] David Meerman Scott: Thank you for your lead again. I mentioned at the top, but thank you for your leadership in this area because we were, you know, Me in a tiny way, you, in a big way, we need to make sure that marketers know what's going on here. And there is a choice. You don't have to advertise with Facebook. You don't have to use Facebook and you see, but you certainly need to understand Facebook because. I believe very, very strongly. And sounds like you are in the same camp, that it's a dangerous thing for society and that the algorithm is a dangerous thing.
[00:33:41] Paul Roetzer: I don’t want my kids on it. Let's just put it that way. I hope it's all gone by the time they're teenagers. All right. So we're going to end with a rapid fire as we always do.
[00:33:52]. So we've got a couple of quick ones here. David, you write voice assistant. You use the most Alexa, Google Assistant, Siri, or don't use them.
[00:34:00] [00:34:00] David Meerman Scott: I don't use them. No, not at all.
[00:34:04] Paul Roetzer : More valuable in 10 years, a liberal arts degree or a computer science degree?
[00:34:25] David Meerman Scott: Becoming from Kenyon college in Ohio, just a couple of hundred miles South of where you are right now, Paul which is a fabulous liberal arts college.
[00:34:35] Yeah. I I'm absolutely convinced that learning how to think first is really important and a liberal arts degree is critical for that. So I'm in the camp of liberal arts. Great. I'm a big fan also. All right.
[00:34:49] Paul Roetzer : Net effect over the next decade, you feel more jobs eliminated by AI or more jobs created by AI, or it's not going to make a meaningful impact one way or the other.
[00:34:59] David Meerman Scott: I think every time there's a new technology, more jobs are created. It's just different kinds of jobs, you know? You know, are there still going to be people? For example, writing writing financial news articles based on corporate earnings releases. No, those jobs are going away, but there will still be jobs that are created. I'm interested to hear what you think about that.
[00:35:24] Paul Roetzer: I'm in the more created camp. I just don't think we can see them. I mean, it's, it's always hard to see around the corner of what's going to come from it, but every time I give a talk or every time I go meet with a corporation, I leave thinking, well, I just.
[00:35:35] Came up with five more jobs that are going to exist. That don't right now. I think that's exactly right. And the other thing is interesting. Tying back that question to the liberal arts question is if you learn how to think, then you're going to be prepared for those news jobs, because you can figure out how to then apply your brain power to something that didn't even exist a couple of months ago.
Great. All right. Last one social media site that [00:36:00] you use the most.
[00:36:00] David Meerman Scott: It's not Facebook as we just spent half an hour talking about, I'm a Twitter guy, Paul. I I love, I love Twitter. I and I, and yes, they have an algorithm too, but it's not quite as heavy handed.
[00:36:19] It's not designed to do the same thing. Yeah. And I'm on it every day. I think it's a very, very important place for for, for, for every person on the planet to be able to see what's going on with the people that matter, whether that's politicians or journalists or authors or others you know, Many times they'll communicate right to you.
[00:36:41] It's super cool. So that's my thing. I'm with you a hundred percent lists in Twitter is like my go-to. I have a news list, the science list and AI list. And what I've done is to your point in that article about human curation, when a topic becomes important to me, I go find the authorities on it, and then I see who they retweet and I add [00:37:00] them to my, and it's just like, So I build ups, like something happened, like, you know, a new government stimulus program.
[00:37:05] Okay. I'm going to go to my feed and see what those people who I trust are saying about it rather than me trying to like, guess or go to Facebook. Exactly. Right. Yep. Yep. And now super cool. And it's you know, people who dismiss it, I believe had just don't really understand it. Right. Well, I would pay for Twitter.
[00:37:21] Like if they said it's nine 99 a month, I'm not done. Oh, it's premium. What's the premium option. I'll pay more. Yeah. Yeah. Where do I get for the 250 bucks a year? I'm in, yeah, I'll take it. All right. Well, this has been great. Do you have any final thoughts for our audience in terms of AI, social media? Just kind of, you know, I was thinking like the little bit of a downer, this particularly give me something good.
[00:37:43] We started with this awful pandemic and how we're not, we're not getting on airplanes and traveling and meeting people anymore. And, and, and And how Facebook is ruining the world. And despite all that Paul I'm actually super [00:38:00] enthusiastic and optimistic about the world. And you know, how wonderful this place is that we get to inhabit and how it's super cool to interact with people like you on a daily basis.
[00:38:12] And. And so I'm super bullish about 2021, even though there's there's these things that we talked about today that are, that are bad, but the vast majority of things happening out there are, are, are super positive and good. Good. And, and I'm with you and I think. An awareness of what's happening is the first step.
[00:38:30] And so yes, it it's evil. Yes. It can affect us, but the more people that are willing to talk about it and understand it, the more we can collectively figure out a better way forward. And I, I agree. I mean, it's, it's not the funnest topic to talk about, but it's gotta be talked about, and I'm glad you've sort of taken that platform that you have and, and are moving that conversation forward.
[00:38:51] Paul Roetzer: So what's the best place for people to find you. Well, we know you're on Twitter.
[00:38:55] David Meerman Scott: We just talked about Twitter. DM Scott, @D M S C O T T. My full name, David Meerman Scott. The only one on the planet. I used my middle name professionally starting 20 years ago for SEO purposes. So you enter my name and you find me.
[00:39:11] Paul Roetzer: Excellent. All right, David, always a pleasure. Thanks, Paul. Always great to talk to you. All right. This has been the Marketing AI Show. We'll look forward to talking with you again. Next time.
Sandie Young
Sandie Young was formerly the Director of Marketing at Ready North. She started at the agency during the summer of 2012, with experience in magazine journalism and a passion for content marketing. Sandie is a graduate of Ohio University, with a Bachelor of Science from the E.W. Scripps School of Journalism.