Social media is at a pivotal moment, and this episode dives deep into the question of whether it can be saved or reimagined. Jesse Hirsh leads a dynamic discussion with guests David Mason, Jeanette Herrle, Sherida Ryan, and Greg Majster, exploring the polarizing nature of platforms like Twitter and TikTok, and the implications of their algorithms on public discourse. The conversation highlights the need for transparency, accountability, and the potential for social media to foster genuine connections rather than division. The panelists challenge the existing structures and propose innovative ideas like cooperative systems and educational frameworks to encourage more meaningful participation. As they navigate the complexities of social media's role in today's society, they collectively envision a future where the medium serves the many, not the few, fostering a healthier digital environment for all.
Jesse Hirsh invites a panel of thought leaders, including David Mason, Jeanette Herrle, Sherida Ryan, and Greg Majster (Stro1), to engage in a dynamic conversation about the future of social media in an era marked by political upheaval and societal changes. The discussion kicks off with an evaluation of the current state of social media platforms, particularly in light of recent events such as the inauguration of a controversial president and the implications of unchecked online discourse. The panelists express concerns over the role of social media in facilitating polarization and the spread of misinformation, questioning whether these platforms can be reformed or if they are fundamentally flawed. David Mason prompts a reflection on the original intentions of social media as spaces for connection and community, contrasting that with the present reality where algorithms prioritize sensationalism over meaningful engagement.
Amidst these reflections, the conversation also touches on themes of anonymity, safety, and the potential for social media to empower users rather than undermine individual agency. Jeanette highlights the historical parallels between the rise of print media and the current digital landscape, suggesting that just as the printing press democratized information, social media has the potential to do the sameāif appropriately harnessed. The panelists explore the idea of creating a more equitable social media framework that encourages diverse voices and fosters constructive dialogue, rather than amplifying extremist views. They consider how user participation could be incentivized through innovative structural changes, such as promoting transparency and accountability among platforms.
Ultimately, the dialogue leads to a consensus that social media must evolve into a space that prioritizes community over chaos, empowering users while also establishing necessary guidelines for responsible interaction. The discussion concludes with a call to action for listeners to rethink their relationship with social media, advocating for a collective effort to reclaim these platforms as tools for good, promoting solidarity and understanding in a fragmented digital world.
Takeaways:
a:5:{s:22:"transcription_uploaded";s:4:"file";s:18:"transcription_html";s:91:"https://transcripts.captivate.fm/transcript/ed68b562-d3c1-42e1-9dca-785bccefa347/index.html";s:18:"transcription_file";s:95:"https://transcripts.captivate.fm/transcript/ed68b562-d3c1-42e1-9dca-785bccefa347/transcript.srt";s:18:"transcription_json";s:96:"https://transcripts.captivate.fm/transcript/ed68b562-d3c1-42e1-9dca-785bccefa347/transcript.json";s:18:"transcription_text";s:61501:"Jesse Hirsch 00:00:04
Hi, I'm Jesse Hirsch.
Jesse Hirsch 00:00:06
Welcome to Metaviews, recorded live in front of an automated audience.
Jesse Hirsch 00:00:16
Today we've gathered the OG Metaview's crew to kind of talk about whether social media can be saved.
Jesse Hirsch 00:00:24
Now, that may be a bit of a misnomer title.
Jesse Hirsch 00:00:27
That's kind, kind of what I'm hoping to get out of it.
Jesse Hirsch 00:00:30
But often, as we do with what I'm now calling the OG Meta crew, we tend, tend to take a bit of a random, spontaneous direction.
Jesse Hirsch 00:00:40
So who knows where we're going to end up other than social media kind of being the frame?
Jesse Hirsch 00:00:46
Because as we meet today on this very cold, dark day in January where we should be honoring the great Martin Luther King Jr.
Jesse Hirsch 00:00:57
We are instead lamenting the inauguration of Donald Trump becoming the first fascist president of the United States.
Jesse Hirsch 00:01:07
And social media has been playing a role.
Jesse Hirsch 00:01:10
And it's been interesting to see almost all the leaders of social media platforms not only donate money to the inauguration, but basically say, hey, look boss, whatever you want, you want no moderation, you got it.
Jesse Hirsch 00:01:24
No moderation.
Jesse Hirsch 00:01:26
And maybe even TikTok may be subjecting itself to American control.
Jesse Hirsch 00:01:31
So it seems like a historic moment when things are changing, things are up in the air, when it's an opportunity to discuss what should be, what could be, what is, and maybe even what has been.
Jesse Hirsch 00:01:44
So this salon was partly instigated, catalyzed by David Mason, a long standing Metaviews member.
Jesse Hirsch 00:01:52
So, David, why don't you kind of set us off with the angle you wanted us to take and in particular why you think social media happens to be potentially in a pivoting moment?
David Mason 00:02:07
Yeah, sure.
David Mason 00:02:08
So, I mean, it's good to start by, you know, trying to frame what is, what we mean by social media.
David Mason 00:02:14
And I think to a large degree it's like the, you know, the Twitter or Twitter and, and all the different sites where, you know, the idea is you just kind of shout stuff into the void and, and you connect with people and you have a good time was kind of the idea.
David Mason 00:02:33
And you know, I, I'm not a huge social media user myself.
David Mason 00:02:38
I find it pretty overwhelming, but it can be useful.
David Mason 00:02:41
And if you have a little, you know, if you might have a.
David Mason 00:02:44
Find yourself in a decent little conversation.
David Mason 00:02:47
I really liked the idea of Twitter in the first place when it was this hyperlocal idea.
David Mason 00:02:52
They actually embedded everyone's latitude and longitude in every single tweet.
David Mason 00:02:58
And it was really more of a like a commons idea at that point, because anyone could access that information, create their own views of it, and so on.
David Mason 00:03:09
But, you know, here we are now and it's all been bugged in mind and surveilled.
David Mason 00:03:19
It polarizes people.
David Mason 00:03:22
It's really almost designed warping and designing, distorting conversation.
David Mason 00:03:29
So what I wonder is what should it look like?
David Mason 00:03:33
I mean, maybe it wasn't even a good idea in the first place because it's just so easily.
David Mason 00:03:39
I mean, maybe it's just not a good idea to put yourself out there, you know, but, you know, if it is a good idea, what do we want it to look like?
David Mason 00:03:47
You know, in order for people to be able to present their facts and perspectives and represent themselves and also, you know, safely share tactics.
Jesse Hirsch 00:03:57
And I certainly, to your point about how much should we be sharing is something I wish was more of a public conversation in this dawn of the fascist era, because this is all evidence that can be used against us.
Jesse Hirsch 00:04:11
And I've been kind of wondering whether I should be deleting past histories or conversely being proud of it.
Jesse Hirsch 00:04:17
Right.
Jesse Hirsch 00:04:17
And trying to be a voice that's an alternative to this.
Jesse Hirsch 00:04:21
And of course the other side, David, in terms of you kind of raising the question, what should it look like?
Jesse Hirsch 00:04:27
There is this kind of open source initiative either within the kind of AT protocol which the Fediverse is really championing and Blue sky also being not exactly part of the Fediverse, but nonetheless embracing kind of open source protocols.
Jesse Hirsch 00:04:44
There is movement certainly on the developer level, if not also on the user level, with people leaving X or leaving Twitter.
Jesse Hirsch 00:04:52
Jeanette, I kind of want to throw to you only because I know that you've been going through your own kind of assessment, similar to how David's thinking, should we have social media?
Jesse Hirsch 00:05:03
What role does social media play?
Jesse Hirsch 00:05:05
And I know, Jeanette, you're going through your own kind of re evaluation of where you kind of spend your time in social media and which platform should get your attention or value.
Jesse Hirsch 00:05:16
So where do you see us in this current spot?
Jesse Hirsch 00:05:19
And to David's point, what should this stuff be looking like?
Jesse Hirsch 00:05:25
What's your ideal kind of social media configuration?
Jeanette 00:05:30
I mean, I honestly don't know.
Jeanette 00:05:32
I do feel like it's one of those things where when you see it, you'll know that that's what it is.
Jeanette 00:05:39
Two things sort of have been on my mind and what David said kind of reminded me.
Jeanette 00:05:44
One is that I've seen elements of this moment before.
Jeanette 00:05:50
Part of what I studied when I was doing work in history was the massive explosion of print.
Jeanette 00:05:57
When printing, as a way of making your voice heard, became accessible to just regular Folks, starting in this, you know, if we're talking about the UK, for instance, starting in the 17th century.
Jeanette 00:06:13
And of course, the natural consequence of that was the learned elites lamented that now every idiot had a megaphone, essentially, and that the, the public sphere was being flooded with all kinds of garbage.
Jeanette 00:06:31
I think this happens every time.
Jeanette 00:06:32
It's a double edged sword.
Jeanette 00:06:34
It's the other thing that has been really bothering me lately is I have a mutual on Twitter, which I'm still on.
Jeanette 00:06:41
I'm kind of still hanging on there.
Jeanette 00:06:45
And he, this is a really super smart guy who I respect a lot, but he keeps banging on, he keeps reposting the same tweet, which is saying, in effect, oh, you dummies.
Jeanette 00:06:57
You think that things like rednote and Tick Tock are liberatory and subversive, but they are just, you know, the most masterful propaganda control mechanism there is.
Jeanette 00:07:10
And every time I read that, I think, dude, they're both.
Jeanette 00:07:13
How.
Jeanette 00:07:14
Why is that so hard to accept that they can be both?
Jeanette 00:07:16
I, I think there's a, you know, in lefty circles, there's this truism about that I think started with Audre Lorde.
Jeanette 00:07:23
You know, the master's tools can never be used to disassemble the master's house.
Jeanette 00:07:27
But my experience of human beings is they hack everything, they subvert everything.
Jeanette 00:07:34
It's engineers who think tools will be used for the purpose that they are designed for and nothing else.
Jeanette 00:07:40
People are so perversely contrary.
Jeanette 00:07:43
And you see this especially with children, they will do things you couldn't even imagine with a tool.
Jeanette 00:07:50
And I don't think social media is any different in that respect.
Jesse Hirsch 00:07:54
It certainly suggests that there are applications of social media that we haven't gotten into yet, that there's ways in which we could bend the platform, bend the medium, so that it has perhaps more resilient characteristics outside of the propaganda, outside of the kind of contagion of stupidity.
Jesse Hirsch 00:08:14
Now, Stroh, I want to bring you in to the conversation.
Jesse Hirsch 00:08:17
Although, David, you just raised your hand, so if you want to do a hot pursuit, please jump in.
David Mason 00:08:22
Oh, yeah, I just wanted to.
David Mason 00:08:24
I mean, I'm sorry.
David Mason 00:08:25
One thing that really has been on my mind for a long time is that, you know, the, the, the idea that this media could and should be symmetrical, and in particular, we, the individual, should be using the media to critique the system.
David Mason 00:08:47
But what has happened instead is that it's asymmetrical and the system is basically being used to undermine the individual.
David Mason 00:08:56
I have a lot of ideas about what it should look like.
David Mason 00:08:59
And that's exactly how it looks to the people behind the dashboards at Facebook and Twitter and even Bluesky and unfortunately Mastodon as well.
David Mason 00:09:13
So, yeah, sorry, I just wanted to add that one point.
Jesse Hirsch 00:09:16
Well, and I wouldn't mind at some point coming back to the idea of transparent analytics, of having an actually transparent social media system in which we can not only see the activity of who's posting and what they're posting, but have that administrator's view, have a sense of the meta view.
Jesse Hirsch 00:09:33
But Stroh, I kind of wanted to bring you in both to put you on the spot as someone who's a creator, right.
Jesse Hirsch 00:09:40
Someone who, you know, in using social media, is doing so from the perspective not just of the consumer, but of someone producing content.
Jesse Hirsch 00:09:48
But I'm also curious, from an athletic perspective, are there rules to the playing field of social media that we don't have, that we could have?
Jesse Hirsch 00:09:59
And I say this because, you know, a lot of what makes sports interesting are the constraints are the rules that are placed on the game.
Jesse Hirsch 00:10:07
And social media doesn't really have rules.
Jesse Hirsch 00:10:11
Maybe there should be some.
Stroh 00:10:13
Right?
Stroh 00:10:14
So, you know, it is kind of, it's kind of an anarchist way of communicating, isn't it?
Stroh 00:10:22
And it's like if we had, if we had rules like transparency, for example, then maybe we would be more accountable.
Stroh 00:10:29
It is very much like what Jeanette suggested, like the early days of print.
Stroh 00:10:35
Right.
Stroh 00:10:36
Because this is such a new technology and we haven't caught up, our educational system hasn't caught up really, our moral and ethical standards, laws even having caught up to where we are giving everybody the bullhorn.
Stroh 00:10:51
Right.
Stroh 00:10:52
So, yeah, it would help.
Stroh 00:10:53
Of course it would help.
Stroh 00:10:55
But how do you do that?
Jesse Hirsch 00:10:56
Right.
Stroh 00:10:56
It goes back to education, obviously, like, because our speaking as a parent even this is how our kids socialize, like, truly, it's, you know, this is where they learn their, their behavior from in so many ways.
Stroh 00:11:11
That's why our world is becoming so much more violent.
Stroh 00:11:14
Well, you know, and that's arguable, you know, but sexualized.
Stroh 00:11:19
Why there's so much more mental health issues when it comes to, like, how people see themselves, not just girls, but boys too now, right.
Stroh 00:11:28
To, to a much greater extent.
Stroh 00:11:31
So of course, if, if we could find a way to, you know, have these things.
Stroh 00:11:36
But it comes back to regulation.
Stroh 00:11:37
How do we, you know, how do we regulate something that is literally in everybody's hand?
Stroh 00:11:43
Do we need to control it?
Stroh 00:11:45
Right.
Stroh 00:11:45
And I, I think long term that's going to have to happen when too Many people are.
Stroh 00:11:50
Start.
Stroh 00:11:51
Start getting hurt or not because of what, like just even your post this morning, I was kind of skimming through it, talking about the, you know, how they're.
Stroh 00:12:01
Whoever.
Stroh 00:12:02
There's what.
Stroh 00:12:03
I forget the term you use, but basically it's you're screwed or you're getting screwed kind of a thing.
Stroh 00:12:10
Right.
Stroh 00:12:10
So maybe you can tell me what that word was.
Stroh 00:12:14
I don't know if you remember.
Jesse Hirsch 00:12:15
Well, it was like, sucks to be you.
Jesse Hirsch 00:12:16
Was.
Jesse Hirsch 00:12:16
Was.
Stroh 00:12:17
Yes.
Stroh 00:12:17
Sucks to be you.
Stroh 00:12:18
That's what it.
Stroh 00:12:18
That's right.
Stroh 00:12:19
Sucks to be you.
Stroh 00:12:20
Well, how do we make sure that, you know, we have things in place and that that is the government, the government's job to govern that so we don't cause harm with it.
Stroh 00:12:31
Right.
Stroh 00:12:31
To kind of regulate it.
Stroh 00:12:33
And maybe that's a long way to can say, yes, regulation would be good because sports are great because you, you have a code that you can play by.
Stroh 00:12:41
Right.
Stroh 00:12:42
And if there's no code, then it's just the biggest bully wins.
Jesse Hirsch 00:12:45
Well, and I think the other sort of point you made right at the outset was transparency, because the thing about sports is, you know, the rules.
Jesse Hirsch 00:12:53
Right.
Jesse Hirsch 00:12:53
Like.
Jesse Hirsch 00:12:54
Like both teams understand the rules.
Jesse Hirsch 00:12:56
Both teams are playing by the same rules.
Jesse Hirsch 00:12:58
That doesn't happen in social media.
Jesse Hirsch 00:13:00
Right.
Jesse Hirsch 00:13:01
If, if there are rules, it's hidden.
Jesse Hirsch 00:13:02
Go ahead.
Stroh 00:13:03
And what's more is that there's somebody ref refereeing the game.
Jesse Hirsch 00:13:07
Yes.
Stroh 00:13:07
A neutral party that needs to actually, like, try to implement that.
Stroh 00:13:13
Right.
Jesse Hirsch 00:13:13
Although there is, you know, a lot of speculation online that the pervasive gambling in sports has made referees corruptible.
Jesse Hirsch 00:13:22
Right.
Jesse Hirsch 00:13:23
And it is interesting to see them doubt that.
Stroh 00:13:26
But that's not new.
Jesse Hirsch 00:13:28
No, no, but it's, it's maybe louder.
Jesse Hirsch 00:13:32
And I think to your point about rules, you know, people didn't like that TikTok was being banned or subject to rules, so they ran to RedNote, which has more rules.
Jesse Hirsch 00:13:43
Like, the terms of service of RedNote are really quite onerous and include things like you can't bully, you can't be mean.
Jesse Hirsch 00:13:51
And people like that, even though I don't think they're going to stay, they're coming back to TikTok.
Jesse Hirsch 00:13:56
But to your point, I think there is a desire for them to be not so much control, but fairness.
Jesse Hirsch 00:14:03
And yet Jeanette's point of humans resist control, humans find ways to evade control, also has a dynamic here, I think that is interesting.
Jesse Hirsch 00:14:14
So, Sharita, if I could bring you into the conversation, I'm curious where you fall into all of this again as a social Media user.
Jesse Hirsch 00:14:24
But keeping in mind, David's kind of a central challenge, which is what should social media be like?
Jesse Hirsch 00:14:31
What should it look like?
Jesse Hirsch 00:14:32
How should it work?
Jesse Hirsch 00:14:33
Or maybe should it not exist at all?
Jesse Hirsch 00:14:36
Which is certainly something I think that we should entertain on the table.
Sharita 00:14:42
The first thing, really, that I've been thinking about all the way through this is that social media, its foundation is social.
Sharita 00:14:55
Well, some people think its foundation is social network theory.
Sharita 00:15:02
Certainly you can analyze it by social network theory, but there are some theories about this.
Sharita 00:15:10
And generally speaking, people who are participating are following those theories.
Sharita 00:15:21
So, for instance, one theory in terms of social networks is that birds of a feather stick together.
Sharita 00:15:34
And when that happens, when you are generally, you know, really focusing on somebody who has the same bias as you, you begin to form social networks.
Sharita 00:15:51
And then that relates in a way to who you trust.
Sharita 00:15:57
You trust the people in your pack.
Sharita 00:16:00
You don't trust the people out there.
Sharita 00:16:03
So that dynamic's happening.
Sharita 00:16:06
On top of that, if you were to put to.
Sharita 00:16:11
To think about what Jeanette is saying and that there's always outriders, there's always out there, people in the, you know, whatever sphere that are not playing by anybody's rules, including social network theory, then the idea of, all right, so there already are some rules in terms of social networks.
Sharita 00:16:36
I'll go back to social network theory, but then you have algorithm algorithms being put on top of that.
Sharita 00:16:46
So another set of rules is going on, and that set of rules is dependent on the bias of whoever programmed the algorithm.
Sharita 00:16:58
So what you're getting here is a caffeine of sound.
Jesse Hirsch 00:17:03
Although let me push back there, because the word rhythm can be found in algorithm and that, you're right, I think at the default level, it's a cacophony of sound.
Jesse Hirsch 00:17:17
But what was interesting in these quote, unquote, final days of TikTok, because of course, they weren't, but they created a mania.
Jesse Hirsch 00:17:25
They created a sense that everyone experienced it.
Jesse Hirsch 00:17:29
TikTok was giving them a rhythm.
Jesse Hirsch 00:17:32
It was giving them a rhythm that, yes, in some cases, they're literally dancing to, but in other cases, it speaks to why they're participating on the platform.
Jesse Hirsch 00:17:44
Because I was also thinking about, to your point, the cacophony and the birds of feather.
Jesse Hirsch 00:17:50
What I really liked about Twitter during the pandemic was the people who, like, who are birds of a feather.
Jesse Hirsch 00:17:58
For me, around Covid, right, Who were Covid aware, who were researching about COVID who were saying, yeah, you gotta wear a mask.
Jesse Hirsch 00:18:05
Yeah, you gotta protect yourself.
Jesse Hirsch 00:18:07
And there was a part of me that didn't want to leave Twitter because I didn't want to leave those people, but now I found them on Threads.
Jesse Hirsch 00:18:16
So Threads has quickly figured out that I'm into Covid people, and I keep going back to Threads, even though it's actually a craptacular social media platform form.
Jesse Hirsch 00:18:28
But it's because it's giving me that birds of a feather.
Jesse Hirsch 00:18:31
However, the rhythm of the Threads algorithm is terrible.
Jesse Hirsch 00:18:37
Like, absolutely terrible.
Jesse Hirsch 00:18:39
There's no rhythm at all.
Jesse Hirsch 00:18:41
And so I'm going there for this single note.
Jesse Hirsch 00:18:44
But to your point, there's a lot of cacophony.
Jesse Hirsch 00:18:46
So I'm positing that the social media I want to go back to.
Jesse Hirsch 00:18:51
David's question is one that has a lot of rhythm, that the algorithm has a rhyme to it, that the algorithm has a flow to it, that it generates a kind of music that keeps me kind of engaged.
Jesse Hirsch 00:19:06
And that's been rare for me as a social media user.
Jesse Hirsch 00:19:10
But TikTok has done it on, on some occasions.
Jesse Hirsch 00:19:13
Stroh, please jump into the conversation.
Stroh 00:19:15
Yeah, you know, what's, what's coming to my mind when you're talking about birds of a feather and just the rules around how these things, you know, our social rules are built into our nature or whatever is actually silos.
Stroh 00:19:32
I think the danger of, the danger of social media isn't that we're birds, but.
Stroh 00:19:38
But that it's easier.
Stroh 00:19:39
The algorithm creates silos.
Stroh 00:19:41
And it's not a natural discourse.
Stroh 00:19:43
Right.
Stroh 00:19:44
Like, we're not like birds of a feather have.
Stroh 00:19:48
Or birds in general.
Stroh 00:19:50
Right.
Stroh 00:19:50
They have a purpose in nature, whereas it, they're, you know, it's.
Stroh 00:19:55
I, I think that the real issue is that the algorithm, the rhythm itself creates silos and structures that are in.
Stroh 00:20:04
Impenetrable in a way, and that's why we've been.
Stroh 00:20:07
We've become so polarized and that's, that's why the world's so.
Stroh 00:20:10
Because.
Stroh 00:20:11
Right.
Stroh 00:20:12
Like, we don't have discourse anymore.
Stroh 00:20:13
Like, we even.
Stroh 00:20:15
This is a silo in a way.
Stroh 00:20:17
Because who would be here with, like, with a really vehemently opposing position to what.
Jesse Hirsch 00:20:25
You know, although you raise, you.
Jesse Hirsch 00:20:28
You raise a key point that I want to reinforce because not only are we a silo intellectually, right.
Jesse Hirsch 00:20:34
In terms of, you know, the five of us kind of have.
Jesse Hirsch 00:20:37
While we are different, we have a certain bias in common, which is a desire to talk about this stuff.
Jesse Hirsch 00:20:42
Right.
Jesse Hirsch 00:20:43
Desire to hang out and problem solve, maybe.
Jesse Hirsch 00:20:45
Sure.
Jesse Hirsch 00:20:46
Sure.
Jesse Hirsch 00:20:46
We are, but we are.
Jesse Hirsch 00:20:48
And this is interesting in terms of my relationship with all four of you, we are still a minority within the minority because of the current meta views readership.
Jesse Hirsch 00:21:00
Most are lurkers, right?
Jesse Hirsch 00:21:02
Most are people who read it a lot.
Jesse Hirsch 00:21:05
Often like each.
Jesse Hirsch 00:21:06
Each issue, they'll read it five times, but they won't click on the like button.
Jesse Hirsch 00:21:11
They won't post a comment, and they certainly won't post a salon.
Jesse Hirsch 00:21:15
Because the other thing to your point, Stroh, about bias is the bias of participation.
Jesse Hirsch 00:21:20
Not everyone who uses social media participates.
Jesse Hirsch 00:21:24
Right?
Jesse Hirsch 00:21:24
We only see the people participating.
Jesse Hirsch 00:21:26
We see the people posting, but we don't see the lurkers.
Jesse Hirsch 00:21:31
And I think we underestimate both their power and their influence.
Jesse Hirsch 00:21:36
But to bring it back to our focus, how could we redesign social media to incentivize the lurkers to participate more?
Jesse Hirsch 00:21:44
Because I'm convinced that social media depends upon participation the way democracy depends upon participation.
Jesse Hirsch 00:21:52
So, David, you just raised your hand.
Jesse Hirsch 00:21:53
Please jump in.
David Mason 00:21:58
Yeah, I'm not sure about participation.
David Mason 00:22:00
I kind of feel representation is more important and just what you're most recently talking about.
David Mason 00:22:06
So I'm not going to.
David Mason 00:22:07
I really didn't sleep well last night, so I'm not going to have the most precise terminology here or even good recall.
David Mason 00:22:12
But, but you know, it's become dangerous for a number of reasons.
David Mason 00:22:16
And one is that, you know, the social media company or organization is, you know, has a special access to what you're saying and you know, all their partners and some of their partners are, can be very, very nefarious, have access to what you're saying.
David Mason 00:22:33
But also anyone pretty much can sit there and scrape contents and you'll see like one of the first things that happened on when Blue sky opened up, it became a refuge.
David Mason 00:22:44
Left leaning millennials and people on Twitter started scraping their profiles and posting them on.
David Mason 00:22:52
On Twitter.
David Mason 00:22:53
Right.
David Mason 00:22:55
So that's why, I mean, I think, I think my, probably my most concrete way just put it out there what I think it might maybe should look like.
David Mason 00:23:06
And this is going to sound terrible in a way, but in a few ways.
David Mason 00:23:10
But it's kind of one system I've heard description is the way that Google proposes to do advertising.
David Mason 00:23:18
And I'm probably getting a little bit wrong.
David Mason 00:23:20
And it was panned of course, because a company like Google will get a hyper criticized and they should.
David Mason 00:23:27
But my read of it was called Flock and basically it would create a bundle of identifiers of who people are.
David Mason 00:23:35
And I think that's how you should serve social media, as a bundle of identifiers.
David Mason 00:23:40
And so when somebody posts something, you should say what Bundle of identifiers posted this.
David Mason 00:23:45
And you should be able to choose what level of disclosure you have over your actual identity.
David Mason 00:23:50
And there's a really important idea that's starting to emerge now called personhood, where you can actually have, like, an authority, say, this is a unique person.
David Mason 00:24:01
And that's a really interesting idea to layer on top of that.
David Mason 00:24:03
But for now, I think just in terms of.
David Mason 00:24:06
Of safety and reasonableness, I think that, you know, we should acknowledge the lurker factor.
David Mason 00:24:13
We should allow those people to become a new kind of participant.
David Mason 00:24:17
Pass to like a dashboard.
David Mason 00:24:19
They click, but maybe.
David Mason 00:24:20
And even past a, you know, reaction or like.
David Mason 00:24:24
Or whatever.
David Mason 00:24:25
But yeah, that level.
David Mason 00:24:27
That level, I think, is next.
David Mason 00:24:29
And that level becomes very necessary when it's so easy to make, you know, any quantity of fake persons that are incredibly realistic as well.
Jesse Hirsch 00:24:38
I think that's a brilliant idea.
Jesse Hirsch 00:24:39
And it kind of suggests that the current problem with social media is it's too individualistic that, you know, if we really want to socialize, the social media part, we have to allow teams, right?
Jesse Hirsch 00:24:51
We have to allow people to use social media as a group, to use social media as part of a larger team.
Jesse Hirsch 00:24:57
Of course, the most radical application of that would be the black block, right?
Jesse Hirsch 00:25:03
The social media black block.
Jesse Hirsch 00:25:04
You got 100 people all gathered together causing shit, ripping down the power structures, trolling at a level never seen before.
Jesse Hirsch 00:25:14
But similarly, the privacy benefits of pooling your data and having that data aggregated amongst a group of people that could even be strangers in the Tor, the onion routing concept.
Jesse Hirsch 00:25:27
So I think that's a fantastic idea.
Jesse Hirsch 00:25:30
And certainly it could be as easy as sharing a password and sharing a vpn, but it looks like you want to follow up, David, please.
David Mason 00:25:39
Well, I guess another area that I maybe don't agree with some perspectives is that, well, there are ways that people who want to cooperate on whatever degree of.
David Mason 00:25:53
Of benign or, you know, kind of their own techniques of interacting with the world can do so anonymously.
David Mason 00:26:04
I personally think that it should be difficult but possible to discover who people really are.
David Mason 00:26:10
I think that's possibly an important component of the network design.
David Mason 00:26:15
Yeah, like multiple court orders, basically, and rigorous process.
Jesse Hirsch 00:26:20
Well, the Chinese government at one point introduced what I thought was actually a laudable policy, which was controllable anonymity, which is the idea that you would be anonymous to companies, you'd be anonymous to the Internet, but the government ultimately would know who you are.
Jesse Hirsch 00:26:36
And granted, that depends upon your trust in the government, and it doesn't necessarily have to be the government.
Jesse Hirsch 00:26:42
It could be a trusted third party, the way that we do have trusted intermediaries that work that way.
Jesse Hirsch 00:26:49
Jeanette, let me try to bring you into the conversation here both in terms of the multiple threads that we've brought out.
Jesse Hirsch 00:26:56
Whether you want to comment on David's notion here of the aggregate or group identity, both on the data side as well as the expression side, but also to go back to that core question that we're trying to hit on.
Jesse Hirsch 00:27:10
What could we do to make social media both a better user experience?
Jesse Hirsch 00:27:15
But I would also elevate that to, you know, better for democracy.
Jesse Hirsch 00:27:19
Right.
Jesse Hirsch 00:27:19
If what we're mourning today is the rise of fascism, what are the changes that we could bring to socialized media that ensures there was more people, power involved rather than, you know, megaphones for billionaires?
Jeanette 00:27:35
Well, you know, one thing, and this really has nothing to do with the.
Jeanette 00:27:40
The issues around privacy, which are completely legitimate.
Jeanette 00:27:44
But when David was speaking, I was reflecting on my time teaching in a classroom and thinking about how any teacher is going to encounter this issue of, you know, there's lots of students who are very eager to participate, almost too eager sometimes, and their voices dominate the room.
Jeanette 00:28:05
And if you only paid attention to what was said in the class, you would have a very biased opinion of what that classroom was about, because you'll have another group, probably the majority, who will only speak occasionally or when directly called upon.
Jeanette 00:28:23
And then, of course, you have a group who do not want to speak at all for whatever reasons.
Jeanette 00:28:29
And the challenge is, how do you include those kids or adults, depending on what group you're teaching?
Jeanette 00:28:37
How do you draw them into the class without forcing them?
Jeanette 00:28:42
Right.
Jeanette 00:28:42
Without making them uncomfortable?
Jeanette 00:28:44
Because I personally made the transition over my life from deeply introverted to extroverted.
Jeanette 00:28:50
And I hated the forced participation.
Jesse Hirsch 00:28:53
You still do?
Jeanette 00:28:55
Yeah, actually, I do still hate that.
Jeanette 00:28:57
I don't like being forced on anything.
Jeanette 00:28:58
But it was.
Jeanette 00:28:59
I.
Jeanette 00:29:00
I was just afraid to make a fool of myself and I didn't want to speak for that reason.
Jeanette 00:29:05
I.
Jeanette 00:29:05
I'm not saying that's the only reason people don't want to speak.
Jeanette 00:29:08
That there.
Jeanette 00:29:09
I'm sure there's as many reasons as there are people, but how do we create a structure where, you know, those students feel comfortable having a voice?
Jeanette 00:29:20
And just to throw something off the top of my head, you know, one thing I tried was just smaller forums.
Jeanette 00:29:27
Sometimes that makes the difference.
Jeanette 00:29:30
Someone may not want to speak up in a whole of class setting, but they are comfortable speaking in a group of, let's say, three people, and that we're in, you know, as part of a group.
Jeanette 00:29:42
Right.
Jeanette 00:29:42
To speak to the aggregate identity.
Jesse Hirsch 00:29:45
I mean, one of the signs I felt of rising fascism within the United States social media ecosystem was the average individual's fear of being canceled, which I always thought was absurd because, like, you have nothing that could be canceled, right?
Jesse Hirsch 00:30:04
You don't have a TV show, you don't have a famous brand.
Jesse Hirsch 00:30:07
But I would constantly, and I still do, would hear these people talk about, you know, I don't want to get canceled, but.
Jesse Hirsch 00:30:13
Or I don't want to say anything because then I'll get canceled.
Jesse Hirsch 00:30:16
Like, you're not in a position to be canceled.
Jesse Hirsch 00:30:18
But in the US the notion of being canceled was, to your point, the fear of being a fool.
Jesse Hirsch 00:30:25
Right.
Jesse Hirsch 00:30:25
It was the consequence for exercising speech on social media.
Jesse Hirsch 00:30:29
It was the consequence of speaking differently on social media.
Jesse Hirsch 00:30:35
So, Sharita, I kind of want you to follow up on what Jeanette was saying as well as David, in terms of responding to my desire to see more participation.
Jesse Hirsch 00:30:46
And I think they're both sort of saying maybe we should be defending people's right not to participate or create alternate avenues for them to participate.
Jesse Hirsch 00:30:55
But I'm curious for you to add to that.
Jesse Hirsch 00:30:57
The angle of these are dangerous times.
Jesse Hirsch 00:31:01
Right.
Jesse Hirsch 00:31:01
Participation, while necessary to push back against the fascism, can also make one vulnerable in terms of the political climate that we find ourselves in.
Sharita 00:31:16
In terms of the political climate that we find ourselves in, I think we're all vulnerable.
Sharita 00:31:22
And I think that we have to.
Sharita 00:31:28
We have to fight back by small heroics.
Sharita 00:31:31
And small heroics are basically speaking what you are speaking who you are.
Sharita 00:31:43
I'm thinking to myself, the chances of somebody finding me really speaking my mind are not that high.
Jesse Hirsch 00:31:53
Hold on, why do you say that?
Jesse Hirsch 00:31:56
Elaborate.
Sharita 00:32:00
I have a small view.
Sharita 00:32:02
I have a small focus.
Sharita 00:32:04
I have a small footprint online.
Sharita 00:32:10
So I feel free in many ways to speak because I think that I'm the needle in the haystack.
Sharita 00:32:19
Now, I could argue it the other way as well.
Jesse Hirsch 00:32:23
Yeah, please do.
Jesse Hirsch 00:32:23
Because the needle in the haystack doesn't apply anymore.
Sharita 00:32:27
Well, I don't.
Sharita 00:32:30
I don't think it applies anymore in terms of if somebody really wants to find you, they will.
Sharita 00:32:40
And I've done that.
Sharita 00:32:42
I've seen that time and time again.
Jesse Hirsch 00:32:45
But.
Jesse Hirsch 00:32:45
But there's a flip side to what David was describing in terms of Google at one point having this Flock initiative where they wanted to put people into groups.
Sharita 00:32:53
Yes.
Jesse Hirsch 00:32:54
There are no needles in the haystack because algorithms can measure all the needles and account for all the Needles.
Jesse Hirsch 00:33:00
And you're correct that they'll probably go for other people before they go for you.
Sharita 00:33:06
Yes.
Jesse Hirsch 00:33:06
But we should not assume that this is 1930s Germany.
Jesse Hirsch 00:33:11
Even though the historical parallels are there, There is a greater efficiency at play in our contemporary moment.
Jesse Hirsch 00:33:19
And that's where I think David's earlier evocation of the analytics dashboards.
Jesse Hirsch 00:33:25
Right.
Jesse Hirsch 00:33:25
That the people who manage these ecosystems have a view that we do not possess, and that if we as users had that view, everything would be different.
Jesse Hirsch 00:33:37
And that doesn't mean good or better or worse, but people already behave differently based on their desire to go viral or their desire to get more attention.
Jesse Hirsch 00:33:49
And if they could see the logic at work, if they could see the system at work.
Jesse Hirsch 00:33:53
And that's where I say, you know, I think the feeling that a lot of people have, and I was on a podcast this morning where the guy was wishing that the incoming executive would be assassinated.
Jesse Hirsch 00:34:06
And, you know, I was like, man, that's a dangerous thing to be doing.
Jesse Hirsch 00:34:10
Like, they have the ability to inventory all of that.
Jesse Hirsch 00:34:13
They don't necessarily have the resources to get to you.
Jesse Hirsch 00:34:16
But next time you want to cross the border, next time you want to encounter it, they could just do a quick search, and bam, there you are.
Jesse Hirsch 00:34:24
That doesn't concern you.
Jesse Hirsch 00:34:25
That doesn't worry you at all.
Sharita 00:34:27
It doesn't worry me in terms of the kinds of heroics that I would participate in.
Sharita 00:34:35
Right.
Sharita 00:34:38
Yes, they can find you, like what you were just saying.
Sharita 00:34:43
But I think in terms of my drilling down, my heroics are not necessarily going to be painted on a screen, but they are there.
Sharita 00:34:56
And if you, you know, if you really wanted to find everybody who didn't agree with your thinking or who was thinking that you're a fascist, blah, blah, blah, yes, you could do that, but it may be a great waste of your time.
Jesse Hirsch 00:35:12
I hope you're right.
Jesse Hirsch 00:35:13
What worries me is the amount of people who are getting in line.
Jesse Hirsch 00:35:17
Right.
Jesse Hirsch 00:35:18
And where normally I would feel that there is safety in numbers, I worry about how many people are going to be in those numbers.
Jesse Hirsch 00:35:26
Stroh, let me throw to you only because you, I think, like most of us here on the Call, have been openly radical your entire life in terms of communicating what's in your heart and sharing what's in your mind.
Jesse Hirsch 00:35:40
Do you feel any chill in the air in terms of human expression?
Stroh 00:35:44
Yeah.
Stroh 00:35:45
You know what this just made me think was that all the social media is actually a way of controlling and understanding all of us, because the AI, there is no needles in the haystack.
Stroh 00:36:00
Everything is right in front of them.
Stroh 00:36:02
And if they have the power, then they have to control it, which they do, because they control it.
Stroh 00:36:09
Right.
Stroh 00:36:09
They know exactly what's going on.
Stroh 00:36:11
So I think if we're talking about the future of social media, I think it goes back to being social and getting offline, because anything there is toeing the line for the powers that be.
Stroh 00:36:25
Right.
Stroh 00:36:26
So it is really about human interaction and doing right by, you know, like in.
Stroh 00:36:31
It comes back to accountability, integrity of.
Stroh 00:36:34
Of your own person.
Stroh 00:36:35
Right.
Stroh 00:36:36
So to be radical in the social media environment of today and to change social media, we have to.
Stroh 00:36:42
We have to start being social again, you know, breaking bread.
Jesse Hirsch 00:36:46
Yeah, yeah, yeah.
Jesse Hirsch 00:36:47
And.
Jesse Hirsch 00:36:47
And not just on the breaking bread side, but also in terms of graffiti, in terms of zines, in terms of media that you experience in person.
Jesse Hirsch 00:36:56
Right.
Jesse Hirsch 00:36:56
You have to be in person and part of the community to experience that kind of media.
Stroh 00:37:00
Yeah.
Stroh 00:37:01
You know, and the chill for me, with social media and how.
Stroh 00:37:03
How prevalent it all is and how is that we're.
Stroh 00:37:07
We're becoming more and more illiterate as a species.
Stroh 00:37:10
Right.
Stroh 00:37:11
Like, we're actually thinking critically less because of how we're put into these groups, whether, like, I think it's happening anyways, Right.
Stroh 00:37:19
Like, it's happened.
Stroh 00:37:20
We're.
Stroh 00:37:21
It's.
Stroh 00:37:21
It's.
Stroh 00:37:22
It's like the discourse isn't there.
Stroh 00:37:24
It's me versus it's polarized.
Jesse Hirsch 00:37:26
Right.
Stroh 00:37:26
So there's definitely a chill.
Stroh 00:37:28
And it's not a rise of fascism, it's a reemergence of it.
Stroh 00:37:32
Right.
Stroh 00:37:32
Because we've let so much of the free thought go wayside for whatever interests, you know, for the.
Stroh 00:37:43
Screw you guys.
Jesse Hirsch 00:37:45
Well, and also, I think there's a dynamic here in terms of the TikTok in particular, where before this moment, right.
Jesse Hirsch 00:37:53
When TikTok as it existed last week, had a lot of left wing content, had a lot of left wing voices, had a lot of critical voices who, you know, were not just critical of the United States.
Jesse Hirsch 00:38:05
We're critical of China, we're critical of Israel, we're critical of everything.
Jesse Hirsch 00:38:10
And will be interesting to see if that remains because TikTok, certainly in terms of Western social media or social media available in the west, had the most political diversity.
Jesse Hirsch 00:38:22
It certainly had fascists, it certainly had conservatives and liberals and social democrats and.
Jesse Hirsch 00:38:27
And communists and anarchists.
Jesse Hirsch 00:38:29
But that's what made it interesting was it had that diversity.
Jesse Hirsch 00:38:32
I don't see that diversity elsewhere, which is why I think, Stroh, we may end up going back to face to face Going back to using postal mail and graffiti and, you know, grassroots communication, but within the spirit of what we're describing.
Jesse Hirsch 00:38:48
I'm curious, David or Jeanette or Shavita Stroh, if any of you could think of what could change in the social media we do have, that would make it more diverse.
Jesse Hirsch 00:38:59
Right?
Jesse Hirsch 00:38:59
I mean, researchers have talked about the byproduct of the engagement algorithm, that it rewards extremists.
Jesse Hirsch 00:39:06
Right.
Jesse Hirsch 00:39:07
Can we think of a different way of configuring the algorithm that is more fair or more equitable in terms of how it distributes attention?
Jesse Hirsch 00:39:17
And I say this because while this may not be the direction things are headed, I think there's a lot of interest in entertaining these types of alternatives.
Stroh 00:39:26
I.
Stroh 00:39:26
I think we got to take the algorithm out of the equation if we're going to change that, because the algorithm has a bias.
Stroh 00:39:31
Right.
Stroh 00:39:32
And that's the.
Stroh 00:39:33
That's the real issue there.
Stroh 00:39:37
You know, it's.
Stroh 00:39:39
I think, in an odd stroke of maybe doing the right thing, whether he means in the.
Stroh 00:39:45
Rod Zuckerberg, by not having the algorithm take control of the.
Stroh 00:39:53
Whatever content is there by policing it, that might be actually a good way, because it opens it up to everybody.
Jesse Hirsch 00:39:58
Except he's.
Jesse Hirsch 00:39:59
He's got to be lying.
Jesse Hirsch 00:40:00
Like, there's no way.
Jesse Hirsch 00:40:01
There's no way he'd allow that.
Jesse Hirsch 00:40:03
Sorry, Jeanette, go ahead.
Jeanette 00:40:04
I.
Jeanette 00:40:05
You know, just to respond to what Stroh said, I always.
Jeanette 00:40:09
Whatever bias comes up and.
Jeanette 00:40:11
Jesse.
Jeanette 00:40:11
Sorry, I'm taking your line here, please.
Jeanette 00:40:13
I have to point out there's good bias.
Jeanette 00:40:16
Nobody ever says no to good bias.
Jeanette 00:40:18
Like, clearly, you've identified the negative bias, the way that algorithms currently are essentially dark patterns.
Jeanette 00:40:25
You know, the thing I keep thinking about is food.
Jeanette 00:40:28
The way that modern processed food in, in particular, North America has been engineered over.
Jeanette 00:40:36
Engineered to be as addictive as possible.
Jeanette 00:40:39
Right?
Jeanette 00:40:40
That's why you eat a whole bag of chips before you even know it.
Jeanette 00:40:45
That's.
Jeanette 00:40:46
To me, there's no difference between that and the algorithms you see on these social media platforms.
Jeanette 00:40:53
So what's the solution to that?
Jeanette 00:40:54
Well, you have to present people with an alternative that is just as satisfying, but is not, you know, the.
Jeanette 00:41:02
The playing on every atavistic drive you have to consume salty, fatty, crunchy food.
Jeanette 00:41:12
And I think that, you know, and.
Jeanette 00:41:15
And maybe that's eating, you know, a really good heirloom tomato or something like that, which is a taste experience most people don't have anymore.
Jeanette 00:41:23
Go ahead.
Jesse Hirsch 00:41:24
Although you're.
Jesse Hirsch 00:41:24
You're making a good point, which is we have to have the rewards.
Jesse Hirsch 00:41:28
But we don't want the rewards to be junk food, right?
Jesse Hirsch 00:41:31
We want the rewards to be healthy.
Jesse Hirsch 00:41:33
We want the rewards.
Jesse Hirsch 00:41:34
You know, for example, I've always been motivated by video games, right?
Jesse Hirsch 00:41:39
Or when I first played on a bbs, it was like a Dungeons and Dragons game where you could level up.
Jesse Hirsch 00:41:44
So if there's something that there's rewards, if there's something's a level up, I'll do it.
Jesse Hirsch 00:41:49
Right?
Jesse Hirsch 00:41:49
And the question is, what kind of leveling up, what kind of rewards could we envision that are positive when it comes to community development, that are positive when it comes to cognitive development, like learning, for example.
Jesse Hirsch 00:42:02
And could that be embedded into a social media platform?
Jesse Hirsch 00:42:07
And I'm floating this not because I agree with it, because I think it could be kind of draconian, but what if, for example, before you're allowed to post, you have to share or retweet five other people's posts?
Jesse Hirsch 00:42:20
So before you can add your own voice, you have to elevate other people's voices, right?
Jesse Hirsch 00:42:26
As a way of kind of paying it forward before you then earn the privilege to participate yourself.
Jesse Hirsch 00:42:33
Potentially easily manipulated.
Jesse Hirsch 00:42:35
But I'm curious what you guys think of those kinds of social contracts being embedded into the logic of these sorts of platforms.
David Mason 00:42:45
Yeah, that has existed as an approach.
David Mason 00:42:48
I can't recall offhand.
David Mason 00:42:50
It might be not a popular system anymore, but there I have seen systems before where you have to basically moderate before you can actually participate, as both as a kind of a captcha and as a way to make sure that you're, you know, serious about participating.
David Mason 00:43:03
You've spent some time learning about the community.
David Mason 00:43:06
I think it's a totally reasonable idea.
David Mason 00:43:09
I don't know if people would stand for it, but.
Jesse Hirsch 00:43:12
Yeah, go ahead, Stroh or sri.
Sharita 00:43:15
Yeah, so go ahead, Stroh.
Stroh 00:43:18
So, you know, what I was just thinking is because a social media identity is an identity no different than the name you're born with or whatever.
Stroh 00:43:28
We all.
Stroh 00:43:29
We all get to be born with it.
Stroh 00:43:31
And it's part of our educational system.
Stroh 00:43:33
So you don't post, but you have to graduate to have access and input.
Stroh 00:43:38
Right?
Stroh 00:43:38
So you have to do a certain amount of learning or have a certain amount of level to be able to contribute to the social, to our conversation.
Stroh 00:43:50
Our conversation.
Stroh 00:43:51
You have to have a certain understanding so that we're not giving a soapbox to people that may not have a greater understanding.
Stroh 00:43:58
Right?
Stroh 00:43:58
So you actually go through school.
Stroh 00:44:00
You know, if you.
Stroh 00:44:01
If you have your high school, well, then you get to use this kind of.
Stroh 00:44:04
And the reward is you get more money, that's guaranteed income because you've achieved and you keep leveling up that way through education.
Sharita 00:44:12
Sharita, I think there, there needs to be multiple interventions.
Sharita 00:44:20
However, one intervention might be that you encourage a form of cyber literacy where you learn how to game the system.
Sharita 00:44:33
You do that, Jess.
Sharita 00:44:35
Self taught.
Jesse Hirsch 00:44:36
No, but it's the people, it's the hacker ethic.
Jesse Hirsch 00:44:39
So you sort of embed this, you know, and, and this was the earlier point about how humans are constantly evading attempts to be controlled and are exactly.
Jesse Hirsch 00:44:49
Constantly defying the logic and the rules.
Jesse Hirsch 00:44:52
So what if you built a system that rewarded that, that encouraged that, that elevated that, but also you don't build.
Sharita 00:44:58
The system, you build the human.
Jesse Hirsch 00:45:01
You want to elaborate on that?
Sharita 00:45:04
I don't mean this as a great truth.
Sharita 00:45:07
All I'm really responding to is that you teach or you model, show people how to game the system.
Sharita 00:45:18
You don't.
Sharita 00:45:20
The system doesn't reward that.
Jesse Hirsch 00:45:23
Right, right.
Sharita 00:45:24
You reward it, or whoever does with the individual.
Jesse Hirsch 00:45:29
And then the individual, their incentive there is.
Sharita 00:45:35
To basically do what they want.
Jesse Hirsch 00:45:38
Right.
Sharita 00:45:39
And not so much be influenced by those you know around you.
Sharita 00:45:44
Now, mind you, that's only going to work for a certain percentage of the population, most of us, in whatever area we want to think about our sheep.
Jesse Hirsch 00:45:57
Well, but.
Jesse Hirsch 00:45:57
And this evokes a whole other dynamic which I'm going to put to aside because I have a separate question, Jeanette, I want to throw to you.
Jesse Hirsch 00:46:05
But this kind of evokes the inherent elitism of social media that on the one hand, Jeanette was kind of citing, you know, the.
Jesse Hirsch 00:46:13
With every media cycle, the elite feels threatened by all the new people who are able to use the media, but those new people essentially become part of the elite because you still don't get to a point of universality.
Jesse Hirsch 00:46:27
We don't have universality when it comes to social media.
Jesse Hirsch 00:46:30
And if anything, social media via influencers have created a new elite.
Jesse Hirsch 00:46:35
But I thought David indirectly, in talking about whether people would go through these measures, evoked the idea of frictionless design.
Jesse Hirsch 00:46:44
Like Facebook conquered the world because they made everything as easy, easy as possible.
Jesse Hirsch 00:46:50
So that in fact all the idiots are on Facebook because Facebook makes it easy for them to communicate, to participate.
Jesse Hirsch 00:46:57
So Jeanette, as someone who refuses to be forced to do anything, and in fact, while you are perfectly willing to do very hard and complicated things that you choose, if someone else asks you to do something that's very hard and difficult, chances are you're not going to do do it.
Jesse Hirsch 00:47:15
So I'm curious, both from your own experience as a social media user, but also your perspective as an educator.
Jesse Hirsch 00:47:23
Can you help us kind of close the circle here?
Jesse Hirsch 00:47:25
Can we incentivize greater behavior, yet at the same time, keep it simple?
Jesse Hirsch 00:47:30
Can we do, as Sharita said, foster hacking of the system, but still have a system that can handle being hacked?
Jeanette 00:47:39
I mean, sure.
Jeanette 00:47:42
The one thing that came to mind, I think Stroh was the first one to invoke the education system, and then Sharita kind of picked that up.
Jeanette 00:47:50
If we're sheep, it's because the education system is entirely calculated to produce that.
Jeanette 00:47:56
It's, it is essentially a compliance machine.
Jeanette 00:48:00
So I think what Sharita is outlining is actually extremely powerful, but it's the opposite of the current structure we have.
Jesse Hirsch 00:48:08
Right.
Jeanette 00:48:09
And because it really is not about fitting in, it's about figuring out what you are and that what I am is very different from what everybody else is.
Jeanette 00:48:22
And maybe my goal should be the best that I can be.
Jeanette 00:48:29
If we could find a social media structure that supported that, I think we'd have something truly empowering.
Jeanette 00:48:39
But you know, what I see in the current thing, to go back to stuff we've already talked, we've already touched on, is generally an environment that is.
Jeanette 00:48:51
Twitter, to me, is the most extreme version of this.
Jeanette 00:48:53
Now, incredibly hostile.
Jeanette 00:48:56
Failure to conform brings down an army of bots.
Jeanette 00:49:01
Right.
Jeanette 00:49:03
Not just individuals who've decided to punish you, but, but in fact, you know, things that aren't even people.
Jeanette 00:49:11
And so people are.
Jeanette 00:49:13
And because outrage is the currency of that particular platform, that's what everybody is motivated to do.
Jeanette 00:49:23
So what would a social media platform look like that, you know, was the opposite of that?
Jeanette 00:49:29
I don't know.
Jeanette 00:49:30
I mean, I really, I can't say.
Jesse Hirsch 00:49:33
Although you did answer it in the sense that it's pre, it's prerequisite is an entirely different educational system that, rather than foster compliance, fosters critical thinking.
Jeanette 00:49:46
Yeah.
Jeanette 00:49:47
And beyond critical thinking, I would say critical doing.
Jeanette 00:49:54
That's not.
Jeanette 00:49:55
I'm sorry, that's not really a no.
Jesse Hirsch 00:49:57
But taking action rather than just getting lost in theory, which unfortunately, the North American left is completely ensconced in theory, and, and hence why there's not a lot of action happening.
Jesse Hirsch 00:50:09
David, you've got your hand up, please.
David Mason 00:50:11
Oh, yeah, yeah.
David Mason 00:50:12
So I, I mostly was, was kind of reacting to one idea that I was hearing.
David Mason 00:50:17
Maybe that's mishearing, but I, I, I don't.
David Mason 00:50:19
I guess I'm a little bit uncomfortable with the idea that it should Be kind of like a.
David Mason 00:50:26
Even.
David Mason 00:50:27
Maybe even critical thinking is.
David Mason 00:50:30
Is, you know, critical thinking is an important skill.
David Mason 00:50:33
But above all, I think somebody who, you know, I think basically, unless you're talking about like a.
David Mason 00:50:42
An institution, then if you're talking about something that is a response, it needs tone.
David Mason 00:50:48
Like a tone and kind of the tools so that anyone can represent, you know, that they're.
David Mason 00:50:55
If they're like a coal miner and they're unhappy and they're not speaking your.
David Mason 00:51:00
Your lefty language, that should be fine.
David Mason 00:51:02
You know, it should be a way to have you not come to blows, basically.
David Mason 00:51:07
And, you know, so.
David Mason 00:51:11
So in that case, that's, That's.
David Mason 00:51:13
That is why.
David Mason 00:51:14
I mean, one.
David Mason 00:51:15
One thing that I've thought about for quite a long time is that where the web came from is a research network.
David Mason 00:51:20
And we should feel like we're on a research network where we're all experimenting and we make mistakes and we're putting forward propositions and theses and people are.
David Mason 00:51:30
And providing evidence and, you know, so I guess, you know, having that kind of we're scientists tone and we're collecting evidence tone without it having to look a certain way or to be according to certain theories might be a good approach.
Jesse Hirsch 00:51:47
Well, and the thread there, I think, between what all of us are saying is that it should have a pedagogic configuration that at its core level, it should be learning.
Jesse Hirsch 00:51:57
Go ahead, Jeanette.
Jeanette 00:51:58
I just want to quickly respond to what David said, because what he's describing is what the scientific ethos is supposed to be.
Jeanette 00:52:08
Of course, in reality, this is not how scientists behave, as Twitter demonstrates.
Jeanette 00:52:14
Yeah, well, and just the world of scientific publishing, for example.
Jeanette 00:52:19
I mean, and this critique of science has, you know, that's been around for quite a while.
Jeanette 00:52:25
But what is.
Jeanette 00:52:26
What strikes me is that you see that behavior the minute you attach the rewards of power to certain kinds of behavior.
Jeanette 00:52:35
That's why people don't behave in a disinterested, supportive, you know, let us mutually explore the nature of reality kind of way.
Jeanette 00:52:44
As soon as there are wards of power attached to particular behaviors, that's where they go.
Jesse Hirsch 00:52:50
Well, and maybe that's, you know, the foreground.
Jesse Hirsch 00:52:53
Right, The.
Jesse Hirsch 00:52:54
The title of the fictitious social media platform we're imagining is this is the place where power is held in the hands of the many.
Jesse Hirsch 00:53:03
And then the subtitle is we don't come to blows.
Jesse Hirsch 00:53:07
Right.
Jesse Hirsch 00:53:08
And that speaks to the culture, again, my desire for there to be a rhythm to the algorithm.
Jesse Hirsch 00:53:13
Right.
Jesse Hirsch 00:53:14
For there to be a kind of rhyme to the way in which these platforms play out.
Jesse Hirsch 00:53:20
Any closing thoughts or final words on kind of where we see the future of social media, or more aptly, where we would like to see the future of social media?
Jesse Hirsch 00:53:33
Stroh?
Stroh 00:53:34
Yeah, you know, I would love for it to become what we want it to be, which is something that teaches us how to be better and understands each other better.
Stroh 00:53:45
And I think it is right, because if the medium is the message, then we see all the data, we see everything that's happening, and the people that are actually inclined to do something better for humanity that have that in mind will take that from it.
Stroh 00:54:01
So I think it's inevitable.
Stroh 00:54:03
But to keep it natural and not controlled by few is to keep it truly democratic is, you know, ideally where we'd like to see it go right.
Jeanette 00:54:13
On out of the hands of the oligarchs and into the hands of the people.
Stroh 00:54:18
Power to the people.
Jesse Hirsch 00:54:20
Very cool.
Jesse Hirsch 00:54:21
Sharita, any final words?
Sharita 00:54:24
I in, in my mind, I keep going back to some of the groups that I've participated with, you know, in online and to me, that's social media as well, and that if you got involved a long time ago, taught you how to behave in social media, before you had Twitter, before, you know, all that other stuff.
Sharita 00:54:56
So I would be interested not in big monoliths, but in small cooperative types of systems.
Jesse Hirsch 00:55:06
Well, and ultimately that's, I suppose, one of the promises of the Fediverse, but currently it would require far more widespread media literacy and technological literacy to actually have happen.
Jesse Hirsch 00:55:19
David, any final words or closing thoughts?
David Mason 00:55:23
No, no, I appreciated it.
David Mason 00:55:25
You know, in some ways, you know, I think structure is an important part of it purpose and structure is an important part of it and not being abductible by oligarchs and other.
David Mason 00:55:38
Other forces.
David Mason 00:55:39
But yeah, it's.
David Mason 00:55:40
And as well, absolutely, like breakouts and all these different kinds of things.
David Mason 00:55:44
But yeah, it's a hard problem, that's for sure.
Jesse Hirsch 00:55:47
Well, and I think for me, kind of, I feel that all of us are of a certain vintage that we remember when media wasn't social.
Jesse Hirsch 00:55:56
Right.
Jesse Hirsch 00:55:57
When media was broadcast, when media was one way.
Jesse Hirsch 00:56:01
And maybe our frustrations is because the socialization of media hasn't been sufficient.
Jesse Hirsch 00:56:08
Right?
Jesse Hirsch 00:56:08
We need the socialization of ownership, we need the socialization of control, we need the socialization of identity, all as ways to make our media more valuable, more useful to the rest of us.
Jesse Hirsch 00:56:23
So thank you very much to our esteemed council of wise intellectuals, AKA the OG metaviews.
Jesse Hirsch 00:56:35
Comes out relatively frequent at the moment.
Jesse Hirsch 00:56:37
Not sure we're gonna keep this pace up for too much longer, but by all means, find us on your social media platforms, especially the social media platform that we're expecting you to invent that we will open accounts on as soon as it is available, and offers an alternative to the dumpster fire that we currently find ourselves in.
Jesse Hirsch 00:56:58
So with that said, we'll see you soon.
Jesse Hirsch 00:57:01
Thank.
";}