IMAGINE IF: YEAGER AND ROSS - COMMUNITIES IN THE AGE OF CONTROL
A conversation with Shawn Yeager and Derek Ross at the Imagine If conference in Nashville, TN. We discuss the current state of digital communications and identity with a specific focus on nostr.
Date: September 20, 2025
Shawn on Nostr: https://primal.net/shawn
Derek on Nostr: https://primal.net/derekross
Bitcoin Park on Nostr: https://primal.net/park
Imagine If: https://bitcoinpark.com/imagineif
(00:40) Future of digital comms, identity, social
(01:41) Diagnosing the problems: incentives, KYC, and broken trust
(03:05) Censorship, shadow bans, and owning your social graph
(05:00) AI, deepfakes, and can we trust what we see?
(07:24) Algorithmic control vs user choice
(10:10) Introducing Nostr: open protocol, healthier engagement
(11:54) Digital health, doomscrolling, and parenting challenges
(15:21) Youth safety on open protocols: responsibilities and tools
(18:22) Give parents the tools: OS-level controls and UX
(19:35) Getting started with Nostr: keys, Primal, and UX spectrum
(21:17) Vibe-coding apps: Soapbox/Shakespeare on Nostr + Bitcoin
(22:39) Permissionless payments and AI-built sites
Video: https://primal.net/e/nevent1qqs0v7rgjh55wygwuc8pmqvk0qz6qts30uaut2c8lp4dgh9usw2cdpgnznwd9
more info on the show: https://citadeldispatch.com
learn more about me: https://odell.xyz
00:40 - Future of digital comms, identity, social
01:41 - Diagnosing the problems: incentives, KYC, and broken trust
03:05 - Censorship, shadow bans, and owning your social graph
05:00 - AI, deepfakes, and can we trust what we see?
07:24 - Algorithmic control vs user choice
10:10 - Introducing Nostr: open protocol, healthier engagement
11:54 - Digital health, doomscrolling, and parenting challenges
15:21 - Youth safety on open protocols: responsibilities and tools
18:22 - Give parents the tools: OS-level controls and UX
19:35 - Getting started with Nostr: keys, Primal, and UX spectrum
21:17 - Vibe-coding apps: Soapbox/Shakespeare on Nostr + Bitcoin
22:39 - Permissionless payments and AI-built sites
NOTE
Transcription provided by Podhome.fm
Created: 09/24/2025 21:49:17
Duration: 1395.0
Channels: 1
1
00:00:40.879 --> 00:00:43.220
A round of applause for Matt O'Dell,
2
00:00:43.795 --> 00:00:46.454
Derek Ross, and Sean Yeager.
3
00:00:49.234 --> 00:00:50.295
Okay. Yo.
4
00:00:52.835 --> 00:00:54.055
How's it going, guys?
5
00:00:57.530 --> 00:00:59.470
We'll be talking today about
6
00:01:01.930 --> 00:01:03.550
the future of digital comms,
7
00:01:04.570 --> 00:01:05.070
identity,
8
00:01:06.090 --> 00:01:06.590
social,
9
00:01:08.170 --> 00:01:09.230
and open communities.
10
00:01:12.275 --> 00:01:13.335
And my good friends,
11
00:01:13.635 --> 00:01:15.335
Derek and Sean, here with us.
12
00:01:16.195 --> 00:01:18.775
I think an interesting place to start is
13
00:01:20.115 --> 00:01:21.415
diagnosing the problem.
14
00:01:21.955 --> 00:01:22.455
We've
15
00:01:23.715 --> 00:01:25.015
found ourselves in
16
00:01:25.635 --> 00:01:27.255
a increasing digital world,
17
00:01:28.409 --> 00:01:28.909
And,
18
00:01:30.970 --> 00:01:35.869
the status quo has kinda just been built to, like, one foot in front of the other without really kinda, like, any
19
00:01:36.409 --> 00:01:39.229
real planning, and and and now we're here.
20
00:01:39.530 --> 00:01:39.930
So,
21
00:01:40.330 --> 00:01:42.909
let's start off with Sean. When you think about
22
00:01:43.725 --> 00:01:56.110
the current state of digital communities and identity and social, where where do you diagnose the problems existing?
23
00:01:57.590 --> 00:02:00.090
I think with as with most everything,
24
00:02:00.950 --> 00:02:06.009
as it admitted Bitcoiner, it starts with broken money. And with broken money come broken incentives,
25
00:02:07.829 --> 00:02:10.250
and from that flow, business models
26
00:02:10.905 --> 00:02:13.325
that turn, as we all know, us into the product.
27
00:02:14.025 --> 00:02:14.525
And,
28
00:02:16.185 --> 00:02:17.485
there has been an increasing,
29
00:02:17.945 --> 00:02:20.285
I think, drive to try to milk more
30
00:02:20.585 --> 00:02:22.125
out of the consumer user,
31
00:02:22.790 --> 00:02:23.690
and then the advertisers
32
00:02:24.070 --> 00:02:24.970
and the businesses.
33
00:02:27.030 --> 00:02:28.410
Cory Doctorow has
34
00:02:28.870 --> 00:02:30.170
a colorful phrase.
35
00:02:30.630 --> 00:02:32.170
Maybe I won't utter it here,
36
00:02:33.190 --> 00:02:35.050
but to describe how these
37
00:02:35.510 --> 00:02:36.010
cycles
38
00:02:36.310 --> 00:02:38.895
roll out. And so where we find ourselves is
39
00:02:39.275 --> 00:02:40.735
not only are we
40
00:02:41.195 --> 00:02:41.695
not,
41
00:02:42.395 --> 00:02:43.135
the user,
42
00:02:43.595 --> 00:02:45.775
we're the product, but I think increasingly
43
00:02:46.475 --> 00:02:47.455
we are seen
44
00:02:47.915 --> 00:02:48.735
as something
45
00:02:49.115 --> 00:02:50.415
to be packaged.
46
00:02:51.560 --> 00:02:57.260
And we see creeping KYC, we see everything happening in The UK with the Online Safety Act,
47
00:02:57.800 --> 00:02:58.300
and,
48
00:02:59.160 --> 00:03:01.180
yeah. We're not in a good place right now.
49
00:03:02.360 --> 00:03:04.780
Very well said. Derek, how do you think about it?
50
00:03:05.655 --> 00:03:09.355
Well, I I think that over the past few years, we've
51
00:03:09.974 --> 00:03:13.034
all come to a place where we know somebody
52
00:03:13.415 --> 00:03:16.314
or we interacted online with somebody, followed somebody
53
00:03:16.855 --> 00:03:18.075
that has been
54
00:03:18.694 --> 00:03:19.915
censored or
55
00:03:20.640 --> 00:03:21.140
has
56
00:03:21.599 --> 00:03:22.900
been shadow banned,
57
00:03:23.439 --> 00:03:25.939
something along those lines. It's becoming
58
00:03:26.560 --> 00:03:27.379
more apparent
59
00:03:27.920 --> 00:03:28.579
and it's
60
00:03:28.879 --> 00:03:35.620
accelerating. It's kind of odd to see it accelerating. Like Sean just said, we're seeing that happen
61
00:03:36.405 --> 00:03:36.905
across
62
00:03:37.284 --> 00:03:40.105
the European Union, across, you know, in The UK.
63
00:03:40.724 --> 00:03:43.704
We're starting to see this actually even happen recently
64
00:03:44.004 --> 00:03:46.504
here in The United States where
65
00:03:47.204 --> 00:03:49.545
people can have their whole entire livelihood,
66
00:03:50.350 --> 00:03:51.090
their business
67
00:03:51.550 --> 00:03:55.570
taken away because they built their business on somebody else's foundation
68
00:03:56.590 --> 00:03:58.450
and they don't own that content.
69
00:03:59.230 --> 00:04:00.530
They don't own that
70
00:04:00.990 --> 00:04:03.650
their followers. They don't own their entire social
71
00:04:04.110 --> 00:04:04.610
graph
72
00:04:05.015 --> 00:04:07.115
and it's disappearing overnight.
73
00:04:07.735 --> 00:04:12.395
Years and years of hard work can be taken away from you, and you can't do anything about it because
74
00:04:12.855 --> 00:04:14.315
you built your entire
75
00:04:14.775 --> 00:04:17.035
digital life on somebody else's foundation.
76
00:04:17.639 --> 00:04:18.139
And
77
00:04:19.160 --> 00:04:22.380
it's becoming very apparent that there needs to be a better way.
78
00:04:24.600 --> 00:04:25.419
Yeah. I think,
79
00:04:27.400 --> 00:04:31.340
there's a there's a couple of issues that compound on top of each other
80
00:04:33.825 --> 00:04:35.125
that result in
81
00:04:36.145 --> 00:04:37.285
the current trajectory
82
00:04:37.745 --> 00:04:39.925
that we're that we're going down in terms
83
00:04:41.105 --> 00:04:43.125
of big tech and digital platforms.
84
00:04:44.065 --> 00:04:49.410
So, I mean, you guys phoned in on on on censorship and control,
85
00:04:50.110 --> 00:04:52.690
which I think is one that people talk about a lot.
86
00:04:53.230 --> 00:04:54.610
So, Sean, you've been
87
00:04:55.470 --> 00:04:55.970
exploring,
88
00:04:56.350 --> 00:04:58.290
like, kind of this intersection between,
89
00:04:59.070 --> 00:05:01.330
you know, AI and Bitcoin. And
90
00:05:02.555 --> 00:05:10.015
the other piece here that is really interesting to me is is, like, this idea of deep fakes and verifiability. How do you think
91
00:05:10.395 --> 00:05:11.295
about that
92
00:05:12.315 --> 00:05:13.535
in the current paradigm?
93
00:05:15.270 --> 00:05:19.370
I think, I mean, and and just a a brief bit of background,
94
00:05:19.750 --> 00:05:27.210
hopefully not a shameless shill, is the the point of trust revolution is to pursue two questions. One is, how do we as developed nations find ourselves
95
00:05:27.955 --> 00:05:30.615
in low trust societies in that we
96
00:05:31.075 --> 00:05:44.430
I think most of us can agree, Pew Research and others would certainly back this up. We don't trust the government. We don't trust the media. We don't trust healthcare. We don't trust education. We don't trust each other. We don't trust across party lines. That's not a black pill. I think it's just observably true. The second more hopeful question is,
97
00:05:45.130 --> 00:05:50.270
how and where can we reclaim the trust that we have given or been demanded of and it has been broken?
98
00:05:50.970 --> 00:05:52.190
And how can we
99
00:05:52.490 --> 00:06:00.025
build trust where we believe it should be? So that's all to say, can we trust our eyes? To your question, you know, can we trust the media that we see and we consume?
100
00:06:00.965 --> 00:06:03.305
I think what's hopeful about that is the ability
101
00:06:03.845 --> 00:06:04.345
to
102
00:06:04.884 --> 00:06:05.384
utilize
103
00:06:06.405 --> 00:06:10.560
public private key cryptography to sign, authenticate, attribute
104
00:06:11.100 --> 00:06:14.320
media. I think we're quite a ways away from that being
105
00:06:14.700 --> 00:06:18.080
large scale. I think once again the incentives are not necessarily aligned
106
00:06:18.540 --> 00:06:22.080
for that to be widely adopted, but I think the tools are there.
107
00:06:22.595 --> 00:06:26.535
And the big question in my mind, to echo yours, is at what point
108
00:06:26.835 --> 00:06:28.295
do we reach this inflection
109
00:06:28.915 --> 00:06:30.295
where there is so much
110
00:06:30.755 --> 00:06:33.975
questioning and confusion about is what I'm seeing real,
111
00:06:34.995 --> 00:06:38.960
that there's a broader adoption of of the tools that we do have, like Nostr,
112
00:06:39.980 --> 00:06:43.360
and these public private key pairs to address that challenge.
113
00:06:43.660 --> 00:06:45.520
But, I mean, are we
114
00:06:46.220 --> 00:06:47.840
aren't we kind of already there?
115
00:06:48.460 --> 00:06:53.835
In what way? There in terms of I think most people, like, when you open your phone, you're like, ah, is that real? Oh.
116
00:06:54.375 --> 00:07:01.755
Yes. Like, we're very close, if not already across the chasm, right? Yeah. Which, I mean, and I'll just say one quick thing there is I think
117
00:07:02.550 --> 00:07:05.290
much as in sort of prior waves of technology,
118
00:07:06.070 --> 00:07:09.930
there has been the need to create a certain literacy and a certain,
119
00:07:11.030 --> 00:07:13.930
ability to scrutinize. I hope that it it incentivizes
120
00:07:14.310 --> 00:07:15.690
and and motivates people
121
00:07:16.215 --> 00:07:17.915
to become more thoughtful,
122
00:07:18.375 --> 00:07:18.875
about
123
00:07:19.335 --> 00:07:23.355
what they consume and and and what they question or or trust.
124
00:07:24.455 --> 00:07:28.395
I think expanding on what you consume is a unique problem
125
00:07:29.080 --> 00:07:29.820
in itself
126
00:07:30.200 --> 00:07:30.700
because
127
00:07:31.480 --> 00:07:38.300
what content I want to consume versus what content I'm forced to consume is very different. Yes. Because we are
128
00:07:38.680 --> 00:07:40.540
slaves to the algorithms,
129
00:07:41.400 --> 00:07:42.780
and what these platforms
130
00:07:43.875 --> 00:07:46.935
want us to see. We don't really have control over
131
00:07:47.635 --> 00:07:50.535
the content. We don't have the control over our attention
132
00:07:51.075 --> 00:07:54.615
and that that's part of the problem too. So if you didn't want to see
133
00:07:55.075 --> 00:07:59.700
certain types of content, it's really hard to not see it using these existing legacy
134
00:08:00.160 --> 00:08:06.020
social platforms. You're being spoon fed. Yeah. Yeah. So I mean, from, like, a productive point of view,
135
00:08:06.880 --> 00:08:10.420
how do you how do you mitigate that? How do you actually solve that problem? I mean,
136
00:08:10.915 --> 00:08:13.975
that's Well, easier said than done. Yeah. It's easier said than done, but
137
00:08:14.275 --> 00:08:18.695
we need tools for users that allow them to choose their own algorithm,
138
00:08:19.075 --> 00:08:24.935
to choose the type of content they want to see, to choose and curate their their social feeds.
139
00:08:25.440 --> 00:08:27.860
Just because Elon and Mark Zuckerberg
140
00:08:28.160 --> 00:08:44.785
say that this is the content that you need to see doesn't mean that I want to see it, doesn't mean that you want to see it, but I don't have a choice if I use Instagram or Facebook or x, Twitter. Like I have to see that algorithm content. I I don't have a choice of choosing,
141
00:08:45.965 --> 00:09:04.590
you know, cat pics as my feed if I want to, you know, if I want a few cats or whatever it is. I I can easily sure. I could browse a hashtag or something like that, but that's not a good, you know, that's not a good choice. We need more user tools. We need more user choice, and there are options out there that give users full control
142
00:09:05.210 --> 00:09:10.735
over what they want to consume, full control over their attention. Because that's what these platforms are are monetizing.
143
00:09:11.115 --> 00:09:12.654
They're monetizing our attention.
144
00:09:13.194 --> 00:09:23.134
Right? Like, we need a way to take that back. It's our it's, you know, what my eyes see. It's my attention. I should be able to designate what gets my attention.
145
00:09:24.040 --> 00:09:30.860
And do you think the the friction point with that because I do think that's the path forward. The friction point with that is it requires
146
00:09:32.040 --> 00:09:35.740
a level of personal responsibility from the actual user. Yeah.
147
00:09:36.264 --> 00:09:42.345
Sure. How do we handle that friction? I mean, if There's some people that just wanna scroll. Right? They they don't want the
148
00:09:42.985 --> 00:09:44.605
they don't have time to
149
00:09:45.385 --> 00:09:50.605
build and curate their own feed, and and that's fine. For that, you have a choice.
150
00:09:50.930 --> 00:09:55.750
But the fact that you don't have a choice is the problem. If you want the spoon fed content, great.
151
00:09:56.449 --> 00:10:15.705
If you don't want the spoon fed content, you want to be your own algorithm be in control, you should have that choice in a wide variety of choices. The choices the choices should be open and transparent, and you should be able to decide which path you want to go. And I I would say it's also experiential in the sense that if you're not on Nostra, if you haven't tried Nostra,
152
00:10:16.885 --> 00:10:18.920
What is Nostra? We haven't talked about that yet.
153
00:10:19.380 --> 00:10:20.199
What is Nostra?
154
00:10:21.300 --> 00:10:31.480
Well, so like bitcoin, and I'll I'll let Matt talk to this. It is an open protocol. No one controls it. No one owns it. And therefore, it is there to be built upon. And the reason I mentioned it is
155
00:10:32.964 --> 00:10:38.745
I think most of traditional social media and communications channels, one to many, they are not only monetizing
156
00:10:39.685 --> 00:10:42.505
our attention, increasingly they're monetizing our outrage.
157
00:10:43.925 --> 00:10:48.024
And I think as people that I've observed experience an alternative,
158
00:10:49.340 --> 00:10:53.520
Mastodon, others there are out there, I think we all agree that Nostra is the way to go.
159
00:10:54.620 --> 00:10:56.160
Once you remove the outrage,
160
00:10:56.780 --> 00:10:58.960
it is experiential that I feel
161
00:10:59.660 --> 00:11:00.160
better,
162
00:11:01.154 --> 00:11:09.815
possibly at least not worse as I have engaged with others on Noster versus X versus Facebook versus others. And so,
163
00:11:10.195 --> 00:11:11.334
that is all to say.
164
00:11:11.714 --> 00:11:12.295
I think
165
00:11:12.834 --> 00:11:15.175
part of the key is just giving people a
166
00:11:15.540 --> 00:11:20.600
sense of what that's like. And I and I think they can begin, each of us, to sort of rewire
167
00:11:21.300 --> 00:11:23.000
those receptors, those dopamine,
168
00:11:23.459 --> 00:11:27.399
you know, hits that we're accustomed to getting. But it will take some time.
169
00:11:27.855 --> 00:11:31.315
I mean, you're drilling down on basically this concept of
170
00:11:32.735 --> 00:11:33.875
healthy usage
171
00:11:34.495 --> 00:11:35.795
of of technology.
172
00:11:36.255 --> 00:11:36.654
Yes.
173
00:11:38.175 --> 00:11:39.714
Which I would say as a society,
174
00:11:41.240 --> 00:11:44.300
we're probably very deep into unhealthy usage
175
00:11:45.000 --> 00:11:50.139
of of the tools. And, I mean, I see this firsthand with my own life. I see this across
176
00:11:50.839 --> 00:12:07.435
all different aspects of society right now. We have a term for that nowadays. It's called doom scrolling. Doom doom It became so apparent we have that. AI psychosis. Yeah. Doom scrolling. Everyone does it. A lot of people do it. They know they're doing it. They continue to do it.
177
00:12:07.975 --> 00:12:09.515
But one part on this
178
00:12:11.180 --> 00:12:11.680
one
179
00:12:12.220 --> 00:12:16.160
one aspect of this idea of of digital health and healthy usage
180
00:12:16.540 --> 00:12:19.839
that I think is incredibly key for our society going forward
181
00:12:20.220 --> 00:12:23.014
is all three of us are parents. It's specifically
182
00:12:23.315 --> 00:12:27.975
I mean, I think adults use it in very unhealthy ways, but but the question is, like, how
183
00:12:28.595 --> 00:12:29.574
does that affect
184
00:12:30.115 --> 00:12:31.095
childhood development?
185
00:12:32.435 --> 00:12:32.935
And
186
00:12:33.714 --> 00:12:37.574
for something like Nasr, that's an open protocol that's not controlled by anybody.
187
00:12:39.210 --> 00:12:43.150
How do you think I mean, we'll start with Sean again. How do you think about
188
00:12:45.770 --> 00:12:52.350
handling that issue? Like, how how does how does society handle that going forward with kids growing up with
189
00:12:53.024 --> 00:12:55.365
basically just a fire hose of information?
190
00:12:56.865 --> 00:12:58.064
Well, I am
191
00:12:58.545 --> 00:13:04.324
here's my little guy right there, my my almost four year old. So, I'm a a dad to a young boy.
192
00:13:04.704 --> 00:13:05.204
And
193
00:13:05.665 --> 00:13:11.160
so I have a bit of time, but I'll just sort of maybe, share an anecdote, which is that
194
00:13:11.620 --> 00:13:13.240
we, full credit to my wife,
195
00:13:14.180 --> 00:13:16.839
had given, closure of your hair's life, had given,
196
00:13:17.860 --> 00:13:23.080
maybe an hour to two per morning of screen time so that, you know, she at home could have some
197
00:13:23.595 --> 00:13:24.975
some space to do some things.
198
00:13:25.435 --> 00:13:37.214
It is remarkable, the change, and this will be obvious to those of you who've done it, but it was remarkable to me that in saying no and and ending that and having zero screen time,
199
00:13:37.529 --> 00:13:48.029
the change in our sun was incredible. And I personally don't know of any better reference point in my life than to have observed that firsthand. So I can only imagine,
200
00:13:49.449 --> 00:13:58.025
what a a a young child given a device in their hand, that's not a judgment for anyone who chooses to do that, but I just can't imagine the damage
201
00:13:58.965 --> 00:14:00.025
that that will do.
202
00:14:00.405 --> 00:14:03.065
So I feel very passionate about,
203
00:14:04.860 --> 00:14:08.640
our collective and individual, most of all, responsibility within our families
204
00:14:09.180 --> 00:14:14.960
to find better ways. So, I mean, we're seeing like, right now, we're seeing a lot of conversation about
205
00:14:16.620 --> 00:14:17.120
disenfranchised
206
00:14:17.660 --> 00:14:18.160
youth
207
00:14:19.665 --> 00:14:24.165
getting radicalized on Internet communities. It's become a very sensitive conversation.
208
00:14:26.865 --> 00:14:30.565
Some of the, quote, unquote, solutions that have been proposed
209
00:14:31.105 --> 00:14:31.605
involves
210
00:14:32.530 --> 00:14:37.510
restricting speech, restricting access More KYC. Adding digital ID,
211
00:14:38.290 --> 00:14:39.430
adding age restrictions.
212
00:14:41.730 --> 00:14:44.950
I mean, we just saw Blue Sky, I think, in two states just added
213
00:14:45.330 --> 00:14:46.070
age restrictions,
214
00:14:46.770 --> 00:14:47.750
to their app.
215
00:14:48.415 --> 00:14:50.115
Derek, how do you
216
00:14:51.375 --> 00:14:55.235
what is what is the most productive path forward? Because I I think
217
00:14:56.255 --> 00:15:00.035
the key here is that that is actually a problem. Like, I I do think
218
00:15:00.930 --> 00:15:04.870
disenfranchised youth are getting radicalized on niche Internet communities.
219
00:15:05.970 --> 00:15:08.630
But when you're building out something like Nasr, an open protocol
220
00:15:10.370 --> 00:15:17.845
where you inherently can't age restrict on a top down level, how do you like what is the most productive path? How how do we actually
221
00:15:18.785 --> 00:15:20.405
solve that in a healthy way?
222
00:15:21.105 --> 00:15:24.245
That's a very good question and it's probably a very hard question.
223
00:15:25.265 --> 00:15:28.485
I I think I'll say part of it goes back to what
224
00:15:29.290 --> 00:15:32.430
Sean was alluding to is that, you know, ultimately
225
00:15:33.530 --> 00:15:34.030
parents
226
00:15:35.050 --> 00:15:35.790
should parent.
227
00:15:36.730 --> 00:15:37.790
If kids
228
00:15:38.890 --> 00:15:45.515
are having issues online getting radicalized and over certain content, you don't want that to happen to your kid then you need to restrict access
229
00:15:46.935 --> 00:15:48.235
to certain applications.
230
00:15:48.855 --> 00:15:53.915
Now that doesn't mean completely take away because we know that kids today are very social
231
00:15:54.660 --> 00:15:59.240
and online, so you can still give them apps. But so the second part of this is
232
00:16:00.420 --> 00:16:05.640
we just need more user controls and we need more apps across the
233
00:16:06.100 --> 00:16:07.960
the Nostra ecosystem that
234
00:16:08.274 --> 00:16:11.334
maybe do focus on restricting filtering
235
00:16:11.955 --> 00:16:14.055
that type of content. So maybe you have,
236
00:16:14.515 --> 00:16:20.695
because Nostr is widely open and you can do anything you want, maybe somebody builds a Nostr application
237
00:16:21.075 --> 00:16:21.575
that
238
00:16:22.580 --> 00:16:25.400
is more suitable for the youth, maybe
239
00:16:25.860 --> 00:16:29.560
restrict certain type of content. It's only bound to certain
240
00:16:30.740 --> 00:16:38.120
content filtered relays, and you can't use anything else but that. Now the argument is, well, the kid can take
241
00:16:38.634 --> 00:16:42.894
the profile, the NSAC and just use another app. But if you're the parent,
242
00:16:43.514 --> 00:16:45.454
you do parenting and you lock down
243
00:16:45.995 --> 00:16:53.295
access to certain applications, you only give them access to the parent approved app. I mean, they're your kids. You should be able to say
244
00:16:53.780 --> 00:16:58.200
what apps they use. And the personal example was that is I didn't let my kids use TikTok
245
00:16:58.580 --> 00:17:00.040
for a very long time.
246
00:17:00.500 --> 00:17:07.560
And my kids are now 14 and 16 years old. They now use TikTok, but they wanted to use it years ago when their
247
00:17:08.005 --> 00:17:15.945
friends were all using it, you know, 10, 12 years old. And and I said, no, you're not using that app. I'll be sorry. And they complained a lot
248
00:17:16.325 --> 00:17:27.559
and I was a parent and said, well, I'm sorry, you're not using it. And I used my parental rights to restrict my kids access to something I didn't want them on. Now they're older.
249
00:17:28.019 --> 00:17:30.200
Sure. I I let them do it.
250
00:17:30.580 --> 00:17:33.960
And the same would go for any Nostra app. I would restrict access
251
00:17:34.375 --> 00:17:45.275
and block if I wanted to, the access to do that because we have the tools to do that. But then, as I said on the other side, we do need a Nostr client to step up and build
252
00:17:46.215 --> 00:17:47.674
a kid filtered, kid
253
00:17:48.000 --> 00:17:51.860
safe environment? Well, and I think just quickly the thing that's so powerful about this in my
254
00:17:52.720 --> 00:17:54.420
my strong promotion of Noster,
255
00:17:54.880 --> 00:17:56.260
or whatever may come after
256
00:17:56.560 --> 00:17:57.060
is
257
00:17:57.520 --> 00:18:01.380
the ability for individuals, for parents in this particular case
258
00:18:01.774 --> 00:18:04.835
to be given the tools to make the choice.
259
00:18:05.695 --> 00:18:14.995
Yeah. I think that's the core. It should not come from x. It should not come from the government. It should come from the individuals closest to and most invested in that little human's health.
260
00:18:15.360 --> 00:18:21.620
And I think Noster is a prime example of what an open protocol does with regard to giving us that power.
261
00:18:21.920 --> 00:18:24.980
Yeah. I think you you give you give parents tools
262
00:18:25.360 --> 00:18:27.380
so that they can parent better.
263
00:18:27.825 --> 00:18:32.485
Absolutely. And and have them take responsibility. And I it's bigger than Nosta. Right? Because, like Absolutely.
264
00:18:33.424 --> 00:18:36.164
I mean, it's kind of bewildering that you don't, like
265
00:18:36.865 --> 00:18:41.445
that Apple doesn't have built into the iPhone or whatever, like, really granular
266
00:18:42.540 --> 00:18:51.040
controls for parents to choose how their kids are interacting with these things. I think you you you bring it down almost the OS level. Right? Yeah. Like,
267
00:18:51.580 --> 00:18:58.215
because I'm a I'm a tech nerd, I know how to go in it on my router and block access to my kids' devices Right. To certain websites.
268
00:18:58.515 --> 00:19:03.735
It's I'll say it's easy, but is it easy for everybody? Probably not. So we need easier
269
00:19:04.035 --> 00:19:06.054
tools for everybody to use. Yeah. I agree.
270
00:19:06.915 --> 00:19:10.275
I mean, guys, this has been a great conversation. We've been a little bit more abstract,
271
00:19:12.700 --> 00:19:18.960
just to bring it bring it all back together and make it a little bit more actionable. To people here that have never used Nasr
272
00:19:19.980 --> 00:19:28.240
and maybe wanna play around with the test, I I think, you know, the best way to learn is to, you know, just get your hands dirty and actually use the tools.
273
00:19:28.995 --> 00:19:32.695
I mean, Sean, what would be your recommendation to someone who's
274
00:19:33.475 --> 00:19:40.535
interested in in seeing what's being built out there? Yeah. I'll take just a a brief moment of further abstraction and just say, I think what's so powerful about
275
00:19:41.190 --> 00:19:44.570
Noster and and some of the technology that underlies it is
276
00:19:45.830 --> 00:19:47.049
I'll steal someone else's,
277
00:19:47.669 --> 00:19:51.049
analogy metaphor is if you were a medieval king
278
00:19:51.350 --> 00:19:58.634
and you needed to issue a directive throughout the kingdom to your military to someone else, as you would probably recall, you would have a signet ring.
279
00:19:59.014 --> 00:20:07.754
That signet ring would be heated, pressed into wax, it creates a seal that letter is then delivered to Matt, the general, and my signet ring is my private key.
280
00:20:08.090 --> 00:20:12.030
It is difficult to mimic, difficult to forge, presumably hard to steal.
281
00:20:12.410 --> 00:20:12.910
That's
282
00:20:13.690 --> 00:20:14.750
my piece of
283
00:20:15.450 --> 00:20:16.590
property that allows
284
00:20:16.970 --> 00:20:21.630
me to sign. The seal is the public key. And so that is all to say,
285
00:20:22.675 --> 00:20:25.895
in in these ways that have been created and recreated throughout time,
286
00:20:26.195 --> 00:20:32.295
Nostred gives you that ownership. Now, with that comes great responsibility. You own that key. You have that signet ring.
287
00:20:32.675 --> 00:20:35.575
And so, from that understanding that you can own
288
00:20:36.110 --> 00:20:39.090
your identity, you can own the ability to attribute,
289
00:20:40.510 --> 00:20:41.330
your creation
290
00:20:41.790 --> 00:20:42.290
or
291
00:20:42.750 --> 00:20:43.250
publishing
292
00:20:43.550 --> 00:20:44.130
of content,
293
00:20:45.710 --> 00:20:49.010
it can be quite simple. So I think primal is brilliant.
294
00:20:49.765 --> 00:20:53.045
I'll disclaimer format, ten thirty one, investor in Primal.
295
00:20:53.765 --> 00:20:56.325
Fantastic application. So primal.net,
296
00:20:56.325 --> 00:20:59.705
I think it's a great way to get started. I think it's one of the best consumer UXs.
297
00:21:00.165 --> 00:21:05.919
There are many others depending on where you are on the spectrum from, I just want it to work Apple esque style
298
00:21:06.299 --> 00:21:06.799
to,
299
00:21:07.100 --> 00:21:11.340
you know, like us, we're nerds and wanna dig in. But I would say, in short, primal.net,
300
00:21:11.340 --> 00:21:12.080
take a look.
301
00:21:12.620 --> 00:21:13.440
Great recommendation.
302
00:21:15.500 --> 00:21:20.159
I think he handled that really well. Yeah. So while we have a little bit more time, just real quick,
303
00:21:21.865 --> 00:21:22.685
vibe coding,
304
00:21:23.065 --> 00:21:23.565
Nostr,
305
00:21:23.945 --> 00:21:24.845
AI, Bitcoin,
306
00:21:25.305 --> 00:21:29.005
that's where your focus is right now. Yes. Why is that powerful?
307
00:21:29.465 --> 00:21:29.965
Because
308
00:21:30.265 --> 00:21:36.650
so Soapbox is building tools that allow people that are creators or have their own community
309
00:21:37.270 --> 00:21:38.970
to build an application.
310
00:21:39.350 --> 00:21:42.809
You can vibe code it. You can build your own app
311
00:21:43.110 --> 00:21:46.169
for your own community and because it's built on Nostr,
312
00:21:46.765 --> 00:21:50.785
you can own all that content. So instead of using Discord
313
00:21:51.245 --> 00:21:53.745
or Twitter or whatever for your community,
314
00:21:54.125 --> 00:21:55.745
you could use Shakespeare
315
00:21:56.445 --> 00:22:03.940
to build your own community app customized how you've always wanted it to be and you own it. You own all the source code. You own all the data.
316
00:22:04.240 --> 00:22:24.235
It's decentralized. You can do whatever you want with it and nobody can take that away from you. Whereas if your Discord server gets taken down because you're a streamer or a musician or an artist or something, well, you're screwed. You can't do anything. But if you use Soapbox tools and you build Shakespeare, you can own every piece of the puzzle. Yeah. And the key there is you don't need
317
00:22:24.695 --> 00:22:41.780
closed API access. You don't need to Yeah. Verify You don't need to ask permission. You just do it. Yeah. You have the you have the social graph. You have the the identity layer. You have the comms protocol all in Nostra, which is basically like an open API for the world for that. Yeah. And then on the payment side, you have Bitcoin
318
00:22:42.424 --> 00:22:53.005
so that you don't have to, you know, get a a Stripe API or something like that to to integrate. No permission required. Just go do it. Yeah. You wanna build a website that accepts Bitcoin payments for your
319
00:22:53.465 --> 00:22:56.525
product that you're selling or for your personal website or something.
320
00:22:56.870 --> 00:23:10.250
You don't need to know any code. You don't need to be a developer on how to do it. You just have a conversation with AI and you and you say build me this website that does this thing a b c d and a few minutes later boom it's done. And it's yours and you can do whatever you want with it.
321
00:23:10.654 --> 00:23:14.515
Love it. Can we have a huge round of applause for Derek and Sean? Thank you guys. Thank you.