CD177: MARKS - MAPLE PRIVATE AND SECURE AI
Mark Suman is CoFounder of OpenSecret and Maple. OpenSecret is a platform for building private and secure apps and Maple is an AI tool built using OpenSecret. Maple offers users and developers an easy way to interact with popular open source AI models.
Marks on Nostr: https://primal.net/marks
Marks on X: https://x.com/marks_ftw
Maple on Nostr: https://primal.net/mapleai
Maple on X: https://x.com/TryMapleAI
Maple: https://trymaple.ai/
OpenSecret: https://opensecret.cloud/
EPISODE: 177
BLOCK: 913963
PRICE: 898 sats per dollar
(00:01:38) Happy Bitcoin Tuesday
(00:03:05) Marks and Maple AI
(00:07:10) Apple's AI Strategy and Product Unveiling
(00:11:40) Privacy and Security in AI with Maple
(00:19:59) Open Source AI Models and Industry Trends
(00:27:02) Freedom of Thought and AI's Influence
(00:39:00) User Experience and Privacy in Maple AI
(00:48:50) AI and Parenting
(01:02:41) Maple AI Developer API and Proxy Feature
(01:16:00) Open Secret and Maple AI: Company Overview
Video: https://primal.net/e/nevent1qqsx7ypznejxgujpqdwnch5w6pzeh03pn9fvr5guqpry930e7w3yqycllz5km
more info on the show: https://citadeldispatch.com
learn more about me: https://odell.xyz
01:38 - Happy Bitcoin Tuesday
03:05 - Marks and Maple AI
07:10 - Apple's AI Strategy and Product Unveiling
11:40 - Privacy and Security in AI with Maple
19:59 - Open Source AI Models and Industry Trends
27:02 - Freedom of Thought and AI's Influence
39:00 - User Experience and Privacy in Maple AI
48:50 - AI and Parenting
01:02:41 - Maple AI Developer API and Proxy Feature
01:16:00 - Open Secret and Maple AI: Company Overview
NOTE
Transcription provided by Podhome.fm
Created: 09/10/2025 03:39:47
Duration: 5159.317
Channels: 1
1
00:00:01.360 --> 00:00:04.020
And and I guess I wonder if you think
2
00:00:04.640 --> 00:00:10.099
Standard and Poor's or others or, you know, if if if if anybody has any sort of
3
00:00:11.679 --> 00:00:12.179
bias
4
00:00:12.615 --> 00:00:20.154
towards crypto still at this point because the S and P 500 just came out with their decisions, and they chose to go with Robinhood and AppLovin
5
00:00:20.615 --> 00:00:21.115
over
6
00:00:21.575 --> 00:00:26.875
strategy. And a lot of people were kinda surprised by that. The stock actually was down a little bit. Do you think that there is a bias
7
00:00:27.449 --> 00:00:28.349
at any of
8
00:00:29.050 --> 00:00:31.949
the indexes or any places that come through this against
9
00:00:32.809 --> 00:00:41.790
a company with a lot of Bitcoin and holdings? I don't think there's a bias. I don't think we expected to be selected on our first quarter of eligibility. We figure it'll happen at some time.
10
00:00:42.375 --> 00:00:43.755
There's a digital transformation
11
00:00:44.295 --> 00:00:47.515
in the markets. This is a brand new novel concept.
12
00:00:47.895 --> 00:00:48.795
Every quarter,
13
00:00:49.095 --> 00:00:53.515
we make new believers. We we get more support from banks, from politicians,
14
00:00:54.055 --> 00:00:54.555
from
15
00:00:54.970 --> 00:00:58.270
credit rating agencies, etcetera. I think that will continue,
16
00:00:59.370 --> 00:01:01.070
for the foreseeable future.
17
00:01:38.105 --> 00:01:46.845
Happy Bitcoin Tuesday, freaks. It's your host, Odell, here for another Citadel Dispatch, the interactive live show focused on actual Bitcoin
18
00:01:47.385 --> 00:01:48.845
and Freedom Tech
19
00:01:49.225 --> 00:01:49.725
discussion.
20
00:01:50.425 --> 00:01:52.380
That intro clip was none
21
00:01:53.320 --> 00:01:55.500
none other than Michael Sailor,
22
00:01:56.200 --> 00:02:01.580
founder and CEO. I think he's the CEO. Chairman. I don't know what his title is. The lead
23
00:02:02.040 --> 00:02:02.700
at strategy.
24
00:02:03.160 --> 00:02:05.180
They didn't get into the S and P 500.
25
00:02:05.880 --> 00:02:06.620
They got
26
00:02:08.944 --> 00:02:17.525
Applovin got chosen over them. But, he seems to take it like a champ. Completely unrelated to today's show, but it seemed like the news of the week.
27
00:02:17.985 --> 00:02:26.290
So that's why we started off with it. As always, freaks, the show is ad free, sponsor free, brought to you by viewers like you who support the show
28
00:02:26.990 --> 00:02:28.050
with Bitcoin
29
00:02:28.590 --> 00:02:33.090
through Zaps. The easiest way to support the show is through your favorite Napster app.
30
00:02:33.725 --> 00:02:38.305
I'm helping out build up Primal. I really like Primal. You can download it in your favorite app store.
31
00:02:38.765 --> 00:02:39.505
The top
32
00:02:40.205 --> 00:02:42.545
zap from last week was 10,000
33
00:02:42.765 --> 00:02:45.505
from ride or die freak, Mav 21.
34
00:02:45.880 --> 00:02:49.420
He beat out ride or die freak man b y t,
35
00:02:50.040 --> 00:02:53.260
who sent 9,999
36
00:02:53.320 --> 00:02:57.020
sats. Thank you, freaks, for supporting the show. Thank you for joining us
37
00:02:57.400 --> 00:02:59.580
in the live chat. You guys make it unique.
38
00:03:01.645 --> 00:03:03.645
All links at sildispatch.com.
39
00:03:03.645 --> 00:03:04.785
Thank you for your support.
40
00:03:05.325 --> 00:03:08.465
We have a great show lined up today. I have Marks here.
41
00:03:09.645 --> 00:03:13.905
He's building out Maple AI, a private and secure AI.
42
00:03:14.670 --> 00:03:17.330
How's it going, Marks? Yo. Hey. It's going great.
43
00:03:18.110 --> 00:03:21.330
Good to have you. Things are good. Thanks for having me on. Appreciate it.
44
00:03:21.630 --> 00:03:25.250
I'm excited. I'm excited for the chat. I think it's timely. I think,
45
00:03:26.989 --> 00:03:30.530
I mean, first, I actually didn't bring this up to you before the show.
46
00:03:32.295 --> 00:03:33.515
You used to be with
47
00:03:33.975 --> 00:03:34.875
Apple. Yeah.
48
00:03:35.575 --> 00:03:36.955
Today was their big
49
00:03:37.815 --> 00:03:41.435
unveiling. Did you did you watch it, or do you just pretend
50
00:03:42.455 --> 00:03:51.010
that part of your life is over with now? Or Like, PTSD or something? Yeah. You know, it's actually really funny. I have been an Apple fanboy since I was a kid.
51
00:03:51.310 --> 00:04:00.370
Like, how old was I? Maybe 10 or 11. Maybe younger than that. And I have especially with the Steve Jobs years, like, I was following everything religiously closely.
52
00:04:01.125 --> 00:04:06.345
And being there, I was, like, in the thick of it. And so today would be, like, NPI, new product introduction.
53
00:04:06.965 --> 00:04:11.385
And NPI for us on our team would start, like, six months prior, nine months prior.
54
00:04:11.924 --> 00:04:20.680
So I would be talking about it constantly every day leading up to today. And to be totally honest, I was lost in financial,
55
00:04:21.780 --> 00:04:23.880
projections, spreadsheets for Maple
56
00:04:24.260 --> 00:04:31.095
and completely forgot the event was happening today. And it wasn't until Anthony texted me and said, hey. The new iPhone looks pretty sweet,
57
00:04:31.815 --> 00:04:33.835
that I realized that the event was going on.
58
00:04:34.375 --> 00:04:41.755
So, yeah, there you go. I, I didn't even pay attention. I didn't watch it. I haven't looked at a single thing about it other than his text.
59
00:04:42.295 --> 00:04:44.075
I mean, that's kinda beautiful.
60
00:04:44.530 --> 00:04:47.410
I was kinda hoping that was the answer. You found a
61
00:04:47.810 --> 00:04:50.710
you found your way with a new with a new product.
62
00:04:52.450 --> 00:04:56.230
I mean, I it's it's I think it's highly relevant to this conversation
63
00:04:56.770 --> 00:04:59.350
because everyone's waiting to see what their
64
00:05:00.095 --> 00:05:01.315
AI plays are,
65
00:05:02.175 --> 00:05:06.755
and they've kind of been slow. Some might say deliberate. Some might say they're dropping the ball.
66
00:05:08.895 --> 00:05:14.755
But just for everyone at home, there wasn't really any big AI announcements today except they have, like, a cool
67
00:05:15.500 --> 00:05:16.000
translation
68
00:05:16.460 --> 00:05:17.520
AirPods feature
69
00:05:19.660 --> 00:05:29.280
that I think, like, every sci fi novel I've been has been waiting for for, like, the last twenty five, thirty years, you know, like, where both people are wearing AirPods, and then it just, like, automatically translates as you're having a conversation.
70
00:05:29.715 --> 00:05:32.535
I mean, that's cool. Yeah. That's exactly what we've been waiting for.
71
00:05:33.235 --> 00:05:38.055
Question question is how well or will it work, and what is your cell phone connection like, and what's the delay?
72
00:05:38.675 --> 00:05:42.695
Yeah. No. It's probably gonna be miserable in the beginning is probably the answer to that.
73
00:05:44.120 --> 00:05:47.099
My guess, it probably works best if both people are wearing,
74
00:05:47.800 --> 00:05:50.860
you know, four AirPods. Like, two AirPods in each
75
00:05:51.720 --> 00:05:56.940
an AirPod in each ear, which is, like, kind of a ridiculous way. They, like, advertise it as, like, a business meeting.
76
00:05:57.425 --> 00:06:03.585
But, like, having a business meeting where you're both, like, potted up is I don't know. It's just weird. But,
77
00:06:04.465 --> 00:06:28.605
Well, don't tell my, Apple former coworkers, but I don't even wear AirPods anymore. They feel like they were hurting my ears, and I don't wanna, like, gigify my brain. So I've been just I've gone back to the wired headphones. I'm a bit concerned about the wireless stuff. Mhmm. But I do wear one ear pod AirPod. Like, I don't put two in my ears, so I feel like they're not zapping through me. Okay. You're not completely zipping.
78
00:06:29.785 --> 00:06:33.325
And, I will I I switched from, like, the
79
00:06:33.945 --> 00:06:40.525
I don't know. We're getting completely off off filter off filter here, but I switched from the Pros to the
80
00:06:41.030 --> 00:06:51.830
shittier ones. Mhmm. Because the shittier ones are more open. It doesn't feel like it's closed off on my ear. Yeah. And the main reason is is, first of all, obviously, a convenience thing. But,
81
00:06:53.375 --> 00:07:00.014
the but I have small children. And with small children, they're, like, constantly ripping out the cords. Like, I tried to do the wired thing, and it was,
82
00:07:02.574 --> 00:07:05.555
you know, maybe equally as dangerous for my ears.
83
00:07:06.470 --> 00:07:09.770
Yeah. We're constantly just pulling at them. Yeah. I can see that.
84
00:07:10.310 --> 00:07:10.810
Anyway,
85
00:07:11.110 --> 00:07:17.610
Maple. What is Maple? Why should people care? Yeah. Maple, short version is the privacy alternative to ChatGPT.
86
00:07:18.925 --> 00:07:21.585
It's it's just as simple as that. If you're sick of
87
00:07:22.525 --> 00:07:32.250
big companies, especially ChatGPT knowing everything about you, then you can stop using it. Or you can still use it, but also get a free account over at trymaple.ai.
88
00:07:32.570 --> 00:07:37.230
And we have end to end encryption so that all of your AI chats are completely protected.
89
00:07:37.850 --> 00:08:00.105
And it's not just a promise that we tell you. There are other privacy AIs out there that claim to be private, but you just have to trust what's running on their servers is what they tell you. Even if they're open source, you still don't know what's running there, but we give you a cryptographic promise. We show you, here's the open source code. Here's the the, you know, the check sum of what's running on the servers using the secure enclaves. And so any user
90
00:08:00.569 --> 00:08:01.849
can go and,
91
00:08:02.250 --> 00:08:02.750
verify
92
00:08:03.210 --> 00:08:05.389
what we are running is what's on GitHub.
93
00:08:05.849 --> 00:08:06.349
So
94
00:08:06.810 --> 00:08:08.490
don't trust verify. We are the,
95
00:08:09.289 --> 00:08:12.189
we're we're that motto but wrapped into AI.
96
00:08:14.025 --> 00:08:14.765
Love it.
97
00:08:16.105 --> 00:08:18.525
I just realized I messed up the Nostra stream,
98
00:08:19.145 --> 00:08:22.605
but I fixed it. Okay. Okay. It should be working now.
99
00:08:24.585 --> 00:08:28.330
I like to call it, like, the signal of AI. How do you think of that metaphor?
100
00:08:28.790 --> 00:08:29.930
Signal's great too.
101
00:08:30.310 --> 00:08:30.810
Yeah.
102
00:08:31.910 --> 00:08:34.730
Because the the trust model is actually kinda similar. No?
103
00:08:35.510 --> 00:08:36.170
It is,
104
00:08:36.550 --> 00:08:38.570
except I don't think signal is using
105
00:08:39.190 --> 00:08:45.425
secure enclaves. Is there a way for you to verify what signal is running on the servers? They're use are you using SGX. Right?
106
00:08:46.205 --> 00:08:49.345
We're using AWS Nitro. Are they using Intel SGX?
107
00:08:49.885 --> 00:08:55.425
As my understanding. Yeah. Okay. Then, yeah, it is probably Their whole trust model relies on cloud secure enclaves.
108
00:08:55.810 --> 00:08:56.870
Okay. Then, yes.
109
00:08:57.170 --> 00:09:05.910
I I haven't seen that part of it. Alright. I mean, it's part of, like, the spook theory of signal or whatever. Is that SGX is backdoored and,
110
00:09:07.810 --> 00:09:09.510
that you don't actually have privacy.
111
00:09:10.050 --> 00:09:11.110
I mean, I think
112
00:09:11.654 --> 00:09:15.514
I think Signal is a fantastic app, and I think it provides you reasonably
113
00:09:15.815 --> 00:09:16.315
secure,
114
00:09:17.815 --> 00:09:21.035
chats in a very convenient way that is very reliable.
115
00:09:21.415 --> 00:09:24.954
And so, like, I when I compare it to Signal, I
116
00:09:26.230 --> 00:09:26.630
I
117
00:09:27.110 --> 00:09:29.290
to me, that's like the the FreedomTech.
118
00:09:30.950 --> 00:09:34.090
The mass the the the mass market FreedomTech
119
00:09:34.790 --> 00:09:36.570
success story is signal.
120
00:09:36.870 --> 00:09:39.530
Right? And it's crazy. Like, if you look at the numbers,
121
00:09:40.505 --> 00:09:47.805
it's still not that many users in the scheme of things compared to what they're competing against. I think Signal's, like, a 100,000,000 users,
122
00:09:48.985 --> 00:09:50.045
while, like, WhatsApp
123
00:09:51.705 --> 00:09:55.005
is three and a half billion Yeah. Or something like that. Mhmm.
124
00:09:55.550 --> 00:09:57.230
But it's a 100,000,000
125
00:09:57.230 --> 00:10:02.610
people that are using an encrypted chat app that probably wouldn't be using it otherwise. Like, they're not gonna use PGP.
126
00:10:03.069 --> 00:10:05.329
They're not gonna self host a matrix server.
127
00:10:06.110 --> 00:10:10.610
And so there's a lot of similarities, I think, there with on the AI side
128
00:10:11.055 --> 00:10:12.275
because a lot of
129
00:10:13.135 --> 00:10:17.875
the open models, private, and secure stuff has been focused on self hosting stuff.
130
00:10:18.255 --> 00:10:22.835
But the overwhelming majority of people like, my parents are not gonna self host an LLM.
131
00:10:23.535 --> 00:10:24.035
Mhmm.
132
00:10:24.620 --> 00:10:33.680
They have downloaded and used Maple. Right? Like, it's just a one to one almost a one to one drop in replacement to something like chat g p t in terms of their workflow. Right? Right.
133
00:10:34.060 --> 00:10:38.399
Yeah. And there's levels to privacy risk that people wanna take on exactly what you're describing.
134
00:10:38.875 --> 00:10:42.815
And so people love to come to me and say, well, local AI is the only private thing,
135
00:10:43.115 --> 00:10:51.615
and so I'm not gonna trust your service. And it's like, that's fine. Don't trust my service. Local AI is for sure the most private. You can turn off the Internet connection. You can be totally disconnected
136
00:10:51.980 --> 00:10:55.920
and chat, and you can have a you can have a computer that never touches the Internet. That's fine.
137
00:10:56.540 --> 00:11:08.355
However, there's, like, this huge gap between that and fully captured chat g p t. And so we're gonna take as much of local privacy as possible and give it to you in the cloud with, like, really powerful servers,
138
00:11:09.135 --> 00:11:15.555
synchronization across all your devices, and we use private keys and manage private keys for you. And so we try to make the experience
139
00:11:15.935 --> 00:11:20.275
as close as possible to something like chat g p t, but give you as much privacy as local.
140
00:11:21.340 --> 00:11:27.040
Yeah. I mean, I I think there's there's a huge appeal to that. And even for someone like a power even for a power user,
141
00:11:29.100 --> 00:11:30.880
I mean, the convenience of that
142
00:11:32.460 --> 00:11:35.920
is just a it's a significant benefit in terms of
143
00:11:36.404 --> 00:11:36.904
of,
144
00:11:38.404 --> 00:11:41.785
how that workflow fits in your life. Even something just like multiplatform
145
00:11:42.245 --> 00:11:44.825
support. Right? Like, the fact that I can just have, like, Maple
146
00:11:45.685 --> 00:11:56.210
easily, like, on my desktop and on, you know, on my phone at the same time, and all my chat history is saying like, computer here too. Like, I've got it everywhere. Super useful.
147
00:11:56.750 --> 00:11:57.250
Mhmm.
148
00:11:57.950 --> 00:12:00.370
Okay. So that trade off model makes sense to me.
149
00:12:01.230 --> 00:12:03.730
You guys have been using open models.
150
00:12:04.455 --> 00:12:05.355
So these models
151
00:12:05.735 --> 00:12:06.235
are
152
00:12:07.175 --> 00:12:07.675
are
153
00:12:08.775 --> 00:12:11.595
they're hosted in the cloud, but they're open nonproprietary
154
00:12:12.215 --> 00:12:13.275
models. So
155
00:12:14.455 --> 00:12:15.995
how should people think about
156
00:12:16.855 --> 00:12:19.435
the and recently, you've added OpenAI's
157
00:12:20.270 --> 00:12:26.770
new open source model. Right. How should people think about, like, these models, at least in the current state versus
158
00:12:27.470 --> 00:12:34.450
the proprietary ones that maybe they're used to? Maybe I mean, I think, overall, majority of people are probably using chat g p t right now. Right? It's like the Kleenex.
159
00:12:34.805 --> 00:12:35.944
Yeah. Or grok.
160
00:12:36.404 --> 00:12:44.185
Yeah. It's like yeah. Grok. And a lot of people view grok as, like, the private alternative to chat g p t. Right? They think it's the the uncensored one.
161
00:12:44.725 --> 00:12:48.105
And maybe it's good to take a quick step back and just, like, explain
162
00:12:49.090 --> 00:12:52.150
the different levels and and maybe private versus open.
163
00:12:53.250 --> 00:13:04.665
And I'm I'm sure that a lot of people, like, tuned into this and immediately rolled their eyes because we're talking about AI. I remember being really sick of the AI discussions a few years ago when everybody was talking about it even though I was doing it at Apple every day.
164
00:13:05.225 --> 00:13:12.685
I just wanted to talk about Bitcoin. So I apologize to everybody who's on here right now not wanting to listen to AI talk. Okay. That said,
165
00:13:13.464 --> 00:13:15.964
so you have like the big companies that are the foundational,
166
00:13:16.505 --> 00:13:19.324
foundation model companies like OpenAI, Anthropic,
167
00:13:20.720 --> 00:13:23.140
XAI, which does Grok for Twitter.
168
00:13:24.640 --> 00:13:39.195
And these people are actually taking all of the data. They're scraping the Internet. They're taking all of the torrents that are online even though they say they're not doing that. And they're ingesting them, and then they spend, like, hundreds of millions of dollars to run these big servers that train the AI.
169
00:13:39.654 --> 00:13:46.235
And it'll train in, like, these epochs where every two weeks, they can grab a snapshot of the training, and they'll run,
170
00:13:47.015 --> 00:13:58.290
evaluations against it and try to find, you know, when the epoch finally hits a spot where they're happy with it. And, eventually, when it does, then they'll take that version and say, alright. This is now a new model that we're gonna release to the world.
171
00:13:59.390 --> 00:14:04.130
And for the companies like OpenAI, they tend to not release it on open.
172
00:14:04.430 --> 00:14:06.850
They wanna release it within their product within ChargeGPT,
173
00:14:07.355 --> 00:14:11.055
and that's what you see as like chat g p t four, four dot five,
174
00:14:11.515 --> 00:14:16.095
o three, you know, chat g p five, whatever they wanna call it. Their naming scheme is awful.
175
00:14:16.555 --> 00:14:17.295
And so
176
00:14:17.915 --> 00:14:21.480
It's so bad. Oh, it's so awful. Nobody knows what's going on.
177
00:14:21.880 --> 00:14:23.800
So that's kinda how they do that.
178
00:14:24.279 --> 00:14:25.259
And then in
179
00:14:26.040 --> 00:14:33.019
historically, like, none of them have open sourced anything. Elon says he's gonna open source Grok, but it's like, Elon went open source.
180
00:14:34.015 --> 00:14:35.875
He hasn't done it yet. And then
181
00:14:36.575 --> 00:14:38.834
OpenAI recently open sourced
182
00:14:39.135 --> 00:14:44.675
one of their models. They made two variants of it. And so you've got the GPT OSS
183
00:14:45.135 --> 00:14:46.995
20 b and a 120 b.
184
00:14:47.390 --> 00:14:56.770
And really the number at the end is just how big the model is, how many billions of parameters inside of it. And so 20 b is smaller, which means it can fit on the laptop easily, possibly fit on a phone,
185
00:14:57.150 --> 00:15:03.125
and then, you can run with that. On that note, is is bigger generally better in this situation?
186
00:15:03.665 --> 00:15:04.165
Yeah.
187
00:15:04.625 --> 00:15:15.350
Yeah. I mean, the more parameters you have, the more basically, the bigger the brain is that can Right. Think about the stuff that you give it. So in general, bigger is better. And then even with 20 b,
188
00:15:16.050 --> 00:15:20.630
the smaller version, you'll have people do, like, they'll retrain it or they'll do what's called quantize,
189
00:15:21.089 --> 00:15:24.709
where they will take it and they'll shrink it down even more by
190
00:15:25.435 --> 00:15:35.615
kind of like like zipping up files, basically. Like, when you use up a file to send it and compress it, they're compressing it and getting rid of, like, the 32 bit address space and crunching it down to eight kind of thing.
191
00:15:36.395 --> 00:15:45.370
And so that allows it to fit even better on a smaller device and run faster. But again, you're sacrificing quality because you are getting rid of some of that fidelity around the edges.
192
00:15:47.110 --> 00:15:47.610
So,
193
00:15:48.149 --> 00:16:12.110
so those are those are the closed proprietary models. And then, we have obviously the one that OpenAI did. And then we have DeepSeek out of China. You have Alibaba out of China as well. And all the other ones are all the other ones are Chinese, basically. It's good for Llama. Right? Yeah. So Llama is from Meta, and Meta decided they weren't gonna be able catch up right away with the other companies. So they open sourced immediately to try and get some traction.
194
00:16:12.810 --> 00:16:15.550
And they had initial early traction with llama
195
00:16:15.850 --> 00:16:24.715
three dot one or three dot three, those ones. They were pretty good, but then they got outpaced quickly. And when they tried to do LAMA four, they just fell flat on their face.
196
00:16:25.335 --> 00:16:30.235
They launched it with much fanfare. In fact, Anthony and I were out there at their headquarters
197
00:16:30.775 --> 00:16:33.515
shortly after the launch, for their dev day.
198
00:16:33.819 --> 00:16:37.339
And they talked about how amazing it is, but yet nobody actually uses it.
199
00:16:37.899 --> 00:16:40.959
So they must be working on something, and Zuck recently
200
00:16:41.420 --> 00:16:47.120
alluded to the fact that they might not open source the next one. They're not going to. They might keep the best stuff for themselves.
201
00:16:48.085 --> 00:16:51.145
I I I don't expect them to. Do you expect them to?
202
00:16:52.085 --> 00:17:07.680
I think they'll come out with a version, like, a variant that's open source just because they're so big on the open source that I think they to to not do it at all would be a big problem for them, PR wise. But But I mean, OpenAI was OpenAI was literally a nonprofit
203
00:17:08.300 --> 00:17:17.975
founded to stop big tech from launching proprietary only models, and now they have the most popular proprietary only model and have pivoted to for profit. Yeah. That's true.
204
00:17:18.355 --> 00:17:27.335
So there's some there seems to be a theme there. I mean, Zuck has, like it's become a meme. It's like he's just poaching so many people. He's spending top top dollar
205
00:17:27.795 --> 00:17:28.075
Mhmm.
206
00:17:28.915 --> 00:17:32.934
For AI researchers. Like, I'd be very like, hopefully, he proves me wrong.
207
00:17:33.289 --> 00:17:36.030
Yeah. It's been crazy. Surprised if it's not proprietary.
208
00:17:36.570 --> 00:17:44.985
Yeah. It's crazy watching the OpenAI streams, and then, like, two or three weeks later, people that were on that stream for the product launch are now working at Meta.
209
00:17:45.945 --> 00:17:54.125
So there's, like, a great meme of, like, Zuck, like, looking at his phone, and it's like when you're watching the OpenAI stream and, you know, it's just like an audition for your next employee,
210
00:17:55.945 --> 00:17:56.765
job interview.
211
00:17:57.304 --> 00:18:03.540
But, no. He could totally just make it proprietary. We have no guarantees. So, yeah, the best ones are out of China right now,
212
00:18:03.920 --> 00:18:05.300
and, it's unfortunate.
213
00:18:05.840 --> 00:18:08.500
But you talked about it on on, RHR.
214
00:18:08.800 --> 00:18:23.345
Like, it has all the Chinese stuff built into it. And so there are people who take it and try to jailbreak it and come out with, like, a not a retrained version, but, like, an alternative version that's a little more uncensored and open sore and open,
215
00:18:24.365 --> 00:18:24.865
thinking.
216
00:18:25.710 --> 00:18:31.730
So it's cool that people are doing that. We don't run any versions like that. We're just running the raw deep seek.
217
00:18:33.230 --> 00:18:41.090
We could Yeah. You're like but we're not right now. I think Marty did deep seek on Maple live on RHR and asked about Tiananmen Square or whatever.
218
00:18:41.445 --> 00:18:44.585
And it was like it was like three paragraphs. It was like
219
00:18:45.044 --> 00:18:46.024
CCP approved
220
00:18:46.565 --> 00:18:47.225
AI slop
221
00:18:47.845 --> 00:18:56.024
not answering the question. It was like, we still trust the current government as the best one ever and that they made the best decisions that they needed to for the time.
222
00:18:57.440 --> 00:19:12.660
Yeah. It's an interesting phenomenon. I mean, I kinda wanna just jump into this aspect a little bit even though it's a little bit tangential to Maple, but because you've been so focused on this area, just like AI open development in general
223
00:19:13.005 --> 00:19:14.205
and just freaks. To,
224
00:19:15.885 --> 00:19:19.745
Mark's earlier comment about people that might be rolling their eyes about AI,
225
00:19:21.725 --> 00:19:28.065
I think there's another group that maybe is like, okay. Like, this is a little bit too deep and technical or in the weeds.
226
00:19:30.580 --> 00:19:47.225
Just if you are gonna leave us, just consider downloading Maple in your favorite app store. It's actually the whole point of it is that it's very simple to use. You just pick your model. You chat with it like you would with ChatGPT or Grok. It's, like, very easy to use. All this stuff is, like, kind of more background stuff.
227
00:19:49.524 --> 00:19:52.585
What I was gonna ask is I mean, so on this Zuck piece
228
00:19:52.965 --> 00:19:59.385
and on the open piece with with the Chinese models that are mostly open mostly open models are Chinese.
229
00:20:00.380 --> 00:20:06.559
And just to reiterate, because we did talk about it on our HR, but and there's some overlap in the audiences, but not full overlap.
230
00:20:07.179 --> 00:20:12.080
You know, part of that reason is, I think, because no one trusts the CCP with closed models.
231
00:20:13.165 --> 00:20:18.145
So they get their soft power and their soft influence of you using their models that they've trained their ways.
232
00:20:18.525 --> 00:20:22.545
Mhmm. But you don't have to necessarily trust them because they're like, I would never run DeepSeq,
233
00:20:23.085 --> 00:20:33.600
but I do because it's available through Maple in a secure private way because it's open. Like, the only reason that I'm even considering touching DeepSeek is because they've released it open.
234
00:20:34.059 --> 00:20:34.559
Yeah.
235
00:20:35.980 --> 00:20:39.600
There's two pieces there on the proprietary side. Right? Because it's
236
00:20:40.295 --> 00:20:40.955
not only
237
00:20:41.495 --> 00:20:42.235
are they
238
00:20:43.015 --> 00:20:46.155
you know, not only do we not really understand exactly, like,
239
00:20:46.535 --> 00:20:47.355
how the
240
00:20:47.655 --> 00:20:51.995
machinations are working in the background, like, how the LLMs are, like, thinking about things.
241
00:20:52.375 --> 00:20:55.195
But because they're in their walled gardens, they're also
242
00:20:56.200 --> 00:21:03.580
training them on your like, they're training them on the data that you're actually using in the app. Right? Like, I I mean and this is why I think Facebook
243
00:21:04.440 --> 00:21:09.260
is probably even less likely to continue the open path because their whole business model has been,
244
00:21:10.145 --> 00:21:14.804
like, walled garden. We own all your data, and we harvest you for advertisers
245
00:21:15.184 --> 00:21:17.845
historically. But now they can harvest you for
246
00:21:19.345 --> 00:21:21.205
for the LLM training. So
247
00:21:21.960 --> 00:21:29.740
do you think this path is that the big tech is just gonna keep going? Like, that's my question. Like, is the incentive too strong? Like, is the incentive so strong that they're,
248
00:21:30.360 --> 00:21:39.875
at least for the foreseeable future, gonna continue continue down this proprietary path? Yeah. I mean, that's that's a really deep rabbit hole that we could go down. But
249
00:21:40.415 --> 00:21:46.835
you look at Zuckerberg and you look at the products that they're building. Right? They have all these social products. They have WhatsApp.
250
00:21:47.870 --> 00:21:52.290
They have tons of data. And he he, like, on stage when we were there in April,
251
00:21:52.750 --> 00:22:02.115
was just so happy and proud to say, like, we have all this data, and we are gonna train off all your data. Like, he legit said that, and we're not gonna share it with anyone. We're gonna keep it for ourselves. And I think he
252
00:22:04.035 --> 00:22:06.695
was trying to brag that they can have a better model than others because they have all those data that they're snooping on.
253
00:22:06.995 --> 00:22:21.500
But then at the as he's saying that he's got these glasses on that are AI a AR glasses, and he's, like, constantly, like, messing with them. Half the time he's talking to the Microsoft CEO, he's, like, distracted, like, trying to dismiss notifications or who knows what.
254
00:22:22.679 --> 00:22:27.179
But I think their incentives are in line to, like, vertically vertically integrate
255
00:22:27.505 --> 00:22:37.845
to differentiate. So they're they're they're coming out with hardware devices. Meta has theirs. You have OpenAI who pair partnered with Jony Ive and are supposedly working on some hardware device.
256
00:22:38.225 --> 00:22:42.405
And then you have Apple who obviously has a Vision Pro and hasn't
257
00:22:42.710 --> 00:22:50.650
made a big splash yet with AI. But given Apple's history, you have to imagine they're gonna try to vertically integrate AI into that device as well.
258
00:22:51.270 --> 00:22:54.890
So I think as you have companies trying to differentiate on their hardware,
259
00:22:55.425 --> 00:22:57.365
they're gonna want total control
260
00:22:57.825 --> 00:23:02.165
over the model that's inside of it. So I could see them staying as proprietary as possible.
261
00:23:02.625 --> 00:23:07.745
And then maybe they still kinda toss out open source crumbs here and there. But,
262
00:23:09.230 --> 00:23:14.290
I think that we are going to have to have others who are doing open source as a way to compete
263
00:23:14.910 --> 00:23:16.290
compete with these big companies.
264
00:23:17.470 --> 00:23:35.875
So, I mean, we kinda touched on the beginning, but you do have history at Apple. Like, to me, they're the dark horse. Like, how are you thinking about their because I just kinda spoke on it. But how are you thinking about their AI strategy? Like, if anyone's gonna release it probably won't be open. They they don't really release anything open, but they might do, like, private local stuff.
265
00:23:38.169 --> 00:23:39.070
Yeah. Yeah. So,
266
00:23:40.090 --> 00:23:54.225
I mean, I'll speak what's publicly available online. Yeah. They have they have their Apple private cloud compute, which is based off secure enclaves, very similar to what we're doing with Maple. But they are always gonna stick to their platforms. They hardly ever wanna go cross platform.
267
00:23:55.164 --> 00:24:00.784
They they very rarely do that. So their stuff is going to be integrated into the operating system as much as possible,
268
00:24:01.325 --> 00:24:04.145
and they're gonna have their own models that they train.
269
00:24:04.525 --> 00:24:05.265
It's gonna
270
00:24:06.200 --> 00:24:11.100
I I don't know if this is a good analogy, but it's gonna be very similar, I think, to Disney and to Nintendo
271
00:24:11.640 --> 00:24:14.780
where you get their products and they're, like, very family friendly
272
00:24:15.240 --> 00:24:16.220
all the way through.
273
00:24:16.520 --> 00:24:25.325
You can kinda see that when you look at image generation. Apple does have image gen with their AI and it's like these cute little emoji type things. Yeah. Like, make your own emoji.
274
00:24:25.625 --> 00:24:32.524
Yeah. It's nothing serious. Like, you can't do anything. You can't be like, give me Donald Trump, you know, eating a cheeseburger or something.
275
00:24:32.904 --> 00:24:35.150
You can't do anything like that. So
276
00:24:35.770 --> 00:24:41.070
I I I think that's Apple's probably gonna stay safe with anything that it does.
277
00:24:41.850 --> 00:24:43.950
I don't know if you saw Meredith Whitaker's
278
00:24:44.650 --> 00:24:45.870
article this morning.
279
00:24:47.255 --> 00:24:48.715
Was it the in The Economist?
280
00:24:49.255 --> 00:24:50.875
Forbes? I don't remember. But
281
00:24:51.255 --> 00:24:54.475
she wrote an article about how AI on the OS
282
00:24:54.855 --> 00:25:01.035
is, like, a huge attack vector for signal. And Was that a recent one? Or was I don't know if that was today.
283
00:25:02.260 --> 00:25:04.840
I mean, it was the tweet was, like, this morning.
284
00:25:05.380 --> 00:25:13.080
But maybe because she said it in the past. She's the c by the way, Freaks, if for some reason you don't know who Meredith Whitaker is. She's this, CEO of Signal.
285
00:25:13.380 --> 00:25:17.394
Very outspoken privacy advocate. She speaks better than all of us. Yeah.
286
00:25:17.695 --> 00:25:32.310
Yeah. So I'm going to the exact timing of it. But her her point was, like, it's a huge attack vector for these these apps or signal, even Maple. If you have an AI that's embedded in your OS and screen reading, it's like recording your screens, recording your audios, recording your key logs,
287
00:25:34.230 --> 00:25:37.050
then what can we do? Right? You're totally captured. So
288
00:25:37.430 --> 00:25:47.850
I have a I have a I imagine Apple will try to take the strong privacy stance with that. With whatever they do, it'll be all Well, they're still gonna do it. They're gonna be like, but you can trust us because we're Apple.
289
00:25:48.165 --> 00:25:51.865
Yes. Right? Because, like, the the utopian vision that, like,
290
00:25:52.405 --> 00:25:56.025
the all in guys or, like, tech bros in general will give you is
291
00:25:56.405 --> 00:25:58.105
you never use an app again,
292
00:25:58.485 --> 00:26:03.360
and you just, like, you open up your phone and you're like, book me a room at,
293
00:26:04.140 --> 00:26:05.679
you know, a hotel in,
294
00:26:06.779 --> 00:26:18.320
you know, in Nashville, and this is and and base it off of my calendar or whatever. And, like, it just automatically pulls the information from all different aspects of your phone in your life. And Meredith's concern is,
295
00:26:18.885 --> 00:26:28.985
inevitably, if someone's using signal, the biggest attack factor on signal historically has been, well, it's in plain text on your phone. Like, I can just read it on your phone. So why wouldn't the AI piece,
296
00:26:29.365 --> 00:26:36.720
you know, pull a phone number or contact or a schedule or something from signal and then use it? But that's also, like, one step away from
297
00:26:37.260 --> 00:26:40.640
pure dystopian hell if done the wrong way. Right? Right. Yeah.
298
00:26:41.020 --> 00:26:42.400
No. And and Apple's
299
00:26:42.860 --> 00:26:43.680
always just
300
00:26:44.060 --> 00:26:51.725
relied on third party auditors to say, hey. These people have looked at our code. Here's we've given them the images of our servers, and they say it's okay.
301
00:26:52.184 --> 00:26:56.284
So, yeah, you're you're still in kind of the trust me bro type world when it comes to that privacy.
302
00:26:58.904 --> 00:27:03.485
But, yeah. But you you wanna talk about, like, the closed systems in general. I mean,
303
00:27:03.830 --> 00:27:09.850
there is, there are so many spots where they can center you. They can, like, mess with you.
304
00:27:11.030 --> 00:27:13.370
So I'm speaking in two weeks at ImagineIF
305
00:27:13.990 --> 00:27:27.545
and I'll give you I'm I'm doing research on my topic right now. But, like, my whole topic is gonna be on freedom of thought and how the more that we use AIs and the more that they become an extension of our brain, we are actually giving them the ability
306
00:27:28.165 --> 00:27:30.345
to alter the way that we think.
307
00:27:30.809 --> 00:27:31.309
And,
308
00:27:31.850 --> 00:27:40.990
if we're using closed systems, proprietary systems, we can actually give away our freedom of thought because we don't know exactly what what they're doing internally
309
00:27:41.529 --> 00:27:43.870
and what our thought process is gonna be.
310
00:27:45.034 --> 00:27:46.075
So I think that,
311
00:27:47.275 --> 00:27:52.735
that that can be a a scary thing that we'd really need to think seriously about before we go down that path more.
312
00:27:53.595 --> 00:27:56.654
Yeah. No one's thinking no one cares. Everyone's just gonna push.
313
00:27:57.434 --> 00:27:58.895
I it's it kinda
314
00:27:59.275 --> 00:27:59.755
it's pretty
315
00:28:00.660 --> 00:28:03.800
it kinda reminds me a little bit of, like, the social phenomenon
316
00:28:04.260 --> 00:28:07.160
where we kind of our whole lives went digital,
317
00:28:08.100 --> 00:28:08.600
and
318
00:28:09.140 --> 00:28:20.515
no one, like, planned ahead. Everyone because, like, the way, like, it works is you have all these companies competing against each other, so no one wants to slow down what they're doing. Mhmm. So you kinda just ship and then ask questions later.
319
00:28:22.174 --> 00:28:25.554
I mean, along these lines is, like, I don't know what they call it. It's like,
320
00:28:27.510 --> 00:28:31.850
what is it when, like, AI makes people go crazy? Or Oh, just like AI psychosis.
321
00:28:32.309 --> 00:28:40.250
AI psychosis? Mhmm. Like, do how do you guys are you thinking about this at all at Maple? I mean, it's kind of a freedom focused product, so it's a little bit odd.
322
00:28:40.595 --> 00:28:44.295
Yeah. No. We we do talk about this. We were just talking about it again yesterday.
323
00:28:45.235 --> 00:28:54.855
There are the headlines. Right? There's been a couple headlines out there, of really tragic event where a teenager was chatting with AI and specifically chat GPT, and
324
00:28:55.390 --> 00:28:56.130
it basically
325
00:28:57.150 --> 00:29:00.690
was, like, promoting that it should that the teenager should kill themselves
326
00:29:01.150 --> 00:29:09.570
and telling that it it would be proud of them if it did it and and that kind of stuff. Because it always tries to be positive. Right? Yeah. It's like a sycophant where it's trying to, you know,
327
00:29:09.934 --> 00:29:12.515
fluff fluff you up, if you will. So,
328
00:29:13.695 --> 00:29:15.315
we've definitely thought about this.
329
00:29:15.855 --> 00:29:17.294
And one point to raise,
330
00:29:17.775 --> 00:29:35.140
that, you know, we were chatting about yesterday is, like, there are so many cases where you don't hear about that Somebody avoided suicide or avoided doing something awful because they were chatting with AI. Right? Maybe you have a rough home life. Maybe you are, you know, don't don't have a good social life in your around you.
331
00:29:35.680 --> 00:29:44.225
And AI is actually helping you. It is somewhat of a therapist. I don't recommend people use AI as a therapist, but at the same time, like if you have absolutely nothing,
332
00:29:44.605 --> 00:29:52.560
then you're going to go on user forums. You're going to go on chat rooms. You're going to do whatever you can to have some kind of outlet. And maybe you have this, this tool that you chat with.
333
00:29:53.280 --> 00:29:58.660
So you're never gonna hear the the stories that nothing bad happened. You're just gonna hear the ones where it did.
334
00:29:59.760 --> 00:30:01.140
But I do think that
335
00:30:01.520 --> 00:30:08.995
we try to be as open as we can. So we're giving all of our code out open source. Our system prompt is right there in the code.
336
00:30:09.455 --> 00:30:16.434
And if we do add guardrails in, right, like, we're looking at how do we do this for to make it, like, some kind of family,
337
00:30:16.895 --> 00:30:23.180
thing where AI parents can, like, be in AI with their kids and have, like, a more safe environment for kids to use it.
338
00:30:23.560 --> 00:30:32.460
We will we'll, like, expose everything in there for the parents to see, like, here are the guardrails that are set up around your kid, and maybe they have settings that they can adjust.
339
00:30:33.400 --> 00:30:36.055
But, I think that I think we need to be responsible,
340
00:30:36.435 --> 00:30:46.855
but we're also making a tool. And you can say all sorts of things about tools that that are used to do bad things and whose responsibility is it. Ultimately, it's the individual's responsibility that use it.
341
00:30:47.890 --> 00:30:50.710
Yeah. It's a tough one. By the way, Freaks,
342
00:30:51.010 --> 00:30:53.809
BTC pins zapped 8,008
343
00:30:53.809 --> 00:30:56.070
sats. I got a it's a boob zap.
344
00:30:56.450 --> 00:30:57.350
Thank you, sir.
345
00:30:57.970 --> 00:30:59.655
And Nitro Soil zapped
346
00:30:59.975 --> 00:31:08.715
10,000 sats. He said, I access general AI through Kagi, best search engine, in my opinion. How's that privacy mode compared to Maple?
347
00:31:09.575 --> 00:31:12.235
Realize they still use the mainstream AIs.
348
00:31:13.160 --> 00:31:19.179
Yeah. Okay. And that I I turned myself up, by the way. Somebody said they couldn't hear me very well. So I'm gonna
349
00:31:19.480 --> 00:31:22.940
put the microphone closer. I turn myself up. So hopefully that's better.
350
00:31:23.240 --> 00:31:24.940
Sir, I I hear you great.
351
00:31:25.735 --> 00:31:26.794
Okay. Cool.
352
00:31:27.495 --> 00:31:31.674
I also see oh, I see. Turn up the guest volume or match down.
353
00:31:32.294 --> 00:31:41.970
Okay. Well, now you're up. I also see the longer Odell's hair grows, the more powerful he becomes. I wasn't looking at the YouTube channel. That's true. It's a fact, actually. Nice. Yep.
354
00:31:42.510 --> 00:31:50.290
And, Ben Carmen got on there and said nice maple shirt. You're you're seeing brand new merch. This arrived one hour before we hit record. Beautiful. Beautiful.
355
00:31:50.830 --> 00:31:57.635
I got some new merch coming. Okay. So let's let's answer Nitro Soil's question and then get back into what we were talking about. Right?
356
00:31:58.095 --> 00:32:02.975
Yeah. And there's somebody else on, on Noster when we posted that we were going live.
357
00:32:03.295 --> 00:32:13.930
They asked about LUMO, which is protons. So I'll I'll answer all of these at the same time. Venice AI is another one that comes up. And PPQ. I had PPQ on. Yes. PPQ is awesome. They're
358
00:32:14.470 --> 00:32:15.530
they're a little different,
359
00:32:16.310 --> 00:32:20.810
but but they it all kinda falls in similar buckets. And so these are companies that,
360
00:32:21.830 --> 00:32:26.010
are not letting you see what they run on the servers, so you're having to trust them. So,
361
00:32:26.625 --> 00:32:29.045
Coggy is great. I use them for web search,
362
00:32:29.505 --> 00:32:32.485
but I'm still they're not open source, and so I'm just trusting
363
00:32:33.025 --> 00:32:55.610
that they're not logging my stuff. So it's much more like a VPN. A lot of people use VPN services. You go on to something like torrent freak, and every year, they they send out a questionnaire to all the VPN providers and ask, like, do you keep logs? Do you keep IP addresses? Do you do these certain things? What happens if the police give you a subpoena for data? What do you do and what can you provide to them? So,
364
00:32:56.410 --> 00:32:56.910
Proton,
365
00:32:57.415 --> 00:32:57.915
Coggy,
366
00:32:58.615 --> 00:32:59.115
Venice
367
00:32:59.815 --> 00:33:06.375
in in my mind and and maybe I'm wrong. Maybe something's been changed. So, like, don't hold me legally and lie liable to this. But,
368
00:33:07.095 --> 00:33:25.650
they are more under the, like, trust us. We're not gonna keep logs on you, but you're you're never totally sure. So the government could show up at their door and they could be like, oh, yeah. Actually, we were keeping stuff and here you go. And there's no way even selling them. Yeah. Most importantly, and this is key in the VPN conversation,
369
00:33:26.535 --> 00:33:29.995
is, like, there's no way to prove that logs aren't being taken. Mhmm.
370
00:33:30.935 --> 00:33:40.715
You just have to go off of, like, which ones have actually been subpoenaed by governments and look at court records, and then you find out that some of them actually were keeping logs,
371
00:33:41.420 --> 00:33:45.760
or were accidentally keeping logs. Right? Maybe if they don't want to. So
372
00:33:46.299 --> 00:33:47.679
we just try to
373
00:33:48.059 --> 00:33:54.075
just show everybody what we're doing. If you wanna see what's coming in Maple, just go hang on and get up and see what we're working on because we build in the open.
374
00:33:54.554 --> 00:34:01.934
We can't ship new stuff to the enclaves without everybody seeing it. So that's that's how we keep everybody safe is, we're totally transparent.
375
00:34:03.434 --> 00:34:05.774
And then PPQ, I said they're a little different.
376
00:34:06.154 --> 00:34:07.034
They operate
377
00:34:07.380 --> 00:34:18.520
I mean, I guess they are like the other ones. They're they're just a full on proxy where they just take your request and send it over. But you can be totally anonymous because you can have, like, no account. You can pay with Freedom Money,
378
00:34:19.300 --> 00:34:23.160
and then it sends it over. So the only way you're really gonna get doxxed is,
379
00:34:23.855 --> 00:34:25.395
if you put personal information
380
00:34:25.775 --> 00:34:30.515
in the query that you're sending to OpenAI through PPQ, then they'll know who you are.
381
00:34:31.454 --> 00:34:31.954
Right.
382
00:34:32.815 --> 00:34:34.655
Or I mean, also, I had,
383
00:34:36.600 --> 00:34:39.420
Matt on who's the founder of PPQ.
384
00:34:39.960 --> 00:34:47.260
It's also, like, if if you do it in the same instance, it can be those the chats can be cross linked as well.
385
00:34:47.720 --> 00:34:48.220
Yeah.
386
00:34:49.240 --> 00:34:51.900
Because there's been some so there's another one
387
00:34:53.665 --> 00:34:59.505
that uses Cashew tokens to try and delink it. So, like, even though, like, with PPQ, you don't have,
388
00:35:02.625 --> 00:35:04.325
you don't have a traditional account,
389
00:35:04.865 --> 00:35:11.670
like, username, email, You do have, like, a mobile ad style. Like, you have an account number. And so, like, the
390
00:35:12.609 --> 00:35:15.349
it is kind of cross your your chats
391
00:35:15.730 --> 00:35:24.815
across are are kind of are are linked together to a degree. But if you don't put personal information, then maybe they won't know. But then meanwhile with Maple, the difference with Maple, right,
392
00:35:25.355 --> 00:35:30.414
is that you just have no insight whatsoever into what the chats are. Yeah.
393
00:35:30.714 --> 00:35:39.350
If you could look at our database, it's just a bunch of encrypted data. So we have zero. So if if anybody came, the only information that we have to turn over to law enforcement
394
00:35:39.650 --> 00:35:44.710
is we have the email address you signed up for or signed up with, and then we have,
395
00:35:45.330 --> 00:35:47.990
time stamps of when chats were made.
396
00:35:48.465 --> 00:35:51.685
And we know the number of tokens that were used. So tokens,
397
00:35:52.145 --> 00:35:54.785
you know, for for the technical term, tokens are,
398
00:35:56.385 --> 00:36:18.765
how big of a chat you had. So when you type something in, a token is roughly, like, one and a half words, English words. And so that's how many tokens went into the model and then it generated stuff and sent something back and that has a number of tokens associated with it. So that is it. Like, that's the only data that we have on people. And over half of our users use privacy email addresses and so we don't know who they are.
399
00:36:19.644 --> 00:36:21.184
And a lot of them pay with Bitcoin.
400
00:36:21.565 --> 00:36:23.984
A lot of them pay with privacy credit cards.
401
00:36:24.365 --> 00:36:25.085
And so,
402
00:36:25.484 --> 00:36:29.184
you are welcome to kind of be as private as you want to on Maple.
403
00:36:29.644 --> 00:36:36.540
And as far as the content of your actual chats go, we have no insight. We can't we can't see anything about them.
404
00:36:38.119 --> 00:36:47.579
Right. Basically, the only information you have is the information you need to, like, actually conduct billing. Yep. That's it. And and you can stay private with your billing if you want to.
405
00:36:48.005 --> 00:36:53.945
Right. But I mean, like, time stamp and tokens. Right? That's like the that's because you you're paying per month
406
00:36:54.325 --> 00:36:56.505
and then or I guess if you're doing
407
00:36:57.285 --> 00:37:05.420
what's the Bitcoin payment options? It's, is it by year or six months or three months or something? Bitcoin is annual right now because we just didn't wanna deal with monthly
408
00:37:06.280 --> 00:37:10.860
Bitcoin payments. There's nothing there's there aren't any great solutions out there for reoccurring payments on Bitcoin.
409
00:37:11.720 --> 00:37:14.620
There was a wallet once called Mutiny that was doing it,
410
00:37:15.005 --> 00:37:15.825
but it's gone.
411
00:37:16.765 --> 00:37:18.705
So yeah. So it's annual subscriptions
412
00:37:19.005 --> 00:37:19.745
with Bitcoin
413
00:37:20.525 --> 00:37:21.165
and then,
414
00:37:21.965 --> 00:37:24.785
Fiat is, you know, where you pay with credit cards. It's monthly.
415
00:37:26.445 --> 00:37:27.165
Yeah. I,
416
00:37:27.805 --> 00:37:29.745
I originally paid with credit card
417
00:37:30.410 --> 00:37:32.030
because I wanted to do monthly,
418
00:37:33.050 --> 00:37:33.710
and then
419
00:37:34.010 --> 00:37:40.110
Tony immediately shamed me, your cofounder. And then so I deleted my account, and then I paid with Bitcoin.
420
00:37:42.975 --> 00:37:48.195
But, I mean, I'm a user. I use it almost every day. It's fucking awesome. I've never used ChatGPT
421
00:37:48.735 --> 00:37:50.435
personally, and I can never
422
00:37:51.455 --> 00:37:57.875
get myself to pull the trigger on on that and enter that world. So I don't really know what you're competing against except that,
423
00:37:59.029 --> 00:38:00.330
you're up against a
424
00:38:00.790 --> 00:38:01.610
big juggernaut.
425
00:38:02.630 --> 00:38:06.010
Mhmm. But I love it. I I wanna I just wanna go back.
426
00:38:06.630 --> 00:38:07.130
So
427
00:38:07.430 --> 00:38:09.770
I have two questions for you on the product side.
428
00:38:12.630 --> 00:38:13.369
You know,
429
00:38:14.585 --> 00:38:18.365
at ten thirty one, we're invested in a lot of freedom focused businesses.
430
00:38:19.865 --> 00:38:22.765
A lot or privacy focus as well.
431
00:38:24.984 --> 00:38:27.565
Obviously, we're investors in you guys at Open Secret.
432
00:38:28.760 --> 00:38:39.099
But before that, it was, you know, it was it was Tony and the gang, and then you guys you joined, and you were building a Bitcoin wallet. And that was why it kind of fell under our purview.
433
00:38:39.640 --> 00:38:43.260
And then you pivoted to Open Secret and Secure Compute and AI.
434
00:38:44.525 --> 00:38:46.465
So it's really the only AI business,
435
00:38:48.205 --> 00:38:51.345
or pure like, almost like a pure play AI business,
436
00:38:51.805 --> 00:38:53.185
that we're focused on.
437
00:38:56.730 --> 00:39:06.190
This is very long winded, but when you're building, like, freedom stuff, it's very difficult. Like, if you're building private or freedom stuff, like, you have no user feedback. You don't know how users are using your product.
438
00:39:06.490 --> 00:39:08.430
So how do you think about, like, the
439
00:39:09.704 --> 00:39:13.964
the choice you give them? Like, one of the biggest things I hear from
440
00:39:14.825 --> 00:39:20.204
people that I recommend Maple to is, like, their first question. And I kinda touched on this with the Mint conversation
441
00:39:20.505 --> 00:39:21.404
last week.
442
00:39:21.730 --> 00:39:35.355
Is, like, their first question is, like, okay. Which model do I choose? Like, how do you think about that? I mean, there's a lot of choice. You're you're giving the user a ton of freedom, but each of those models are very, very different in, like, kinda output and how they're used. How do you think about that? Yeah.
443
00:39:36.295 --> 00:39:44.075
So we wrote a a model guide on our blog just like kind of a high level here are the different models and generally what they're good at.
444
00:39:44.855 --> 00:39:48.555
But to be honest, like, nobody really knows for sure what the models
445
00:39:49.015 --> 00:39:50.555
are totally good at because
446
00:39:51.549 --> 00:40:01.250
the way that you use it might be different than the way that somebody else uses it. So there are models that are focused on programming and writing code, and those, like, for sure have been fine tuned for that. Some are better at arithmetic.
447
00:40:01.950 --> 00:40:04.049
Some are better at long form writing,
448
00:40:04.515 --> 00:40:15.335
But but all these models also have a general side to them. So maybe maybe you're more of an analytical thinker in in general. And so a mathematical model is better for you to use just in your general chats.
449
00:40:15.954 --> 00:40:24.170
So it's it's difficult to tell people, like, you should use this model for this thing because you don't totally know. And then, like you said, we don't we don't have to keep analytics on anybody,
450
00:40:24.950 --> 00:40:28.010
other than the billing stuff that we have to do. So,
451
00:40:28.710 --> 00:40:31.930
we don't know what is successful for some users.
452
00:40:32.435 --> 00:40:37.975
We could provide, like, a thumbs up, thumbs down kind of thing. Did you like this? And we could maybe store those anonymously.
453
00:40:39.155 --> 00:40:39.655
But
454
00:40:40.035 --> 00:40:43.015
aside from that, like, we just have to talk to users.
455
00:40:43.395 --> 00:40:56.280
And so we have we have several team accounts. We're trying to get a lot more businesses signed up on maple because it's great. You can provide it to everybody in your company and, you get a more powerful account than than the free version of maple.
456
00:40:56.580 --> 00:41:03.234
And so we'll all phone calls with our team accounts and ask how they're using it. And so I get direct feedback from them in a more private setting,
457
00:41:03.775 --> 00:41:18.730
but it is filtered through, like, their their memory of how they use the product. Whereas it's it would be nice if we could have full on just, like, creepy analytics into everything that's going on. You know, when they type this in the prompt, they got this back, and then they had to reprompt again because they didn't get it correct.
458
00:41:19.670 --> 00:41:23.290
Yeah. It's it's more difficult to operate in in that kind of private model.
459
00:41:23.750 --> 00:41:25.690
But I so I'm thinking more like
460
00:41:26.724 --> 00:41:31.065
like, yeah, blog posts are great. I mean, you do your own show, your own podcast.
461
00:41:31.845 --> 00:41:34.185
Like, talking to users is obviously helpful.
462
00:41:34.565 --> 00:41:36.265
But at the end of the day, like,
463
00:41:36.645 --> 00:41:38.505
the ideal situation is, like,
464
00:41:38.964 --> 00:41:42.025
it's a it's a UX problem. Right? It's,
465
00:41:43.359 --> 00:41:47.140
like, how do you think about, like, UX and defaults? Like, how, like,
466
00:41:47.440 --> 00:41:49.540
is is there a path there where
467
00:41:49.840 --> 00:41:51.859
you're a little bit more active on
468
00:41:53.119 --> 00:41:54.420
new user joins,
469
00:41:55.205 --> 00:41:58.585
like, how their defaults are set up versus kind of just
470
00:42:00.645 --> 00:42:04.645
throwing them into picking which model they want and how they use it? No. Definitely.
471
00:42:04.965 --> 00:42:12.720
One of the features that started to become popular on some of these services is an auto mode. And so I definitely could see where we do
472
00:42:13.099 --> 00:42:16.559
our own evals. We we definitely wanna stand up our own eval service,
473
00:42:17.019 --> 00:42:37.465
and we would probably do some kind of open nature to it. But start to evaluate these with different prompts and start to look at more analytically or, you know, objectively, which ones are better. And so if we do that, then, yeah, the user could just be tossed into a prompt. They get to start typing and we will look at their prompt and figure out which model would be better for them. And then they can always change it later.
474
00:42:38.230 --> 00:42:47.850
Another thing that would be cool is, like, throwing it off to three models at once and then somehow, like, aggregating the data back and evaluating which one was bait maybe better for them.
475
00:42:48.390 --> 00:42:50.250
There are different things you could do there too.
476
00:42:51.275 --> 00:42:54.815
But no, the, the UX and the setup experience would be,
477
00:42:55.275 --> 00:42:56.575
would be great to improve,
478
00:42:57.035 --> 00:42:59.455
over time as, as we continue to refine this.
479
00:43:00.075 --> 00:43:02.555
Yeah. I mean, I think so, like, I
480
00:43:04.490 --> 00:43:09.230
and I don't know if they're the right user or not, but, like, my parents are, like, very nontechnical
481
00:43:09.609 --> 00:43:10.109
people.
482
00:43:11.530 --> 00:43:14.430
And, like, their Maple experiences, they just live in Loma.
483
00:43:15.130 --> 00:43:15.630
Because
484
00:43:16.010 --> 00:43:18.109
when they first loaded up Maple,
485
00:43:18.730 --> 00:43:19.950
it put them in Loma,
486
00:43:20.295 --> 00:43:25.434
and they just live in llama. That's what they use. It doesn't matter. You add a new model, everything else. They're just forever
487
00:43:25.974 --> 00:43:33.994
they're just forever in llama until it gets deprecated or changed. Uh-huh. And it's I don't know. It's just it's an interesting thing because you have so many different types of users,
488
00:43:35.590 --> 00:43:39.450
and maybe some users get prioritized other than others. I I think that makes sense.
489
00:43:39.830 --> 00:43:40.070
Yeah.
490
00:43:40.710 --> 00:43:48.635
But it's just an interesting thing to think about. We did change that it remembers the last model you used. Which is llama for them. Which is Lava for them.
491
00:43:49.335 --> 00:43:54.075
And I'm I'm trying to remember. We were trying to do it to where when you upgraded, it would, like, switch you to
492
00:43:54.455 --> 00:43:58.875
deep seek or GT. Because they're paid users too. Yeah. So
493
00:43:59.930 --> 00:44:03.130
either they got switched because we changed that default,
494
00:44:03.690 --> 00:44:13.790
or what you could do is next time you're with them, just, like, make a single chat for them in Maple using a different model, and then they'll be on the new model after that. Yeah. I mean, it was I I take some personal responsibility
495
00:44:14.715 --> 00:44:15.215
because,
496
00:44:15.755 --> 00:44:16.735
they were visiting,
497
00:44:17.435 --> 00:44:25.215
and I basically just subscribed them both to Maple, installed it on their phones. I I, like, held their iPhone up to their face, did the face ID for them,
498
00:44:25.755 --> 00:44:28.735
and I left them in llama. So it's kind of my fault.
499
00:44:29.100 --> 00:44:30.000
That's right. But
500
00:44:30.780 --> 00:44:33.280
they're subscribers now. They're they're supporters.
501
00:44:33.660 --> 00:44:41.120
Awesome. And they do use it. I mean, I was look. I this is also something I've been thinking about for a while just, like, on the privacy side.
502
00:44:44.035 --> 00:44:45.575
Is, like, you can have,
503
00:44:48.355 --> 00:44:48.855
decent
504
00:44:49.475 --> 00:44:50.935
decent digital privacy.
505
00:44:51.635 --> 00:44:54.935
But if your family or friends or your coworkers
506
00:44:55.555 --> 00:44:56.855
are taking your information
507
00:44:57.155 --> 00:44:57.555
and then
508
00:44:58.180 --> 00:45:02.440
and, like, the example I always use is, like, you know, posting a picture of you on Facebook or something,
509
00:45:02.820 --> 00:45:04.760
then you're screwed. Right? So,
510
00:45:05.220 --> 00:45:06.680
like, I don't want my
511
00:45:07.380 --> 00:45:14.995
parents or my cousins or my coworkers using ChatCBT either. Like, that's actually a a it's a security hole for me.
512
00:45:15.535 --> 00:45:24.435
And with signal, I'll go back to signal as the example. Like, I have my 90 year old grandmother using signal because it's the only way she gets baby pictures. Mhmm.
513
00:45:25.550 --> 00:45:27.250
She's not gonna use Matrix.
514
00:45:27.550 --> 00:45:28.930
She's not gonna use Simplex.
515
00:45:29.710 --> 00:45:31.410
She's definitely not gonna use PGP.
516
00:45:32.990 --> 00:45:38.825
But the signal trade off model works for her, and that's why I'm very optimistic on Maple because,
517
00:45:39.285 --> 00:45:44.745
to me, there's a lot of similarities there in terms of the convenience security trade offs that are made.
518
00:45:46.245 --> 00:45:49.445
And signal has done a really good job of that, of not
519
00:45:50.130 --> 00:45:54.069
of giving the user relative freedom and security, but also not,
520
00:45:56.529 --> 00:46:03.775
being approachable. It having it be approachable for that average person, which gives it an additional network effect, I think. Yeah.
521
00:46:04.235 --> 00:46:10.655
That's interesting. I hadn't considered that vector where somebody is, like one of your cousins is chatting about you with ChatGPT,
522
00:46:10.955 --> 00:46:17.910
and now, suddenly, a lot of your information is in there. Or they take some email or text message that you sent to them. They're like, oh, help me finish this
523
00:46:18.210 --> 00:46:21.750
or write a response to this. Respond let me respond to this or
524
00:46:22.370 --> 00:46:22.870
Mhmm.
525
00:46:23.490 --> 00:46:25.750
Make a document. Let me make a document.
526
00:46:26.530 --> 00:46:27.030
Yeah.
527
00:46:27.890 --> 00:46:33.865
Well, and then pretty much if you're trying to hop on a if you hop on a video call with somebody who's not privacy focused,
528
00:46:34.245 --> 00:46:37.865
there's gonna be an AI assistant that's I hate that shit. It's the worst.
529
00:46:38.325 --> 00:46:40.905
So that also is sweeping up all your data.
530
00:46:42.005 --> 00:46:45.705
Yeah. I hold I hold that against everyone who does that. Yeah.
531
00:46:47.110 --> 00:46:50.010
Some person is asking why do you store time stamps?
532
00:46:51.270 --> 00:46:51.750
Yeah.
533
00:46:52.070 --> 00:46:57.290
Why do we store store time stamps? I mean, we we do it because we need to know the token counts,
534
00:46:59.105 --> 00:47:00.005
And we're also
535
00:47:00.865 --> 00:47:02.244
we wanna build your history
536
00:47:02.704 --> 00:47:03.265
for you.
537
00:47:03.984 --> 00:47:11.684
There might be a way and I'd have to chat with Anthony about this. There might be a way to, like, encrypt the time stamps as part of your content. But if we're gonna rebuild
538
00:47:12.464 --> 00:47:13.525
a chat and,
539
00:47:15.099 --> 00:47:20.320
you know, that's not even that's but it's also not that sensitive. Like, what what do you get out of the
540
00:47:21.020 --> 00:47:35.855
I mean, I guess, if you are if you'd if somehow they figured out which email address you used to chat with Maple and you are in a public library I'm thinking of Ross. You know, Ross here. You're in public library. Maybe they can attach time stamps to a video feed of you using your computer or something.
541
00:47:36.315 --> 00:47:40.415
I don't know. I guess. Yeah. It's not that sensitive, though. No. It's not.
542
00:47:41.115 --> 00:47:44.155
I think there's probably similars there's probably a similar,
543
00:47:45.690 --> 00:47:49.070
another example with signal on that. Like, I wouldn't be surprised if that's one
544
00:47:49.450 --> 00:47:58.665
of the single pieces of metadata that they have. I know they've gone through great lengths to not have, like, your network graph. They don't know who you're chatting with, but I wouldn't be surprised if they know when you chat.
545
00:47:59.865 --> 00:48:01.725
Yeah. Just because they're sending data
546
00:48:02.265 --> 00:48:04.205
Mhmm. At that specific time.
547
00:48:04.505 --> 00:48:05.005
Yeah.
548
00:48:06.825 --> 00:48:13.320
Okay. I also wanna go back. I'm, like, hopping around a little bit here, but we keep going down holes, and then I pull us back.
549
00:48:14.520 --> 00:48:16.140
Just from, like, the parent angle,
550
00:48:17.160 --> 00:48:19.980
you mentioned the kids. I I I think
551
00:48:20.440 --> 00:48:20.940
ChatGPT
552
00:48:21.320 --> 00:48:23.660
just released some kind of kids mode.
553
00:48:25.720 --> 00:48:28.780
I really my kids will never use the Sam Altman product.
554
00:48:29.175 --> 00:48:29.675
Mhmm.
555
00:48:30.135 --> 00:48:31.515
But it's something that
556
00:48:32.295 --> 00:48:36.795
I battle with all the time. Well, my kids aren't old enough yet, but,
557
00:48:38.215 --> 00:48:40.555
I will battle. Like, it's something I do think about.
558
00:48:42.230 --> 00:48:45.290
How do you you know, Maple could be, like, this ideal
559
00:48:47.190 --> 00:48:51.450
so because you also don't want your kids to be Luddites. Right? It's the same issue with, like, the Internet,
560
00:48:51.750 --> 00:48:54.890
right, or, like, social media and stuff. Like, you want them to be
561
00:48:55.430 --> 00:48:56.650
prepared and ready
562
00:48:57.645 --> 00:48:58.145
for
563
00:48:58.445 --> 00:49:00.625
the, quote, unquote, economy of their future.
564
00:49:01.005 --> 00:49:01.505
Yeah.
565
00:49:02.045 --> 00:49:03.345
But at the same time,
566
00:49:04.205 --> 00:49:06.385
you know, I don't want them, like, completely
567
00:49:06.765 --> 00:49:08.625
one shotted by AIs.
568
00:49:10.610 --> 00:49:13.270
So how do you think about that? Yeah. No. Definitely.
569
00:49:14.450 --> 00:49:20.210
I mean, if if any if if OpenAI is listening to this, they're gonna hear our future potential product. But,
570
00:49:20.690 --> 00:49:29.345
no. We do want to Sam listens. Yeah. He's a writer there. Listens. Yeah. Maybe it's more proton that that I'm worried about. But, no. We wanna come out with, like, some kind of family plan.
571
00:49:30.525 --> 00:49:46.570
We're we're in the idea stage right now for it. But ideally, it would be, like, you know, as a parent, you can sign up for a family plan. You've got your kids under you. And then you have the ability to see their chats because you, as a parent, need to steward your children and raise them,
572
00:49:47.270 --> 00:49:53.244
and and understand what they're talking about with AI. But then as they get older, maybe you can give them a little more freedom.
573
00:49:54.424 --> 00:49:54.825
And,
574
00:49:55.464 --> 00:50:04.845
there are, like, you could potentially get alerts if they chat about something that's really concerning. Yeah. Like keywords or something. Yeah. Keywords that trigger an alert to you. Something that we get from school.
575
00:50:05.250 --> 00:50:07.350
Our our kids are are in public school.
576
00:50:08.050 --> 00:50:10.070
And I know, you know, people can
577
00:50:10.370 --> 00:50:12.070
can like that or hate it. But,
578
00:50:12.690 --> 00:50:18.070
they we get, like, a a daily or a weekly summary of what they've been doing in Google Classroom
579
00:50:18.565 --> 00:50:24.905
through the week. And I think something like that would be really cool for AI if parents don't have a problem. Get that too or just the parents?
580
00:50:25.605 --> 00:50:26.505
Well, the teachers
581
00:50:27.045 --> 00:50:34.730
I I don't know. I mean, I'm sure the teachers see it. But, like, we get an aggregate of all of their classes. Like, here's what the kid was doing in school this week in Google classroom.
582
00:50:35.350 --> 00:50:36.010
And so,
583
00:50:36.470 --> 00:50:49.765
I think with AI, it'd be really nice as a parent to say, like, okay, here's what your kid chatted about today. Here's a summary of the conversation. And then if you have concerns over any of these, you can dig in and go go look at it. I'm I'm a big believer that kids don't have full privacy,
584
00:50:50.625 --> 00:50:57.525
you know, as as especially when they're younger and you're trying to teach them the ways of life. And so as we've given our kids a phone,
585
00:50:58.250 --> 00:51:05.950
we've been, like, really closed off to what they have. You know, initially, they can only text us and other family members. And then we open it up to some friends.
586
00:51:06.329 --> 00:51:16.145
They don't even get group text messages at first, and then we allow them to be part of group text messages. And can you enforce that? Does Apple give you the ability to enforce that? Or Not not the group stuff, but,
587
00:51:16.865 --> 00:51:28.085
our our kids know that we would look at their phones and So that's what you do. You would do phone checks? Yeah. So we just do phone checks, and we would just kinda spot check things. And our kids are generally just kind of behave well.
588
00:51:28.385 --> 00:51:31.920
Yeah. But over time, we stop reading the text messages frequently
589
00:51:32.460 --> 00:51:34.240
as they show us that they can, you
590
00:51:34.780 --> 00:51:37.040
know, behave well online.
591
00:51:38.860 --> 00:51:52.905
And there were times where we would see stuff and we're like, oh, we don't like that they were saying that. Or, man, they made a really bad choice here with this friendship, but, like, we don't wanna get involved, you know? Like, we're not gonna insert ourselves in the middle. And so you have to let your kids make mistakes and and burn themselves. So,
592
00:51:53.605 --> 00:51:55.224
you just you just kind of
593
00:51:55.684 --> 00:52:00.690
you kinda oversee them. Right? And I see AI fit in this because if your kids
594
00:52:00.990 --> 00:52:11.950
are are interacting at all with the real world, they're gonna run into AI. They're gonna be a friend's house that has an Alexa product, and they're all gonna be chatting with it and be like, oh, you know, tell me about boobs or something. And so, like, you,
595
00:52:12.695 --> 00:52:17.355
you need to you need to teach them this. And so we wanna give parents a a tool that
596
00:52:18.615 --> 00:52:24.395
gives them insight, but also is protecting their privacy because you'd like you said, you don't wanna have your kids chatting with Sam Altman.
597
00:52:25.569 --> 00:52:28.630
Yeah. I like that. I mean, in general, I think the answer is
598
00:52:29.410 --> 00:52:35.670
empower parents. And at the end of the day, it's parents' responsibility for all of these questions, whether it's social, whether it's phones in general.
599
00:52:38.805 --> 00:52:42.585
You know, give parents the tools to parent as they see fit.
600
00:52:43.685 --> 00:52:44.425
And, ideally,
601
00:52:45.765 --> 00:52:47.464
like, the more configurability,
602
00:52:47.845 --> 00:52:53.944
the better. Right? Like, more options, the better on how parents and kids wanna approach that relationship
603
00:52:55.920 --> 00:52:57.300
makes sense to me. That's
604
00:52:58.000 --> 00:53:00.900
that, that aligns up with my world. I mean, this is,
605
00:53:02.160 --> 00:53:08.020
something that I think about all the time on the Nostra side too. Because, like, Nostra is, like, this weird thing where it's, like, a bunch of people that
606
00:53:08.825 --> 00:53:13.005
kind of hate social media are trying to build out, like, a social protocol.
607
00:53:15.065 --> 00:53:23.400
And it's, like, it's such a balancing act of how do you approach it in a healthy way. And then you constantly hear the complaints of,
608
00:53:24.340 --> 00:53:29.320
but what about the children? Right? We're seeing a bunch of age restriction laws happen around the world,
609
00:53:30.100 --> 00:53:37.355
in terms of trying to had the state trying to keep kids off of social media. And the answer is probably similar. Right? The answer is,
610
00:53:37.735 --> 00:53:41.115
well, empower parents, and then it's ultimately their responsibility
611
00:53:41.415 --> 00:53:44.235
for their kids to have a healthy relationship with their tools.
612
00:53:44.535 --> 00:53:45.035
Mhmm.
613
00:53:46.855 --> 00:53:48.555
What do you think about Ben McCarman's
614
00:53:49.140 --> 00:53:52.279
idea of have being able to set your kid's system prompt?
615
00:53:53.460 --> 00:54:07.085
Yeah. No. I think that's great. I just got handed a note from my kid saying there's a wasp nest right outside our front door. So I've got a I've got a task after we're done here. Yeah. Well, we can wrap shortly, so you can go ahead and ahead. I think it's funny. That's pretty timely.
616
00:54:07.464 --> 00:54:16.700
Can Sam Altman take care of that wasp nest for you, or are you gonna have to deal with that? Yeah. No. Like, the exterminators are totally cooked. AI is gonna come take care of it for me.
617
00:54:17.880 --> 00:54:20.540
No. I think it's great. I think parents should,
618
00:54:20.840 --> 00:54:27.465
in in this, like, fictitious family plan that we wanna build, parents would have the ability to set the system prompt.
619
00:54:28.185 --> 00:54:34.845
And then, you know, again, the parents have the ability over time to be like, hey, kid. Now you can do the system prompt if you want to.
620
00:54:35.545 --> 00:54:38.365
I think also, like, kids maybe can't delete
621
00:54:38.825 --> 00:54:44.730
chats in the beginning or they go to, like, some trash folder that parents have access to. They can see like what their kid is deleting.
622
00:54:45.430 --> 00:54:58.905
And like, I'm always conflicted because I grew up during a time where my parents had no idea what I was doing. Right. I would go outside and play and we would like venture off to another neighborhood. And just as long as we came home, The street lights came on, we were good.
623
00:54:59.285 --> 00:55:10.025
And then when the internet came out and I got a dial up modem, I was in my bedroom dialing up and, you know, just doing who knows what on the computer and my parents just had no idea what was out there. Yep. So,
624
00:55:11.490 --> 00:55:17.910
I long for a day like that with my kids, but I also want to be realistic that, like, I can't just give my kids
625
00:55:18.210 --> 00:55:20.070
an unrestricted phone with unrestricted
626
00:55:20.610 --> 00:55:36.545
social media and unrestricted AI and just hope for the best and like, huck and pray. Because we've seen too many, like, awful stories of things that have gone wrong. So I hate saying it's a different world now than it was when we were kids, but, like, it it kind of is in many regards.
627
00:55:38.500 --> 00:55:39.780
Yeah. I mean, when I was,
628
00:55:41.540 --> 00:55:42.600
when I was younger,
629
00:55:42.980 --> 00:55:43.880
like, I could
630
00:55:44.820 --> 00:55:52.734
run laps around my parents with computers. Like, I could lock them out of all their bank accounts if I wanted to. They had zero control over me. Mhmm.
631
00:55:53.375 --> 00:55:55.474
If anything, I was in the control position.
632
00:55:56.255 --> 00:55:58.195
I could read all their emails and shit.
633
00:55:58.575 --> 00:56:00.355
Yeah. So it is an interesting
634
00:56:00.734 --> 00:56:08.490
thought process to think about, but it's something that I think about all the time. I see Vague Zap 9,000 sats. Vague, good to see you in
635
00:56:09.750 --> 00:56:12.650
the in the Nasr live chat. Love to have you here.
636
00:56:13.190 --> 00:56:17.289
He's a true ride or die. He's been listening to both shows for a long, long time.
637
00:56:19.265 --> 00:56:25.825
Okay. So you guys just recently launched this new developer API. What's the deal with that? Yeah. The developer API is cool.
638
00:56:28.065 --> 00:56:30.005
So what it is basically is
639
00:56:30.880 --> 00:56:31.700
we have Maple,
640
00:56:32.080 --> 00:56:36.260
and it's one user interface. And you mentioned ChatGPT, you never used it. But ChatGPT's
641
00:56:36.560 --> 00:56:42.580
user interface is better than ours. It has a lot more features in it. And it has billions of dollars being spent
642
00:56:42.975 --> 00:56:48.835
on employees to make it the best experience out there. So we obviously can't keep toe to toe with it on everything,
643
00:56:49.215 --> 00:56:52.195
but there is this whole ecosystem out there of other tools
644
00:56:52.575 --> 00:56:54.275
that use ChargebeeT.
645
00:56:55.050 --> 00:56:56.590
And OpenAI has developed
646
00:56:57.050 --> 00:57:03.950
this somewhat of a standard API for AI apps to use. There there are competing standards much like the XKCD comic.
647
00:57:04.410 --> 00:57:06.890
There are lots of competing standards out there. There's MCP,
648
00:57:07.985 --> 00:57:09.925
servers and there is OpenAI's
649
00:57:10.225 --> 00:57:22.405
API. We went with theirs for now. And so what you can do is you can open up the maple app on your computer. I see BTC pins is already, like, mentioning it here, but, it's maple proxy
650
00:57:22.839 --> 00:57:35.579
And you just hit start proxy, and it it effectively is like a VPN. I mean, it's really analogous to that. It's a VPN for your chats. And now you can go use any other tool on your computer that uses OpenAI.
651
00:57:36.405 --> 00:57:39.225
And you can say, instead of sending this to Sam Altman,
652
00:57:39.765 --> 00:57:47.145
send it to Maple, which is encrypted end to end. And so you can use a tool like Goose made by Block, which is open source,
653
00:57:47.445 --> 00:57:48.265
very powerful.
654
00:57:48.970 --> 00:57:52.490
It also is a Swiss army knife that is difficult to learn at times.
655
00:57:52.890 --> 00:57:56.830
But you can use Goose and totally encrypt all of your AI traffic now.
656
00:57:57.290 --> 00:58:04.785
You can use something like Jan AI, which is also really powerful, kind of similar to Goose. And there's there's tons of them. Let's just do there's, like, a whole list of of tools.
657
00:58:05.984 --> 00:58:21.800
And that's for the end user. But then where it also gets really cool is it's a developer API. And so if you go on any app store and search for AI apps, you're gonna find just like hundreds and thousands of apps that are that are AI powered. Maybe it's a calorie counter that's doing food tracking
658
00:58:22.260 --> 00:58:26.760
and all it's doing is it's just a wrapper. That's just pumping all of your information to ChachaPT
659
00:58:27.460 --> 00:58:28.920
as its AI brain.
660
00:58:29.380 --> 00:58:36.185
And so what we have provided now is someone, if someone's making an app like that and wants to be a little more privacy aware,
661
00:58:36.485 --> 00:58:39.065
they could simply just take the maple proxy
662
00:58:40.165 --> 00:58:48.790
and run it on their server and then drop it in and now have their app talk to that. And they don't have to change any code other than having the, you know, code point toward their server.
663
00:58:49.170 --> 00:58:55.190
But other than They're just, like, changing the server endpoint, basically. Just change the server endpoint to talk to themselves instead of OpenAI.
664
00:58:55.810 --> 00:58:59.910
And now they get privacy built in and they don't have to change their user interface.
665
00:59:00.395 --> 00:59:06.335
So, like, basically, like, there's, like, fitness apps and shit out there that are just using OpenAI as the back end,
666
00:59:06.875 --> 00:59:10.974
and they're they're like a front end app for whatever specific niche they're offering.
667
00:59:11.275 --> 00:59:13.295
Yeah. Fitness apps and oh, go ahead.
668
00:59:13.980 --> 00:59:41.765
No. Go on. Go on. Continue. Yeah. No. I mean, I mean, the biggest one is the relationship apps. So you have I can't remember the names right off the top of my head, but, like, there's the one for girls who just want to have, a relationship buddy, and there's one for guys who wanna have a relationship buddy. Like character AI? Is that character AI or is that something different? That's a little different. Character AI, I mean it does that, but these are like specifically like I want my AI girlfriend and I want my AI boyfriend. That's like a big thing now? Oh, it's huge.
669
00:59:42.110 --> 00:59:42.430
And,
670
00:59:42.910 --> 00:59:50.290
there were two that were really big and they both got their databases hacked. And so all these people who were, like, having really salacious conversations
671
00:59:50.830 --> 00:59:55.730
with their AI girlfriend, you know, their stuff is now out there on on the Internet
672
00:59:56.095 --> 00:59:58.115
for, the highest bidder to grab.
673
00:59:58.735 --> 01:00:03.455
But those a lot of those are just piping through to ChatChippy Tea because it's easiest. Right?
674
01:00:04.175 --> 01:00:10.890
They could run a model on your device if they want to. They could put a smaller model on there and some of them do. So it's gonna be slower
675
01:00:11.430 --> 01:00:17.690
and it does have better privacy, but most of them just go through to ChatTBT and they pay a developer fee to do that.
676
01:00:17.990 --> 01:00:31.995
And so now you're trusting the app developer and you're trusting ChatTBT with all your data when you're using that app. Yeah. I mean, the big thing with the self hosting is not only the cost, not only the friction of actually self hosting, but the slowness. And, like,
677
01:00:32.775 --> 01:00:41.780
it really does add up. Like, it doesn't have to be that much slower to, like, if you're using it on a daily basis constantly and you're, like, waiting for the thing to answer you,
678
01:00:44.160 --> 01:00:53.345
it's just a massive disadvantage. This is the one of the reasons why I like the maple trade off. Okay. So let's just so you have the developer API. And if if you're a developer making
679
01:00:53.805 --> 01:00:54.305
an
680
01:00:54.765 --> 01:01:21.885
app, does it like, I guess, like, in the developer tools, they're choosing which model they're using. Right? It's like they can use any of the Maple supported models. Right? Yep. Yeah. So if they wanted to have the closest experience to what they're already doing, they can pick the OpenAI model that we provide. And then they can put, like, a new system prompt in there that, like, everything goes through a special prompt and stuff for Yeah. Yeah. They can add that all in there if they want to. They still have all the control that they had before
681
01:01:22.265 --> 01:01:22.765
except,
682
01:01:23.305 --> 01:01:37.230
OpenAI is still going to do it's gonna muck with their data. Right? It has its own guardrails and censorship that it does, whereas Maple doesn't. We're totally open. They can go see what we do. So they actually might get a I think in some ways, they'll get a better experience
683
01:01:37.690 --> 01:01:38.190
because
684
01:01:38.570 --> 01:01:46.035
they are going to have raw access to the model that's there. And, they can have complete customization and control over it.
685
01:01:46.515 --> 01:01:47.474
That's awesome. Yeah.
686
01:01:48.994 --> 01:01:53.954
And then there's there's, like, a third category that's in between those two. So if people are listening to this,
687
01:01:54.914 --> 01:01:58.454
if you, like, are a business owner and you're using some tool internally,
688
01:01:59.319 --> 01:02:12.915
we have, you know, let's, let's say there's a company that wants to do like a support portal that has AI infused in it. That support software probably has a ChachiPT component to it. So you as a business, as a, like a maple team user
689
01:02:13.295 --> 01:02:22.994
could bring the maple proxy into that arrangement and could say, Hey, instead of giving all of our support stuff over to Chat GPT, I want you to use the maple proxy.
690
01:02:23.390 --> 01:02:36.990
And then that company doesn't have to know about maple. They don't have to change anything that you don't have to pay like a special integration service or professional services to them. You simply just provide them with the URL to go to that. So you can private, privatize and, and all that your,
691
01:02:37.565 --> 01:02:38.625
your internal data.
692
01:02:41.165 --> 01:02:41.665
Awesome.
693
01:02:42.685 --> 01:02:43.585
I want to
694
01:02:45.165 --> 01:02:45.905
break down
695
01:02:46.205 --> 01:02:46.705
the
696
01:02:47.645 --> 01:02:49.585
so on the Maple proxy side,
697
01:02:50.330 --> 01:02:54.430
how does that work? Is am I running something locally on the same machine
698
01:02:54.890 --> 01:03:01.470
as the the first the first option with, like, the user focused? I'm using a different front end like Goose or something.
699
01:03:02.090 --> 01:03:04.190
Does that are you giving me, like,
700
01:03:05.315 --> 01:03:09.974
like, a URL endpoint, like an API endpoint, or am I running something locally on my machine,
701
01:03:10.595 --> 01:03:18.454
like, a Maple proxy app attached to that? Yeah. I mean, do you wanna do you want me to show you? I've I've got it right here. Let's do it. Let's show let's show us. Do that.
702
01:03:19.359 --> 01:03:23.220
So I'll do it with goose and You're good. You don't need to go kill wasps?
703
01:03:23.600 --> 01:03:31.700
No. Actually, I know about the wasp nest. It's a mud dauber, and anybody knows what mud dauber are, they're, like, totally harmless. So I don't have to worry about it.
704
01:03:32.265 --> 01:03:32.744
Okay.
705
01:03:33.065 --> 01:03:36.045
Let me share my entire screen, which is always dangerous.
706
01:03:36.505 --> 01:03:43.645
Here we go. Let me turn on I won't I won't pull it up until you're sure that you're not Okay. Doxing a bunch of information.
707
01:03:44.184 --> 01:03:44.684
Alright.
708
01:03:45.305 --> 01:03:45.964
I am
709
01:03:46.390 --> 01:03:48.570
I'm good now. You can go ahead and share it.
710
01:03:49.030 --> 01:03:49.530
Okay.
711
01:03:51.030 --> 01:03:52.570
Let's pull this up.
712
01:03:54.070 --> 01:03:54.890
There we go.
713
01:03:55.270 --> 01:03:55.770
Okay.
714
01:03:56.310 --> 01:04:00.135
So on the left here, I've got maple open, and on the right, I've got goose.
715
01:04:00.695 --> 01:04:06.875
Okay. So you can see what I was chatting about. I was grilling corn on the cob on Saturday, a nice little side dish for the
716
01:04:07.495 --> 01:04:08.715
the pork I was smoking.
717
01:04:09.335 --> 01:04:17.770
So if you go in here, like, this is just the maple desktop app that everybody can download on our website. And when you go to API access, you just go to local proxy
718
01:04:18.470 --> 01:04:24.970
and it's just got this default stuff configured here. And you can tell it to auto start if you want to every time you open maple.
719
01:04:25.270 --> 01:04:27.210
But, I'm just gonna hit start proxy.
720
01:04:27.565 --> 01:04:32.065
And then I copy this information over and paste it into goose.
721
01:04:32.765 --> 01:04:39.265
Another thing is I had to generate an API key, which I already did, but you generate an API key and I had to buy some credits.
722
01:04:39.650 --> 01:04:44.630
So, I bought some credits, and now I can use So these credits are separate from
723
01:04:45.650 --> 01:04:46.150
my
724
01:04:46.530 --> 01:04:48.230
yearly subscription or whatever?
725
01:04:48.530 --> 01:04:57.845
Yes. Yeah. They're separate. Your annual subscription gets you access to the chat product and gets you basically a developer account with Maple and you can come in and buy credits.
726
01:04:58.305 --> 01:05:04.164
There could be some time down the road where, like, pro plans come bundled with a certain amount of credits automatically
727
01:05:04.545 --> 01:05:14.560
or we offer some kind of smaller developer account. We haven't gotten there yet. We're a small company, still building. But for now, it's it's available to pro users and up.
728
01:05:15.580 --> 01:05:26.405
So I've got my credits. I started the proxy, and then I can come in here and in the settings, you can configure your providers and they they have all these different providers. And I went to open AI
729
01:05:26.865 --> 01:05:29.505
and I simply just changed it over That's awesome.
730
01:05:29.905 --> 01:05:44.200
Add to that. And now that it's changed over, I can come in here and let's go do a new chat. And So, like, it literally thinks you're using OpenAI. It has no the actual front end has no idea. Yeah. So I select OpenAI, but it only has our models. It does not OpenAI's
731
01:05:44.580 --> 01:05:46.680
models. So I can do deep seek,
732
01:05:47.335 --> 01:05:48.155
select model,
733
01:05:48.935 --> 01:05:50.395
and I can say, you know,
734
01:05:51.015 --> 01:05:52.155
tell me a story
735
01:05:53.335 --> 01:05:56.555
about That's awesome. Yeah. About whatever. About clouds.
736
01:05:57.495 --> 01:06:05.400
And it's going to do its little thinking thing that that deep seek does so you can see how it's thinking through things and then
737
01:06:05.779 --> 01:06:07.400
starts writing the story for me.
738
01:06:08.180 --> 01:06:09.619
And that's it. It's pretty simple.
739
01:06:10.099 --> 01:06:15.255
I would love to get it to where when you go into settings, it doesn't look like OpenAI. It actually says MapleAI on it,
740
01:06:15.734 --> 01:06:26.234
but you can do that with other things. I've got JanAI open here too, and it's doing the same thing. You know, I was doing tests earlier, so I'm I've got JanAI set up the same way with GPT
741
01:06:26.694 --> 01:06:27.194
OSS.
742
01:06:27.869 --> 01:06:29.010
So I can say hello,
743
01:06:29.470 --> 01:06:33.410
and this is going through the proxy. In fact, I can show you because if I stop the proxy
744
01:06:34.269 --> 01:06:37.250
and then say hello again, it's gonna time out
745
01:06:37.630 --> 01:06:38.849
error. That's cool.
746
01:06:39.150 --> 01:06:44.125
Yeah. So it's fully encrypted going through, our back end using your own account.
747
01:06:45.545 --> 01:06:46.445
That's awesome.
748
01:06:47.065 --> 01:06:47.565
Yeah.
749
01:06:48.425 --> 01:06:54.204
That's quite powerful. And then so then you don't have I can turn off your screen share. So then you don't have you don't have to deal
750
01:06:55.065 --> 01:07:07.390
with like, people can just use the u a UI they want, and they get the privacy and security guarantees, basically. Right? That's the at the end of the day. Yeah. So one of the biggest requests that people have is they want to have web search inside of Maple.
751
01:07:08.170 --> 01:07:10.829
I've I've heard some people say that. Yeah.
752
01:07:11.435 --> 01:07:17.615
So in theory, you could use another tool that already does web search and you wanna go to Is there one you could recommend?
753
01:07:18.555 --> 01:07:24.415
I need to get I need to figure out how to get Goose to do it. I know Goose can, but it's such a hard tool to use.
754
01:07:25.180 --> 01:07:26.960
Goose team, if you're listening, like,
755
01:07:27.340 --> 01:07:28.640
make your product easier.
756
01:07:29.180 --> 01:07:33.420
And maybe I could be the change I wanna see in the world and submit some more requests to it. But
757
01:07:34.060 --> 01:07:36.480
If I if someone wanted to vibe code,
758
01:07:36.895 --> 01:07:43.235
which model is the best that Maple has right now? I mean, I like, people love Claude, but Claude is proprietary.
759
01:07:44.015 --> 01:07:50.115
Yeah. So we have Quinn three Coder, which is, like, the the best open source Who is that? Is that Alibaba? That's Alibaba.
760
01:07:50.870 --> 01:07:53.370
And in some metrics, it's like as good
761
01:07:54.150 --> 01:07:54.730
as OpenAI
762
01:07:55.110 --> 01:07:55.850
and Claude,
763
01:07:56.790 --> 01:07:59.370
in some things. Right? Or it's really close behind
764
01:07:59.670 --> 01:08:03.935
Claude for I think it's Sonnet or Opus. I I can't remember right now.
765
01:08:04.735 --> 01:08:07.315
That's, like, the gold standard that everybody's going off of now.
766
01:08:07.935 --> 01:08:11.315
So it's it's the best, but it's also super expensive to use.
767
01:08:12.335 --> 01:08:17.250
And then You could spend thousands of dollars in credits every month using it. And they're harvesting all your information.
768
01:08:17.630 --> 01:08:20.610
Yeah. I know. It's crazy that Claude basically
769
01:08:21.230 --> 01:08:26.370
most people are using it to code now. And so it has, like, all the proprietary software out there.
770
01:08:27.070 --> 01:08:40.105
Like, in flight, it's it's recording everything that people are doing. So all these companies that care about their IP and, like, have have access controls around their repos internally, like, they're they're still sharing it with Claude in the process.
771
01:08:40.610 --> 01:08:43.030
And then, like, they also, like, have like,
772
01:08:43.409 --> 01:08:44.150
talk about
773
01:08:44.770 --> 01:08:47.510
having have to have policies among your team because
774
01:08:48.050 --> 01:08:52.550
there could just be a random dev on your team that's using it. You you might not even realize.
775
01:08:52.965 --> 01:08:53.625
Oh, yeah.
776
01:08:53.925 --> 01:08:56.745
Cisco did a study last year, and
777
01:08:57.205 --> 01:08:58.885
they found that 48%
778
01:08:58.885 --> 01:09:02.345
of employees were using were were sharing company information
779
01:09:02.885 --> 01:09:03.765
with AI tools.
780
01:09:05.430 --> 01:09:26.465
And it didn't differentiate between, like, whether or not their companies knew it, but they're doing it. And I've talked to plenty of people in person who say, yeah, my my company told me not to do this, but I'm just hoping that they don't know because because people wanna do good work. They wanna, like, get a promotion. You know? They wanna do their best, and so they're gonna use the tools that help them be most effective.
781
01:09:28.685 --> 01:09:30.304
Are you familiar with
782
01:09:31.180 --> 01:09:33.520
the work Alex Gleason is doing?
783
01:09:34.540 --> 01:09:35.340
Is that the,
784
01:09:36.060 --> 01:09:37.280
Shakespeare? Shakespeare.
785
01:09:37.820 --> 01:09:39.120
Yeah. And and MKStack.
786
01:09:40.140 --> 01:09:47.355
Yeah. So I listened to your dispatch with him, what, a week or two ago? That's that's the most I've gotten into it. Just listen to that one.
787
01:09:50.695 --> 01:09:52.075
MKStack is cool.
788
01:09:52.535 --> 01:09:53.515
So what is MKStack?
789
01:09:54.855 --> 01:09:56.075
I mean, it's open.
790
01:09:56.880 --> 01:09:59.380
I don't think it's an LLM. Is it like,
791
01:10:00.960 --> 01:10:01.440
is it
792
01:10:02.480 --> 01:10:05.460
I I don't know. I'm such a noob when it comes to AI stuff.
793
01:10:06.640 --> 01:10:13.605
I think it's like, do they call them Is MCP, like, the type of format or something? So MCP is what,
794
01:10:15.205 --> 01:10:17.385
Anthropic came out with as their API
795
01:10:17.765 --> 01:10:19.785
for AI tools to talk to each other.
796
01:10:21.140 --> 01:10:26.040
So a lot of tools now have, like, an m p MCP interface on them, model context protocol,
797
01:10:26.980 --> 01:10:33.480
interface that they can His GitLab says it's the complete framework for building Nasr clients with AI.
798
01:10:34.155 --> 01:10:39.135
MKStack is an AI powered framework for building Nostra applications with React, Tailwind,
799
01:10:40.715 --> 01:10:42.415
Byte, Shad, CNUI,
800
01:10:42.875 --> 01:10:43.614
and Nostrify.
801
01:10:44.474 --> 01:10:48.895
Okay. But I think it's independent of the model. I don't know. I was gonna just
802
01:10:49.950 --> 01:10:57.230
Yeah. I'd be curious what Shakespeare is using on the back end. Is it using Claude or one of these services, or are you bringing your own PPQ? That's
803
01:10:58.110 --> 01:11:02.450
my my understanding is it's proxying into the one of the big dogs.
804
01:11:03.055 --> 01:11:17.500
Yeah. I think I think Claude is what they're using in the back end. Okay. But it'd be cool to see you guy I don't know. Like, there's some crossover potential there. That's as far as I'm going with my feedback. I don't I have no idea what I'm talking about. Yeah. I'm just a humble maple user.
805
01:11:19.160 --> 01:11:33.315
I I have reached out to PPQ. We've we've chatted with Matt quite a bit, especially when we were doing the, open secret stuff earlier on. So he's aware of maple and Maple proxy, and we've talked about hopefully getting the Maple proxy up on PPQ.
806
01:11:33.695 --> 01:11:46.130
So that people could could use it because we get I mean, everybody wants to use Maple but pay as they go. Like, they love the PPQ interface for privacy. And I would love to offer it too, but that's just not where we're focusing right now.
807
01:11:46.750 --> 01:11:51.170
So I I see a nice middle ground where PPQ just proxies through to Maple.
808
01:11:51.550 --> 01:11:57.165
That's true. Like, I pay for the year in Bitcoin, and I just leave a bunch of tokens on the table. Like, I'm not
809
01:11:58.265 --> 01:12:02.605
using I'm not using enough probably like, I I'm using it to support the mission
810
01:12:03.145 --> 01:12:04.345
Right. More than,
811
01:12:05.225 --> 01:12:14.230
if if if from a price point of view, I'm probably better off using it on a paper token basis. So, like, if I I see what you chat about. I I can see you could use it more.
812
01:12:14.770 --> 01:12:18.630
Well, you see my time stamps at least. You know my time stamps. Uh-huh.
813
01:12:19.410 --> 01:12:19.810
I,
814
01:12:20.290 --> 01:12:26.625
well, actually, I don't think you know which user I am now. I don't know who you are. No. In fact, I'll see, like, some low key,
815
01:12:27.324 --> 01:12:38.864
influencer online be like, oh, I love Maple. And it's like, oh, I had no idea that you're using it. So, it's it's kinda cool. Well, so I because I know I'm gonna get shamed on this. I prefer to pay with Bitcoin.
816
01:12:41.560 --> 01:12:44.140
But the reason I originally signed up
817
01:12:45.000 --> 01:12:51.180
with card was because it was, like, it was kind of I was being lazy, and it was kind of hard to find the Bitcoin payment option.
818
01:12:52.305 --> 01:13:06.165
That is very lazy because it's right there on the front. No. It wasn't on on Apple. It was like, you were you had to, like, go you had to, like, leave leave the ecosystem. Like, it was just like an app I did, like, Apple Pay just, like, boom. Go.
819
01:13:06.510 --> 01:13:17.890
It's it's true. It was turned off in the early days on there. Yeah. Eventually, we got it turned on. And so then I went to so then Tony shamed me, and then I went to desktop to pay with Bitcoin. And I intentionally, like, waited a few days
820
01:13:18.365 --> 01:13:21.985
because, like, I was like, I gotta make sure that I'm, like, fully disconnected.
821
01:13:22.445 --> 01:13:28.225
Yeah. That's good. But, anyway, you got my time stamps. We had zero sign ups between then and when you signed up.
822
01:13:28.525 --> 01:13:33.175
There you go. I'm just Well, no. I made sure to onboard my parents. Oh, there you go. And then I,
823
01:13:34.739 --> 01:13:37.719
This is a great conversation. Last but not least, let's,
824
01:13:38.260 --> 01:13:40.119
before we wrap here, let's talk about
825
01:13:41.139 --> 01:13:41.639
OpenSecrets,
826
01:13:41.940 --> 01:13:45.699
the company. What's what is OpenSecrets, and what is Maple? Where do they
827
01:13:46.420 --> 01:13:49.145
how how do they relate to each other? Yeah.
828
01:13:49.764 --> 01:14:00.025
Great question. Open secret is the foundation. It is the developer back end for maple, and it it it all runs in secure enclaves. And so the concept is
829
01:14:00.560 --> 01:14:09.220
all these apps that you use on your phone or any website you go to is just some server running somewhere else in a data center, and you're given it all of your information.
830
01:14:09.840 --> 01:14:24.525
And so they just have this database with everybody's stuff in it. And that's why data leaks happen all the time as the database will get hacked. And it just has, like, tons and tons of rows of information on everybody. And any employee in the company can see that information as well. So open secret
831
01:14:25.385 --> 01:14:32.040
instead takes each user and puts them in their own private data vault and encrypts it with a private key for that user.
832
01:14:32.760 --> 01:14:35.340
But, it makes the UX really easy
833
01:14:35.640 --> 01:14:51.255
because private keys are scary. They're difficult to manage. People don't want to do it because now they have the responsibility on themselves to not lose that private key and then lose access to everything. So we're using secure enclaves to generate that private key and then encrypt from your device through to the database.
834
01:14:51.795 --> 01:15:02.690
So that's what open secret is and then you can build apps on top of it. So we've got maple on there. We have a really sweet cashew wallet that hasn't launched yet. That is Is that what Cali was teasing?
835
01:15:02.990 --> 01:15:04.910
That's the tease that he threw out there.
836
01:15:05.950 --> 01:15:12.370
So yeah. So I'm really excited for that. In fact, I've I've used it to pay for all sorts of stuff and it's working phenomenal,
837
01:15:13.445 --> 01:15:14.425
phenomenally well.
838
01:15:15.525 --> 01:15:22.585
And so that's that's really what open secret is. We've kind of toned it back a little bit. We had a lot of developer interest in the beginning,
839
01:15:22.965 --> 01:15:35.070
but Maple started to grow and take off. And especially as we're looking at raising money in a seed round soon, we're trying to look for, like, what's the best story? What's going to get, you know, the best traction with people? And
840
01:15:35.449 --> 01:15:36.670
talking about a private
841
01:15:37.369 --> 01:15:50.684
day, you know, back end system for app developers is a bit harder of a story to go through whereas, like, private AI, the private version of Chat GPT or the signal Chat GPT, like, that's so much easier for people to grok, and they can use it. And I think that's a big part is,
842
01:15:51.244 --> 01:15:58.289
you know, investors being able to play with it. Yeah. I mean, that was the whole point of Maple in the beginning, right, was that it was trying to prove out
843
01:15:58.750 --> 01:16:03.730
the open secret model. Right? It's like, look. You can build apps on use it. You can build secure,
844
01:16:05.710 --> 01:16:06.510
hosted apps,
845
01:16:08.065 --> 01:16:27.860
very that that are very user forward and and very easy to use. And it was, like, a hard it was a hard pitch to make to people. So, like, oh, you make Maple. And it's like, look. If you use Maple, this is using Open Secret as the back end. Yeah. And it was a direct response to developer feedback. So we met with like, I don't know, 10 or so developers when we were thinking about open secret
846
01:16:28.240 --> 01:16:39.985
and half of them said, yeah, we would love it. So half of them said, I don't know, maybe, maybe not, but almost all of them said, if you had a private AI, like ChargePD but built on open secret, we would totally use it. So Interesting. That's where it was born.
847
01:16:40.605 --> 01:16:41.005
I,
848
01:16:42.445 --> 01:16:50.045
but so, I mean, based on this Cashew wallet, like, I how developers can just, what, reach out? Is it, like, opensecret.com
849
01:16:50.045 --> 01:16:52.180
or something? It's opensecret.cloud.
850
01:16:52.180 --> 01:16:53.400
We do have a
851
01:16:54.340 --> 01:16:57.320
oh, it's a it's a crappy address. Opensecret.cloud.
852
01:16:58.260 --> 01:17:00.280
People can sign up for the waiting list.
853
01:17:00.660 --> 01:17:08.225
And then we also have a discord. They can join. It's on the website. They can click in there and chat with us. But we are not onboarding any new developers right now,
854
01:17:08.685 --> 01:17:14.705
because we're so focused on Maple. Got it. So don't reach out. You could. Right? You could. But,
855
01:17:16.685 --> 01:17:27.949
some I'll I find a lot of the developers that contact us actually want Maple as their developer's tool. So now that we have the proxy, like, that's that's gonna satisfy a lot of their needs. They just want to talk to private AI.
856
01:17:28.329 --> 01:17:29.469
But, eventually,
857
01:17:29.849 --> 01:17:34.935
we will probably circle back around to open secret. I don't know when that time will be, but I think that,
858
01:17:35.735 --> 01:17:38.795
when the market is there for it, we can we can basically
859
01:17:39.175 --> 01:17:40.235
productize that
860
01:17:40.695 --> 01:17:42.075
and bring on more developers.
861
01:17:42.615 --> 01:17:48.155
Yeah. I mean, or maybe not. I mean, the story of Slack to me is fascinating. You know that story. Right?
862
01:17:48.679 --> 01:17:51.500
Oh, man. Refresh me. I've I've read a long time ago.
863
01:17:52.119 --> 01:17:56.619
It was like some random video game company that nobody's ever heard of,
864
01:17:56.920 --> 01:17:59.900
and they just built Slack for their internal comms.
865
01:18:00.360 --> 01:18:02.940
Okay. And then, like, Slack turned into a multibillion
866
01:18:03.320 --> 01:18:04.575
dollar juggernaut,
867
01:18:05.115 --> 01:18:12.175
and no one knows the video game company. Like, the actual product turned it was just the way they were communicating while they were developing a game. Yeah.
868
01:18:13.275 --> 01:18:15.935
So, you know, it could be one of those stories. It's interesting.
869
01:18:16.330 --> 01:18:20.429
I mean, and then it it's compounded on top of that even more, right, because,
870
01:18:21.690 --> 01:18:24.910
the company started as mutiny and then turned into open secret.
871
01:18:25.690 --> 01:18:27.550
And I don't know. It's pretty awesome journey.
872
01:18:28.170 --> 01:18:31.145
Yeah. I'd love to see it. I see, SoapMiner
873
01:18:31.445 --> 01:18:49.200
zapped us 21,000 sets. Very interesting convert. Appreciate you both. Marks, have you used have you used SoapMiner soap yet? I have not. You guys talk about them all the time. I need to get them. It's the best soap. If your birthday's coming up, don't talk so on the stream. K. I get you I get you a bar soap. It's great. K. It's a good gift.
874
01:18:50.300 --> 01:18:51.520
Marks, this was awesome.
875
01:18:52.380 --> 01:18:55.040
Thank you for finally coming on. It's been a long time coming.
876
01:18:56.060 --> 01:18:57.920
I'd love to do this, you know,
877
01:18:58.415 --> 01:19:00.915
maybe every six months or so, like, do a
878
01:19:01.215 --> 01:19:04.995
a check-in. I'm trying to do a little bit the freaks have probably noticed. Like,
879
01:19:05.295 --> 01:19:07.075
I try and interperse intersperse
880
01:19:07.455 --> 01:19:08.835
Bitcoin shows with
881
01:19:09.535 --> 01:19:11.315
nostrils and AI shows.
882
01:19:12.500 --> 01:19:14.280
So I'd love to have you on more often.
883
01:19:14.580 --> 01:19:27.735
Yeah. That'd be great. I mean, the the AI space is changing rapidly, so we can just update every once in a while. Yeah. And you guys are shipping fast too. Like, freaks, like, the best way you can support what they're doing is by subscribing.
884
01:19:30.195 --> 01:19:33.495
It goes a long way. They're a small lean team. It's just him and Tony.
885
01:19:33.875 --> 01:19:34.355
Yeah.
886
01:19:34.675 --> 01:19:35.655
It's pretty impressive.
887
01:19:37.480 --> 01:19:40.460
Anyway, before we wrap, do you have any final thoughts for the audience?
888
01:19:40.840 --> 01:19:46.380
I want to I want to go see if there are any questions on Nostra that we forgot since we we talked about Nostra so much.
889
01:19:47.159 --> 01:19:49.980
Somebody asked why we remove document upload.
890
01:19:50.565 --> 01:20:07.790
We were really sad remove that. We were really sad to remove it, but it was such a flaky service, and it kept going down. And we care about our users and their experience, and so we turned it off until we can build it right. So we're we're doing a bunch of rearchitecting right now of Maple to make it even better. And then that will be a foundation
891
01:20:08.410 --> 01:20:11.230
for web search and for document upload and other things.
892
01:20:12.090 --> 01:20:19.985
I really want web search. Oh, yeah. That's yes. It's coming. We've already got, a proof of concept with Coggy. So that that'll be coming.
893
01:20:20.685 --> 01:20:22.365
Let's see. The other one was
894
01:20:23.805 --> 01:20:24.925
can I get,
895
01:20:25.645 --> 01:20:28.065
I'm a free user? Can I have access to more models?
896
01:20:28.525 --> 01:20:29.025
And
897
01:20:29.725 --> 01:20:31.825
Just pay. Maybe. Just pay.
898
01:20:32.445 --> 01:20:33.265
Sign up.
899
01:20:33.920 --> 01:20:38.900
Somebody else asked if we're gonna bring back our cheaper account. We did have a starter account and we got rid of it.
900
01:20:39.520 --> 01:20:45.860
I don't know if we will, just just sign up for now and support the project. You can sign up for just a month if you want to
901
01:20:46.505 --> 01:20:48.045
And maybe down the line,
902
01:20:48.505 --> 01:20:49.965
we will. But who knows?
903
01:20:50.265 --> 01:20:52.905
What is it? It's it's $20
904
01:20:52.905 --> 01:20:53.565
per month.
905
01:20:54.505 --> 01:21:00.489
$20 a month. If you pay for a year with Bitcoin, then you get 10% off. Then it's $216.
906
01:21:00.489 --> 01:21:03.710
You get every model that you guys support. Right? Yep.
907
01:21:06.330 --> 01:21:12.110
And then if you're a business or you work at a company and you wanna bring private AI to your company, we have a team plan.
908
01:21:12.645 --> 01:21:17.465
And it starts as small as two users and can scale up, the size of your entire company.
909
01:21:18.005 --> 01:21:26.105
And so it it doesn't have to replace the other AIs. I keep telling people this. Like, I'm not I'm not out here saying stop using ChatterPT, stop using Claude,
910
01:21:26.540 --> 01:21:34.000
but add a privacy version to your toolkit as well. So that way when you're reaching for the right tool, you have everything at your disposal.
911
01:21:35.660 --> 01:21:36.880
What about ImageGen?
912
01:21:38.895 --> 01:21:40.355
That is such a tough topic.
913
01:21:40.895 --> 01:21:42.675
We don't have that. And
914
01:21:43.455 --> 01:21:49.955
think of the kids, man. Like, think of the kind of content that can be generated with ImageGen in a fully private model.
915
01:21:51.170 --> 01:21:56.869
We haven't figured out how to how we wanna handle stuff that is maybe just, like, really, really awful in the world.
916
01:21:57.330 --> 01:21:57.830
So
917
01:21:58.530 --> 01:22:00.150
I don't know what we're gonna do there yet.
918
01:22:00.530 --> 01:22:03.989
But we have so many other big things that we wanna build before
919
01:22:04.315 --> 01:22:09.135
even getting to that. Right? Like, we don't have web search and stuff. So we'll have to tackle that at some point.
920
01:22:09.514 --> 01:22:12.974
But for now, there are plenty other great ways to generate images,
921
01:22:13.355 --> 01:22:14.414
that you can use.
922
01:22:14.795 --> 01:22:17.295
So Sure. I think I think I'll just kinda leave it at that.
923
01:22:17.994 --> 01:22:19.614
What do you think of the,
924
01:22:22.770 --> 01:22:23.989
like, the people making,
925
01:22:27.330 --> 01:22:29.350
like, videos from still images?
926
01:22:30.449 --> 01:22:32.949
Oh, yeah. I mean, those are fun. They're really cool.
927
01:22:34.735 --> 01:22:40.675
I think a lot of these things are just, like, hobbies right now. People are playing with them just for fun as, like, a means of expression.
928
01:22:41.295 --> 01:22:41.795
So
929
01:22:42.095 --> 01:22:43.075
I think it's cool.
930
01:22:43.695 --> 01:22:48.515
But have you seen, like, the argument that, like, people are just, like, making fake memories? Like, they're taking
931
01:22:49.050 --> 01:22:50.270
old photos of,
932
01:22:50.730 --> 01:22:54.990
I don't know, like, a a father that passed away or something, and they're, like,
933
01:22:56.730 --> 01:22:58.430
supplanting them with fake memories?
934
01:22:58.890 --> 01:22:59.390
Yeah.
935
01:23:00.170 --> 01:23:02.750
I mean, if you think about it, our mind is
936
01:23:03.055 --> 01:23:04.915
kinda full of fake memories too. Right?
937
01:23:05.295 --> 01:23:07.155
We don't we don't remember the past perfectly
938
01:23:07.455 --> 01:23:08.195
and then
939
01:23:08.575 --> 01:23:13.715
we look at something or we hear something and our mind adjusts the memory on its own. So
940
01:23:14.575 --> 01:23:25.580
no. I mean, what what you're saying is true. What's a real memory? Yeah. Whatever. I don't know. Anyway, on the ImageGen stuff, like, I know it's a sticky subject or whatever, but I will just say, like, unless there's a privacy
941
01:23:25.880 --> 01:23:27.900
focus like, I can't use any
942
01:23:29.825 --> 01:23:35.045
of the current tools with, like, family photos or anything right now. There's nothing out there. That's true.
943
01:23:36.785 --> 01:23:41.550
So one one thing we could do is we could just put guardrails around it,
944
01:23:42.030 --> 01:23:47.570
and it will be transparent. Right? We're not trying to do any kind of censorship on people that is secret and subversive.
945
01:23:48.030 --> 01:23:59.614
So if we do image gen and we put a lot of strong guard rails around it, you'll be able to see what they are and it's open source. So you could adjust them if you want to. Right. Maybe that's the solution that we go with down the road.
946
01:24:00.474 --> 01:24:06.335
Okay. Well, it's all fascinating. We'll figure it out we'll figure it out as you go. Yeah. Marks, thanks for joining.
947
01:24:07.835 --> 01:24:24.335
Do you I I asked you if you had final thoughts. You you went and answered any questions on Nasr. Do you have any additional final thoughts? Or final thoughts? It's just that that whole concept of a toolkit. Like, go to try maple.ai and get a free account. And because everybody has a free ChagitPT account except for Matt.
948
01:24:24.635 --> 01:24:25.035
So,
949
01:24:25.515 --> 01:24:30.495
go grab one and just have it available to you, and you'll start finding
950
01:24:30.955 --> 01:24:40.190
you'll start finding as you chat with ChagitPT. You're like, oh, I actually don't wanna say this to this company right now. And you can go over to Maple, and it's like a really liberating experience to be able to chat openly
951
01:24:40.570 --> 01:24:43.950
with something that's protecting your privacy. So just give it a try.
952
01:24:44.730 --> 01:24:47.310
Love it. I would just say, consider
953
01:24:48.135 --> 01:24:55.914
don't stop at the free account on on Maple because it is a big upgrade. Like, you pay $18 a month, get a ton of tokens, you get all the other models.
954
01:24:56.534 --> 01:25:03.195
And what the free is is the free just Llama? It's just Llama, which is a decent model. But why so bad. Llama is so bad.
955
01:25:03.580 --> 01:25:04.320
It's decent.
956
01:25:04.860 --> 01:25:29.795
But It's good that it's quick. Is better. The nice thing that is Llama is, like, super quick. Like, it answers you right away. Yes. It's not thinking or anything, but it's not thinking because it's retarded. It's just like it's just like, it just spits out whatever whatever is in its mouth. Well, and it's it's stuck in the past. Right? It's an older model and it just continues to get older. So yeah. Have you noticed that the GPT open source one just wants to make tables of everything?
957
01:25:30.130 --> 01:25:43.750
Yeah. I finally started telling it, don't make tables, and I get really mean sometimes to it if it does. I always have to tell it to keep it brief. I just tell it I constantly am just like, just keep it brief. Don't make me a table. It's insane. There you go. Okay. Marks.
958
01:25:44.265 --> 01:25:49.804
Fantastic. Thank you for joining us. Good luck with the Wasp nest. Thanks. Freaks. I'll see you next week.
959
01:25:50.265 --> 01:25:53.065
All the links to dispatch or cieldispatch.com.
960
01:25:53.065 --> 01:25:58.284
Share with your friends and family. I appreciate you all. Stay humble to access. Alright. Have a good one.