I’m pretty sure I’ve lived on less than this in inflation adjusted dollars.
I’m pretty sure I’ve lived on less than this in inflation adjusted dollars.
Oh I don’t think it’s a problem for Discord, but when it comes to software projects specifically I find the reliance on Discord frustrating because of its non-public orientation. If I’m having an issue I’d far rather search for a solution on a public wiki, bug report system, or forum than sign up for one more Discord server.
Far more than should tbh. Too many little game mods will have a Discord for questions and reporting issues rather than using their GitHub or a forum.
I think you’re giving the guy too much credit. Sometimes things are as they seen. He just didn’t like the moderation scheme on Twitter, made a gesture buying it, fumbled a little bit and overbid, then after having been forced to acquire it tried to turn it into something closer to what he wanted it to be.
Masnick’s post is well put, but also a disturbing reminder of how much power nation-states can exert over the Internet.
Because they’re also rich. Laws are for the poors.
XP.
Windows was getting to be too much trouble to 🏴☠️, Vista didn’t look that great, I couldn’t afford to upgrade my hardware to accommodate the bloat, and desktop Linux was a lot more mature and ready to go out of the box.
There are probably safeguards in place to prevent the creation of CSAM, just like there are for other illegal and offensive things, but determined people work around them.
the AI has to be trained on something first. It has to somehow know what a naked minor looks like. And to do that, well… You need to feed it CSAM.
First of all, not every image of a naked child is CSAM. This is actually been kind of a problem with automated CSAM detection systems triggering false positives on non-sexual images, and getting innocent people into trouble.
But also, AI systems can blend multiple elements together. They don’t need CSAM training material to create CSAM, just the individual elements crafted into a prompt sufficient to create the image while avoiding any safeguards.
It would not need to be trained on CP. It would just need to know what human bodies can look like and what sex is.
AIs usually try not to allow certain content to be produced, but it seems people are always finding ways to work around those safeguards.
i feel like he sounds like psychopath even now.
I think it’s rash to judge the tone of his writing like that. It can be a struggle to identify and admit one’s flaws, and it’s certainly a struggle for most people of the modern era to write elegantly with only pen and a few sheets of paper.
A person like what? There’s no connecting thread between morality, emotional maturity, and programming skills.
Actually agree, generally.
Bitcoin is a pyramid scheme basically, but it’s not the only block chain in town.
Anonymous peer-to-peer financial exchanges can actually be good.
Cooperative ledgers can be good.
Public ledgers can be good.
It’s not a brilliant new idea, it’s a good old one. Jitneys are back baby!
Isn’t the old bit about organized crime how they always have a second set of books? After all they do want to be able to track their finances.
Or at this point, the one whose tracking is easiest and safest to avoid or circumvent.
Also mind that soon these new cars will be used cars with the same bullshit.
Yeah. No one ever gave me AdSense dollars for nearly busting my fucking head.
I hate Twitter, but I’m getting to the point where I want it to get better because if bluesky gets many more members we’re just gonna have Twitter again.
One thing I liked about the Muskification of Twitter was the scattering.