More than 200 Substack authors asked the platform to explain why it’s “platforming and monetizing Nazis,” and now they have an answer straight from co-founder Hamish McKenzie:

I just want to make it clear that we don’t like Nazis either—we wish no-one held those views. But some people do hold those and other extreme views. Given that, we don’t think that censorship (including through demonetizing publications) makes the problem go away—in fact, it makes it worse.

While McKenzie offers no evidence to back these ideas, this tracks with the company’s previous stance on taking a hands-off approach to moderation. In April, Substack CEO Chris Best appeared on the Decoder podcast and refused to answer moderation questions. “We’re not going to get into specific ‘would you or won’t you’ content moderation questions” over the issue of overt racism being published on the platform, Best said. McKenzie followed up later with a similar statement to the one today, saying “we don’t like or condone bigotry in any form.”

You are viewing a single thread.
View all comments

I actually prefer this type of hands-off approach. I find it offensive that people would refuse to let me see things because they deem it too “bad” for me to deal with. I find it insulting anyone would stop me reading how to make meth or read Mein Kampf. I’m 40yo and it’s pretty fucking difficult to offend me and to think I’m going to be driven to commit crime just by reading is offensive.

I don’t need protecting from speech/information. I’m perfectly capable and confident in my own views to deal with bullshit of all types.

If you’re incapable of dealing with it - then don’t fucking read it.

Fact is the more you clamp down on stuff like this the more you drive people into the shadows. 4chan and the darkweb become havens of ‘victimhood’ where they can spout their bullshit and create terrorists. When you prohibit information/speech you give it power.

In high school it was common for everyone to hunt for the Anarchists/Jolly Roger Cookbook. I imagine there’s kids now who see it as a challenge to get hold of it and terrorist manuals - not because they want to blow shit up, but because it’s taboo!

Same with drugs - don’t pick and eat that mushroom. Don’t burn that plant. Anyone with 0.1% of curiosity will ask “why?” and do it because they want to know why it’s prohibited.

Porn is another example. The more you lock it down the more people will thirst for it.

Open it all up to the bright light of day. Show it up for all it’s naked stupidity.

permalink
report
reply
10 points

That’s not really how this works. Do you also think advertising and marketing don’t work?

permalink
report
parent
reply

In what way is advertising and marketing the same as Mein Kampf?

permalink
report
parent
reply
11 points

Pinching the bridge of my nose here. Nazi blog posts are marketing for nazi beliefs. They’re posting because they have ideas that they want you to have, too. What do you think marketing is? Ok, let’s assume you’re asking in good faith.

When you see an ad you don’t typically run right out and buy it. But now you’re more aware of whatever they’re advertising. Maybe that’s a new car. Maybe it’s pepsi. Maybe it’s “You should recycle.” And maybe, when it’s a literal nazi post, it’s “the jews are the problem”. Some people will bounce right off the ad… Some people will immediately click through, read the related links, blah blah. And many people who read it will sort of remember it, and now have context for the next post they see. The more ads they see for nazi beliefs (or anything, really), the more likely they are to be persuaded.

If you saw posts every day that promoted nazism as a solution for the world’s problems, it would have an effect on you. Look how effective fox news has been at propagating right wing beliefs.

permalink
report
parent
reply
7 points

In a lot of languages advertising and propaganda are literally the same word. The only difference is whether the goal is commercial or political.

permalink
report
parent
reply
5 points

Exposure

permalink
report
parent
reply
2 points
*

fascinated that you think it would somehow be harder for you to go out and find nazis if substack weren’t hosting and paying them. it will always be easy to find and read Nazi content. the reason substack matters is that the platform helps THEM find YOU, or a suggestible journalist, or a suggestible politician, etc. you are not the protagonist here

permalink
report
parent
reply
2 points

Substack bans pornography but allows Nazis.

permalink
report
parent
reply
1 point

Agreed. I actually had come back to this topic specifically to make this exact point, which for all the time I’d spent on this at this point I feel like I hadn’t said.

People are adults, generally speaking. It’s weird to say that you can’t have a newsletter that has a literal swastika on it, because people will be able to read it but unable to realize that what it’s saying is dangerous violence. Apparently we have to have someone “in charge” of making sure only the good stuff is allowed to be published, and keeping away the bad stuff, so people won’t be influenced by bad stuff. This is a weird viewpoint. It’s one the founding fathers were not at all in agreement with.

Personally, I do think that there’s a place for organized opposition to slick internet propaganda which pulls people down the right-wing rabbit hole, because that’s a huge problem right now. I don’t actually know what that opposition looks like, and I can definitely see a place for banning certain behaviors (bot accounts, funded troll operations, disguising the source of a message) that people might class as “free speech,” or adding counterbalancing “free speech” in kind to misleading messages (Twitter’s “community notes” are actually a pretty good way of combating it for example). But simply knee-jerking that we have to find the people who are wrong, and ban them, because if we let people say wrong stuff then other people will read it and become wrong, is a very childish way to look at people who consume media on the internet.

permalink
report
parent
reply
2 points
*

This article is not about government censorship. This is about a private entity actively deciding to allow nazi content on their platform. Hand wringing about founding fathers belongs in some other thread where the topic is the government prohibiting content from being published.

permalink
report
parent
reply
0 points
*

I’m aware of how the first amendment applies, yes. I agree with the spirit of it in addition to the letter, though. You’re free to delete the one sentence where I talked about founding fathers, and respond to the whole rest of my message which doesn’t reference them or government censorship in any way.

(Edit: Actually, I wasn’t super explicit about it, but in the whole final paragraph I was thinking partly of government regulation to combat misinformation. That is, in part, what I meant by “organized opposition.” So, I spent time in my message referring to what the government should do to limit harmful internet content, and no time at all talking about what it shouldn’t do. I did throw in a passing reference to founding fathers, in reference to the spirit that I think should inform private companies who are non-governmental gatekeepers of content.)

permalink
report
parent
reply

Technology

!technology@lemmy.world

Create post

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each other!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed
  10. Accounts 7 days and younger will have their posts automatically removed.

Approved Bots


Community stats

  • 16K

    Monthly active users

  • 13K

    Posts

  • 591K

    Comments