cross-posted from: https://lemmy.ml/post/18299168

Back in the day the best way to find cool sites when you were on a cool site was to click next in the webring. In this age of ailing search engines and confidently incorrect AI, it is time for the webring to make a comeback.

This person has given his the code to get started: Webring

23 points

Eh web rings were pretty lame even when they existed. There are plenty of ways to find new stuff these days. I hear they even have sites where anyone can post links and vote on which ones are good.

permalink
report
reply
1 point

Might be an interesting addition to have an aggregator aggregator, something that would count how often a particular website is linked and in what categories it is linked in.

Then you can filter by how many times that web page has gotten an upvote or downvote.

If you filtered out social media and all of the say top 100 web pages what would be left and how popular are they?

permalink
report
parent
reply
5 points

I have a vision of starting a <noscript> community.

Basically building a set of tools to help people host content with just plain HTML and CSS, using static personal hosting and organically sharing links like the pioneer days of the web.

I think that the shift to client-side scripting, like tracking pixels, algorithmic content, infinite scrolling, targeted advertising, etc is how we ended up with the monoculture we see today.

Just disable JavaScript on your browser and 99% of those things go sway and we can support people building personal homepages again.

permalink
report
reply
5 points

I built personal webpages in the 1990s, and still do it now, I included javascript then, and still do now - to make calculations, show interactive graphics, quantitative stuff about climate change - see for example this model.
I get your concept, that more websites should be written and hosted by individuals not big tech - but javascript is not the essence of the problem - js is just calculating stuff client-side for efficiency. In theory big tech could still serve up personalised algorithm-driven feeds and targeted advertising, just with server-side page generation (like php) and a few cookies, would waste more bandwidth but no stress to them. Whereas disabling client side calculations would kill what i do, as I can’t as an individual afford to host big calculations on cloud servers (which is also technically harder).

permalink
report
parent
reply
2 points

Yeah, this isn’t supposed to be a silver bullet, it’s more about democratizing the internet more.

I think that

  • Low barrier to entry
  • Focus on users owning their own content
  • Privacy is more important than advanced functionality

I.e. if you want to start a blog, it should be easy own it and host it yourself rather than surrending your content to Twitter and Facebook. Make it accessible to others who also want to surf the web without being targeted and tracked.

permalink
report
parent
reply
1 point

Are you familiar with neocities (geocities revival thing)? It’s not anti-scripting but it may scratch your itch.

permalink
report
parent
reply
1 point
*

They were never that cool. In fact, people only did it because they wanted more traffic.

Sorry, people stopped using them for a reason imho.

permalink
report
reply

Opensource

!opensource@programming.dev

Create post

A community for discussion about open source software! Ask questions, share knowledge, share news, or post interesting stuff related to it!

Credits

Icon base by Lorc under CC BY 3.0 with modifications to add a gradient



Community stats

  • 1.1K

    Monthly active users

  • 230

    Posts

  • 902

    Comments