Yeah, its kinda telling those who treat everything after shooting film as some sort of binary process.
I feel a lot of them would benefit more from just processing all their photos through some basic profile that ends with running it through a film simulator.
Agreed. The feeds I want to see blogging on is my RSS reader, not just constantly dumped onto social media feeds unless someone is explicitly bringing it up to talk about a post in general. If its just announcing a post exists, it feels spammy.
Finding external conversations about posts obviously is a bit more complicated but also if I actually care to see if discussion ever happened, I do so by searching the url. Something that linking via permashortlinks impairs.
For someone doing this in 2026 some food for thought:
Use arachne wall generator for much clearer legends (dynamic line width)
Most printers can do twice the resolution in the Z axis compared to the blogger's post, or the same layer height with a 0.4mm nozzle (legends would suffer)
Very minor fuzzy skin enabled on the outer walls can replace layer lines with more of a textured look
ZAA (Z anti-aliasing) post-processing scripts can eliminate most the need for angling at 45 degrees
Agree strongly. An expired cert is better than no cert.
Also would argue maintenance is only as complicated as you make it for yourself. Countless people keep patched, secure, https web servers running with minimal effort. If its somehow effort, introspect some on why you are somehow making so much work for yourself.
Might be a bit of each of us touching different ends of the elephant. To be clear I am talking about long timespans. Lets Encrypt hasn't even existed for a full decade yet. During that time it's dropped support entirely for the original acme protocol. During that time it's root certs have expired at least twice (only those I remember where it caused issues in older software). And that's ignoring the churn in acme/acme2 clients and specific OS/Distro cert choice issues and browser CA issues. Saying that there's no trouble with HTTPS must be coming from experiences on short timescales (ie, a few years).
HTTP/3 already doesn't allow anything but CA TLS only. It won't be too long before they no longer allow you to click through CA TLS warnings.
If human people want things to be on the web for long time periods those things should be served HTTP+HTTPS.
There is some kind of middle ground here.. My first HTML file still renders like it did on Mosaic. The HTTP server I used back then still works today 35 years later without maintenance. I do agree that HTTPS is a simple solution but there is too much cargo cult around it. Honestly I do not see the use to maintain everything published if you follow sane practices.
EDIT: I have 15 year old things at work that do not compile, you have to maintain it for sure, biggest problem is cryptography. I am not sure that unstable tech should be part of the application ever.
Unless I'm misunderstanding your point, your HTTP server from 35 years ago is still working today without any maintenance?
Does that mean no security patching and no updates for bugfixes? or does "no maintenance" means something else I'm missing?
I find it difficult to discuss these topics when comments like these pretend that you can leave your system exposed on the internet for years without any maintenance.
If we're talking applications that don't actively listen on the internet that's fine, and I would agree that we should have complete software that just works.
But a webserver, unless it's for personal/home use, it's on the internet and I don't see how it could work for 35 years without any update/change
Static html webservers don't really have any need for security patching or bugfixes constantly like dynamic complex stuff. They literally can just live forever. The sites themselves are just files. Not applications.
That's no use when your automated registrar stops working in 3 years because it went out of business or changed protocols. Let's Encrypt has been an outlier.
The only OS that doesnt as far as I'm aware is windows. And what image editors still have problems? Affinity has supported it for several years, GIMP, lightroom/PS, photopea, everywhere I test webp works fine. All work just fine.
Most social media sites take webp these days no issue, its mostly older oft php-based sites that struggle far as im aware. And when it cuts down bandwidth by a sizeable amount theres network effects that tend to push some level of adoption of more modern image formats.
Im not saying its the solution I would implement but caddy's L4 module does let you do this, essentially using TLS as a tunnel and openssl in the proxy command to terminate it client side.
> The Arch Wiki it's a joke as it gets obsolete with every major upgrade
I find this claim hilarious because well, arch doesnt have the concept of major upgrades. And rather their documentation is some of the fastest to cover relevant changes with regards to users' needs in software.
Can you find some obsolete bits here or there? Sure, but they're almost always visually flagged as OoD and disputed until consensus is certain rather than immediately yeeted.
Love TT-RSS to bits and pieces, I dont know how people use rss readers that have zero level of filtering. Being able to invoke custom plugins to action on certain hits is just the cherry on top for me.
That said I use a healthy dose of custom css for it on computer, and access it via netnewswire on mobile. Cant speak for any the official apps but at least you can get TT-RSS to speak just about every flavour of RSS API.
Concerningly similar story except it was a leg injury that led to a downturn in my posture. Love my embody to bits and pieces, the first week with it was rough but its been bliss ever since.
I feel a lot of them would benefit more from just processing all their photos through some basic profile that ends with running it through a film simulator.
reply