This will soon be happening with our parents' social security checks, our friend's cancer treatment plan, our international flights logistics, our ISPs routing configurations, ...
Search should be a public service, open and transparent, funded by tax revenue, and maintained for the public good. It is too important a service these days to leave it up to profiteers (who have repeatedly demonstrated they are not responsible or responsive stewards of the public good).
This is great. If you think that the phenomena of human-like text generation evinces human-like intelligence, then this should be taken to evince that the systems likely have dementia. https://en.wikipedia.org/wiki/Montreal_Cognitive_Assessment
Imagine if I asked you to draw as pixels and operate a clock via html or create a jpeg with a pencil and paper and have it be accurate.. I suspect your handcoded work to be off by an order of magnitutde compared
I think the big thing (potentially, for me) is the ability to postpone conflict resolution during a rebase. That can be quite painful in regular old git, but git-mediate helps make that less painful in practice in my particular situation and workflow.
We'll see once better non-cli UX appears. I'm low-key excited for what could be possible in this space.
I am excited too! It is probably too much to hope, but I nonetheless am hoping that magit gets a jj backend before I have enough motivation or need to learn a new tool to do the same old stuff :D
A key -- perhaps THE key -- remark here, IMO is the following:
> I do want to make things, and many times I dont want to know something, but I want to use it
This confesses the desire to make, to use, and to make use of, without ANY substantive understanding.
Of course this seems attractive for some reasons, but it is a wrong, degenerative way to be in the world. Thinking and being belong together. Knowing and using are two dimensions of the same activity.
The way of these tools is a making without understanding, a using without learning, a way of being that is thoughtless.
There's nothing preventing us from thoughtful, rigorous, enriching use of generative ML, except that the systems we live and work in don't want us to be thoughtful and enriched and rigorous. They want us pliant and reactive and automated and sloppy.
>Of course this seems attractive for some reasons, but it is a wrong, degenerative way to be in the world.
I share your sense that there's something psychologically vivid and valuable in that passage, but it's part of an implicit bargain that's uncontroversial in other respects - I don't have to be an electrician to want a working light switch. I don't personally inspect elevators or planes or, in many cases, food. It's the basic bargain of modernity.
I suppose, to your point, the important distinction here is that I wouldn't call myself an electrician if my relationship to the subject matter doesn't extend beyond the desire to flip a switch.
I'd argue that you understand what a light switch does well enough to use it effectively for its purpose.
When me move from just making use of something to using something to make with, that is when we should have a deeper understanding I think.
Does that sound right?
> the important distinction here is that I wouldn't call myself an electrician if my relationship to the subject matter doesn't extend beyond the desire to flip a switch.
> Large Language Models represent a fundamentally degenerative technology because they systemically devalue the very processes that underpin human progress: original thought, rigorous inquiry, and shared trust. On an individual level, they encourage cognitive offloading, substituting the difficult work of critical thinking and creative synthesis with effortless, probabilistic text generation. This fosters an atrophy of intellectual skills, making society more dependent on automated systems and less capable of genuinely emancipated thought. This intellectual dependency, in turn, threatens long-term technological advancement by trapping us in a recursive loop of recycling and rephrasing existing knowledge, rather than fostering the groundbreaking, first-principles discoveries that drive true progress. Ultimately, this technology is dangerous for society because it erodes the foundation of a shared reality by enabling the mass production of sophisticated misinformation, corroding social trust, and concentrating immense power over information into the hands of a few unaccountable entities.
reply