Hacker Newsnew | past | comments | ask | show | jobs | submit | Zach_the_Lizard's commentslogin

Pulling weeds by hand works for a lot of weeds and is the most environmentally friendly solution where possible. It's what I've done, for the most part.

I will say for some weed species that can be ineffective or counterproductive, unfortunately, and for those a chemical (or other) solution may be in order.

Weeds can also be a sign of a potential problem, such as poor drainage, a leak, etc.

Nutsedge is an example of that. As I recall, pulling it out results in it sending more shoots up if you don't get the nut (which can be feet underground).

At that point, you have to continuously pull weeds on a daily (or multiple times daily) basis in order for it to use up more energy growing than it generates.

It likes water, so if it's there, it might be because there's standing water from rain.

I dug up a raised flower bed to get rid of it once. Nuts were absolutely everywhere because of poor drainage. I had to go down 2 feet I think to get them all, I replaced the bottom layers of impermeable clay soil with something that drained, along with a drain pipe or two.

Now the sedge is gone, the risk of foundation damage from being too wet is gone, and no chemicals were required.


Can't do that in cracks in a sidewalk, between pavers, on a wall, etc. where plant growth can damage them.


Weed whacker and edger? You'll have them out anyway.


Some weeds are quite unpleasant, such as sticker burrs. I'd rather not have a dog and children covered in those.

Some weeds can be damaging to property, trees, sidewalks, etc. or are poisonous.

It's not always about being annoyed by dandelions in an otherwise overly fussed over sterile lawn environment.


Even then, spraying cancer causing chemicals into the land is beyond stupid. Killing yourself and the humans around your land for having a bit less work, one can't be more antisocial.


The problem with that "decipherment", from what I've been told by others who are far more educated than I am, is that it does the equivalent of deciphering Anglo-Saxon runnic texts by using modern slang like "yo" in order for it to work out.

As a non-linguist, non-Sankrit speaker I can't evaluate those claims, but considering that this script declines as the Indus Valley Civilization fades away, along with the arrival of Indo-European speakers who would be more likely to speak the ancestor language of Sanskrit, I'd be highly skeptical of these claims.

If the script is a full writing system, and I were forced to guess what a future decipherment might find, it wouldn't surprise me to see that the language is related to the Dravidian languages.

Hopefully more examples of the writing will be found so that we may one day know for sure.


I learned Sanskrit as a kid and I’m familiar with Dravidian languages as well. They’ve heavily influenced and assimilated each other’s features. Although there are no attestations of Dravidian languages before 5th C BCE, we never know what future discoveries might tell about its connection to IVC, if any.

If we can decipher letters from burnt, rolled up scrolls, I’m sure eventually we’ll figure out what IVCs writings meant.


There have been attempts to recreate (vulgar) Latin from modern day Romance languages, as well as using older forms of these languages to reconstrct what's known as Proto-Romance.

My recollection is that the complexity went the other way; Latin was more complex than the reconstructed languages, especially if the reconstruction didn't include Romanian, because the modern Romance languages became simpler over time in similar ways.

It's clear that the result is useful for understanding features of the ancestral language, but it's not perfect, and never will be.

On the other hand, comparative linguistics came long before genetics, and it is this field that first noticed a connection between the Indo-European languages.

Archaeological and especially genetic evidence now show the peoples of this language family (mostly) have shared (though distant and diluted) ancestry, so the field was broadly correct in noticing a connection.


>...farming those risks out to institutions seems to be the current way most societies have decided to mitigate those risks

Unfortunately, those institutions --be they governments, insurance companies, UL Labs, banks, venture capitalists, etc.--also need to be vetted.

Even when staffed with impeccably well credentialed and otherwise highly capable people, their conclusions may be drawn using a different risk framework than your own.

The risk that they mitigate may even be the risk that you won't vote for them, give them money, etc.

There is also the risk of having too little risk, a catastrophe no worse than too much risk. The balloon may not pop, but it may never be filled.


I don’t think anyone reasonable is advocating believing institutions on blind faith (possibly with the exception of religious institutions). They need to be transparent and also strive to reflect the values (risk and otherwise) of their constituents.


On the other hand, avoiding peanut exposure can cause an increase in allergies, so there's a feedback loop at play.

The children who now have allergies, but wouldn't with past exposure levels, are more inconvenienced than the kid who can't eat a peanut butter sandwich at school.

Attempting to make life better for all has unexpected twists and turns


The question is whether children who are already in school are likely to develop an allergy due to lack of exposure at that point in their lives. If the answer is no, the only downside is parents having to make a different sandwich for lunch.

I'm allergic to peanuts and none of my schools banned peanuts. When I was in elementary school, another kid chased me around with a peanut butter sandwich for a couple of minutes before the teachers were able to stop them. I'm not saying this will happen to all kids, but the options for bullying get kinda scary when an everyday food can kill you. Kids at that age may not understand how serious allergies can be, so there's also the more mundane risk of a child with an allergy trying somebody else's food without knowing it contains allergens.

Later on in middle school there was a single peanut-free table, where kids were eating peanut butter sandwiches for lunch every day before I arrived. Fortunately they were willing to bring something else once I told them I actually needed the peanut-free table to be peanut-free.


It's a bunch of small things: command line flags, whether a command line tool is even present at all, compiler built-ins / differences in behavior, headers possibly being in different places, compiler support for various language standards and more.

Even to this day, it's not uncommon to find libraries that won't compile with one of GCC, clang, etc. or even the same compiler but Linux vs MacOS.

It was even worse in ye olde times before package managers, I'm assuming.

EDIT: I forgot to mention that System V and BSD are two of the major families.

Both influenced Unix-like OSes far and wide, such as SysV style init scripts in certain Linux distros, MacOS being derived partly from BSD, Solaris being a continuation of SysV IIRC, and more.

There was a rough standardization in where certain things could be found, command line flags, etc.


From an Apollo Domain OS Design document[0]:

"In order to provide maximum portability of software from Domain/ as to other Berkeley or System V UNIX systems, Apollo provides two complete and separate UNIX environments, rather than a hybrid of the two. Any workstation can have one or both UNIX environments installed, and users can select which environment to use on a per-process basis.

Two key mechanisms support this facility. First, every program can have a stamp applied that says what UNIX environment it should run in. The default value for this stamp is the environment in which it was compiled. When the program is loaded, the system sets an internal run-time switch to either berkeley or att, depending on the value of the stamp. Some of the UNIX system calls use this run-time switch to resolve conflicts when the same system call has different semantics in the two environments.

The other mechanism is a modification of the pathname resolution process, such that pathname text contains environment variable ex- pansions.

[...]

When UNIX software is installed on a node, the standard trees (/bin, /usr) are installed under directories called bsd4.3 and sysS.3. The names /bin and /usr are actually symbolic links dependent on the value of an environment variable named SYSTYPE. That is, /bin is a symbolic link to /$(SYSTYPE)/bin. When the program loader loads a stamped program, it sets the value of SYSTYPE to either bsd4.3 or sys3.3, according to the value of the program stamp. Therefore, a program that refers to a name in one of the standard directories will access the correct version for its environment."

[0] https://bitsavers.org/pdf/apollo/014962-A00_Domain_OS_Design...


MacOS is only licensed for use in Apple branded hardware, as I understand it. Even running it in a VM could be problematic if that host isn't running MacOS.


So your issue isnt the openness in terms of being limited on what you can do on it, and more that you want it to be bloated with drivers for millions of various pieces of hardware like Windows, got it.


> and more that you want it to be bloated with drivers for millions of various pieces of hardware like Windows, got it.

MacOS is bloated anyways; they might as well use that bloat for something important like backwards-compatibility and not zombie-code left over from the PowerPC era. That's just an objective failure, on Apple's behalf; they break software support more often than Microsoft and even Linux at this point. A professional OS really has no excuse to break someone's software and leave it broken. Even Microsoft gets that.

So... yeah, you know what? I do want it to be bloated with drivers, because whatever they're stuffing it with right now clearly isn't working. I don't trust Apple to write or maintain a long-lived successor, I demand third-party alternatives I can maintain myself. Give me more options for writing and delivering software, or else I am going to continue ignoring MacOS as a build target for the foreseeable future.


True. However I can (and have multiple times) migrate from machine to machine without needing to reinstall everything.

My work MacBook was pulled from an original Air from something like 2015, to a 2017 Pro and currently my 2019 Pro.

So I’ve got apps installed on my Mac that have been installed damn near 10 years ago.

Ditto my home 2015 Pro was later on migrated to a M1 Air. Hell, I’ve still some 32 Bit Steam games that still somehow run on my Air (least Steam tells me they’re 32 bit).

We could play this game ad-infinitum, each finding a level of supposed “openness” but the basic facts are that neither Windows, nor MacOS are truly open.

If you want open, then Linux is always going to be in the answer somewhere. Not MS Windows. And not Apple MacOS.


I'm guessing "the edge" is doing inference work in the browser, etc. as opposed to somewhere in the backend of the web app.

Maybe your local machine can run, I don't know, a model to make suggestions as you're editing a Google Doc, which frees up the Big Machine in the Sky to do other things.

As this becomes more technically feasible, it reduces the effective cost of inference for a new service provider, since you, the client, are now running their code.

The Jevons paradox might kick in, causing more and more uses of LLMs for use cases that were too expensive before.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: