Hacker Newsnew | past | comments | ask | show | jobs | submit | dzink's commentslogin

The author attributes meaning for the giver and hopefully receiver to time spent by the giver. They argue less time spent for same utility lowers the meaning.

I see something very different:

1. The government post shared as an example of efficiency to utility increase has glaring errors: “journey-level developers”. You will never achieve any improvement on government code bases if the people leading the effort can’t pay attention to the most basic and broadcasted elements of the job. AI used by junior developers will only compound the massive complexity of government systems to the point where they are not fixable by seniors and they are not usable by humans.

2. The time spent doing something, meaningful or not, with care is a training for that person into attention to detail, which is absolutely critical to getting things right. People who lazily lean on generating more without attention to detail, don’t see the real point of the work - it’s not to add more stuff to less space (physical or mental) faster. It’s to make the existing space better and bigger by needing less in it. The rest is just mental and physical complexity overload. We are about to drawn in that overload like hoarders next to a dumpster.

3. If you ever live in a small home you may noticed that the joy of getting things (usually derived from dopamine-seeking behaviors like shopping or making, or shopping for ingredients so you can make things, or from getting toys for your kids or other people getting toys for your kids) will quickly overload any space in your living quarters. Your house becomes unbearably small and it becomes impossible to find things in piles or drawers filled with other things if nobody ever organizes them. We all have become dopamine adduces or the world has turned us into such and there are few if any humans willing and capable of organizing that world. So most people today will be paralyzed with a million choices that were never organized or pruned down by the owner or their predecessors. The overwhelming feeling would be to escape the dread of organization into more dopamine- generating behaviors. We need more of the “moms” who clean up our rooms after we’ve had fun with all the legos we can now generate. Or we will all be living in a dumpster before too long.


Journey-level would just be 'senior', no? I'm not familiar with the terminology but it seems akin to journeyman (i.e. competent executor of a skillset). A junior would be apprentice in that universe, I imagine.

In today’s world, Autocracy is not possible without Tech. Tech firms enable it with anything from surveillance, to mass propaganda by anyone willing to pay, to enabling things as private data collectors for law enforcement that law enforcement would legally not be allowed to do on its own.

The even bigger approach now is the denial of service attack on the judicial system coming from AI-enabled infractions that are too new to the system. To deny enforcement in an area that helps authocrats - an AI tool can enable anyone to create problems on that area automatically and easily, and now the pipeline for getting justice in that area becomes clogged. It doesn’t even need to be intentional - it can be automatic.

The zero sum game of tech firms competing for the same fluid investors in the stock market means the control of government becomes guaranteed - because if a single one is singled out the money flows out of its equities and into their competitors. It’s blood supply to vital organs. The CEOs have a fiduciary duty to their shareholders above all else. They cannot do anything that will damage the stock price without being sued to oblivion and that would attack their personal assets as well.


Think back to Maya history when the rulers kept astronomy knowledge secret to pretend they were Gods and had control over celestial objects. If expensive education and publishing access provides power to someone and free education and publishing becomes a thread to their authority, that’s not a good testimony on how they used their education advantage while they had it.

AI may be destroying truth by creating collective schizophrenia and backing different people’s delusions, or by being trained to create rage-bait for clicks, or by introducing vulnerabilities on critical software and hardware infrastructure. But if institutions feel threatened, their best bet is to become higher levels of abstraction, or to dig deeper into where they are truly deeply needed - providing transparency and research into all angles and weaknesses and abuses of AI models. Then surfacing how to make education more reliably scalable if AI is used.


Nevada protects founders from shareholder lawsuits. So if someone defrauds or intends to defraud shareholders - they are more likely to prefer Nevada. To be fair, a lot of things can turn into a shareholder lawsuit in Delaware.


How do you do objective research without a data pipeline? Social media companies can use user privacy as an excuse to not share feeds that influence users. The first step to fixing the wrongs is transparency, but there are no incentives for big tech to enable that.


The election is a state at a moment in time. Before the election, to affect the state, propaganda machines target whatever is lower on the maslow hierarchy of needs for the voters - pesky tariffs are a tiny issue compared to that boogeyman that wants to target your children. Someone’s freedom is a pesky thing compared to that immigrant boogeyman going after your retirement savings. Once people have voted because they are scared for their life savings and their children, the elected can do whatever they want and target whomever they want with impunity for several years. Especially if they start building their own militia and threatening the judiciary.

This authoritarian model has proven very successful for anyone Putin and his aparatus has installed anywhere. Now it may be franchised even further.


>that boogeyman

It's depressingly funny too when you ask these people if they've ever been directly affected by said boogeyman and they'll say no, but the know someone that has. Meanwhile you can ask them things about healthcare, local government, and other matters that affect their daily life, and they'll swear the trans-immigrant-boogeyman de jour is has far more affect on their lives.


From the article it says the change is implemented by telling brands they don’t need an amazon barcode if they have a product barcode, while resellers need an amazon barcode. What happens if resellers decide to just not add the amazon barcode and appear as brands?


In an ideal world, the reseller’s shipment would be rejected.


Annoying was a thing of the past. Look at the evolution of ads and content placement. With social media advertising being pushed to trigger massive anxiety and societal schizophrenia on some topics, imagine what can be done with personalized AI (especially if the buyers are well funded politicians, or state-backed malicious actors vying for territory, real estate, or natural resources where you live - the highest margin opportunities).

At first, In retail you had billboards and shelf space. The lowest quality ingredients your product has (example syrup bottled with soda water), the higher your margin was, the more you could afford to buy out shelf space in retail chains and keep any higher quality competition out. Then you would use some extra profits to buy out national ads and you’d become a top holding for the biggest investors. That was the low-tech flywheel.

In the Search Engine world - the billboards weee the Margin-eating auction-based ads prices and the shelf space became SEO on increasingly diluted and dis-informative content to fill the shelf-space side. In Video advertising, rage-bait and conspiracy theories try to eat up the time available for top users.

AI advertising if done right can be useful, but the industry that asks for it intentionally asks for obtrusive and attention hogging, not for useful. The goal is always to push people to generate demand, not sit there when they need something. Thus the repetition, psychological experiments, emotional warfare (surfacing or creating perceived deficiencies, then selling the cure). Now if you understand that the parties funding AI expansion are not Procter and Gamble- level commercial entities but state and sovereign investors, you can forecast what the main use cases may be and how those will be approached. Especially if natural resources are becoming more profitable than consumer demand.


Those look like the monitors used on the F1 movie, which is strange, considering it was an Apple production and they maybe should have used apple monitors for product placement . I guess it is a testimony about Kuycon from Apple.


You should look at pictures of Apple's Pro Display XDR. The Kuycon monitor is an obvious rip-off of that in terms of styling, especially the ventilation on the back.


If one person thinks this way, many more do. This is typical in large organizations, especially government institutions, because expense of running entire teams at massive costs for no reason is not born by the team but by someone with a much larger budget that has more money than care or completely wrong incentives (the more people I manage, the more important I am, type of orgs). This is organizational gangrene described from the inside and partly how or why it happens. If you are leading an organization and reading this - figure out how to measure and prevent it.


Humans think this way. This isn't a cultural thing, it's human nature. We like positive people and dislike negative people. Ignoring the fact that political capital is a thing won't make it go away.


The goal is not to ignore human nature, but to build better tools for orgs to get feedback and act on it before it corrodes them on the inside. Government is the biggest of them all - fix this and maybe you can create government that works for you, instead of blowing taxpayer dollars like a leaky bucket. Humans in an organizations are like cells or organs in a body. Every country, team, and organization iterates on a proper nervous system for their body.


Are there any good examples of governments that work really well? I don't think so.

I think this just shows that we'll all be better off when we can make AI smart enough so we can put it in charge of everything.


imo it's a cultural thing specific to organizations which are raking in money, as many tech companies are. The less actual competitive pressure there is the more everyone is pressured to just shut up and take their cut. Whether it's more or less than it could be is less important than just not rocking the boat.

Whereas if real existential need is on the line then people are incentivized to give a shit about the outcome more.

Tech is so rich in general that the norm is to just shut up and enjoy your upple-middle-class existence instead of caring about the details. After all, if this company blows up, there's another one way that will take most of you.

Not that this excludes the same behavior in industries that are less lucrative. There's cultural inertia to contend with, plus loads of other effects. But I have noticed that this attitude seems to spontaneously arise whenever a place is sufficiently cushy.

Also, this take doesn't (on its own) recommend one strategy or the other. Maybe it makes the most sense to go along with things or fight them for personal reasons, uncorrelated to the economic ones. But it's good, I think, to recognize that the impulse is somewhat biased by the risk-reward calculation of a rich workplace. Basically it is essentially coupled to a sort of privilege.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: