Hacker Newsnew | past | comments | ask | show | jobs | submit | okanat's commentslogin

The average developers suck. The distribution is also unbalanced. It is bulkier on the low-skill side.

Great UIs are written by above average or even exceptional developers. Such experience is tied to the real-life reasoning and combining unique years-long human experience of interacting with the world. You need true general intelligence for that.


That is the point I was making, but I suppose that may not have been clear. Thanks for expanding.

Cross platform GUI libraries suck. Ever used a GTK app under Windows? It looks terrible, renders terrible, doesn't support HiDPI. Qt Widgets still have weird bugs when you connect or disconnect displays it rerenders UIs twice the size. None of those kinds of bugs exist for apps written in Microsoft's UI frameworks and browsers.

The problem with cross platform UI is that it is antithetical to the purpose of an OS-native UI in its reason of existence. Cross platform tries to unify the UX while native UI tries to differentiate the UX. Native UI wants unique incompatible behavior.

So the cross platform UI frameworks that try to use the actual OS components always end up with terrible visual bugs due to unifying things that don't want to be unified. Or worse many "cross platform" UI frameworks try to mimic the its developer's favorite OS. I have seen way too many Android apps that has "cross platform" frameworks that draw iOS UI elements.

The best way to do cross platform applications with a GUI (I specifically avoid cross platform UI) is defining a yet another platform above a very basic common layer. This is what Web had done. What a browser asks from an OS is a rectangle (a graphics buffer) and the fonts to draw a webpage. Nothing else. Entire drawing functionality and the behavior is redefined from scratch. This is the advantage of Web and this is why Electron works so well for applications deployed in multiple OSes.


> Ever used a GTK app under Windows?

I have created and used them. They didn't look terrible on windows.

>What a browser asks from an OS is a rectangle (a graphics buffer) and the fonts to draw a webpage. Nothing else. Entire drawing functionality and the behavior is redefined from scratch. This is the advantage of Web..

I think that is exactly what Gtk does (and may be even Qt also) too..

I think it is just there there is not much funding going to those projects. Web on the other hand, being an ad-delivery platform, the sellers really want your browsers to work and look good...


There's loads of funding. But the ones funding Qt and GTK aren't parties interested in things like cohesion or design standards. They just needed a way to deliver their product to the user in a faster way than maintaining 2-3 OS platform apps. Wanting that shipping velocity by its nature sacrifices the above elements.

The remnants of the dotcom era for web definitely helped shape it in a more design contentious way, in comparison. Those standards are created and pushed a few layers above that in which cross platform UI's work in.


Or -hear me out- we can put these long I beams on the ground and put some cables above. Then tie 50 trucks to each other and they can get whatever kind of electricity from anything you can make electricity out of.

Well we already have a lot of those, at least in North America (best freight rail system in the world), and it might make sense to build even more tracks in some areas. But rail will never be practical for time-sensitive cargo. It just takes too long to assemble a train and move cars through switching yards. We're always going to need a lot of trucks no matter what.

Nearly zero of the freight rail network in north America is electrified: https://en.wikipedia.org/wiki/Railroad_electrification_in_th...

Sort of. The locomotives are diesel electric series hybrids. Which means you can make one that can travel anywhere that isn't electrified but add a pantograph to it for minimal additional cost and then stop burning diesel anywhere that is, and electrify the lines piecemeal.

Add a battery car and you only have to electrify a minority of the lines to be off diesel a majority of the time.

Extra points for electrifying steep grades.


Any sufficiently method of ground transportation contains an ad hoc, informally-specified, bug-ridden, slow implementation of half of a rail network.

Truely rail-fans are transportation equivalent to vegans in food or cross-fit in exercise. I've spent many an hour on the Isle of Sodor and appreciate how useful those engines are in so many contexts. Yet still, there are buses that move alongside Percy, and pick up stranded passengers, and the Fat Controller (aka Sir Topham Hat) still has a sedan. It's a multi-modal world out there and the tractor trailer still has a place in it.

It is not the economies of scale but entry cost increase per each new player entering the same market. The real world markets are guarded, price fixing oligopolies.

The most important thing a startup is expected to do is not to get profitable quick but suffocate all possibilities of competition. Dysfunctionality is not a bug, it is a feature of our economic system.


Non-system programmers like to trivialize choices of system programmers yet again. .NET is a GC platform running on a virtual machine. Bytecode compatibility and absolute performance are not that big of a deal on such platforms. You cannot / shouldn't run .NET on deeply embedded systems and bare metal where You want to strip as much standard library as possible and want as little magic in standard library as possible. In a language with big hosted system assumption this causes to runtime to be split and forces developers to define big API boundaries.

The use case of languages like Rust and C++ is that you can use the same compiler to write both bare-metal unhosted code (non-std for bootloaders, microcontrollers and kernels) and hosted code (uses std structures). As a system programmer that crosses the edge between two environments, I would like to share as much code as possible. Having a big standard library with hosted system assumption is a huge issue. In those cases you want the language works 99% the same and can use the same structs / libraries. Sometimes you also want to write non-std code on hosted environments for things like linkers.

Rust isn't even at the level of maturity of C yet in this regard. Rust's std / core is too big for really memory limited microcontrollers (<64 K space) and requires nasty hacks with weak ABI symbols to make things sane.

Having a huge baggage of std both causes issues like this for the users and also increases maintenance burden of the maintainers. Rust really wants to break its APIs as little as possible and small standard library is a great way to achieve that. C++ suffered a lot from this and it hampered its adoption for C codebases.


Some of these non-system programmers are ex-system programmers, coding since the mid-80's that foundly remember the days when C and C++ compilers had rich frameworks that would compete in features with what .NET and Java later came to be.

Unfortunelly too many modern system programmers never lived in that era, and are completly off on how nice the whole development experience could be like.


I think those two things are orthogonal. I'm not against somebody bundling up nice Rust libraries and providing a pre-install package or providing nice GUIs around (like Borland used to do and Qt still kind of does). Or an OS providing a nice set of libraries.

However, the standard library of a systems language has a special relationship with the compiler. This is the case for C and C++ where the compiler and the standard library also has a special relationship with the platform like GNU or musl with Linux, or MSVC and Windows. It makes changing APIs or modernizing infrastructure almost impossible without creating an entire new OS and porting all compilers and standard libraries to it. Moreover the newer C++ standards actually force you to define such a relationship (with std::initializer_list and threading stuff). It is basically impossible to make an OS-agnostic C++ compiler that doesn't leak its and platform's internals to the user.

Luckily Rust mostly abstracts around the platform-compiler boundary and its standard library so the platform dependencies are implementation details. Unlike C and C++, one can write Rust without caring about how the underlying OS does ABI. However, Rust compiler and Rust std has a special relationship. `Box` can only be defined as part of Rust standard library that's compiled together with the Rust compiler itself. Its special relationship is kind of a blocker for -Zbuild-std and std-aware Cargo which prevents size-optimizing std for embedded systems. Without that magic (i.e. compiling the compiler itself, or worse bootstrapping it) you cannot independently create a `Box`.

I want this kind of library to contain as little as possible since it is convenient to define these kinds of relationships and rely on magic. Modern C++ has too much such magic. Rust is mostly on a correct path with std, core, alloc etc. separations. These kinds of boundaries make it possible to share as much code as possible with many libraries without finding hacky ways around std (which you have to do with C++).

This doesn't mean that I wouldn't appreciate more actual functional libraries maintained by Rust Foundation-funded people and be part of the project or even easily installed. However those libraries should be effortlessly exchangeable. I think current Cargo ecosystem achieves this mostly. However I would appreciate a more curated Cargo repository that contains only a limited set of really well maintained packages (similar to Maven's repos in Java world).


Why didn't those artifacts/relics survive into the modern era?

There's also something about the early 00s that made software developers go crazy in Java land that they decided to over engineer software for no real benefit and come up with overly complex architectures that don't really address the core issues but rather imagined issues that turn out to not be that important in practice.


They did survive, Qt, VCL, FireMonkey, POCO, but the dark energy of the Electron force it too mighty.

Also in the 2010's we had the rise of scripting languages, thus we have a whole generation that never used compiled languages and are now re-discovering systems programming via Rust, Zig and co.

A history lesson, before OOP, there was Yourdon Structured Method, and plenty of C enterprise architects jumped into it.

The GoF book used Smalltalk and C++, predating Java by a couple of years.

The Booch Method used C++, and predates Java for a decade.

Ah and there was that whole operating system written in an OOP C dialect, including its drivers, NeXTSTEP, which also survives to this day, with more consumer deployments than the Year of Desktop Linux.


... and then figuring out where the hardware company cheapened out and created a whole unfixable mess (extra fun when you first ship your first 10k batch and things start failing after the vendor made a "simple revision"). Then finding a workaround.

Formal instructions paired by tables are almost as rigid as code. Btw normal engineering disciplines have a lot of strict math and formulas. Neither electrical nor mechanical engineering runs on purely instructions.

The most is owned by Big Telcos, previous national monopolies. Deutsche Telekom from Germany, NTT from Japan, AT&T and Level3 and Lucent from US, Vodafone from UK, some private lines for Big Tech. There are lots of privately owned companies for connecting all sorts of big and small companies' infrastructure (cables and routers) together in Internet Exchange Points all over the world. Some of them are again owned by big telcos, some of them are private independent companies, some of them are government owned, or any combination of the options.

> We can build such a society. I am not sure why you think this is never possible.

Where does such informed political and economic interest and power exist? With whom do we construct such society? Do they have the power and will to fight for it?

Normies live with normie standards and with incresing social media exposure with more and more emotional animal-like manipulated world views. They are either ignorant or ambivalent.

Will tech people gather on a piece of land and declare independence? Most of my tech worker colleagues are also quite pro-social media and they heavily use it to boost their apparent social status. We cannot even trust our kind.

Similar examples of new technology being used to motivate and mobilize masses have always ended with devastating wars and genocides. Previously the speed of propagation of information gave advantages to statespeople like FDR to put an end to increasing racism/Nazism/violent tendencies (of course not everywhere, when let to its own devices new technology almost perfect for constructing dictatorships). Now everybody has equal access to misinformation.


This doesn't solve the issue that globalism caused. Europe doesn't make DRAM nor has the know-how to quickly bring factories online which usually take 10+ years.

We are tied to American economy and if AI companies start driving prices up not only DRAM but basically everything will become more expensive.


America doesn't manufacture DRAM either, this is all South Korea and Taiwan.

??? Micron has DRAM megafabs in both Idaho and New York state.

https://en.wikipedia.org/wiki/Micron_Technology


They don't "have them", they're building them.

https://www.micron.com/us-expansion/id

> Micron has already achieved key construction milestones on its first Idaho fab with DRAM output scheduled to begin in 2027.

https://investors.micron.com/news-releases/news-release-deta...

> Production is expected to start in 2030 with the fabs ramping throughout the decade.

Until they start outputting DRAM in any meaningful quantity, they're not relevant.


> They don't "have them", they're building them.

According to wikipedia Micron Fab 6 in Virginia started production in 1997 and is still operating


> "in any meaningful capacity"

Building a factory is one thing, they can have 50 of them built, but that doesn't mean much if all 50 together amount to like 0.1% of the company's output.

Once those factories scale up to 1-2%, then we can start considering that they've actually built a domestic supply, but that's a whole different goal than simply building the factories. Building factories is trivial. Making them output something is also "trivial". Scaling that up to a meaningful amount is a whole different, much harder goal to accomplish.


Are you sure? I'd think producing any RAM chips at all is much harder than scaling up a working production process.

Yes, I'm pretty confident you don't know anything about manufacturing at scale.

Say you use a magic wand and build 15 new state-of-the-art factories tomorrow. Who's gonna run them? Does any location in the US have enough qualified workers that can simply take over and produce RAM in them from day 1 with no major fuckups?

No, you need a ton of time to teach thousands of people how to run those 15 factories. To even begin to teach people, you need to have 1 factory up and running. That 1 factory is at first going to be run by some of their existent workforce that they temporarily migrate from South Asia. Only then can they start to teach local populace how to run those factories on their own.

This is why it's much cheaper to simply build an additional one in South Asia than it is to build more than one in a whole new location. South Asia already has a bunch of workers that know what they're doing because they've been doing it for a long time. Build a new factory, promote some of your existent workforce up the chain, fill the lowest positions with fresh graduates that are gonna be equally good every year and you're good to go. It's nowhere near that simple in a brand new location, where even the most optimistic scenario would take longer than a decade to produce a meaningful amount of output.


Wow, so it's almost like the workers are a key piece in producing any value at all.

Not to mention, given recent US immigration enforcement actions at various manufacturing plants, you can't even safely bring in overseas workers to train your domestic workforce...

It looks like it's still a big difference between how the US and EU are responding to the chip supply wars. The US is actually building their own manufacturing capabilities domestically while the EU is apparently doing nothing, which is unfortunate.

There is https://www.ferroelectric-memory.com/memory-chip-factory/

There is also https://www.vishay.com/ which expanded several sites in .de, without much fuss, or begging for subsidies. That is neither RAM, HBM, NAND, nor NOR, but nonetheless much needed stuff, for all the electrified cyber.


Infineon is _opening_ its fab plant in Dresden this year which was supported by around 1bn euros from the EU equivalent of the CHIPS Act. They started building this fab in 2023, while TSMC, who started building its fab in the US right after covid just delayed the opening to 2027

The fab that Infineon is building is vastly smaller in scale, and their tech isn't really relevant to this discussion. For instance, it doesn't produce CPU/GPU microchips or DRAM. Also only 300mm wafer technology, which isn't competitive for anything except for some narrow industrial use-cases. Glad to see the EU is doing it, but it's a completely different thing.

Pretty much everyone is on 300 mm wafers for everything now, and has been for a while. Are you perhaps reading this as 300 nm process (which would usually be called 0.3 micron)?

But in the context of what we are talking about it's still true that nobody in the EU is making cutting edge CPU/GPU/DRAM and there are no plans to do so either (including that Infineon fab).


> Currently, 100% of leading-edge DRAM production occurs overseas, primarily in East Asia.[0]

They make DRAM for cars, not computers, in the USA. They've promised they'll bring manufacturing onshore any time soon, which effectively means they'll wait until Trump forgets about it.

0: https://www.nist.gov/news-events/news/2025/06/president-trum...


That's not how it works, DRAM substrate (the actual chip that contains memory cells) is shared. It's only the packaging that differs.

Are those plants still functional after CHIPS act was axed? I thought they mainly produce in Asia now.

Well first of all, the CHIPS Act was not "axed", it is federal law passed by an overwhelming bipartisan majority of the House and Senate. It would take a complete reversal of congress to repeal it and it's still very popular among both parties.

Where do you get your information from?


> Well first of all, the CHIPS Act was not "axed", it is federal law passed by an overwhelming bipartisan majority of the House and Senate. It would take a complete reversal of congress to repeal it and it's still very popular among both parties.

DOGE cut basically all staff from the CHIPS Program Office, congress passed the money but Trump is choosing to turn it into a slush-fund the admin spends on industrial policy (such as buying a stake in Intel). Wolfspeed went into bankruptcy in part because the admin delayed CHIPS funding agreed by the previous admin [1] (it's unclear whether they received the grant now that they have left it).

[1] https://www.ft.com/content/4aac09f9-19df-401a-9ab3-ef14a47bb...


Once again, where do you get your information from? The only thing that doesn't exist anymore is DOGE itself.

https://wtop.com/government/2025/11/doge-quietly-disbands-8-...


Does eliminating DOGE magically put Humpty Dumpty back together again?

You hire someone to reqire a few outlets in your kitchen. They burn down half your house. You fire the crappy electrician.

What's the state of your house now?


If Doge disbanded but its actions haven't been undone, it doesn't matter if Doge doesn't exist anymore.

Just like executing a serial killer brings back all the victims, right? No harm done!

> after CHIPS act was axed

This is news to a lot of Americans! The 2022 CHIPS and Science Act is codified federal law. I think a lot of states (Arizona, Idaho, New York) would be very interested to learn that the funding for the infrastructure that they are already building has somehow gone poof.


Intel is now partially government owned(10%), they got rid of some of the milestones. The current administration has been extremely poor about communicating changes as well as constantly yanking funding (or threatening to) for projects - the chances of funding going poof are higher than ever.

American companies are driving global economy insane. Currently the American political administration sides with the AI companies since it gives the inspiration that the economy is doing well. If things start to go side ways, the US government can put pressure on its local companies like Micron to supply other fields.

Europe doesn't have local manufacturers. So it cannot exert control over the manufacturers to keep its internal / strategic market sane. All European hardware manufacturers have to put up with and compete in irrationality inflated prices.


And China with IXMT.

Europe has stopped making DRAM relatively recently (Qimonda).

This should have not been allowed to happen.


Guess who bought Qimonda's patents?

Who?

Micron

and facilities went to China's Inspur.

Isn’t there also basically 0 American DRAM?

Micron Technology, Inc. is an American semiconductor company that manufactures computer memory

They don't produce them within the US. They're building some factories to do so in the future, but as of now their output is 0.

However, the US government has / can have control over Micron's production. They are headquartered in the US. They have the intellectual property and know-how to erect a vertically integrated supply chain. Europe doesn't have this strategic investment.

Micron/Crucial has bailed the consumer market. Enterprise only now, FWIW.

America doesn't really produce RAM either.

The newfound desire to move away from American cloud providers isn’t related to pricing, it’s about the perception of growing instability within the American government, the perception of deteriorating freedom of speech, and the perception of an increasingly non-neutral business environment.

E.g., if I’m running a business in the US and I don’t kiss Trump’s ring (and pay bribes), if he becomes dictator for life in 2028, all bets are off for my business.

Both the EU and USA import the majority of their computer equipment, and the USA is placing heavy and unpredictable tariffs on those goods. It’s hard to argue that a business should bet that data centers will be cheaper in the US than in the EU if Trump is the last democratically elected president.

The most stable places to do business in 2026 are probably the EU and China.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: