Hacker Newsnew | past | comments | ask | show | jobs | submit | keyringlight's commentslogin

I think the ipod being so common has helped it remain useful 20 years on. There's a lot of them being sold cheaply as people clean out old drawers, spares/replacements and upgrades are readily available, how to work on them is common knowledge and tools to work on consumer electronics is common, and rockbox is available for most of the range. MP3 players were a commodity good with a huge range of models available from practically any company capable, but most won't have the market around it. My ipod has a 128GB SD card instead of a HDD and a battery that lasts at least 5.5 days playback instead of whatever the original 650mAH allowed

I'd wonder how much that scales up though for the benefit of the companies that are each investing hundreds of billions and hope to see a net return. How many developers like you (presumably less of you seeing as each is more productive) or enterprises you work for paying fees (along with slimming down legacy costs paid to someone) does it take to get up in the 12 digit range?

No idea, and not my problem. I’m surprised I’ve been downvoted so much in these comments. I’m not saying OpenAI et al is a good company or good financial scenario, or good investment.

The technology is amazingly powerful. Full stop.

The constraint that drives cost is technical — semiconductor prices. Semiconductors are manufactured commodities over time, those costs will drop over time. The Sun workstation I bought for $40k in 1999 would get smoked by a raspberry pi for $40.

Even if everyone put their pencils down and stopped working on this stuff, you’d get a lot of value from the open source(-ish) models available today.

Worst case scenario, LLMs are like Excel. Little computer programs will be available to anyone to do what they need done. Excel alone changed the world in many ways.


Assuming you're already running a PC with a desktop OS, you can use virtualization to 'get your toe wet' and try linux without diving in entirely with a real install. On windows virtualbox is free for non-commercial use and pretty simple to set up.

If you've got a spare drive then install it on that leaving your existing install alone, or if you have spare space on your existing drive you can shrink a partition (backup important data first) and set up a multi-boot


WSL2 is available to get ones toes wet as well, without needing to go through the install process of any Linux flavor, nor needlessly involve Oracle.

Sure it doesn't have a full Desktop Environment, but one can run GUI apps, and easily pass files between.


My impression for the past decade or so is that Radeon for PC is AMD's way of keeping their GPU division's engine running between contracts for consoles or compute products. At the very least it's a welcome byproduct that provides a competent iGPU and a test bed for future 'main' products. It's been a long while since AMD has shown future vision for PC GPUs or they've led with a feature instead of following what others do.

> My impression for the past decade or so is that Radeon for PC is AMD's way of keeping their GPU division's engine running

During this time AMD was focused on CPUs. They've already said that they'll focus more on GPUs now (since CPUs are way ahead and AI is a thing) so this should change things.


It's interesting to see parallel development of certain features in the early 3D era and how they were used. The original Prey by 3D Realms was shown in demos in 1997/98 with portal tech including rotating it in a level prop, so it's interesting and ties into the game fiction instead of being only functional to stitch map areas together. There's likely more examples during that period when licensing an engine was less common.

The way I see live tiles is that it was MS abandoning widgets that existed since vista (although they were removed later for security reasons) and coming up with a new thing to start all over with, and didn't backporting it so the only way you'd get them is on the (less popular) new version of the OS. Also they were tied into the start screen/menu, you couldn't drop on on your desktop.

There's also the issue of distance for a mouse cursor to travel to select something. I think the general issue is imposing one interface for every mode of input instead of options, so either select an appropriate interface depending on how the start menu was invoked (even if it's just scaling it down to a confined space) or letting people select the default however it's invoked. Yes that's going to be more work, but when we're talking about the largest corporations on the planet I struggle to believe they can't afford it.

IIRC, win8 was the last windows to have thick graphical window borders, and that was after they got rid of the texture/aero look from vista/7, so at that point you at least had something graphical to grab onto which (mostly?) matched where the cursor was. Then in win10 onwards they shrunk the border down to one pixel with the zone around it where you can click off the window but still affect it.

On the back of my mind I think part of this was the move to fit scaling to large resolution monitors (i.e. 4k+) work better, as a graphical border of a fixed pixel width will shrink proportionally compared to a border that is as thin as it can be. For a while I've felt that it's a missed opportunity on high res displays to not use more detailed art for window chrome as pixel wide will only get smaller and more difficult to distinguish, such as the minimize/maximize/close icons which remain pixel wide line art even at big scaling.


Isn't disabling hibernate "powercfg /hibernate off"? I'm not aware of a simple and obvious UI switch to do the same


I'm pretty sure hibernate has defaulted to off for quite a while and has to be turned on if desired (at least, the last several machines I've bought new that was the case.)

The UI switch is not particularly obvious, at Control Panel → Hardware and Sound → Power Options → System Settings


One caveat, you might want to account for text that the writer deleted staying deleted/hidden. I'm not always the best at proofreading before submitting, but I'll often cut tangents I'm prone to ramble onto. Accidental pastes from other sources that are meant to stay private would be another issue if the history tracker grabs everything that goes into the text box.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: