Hacker Newsnew | past | comments | ask | show | jobs | submit | gruturo's commentslogin

> Samsung has no vision.

I entirely agree with you, and profoundly dislike them, but it's clearly working for them if their financials don't lie. While most other manufacturers bleed money, Samsung had healthy profits on smartphones last time I checked. It still puzzles me that anyone would buy them at all, but I've long accepted that I'm not a representative sample.

So given that, I don't see why they would bother coming up with a vision after all this time.


I managed to purge myself of Apple as of a couple of years back by getting an s24 ultra.

Main things that stand out over apple:

- Much higher resolution camera w/ pretty incredible zoom. Though overall picture quality is a far closer comparison.

- S-pen, mostly used for its remote capability, shame they dropped that for the s25...

- Samsung Dex. I use my phone as my laptop daily, I've also used it as a dumb terminal for remote gaming while travelling which works exceptionally well

- Access to alternative browsers, ad blocking, alternate stores, side loading apps etc

While Google is no angel Apple actively works against open systems and control of your own devices, I'm glad to be out of that ecosystem.


Samsung seems to be targetting a sweet spot. "Costs less than Apple, superficially looks like an iPhone, product lineup includes smaller form factors, good enough."

It doesn't work for me, but that's because I courageously use my headphone jack.


Predictable and extremely low costs for less critical stuff. My 2 main ones are respectively around 4 and 8 EUR per _year_.

I use them to run wireguard to evade geoblocks when I'm travelling, a few redundant monitoring scripts alerting me of reachability issues of more critical stuff I care about, they serve as contingency access channels to my home (and home assistant) if my primary channels are down.

I get no support, no updates, it's all on me - which is fine, it allows me to stay current and not lose hands-on practice on skills which I anyway need for my job (and which are anyway my passion). I don't even get an entire IPv4 - I get.... 1/3000th of it? (21 ports, the rest are forwarded to other customers). Suits me fine.


And it's always that price, apart from bandwidth overage on some but not all providers.

It also fits in a handful of bytes or kilobytes what would take half a gigabyte to communicate in a video - sometimes making the difference if you have limited bandwidth or a cap on monthly traffic.

It's also ridiculously easy to cache (download a book in 9 seconds, board a transoceanic flight - no problem)

It also doesn't require the right sound and lighting conditions to see and understand a video (either those conditions, or good noise cancelling headphones - and now you're unaware of your surroundings)

It's also the only viable option on insanely low power devices which get months of battery life per charge.

It's also something you can read at an incredibly speedy pace if you are good at it and practice - though occasionally a decent audio/video player will be of use with this.

It's also something you can fall asleep while consuming, and tomorrow you won't have much trouble finding exactly where you left off.

I could continue..


It's also the only medium where semantic reasoning and indexing at scale makes financial sense. I can run RAG over millions of text rows in Postgres for pennies, but the compute costs to process and embed video content are still prohibitive if you care about margins.

True. But there are 2 ways to read this:

1: Yes, let's stick with ICE cars and die of preventable illnesses because EVs are only a massive improvement, rather than absolute perfection

2: Hey let's take this massive improvement and enjoy enormously cleaner air

I meet way too many people from group 1 unfortunately.


That's exactly what I wrote: "it's a step up".

Awesome article Ken, I feel spoiled! It's always nice to see your posts hit HN!

Out of curiosity: Is there anything you feel they could have done better in hindsight? Useless instructions, or inefficient ones, or "missing" ones? Either down at the transistor level, or in high level design/philosophy (the segment/offset mechanism creating 20 bit addresses out of 2 16-bit registers with thousands of overlaps sure comes to mind - if not a flat model, but that's asking too much to 1979 design and transistor limitations I guess) ?

Thanks!


That's an interesting question. Keep in mind that the 8086 was built as a stopgap processor to sell until Intel's iAPX 432 "micro-mainframe" processor was completed. Moreover, the 8086 was designed to be assembly-language compatible with the 8080 (through translation software) so it could take advantage of existing software. It was also designed to be compatible with the 8080's 16-bit addressing while supporting more memory.

Given those constraints, the design of the 8086 makes sense. In hindsight, though, considering that the x86 architecture has lasted for decades, there are a lot of things that could have been done differently. For example, the instruction encoding is a mess and didn't have an easy path for extending the instruction set. Trapping on invalid instructions would have been a good idea. The BCD instructions are not useful nowadays. Treating a register as two overlapping 8-bit registers (AL, AH) makes register renaming difficult in an out-of-order execution system. A flat address space would have been much nicer than segmented memory, as you mention. The concept of I/O operations vs memory operations was inherited from the Datapoint 2200; memory-mapped I/O would have been better. Overall, a more RISC-like architecture would have been good.

I can't really fault the 8086 designers for their decisions, since they made sense at the time. But if you could go back in a time machine, one could certainly give them a lot of advice!


As someone who did assembly coding on the 8086/286/386 in the 90s, the xH and xL registers were quite useful to write efficient code. Maybe 64-bit mode should have gotten rid of them completely though, rather than only when REX.W=1.

AAA/AAS/DAA/DAS were used quite a lot by COBOL compilers. These days ASCII and BCD processing doesn't use them, but it takes very fast data paths (the microcode sequencer in the 8086 was pretty slow), large ALUs, and very fast multipliers (to divide by constant powers of 10) to write efficient routines.

I/O ports have always been weird though. :)


> I can't really fault the 8086 designers for their decisions, since they made sense at the time. But if you could go back in a time machine, one could certainly give them a lot of advice!

Thanks for capturing my feeling very precisely! I was indeed thinking what they could have done better with the same approximate number of transistor and the benefit of a time traveler :) And yes the constraints you mention (8080 compatibility, etc) indeed limit their leeway so maybe we'd have to point the time machine at a few years earlier and influence the 8080 first


What's that military adage? Something along the lines of 'planned to win the (prior) war'?

There's also the needs of the moment. Wasn't the 8086 a 'drop in' replacement for the 8080, and also (offhand recollection) limited by the number of pins on some of it's package options? This was still an era when it was common for even multiple series of computers from a vendor to have incompatible architectures that required at the very least recompiling software if not whole new programs.


This is nothing new. The Army has been doing this forever. A certain General Failure was reading my C: drive all the way back in the 80s.

I'll show myself out..


You know you can get a lightning-to-C adapter for very little, right? Here you go, under $2 each: https://www.amazon.com/Lightning-Adapter-Charging-Transfer-C... (probably under $1 each if you have the patience to look for them in other sites)

And a lot of chargers don't have a cable built-in, they just have a USB-A or -C port - so it's just a matter of replacing the cable. But - again, if you'd rather not do even that, you're welcome to keep using your old cable with a USB-C converter


Oh cool it's not just me doing exactly this.

Sticking to pure zigbee devices with zigbee2mqtt and slae.sh's excellent USB coordinator. A couple weeks ago I bought a bunch of spare IKEA zigbee devices before they go out of stock. Around 2030 I'll take a look if thread/matter is anywhere near mature and has settled.


Are the ikea zigbee devices going to stop being sold? Massive shame if so, they are extremely reliable and easy to use.


IKEA's whole smart home ecosystem is presently being overhauled from Zigbee to Thread/Matter, with a product availability gap in the meantime.

https://www.ikea.com/global/en/newsroom/retail/the-new-smart...


Oooh, thank you for sharing! New product lineup looks interesting, but I echo other concerns here about it thread maybe eventually requiring internet.


What gap though? Our local IKEA has plenty of lights, smart plugs, etc. available still.


I just bought some spare pieces (remotes, bulbs) just in case


Personally, I find their contact sensors (the tall-ish thin ones) to be quite unreliable. I live in a modest home with plenty of zigbee devices as repeaters nearby and the contact sensors often stop reporting at random. I’ll pop it off the door, click repair on my coordinator and then hit the reset switch on the sensor; back online.

I like them because they can use rechargeable AAA batteries but if I still have to touch them every few weeks to repair, I’d rather switch to a different brand that is more reliable and uses less ideal battery formats.

That said, the newish Inspelling plugs in the EU market are fantastic. They report reliably, can handle larger loads, and cost about €10. For that price, it’s hard to complain that they are a bit larger than other options.


Side question but where would one learn how to do this that way? Any guides, reddit? The home automation market seems such a mess every time I check it out.


Easiest way is to put HAOS (Home Assistant OS) on a Raspberry Pi, Home Assistant Green, or some NUC:

https://www.home-assistant.io/installation/

Then get a coordinator recommended for zigbee2mqtt:

https://www.zigbee2mqtt.io/guide/adapters/

Then install and start the following add-ons in Home Assistant:

- Mosquitto - zigbee2mqtt: https://www.zigbee2mqtt.io/guide/installation/03_ha_addon.ht... - MQTT

And that's pretty much it, you can add devices through the MQTT add-on page. They will also become available as entities in the rest of Home Assistant, and you can make graphs, dashboards, actions, etc.

You can also run + install zigbee2mqtt and Mosquitto on a Linux machine, but HAOS give you more of an integrated solution with dashboards, graphs, backups, cloud access, etc.


Feed that comment into an ai (claude suggested). Let it know what you have, and just work out a "numbered list roadmap". Love ais for that!


> The law got SO convoluted over 9 years of interpretation by the European courts that its now impossible to be 100% compliant

It absolutely isn't. I set up a blog for a friend where she shows her art and publishes an appearances itinerary/schedule. It doesn't collect ANY info from visitors, therefore requires no cookie banner at all. Simple as that.

HTTP logs are retained for 7 days for security analysis and then wiped. No analytics available, although my understanding is that a self-hosted Matomo instance set to anonymize the last 2 IP bytes of every logline it ingests would still be considered exempt from a banner.


> HTTP logs are retained for 7 days

There you go. The moment you save any information that can help identify someone for any period, you are within the scope of the law. God forbid you keep the IPS for any reason.

> for security analysis

The law doesnt give a zit about what you do it for. If you retain any personal info or set any cookie, you have to tell the user about it and give options.

> Matomo instance

Hahaha - matomo itself is non-compliant with the law. Its developers think that anonymizing info or collecting bits and pieces for functional info and setting a cookie for that purpose allows you not to show a banner. That's wrong. It doesnt matter for what you collect info or set a cookie - the moment you set a cookie, you have to show a cookie banner and tell exactly what you are collecting and what you are using it for. Even for functional cookies.

The only way you can be compliant with this law is by setting an apache header or something to delete all cookies the moment they are set so that you wont leave any cookie. Even in that case, you may be responsible for you are holding that information even for a few miliseconds. (yeah, you as a techie think that its not important, but law doesnt work that way). Best chance is to have a server that does not set any cookie or collect any info in any way. Good job preventing spam, fraud, ddos with such a setup.


An outcome I'm entirely fine with. Those industries are _not_ divinely entitled to fabulous wealth by violating one's privacy. I won't shed a tear if they don't survive once they are blocked from spying.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: