This has been a commonplace feature on SOCs for a decade or two now. The comments seem to be taking this headline as out‑of‑the‑ordinary news, phrased as if Oneplus invented it. Even cheapo devices often use an eFuse as anti-rollback. We do it at my work whenever root exploits are found that let you run unsigned code. If we don't blow an eFuse, then those security updates can just be undone, since any random enemy with hardware access could plug in a USB cable, flash the older exploitable signed firmware, steal your personal data, install a trojan, etc. I get the appeal of ROMs/jailbreaking/piracy but it relies on running obsolete exploitable firmware. It's not like they're forcing anyone to install the security patch who doesn't want it. This is normal.
It ain't normal to me. If I bought a phone, I should be able to decide that I want to run different software on it.
Let's say OP takes a very different turn with their software that I am comfortable with - say reporting my usage data to a different country. I should be able to say "fuck that upgrade, I'm going to run the software that was on my phone when I originally bought it"
This change blocks that action, and from my understanding if I try to do it, it bricks my phone.
The whole point of this is so that when someone steals your phone, they can't install an older vulnerable version of the firmware than can be used to set it back to factory settings which makes it far more valuable for resale.
Phone thieves aren't checking which phone brand I have before they knick my phone. Your scenerio is not improved by making Oneplus phones impossible to use once they're stolen.
> It reduces the expected value of stealing a phone, which reduces the demand for stolen phones.
It's not at all obvious that this is what happens. To begin with, do you regard the average phone thief as someone who even knows what expected value is?
They want drugs so they steal phones until they get enough money to buy drugs. If half the phones can't be resold then they need to steal twice as many phones to get enough money to buy drugs; does that make phone thefts go down or up?
On top of that, the premise is ridiculous. You don't need to lock the boot loader or prevent people from installing third party software to prevent stolen phones from being used. Just establish a registry for the IMEI of stolen phones so that carriers can consult the registry and refuse to provide service to stolen phones.
It's entirely unrelated to whether or not you can install a custom ROM and is merely being used as an excuse because "prevent theft somehow" sounds vaguely like a legitimate reason when the actual reason of "prevent competition" does not.
> It's not at all obvious that this is what happens.
This is what we've empirically seen as Apple went from having devices which could trivially be reflashed and resold without much impediment to now most iPhones being locked and their hardware parts cryptographically tied together.
There is a lot of "how to lie with statistics" going on with correlations like that. To begin with, property crime rates have been declining year over year in general, so "it was lower the year after X" is the expected result whether or not X actually did any good. This is especially true in years -- like the one in question -- that follow an epidemic of thefts, and then subsequent years see large declines as a result of reversion to the mean.
Then clickbait headline authors do their favorite thing and find a table of numbers, sort by size and choose the biggest one. 50% in London! That's probably not an outlier, right? But down to 25% by the time they get to city number 3, and no other cities are listed.
Likewise, when there are a lot of thefts then everyone tries a lot of solutions, and then some subset of them do something (or just reversion to the mean again) and everybody wants to claim it was their thing that solved it.
But if it was their thing, and their thing is still in place, then the theft rate shouldn't be going back up again, right? Yet it is:
> It's not at all obvious that this is what happens. To begin with, do you regard the average phone thief as someone who even knows what expected value is?
They know if their fence went from offering them $20/phone to offering $5/phone, it's not worth their time to steal phones any more.
> Just establish a registry for the IMEI of stolen phones so that carriers can consult the registry and refuse to provide service to stolen phones.
This seems like something that the average HNer is going to get equally riled up about as a surveillance and user freedom issue.
> They know if their fence went from offering them $20/phone to offering $5/phone, it's not worth their time to steal phones any more.
Except that phones are worth significantly more than both of those numbers or nobody would be stealing them to begin with, and they have a value floor in what they're worth if disassembled for parts which is above what many people would be willing to steal in order to get. And then we're back to, if you need X amount of money to buy drugs, and the amount of phones you have to steal to get X amount of money doubles, how many phones are they going to steal now?
> This seems like something that the average HNer is going to get equally riled up about as a surveillance and user freedom issue.
The only thing on the list is stolen phones. The phone carrier consulting the list would have your IMEI regardless. The only information anyone would get from the list is that the owner of a phone with a particular IMEI has reported it as stolen.
The main thing you need to make sure and do is to have a good way to prevent someone from reporting someone else's phone as stolen, and "make that a crime and make people who want to file a theft report show a valid ID so they can be prosecuted if they're committing that crime" is probably a pretty good way to do that.
Thieves don't always get the news right away, but when you work hard to steal a bunch of phones and can't sell them for anything, you don't get your fix and you find something else to steal and sell.
Regulations have made it pretty hard to sell catalytic converters, but there's still thefts cause some theives are really out of the loop, but I think it's been reduced by a lot. Still a few people who want to fill up their stolen trailer with cats before they go to the scrap yard, though.
A strong lock system that prevents stolen phones from being used is better than a global IMEI denylist because phones that can't be connected to a cell network but are otherwise usable still have value, some networks won't participate in a global list, and some phones can have their IMEI changed if you can run arbitrary software on them (which is maybe a bigger issue, but still steal phone -> wipe -> change IMEI -> resell is stopped if you can't wipe the stolen phone)
> Thieves don't always get the news right away, but when you work hard to steal a bunch of phones and can't sell them for anything, you don't get your fix and you find something else to steal and sell.
Thieves figure that out pretty quick, and they still seem to be stealing plenty of phones.
> Regulations have made it pretty hard to sell catalytic converters
This is the equivalent of having a list of stolen phones.
> A strong lock system that prevents stolen phones from being used is better than a global IMEI denylist because phones that can't be connected to a cell network but are otherwise usable still have value
It's pretty likely that this value is lower than, or approximately the same as, the value of the phone as individual parts.
> some networks won't participate in a global list
Thieves want to sell phones in rich countries where people can afford to buy them. Get the rich countries to use the list and nobody is going to be stealing iPhones so they can pay $10 to ship them to sell in Somalia for $5. For that matter it's going to make a huge dent even if yours is the only country using the list, because most thieves are not going to use an international fence.
> some phones can have their IMEI changed if you can run arbitrary software on them
So the manufacturers who want to do something like this should prevent that rather than preventing people from running arbitrary software in general.
It seems like you're trying too hard to defend the premise. Having a list of stolen IMEIs would be significantly effective. "What about this marginal edge case?" is like, preventing the thieves from selling stolen catalytic converters would be significantly effective, but they could hypothetically ship them to Somalia and sell them there, so we need OEMs to lock down everyone's cars instead.
That seems more like an excuse to lock down everyone's devices than an actual concern about the marginal edge case which itself could be addressed in various ways without doing something with such high costs to competition. Assuming the edge case was even significant, which it probably isn't.
I find it hard to believe that Oneplus is spending engineering and business recourses, upsetting a portion of their own userbase, and creating more e-waste because they want to reduce the global demand for stolen phones. They only have like 3% of the total market, they can't realistically move that needle.
I don't understand what business incentives they would have to make "reduce global demand for stolen phones" a goal they want to invest in.
It'd be ideal if the phone manufacturer had a way to delegate trust and say "you take the risk, you deal with the consequences" - unlocking the bootloader used to be this. Now we're moving to platforms treating any unlocked device as uniformly untrusted, because of all of the security problems your untrusted device can cause if they allow it inside their trust boundary.
We cant have nice things because bad people abused it :(.
Realistically, we're moving to a model where you'll have to have a locked down iPhone or Android device to act as a trusted device to access anything that needs security (like banking), and then a second device if you want to play.
The really evil part is things that don't need security (like say, reading a website without a log in - just establishing a TLS session) might go away for untrusted devices as well.
> We cant have nice things because bad people abused it :(.
You've fallen for their propaganda. It's a bit off topic from the Oneplus headline but as far as bootloaders go we can't have nice things because the vendors and app developers want control over end users. The android security model is explicit that the user, vendor, and app developer are each party to the process and can veto anything. That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The user is the only legitimate party to what happens on a privately owned device. App developers are to be viewed as potential adversaries that might attempt to take advantage of you. To the extent that you are forced to trust the vendor they have the equivalent of a fiduciary duty to you - they are ethically bound to see your best interests carried out to the best of their ability.
> That's fundamentally incompatible with my worldview and I explicitly think it should be legislated out of existence.
The model that makes sense to me personally is that private companies should be legislated to be absolutely clear about what they are selling you. If a company wants to make a locked down device, that should be their right. If you don't want to buy it, that's your absolute right too.
As a consumer, you should be given the information you need to make the choices that are aligned with your values.
If a company says "I'm selling you a device you can root", and people buy the device because it has that advertised, they should be on the hook to uphold that promise. The nasty thing on this thread is the potential rug pull by Oneplus, especially as they have kind of marketed themselves as the alternative to companies that lock their devices down.
I don't entirely agree but neither would I be dead set against such an arrangement. Consider that (for example) while private banks are free not to do business with you at least in civilized countries there is a government associated bank that will always do business with anyone. Mobile devices occupy a similar space; there would always need to be a vendor offering user controllable devices. And we would also need legal protections against app authors given that (for example) banking apps are currently picking and choosing which device configurations they will run on.
I think it would be far simpler and more effective to outlaw vendor controlled devices. Note that wouldn't prevent the existence of some sort of opt-in key escrow service where users voluntarily turn over control of the root of trust to a third party (possibly the vendor themselves).
You can already basically do this on Google Pixel devices today. Flash a custom ROM, relock the bootloader, and disable bootloader unlocking in settings. Control of the device is then held by whoever controls the keys at the root of the flashed ROM with the caveat that if you can log in to the phone you can re-enable bootloader unlocking.
How is that supposed to fix anything if I don't trust the hypervisor?
It's funny, GP framed it as "work" vs "play" but for me it's "untrusted software that spies on me that I'm forced to use" vs "software stack that I mostly trust (except the firmware) but BigCorp doesn't approve of".
Well I don't entirely, but in that case there's even less of a choice and also (it seems to me) less risk. The OEM software stack on the phone is expected to phone home. On the other hand there is a strong expectation that a CPU or southbridge or whatever other chip will not do that on its own. Not only would it be much more technically complex to pull off, it should also be easy to confirm once suspected by going around and auditing other identical hardware.
As you progress down the stack from userspace to OS to firmware to hardware there is progressively less opportunity to interact directly with the network in a non-surreptitious manner, more expectation of isolation, and it becomes increasingly difficult to hide something after the fact. On the extreme end a hardware backdoor is permanently built into the chip as a sort of physical artifact. It's literally impossible to cover it up after the fact. That's incredibly high risk for the manufacturer.
The above is why the Intel ME and AMD PSP solutions are so nefarious. They normalize the expectation that the hardware vendor maintains unauditable, network capable, remotely patchable black box software that sits at the bottom of the stack at the root of trust. It's literally something out of a dystopian sci-fi flick.
> any random enemy with hardware access could plug in a USB cable, flash the older exploitable signed firmware, steal your personal data, install a trojan, etc
A lot of my phones stopped receiving firmware updates long ago, the manufacturer just simply stopped providing them. The only way to safely use them is to install custom firmware that are still address the problems, and this eFuse thing can be used to prevent custom firmware.
This eFuse is part of the plot to prevent user from accessing open source firmware, it's just that. Your "user safety" jargon cannot confuse people anymore, after all the knowledge people (at least the smart few) has learned during the years.
On most devices, anti-rollback means "older firmware won't boot" or "you lose secure features." Here it seems to mean "try it and you permanently brick the device," with no warning in the updater and no public statement explaining the change
I don't know about most devices, but for all the ones I've messed with, eFuse anti-rollback always "bricked" them if you rolled back. It was a natural consequence of the firmware essentially being a binary with a USB flashing mode, plus a bootloader to continue into the operating system. If the firmware can't load at all due to failing eFuse check, then you can't load into flashing mode. The same thing would happen if you wrote garbage to the bootloader partition. That's enough for customers and journalists to call it "permanantly bricked". There might be some SOC recovery mode that lets you load a newer bootloader into RAM, but it would need some software tooling from the SOC manufacturer, and at that point few customers will figure it out.
Sounds like that should be an option in "Developer Options" that defaults to true, and can only be disabled after re-authentication / enterprise IT authorization. I don't see anything lost for the user if it were done this way.
why don't they work the same way PCs do with UEFI and secure boot? where users decide what certificates go in as trusted root, so they can install their own OS? I'm surprised there hasn't been any anti-trust suits over this by competitor ROM makers.
Once they have hardware access who cares? They either access my data or throw it in a lake. Either way the phone is gone and I'd better have had good a data backup and a level of encryption I'm comfortable with.
This not only makes it impossible to install your own ROMs, but permanently bricks the phone if you try. That is not something my hardware provider will ever have the choice to make.
It's just another nail in the coffin of general computing, one more defeat of what phones could have been, and one more piece of personal control that consumers will be all too happy to give up because of convenience.