Conceivably, a state actor could use this bug to eavesdrop on an espionage target, no?
Well, let's try to conceive it. Our state level actor is now in possession of an exploit that lets them eavesdrop on a target when they text-dictate or activate Siri, while wearing particular Apple headphones. After getting the target to install a specific malicious app from the App Store. And to run it. And to give it Bluetooth permission. And to make sure to restart it whenever they reboot their phone or the phone kills it for any reason. The value of this as state-level actor surveillance malware feels a lot closer to $0 than $7000 to me but I'm happy to hear a different conception of how this might work.
You're not wrong from a technical perspective, but typically the purchaser would be a broker that re-sells these types of exploits to a state-level actor, or even to another broker. Said brokers are interested in acquiring exploits that check certain boxes for their gov buyers, and anything that checks the iOS box is always going to be a hot commodity.
Remember, at the end of the day the sale is to the government and they have big pockets and less common sense.
Well, let's try to conceive it. Our state level actor is now in possession of an exploit that lets them eavesdrop on a target when they text-dictate or activate Siri, while wearing particular Apple headphones. After getting the target to install a specific malicious app from the App Store. And to run it. And to give it Bluetooth permission. And to make sure to restart it whenever they reboot their phone or the phone kills it for any reason. The value of this as state-level actor surveillance malware feels a lot closer to $0 than $7000 to me but I'm happy to hear a different conception of how this might work.