The end of the post is extremely specifically and carefully not describing a framework for rolling out files to exploit these vulnerabilities; those files as described do nothing, and serve only aesthetic purposes. While it's easy to read that as a wink that they are exploiting the vulnerabilities they found while maintaining plausible deniability that they aren't, it's equally possible it's the other way around: they aren't rolling out exploits but want people reading the blog post to believe that they are. Or that they want to lay out the framework so that others can do so, but aren't actually going to follow through themselves. As written it's essentially unverifiable, obviously on purpose.
The optimal thing for them to do would be to build the framework and ship partially corrupted JPEGs that don't actually do anything nasty to Cellebrite. Cellebrite can verify that the machinery is there (not a totally idle threat) but no one can prove that Signal has actually done anything illegal. Cellebrite then wastes a bunch of times gathering and analyzing the files without actually learning anything from it. They also get more incentive to start finding and fixing their software's vulnerabilities, which throws off work schedules.
And Signal can develop a few land mines to deploy at any time, and just... hold on to them for a rainy day.
Even if Signal put the files out there and explicitly owned up to it, I struggle to see how it could be even remotely illegal. It's not their fault some other company's faulty product falls apart when it hits bad data in their app.
Agreed. Obviously, I'm not a lawyer so who knows. But it seems ridiculous that you could break into someone else's device and run all of their files and then come after them legally because a file you effectively stole didn't run properly on your computer.
At some point if someone breaks through multiple levels of advanced security to, say, steal your gun and then shoot themselves in the face with it, whose fault is that really...