I am not a fan of this book. I wrote a review of the book as an HN comment. I dislike this book so much that for the first time in the history of my use of HN, my comment overflowed the bounds for HN comments. So, here it is:
This is fantastic, thanks for taking the time to write that out. (And my condolences to the author... I know very well how hard it is to have your work so thoroughly slammed but in this case it's important.)
One minor nit; I believe there is a typo in the review which caused parsing problems on my end:
The contact forward secrecy provides a design is...
I was planning a new view, which is just the user-submitted text, and that will be live within the day. (Similar to how on imgur.com you can link to the image directly, or the image with adverts.)
But extra meta-data I've been loathe to add so far because I see it as a barrier. I'm open to persuasion though.
Do you have any books that you can recommend as an alternative for learning cryptography (for people with some exposure/programming experience but not experts in crypto)?
> The randomness requirement for DSA is such a huge problem for DSA that it forms the centerpiece of Daniel Bernstein's critique of the NIST standards process. This is the bug that broke the PS3.
I've been reading about this recently, but I'm confused. Wasn't the PS3 issue that the key was static? That's not the opposite of randomized, because there could be a deterministic, yet message-specific scheme for generating the secret key. Or is that known to be unacceptable?
> there is concern that the NIST curves are backdoored and should be disfavored and replaced with Curve25519 and curves of similar construction.
I've heard the opposite, that the curves are not the problem but the choice of basepoint. E.g. [1] states: "...the vulnerabilities in Dual-EC have precisely nothing to do with the specifics of the NIST standard elliptic curves themselves. The 'back door' in Dual-EC comes exclusively from the relationship between P and Q -- the latter of which is published only in the Dual-EC specification."
Is it still a matter of dispute whether the curves themselves are suspect? Or is the referenced blog post outdated?
No, the PS3 problem was that the per-message DSA nonce ("k") was repeated (and thus not random). I'm not sure about the rest of your paragraph; if you're referring to deterministic alternatives to standard DSA, yes, they exist. Again, see Bernstein.
Dual_EC isn't a curve standard, it's a random number generator standard. The NIST P- curves are thought to possibly be backdoored. Dual_EC is all but certainly a backdoored.
So you're saying that there is something inherently insecure about the constants defining the curves themselves. I've read some of Bernstein's stuff on 25519, and everything there is about selling curve25519 as making the engineering challenges easier by avoiding common pitfalls of the Weierstrass normal form.
And I can't find anyone anywhere giving evidence of mathematical insecurity of any NIST standard curves. Evidence might come in the form of a (significantly more than state of the art)-subexponential but still superpolynomial time algorithm. Dual_EC has something even better: a proof of insecurity.
And I quote, "So what's the problem? Answer: If you implement the NIST curves, chances are you're doing it wrong"
This is just reinforcing my point. There is nothing known to be mathematically wrong with the standard curves. Bernstein just warns against all the (admittedly many) pitfalls in implementations, and that the Weierstrass normal form makes it easier to run into problems than the normal form he proposes. This is the only reason he says NIST doesn't guarantee security, and of course they don't guarantee against engineering errors.
But that's extremely different from saying NSA planted backdoored curves intentionally. The only thing in Bernstein's analysis that I could possibly construe as suggesting malicious behavior is that the NIST curves are outdated (the suggestion being that they are intentionally left outdated).
Second, if NSA backdoored a curve standard, they probably did it in a fashion that only allows them privileges. Google [NOBUS NSA]. Dual_EC is a NOBUS backdoor, unless you can efficiently solve the ECDLP, in which case the backdoor doesn't matter anyways.
Finally, even if you stipulate for argument that a curve was backdoored in such a way that a researcher might find the backdoor, who's to say that curve researchers care that much about Bitcoin?
Wow, thanks for your comprehensive review! I only got a chance to skip the first couple of chapters, and initially thought it was just a set of examples of how to use go's crypto library. Seems like the whole thing should be taken with a grain of salt.
thanks. but why has crypto to be equivalent to "heart surgery"? if I would describe the state of the art of Internet security I wouldn't apply that term.
"It's nuanced and detail oriented" would be my answer. I'm not a cryptographer, but in my dealings and exploration I've found lots of issues that just aren't obvious, e.g. timing attacks and length extension attacks. There are also many things that are obvious, but require care to do, e.g. single-use nonces and authentication.
A mistake in crypto can invalidate your entire system, not just make it unreliable or crash, and those mistakes don't have to be something obvious, there are many insidious little things that can happen as well. That's my take on it.
That said, I believe that software engineers should learn basic crypto and fiddle around with their own ideas, _with the understanding that there is always someone smarter than them_ in order to understand some of the problems they'll be facing.
I know Kyle personally. He is super smart, incredibly humble, and willing to teach. And I always admire anyone taking the initiative to try and produce something useful.
In response to Jabbles, this is from his (work in progress) book. Page 7.
// Implement the standard padding scheme for block ciphers. This
// scheme uses 0x80 as the first non-NULL padding byte, and 0x00 to
// pad out the data to a multiple of the block length as required.
// If the message is a multiple of the block size, add a full block
// of padding. Note that the message is copied, and the original
// isn't touched.
As anyone who has written technical material knows, your drafts tend to be error prone and you rely heavily on others to help catch your mistakes. Technical editors are the normal tool utilized by the traditional publishing world. As this is a leanpub book, I feel that the responsibility generally falls on the community. Those who have paid for the book and want to see it succeed.
I personally have purchased this book as well as reported errors to Kyle.
I'd encourage anyone with questions or comments to reach out to Kyle. He is quick to respond.
I can't comment on the author's cryptographic knowledge, but his Go seems non-idiomatic, and in some cases plain weird.
For example, this function stores the length of a slice in a variable, creates a new slice of the same length, copies all the bytes from the old to the new, then checks that the length of the new one is the same as the old (stored variable) - since it cannot have changed this check never fails. Hardly inspires confidence.
It happens that in this library, with padding errors delivered as error return values, checking a MAC and then obliviously decrypting won't leak the broken pad values --- presuming your code (a) uses the author's idiosyncratic 80h+00h padding scheme, or (b) uses PKCS7 padding like every other system and carefully checks to make sure that the pad bytes don't run it off the end of a slice --- doesn't fatally compromise the security of the system.
But in virtually every other language in which people implement cryptography, the pattern the author uses snatches defeat from the jaws of victory by creating the opportunity for padding exceptions even after the code has verified for itself that the message it's decrypting can't possibly be valid.
Obviously, the bigger problem is that you're more than halfway through the book before you find out that the example from Chapter 2 is totally hosed.
You're right, and it's by-the-by; I missed that decrypt() had moved on to CTR in that code.
If it had been CBC with ISO7816 or PKCS#5 padding then I do think this would be attackable. The unpadding errors from decrypt trump the MAC error thanks to the "err == nil && !match" expression.
Fortunately, it's not giving a distinctive error, and always runs both the MAC and the decryption so timing won't distinguish, either. The big problem with this pattern is how disastrous it is in every other language (coupled with the lack of any rationale for decrypting on MAC failure).
Books like this scare the hell out of me. People can't even seem to use tried-and-true crypto libraries without screwing something up most of the time. Now you want them to write their own? This sounds like an absolute disaster waiting to happen.
Same difference. Golang provides cryptographic primitives but few fully-implemented constructions; for instance, if you use Golang's CBC, you have to implement your own padding.
Any Golang program that implements crypto will also need to implement an incomplete and poorly-specified version of NaCl.
(That's true of virtually every other programming language as well.)
I agree to some extent, but I wanted to make clear that this book is not about implementing AES or SHA algorithms. It's about using them (the author actually makes that very clear, the algorithms are treated as black boxes).
Yes you can still use them wrong, and you're not much more advanced. But the original comment seemed to suggest it was about implementing the low-level algorithms.
There's an important distinction here that most people miss, namely the difference between a 'cryptographer' and a 'cryptography engineer'. A cryptographer builds schemes and protocols and proves their security. (Phil Rogaway is a cryptographer. Bellare, Boneh, Rivest, etc.) A cryptography engineer writes code that implements schemes developed by cryptographers. To be a good cryptography engineer you really need to learn how to break systems. In contrast, some of the best and most important cryptographers never even touch a keyboard except to typeset research papers.
I don't understand - are you contesting that this distinction exists at all? Here are some counterexamples: Shaffi Goldwasser, Oded Goldreich, Matt Franklin, Craig Gentry, Chris Peikert.
Breaking crypto systems isn't the same as breaking "into" systems.
The history of cryptography can basically be described as a series of assumptions which turn out to be invalid. This is either because they were never valid to begin with (and it just took time for problems to be discovered and shaken out), or because the facts on the ground change, making them invalid.
I'm not a cryptographer. I also don't really call myself a hacker anymore (because the loaded assumptions around that word make it useless as a descriptor, you wind up having to explain it in so much detail that it's just easier to start out with a different word), but I do know some cryptographers. They have all described the process in a similar way, which is you start by learning about the pitfalls of everything that's come before you.
Much like if you were learning to build bridges, you'd spend time learning about past bridges that didn't hold up.
So you start with the oldest crypto systems, and learn why those fell out of fashion. One nice benefit to this approach is that it makes it fairly apparent how brittle these constructions are. I don't think I've ever met a cryptographer who isn't suitably hesitant about designing cryptographic systems. Also, a good portion of the time spent as a cryptographer (maybe most of it? I'd love to hear a dissenting view from an actual cryptographer), is in breaking cryptographic systems (initially other people's, and than later, your own).
Cryptography isn't like web frameworks in the sense that everyone is making their own. New crypto systems (at least ones that come from cryptographers) don't spring up out of the ether every week.
One problem that seems to come up in cryptography is that cryptographers themselves "seem" to be mostly only concerned with the primitives. They leave implementation as "an exercise to the reader". This is a problem because I'd say that the overwhelming majority of actual security problems that stem from cryptography aren't problems with the primitives. They're problems with the construction necessary to do anything useful with those primitives. That's why when people heed advice like "Use AES" they're probably screwed.
You're pretty much on point. When designing a scheme, whether a low-level cipher or a higher-level protocol, only a small fraction of the time is spent on the initial design itself. That is not to say that a scheme is designed and promptly finalized; analysis results feed back into the design process, of course. But the analysis is generally by far the most time-consuming task.
Additionally, sane people do not use building blocks that they don't trust. If you need a block cipher you use AES (which is partially backed by security proofs) instead of designing your own (unless you have very specific requirements). This saves immense time in both design and analysis. Same with modes (Feistel structure, Sponges, HMAC, etc). Proofs play a big role here, although it is important to understand what they say, instead of blindingly assume they mean whatever you use them for is secure.
You are right that implementation is generally left as an afterthought. This is slowly changing, and primitives are increasingly being designed for implementation and side-channel friendliness.
The intent of the book, as stated by the author, is:
It assumes that you aren't looking for cryptographic
theories, but practical ways to use cryptography in your
projects.
If you want to build a cryptographic library for other developers to consume, it's imperative that you understand the theory behind crypto principles. The author is telling you that this isn't the book for that.
http://pastebin.com/raw.php?i=NXgU30xK
(as a Github gist:
https://gist.githubusercontent.com/anonymous/3cc34251e501c2c...)