Skip to content

Instantly share code, notes, and snippets.

@emboss
Created September 4, 2012 18:46
Show Gist options
  • Save emboss/3624870 to your computer and use it in GitHub Desktop.
Save emboss/3624870 to your computer and use it in GitHub Desktop.
Secure Installation or the notion of a "Trusted Path"

Secure installation of OpenSSL FIPS module.

While looking into the "FIPS mode" of OpenSSL recently, I found this. What puzzled me was the footnote about "secure installation", and the details from section 6.6 of the OpenSSL FIPS User Guide. To count as a valid installation that fulfills all of the requirements, users are required to verify the integrity of the OpenSSL FIPS sources with an independently acquired FIPS 140-2-validated cryptographic module. The programmer in us immediately shouts "Infinite recursion!" and what seems like an overly academical troll on behalf of the CMVP at first turns out to be a delicate issue, while not novel at all, but with far-reaching consequences for the status of any "validated" piece of software in general, or as the OpenSSL FIPS User Guide puts it:

Note this last point is downright mind-boggling: it amounts to an assertion that essentially all installations of validated software modules are illegitimate.

Secure installation of Gems

This kept me thinking about the situation we have with Ruby (or any other software system that offers downloadable packages that perform security-relevant tasks). The current state of the art is to acquire your gems via https download. This is certainly an improvement over having no guarantees whatsoever with plain http downloads. But I am a huge advocate of signing gems with a digital signature. Apart from some technical benefits, this would give us one big advantage over https. By default, any uploaded gem would be signed with a key that is in RubyGems' possession, but gem authors could opt in to sign their gems with individual keys/certificates acquired on their own. The default case provides guarantees that are similar to those in the https download case, but the situation for individually signed gems would change dramatically. A user that has validated such a gem successfully has the guarantee that the gem had not been changed after it had been signed by the author (or more correctly, by the holder of the certificate/key). This is much better than "the gem has not been changed after RubyGems signed it", which is the assurance level of an https download (or RubyGems signing the gem).

The whole FIPS troll about "secure installation" now challenges the whole picture, even rightfully so, if we try to keep the impracticality of such strictness aside. For both cases, signed gems or https-downloaded gems, CMVP reasoning goes like this: "I don't care whether you validated that gem signature or downloaded it over https, because I don't know if your software for doing so is working correctly".

Chicken or Egg?

But then, how are we supposed to get that initial frickin' "trusted module" in the first place?! Clearly, because of the recursive nature of the problem, it can't be done using existing software. For lack of a clear response by the CMVP (I bet they themselves have no clear answer yet), OpenSSL started shipping the FIPS code on CDs. That's done a lot in my experience: whenever it comes to trust issues the good old snail mail is used, in combination with a CD that contains the software (you know, as in read-only storage medium). Of course this is not a satisfying solution to the problem. If I know that company X ordered their CD (because I might be the one who mailed that order), I can still swap packages with my own very special CD inside. Another practice often used in combination with smart cards is face-to-face handover. In our case, this would mean that anybody who wants to acquire a "trusted module" would have to get their copy in person. While this is probably fine in the case of tamper-proof hardware, what have we gained in the case of software? The client now has a physical read-only medium with the source code on it, but still no guarantee that the clerk at the desk handed over the right thing. The vendor is now assured to have followed the guidelines, but still has no guarantee over what happens on the client's computer. What if the client's gcc is infected? What about the client's OS, what about existing malware? What about the hardware?

Conclusion

The whole secure installation chicken and egg problem is quite interesting, but with generic hard- and software it seems intractable at this point. Even with initiatives such as Trusted Computing, current validation procedures are mostly incredibly annoying chores that buy very little additional assurance of anything. We would need some form of a "live proof" of correctness, trying to validate each and everything seems infeasible due to the recursive nature of the problem. It just feels too much like "yo dawg, I heard you like validation, so I put validation in your validation so you can validate while you validate". The cynical among us might point out that the mere existence of this problem renders the whole security thing moot, but that can't be the solution either. I think with the lack of better alternatives, the discussion should be about where to draw the line of practicality. Requiring folks to send CDs all over the place seems to be below that line.

I'd be very happy to learn about initiatives that aim at solving this dilemma. Trusted Computing is one I know about, but never really liked because it seems impractical in the light of upgrades/updates etc.

@jfirebaugh
Copy link

current validation procedures are mostly incredibly annoying chores that buy very little additional assurance of anything

In the case of FIPS, what it buys (at incredibly high cost for both the vendor and the buyer) is the assurance that one is not buying security snake oil: "Joe's Own Encryption Algorithm -- now with 8096 bit keys!" There's very little assurance that the FIPS-approved algorithms were competently implemented or that the trusted module was securely received and properly installed and configured. I doubt if many organizations purchasing FIPS-validated software even care about adhering to the User Guide; my impression is that they mostly use validation as a bozo filter during the purchasing process.

In practice, FIPS validation for competently implemented (i.e. non-snake oil) software makes it less secure: since the costs of validation are so high, vendors have a considerable incentive to let vulnerable versions ride rather than go through the hassle of revalidating a fixed version.

@emboss
Copy link
Author

emboss commented Sep 4, 2012

@jfirebaugh I fully agree. FIPS validation is more or less a marketing thing, a badge acquired at extremely high cost. But in practice, since critical security updates are problematic because they would break the "validation seal", it causes a lot of problems. From my own experience, standards like FIPS validation etc. pretty much are the entry ticket to certain businesses. Again, I'm totally with you there. It doesn't count at all what the software actually does, because only a handful of people would know anyway - it's just the "branding" that counts. I guess it helps the people who have to make the decisions to keep a clean vest in case things go wrong. It's better to be able to argue "but we only used FIPS-validated software components" instead of "but in my opinion algorithm X was way more secure". It's sad but understandable. I know of examples where banking applications weren't allowed to use Javascript at all because the word "script" alone was considered evil. Implementing the whole thing as an Applet using the Java plug-in was considered safe, though. Recent events have shown us how safe :)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment