Certificate compromises?

Hi,

Any "news"/"developments" regarding how robust "signed binaries" (et al.) have proven to be IN PRACTICE? I.e., any known exploits? Any "social engineering" exploits?

And, relative merits of signing against a *single* keyset distributed in the target vs. allowing certifying authorities ("Bob will vouch for me...")

Thx,

--don

Reply to
Don Y
Loading thread data ...

formatting link
is the most famous example.

A lot of your questions seem off-topic for comp.arch.embedded. You might get better answers if you take a moment to find the appropriate newsgroup.

Reply to
Paul Rubin

This doesn't indicate how pervasive a problem it is/is not. If, OTOH, it was commonplace for portions of the CA to be "forged" (for want of a better social-engineering term), then the entire concept would merit closer scrutiny.

It also doesn't address how aware *users* are of these issues (do folks ignore the warnings re: unsigned binaries and "install anyway"? *I* routinely download drivers from known manufacturer web sites that have

*not* been signed -- yet I am 99.99999327% sure they are "genuine" and, thus, "install anyway". [There have been a couple of scandals in recent memory where "golden masters" were compromised]

Ah, *your* embedded devices used MASKED ROM and "return to factory" for firmware updates? Or, do you just not consider firmware *delivery* integrity to be important for your products?

Can you point to some *other* common use of CA technology that will have a readership versed in the other issues that pertain to embedded systems? E.g., devices that *don't* always have displays, keyboards, interactive users *and* live network connections?

As to my other recent posts:

- Decompiling "MS" Drivers I suspect I will find a greater number of souls who have *any* experience with drivers OF ANY SORT here than in an "MS-specific" newsgroup. I imagine 80+% of everyone reading, here, has some experience with a driver -- even if only an ad hoc defined one! Would you suggest I fid an MS-specific group with 0.01% in that same capacity?

- "Stable" time references What group would better address the design of devices that might rely on a timebase of suitable accuracy/precision? Any theoretical discussion has to be grounded in *practice*. It's one thing to specify a "laboratory reference" -- quite another for a reference that might see use on a battlefield, in a restaurant or in a mine.

- "RJ45" crimp connector flavors Ah, my mistake. No one here uses network technology or designs network interfaces into their products. "Embedded" must mean "has no connection to the outside world".

- "IVR" et al. in the 21st century Again, I guess all those devices that *don't* talk to the outside world are very selective about just *how* they "don't talk to it". And, how they ignore the souls that populate it! :>

- Managing "capabilities" for security Ah, yes. We're not concerned with firmware delivery integrity so we obviously aren't concerned with the integrity of the firmware operating within the devices we build! Trust everyone!

- "scrubbing" below the FTL As we're all using MASKED ROM, any FLASH related questions would obviously be inappropriate, here. A "PC"-oriented forum would clearly miss the issues involved as none of those folks will have any experience writing an FTL -- let alone considering how to "drill through it" for this functionality!

I value the diversity of experiences that are in evidence in the readership, here. I expect folks to have knowledge from application domains that I will likely NEVER encounter. It would be foolish not to take advantage of that knowledge pool!

I try to be very diligent about prefacing any subject that is likely to be considered "off topic" with an explicit indication of that: OT: Inhibiting persistent changes to a workstation OT? PSP development OT? Alternative business models OT: Disk "imaging" sw And, when I am keenly aware that the subject matter will *not* fit with the expertise of the folks here, I clearly indicate I am looking for suggestions for alternative venues to explore: ISO: UI/UX issues forum ISO: privacy forums (lack of responses confirms my suspicions regarding "local expertise")

If you can't see the applicability of my questions to the subject matter of this newsgroup, please feel free not to bother yourself opening or replying to my "misguided" questions! It costs you nothing NOT to look! :>

OTOH, you may discover there are folks working on things *in* this domain that go beyond your (apparent) level of experience. You may even get an idea for a technology that *you* could eventually apply! Or, an appreciation for why certain technologies are *not* commonly embraced -- saving you the cost of a "misguided adventure".

Reply to
Don Y

This is potentially a very wide topic - are you talking about binaries to run on an embedded system or to run on a PC? Are you talking about some standard signature system, or are inventing/picking your own? Robustness, and known exploits, are totally dependent on the system chosen.

A strong system - such as running the binary through SHA-512 for a

512-bit hash, then signing it using a 2048-bit gpg key - will be unbreakable even for the NSA. But it might be a bit overkill if you are talking about binaries for an 8K microcontroller...

Any time there are humans involved, there are social engineering exploits. The only way to fully avoid them, is to keep people out of the loop.

Single keysets are more secure in that you only have /one/ set of keys to keep safe - but if they are compromised, everything is compromised. Certifying authorities are more scalable, as you can distribute the work.

Reply to
David Brown

This might not be directly related but Phil Koopmans Blog might give you some help in deciding what you need to do.

The bits you need are 3 or 4 items down I think.

--
******************************************************************** 
Paul E. Bennett IEng MIET..... 
 Click to see the full signature
Reply to
Paul E Bennett

Posted in c.a.e so assume an embedded system -- though probably larger/more complex than "most". But, here, I imagine very little is *typically* done to secure an executable: verify a checksum and "jump to main" with fingers crossed!

OTOH, I suspect most folks only have experience with signed objects on "workstations" -- Sun, "PC"s, etc. I *suspect* the vulnerabilities are of the same *types* -- though the workstation world probably would experience more of them (based solely on number of identical units sold!)

Think "much bigger" systems.

Exactly. AFAICT, this is the only *realistic* threat to such an approach. Folks with deep pockets can always de-encapsulate die and microprobe for specifics (taking care not to knock too many electrons off those gates! :> )

Exactly. Hence the appeal.

It also inherently allows for "third party" developers without requiring every "app" to go through the same funnel.

OTOH, removing this funnel also removes the gatekeeper -- any authorized signer can produce anything he/she wants!

Reply to
Don Y

Thanks. I'm reasonably sure I know how to do what I want. But, puzzled to see so many devices that I would *think* would be "secured" in their implementations routinely "rooted" (game consoles, cell phones, etc. -- things where the business model is predicated on controlling what runs where!).

Hard to believe with the monies involved that these folks are incompetent in their implementations (I haven't examined all the exploits in detail). So, I wonder if the *source* is something like a "leaked secret" or an "observable secret".

Or, just a shitty implementation.

What "secures" the code in the implantable AED from hackery? (scarey thought!) Is it simply *wishful thinking*??

Reply to
Don Y

Hah, Don, I guess wherever there is code and communication things rely on wishful thinking.

How sure is sure? Let us take one of my MPC5200B DPS machines for example. Since I have written every bit of code running on the MPC5200B I should think it would suffice if I simply do not run any servers posing a potential vulnerability. I know my code has no back doors etc. Nothing running which I don't know exactly. Perfectly safe, one should hope.

Or is it. How do I know what I get on the BGA chip (which is actually a board). The 5200 eats around 1W of power, one could hide a radio inside if one wanted - connecting to the debug interface or wherever.

What other than wishful thinking could I possibly rely on. If it is a radio I might be able to catch it, OK. But if it is some piece of code inserting itself into ethernet frames etc.... when would I go into the trouble looking for it once I have things up and running.

Mind you, I don't know there is something like that, in fact my wishful thinking says there is not - but under the score I just don't know.

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
dp

Yes, that's the most common - checking that the CRC matches makes it very unlikely for accidental errors in the binaries, but doesn't protect against a knowledgeable person making a fake binary.

Of course, many embedded systems are also protected by "security by obscurity" - lack of documentation, specifications and manuals means that only the developer knows how to generate the binary files and attached checksums, making it hard to hack!

Yes, that's right. And people do sometimes have to make PC software to go along with their embedded software - for many c.a.e. developers, the only "signing" they need is for Windows software/drivers to go along with their embedded systems.

If you are talking about a 32-bit microcontroller with a reasonable lump of flash and ram, then it should be perfectly capable of running these tests - it's then merely a matter of how fast you need it to run. If it is an embedded Linux system, then the software is already available - if not, you can certainly get the source code (but the GPL license may be an issue) and the algorithms are freely available. RSA is easy to implement, especially if speed is not too critical.

There are also simpler systems - 3DES is quite popular for encryption, and can be used as the basis for both encryption and signing. (But of course you need to be extra careful of the key when using symmetric cyphers.)

There are also cryptography chips that handle things like hashs, signing and encryption/decryption. Atmel has a range of these, as do several other manufacturers.

That sounds more like decryption to me - digital signing does not encrypt the binary itself, only the hash. So with signed binaries, the bad guy doesn't have to peel off the top of the chip - he can just look at the update file directly. The signing is so that if he changes the file, he can't trick other systems into running it.

Of course, it is not uncommon to encrypt the files too.

But you make a good point about any security system - there will /always/ be ways around it for those with enough money or enough determination. The aim is to make it more expensive to crack than it is to buy the systems legitimately. But don't make it /too/ hard - you want to stop before rubber hose cryptoanalysis becomes the cracking method of choice.

Yes - it's a compromise, and something you'll have to decide.

Reply to
David Brown

The only certainty is if you had created the *entire* system yourself, down to the individual chips you use, never ever have it connected to the net and operate it inside a Faraday cage.

If you don't do it all yourself you have to trust someone to have provided you with a poduct without the sneakery.

--
******************************************************************** 
Paul E. Bennett IEng MIET..... 
 Click to see the full signature
Reply to
Paul E Bennett

This is how things are nowadays, yep. Unless we have a silicon factory at our disposal we have to trust someone else who does. I am not so sure about the Faraday cage, I'd like to think that having everything under control should make it possible to produce a device secure at the level of its current level of technical development. But this may also be just wishful thinking, of course, and I have not really given it that much thought :-).

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
dp

I'm not trying to drag "third party libraries" into the issue. Rather, I am concerned with a couple of "common cases":

- someone modifies a legitimate binary (i.e., something that *you* wrote) at some point in the distribution chain (including AFTER he has received it for his own use) to behave in a way that you hadn't intended or that, potentially, can harm you or your reputation (e.g., a competitor buys your product, hacks it and "demonstrates it" as having buggy performance -- requiring you to replicate the same tests publicly to dispute these claims, etc.)

- someone modifies a legitimate binary *midstream* and allows it to be distributed to others AS IF you had written it (containing back doors, etc.)

- someone writes an application that your device *intends* to support but that makes your product look bad (e.g., this is part of the argument Sony, Apple, etc. use to justify having only "approved apps" run on their devices)

Indeed. Kernighan describes a scenario where a system can propagate a back door unbeknownst to developers (e.g., hack the compiler!)

Of course! E.g., its relatively easy to use a DNS tunnel to move information in/out of a firewalled system -- under the eyes of system administrators! (very low data rates, typically, but "enough" for many uses!) The same can be said of most other "universal" protocols -- data can be encoded in all sorts of ways that neither a human nor a "typical" firewall/IDS can detect yet require very little coding effort!

Reply to
Don Y

An acquaintance of mine is involved making a mass produced embedded device with crypto inside. They use two Chinese contract manufacturers who they assume are going to try to rip off the crypto keys. They therefore split up the manufacturing and key loading operations such that the two companies would have to collude in order to compromise the devkce on any scale. It's conceivable that could happen, but they see it as less likely than a single company acting by itself if it could.

Reply to
Paul Rubin

Agreed. In the days of "predefined disk types" (PC), I would routinely patch system ROMs, tweek the checksum and end up with my own "custom" disk type (before "type 47", etc.).

I patched the test procedure for a large military product many years ago (solely as an expedient -- so I could bypass certain time-consuming tests that I already *knew* passed) and it was just a fluke that individual present for the official sign-off procedure noticed my "Go for coffee" note on the screen (oops!)

I.e., many systems *trust* the code that runs on them -- if it runs, it must be INTENDED for this system!

Yes, but look at how many consumer products are "hacked", regularly. I.e., tools are far more capable. And, developers have largely settled into a "rut" -- doing things with less variety than in years past (often because resources are more plentiful).

I wasn't addressing the potential need for a signed PC binary but, rather, suggesting that this is the *only* place most people will encounter a "signed binary required" situation (at least, the only place where this will be obviously the case!).

E.g., if the firmware in your Widget2000 fails to "upgrade", do you know that it is a result of a failed signature? Bad flash burn? etc. Your only practical remedy is "try again:.

*If* I implement a gatekeeping function that doesn't *allow* unsigned objects onto the system (i.e., no removable media that can be swapped out after-the-fact), I think I only need to verify signature at "install time" and thus eliminate the need for run time checks.

If you have a "secure" loader, then you can rely on the signature exclusively -- without having to encrypt/decrypt the object itself. I.e., he can't change the "payload" because the signature would no longer be correct. And, he can't compute a new signature because he doesn't know *how*. So, he had to examine the "secure" loader to gain the insight needed to modify the forged signed object.

Addressing the perceived "value" of cracking (for benevolent *or* nefarious purposes) is not trivial. Motivations vary from "pride" and "challenge" to financial gain. It's often not possible to evaluate this issue in all but obvious cases (e.g., a device that acts as "electronic money")

The problem is the trust it entails. It's sort of like NDA's... don't ask someone who is willing to divulge his most personal and business secrets to sign an NDA cuz he'll treat your "secret" similarly!

Every entity you drag into the trust circle increases the attack surface. I can make sure *no one* see's my secrets... but, not once I've given a copy to *you*! (regardless of how much I *trust* you)

[witness all the security breaches that we hear about -- and ponder how many we *don't*!]
Reply to
Don Y

Well I am past that particular example, I have written the compiler :) . I don't think I can be hacked at software level.

Oh of course, once you have malicious code running on a processor things are lost (not that I have much experience with that, I have never had malicious code on a DPS machine I know of - I am too small an operation to be targeted, I suppose).

But your thread made me think (again, and again - sort of vaguely as I have more pressing issues at hand ...:-) on what do we get with todays processors. May be someone who is a lot more of an RF person than I am (this won't take that much) could explain how feasible it is to say use the clock PLL to modulate with some data to be sent out. Or whatever, I suppose the receiver part will be harder to do though (do we have an RF guy to speculate a little on that?). Given the mess ethernet frames are - and that I am likely not the only one to just connect a PHY to the MII or whatever and see if the packets make it both ways - I think their preambles etc. might be a nice place to transport a few hidden bits (just musing, I'd have to work to prove whether this is possible or not).

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
dp

I'm not sure its "just" an issue of malicious code. Rather, "any code that you don't *intend* to be running on that device".

E.g., the example I gave of hacking the test routine for that military project (see my response to David, upthread) -- my "hack" wasn't malicious nor did it alter the *functional* behavior of the test program. Instead, it just made it very clear to folks watching the progress messages (i.e., typically, *me*) that the current test was going to take a REALLY LONG TIME!

Yet, the device couldn't formally be "sold off" with that obviously "corrupted" program acting as the arbiter of "pass/fail" -- regardless of how benign my change had been!

The fact that I was *able* to make such a change indicates that there were no mechanisms in place to *prevent* it. Or, indicate that I had "tampered" with the binary.

I was recently given an article indicating how the audio output of laptops could be used to "transmit" information to another device "within earshot" -- undetected by human hearing.

The possibilities for covert channels are many!

But, even if the software doesn't try to "leak" information from the device/user, what if you just don't want it to *execute* without your blessing? E.g., Sony doesn't want games running on their hardware without getting a "cut of the action" (I suppose). Even if it is just a means of accurately monitoring sales/demand for particular titles! (to further refine their own marketing efforts)

Or, use the time *between* messages to encode data. Or, stagger messages to alternate IP addresses: X.Y.Z.0 vs. X.Y.Z.1 to encode a binary value. Or, try to resolve different symbolic addresses (value1.mydomain.com vs. value2.mydomain.com vs. value3.mydomain.com). Or...

Reply to
Don Y

Unless you house it entirely in a faraday Cage, the system will radiate energy to the outside world. With a highly dirctional antenna, the emissions from your system could be captured (from the leads, from the display screen, etc). There are certain regular frequency events in there with a mush between them which would give the eavesdropper something to lock onto. Then it comes down to how good their filtering and decoding efforts are.

--
******************************************************************** 
Paul E. Bennett IEng MIET..... 
 Click to see the full signature
Reply to
Paul E Bennett

Classic on military hardened multiuser systems where there should be zero possibility of sending untracked information from one user to another: - the tx user creates and deletes large files, with a time/space modulation defined by the info to be transmitted - the rx user looks at the free disk space and demodulates the info

OK, it is a slow transmission path, but I've watched a system salesman/engineer damn near have a heart attack when he saw a message being typed by one user appear on another user's screen.

Reply to
Tom Gardner

aka TEMPEST.

Similar schemes are used to monitor *power* consumed by "secure" MCU's and deduce likely instruction sequences.

Note, you don't necessarily need to be 100% *sure* of an individual observation. All you hope to do is improve likelihood of "knowing" and chain these likelihoods together.

Reply to
Don Y

Almost all steganographic techniques are low bandwidth. But, if you've got lots of *time* (e.g., 24/7/365) then you can transfer lots of data! :>

OTOH, there is a lot of redundancy in most "data" -- or, "unnecessary detail" (i.e., "nice to have but not essential").

And, if the two parties already have conspired to decide the *nature* of the data to be exchanged, you can rule out lots of otherwise ambiguity.

E.g., 9-14-27 is probably *not* the date of birth of the Czar's oldest living relative but, rather, the combination to the wall safe in the master bedroom of the Imperial Residence.

And, even partial data can have immense value! E.g., "9" in the above example gives a would be thief a big head start... (How much would you pay to *know* the first of the set of winning lotto numbers before the drawing?)

Reply to
Don Y

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.