Is there a process for secure firmware install/upgrade for device made offshore?

Hi Recently more and more companies want to add security (authentication and/o r encryption) to their devices firmware install/update process. Typically t his is done by storing a secret encryption key in bootloader or elsewhere i n internal MCU flash. This should work if bootloader is installed in secure facility by trusted people. But then manufacturing is outsourced/offshored and then what? I do not want to send my precious key to China. So, I wonde r whether it is possible to design an algorithm or process for secure firmw are installation and updates while initial firmware is installed by a facto ry in China. Typically my devices have JTAG, some other port (UART, etc) an d often wireless (WiFi or Bluetooth). Note: moving all newly manufactured d evices to a secure location and reflashing via JTAG would be too expensive. This problem seem to be very common now, there must be some common solutio ns, are there? If pure software solution is not possible, are there some ha rdware assisted solutions? I guess if a chip would include a hardcoded inac cessible private key and assymetric decryption module, this would solve thi s problem, would it? Are there such chips?

Thank you

Reply to
jhnlmn
Loading thread data ...

What are you trying to guard/protect against? "Factory" being able to create unlicensed updates for your product?

Are you expecting to distribute unencrypted (though *signed*) updates to your users?

How complex is the device (i.e., you;ll be giving your manufacturer the binary image of the code to install in the virgin devices)?

Reflashing via JTAG (at a "secure location") isn't the *only* option. (Hint: how will your end users install updates?)

All you need (to allow you to NOT trust your manufacturing source) is a "secret" that isn't available to them, or your users.

Reply to
Don Y

  1. Reverse engineering
  2. Hacking In resent years there were too many scary news about baby monitors and tedd y bears, usb dongles and hard drives, networks routers and switches being h acked and used to spy on their users, steal their data or infect computers. Many of these hacks became possible because original firmware was reverse engineered and/or hacked firmware was installed. Companies that developed t hose insecure devices got very bad publicity. People are afraid to use devi ces. We need to find some best practices and follow them.

your users?

Requirements vary. Recently many of my clients began demanding encryption ( at least those who understand the threat). And I advise them to use both au thentication and encryption as much as microcontroller capabilities allow. Luckily, recently more and more microcontrollers include hardware encryptio n modules. The question is where to hide the secret key to use for encrypti on/decryption.

inary image of the code to install in the virgin devices)?

Again it varies, but recently I see more and more devices that only include bootloader and some basic firmware and then perform firmware upgrade upon first connection. Of course, this firmware upgrade can be done by anybody - end user or manufacturer or hacker.

secret" that isn't available to them, or your users.

Yes, this was my original question, how to store a secret on a device that is manufactured in China?

Reply to
jhnlmn

Don, I appreciate your pessimism and despair. Still, imagine that more and more devices are made just like I said: on a s mall budged, with almost everything outsourced except (hopefully) engineeri ng. So, we must find some ways to retain control, otherwise everything will be hacked (which is not hacked already) and we will not be able to make or use any electronics. I really wish a solution will be found. Anybody? Thank you

Reply to
jhnlmn

It's not pessimism; it's a realistic attitude given the capabilities available to even "small players", nowadays. "Kids" de-encapsulate chips in college labs and microprobe die, etc.

Imagine you have a device that lets you "hide" a key in it. Who's going to push the buttons, copy the files, or to actually *hide* the key? You're outsourcing EVERYTHING!

Do you trust the supplier to do the "secret key encoding" -- but NOT trust him to actually *see* the key he is encoding into the devices?

Do you find a second firm to do the keying -- and hope these two firms never "compare notes"?

What if you add a component containing the key to the design AFTER it has been manufactured. Who "adds the component"? A third firm?

Do you buy custom silicon with the key already hidden inside? Then, hope that secret is never leaked/revealed/sold/etc. (i.e., have BETTER security over your IP than the likes of the NSA, Sony, many multinational banks, etc.) by a disgruntled employee, etc.?

And, WHEN it is disclosed/discovered (by some technique that you've not yet heard about), do you get NEW silicon with a different key? And, at the same time, allow clients to be able to PICK the microcontroller core that they want embedded in that silicon?

Do you suddenly stop making products when someone "cracks" your system? And, not resume until you've come up with another "uncrackable" approach?

And, of course, you no doubt expect all of this to be inexpensive as these "startups" don't have the re$ource$ for that, either?

You either have to trust *some* supplier OR be willing to take on some of the trusted activities yourself. How does your client know that

*YOU* can be trusted? Maybe you hid a backdoor in THEIR product?!

Invest your time in making better products -- so no one wants the last generation product. Design your products with security in mind from the start, not bolted on as an afterthought. Be an Engineer, not a Rent-a-Cop.

IFind people that you trust -- if you need them to be trustworthy -- in much the same way that you hire people to be *competent* (at some skill) when THAT is your need!

Reply to
Don Y

Add a Dallas chip with a unique id in every product?

Reply to
Bill Davy

Am 25.06.2017 um 00:27 schrieb snipped-for-privacy@gmail.com:

The answer to that is as simple as it is depressing: you don't, because you can't.

As the saying goes: three can keep a secret if two of them are dead.

And installing the secret yourself is not a solution either, because then you will effectively no longer be manufacturing in China. You'll be buying unfinished product from China. If you can afford the extra shipping and delay, plus the local work and taxes, good for you. But if you could, why were you going the "Made in China" route to begin with? And what makes you so sure your local employees can be trusted with the secret so much more readily than the Chinese?

And that's before we consider how you expect your overseas contractors to meaningfully _test_ freshly produced devices which have to refuse to run any but authenticated software, while they don't have the secret key to do the authentication.

Any firmware upgrade mechanism is an attack vector first, and a useful feature second.

Reply to
Hans-Bernhard Bröker

MCU manufacturers will typically install firmware for you. Given you are using a chip with a "secure" key/firmware, have the manufacturer install the key/firmware, and forward the parts for assembly.

That's a common solution... Won't help if the chip's "security" is weak...

Hope that helps, Best Regards, Dave

Reply to
Dave Nadler

Who installed this initial networking code to your device? Factory? End user? Or some other trusted person? Do you just send JTAG programmer to the end user as ask to install FW himself?

I am not treating it as afterthought. But all I am hearing here is that it cannot be done.

Reply to
jhnlmn

On 2017-06-24 snipped-for-privacy@gmail.com wrote in comp.arch.embedded:

There are chips designed for this purpose. I did some searching for a product a while ago and found some chips that claim secure key storage or other secure services. I did not use them (yet), but here are some examples of product families:

formatting link
formatting link
formatting link

--
Stef    (remove caps, dashes and .invalid from e-mail address to reply by mail) 

"Probably the best operating system in the world is the [operating system] 
 Click to see the full signature
Reply to
Stef

Max seem to provide complete MCUs with Secure Boot Loader with Public Key Authentication. But all the details are under NDA, so I cannot tell whether it makes sense. Did anybody studied those? Atmel and Fujitsu appears to only provide peripheral I2C/SPI chips with Crypto-Authentication. But I cannot imagine how this can work if FW on the main MPU is not trusted. Am I wrong?

I guess there is no real difference whether we (or some other trusted party) reprogram MCUs before or after assembly. Is there?

Also, there was some disagreement about what kind or keys to pre-program: symmetric or private, single key for all devices or unique key per device. Any opinion on this?

Reply to
jhnlmn

formatting link

--
(Remove the obvious prefix to reply privately.) 
Gemaakt met Opera's e-mailprogramma: http://www.opera.com/mail/
Reply to
Boudewijn Dijkstra

Would it not be easy to build a hardware gadget that logs the programming bit-stream at the target side of this programmer?

Reply to
Dave Nadler

The details about this "Flasher secure" are very scarce, Flasher manual does not mention it at all. This seem to be the most detailed explanation:

formatting link

It appears that they do nothing to prevent JTAG sniffing. And they cannot, theoretically, do anything without some support from the chip maker (decryption on the chip itself).

What is interesting is this "Flasher SECURE reads the UID from the device". How? Which chips are supported? Is it a standard feature?

Reply to
jhnlmn

One FPGA vendor that I have been using, Microsemi, actually sends encrypted programming files to the chip, and there is decryption hardware built into the chip, so no unencrypted data to sniff.

The chips come with a generic Microsemi key built in, and the tool by defaults encrypts to that key. You can also load yoour own private key into the chip, and encrypt the data to it, so only units which you have put you key into can use that data, and you can lock the device so you need to know the private key to even connect with a programmer/debugger.

There is also a unique serial number in each device, and you can include in the encrypted bit stream instructions that only a given device is to accept the bit stream. They then sell a special programmer that you can make your contract manufacturer use that accepts a file that has been encrypted with instructions of how many copies this file is allowed to make, and doing some accounting over the Internet, allows them to make exactly that many copies, getting the chips serial number, verifying permission over the Internet, and then encrypting for THAT particular device and programming. (The programmer using a secure processor so any decrypted bit patterns only exist inside the chip, so not accessible without breaking the encryption.

Reply to
Richard Damon

Op Thu, 29 Jun 2017 19:44:41 +0200 schreef Dave Nadler :

Sure. But suppose that the image contains a cryptographic signature of the program code *and* the device's unique ID. And the program checks the signature before booting the application.

--
(Remove the obvious prefix to reply privately.) 
Gemaakt met Opera's e-mailprogramma: http://www.opera.com/mail/
Reply to
Boudewijn Dijkstra

Yeah, it's quite a new product, documentation is lagging a bit.

Just sniffing may not instantly make cloning possible. See my reply to Dave Nadler.

I've heard that they're talking to ST and possibly others about secure JTAG/SWD.

Lots of newish 32-bit devices have a UID register. Otherwise, you can usually generate a good-enough fingerprint from the SRAM reset state.

--
(Remove the obvious prefix to reply privately.) 
Gemaakt met Opera's e-mailprogramma: http://www.opera.com/mail/
Reply to
Boudewijn Dijkstra

No, remember, until secure program is installed we cannot trust any pre-existing program in flash. JTAG must wipe out entire flash and RAM before installing a secure program.

This seem to be the only solution. I guess it will take several more years until most chip makers adopt this practice. And then there still will be a risk that they will not hide the secret key on die well enough. If the secret key is discovered, then entire batch of devices with the same key will be compromised.

Reply to
jhnlmn

I meant that the program would check its own signature. It can still be reverse engineered and patched to skip the checks, but this can be made arbitrarily difficult.

--
(Remove the obvious prefix to reply privately.) 
Gemaakt met Opera's e-mailprogramma: http://www.opera.com/mail/
Reply to
Boudewijn Dijkstra

arbitrarily difficult.

Please, explain. As far as I can see, if a program can be both disassembled and patched (which unencrypted JTAG allows), then anything is possible (and trivial), including skipping the check entirely or replacing the signature.

Reply to
jhnlmn

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.