Soft-Core Processor - obsolescence mitigation a reality?

Hi,

My company has been investigating the use of soft-core processors (eg. Microblaze, NIOS II) to reduce the risk of having to rewrite firmware due to microcontroller/IC/peripheral obsolescence.

In our investigation an interesting point was raised, and I wanted to ask you what you thought about it...

***BACKGROUND*** Our products typically have a production life-cycle of 10-15 years+. Typically there is at least 1 MCU and an FPGA (for I/O expansion, high speed processing and control). Each has a USB or ethernet interface. The cost of rewriting firmware too frequently to target a whole new MCU/architecture is prohibitive. On the surface, an FPGA with a soft-core processor seems to be perfect - rationalization of components and potentially reduce the risk of obsolescence of 'firmware' as the processor can be migrated to newer silicon (if the FPGA becomes obsolete).

***ISSUE*** If we were to go down the soft-core processor path, we'd be targeting devices like Spartan 3S500E. If we were to implement the USB or ethernet MAC in the FPGA to mitigate the potential obsolescence of using an external USB/Ethernet IC, we'd still be relying on having an external PHY IC.

So the question that was raised is: are the PHY IC's going to go out of fashion quickly anyway, since many MCU's are having USB/Ethernet peripherals built-in? Does the same apply to external USB/Ethernet MAC/PHY IC's?

Or is it a safe approach to implement soft-core processor and use an external USB/Ethernet IC (MAC+PHY) and hope that they don't go obsolete too quickly? Or hope that the interface is standard and that there will be alternatives?

Your opinions are much appreciated. Thanks. PretzelX.

Reply to
PretzelX
Loading thread data ...

Obsolescence is a hard thing to dodge.

On top of your other concerns, I'd add making sure that the soft core IP you choose isn't bound tightly to the FPGA architecture -- otherwise you may find yourself with a soft core processor that'll only work on obsolete hardware.

As far as the USB PHY interface -- is your concern _any_ obsolescence, or obsolescence that drives a big engineering effort? Depending on your speed and other requirements you may be able to modularize your interface such that you can use a microprocessor with built-in USB and minimal software to interface to the FPGA, so that while the micro may go obsolete and drive a board change, the engineering time to implement a new interface will be minimal.

--

Tim Wescott
Wescott Design Services
http://www.wescottdesign.com

Do you need to implement control loops in software?
"Applied Control Theory for Embedded Systems" gives you just what it says.
See details at http://www.wescottdesign.com/actfes/actfes.html
Reply to
Tim Wescott

... snip ...

I think you are looking in the wrong direction. Simply implement your system using a well defined standard language, such as ISO defined C or Ada. Define a few specific non-implementable (in those langages) functions. Put the two together. When you have to change processors, just rewrite those non-standard functions. Don't forget to specify them thoroughly.

Using gcc you can have either C or Ada available on many systems.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

If the design is heavily dependant on micro peripherals, such as timers, interrupts, I/O ports etc, then this isn't really an option.

IMHO, I'd go the soft-core processor (unlikely there's much vendor silicon-dependencies there) and a generic interface (whether internal to the FPGA or at the FPGA pins) to PC connectivity. What you plug into this interface will depend on current PC connectivity trends - serial, USB etc...

Wow, this must be some cool product that transcends decades of PC evolution! ;)

Regards,

--
Mark McDougall, Engineer
Virtual Logic Pty Ltd, 
21-25 King St, Rockdale, 2216
Ph: +612-9599-3255 Fax: +612-9599-3266
Reply to
Mark McDougall

says.

Hi Tim,

What you say is true, and it's that very fact - that obsolescence is inevitable - that is driving this new approach to try to reduce the effects of it.

I can see what you're saying about not using a soft-core MCU that's bound to a vendor, but the highly integrated development tools, IP and support bring a level of comfort and confidence (whether this is just perceived rather than real, I'm not sure!).

As you put it, the type of obsolescence that we're trying to avoid is not that of silicon, which is out of our control and inevitable, but to minimize the effect of that obsolescence and minimize the resulting amount of engineering effort.

We are definitely pursuing other techniques like implementing a HAL and trying to use 'standard' language and features only. Also developing standard interfaces to things like USB. Using an RTOS seems to take care of a lot of this to a degree too.

So do you think that using a USB MAC/PHY IC (using a standard interface) and a soft-core is a reasonable approach? (It seems to be more economical too (don't have to fork out for USB MAC IP if going for vendor supplied IP).)

Thanks, PretzelX.

Reply to
PretzelX

Hi Chuck,

That's a great suggestion.

Actually we are currently trying to do what you suggested. We're trying to avoid fancy/obscure features of the MCUs and also abstracting hardware specific functionality into its own layer (HAL). As mentioned in the previous reply, using an RTOS seems to provide this abstraction to a degree.

Thanks Chuck, PretzelX.

Reply to
PretzelX

this

etc...

Hi Mark,

Thanks for your suggestions. I'm personally leaning towards the soft-core processor approach too.

And what you say is very true. That by approaching the design this way - comms interface (USB/ethernet/whatever is to come in the future), we future proof the firmware to a degree. As long as the interface to the communications peripheral IC is standardardized and abstracted out in the firmware, it's quite safe.

Another reason I'm leaning towards the soft-core approach is that there aren't any MCUs that have the I/O requirements of the projects in mind. Going down the discreet MCU path would mean that I'd need an FPGA anyway!

But there is always the risk of the FPGA vendor pulling the plug on the soft-core. Hence, all the other techniques to mitigate obsolescence of code (HAL etc.)

Thanks Mark, PretzelX.

Reply to
PretzelX

For anything longer than 5 years it is VERY difficult to maintain supply and support of products.

You may have to consider archiving the complete development system (and a backup) along with EVERY piece of software, with off site copies of hardware and software.

Even archiving all documents and files, let alone any manufacturing tools can be a nightmare. Version changes in manufacturers, different OS support, suppliers going bankrupt, other parts going obsolete.

I know of several devices used in avionics, where the special cutom chips have been respun several times and recertified, as the manufacturing process of devices become obsolete (4inch to 6inch min to 8inch wafer size). The main problem was a specific EEPROM device that was specified in one project, where a last time buy for 15 years worth of production was done, but as a last time buy of WAFERS! These devices are packaged and tested in annual batches.

It is not uncommon for products in a lot of industries to be around for

20 years! Just consider how many people have still working AVO meters or other test equipment.

Guy Macon has an interesting article on his website on archiving documents for several years.

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk
    PC Services
 Timing Diagram Font
  GNU H8 - compiler & Renesas H8/H8S/H8 Tiny
 For those web sites you hate
Reply to
Paul Carpenter

... snip ...

... snip ...

I would say especially in avionics, where the testing and certifying process is so elaborate that great efforts are worthwhile to avoid repetition. I know of one firm that, at least

8 years ago, had a product based on the 80186 chip. They had bought a large number of packaged chips when they went out of production, and also a herd of wafers, specifically to avoid a future software recertification problem. The hardware production recertification would be easier and cheaper.

Note how rarely we hear about air crashes due to computer bugs.

--
 [mail]: Chuck F (cbfalconer at maineline dot net) 
 [page]: 
            Try the download section.
Reply to
CBFalconer

There are free, open-source soft core processors out there that would negate the prospect of the vendor "pulling the plug". Choose a legacy core and you have a choice of free, open-source compilers/assemblers as well - obsolescence won't be an issue (although certification of the core/tools may be in some applictions I suppose).

Choosing an open soft-core with open tools and implementing an abstraction layer for your PC/external interface is the best way IMHO to future-proof the design.

Regards,

--
Mark McDougall, Engineer
Virtual Logic Pty Ltd, 
21-25 King St, Rockdale, 2216
Ph: +612-9599-3255 Fax: +612-9599-3266
Reply to
Mark McDougall

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.