Release/Revision standard notation?

Good for you. An old employer of mine that I still occasionally work for treats mechanical design work as real engineering, electrical design work as slightly mysterious real engineering, and software design work as strange, wacky black magic that can only be done under the light of a lunar eclipse with the aid of a freshly dead chicken.

Hence, when a mechanical guy has a preliminary design review everyone shows up with the expectation that they'll understand what's said, when an electrical guy has a design review everyone shows up with the expectation that they'll understand what's said, and when a software guy has a review only software people show up -- if you can drag them away from coding long enough to do so.

Drives me up the wall.

Good for you.

The process that seems to work when you have "big" embedded software is to use the VCS and the platoon (_not_ army) of developers to generate release candidates, release the binary image through the same part numbering system that you speak of, and keep the VCS up to date (through discipline, again) in the background to insure reproducible code.

Part of this is done by never, ever, letting a developer submit a binary image for release -- you go out and hire at least one person who knows how to start a build, but doesn't want to play with the code. Their job is to build and test software, and to kick the developer in the shins if it doesn't build or test out. If they can't check code out of the VCS into a virgin directory, build it and have it work, then the software is broken -- with no allowance for the developer insisting on special gyrations to make that particular build work.

--

Tim Wescott
Wescott Design Services
http://www.wescottdesign.com

Do you need to implement control loops in software?
"Applied Control Theory for Embedded Systems" gives you just what it says.
See details at http://www.wescottdesign.com/actfes/actfes.html
Reply to
Tim Wescott
Loading thread data ...

These days I suspect the best approach to "archiving" a system would be to create a virtual machine with all the tools installed.

Although as Joerg/Tim point out, there's still a problem if any of it is dongled.

Where I work our hardware releases are just "vX.YZ" where changes in X refer to new physical PCBs and changes in Y usually refer to some "significant" hacks (blue-wires -- or drilling out holes or something for mechanical pieces) on the board whereas changes and in Z usually refer to component value changes.

For software, we don't have any real standard, although I've generally a proponent of just stick a timestamp (full date & time) on the release -- which can be easily automated with the build tools. This completely eliminates the very common problem where someone finds a bug, some programmer fixes it (or at least attempts to), providers a test with a new binary... but the binary still claims to be the exact same version number as the buggy release. :-(

Reply to
Joel Koltner

...

Migrating code for tools no longer manufactured/sold actually means=20 that the simpler and cost effective solution for some markets is actually to archive complete computer systems as changing the tools means revalidation/certification. Even if you could get the new versions of tools to work with newer computers/operating systems/etc..

Especially if you have last time buy on components even at wafer level because you are making parts for aircraft, classic examples would be

=09Boeing 747 =09C130 (Hercules in UK) =09Nimrod (originally Comet airframe circa late 1950's)

Having recently looked at some testing equipment that has to work for at least 10 years, I am aware of the issues and multiple paper/CD/network storage of documents etc..

--=20 Paul Carpenter | snipped-for-privacy@pcserviceselectronics.co.uk PC Services Timing Diagram Font GNU H8 - compiler & Renesas H8/H8S/H8 Tiny For those web sites you hate

Reply to
Paul Carpenter

In article ,=20 snipped-for-privacy@pcserviceselectronics.co.uk says...>=20

n

r,

we

No one specified code, though that is a bigger problem. How much=20 bigger depends on the tools (another vote for VHDL vs. schematic,=20 IMO).

If you're in a lifetime buy situation the tools don't matter,=20 however documentation still matters.

Why single them out?

Why is test equipment a problem?

Reply to
krw

n

r,

we

t -

In my first job, all released documents were paper, even computer source-code listings which would be impractical to retype and unlikely to be redone without error. PCB artwork soource documents were hand- taped vellum.

In my last job, all released documents were electronic files archived on the local network, backed up on the corporate network across the country, and periodically archived on "tape" at a local disaster backup business. Most documents produced were part of the natural design flow (schematic diagrams, part lists (with some massaging), PCB artwork "Gerber" files, FPGA source code, etc). The most difficult and time-consuming documents to produce were Source Control Documents, a way of assigning our part number to someone else's product, which usually consisted of a cover sheet followed by pages copied from a vendor data sheet or catalog.

Reply to
Richard Henry

That's not a problem, it's a feature!

John

Reply to
John Larkin

... and pretty much all the clients I work for. Else I would most likely decline to consult for them. Without a formal release process you'd be only a few footsteps away from a major lawsuit should something happen. Because plaintiff's counsel will dig that out.

[...]
--
Regards, Joerg

http://www.analogconsultants.com/

"gmail" domain blocked because of excessive spam.
Use another domain or send PM.
Reply to
Joerg

Aha. I see!

--

John Devereux
Reply to
John Devereux

I often use two+ digit revisions for circuit etch/revisions

A original etch and circuit match A1 Etch level A modified to circuit rev A1

B Etch level B rollup match circuit Rev B

To facilitate this the ident layer has etch and circuit rev shown as letter for etch and white block for circuit rev added as part of the manufacturing/testing process.

Make sure I keep reference of what revisions are shipped.

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk
    PC Services
 Timing Diagram Font
  GNU H8 - compiler & Renesas H8/H8S/H8 Tiny
 For those web sites you hate
Reply to
Paul Carpenter

That's not bad at all. Full MIL standards require an ECO to change something, so your "A1" becomes "assy 12345-1A ECO 1412 ECO 1618", with labels stuck on the poor thing somehow.

We are rarely forced to revise a drawing but keep it coherent with other rev B drawings, so we re-issue it as rev B1. This is generally to correct a drawing error.

John

Reply to
John Larkin

VCS is OK with me (one of my guys uses one) as long as each formal release is exported and released as a self-contained package of sources, scripts, and tools, sufficient to do a full build without the VCS.

Personally, I can control my revs and versions without one. Sometimes we pass a file-ownership token between guys. It's a piece of wood with TOKEN painted on.

John

Reply to
John Larkin

In article , snipped-for-privacy@highNOTlandTHIStechnologyPART.com says...>

Do test the entire process on every OS release and hardware upgrade?

That's the minimum that should be done. I'll have to see what we're doing along those lines. At least there is a manual path back from "disaster".

Reply to
krw

No. Whenever the tools change, we dump another set into the M:\\TOOLS folder, where the M-drive is our library server. That's probably enough to untange the mess if we really need to go back and rebuild an ancient software product.

All my assembly stuff uses batch files and programs that run under DOS or the Windows command line. That includes a few homebrew things that we have the source code for. So far, I've been able to rebuild 10-12 year old stuff with no problems.

I won't let any of my guys use tools that we can't archive and be reasonably sure we can run a decade or two into the future. If they use a VCS, each formal release must be a set of clean files, all on its own, independent of the VCS. And the next rev must start from those released files, not from what the VCS thought the last formal release was. KISS.

John

Reply to
John Larkin

I think that rules out most Microsoft compilers? :-)

On Windows Vista, nothing prior to Visual Studio 2005 is officially supported... although it turns out that Visual Studio 6 from 1998 will 99% work with very minor workarounds. Of course, those workarounds are not found on Microsoft's web site :-( ...but Google knows where they are.

That's an entirely reasonable process that I doubt anyone would object to. I believe in "management by interfaces" -- use whatever tools you feel like, but when it comes time to release, we have a well-defined "interface" that says what your code has to do... or, in your case, how the code needs to be built.

---Joel

Reply to
Joel Koltner

[...]

I have not needed to delve into the networking yet - everything seems to work fine as it is. I did create a shared folder link so that my linux home directory appears as a drive letter in VirtualBox.

I was surprised how good the hardware support is. It creats some kind of "virtual" USB controller, so you can use any USB device with the manufacturers drivers.

I also use the "headless" mode where you can connect using a remote desktop ("rdesktop") session. So I can be working in the office, then move to the "lab" machine and carry on using that same session.

--

John Devereux
Reply to
John Devereux

A networking setup like mine is particularly useful if the host is windows. Linux has vastly more networking tools, so my VirtualBox setup lets me run these in a virtual Linux box (the NAT networking can't pass anything other than UDP and TCP/IP packets - no pings, arpings, or other protocols). It is also useful if you want to access the virtual machine directly on the network from other machines.

I've found it can be a little unstable at times, but apart from that it is *very* useful using USB devices in the virtual machines. It's particularly useful if the drivers for the devices are not too great, or have conflicting versions, or if you want to test installation routines. Just make a snapshot of the virtual machine before they are connected, then try it out. If it doesn't work, you can quickly roll back.

Reply to
David Brown
[...]

Also I don't feel the need to run AV software on the Windows VM, or keep it continuously updated. Although *theoretically* still needed, I don't do much browsing or read email from Windows (and when I do it is with firefox). So that in itself speeds things up enormously.

[...]
--

John Devereux
Reply to
John Devereux

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.