Moving from 8051 to AVR

Disagree, the most significant driver is Time To Market. If cost was the main driver, then people would take time to do optimized custom chips. Many don't, even if they have volume motivating the development of chips. They select from available circuits on cost, but then cost is second in priority, not first.

If you have can save $0.5M earlier, then you surely put some money in the bank for a support contract... You have less chance of saving money if you are limited to the architecture supported by your assembler. assembler= less competition. Changing to a new C compiler can be very quick and low cost (gcc ?)

I do understand that people refusing to use high level languages might be less than fascinated by RISC processors...

--
Best Regards,
Ulf Samuelsson
This is intended to be my personal opinion which may,
or may bot be shared by my employer Atmel Nordic AB
Reply to
Ulf Samuelsson
Loading thread data ...

Shudder. If that is the future direction, (and you are probably right) then the AVR will fade, and all the interns will use ARMs...

I'm lost. How do the extra opcode features of the C51, like direct memory access, and Boolean operators, make code LESS readable ?

?! - I think you mean it made the core smaller, and simpler.

There is no reason, on todays silicon, for on chip RAM memory to be a bottleneck of any kind. FLASH is todays bottleneck, and the AVRs are limited by that.

-jg

Reply to
Jim Granville

It certainly can be!

Jon

Reply to
Jonathan Kirwan

First off, I want to point out that you are quoting someone I quoted (Ian) and responding to that quote, here. It's not my quote, just in case anyone missed that detail. Now to my comment to your position:

That's important. But C isn't necessarily an always-winner here and that isn't the most significant driver in all cases, despite your assurances otherwise. There are external factors that are imposed as constraints to what may be offered -- and these include cost, power consumption, size, e-radiation, heating, etc. It is my firm conclusion that in at least a few cases I know of, our competitor using C has, in fact, caused their products to become uncompetitive by comparison. I won't belabor that issue -- as it would require far too much transfer of information here for me to convince you -- suffice it that I assure you it is the case.

But I will agree that time to market is an important factor in almost every case. I just don't agree with it necessarily being the "most significant driver" in each and every case. It is here that you reach too far with your statement.

In any case, in many projects I've worked on choosing assembly or C has little impact on the time to market. I happen to be very competent in assembly programming on a wide variety of processors. I also have used C since 1978, so I figure I'm reasonably competent in its use, as well. In my domain of projects, this particular choice usually has a small impact on time. And since that isn't the only factor I am asked to weigh by clients, the choice of which remains a decision that isn't always taken in favor of C.

But perhaps you are addressing yourself to cases where the choice between C and assembly is actually visualized in your mind as one between "C, where the programmer(s) have good experience in using the language" and "assembly, where the programmer(s) are incompetent at applying it to entire projects." In that case, of course time to market would be horribly impacted. But then, that's a matter of skill sets. I could reverse the symmetry with the opposite result. (Also, as often happens, the concept of mixed assembly and C seems lost in all this.)

You mean, actually, if time weren't important at all. But that is arguing from an extreme. Of course, time is important. No one is arguing differently. So I hope no one finds this argument from an extreme as persuasive about the realities of using assembly or C or whatever else.

The point is that time to market, while important, isn't always taken in the extreme view you appear to be pushing here. The evidence you offer, that of "optimized custom chips," doesn't apply as it assumes that time isn't a driver at all. But that is nothing more than a strawman argument, as no one has tried to argue that time doesn't matter.

If so, that could be true. However, I don't know how that guarantees anything. Businesses go out of business, people die, new ownership of compiler products may change their policies including how they interpret old contracts or whether or not those old contracts are even binding on a new corporate owner, etc.

I don't find that quip of yours encouraging, frankly. And in my experience, when there are other options available (such as the GNU toolset or using assembly tools, as two examples of what I mean), then some serious consideration and balancing of priorities is worthwhile.

I'm not saying that commercial products are wrong to choose. But the point here is that many commercial vendors work far too hard to protect themselves from some perceived or real harm and that these protections can act to injure the very people who are honestly their truer customers. Dongles and weird copy protection schemes cause undue excess effort to take place in re-invigorating an old project for new modifications, even in the better cases where the vendor still exists and is willing to try and help out. In other cases, where the vendor doesn't even exist at all and there is no one available to provide a dongle or software key anymore, it's a terrible thing. One that could have been avoided, if there were other options at the time.

I don't imagine things nearly as simple, black and white, as you seem to, Ulf. My actual experience with these exact problems goes back decades, as I am still to this very day supporting products using the Lattice C compiler, old Intel tools, special customized software to handle the unique OBJ formats of both, etc., all for cross-development to an 8088 processor that is in a custom configuration totally unlike a PC. There is no Lattice, anymore. They are gone. And this is only one of many such examples. I also have had to deal with cases where the old tools are no longer even _known_ about by the current support staff and there is almost nothing they can even do, despite wanting to help. And there are other different circumstances.

Some foresight about these possibilities can help make a better choice about which approach to take and how to hedge against future problems. How that plays out, of course, depends on the circumstances and the vendors and a lot of other factors.

But nothing quite so simple as you seem to imagine.

I actually LOVED the MIPS R2000 in assembly!! I also love the ADSP-21xx, which allows me to "pack multiple instructions" in certain cases. I loved the HP-21MX processor, which allowed me to do microcoding of instructions. I loved the Alpha!! Don't get me started....

Jon

Reply to
Jonathan Kirwan

Depending on the DVD player it may or may not use a 32bit processor anyway or it may be using chippery to do all the conversions.

Printers fall into two camps real printers and spawn of the devil!

Real printers have the processing inside the printer itself, more often than not a 32 bit processor and with an option of network interface, it is highly likely that to support things like Postscript and networking they use high level languages and an OS.

Spawn of the devil printers use the host to do all the processing and other popup and web access applications, using a high level language normally on the host.

Also with printers there is a lot of reuse as the differences between model A, A1, A2, B3, quirkyname 3000 and sillyname 5000, are mainly branding, styling and minor changes in performance.

Keyboards are very much minimal code and power optimisations. Depending on the complexity (and volume) of the alarm system will mean that the vast majority are probably written in assembly on the cheapest and smallest possible micro.

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk
    PC Services
              GNU H8 & mailing list info
             For those web sites you hate
Reply to
Paul Carpenter

Mostr DVD players do the decode (hard) stuff in licensed hardware. There is usually a 4 or 8 bitter for the user interface.

I think there is a third camp becasue you assume I am talking about printers attached to PCs. There's a whole bunch of other printers in ticket machines, label printers and the like many coded in assembler (I know I developed a bunch of them)

I know the original PC keyboard used an 8051 nd I have a strong suspicion that most still do.

Ian

Reply to
Ian Bell

OK even if I buy that, but my argument still stands.

No you take it off the unit price and blow the competition away because if you don't they will.

What are you talking about. When you develop in high volume you pick your micro and its manufacturer GIVES you the assembler, ICE, and all the other development tools.

Assembler= greater choice of microcontrollers

Ian

Reply to
Ian Bell

Yup, 1981 IIRC. It was originally the Dragon 16 but a few weeks before production started, Clive Sinclair brought out the Spectrum (16K), so we had to be better than that. We jumped through some hoops to do that upgrade. We even did a run using some Siemens 16K RAM chips which had two version with opposite polarity chip enables. We literally piggy backed them in pairs. Those were the days.

Ian

Reply to
Ian Bell

If I had a clue what that meant I would respond to it.

And I make a point of never employing them. I teach electronic engineers to code in assembler then they move on to C. Only that way can you get programmers who really appreciate how to get the best out of a small microcontroller.

You have got to be joking. I have never yet seen a single computer science grad worth employing.

The certainly cannot. They haven't got a clue.

Ian

Reply to
Ian Bell

I agree completely. I think I have said elsewhere in this thread that when you reach an ARM then C is the way to go. I simply object to Ulf's blanket statement that C is always best.

ian

Reply to
Ian Bell

I cobbled together a bitmapped graphics display for the UK101 out of, IIRC, 16 x 2114 1kByte static RAMS. I piggybacked *all 16* on top of each other (except for chip selects). Worked great except you would get some "sparkling" pixels when the RAMs all eventually overheated :)

I seem to recall writing a simple screen editor and some graphics routines in 6502 *machine code*. None of your bloated assembler for me! :)

--

John Devereux
Reply to
John Devereux

I've been following your discussion and I cannot comment since I am still an EE student.

I can assure you though that Ian is right about the DVD players. I had a Norcent DP-500 DVD player and my sister did get a DVD stuck in there a few yrs back. So I opened it up to get the DVD and to my surprise I saw a huge single IC design. I was very confused and fairly impressed at the same time!

So I did look up a hack for it to be region free and it gave version numbers of the different firmwares and low and behold, I found something like:

"8052 - v1.02.03"

It did also have a DSP on the chip, which I guess was used for the DVD/MP3/VCD decoding.

So I came to the conclusion that te 8052 was probably the one responsible for getting controls from the remote and controlling the DSP and the other unit that was reading the discs. I was curious as to what speed it was operating, but the single IC design probably has some sort of PLL or clock divider because I remeber seeing a standard crystal.

I found that to be very interesting indeed.

Thought it would be a nice story in addition to backing up Ian's claim.

Reply to
Isaac Bosompem

While designs like that exist, they're not the norm any more. The decode is still done in hardware, but the core attached to it is usually 32-bit (sometimes ARM, sometimes MIPS, sometimes proprietary). This is driven by the desire for more advanced user interfaces.

Reply to
larwe

Exactly. If you are using assembly and your competitor is using C. You will lose out every time on Time To Market. Saving a few pennies while loosing market share is not a winning situation.

High volume is a relative term. It could mean 20,000 units a year or 20 million units a year. Today, I would guess a large percentage of high volume units are written in C. For instance, I highly doubt the ipod is written in assembly.

Reply to
diggerdo

I would understand that being the case as well. That DVD player I was talking about was bought back in 2002, 2003. I am sure much has changed since then. I do have a online friend in China that was doing some work on a MIPS based set top box.

Reply to
Isaac Bosompem

And putting out the wrong product fast is no better. Point is, there are many constraints operating on real products. Time to market is important, of course. But so is providing the right tool.

The ipod isn't a good example of an embedded system I've had access to, nor is it an 8-bit system. Strawman arguments appear to abound in this religious desire to "prove" that C is "always best." But if you happen to be fortunate enough to have a product selling at that price point and volume and customer service level, more power to you.

Jon

Reply to
Jonathan Kirwan

but not among the members of the ISO C panels

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

In article , Ulf Samuelsson writes

God help us!!! Those are the sort of IT programmer that should never be let loose on embedded systems (of any size)

Most embedded systems have some global variables because of the hardware.

You try comparing

Whilst we are on about precision I assume you mean ISO C ? Which C do you mean? ISO 90 or 99? (or do you really mean ANSI 98).

Most compilers still track C90

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
/\/\/ chris@phaedsys.org      www.phaedsys.org \/\/\
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/
Reply to
Chris Hills

Direct memory access means that you are using up the decoding space that should be used for registers. It is not "extra" opcode features, it is waste of decoding space.

Lack of registers means that local variables cannot be optimized into registers. This means you are forced to use global variables. I.E: you are making the code harder to read and thus less maintainable.

If you have more SRAM than you need which is often the case in 0.18u processes then the use boolean operators makes code less portable/readable. Writing a boolean variable is more expensive than writing a byte variable since you have to do a read-modify write - OR go to a much more expensive SRAM implementation.

--
Best Regards,
Ulf Samuelsson
This is intended to be my personal opinion which may,
or may bot be shared by my employer Atmel Nordic AB
Reply to
Ulf Samuelsson

I am not talking about 1-2kB programs here. If you have sigificant amount of code written in assembler, you are locked into that architecture. If someone comes out with a new chip at half the price and you have

500kB assembler code, you cannot easily change that. With C, I know people have ported such code in a week.
--
Best Regards,
Ulf Samuelsson
This is intended to be my personal opinion which may,
or may bot be shared by my employer Atmel Nordic AB
Reply to
Ulf Samuelsson

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.