Moving from 8051 to AVR

Yes. I've seen exactly that happen, in fact.

hehe. In the case I had cited re: 8088, there are some custom tools to convert certain fields so that one tool's OBJ format was compatible with another tool's. It's life in embedded work.

Jon

Reply to
Jonathan Kirwan
Loading thread data ...

Using malloc in a system with 128 bytes of RAM.

const char[]="fred"; with no regard paid to whether this is in code space or data space (on CPUs where it matters).

All kinds of assumptions about brownout and glitch recovery.

Failure to implement hysteresis on battery level monitoring code.

Either never enabling watchdog, or kicking it in timer ISR without making any attempt to check if the system is in a functional state.

etc etc.

Reply to
larwe

Indeed, you said "a person educated at university on a PC" CAre to be a bit more speciifc about what that means?

Ian

Reply to
Ian Bell

Well a) 500K of Z80 was not a smart thing to do in the first place and b) porting it to an AVR was an even less smart thing to do.

Ian

Reply to
Ian Bell

But why would they 'require' something a s powerfull as an ARM/

Ian

Reply to
Ian Bell

I agree. I think the ARM will become the 8051 of the 21st century - them Ulf will have his way and we will all be C programmers.

Ian

Reply to
Ian Bell

Perhaps I'm reading too much there. However, I would have imagined that instead of just differing and taking an opposing position, you might have expanded upon things showing that there is "texture" to the question, instead of putting it so abruptly.

So I gather you agree that neither cost nor time to market is always "most significant," but that things depend. Good.

I often get cost targets and features and noise specs and delivery times desired and a whole host of factors, some starting out supposedly more important than others. We debate them, collect more information, debate some more, and think some more, mentally work through different priority arrangements and choices. Over some modest time, much more clarity arrives and we close towards an acceptable plan.

I just don't get things that look so black and white, plain vanilla, as your comment appears to suggest. So perhaps this is something else I don't well understand from your writing.

That could be. Either that is my failure to read, or yours to explain. Or both. Or neither.

If that is what you are concerned over, I'll say what I've said elsewhere, again. In my experience, projects are a great deal more than just the "software coding time." Most of the IP is in a host of other factors and a great deal of work surrounds the mere coding work. For example, in one case I have IP going back almost 20 years of time (more than that, counting other folks' efforts, but I'm limiting myself to the span of this processor architecture within the product area -- ADSP-21xx) and spanning literally many dozens of instruments

-- most still operating well. There are new research efforts ongoing, cost reductions, shrinks, etc. Porting the assembly code is the least of my worries. And it wouldn't be a serious problem should we chose "something else" and port the assembly. Not in comparison with the entire project and the dozens of people pushing different fronts.

So "easy" is in the eye of the beholder. And frankly, with my skills in assembly coding, I can port over to most any processor quite quickly if I stayed in assembly. I might not, as this is almost 20 years later on and things have definitely changed since then. But that is another decision I'll take in the context of many other things.

No idea what you mean to suggest, here.

Jon

Reply to
Jonathan Kirwan

:) - wow, that's some spin...

again, it comes back to make the die smaller, but at the classic RISC cost, of larger code....

Well, clearly we have different code styles, I have no problems putting local variables in registers, or passing params in registers.... I can also switch register banks...

As to 'more registers' : in the AVR, rememeber your solution to lack of boolean space, was to 'use some registers', plus the upper

6 are lost under the pointers, and the lower 16 have different opcode access. ( in the AVR, not all registers are equal, due to a lack of opcode space )

Then your advice was to allocate more registers for interrupts, as the 'fix' for the absence of bank switching... Problem is, they are a very finite resource.

The C51 has BOTH register, BIT and DATA memory opcodes...

Which AVRs are 0.18u ?

Wow, I've never seen the 'less readable' claim before.

In the C51 I can write :

IF FlagOne OR FlagTwo THEN

Cost is 2 bits of cheap, data memory space, handled automatically by the linker. To me that is _both_ portable and readable.

In the AVR, if you have multiple modules, with register mapped booleans, there is no linker to allocate them, that is up to the user. Not very portable.... Solution : Most users would take the easy path: just chew more RAM (and cycles), and Atomic - what's that ?

but you just said SRAM area was cheap, so this logic is circular... [and does not actually answer the original question ]

The point is the C51 silicon does the Atomic RmW for you, whilst in the AVR, the software has to fetch.mask.replace - so the 'simpler' SRAM has a significant cost elsewhere : in Code and cycles.

I'll stick with the core that is multisourced, and was designed as a microcontroller, and on the projects that need RISC, I'll use ARM or maybe one of the FPGA SoftCPUs .... Funny how more competition seems to sharpen those ARM prices...

-jg

Reply to
Jim Granville

Most of the early ones, these days they use a lots of chips/chipsets that are also used in set top boxes and others because these 32bit processors with onboard decode and video hardware are cheap and plentiful for that model. Next model may use the next new chips/chipset for that model batch.

Real printers that did not need much processing, but I see more and more of these 'low' end printers being used to do more and more complex things that some are already migrating to 32 bit processors. Mainly because the people want to print colour logos, photos, fonts like on thier PC and network the devices or even have USB versions. A lot of this is quicker and cheaper in development costs (time, materials, labour) with 32 bit.

Original PC Keyboard was something like the 8048 (maybe 8042), all the latest boards are basic clones of that chip.

--
Paul Carpenter          | paul@pcserviceselectronics.co.uk
    PC Services
 Click to see the full signature
Reply to
Paul Carpenter

That would not have been a recent meeting at Ericsson, tho would it ? :) [Seems AVR has been VERY costly to Ericsson - recent news ..]

-jg

Reply to
Jim Granville

I like the AVR, too, Ulf. I already pointed you to a post I made in

1999 here, where I lauded it over and above my other experiences in writing assembly code.

But this changes nothing in my comments today and yesterday. Both are simultaneously true. If you cannot see why, then you do not follow what I'm saying very well.

How could I argue with the way you have now defined the question?

If a C compiler is a requirement for your project and you meet your overall application better using some well-selected C compiler for CPU x than when using a similarly well-selected C compiler for CPU y, then you should probably use the one that meets the needs better. That is, of course, if there aren't other reasons that compete for attention in the project and would argue otherwise.

But I have to say that I don't recall making that kind of comparison in 30+ years. I can't recall everything else being so equal that the deciding issue was whether or not this particular well-selected C compiler for CPU x was better than that well-selected C compiler for CPU y, suggesting that I choose CPU x over CPU y on the basis of that narrow focus. In my experience, the decisions are taken for a variety of other reasons long before I start worrying about things like that. Stuff like I've already mentioned: peripheral features, availability, sourcing options, cost, power, packaging, future projections, and so on.

So I don't find your focus very useful, in practice.

Really? I thought you defined it as "better code size." But now you conflate it with this? I think you don't even know yet how you define "c friendly," except in that religious (and useless) meaning I already mentioned -- as "everything good." So now, if development time is better, than that is "c friendly," too. What else would you like to include, Ulf?

There is no objective meaning here, from how I read you.

Indeed they do care about development time.

My experience is that there really _are_ bad problems with certain C compilers that can crop up and eat your time away. But none of my experience says that the truly bad issues have anything to do with the processor, as an issue inherent to it (though I may grant an exception or two, it's not the usual case for 8 bit CPUs and above, today.) The time drain I've experienced in using C compilers has had to do with poor design or implementation by the vendor that I had to track down. And I can't say that this has been more or less, by CPU family.

But on the subject of C compilers, it's been my experience that it is the tool vendor I've had to worry about, not the CPU itself.

I guess my main point, though, is that CPU manufacturers like to push their product on some canard, like "my CPU is more C-friendly than theirs!" But it's (1) more misdirection than fact, and (2) isn't an important focus, anyway. What counts are the larger fundamentals about the CPU family -- it's peripheral features, it's availability, sourcing options, cost, power, packaging, business model relationships to your own, and whether or not the actual tools together with all the rest can serve your application well. So I've no idea why a narrow focus on "c friendly" is meaningful to anyone much except for those marketing their CPUs and those writing compilers for them. At least, you haven't managed to convince me about it.

Jon

Reply to
Jonathan Kirwan

I guess Ulf meant 500K of source code....

Meindert

Reply to
Meindert Sprang

Was that a serious suggestion ?

This is a old, maintenance product - who in their right mind, would throw a new tool chain at that ?

[ Oh I know, one of your fresh new graduates... :) ]

-jg

Reply to
Jim Granville

I'm curious. How so?

Jon

Reply to
Jonathan Kirwan

I was stunned for a moment trying to imagine someone seriously saying "why not switch to another C compiler?" in a case like this. But I did get a chuckle.

Yikes! ;)

Jon

Reply to
Jonathan Kirwan

Try

formatting link

but I especially liked the desperate spin elsewhere, as Atmel tried to avoid 'win the battle, but loose the customer'. "We still value Ericsson as a customer.." stuff

Will Ericsson's top management even look at Atmel now, if there is _any_ alternative ?

-jg

Reply to
Jim Granville

Not a lot of detailed information there. Now I'd like to see the arbitration documents. But I'm betting they are sealed up very tightly. ;) It reads as though Ericsson was made privy to trade secrets, voluntarily by Atmel, through some association with Atmel and that they violated reasonable expectations by Atmel about those information exchanges.

But if that is so, it seems a little surprising to me that the limitations (those outside of the monetary award, I mean) were simply that Ericsson cannot embed an AVR that isn't fabricated by Atmel's FABs and bought through legitimate channels from Atmel. The decision suggests to me that the trade secrets involve the AVR CPU design. But Ericsson isn't in the business of competing at selling CPUs and, frankly, beyond that value I don't consider an AVR having IP worth taking these kinds of risks in damage awards and limitations. I imagine there are so many other options available to a company like Ericsson, these days.

Thanks for the article.

Jon

Reply to
Jonathan Kirwan

Pretty much my reaction.

I get the impression both Atmel ( and Ericsson!) were surprised by the amount awarded.

formatting link

You could say that makes the AVR one of the most expensive Small core IP's in history.... :)

The AVR is not that much ahead of something like the Mico8, which Lattice have open sourced, and there are plenty of better cores about...

Yes, and I suspect they will be avoiding single sourced, ( or not open ) cores now .... :)

This has to have harmed Atmel's AVR business, long term. Suing one's customers, right thru to open court, is not a good look, and not quickly forgotten. Normally, you bring them to heel, and sort the license fees.

-jg

Reply to
Jim Granville

In article , Ulf Samuelsson writes

You are going to pick up embedded or electronics engineers. You are NOT going to use IT or computer science programmers.

I would much prefer to teach some one C than have a PC programmer (Vc++ C++/CLI, .Net, VB) and have to un-train a lot of poor programming and hacking techniques before training them in proper SW engineering in C for embedded systems.

You only have to look at the questions students ask on here as it is.

Then we agree but that has nothing to do with PC programming. I don't see how the 51 means you have more globals than on any other MCU

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
 Click to see the full signature
Reply to
Chris Hills

AFAIK compiler writing is not taught in university courses. At least not in the UK

There are probably around 40 compiler company's in the world (a guess) there are more than 10 I know of off the top of my head.

--
\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\/\
\/\/\/\/\ Chris Hills  Staffs  England     /\/\/\/\/
 Click to see the full signature
Reply to
Chris Hills

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.