PIC processors and ADCs

On a sunny day (Tue, 14 Apr 2015 14:37:53 +0200) it happened David Brown wrote in :

The requirement for 'micro controller' go up and up. If I look in my house, in the living room, n factory, then Linux in Samsung 3D TV( have teh source) Linux in Linksys wireless access point Linux on 2 laptops Linux on 3 PCs Linux on 4 Raspis Linux on my Humax cable modem. I think linux on an IP cam, not sure. My LG robot vacuum cleaner runs Linux too I think (thing sucks BTW do not buy that).

So, many if not most things where you would expect an 'embedded' board actually simply run Linux. In such a case using a Raspberry (suggestion to Samsung) would make the TV more user friendly I'd think. I mean I had to write and publish the soft for that TV to actually play multi-language transport stream files.

Well I had a look, and I do not see the point. Looks to me like somebody started to make their own language, and then made a lot of mistakes, changed some in the next version etc. Why not learn C? Python may SEEM simpler to a beginner but it really is a mess.

I dunno, people talk about whats-that-thing Arduino, it is using soft written in that horrible C++ language. BTW C++ (as well as Java) is a crime against humanity. Sure people sell add ons for that 'duino, it is business, but I would not want one, and AVR has no place.

Same for C++ compiler vendors, it is crap will always be crap and look at the damage it has done to humanity, is not that MS OS written in C++? At least I remember Visual C++ coding what a joke, coding in little windows in all sort of colors like a kid coloring cartoons. And bloat no end.

So, anyways it seems Linux has taken over completely in the embedded world, and multicore Raspis are the present and future, cool. Now all we have to do is make little interface boards for whatever it is we need that fit a Raspi GPIO and we can fly to other planets, well, it will need fixed FLASH, not that SDcard and its connector to live through launch. Could even make the F35 fly ;-) Well after changing its design to a Chinese one.

So, finally an embedded standard! And a good learning tool, pick your language.

Maybe I should buy one, but bit busy studying lately, need to do some exams.

Reply to
Jan Panteltje
Loading thread data ...

On a sunny day (Tue, 14 Apr 2015 05:19:34 -0700 (PDT)) it happened Lasse Langwadt Christensen wrote in :

Has it HDMI, ethernet, USB, memory, does it run Linux and gcc, does it need special cables to program? Can you ssh to it?

I have not used a debugger since the 1986 or 1987 last century. You do not need a debugger in a high level language a like asm or C. mm not even in binary.

Reply to
Jan Panteltje

On a sunny day (Tue, 14 Apr 2015 15:43:35 +0200) it happened David Brown wrote in :

My PIC 18F14K22 runs at 64 MHz, 4 x PLL from internal osc.

Reply to
Jan Panteltje

Den tirsdag den 14. april 2015 kl. 16.13.51 UTC+2 skrev Jan Panteltje:

it's a microcontroller not a PC

you can program it a standard usb cable, or bootload via uart/usb/can etc. you can compile code for it with gcc, some of them have USB

you don't need it but it makes things easier, just like driving with your eyes open is easier than keeping them closed and just bumping against the curb ...

-Lasse

Reply to
Lasse Langwadt Christensen

There are plenty of embedded Linux systems, but to my knowledge none that run on a microcontroller. There is no hard and fast definition here, but there are common places to draw lines. To be called a "microcontroller", the system has to be a bit more integrated - a good rule of thumb is that all the program flash and ram are embedded in the same chip. In most cases, the device runs a single program - any OS is linked directly with the application.

A discussion of the advantages and disadvantages of Python is getting quite off topic, but as a general point it is a much higher level language than C. And given that it is currently one of the most popular programming languages, I would not classify it as just "someone started to make their own language" (although that is how it started, just as for almost all programming languages).

If you don't like it, that's fine - but if you know nothing about the language, don't presume to tell others to stay away from it.

I think you are straying into personal opinion, rather than subjective fact :-)

No, MS's OS are not written in C++ (though parts of it is).

And lots of stuff written for Linux is in C++, even though the kernel and most of the historic code is in C. You may be shocked to learn that there is a lot of Python code in a typical Linux distribution - as well as Perl, Tcl, C, C++, Java, Objective-C, OCAML, Lua, Bash, C#, and a few other languages that I've forgotten to list.

People write good software and crap software regardless of the language.

That's the IDE, not the language.

No, it has not. It is very popular for bigger embedded systems, and its use is spreading, but it has certainly not "taken over" - nor will it ever do so. And for microcontollers and small embedded systems, it is non-existent.

When programming for embedded Linux, I mostly use Python. That's one of the reasons for using embedded Linux in the first place.

Reply to
David Brown

On a sunny day (Tue, 14 Apr 2015 16:23:24 +0200) it happened David Brown wrote in :

That explains it!

Reply to
Jan Panteltje

On a sunny day (Tue, 14 Apr 2015 07:22:53 -0700 (PDT)) it happened Lasse Langwadt Christensen wrote in :

Once there was sjips then there was processors then there was PIC then there was Raspberry now there is quad core Raspberry. Why bother will less, and lesser tools?

You cannot compare that way, I'd say, those who need the debugger are the ones not looking when driving and then every time need to get out of the car to take up the damage when hitting the curb (and many other things) ;-)

I mean in C watch your pointers and types. In asm well, I dunno what you guys do wrong, I simply never have those problems, write modular, test register values if I need to see those via a 'print' statement via serial, specify some debug mode if need be to print all those vars. But as I never have those problems it is like I cannot help you with bad driving either ;-)

In asm I have some routines to print registers in hex or decimal (32 bit on PIC). All ye need.

Reply to
Jan Panteltje

Den tirsdag den 14. april 2015 kl. 16.45.07 UTC+2 skrev Jan Panteltje:

It's about using the right tool for the job, there are plenty of things that a small MCU will do that a raspberry running linux will suck at

and you get all of that and more for free with a debugger so why waste time on it?

-Lasse

Reply to
Lasse Langwadt Christensen

On a sunny day (Tue, 14 Apr 2015 11:19:11 -0700 (PDT)) it happened Lasse Langwadt Christensen wrote in :

Sure that is why I use PICs, but this about Eva-Lotion

Right, I would not waste my time on a D-bugger either :-)

Reply to
Jan Panteltje

The Arduino things are pretty useful if you just need a micro on a breakout board. Program them with GCC and ignore the other stuff if you like. ATMega whatever.

I don't really see much reason to pick one architecture over another in 8-bitters, but I've already been bitten a few times by premature retirement of processors (not PICs, so far, but **others**).

There's even a 'new' 8-bit family (EFM8) from Silabs. 8051-based so Keil works on it.

Reply to
Spehro Pefhany

Den tirsdag den 14. april 2015 kl. 20.29.33 UTC+2 skrev Jan Panteltje:

So? with those boards you can get a much more powerful 32 bit MCU with better tools for less that the cost of a pic and piece of veroboard

because starting a terminal and writing code to print stuff is so much easier that starting another program that does exactly the same out of the box

-Lasse

Reply to
Lasse Langwadt Christensen

He prefers his favourite flint axe to all other tools.

If your only tool is a hammer then every problem looks like a nail.

He does have a partly valid point though in that too many developers spend way too much time stepping aimlessly through code in debuggers. Used properly debuggers and simulators can save a lot of time.

Post mortem debuggers are even more impressive - I used to reckon on finding a root cause of failure almost every time a live system failed.

--
Regards, 
Martin Brown
Reply to
Martin Brown

On a sunny day (Tue, 14 Apr 2015 14:33:43 -0700 (PDT)) it happened Lasse Langwadt Christensen wrote in :

Well, and that learning curve.

Sometimes I wonder if the art of programming is lost.

These day you have an app for everything I suppose. Why not get one to do the coding for you? Does not matter what language it uses internally, as long as it works and beeps when ready like a microwave...

Reply to
Jan Panteltje

On a sunny day (Tue, 14 Apr 2015 15:38:20 -0400) it happened Spehro Pefhany wrote in :

Cool, I once wrote a simple 8051 assembler, should be on my site.

formatting link
Also I have a 8052 AH basic computa that I build, it can also progam xx32 EPROMS. Now that was in the eighties I think.

Although that computah still works, a simple 18F is much more powerful, peripherals, ADC.., power consumption.

In the eighties the Intel assembler was something like 1 k$.

Things have changed. Don't look back. Wonder what's next...

Reply to
Jan Panteltje

On a sunny day (Tue, 14 Apr 2015 22:40:24 +0100) it happened Martin Brown wrote in :

Nope, I have worked with more tools and systems than you can imagine.

Exactly, and if you write good code the extreme rare event where you get stuck is likely a processor or peripheral issue, and can be pin-pointed in no time with some test routines.

Does not ripping apart dead people smell? Wait till the bugs have done their work and use then skeleton for biology lessons.

;-)

Reply to
Jan Panteltje

Once there were feet, then there were horses, then there were cars, then there were trains, then there were planes.

Why would anyone bother to walk to the postbox when they can fly in a jumbo jet?

Reply to
David Brown

Note to the partially illiterate - a PIC18F is not the same as a PIC16F, which was the family in question. The AVR example I gave has a cpu core that runs real code something like 10 to 20 times as fast as the PIC example chip (which itself is bigger and more powerful than the one the OP was discussing).

The PIC18 core has a number of improvements over the PIC16 core, not just a higher clock speed (in particular, it has linear memory addressing rather than bank switching all the time). At 64 MHz it might be getting close to the speed of an AVR at 32 MHz for some code, though not if multiplies or other serious arithmetic were involved.

It's still a crap core, just a little less crap than lower PIC families.

Reply to
David Brown

On a sunny day (Wed, 15 Apr 2015 10:30:30 +0200) it happened David Brown wrote in :

Nope, subject line reads "PIC processors and ADCs" You are the illiterate.

It so happens PIC18 has hardware multiply

Although bank switching has it problems, it never prevented real programmers from writing functional code. AVR is tooted far to much. That architecture fell sort of between boat and wall.

Some (I think it was you) say ARM is the future.

I do not agree, bought a book on ARM already in 1984 or whenever it was, and decided that integrated hardware (as we see in the xx86) was a better idea. I mean higher level instructions. Hey even the Z80, I used a lot, was superior.

Of course these days things grow towards each other, the cellphone market and Intel wanting into it. Low power requirements, the quest for lower power.

Your own cores in FPGA, what not, now we are full circle, design your own processor. Your own instruction set, and integrate peripheral hardware in the same chip. But... it all depends on the application. But AVR? well I think there is an AVR core on opencores.org

formatting link

So if you are a diehard AVR type, then there you go. Ever programmed an FPGA?

Ever programmed ANYTHING apart from that 'duino?

Reply to
Jan Panteltje

On a sunny day (Wed, 15 Apr 2015 10:23:23 +0200) it happened David Brown wrote in :

Y knot?

I duinno i do niot teach

Wel when asking that sort of question

Reply to
Jan Panteltje

Despite the subject line, it was the PIC16F that was under discussion and used for comparison.

(But I apologise for the completely unnecessary "illiterate" comment.)

That helps.

The AVR has been an extraordinarily successful architecture, and has brought 8-bit processing to a new level. It certainly has its flaws and limitations, but it brought modern cpu principles into the 8-bit world. Putting aside issues of peripherals, package types, long-term availability, etc., as a cpu core there is nothing else in the 8-bit world that beats the AVR for ease of use or programming.

Most people, including me, say that.

Again, I think there are a number of flaws (or "scope for improvement") in the ARM architecture. It is not my favourite 32-bit ISA - I think the 68K (including Coldfire) is nicer in many ways. But the combination of a solid, scalable, efficient architecture combined with market forces has made ARM /the/ standard. Whatever you might think of it, Cortex M is without doubt the future for microcontrollers. In some ways it is a shame - I would prefer to see more competition (such as from MIPS small cores). But this is the way of the world.

The x86 chips did not have much integrated hardware at that time. The ARM did not appear (outside prototypes) until 1987. It was a processor

- no integrated hardware. And it was so much faster than the x86 chips that you could emulate an 80286 in software at a solid speed (about half the speed of the real 80286 hardware, IIRC). I only did a little ARM assembly coding at the time - I didn't have any books (though I sneak-read a few pages in a bookshop), so I based my programming on extending a few existing programs.

The Z80 had higher-level instructions than the ARM2 - but the Z80 was a CISC chip, while ARM is RISC. The trend towards RISC was starting at that time in mainstream processors, and has continued - RISC is dominant in all except legacy markets (where compatibility keeps old architectures around despite their inefficiencies). High-level instructions in assembly have long since been replaced by high-level features of programming languages.

Intel (with x86) is non-existent in the cellphone market. They have some design wins in the tablet market, where low processor power is less relevant (since the whole device uses more power, dominated by the screens). But it is nice that the competition from ARM has led Intel to put work into power efficiency.

That is on its way out. Soft processors are used for some small or extremely specialised tasks in FPGAs, but hard macros (now always ARM cores) are the key cpu units in most FPGA designs.

Yes, though I haven't done many FPGA designs.

I haven't used an Arduino. But the cores I have programmed include

6502, Z80A, ARM, HPC, COP8, PIC12, PIC16, M68K, ColdFire, PPC, MIPS, x86, NIOS, XMOS, MSP430, TMS320, 6800, 56F DSPs, Blackfin, AVR, AVR32, and a few others that I have forgotten. Some of these I have only used briefly or for a single project. In every case, I have done at least a little assembly, but the majority of the work has been with C.
Reply to
David Brown

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.