I am new to pic programming. I have done two small projects tell now, and I am interested to buy a pic programmer, but I am not sure which one to get. Based on what I have read in the internet, I think pickkit2 or pickkit3 are good programmers for beginners. Which one do you guys recommend ?
Thanks,
--------------------------------------- Posted through
"Musab" schreef in bericht news:2Lqdncsrn66JGDHRRVn snipped-for-privacy@giganews.com...
I guess for a beginner the biggest difference to be the price-tag. But as the pickit3 is newer I nevertheless advise this one. It covers some more components and it may last longer.
I have a PicKit3. It's the only store-bought programmer I have tried, so I have nothing to compare to, but I am very pleased with it. In addition to programming the chips, it is also a basic debugger, which is incredibly useful sometimes.
But if you have some experience, and have the extra cash, then the ICD3 would be the second choice. Why? Debugging speed. It's faster than the pickit3, which is helpful during those stressful times.
I have the PICkit2, very happy with it. Also it has debug mode that works very well with the PIC chips with on-chip debug support, like the 16F886/887 I'm using at the moment.
If you're using 16Fxxx and below, PICkit2 is fine, I have no idea about the more complex chips.
Another thing to ask yourself is: am I going to program other devices in the future? If you have the money to spend a Galep5 from conitec.de is an excellent programmer which can program almost any device.
--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
On a sunny day (Fri, 08 Oct 2010 16:43:47 GMT) it happened snipped-for-privacy@puntnl.niks (Nico Coesel) wrote in :
My personal view is that if you want to program a PIC you should start by building your own programmer, and read the programming specification.
formatting link
At least then you know what you are doing. And cheaper too, you learn more, faster, more reliable, better quality, more flexible. Of course one should learn [a] programming [language] too, starting with asm.
And not use that MPLAB bloat... I will never understand what that adds to things. here is a device that you can program in 20 seconds, as many times as you like, and if more times those are only a few $, and then why wrestle with MPlab bloat simulation on a MS PC? Us Linux and gpasm.
The world will die of simulations, even NASA's manned space travel has been replaced by simulations, everything you find on the web is 'simulation', we need a reality call. WW3 would help.
On a sunny day (Sat, 09 Oct 2010 15:43:46 +1100) it happened Grant wrote in :
Ones that are running win[!]do[w]s are members of the sect of the God Of Bloat. True devotion to the G O B means sacrificing CPU cycles to keep the G O B happy. The G O B in return will then inspire you to upgrade to ever more powerful hardware. Soon you will need a 5 core 3.4 GHz plus GPU hardware accelerator to edit an email.
I have not used any of that stuff ever. That includes debug.
LOL
I think one uses 'debug' or ICE when one cannot imagine what is happening in the silicon, or in ones own soft? This points to a lack of analytical capabilities, lack of imagination, or absence of functioning neuron groups.
You are right, usually you don't need a debugger for a PIC program, but if programming on PC, in a 100,000 lines program, sometimes I really cannot image what is happpening in my own software :-) and then a debugger is really useful.
--
Frank Buss, http://www.frank-buss.de
piano and more: http://www.youtube.com/user/frankbuss
On a sunny day (Sat, 09 Oct 2010 01:01:11 -0600) it happened hamilton wrote in :
building
Yea, I dunno, I did all repairs with my car :-) Your anal-log description is incomplete, and does not apply. Micro-thing is about programming and hardware, and this is a design group. So if you want to play with a micro-thingy you *should* know programming and flip[s]-and-flop[s], I did design my own micro with TTL one day, because I could not get a budget approval for a microcomputer development system (very early ones), but I has unlimited budget for digital design, so I designed my own processor. It worked, was a bit risc, way a head of time. Wire wrap too.
The point is perhaps that the today kiddies want to start top down. That is why NASA cannot go to mars with a manned mission. Von Braun did it bottom up, knew the works and just went for the moon. These days they only have simulations at NASA, and even then their rockets are unstable. Because they fired all the people with real hands on experience in the seventies of the last century. Top down does not work! Sure you can learn a kid to drive a car. But first thing something does not work, and 'wtf is a sparkplug?' comes up, they will need a rescue team to get them back home. Well along those lines anyways.
Today we see the same thing over again at NASA, firing hundreds of engineers who finally learned how to work in a precise way.
Politics, and science, seem to often become each others enemy. Politics wants to use science, and it can accomplish great things, but then things die because of infighting. And an other empire goes down.
2000+ may become the age of the simulations, as opposed to the one of Aquarius (or was it aquarium), well. Finally everything becomes a simulation I think there was that movie... and nobody has a clue about reality, just brains hooked up to a main frame, nobody to maintain it hough, cosmic ray comes, chip bit flips, humanity gone. This is why I mentioned WW3, it would be the wakeup call. Imagine no supermarket to get food from, million dying in cities without food, back to the land (living from), to hell with simulations. RIIIIIIIIIING
On a sunny day (Sat, 9 Oct 2010 14:12:05 +0200) it happened Frank Buss wrote in :
the silicon, or in ones own soft?
NewsFleX, this newsreader I am writing this with, I wrote (in C), is about that size. You can start it with the '-d' flag, and will then use printf() to print all function calls and arguments to those functions, plus whatever other variables are used. So printf() is is my debug. I rely on gcc. If I write is asm then I usually add some routines that can print numbers in ASCII, and use those for debug, those routines have been debugged over time, for example for a PIC it is just copy and paste to include those, takes up very little space. In case of a PIC all I need is 1 I/O pin for debug for serial out.
Example: int show_posting_body(long posting_id) { int c; struct posting *pa, *lookup_posting(); char temp[READSIZE]; FILE *load_filefd; struct stat *statptr; char *space, *spaceptr; char *ptr; char *expanded_space;
Ok, this is not a debugger, but some kind of (rather verbose) debugging support. Your posting before sounded like you can imagine anything without any debugging support.
--
Frank Buss, http://www.frank-buss.de
piano and more: http://www.youtube.com/user/frankbuss
On a sunny day (Sat, 9 Oct 2010 15:38:27 +0200) it happened Frank Buss wrote in :
function calls
'debugger' is a tool, in the old times I have used one. Then I did read a paper from Uni Twente (Netherlands) that argued that in high level languages a 'debugger' is not needed, one should simply print the values. Now there remain the question 'what is a high level language?'. As I started programing with zeros and ones (DIP switches, machine code, even hard-wired programs), for me assembler is already a high level language, and C most certainly is :-) I guess for the new kiddies on the block maybe it is from Python upwards, or they simple say and type 'I want ....' where '....' stands for whatever. So it is all bit relative. Of course we need to be able to verify if whatever we created actually does what we intended, or anything at all, does not cause WW3, etc. I am not sure if my writings qualify. hehe LOL Reality is the feedback I think. Now we can start a thread about cause and effect...
I wouldn't put it this way but in essence you are right. I think my next PC will run Linux with XP in a virtual machine for programs that are not available for Linux. About 2 years ago I upgraded my PC to XP because I put a dual core CPU in it but I start to regret not re-installing Win2k.
the silicon, or in ones own soft?
It is very handy to get a grasp on how a CPU works. Way back in school I was thought Z80 assembly on an assembler with a simulator (not that I needed that; I already knew all about the Z80 by then).
I use a debugger when I want to verify a piece of program does exactly what I expect it to do. Mostly buffer management/caching stuff in which faults may accumulate before trashing the place. Or if a piece of software written by someone else crashes for no good reason at all.
--
Failure does not prove something is impossible, failure simply
indicates you are not using the right tools...
They haven't fixed PICkit3 yet? Glad I didn't go buy one. PICkit2 is okay for starting out, even better if you get the 28-pin demo board, because the CPU on that has ICE support, saves you needing one of the IDC products.
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.