Picaxe: The Crappiest, Best mcu out there!

The Picaxe is an mcu that hardly deserves the name, as it runs on a basic interpreter and is slow.

But what the basic interpreter does is to make simple usually complex needs, like ADC. It's a 90% of needs are done with 10% of code structures type of thing.

It reminds me of the invention of the automatic transmission. (or Green Mountain coffee.) The auto transmission were called names like "slush buckets" when they first came out. They were slow and got crappy gas mileage. But look what happened.

I have always wondered why there isn't a natural language interface for coding and even design of circuits.

"I want to measure the room temperature with a thermistor and transfer results to PC."

Then it creates circuit and code for you.

Yes, it's disgusting to the purist (and me too, a bit). But recall the slush buckets of the past...

Reply to
haiticare2011
Loading thread data ...

Graphical drag & drop programming for pics.

Reply to
DTJ

Because even the simplest of things are too complicated, with too many variations and circumstances.

Consider the mere blinking of an LED, and all the discussion that took (with no definitive resolution) from people who've been doing this since childhood...

Cheers, James Arthur

Reply to
dagmargoodboat

They became somewhat better, and everybody stopped learning how to actually drive, so they're stuck with them. ;)

Cheers

Phil "cars should do as they're damn well told" Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Have you ever looked at Forth? Forth is a language based on words for both data and action (nouns and verbs) as well as modifiers (adverbs and adjectives). It is not uncommon for a program to look a bit like language although the verb tends to be at the end (more like Latin than English). e.g.

: temperature_to_PC ( -- ) measure_thermistor convert_to_temperature send_temperature_to_PC ;

Of course this isn't anything any other language can't do, but it is so much simpler in Forth.

The next step is to specify each of the above operations and code in a similar manner. Then you can interactively test each defined word from the bottom up.

You could think of this as defining your own application specific language and teaching it to the computer. :)

--

Rick
Reply to
rickman

hmmm maybe I disagree with you, but I'm not collapsing that wave function yet. :) I dunno, I think you could do it, but just a hunch. I mean, isn't there any forward progress in the state of knowledge despite the endless discussions you refer to? And don't forget - in pattern recognition, if you can "get on the beach" - then you can ask further questions as you go along. "No, make it more like that. And add that. etc."

Reply to
haiticare2011

I've definitely heard of it, and it has (had?) proponents. In a sense it may resemble LISP, a tailored AI language . Or Prolog. None have caught on, though I recall autocad was written in LISP, which is weird, since Autocad not exactly AI.

Reply to
haiticare2011

Forth isn't a lot like LISP, I don't think. It's a stack-based language that only understands integers, and the intelligibility of the programs depends critically on the coder's ability to come up with short descriptive names for the 'words' (which are like functions).

Back in the day, it was useful with micros because it tends to be very space-efficient--even short blocks of code get used over and over.

I had a Forth compiler for the first Macintosh, back in 1984, but I never used it for anything very complicated. It's a lot like HP calculator programming, which I'd done a lot of, but in the end I stuck with HP Basic.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

I was looking at it psychologically - ISTM the LISP-Forth-Prolog approach is to try to change your way of thinking to more effectively program and solve problems. I remember Feynman liked to do that - brac + ket = bracket was some kind of quantum construct. Then you can get into the famous Sapir-Whorf hypothesis (at least famous among cognitive psychologists.) It had to do with where thought came from - language or raw consciousness? IMHO I'd like to keep csness pristine as possible...

Reply to
haiticare2011

Okay, so having got that off your chest, what do you think are the similarities between LISP and Forth?

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

I spent a total of 2 weeks with Lisp in school and have not seen it since. Somewhere in the second week I had the epiphany where I "got" Lisp and could program in it. Forth is not really much like Lisp other than the fact that you need to change your ideas of programming to use it most effectively. Once you do that it becomes a very useful tool.

Forth can be great for embedded work because it lends itself to small, modular code which promotes reuse and ends up producing smaller code than typical C programs. It is also interactive, capable of running on all but the smallest processors. By that I mean the compiler can reside on the target allowing you to interactively compile and test code from the command line directly on the target. There aren't many Forth simulators, lol.

--

Rick
Reply to
rickman

Have you seen these?

formatting link

Reply to
bitrex

To try to answer you directly as possible, the similarities lie in the idea that a different language will have "intelligence benefit" and maybe efficiency. So I would disagree with the former poster's statement that Forth makes things "so much simpler." It doesn't, because you have to learn a new language which is different from 99.9% of the code out there. And as well...

Just the idea that you can solve new problems by changing the language. All these specialized languages have not panned out. I don't know why they have not panned out, and there could be many explanations. One is that they are ineffective in a philosophical sense. If you talk different, you don't get smarter. Another explanation is that simple procedural languages like C have created the mind-set of millions of programmers. If you stand up and shout, "All right, everyone stand on your heads!" compliance is not guaranteed.

Finally, it's possible that procedural languages represent how people think, better. I don't know about you, but I don't think in terms of lists or LIFO stacks. And if I were to change my way of thinking, then there would have to be a big payoff, an enormous advantage. Our language structure is procedural, or cause-and-effect.

Another angle on this is the question, "Where does intelligence originate?" It's remarkable that the Asian culture and philosophy looks at the mind in some very fundamentally different ways. Here are several Asian ways of seeing it:

-You don't control your mind and you can't stop the thinking.

-People get trapped in their thinking. (attachment) fluidity is stopped.

-Consciousness is separate from the mind.

In any case, when Nvidia produced their 192 core supercomputer board - Here is a quote from their web site:

"CUDA is an entire computing platform for C/C++/Fortran on the GPU. When we started creating CUDA, we had a lot of choices for what we could build. The key thing customers said was they didn't want to have to learn a whole new language or API. Some of them were hiring gaming developers because they knew GPUs were fast but didn't know how to get to them. Providing a solution that was easy, that you could learn in one session and see it outperform your CPU code, was critical."

-- Ian Buck General Manager, NVIDIA

I hope I have addressed the question you raised.

Reply to
haiticare2011

What about them?

--

Rick
Reply to
rickman

On Sat, 19 Jul 2014 12:30:51 -0400, rickman Gave us:

MAJOR snip. Sheesh, guys.

The Cell CPU was better. Shame Sony, IBM, and Toshiba did not continue to develop it past the first iterations.

Must be that HD DVD debacle muddying things up. Or maybe it was the "Duel Threading" I saw referenced.

I thought that was pretty funny when I saw that remark.

Reply to
DecadentLinuxUserNumeroUno

In what way was it "better"? What was it better for?

--

Rick
Reply to
rickman

Thanks Rick. The greenarray is programmed in "PolyForth?" I'd hate to program a neural network in Forth, but maybe I'm just being too conservative. It's an interesting piece of hardware. Thanks.

Reply to
haiticare2011

You can use Polyforth, but that is not a good way to use the GA144. That is intended for running a supervisory program (on a few cores) like you might on an ARM while the rest of the GA144 cores run specific assembly code to implement hardware like functions or act as coprocessors.

To be honest, there really and truly is no "intent" when it comes to the GA144. Someone once commented that Chuck Moore is not so much of an inventor as he is a thinker. He toys with ideas and sometimes builds them, not entirely unlike Leonardo da Vinci. Often he builds things because they seem like they could be useful without having concrete ideas for how they might be useful. Ask Green Arrays about the target market for the chip and they will talk vaguely about the Internet of Things. Notice they don't talk much about sales because basically there aren't any to speak of.

--

Rick
Reply to
rickman

Yes - About sales, I suspected so. But Nvidia supports their work on 192 core boards by catering to gamers. I'm in process of setting one up - slowwwwlllly. With the Nvidia board, the GPU-oriened cores are attractive to run neural networks as well as input camera images. I'm trying to see how much is demonstrable, and how much is vapor-ware.

Reply to
haiticare2011

It would be really nice if you quit using Google Groups. The double spacing really messes up the posts.

I don't really know much about GPUs. I would think the big limitation would be memory bandwidth. I assume they have a sizable on chip cache and depend on the algorithm handing a lot of data between the ALUs and not on/off the chip so much.

I thought they were up to 1024 "cores" on a chip? Or is that a different architecture?

--

Rick
Reply to
rickman

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.