New soft processor core paper publisher?

Quitter! If the syntax (or near total lack thereof) bothers you, then you must have a very thin skin.

LOL! I guess I was documenting something this morning rather than writing code...

--

Rick
Reply to
rickman
Loading thread data ...

Eric, I'm curious what books these were that you found so offensive?

I'm also baffled about your comment about "programs that emit Forth." Although PostScript has many features in common with Forth, it is quite different, both in terms of command set and programming model.

Modern Forths (e.g. since the release of ANS Forth 94) feature a variety of implementation strategies, ranging from fairly conventional compilers that generate optimized machine code to more traditional threaded code models.

Cheers, Elizabeth

--
================================================== 
Elizabeth D. Rather   (US & Canada)   800-55-FORTH 
 Click to see the full signature
Reply to
Elizabeth D. Rather

Ha ha! And I see what you did there.

The books ("Starting Forth", "Thinkind Forth", "Forth Programmer's Handbook ") weren't themselves offensive, but they revealed Forth to be much lamer t han I expected for all the stick-it-to-the-man ethos surrounding it. I was totally stoked for a stack-based language that would solve all my problems , but all I got was some books gathering dust.

You're looking for Tom Gardner, he's down the hall near the elevators using a little stamp at the bottom of his cane to make little chicken footprints on the floor..

Reply to
Eric Wallin

Elizabeth D. Rather wrote: > I'm also baffled about your comment about "programs that emit > Forth." Although PostScript has many features in common with > Forth, it is quite different, both in terms of command set > and programming model.

It was my comment, based on experience of writing a few Forth and PostScript programs back in the mid 80s.

I'm know there are differences between Forth and PostScript, but the similarities are /more/ significant. Forth is like PostScript in the same way that Delphi is like Pascal, C# is like Java, Scheme is like Lisp, Objective-C is like SmallTalk (and unlike C++) etc.

And, more importantly, ForthPostscript is unlike C, is unlike Prolog, is unlike Lisp, is unlike Smalltalk is unlike Pascal.

I was beguiled by Forth and to some extend remain so: I'd /love/ to find a good justification to use it again, and might do so, just for the hell of it.

I'm sure Forth has moved on since the 80s, but it cannot escape its ethos any more than C, Prolog, Lisp, Smalltalk can. Or rather, if it did chance then it wouldn't be Forth anymore.

I'm sure that's true, but it misses the point. Interpreted Java is the same as jitted Java is the same as Hotspotted Java.

Ditto Forth; if it wasn't then it simply wouldn't be Forth!

I suppose I ought to change "nobody writes Forth" to "almost nobody writes Forth.

I would, however, be interested to know whether Forth has a defined memory model, since that's a _necessary_ _precondition_ to be able to write portable multithreaded code that runs on multicore processors. Hells teeth, even C has now has finally decided that they need to define a memory model, only 20 years after those With Clue knew it was required. (I don't think there are any implementation though!)

Reply to
Tom Gardner

Shouldn't that be "almost nobody Forth writes" ?

I wonder if Yoda programs in Forth...

(I too would like an excuse to work with Forth again.)

Reply to
David Brown

:)

Still use my HP calculators in preference to algebraic and half-algebraic calculators!

Reply to
Tom Gardner

Same here! When manually calculating, give me HP (or similar) or give me d eath. But Forth kind of sucks (and it pains me deeply to have discovered t hat). The failure (as I see it) of Forth is the awkward target programming model (or virtual machine as the kids say these days - virtual anything is sexy).

There's a pretty big gulf between between manual data & operation entry on something like an HP (or any hand held calculator) and programming with For th (or any language). A single stack very much facilitates the former but places handcuffs on the latter. These are two fairly different activities that IMO don't exactly have boatloads of overlap, however much we might wan t or personally need for it to be so.

Reply to
Eric Wallin

weren't themselves offensive, but they revealed Forth to be much lamer than I expected for all the stick-it-to-the-man ethos surrounding it. I was totally stoked for a stack-based language that would solve all my problems, but all I got was some books gathering dust.

little stamp at the bottom of his cane to make little chicken footprints on the floor..

I just saw that episode a few weeks ago. The chicken contest was pretty cool actually.

I actually identify with House. Not that I am as smart as he is, but I have a bad hip (waiting for Obamacare to kick in so I can get a new one) and have a natural tendency to tick off people unless I work to reign it in. :]

--

Rick
Reply to
rickman

If you ever need to control hardware, you will want it. The interactivity is great. Actually, when doing things in real time the interactivity sort of goes away (or at least the utility of it) since you can't type fast enough to control a robot or intercept a serial port running at 38.4 kbps. But you can very easily test small portions of your code in ways that are tricky in C or the other languages you mention. Then those ideas can be applied to any app... even if not real time.

Can *any* of us ever escape our ethos? I know I've been trying for many a year and it is still right here by my side, aren't you Ethos? Actually, that would be a good name for a couple of dogs, Ethos and Pathos, Chesapeake Bay retrievers.

So now I've been demoted to to "almost" nobody? What do I have to do to work my way *up* to nobody?

Uh, can *someone* who knows something answer that one?

Do you have a lot of call to program multithreaded code for multicore processors? What sort of apps are you coding?

--

Rick
Reply to
rickman

I would say that was "nobody almost Forth writes". Wouldn't it be [noun [adjective] [noun [adjective]]] verb?

What do you do instead?

--

Rick
Reply to
rickman

death. But Forth kind of sucks (and it pains me deeply to have discovered that). The failure (as I see it) of Forth is the awkward target programming model (or virtual machine as the kids say these days - virtual anything is sexy).

something like an HP (or any hand held calculator) and programming with Forth (or any language). A single stack very much facilitates the former but places handcuffs on the latter. These are two fairly different activities that IMO don't exactly have boatloads of overlap, however much we might want or personally need for it to be so.

I won't argue with any of that, or I might...

So why are you here exactly? I'm not saying you shouldn't be here or that you shouldn't be saying what you are saying. But given how you feel about Forth, I'm just curious why you want to have the conversation you are having? Are you exploring your inner curmudgeon?

--

Rick
Reply to
rickman

great.

The first time I wished I had Forth was when manually testing prototype hardware with a z80-class processor c1983. Forth would have been ideal, and significantly better than the rubbish I threw together.

That experience has shaped my views of "domain specific languages" vs libraries to this day.

(or at least the utility of it) since you can't type fast enough to control a robot or intercept a serial port running at

that are tricky in C or the other languages you mention. Then those ideas can be applied to any app... even if not real time.

Just so. It seems we are in violent agreement.

One also has to consider the tools that other people are familiar with; changing away from them requires very significant advantages, and Forth just doesn't have those.

my way *up* to nobody?

Why are you assuming "demoted"? Is a screw above or below a nail in some imagined hierarchy!

processors? What sort of apps are you coding?

Even cellphones have dual and quad processors nowadays.

And, of course, there are outliers like the Parallax Propellor.

Multicore will become the norm in the near future; the only constraint in embedded systems will be memory bandwidth.

Reply to
Tom Gardner

shouldn't be saying what you are saying. But given how you feel about Forth, I'm just curious why you want to have the

(a) because it is fun (b) to gain a personal understanding of the advantages of nails and screws, and when each should/shouldn't be used

Reply to
Tom Gardner

I'm not sure what that implies, but I'll go with it.

Not *too* violent. I don't have guns or anything like that.

I'm not sure what tools you mean. Sometimes I miss using a debugger where I can step through my code. Actually I bet Win32Forth has that somewhere, but the docs are not so great and I am likely missing some

95% of what is in it. Still, debuggers have their limits, notably they don't do a great job of getting you in the vicinity of the bug so you can step through the code to find it.

My willingness to give up debuggers and "go with the Forth" was largely prompted by a statement by Jeff Fox. I don't recall the context exactly but I was talking about the difficulties of tracking down stack underflows. My context was thinking like I was coding in C where I would write a bunch of code and then start in on the bugs often adding new ones in the process. Jeff pointed out that in Forth, if you have a stack mismatch it shows that *you can't count*. Even though (or perhaps because) it was so simple, that really struck me. If I can't write a Forth word that balances the stack effects, it means I can't count to three or four. Counting to three or four is a lot easier than using a debugger!

Context is everything. I have no idea what the context of screw v nail is about.

So which are you writing code for? Or is this just a theoretical discussion?

As to multicore being the norm... well, the last toaster I bought does seem to have a micro in it, *really*. It's a four slice unit, each half has three little buttons for bagel, frozen and reheat, one button for cancel and a darkness knob. You have up to five seconds after pushing down the handle to make your selections. I'll bet you anything this has a GA4 in it! I can't imagine doing this without a multicore processor... oh, wait, that is programmed in Forth isn't it... what is that memory model again?

--

Rick
Reply to
rickman

What happened to the third dog, Logos?

A funny omission, given that this group is primarily logic related (only secondarily philosophy related).

Chris

Reply to
chrisabele

I thought about that, but I was not sure. When I say "work with Forth again", I have only "played" with Forth, not "worked" with it, and it was a couple of decades ago.

I do mostly small-systems embedded programming, which is mostly in C. It used to include a lot more assembly, but that's quite rare now (though it is not uncommon to have to make little snippets in assembly, or to study compiler-generated assembly), and perhaps in the future it will include more C++ (especially with C++11 features). I also do desktop and server programming, mostly in Python, and I have done a bit of FPGA work (but not for a number of years).

I don't think of Forth as being a suitable choice of language for the kind of systems I work with - but I do think it would be fun to work with the kind of systems for which Forth is the best choice. However, I suspect that is unlikely to happen in practice. (Many years ago, my company looked at a potential project for which Atmel's Marc-4 processors were a possibility, but that's the nearest I've come to Forth at work.)

I just think it's fun to work with different types of language - it gives you a better understanding of programming in general, and new ideas of different ways to handle tasks.

Reply to
David Brown

Hey, it's not like this is *real* forth. But looking at how some Forth code works for things like assemblers and my own projects, the data is dealt with first starting with some sort of a noun type piece of data (like a register) which may be modified by an adjective (perhaps an addressing mode) followed by others, then the final verb to complete the action (operation).

Similar to myself, but with the opposite emphasis. I mostly do hardware and FPGA work with embedded programming which has been rare for some years.

I think Python is the language a customer recommended to me. He said that some languages are good for this or good for that, but Python incorporates a lot of the various features that makes it good for most things. They write code running under Linux on IP chassis. I think they use Python a lot.

So why can't you consider Forth for processors that aren't stack based?

I'm beyond "playing" in this stuff and I don't mean "playing" in a derogatory way, I mean I just want to get my work done. I'm all but retired and although some of my projects are not truly profit motivated, I want to get them done with a minimum of fuss. I look at the tools used to code in C on embedded systems and it scares me off really, especially the open source ones that require you to learn so much before you can become productive or even get the "hello world" program to work. That's why I haven't done anything with the rPi or the Beagle Boards.

--

Rick
Reply to
rickman

Just venting at the industry, and procrastinating a bit (putting off the fi nal verification of the processor). Apparently OT is my favorite subject a s it seems I'm always busy derailing my own (and others) threads. That, an d Y'all have very interesting takes on these and various and sundry other t hings.

Backing up a bit, it strikes me as a bit crazy to make a language based on the concept of a weird target processor. I mean, I get the portability thi ng, but at what cost? If my experience as a casual user (not programmer) o f Java on my PC is any indication (data point of one, the plural of anecdot e isn't data, etc.), the virtual stack-based processor paradigm has failed, as the constant updates, security issues, etc. pretty much forced me to un install it. And I would think that a language targeting a processor model that is radically different than the physically underlying one would be ter ribly inefficient unless the compiler can do hand stands while juggling spi nning plates on fire - even if it is, god knows what it spits out. Canonic al stack processors and their languages (Forth, Java, Postscript) at this p oint seem to be hanging by a legacy thread (even if every PC runs one perip herally at one time or another).

I suspect that multiple independent equal bandwidth threads (as I strongly suspect the Propeller has, and my processor definitely has) is such a natur al construct - it fully utilizes the HW pipeline by eliminating all hazards , bubbles, stalls, branch prediction, etc. and uses the interstage register ing for data and control value storage - that it will come more into common usage as compilers better adapt to multi-cores and threads. Then again, t he industry never met a billion transistor bizarro world processor it didn' t absolutely love, so what do I know?

I find it exceeding odd that the PC industry is still using x86 _anything_ at this point. Apple showed us you can just dump your processor and switch horses in midstream pretty much whenever you feel like it (68k => PowerP C => x86) and not torch your product line / lose your customer base. I s uppose having Intel and MS go belly up overnight is beyond the pale and at the root of why we can't have nice things. I remember buying my first 286, imagining of all the wonderful projects it would enable, and then finding out what complete dogs the processor and OS were - it was quite disillusion ing for the big boys to sell me a lump of shit like that (and for a lot mor e than 3 farthings).

Reply to
Eric Wallin

Why is it that every "serious" processor is a ball of cruft with multiple bags on the side, some much worse than others, but none even a mother could love? Is it because overly complex compilers have made HW developers disillusioned / complacent?

I honestly don't get how we got here from there. If it's anything, engineering is an exercise in complexity management, but processor design is somehow flying under the radar.

Reply to
Eric Wallin

on the side, some much worse than others, but none even a mother could love? Is it because overly complex compilers have made HW developers disillusioned / complacent?

1) an adherence to the von Neumann concepts of what a processor should be 2) processing models that worked well given previous technology limits, but which don't scale to modern technology limits (e.g processor/memory speed ratio, number of ic pins, cache coherence) 3) backwards compatibility (*the* dominant commercial consideration)

engineering is an exercise in complexity management, but processor design is somehow flying under the radar.

1) look at The Mill on comp.arch, for one beguiling possibility 2) message passing between processors/threads executing on multiprocessor machines with non-coherent memories
Reply to
Tom Gardner

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.