self hosting on the Pi3

On Sat, 7 Mar 2020 20:51:39 -0000 (UTC), Martin Gregorie declaimed the following:
EBCDIC had the advantage of mapping practically 1:1 with Hollerith card codes. EBCDIC and ASCII were both released with a year of each other, and since IBM already had equipment using "BCDIC" (no "extended") it likely was faster for them to develop and release EBCDIC than to wait for the ASCII standard, and only then develop systems using it.
--
	Wulfraed                 Dennis Lee Bieber         AF6VN 
	wlfraed@ix.netcom.com    http://wlfraed.microdiversity.freeddns.org/
Reply to
Dennis Lee Bieber
Loading thread data ...
On a sunny day (Sat, 07 Mar 2020 13:25:02 +0100) it happened Axel Berger wrote in :
Depends... on how much you know I guess...
That is correct, indeed learning a programming language in '2 weeks' at least for me .. I speak Dutch, English, German, French, and a bit Portuguese... The first 4 were hammered into us at school, started French in kindergarten,.. Am a Dutch native. It is a bit the same with C (as example). You can, if you are into that, read 'Brian Kernighan and Dennis Ritchie. The C Programming Language' maybe in a day? write the examples. learn how to use gcc (or any other C compiler) but then there are the methods, we can talk for example about linked lists, hashing, etc.. I learned a lot from Dr Dobbs magazine as far as programming in C goes -over the years- that is. Just an other language? Ever tried x86 asm? My boss was really into that, would sit next to me and hammer away, One day I started using what was it? 8051 series micros in projects, just an other asm, other processor structure, he then decided everybody should learn C. That is where I learned C (from an other programmer). A few lessons of an hour or so, but that is then only where it starts, I learned a lot about Z80 asm programming from some other book, and those techniques I always use in other projects
I see people sweating with ICE, debuggers, I don't use any of that, use the techniques I learned from that book, was it Prof Alan Miller? 8080/Z80 Assembly Language: Techniques for Improved Programming Alan R. Miller
So if you think you know programming after those 2 weeks... the BASICS (not BASIC) may be simple.. but that is where the really interesting thing starts.
hashing:
formatting link
One of my first C programs, the NewsFleX Usenet newsreader I wrote and have been using for discussions like this ( -;) ) since I could not find a 'Free Agent' for Linux:
formatting link
Look for hashing in the code!
But OK, BASIC can be learned in a flash, and you can do A LOT with it. And then you still will have to KNOW what you babble about in that language, being able to speak a language does NOT make you a great novel writer. Tehre is a lot of babble going on bars etc...
And open source your code, so the genius can help others.
hehe
Reply to
Jan Panteltje
On a sunny day (8 Mar 2020 01:22:49 GMT) it happened Robert Riches wrote in :
When I worked for the TV network it was always " The show must go on'. Camera breakdown, recording breakdown (my department so to speak) with a studio full of artists that can only be booked on that same day once, or on the air .. selling black.. or 'Sorry people today no .. show' is out of the question. My first boss ended up in the madhouse. pressure, I always thought it was from my declarations... ;-) last I heard from him was a card from the madhouse, we got a new boss. But anyways in all those years the show always went on in my case. Fault finding in seconds or find some alternative solution was a natural to me. You do have to know the stuff down to the transistor so to speak. Work in shifts... Only once did I wake up at night and thought I heard the intercom Brussels Eurovision... you jump. Did head control room too, all those phone lines.... events all over the country video sync..
I think what we see now with the corona virus it must be hot at the controls.
Wonder if the networks will go down.
Working on some music:
First verse, every body normal clothing Every body do the loco-rona Do the loco-rona, it is a real nice dance Do the loco-rona this is your chance! Every body do the loco-rona with me!
Second verse, mouth caps on Every body do the loco-rona Do the loco-rona, it is a real nice dance Do the loco-rona this is your chance! every body do the loco-rona with me!
Third verse, gas masks on Every body do the loco-rona Do the loco-rona, it is a real nice dance Do the loco-rona this is your chance! every body do the loco-rona with me!
Forth verse, space helmets on Every body do the loco-rona Do the loco-rona it is a real nice dance Do the loco-rona this is your chance! every body do the loco-rona with me!
Fifth verse, complete space suits on Every body do the loco-rona Do the loco-rona, it is a real nice dance Do the loco-rona this is your chance! every body do the loco-rona with me!
Last verse, dancers in space-suits tumbling over one by one Every body do the loco-rona Do the loco-rona, it is a real nice dance Do the loco-rona this is your chance! every body do the loco-rona with me!
Music like this
formatting link

If society collapses... Our last hope is with Elon Musk, an escape to mars.
This is the news: Marsbase 1 first astronut tested positive for loco-rona virus. LOL
Reply to
Jan Panteltje
On a sunny day (Sat, 7 Mar 2020 16:06:39 -0000 (UTC)) it happened Martin Gregorie wrote in :
These days it is AFAIK all done in digital memory, at least that is how I do it on the PC, you can do it too: mplayer -fps 2 video.avi
Yes! Nice! Seen similar setups at IBM.
Thanks, sometimes it is nice to see the old days..
Reply to
Jan Panteltje
I learnt C from reading K&R - easy-peasy, but then I was already a programmer and analyst.
My first language was Algol 60, followed by PLAN (ICL 1900 mainframe assembler) when I started work, then COBOL, Algol 68 (a much underrated language), BASIC, PL/9 and then C.
If you're a new programmer and have just learned C or Java, I can thoroughly recommend getting hold of "The Practice of Programming" by Kernighan and Pike.
Its aim is to teach new programmers how to write well-structured, readable programs that are easy to debug and/or extend. These are skills that very few language-specific books or programming courses teach but are essential for a professional programmer or anybody planning to write and publish free software.
Though its written round the Algol, Pascal, C, Java family of languages, the ideas in it are applicable to almost any programming language.
--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie
Hollerith card
Yes, I know about its relationship to cards.
What I dislike about it is the way the A-Z block is broken by groups of punctuation and other characters which can play merry hell with sorting as well as putting digits after letters. I knew 6-bit ISO code (used by ICL 1900s) and ASCII before I met EBCDIC, so I found its collation sequence rather nasty - and still do.
--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie
You need to learn 8086 assembler then. Instruction set support for EBCDICASCII.
Reply to
mm0fmf
Nah, just need to implement a suitable Java Comparable class any time I get sufficiently pissed off with the way TreeMap insists on using a default collation sequence that puts digits behind alphabetic characters. Or do something similar in C.
I've written code using other assemblers: PLAN, 6800, 6809 and 68000, but as I haven't used any of them in the last 20 years, I've yet to find a reason for learning any others. OK, I might make an exception for PIC assembler if I can't do what I need with PICAXE BASIC.
--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie
Java: another Dodo language like COBOL.
XLAT was added to the original 8086 instruction set by Intel at IBM's request specifically for speeding up conversions to and from EBCDIC. Load DS:BX with the table address, element index in to AL and XLAT loads AL with the byte at DS:BX + AL.
I wrote some 8086 code that specifically used it in maybe 1985 ISTR. Never used it again and was surprised to see it has made it's way into the 64bit instruction set. Last 80x86 assembler I wrote was 1996 for an 80186EB embedded thing. Luckily I've only needed ARM (assorted sizes) and MIPS since then in the job that pays the bills and some PIC 8bit for hobbyjobbies.
Reply to
mm0fmf
8" disks weren't that much better -- but at least there was the IBM 3740 format to fall back on for compatibility. In my first ever job (not quite as long ago as yours) we had a Z80 CP/M machine with two internal 5.15" floppies and an external 8" drive. That 8" drive supported a DS/DD format with a massive 1.3MB capacity.
That machine was lovely to develop on as it had one of the nicest keyboards to type on that I've ever encountered ... but it's main purpose was as a disk format conversion box -- it could handle dozens of formats on both the 5.25" and the 8" drives.
By the time the IBM PC appeared most CP/M-80 machines I was seeing wrote either 10 512-byte or 5 1024-byte sectors on a track on a 5.25" floppy -- making 400kB of data in all. The mostly used WD controller chips that allowed that. The floppy controller in the PC writes too much lead data on each track to achieve that, so the best it can do is 9 sectors of 512 bytes -- giving 360k (DOS originally wrote only 8 sectors to keep the addressing simple(r), giving 320k in all).
IBM really screwed up there!
--
Cheers, 
 Daniel.
Reply to
Daniel James
I learned Algol68 first ... and almost every language I've looked at since has been a bit of a disappointment (except, maybe Ada and C++).
Do you know of Algol68 Genie, the interpreter that runs on Windows and Linux? Nostalgia is very much what it used to be ...
--
Cheers, 
 Daniel.
Reply to
Daniel James
Quoting from Ted Nelson's _Computer Lib_:
ASCII and ye shall receive. -- the computer industry
ASCII not, what your machine can do for you. -- IBM
--
/~\  Charlie Gibbs                  |  Microsoft is a dictatorship. 
\ /        |  Apple is a cult. 
 Click to see the full signature
Reply to
Charlie Gibbs
I guess the IBM folks missed the 360's TR instruction.
I was heavily into assembly language back in the mainframe days, and had fun writing 8080 assembly code on my CP/M box. However, 8086/8088 assembly language just got too kludgy for me, even before you got into all the segment register headaches. Fortunately, C came along, and I gratefully made the switch(). :-)
--
/~\  Charlie Gibbs                  |  Microsoft is a dictatorship. 
\ /        |  Apple is a cult. 
 Click to see the full signature
Reply to
Charlie Gibbs
I've heard of it, but not used it.
All my A68 experience, what there was of it, was with Algol68R on a 1904S running George 3 when I was the sysadmin for a while at British Steel's Battersea Labs. I'd installed A68 there and played with it a bit before letting the researchers loose with it. It was installed at their request.
Then, quite suddenly, the George 3 job accounting system went titsup with a classic 1900 crash - the program reduced its memory size to 64 words and crashed trying to execute and instruction that was now outside the 64 word address space. Since at this point you had nothing except the accumulators, PC and CC plus the next 54 words to work with it wasn't easy to debug a program that did that.
Long story short: the accounting program had tried to handle a very long running job which had created a very big log. As its first action was to increase memory to hold the entire log before starting to process it, this time the log size forced it to hit the process memory limit (32K words for a program running in 15AM mode), at which point the address pointer wrapped round and continued to write the log, from address zero which stomped on its registers, overwrote the PC and crashed.
Technical point: In all 1900s below the 1906 the 8 accumulators, PC and CC were the first 10 words of a program's memory. The only hardware registers were datum and limit for the running program, which made swapping programs in and out of memory or moving them around extremely easy because the entire program was always a single code block, but did meant that an address overflow could wrap round and continue writing to memory from word zero. The PC address was always relative to the program's datum address.
Anyway, I worked out why it had crashed: see above. So, I wrote its replacement in A68R, mainly because I could, IIRC we didn't have a COBOL compiler, and doing it in PLAN would have taken a lot longer. C hadn't yet been invented and I don't speak FORTRAN. My program processed logs line by line, so bypassed the original problem.
A month of three after that I left and took 10 months off driving to India and back. When I visited on my return my program was still up and running happily. I know it hadn't been touched because my replacement had deleted the source(!) soon after I'd left.
--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie
I've done just that in Java, for a fairly complex data structure. I designed first it by drawing an Entity Relationship Diagram. You can see from this approach that I've done my time designing and tuning databases.
I mapped the ERD into memory by defining a Class for every entity and a TreeMap object for every prime key. This was very easy to do and has worked as I wanted it to do from the off.
Then, as the program gor written it was easy to add the various access methods as methods in the classes they manipulate.
--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie
Not even slightly.
COBOL is gone and not much missed, but Java is dicing with C as the most used language at present.
USB OTG micro B to A cable
I wonder if that's specifically to deal with sorting digits before/after letters. Somebody must want that or it wouldn't persist - I wonder if its some age old Library convention.
If so, that would explain its position in the EBCDIC collation sequence..
--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie
I don't think so. Huge amount of legacy code is still in use.
--
To ban Christmas, simply give turkeys the vote.
Reply to
The Natural Philosopher
On a sunny day (Sun, 08 Mar 2020 17:11:57 -0000) it happened Daniel James wrote in :
Not sure you should blame the IBM PC floppy controller, as I used it in the CP/M clone I wrote, am talking about the 8272 chip. I used the Kaypro II format with its 40 tracks x 10 sectors, 400kB. Description of that system here:
formatting link

Circuit diagram of the floppy controller board I designed:
formatting link
IC1 is the 8272, circuit is dated 13-7-1984
Wrote the drivers too of course.
No, maybe it was Microsoft?
My Z80 sytem was faster than the IBM PC when I later added a ram-disk and on power up loaded a whole 400 kB floppy into that RAM disk, so no more seek times and read times next zero.
formatting link

Reply to
Jan Panteltje
On a sunny day (Sun, 8 Mar 2020 12:52:59 -0000 (UTC)) it happened Martin Gregorie wrote in :
Indeed!
As I come from a hardware background, I was solving problems using logic circuits long before there were computers available. I think this is a plus, being familiar with hardware and logic. That HS100 slow motion huge disk machine was not controlled by some micro processor but by simple logic gates.. resulting in boards full of not so simple logic circuits.. Same for the AVR1 video recorders, quite nice and very fast. 'Turing' is not always the best solution to problems. And we used analog computing circuits..
I remember that night I was first trying BASIC on a computer and was touched by that if I did poke ADDRESS 123 and then read it back with PEEK ADDRESS the result was also 123, not only that POKE and PEEK ADDRESS+1 did not affect what was on ADDRESS. For me it was a step from analog computing to wanting to do more with digital. Like from a slide-rule to a real calculator On the slide-rule 2 * 2 was always about 4 plus or minus something.. Finally could get into exact answers! These day I think, with all the hype about 'quantum computers' those guys completely miss the point, noise takes them back -- in fact they _have_ an old analog computer. Many a scientific paper is published that has as last line "his will (with more funding duh) bring the quantum computer much closer" My suggestion to those guys is: come with something that can factor a large number.
You see those noise limits also in for example multilevel FLASH memory, the charge (or say voltage) in a singe cell is then divided in say 4 levels for 4 bits, noise looks you in the face as to how far you can go with the number of levels. Add a bit radiation and it is all over, do not take it on a mars trip.. Elon reading this???
Anyways, programming is fun, but for heavens sake please know the hardware too.
Reply to
Jan Panteltje
On a sunny day (Mon, 09 Mar 2020 06:59:02 GMT) it happened Jan Panteltje wrote in :
PS, could have been an IBM BIOS limitation?
Reply to
Jan Panteltje

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.