We would not be here were it not for DTSS

John Robertson wrote in news:EpCdnRI0ivb6VqXDnZ2dnUU7-U snipped-for-privacy@giganews.com:

No. What he is is iliterate. The topic clearly states things, and I never said BASIC was this great advance. Slotard likes to poke folks and he got it wrong yet again. He presumed I was referring to BASIC as that is what the video talks about and is titled as. He proved he did not watch it and only looked at the title.

In the video, the more more robust languages were mentioned, yet another proof that the chump did not even watch the video.

Putting that aside, it actually was advanced inasmuch as it was an interpreted language as opposed to a compiled language. This made it hugely more cross platform compatible.

Try moving a Burroughs mainframe MRP application to the PC. Unable to even do it back in those 286 386 days when we tried.

Still not the point. Overkill Bill will never get the point. His functioning brain matter would fit on the end of a molecular probe. And that was before his senility started kicking in.

Reply to
DecadentLinuxUserNumeroUno
Loading thread data ...

Funny that he doesn't get the entire point of basic & how it changed the world.

Reply to
tabbypurr

t

ting.

hat you could do faster with a more appropriate language doesn't make a goo d choice.

advance. Teaching lots of undergraduates to program was a good idea, but t hat's it.

BASIC was more aimed at undergraduates than kids.

I'm not sure that getting kids into programming really early is such a good idea.

There's a great deal of sloppy program writing around, and getting people i nto it when they are old enough to understand that care and attention to de tail might pay off better - in the long term - than letting them learn earl y on how to quickly crank out something that more or less works.

Hobbyists still learn how to use the 741 and the 555, and miss the fact tha t anything you do with these chips you can do better with other devices.

You can claim anything you like, and when it comes to climate change you do just that. This is an unmoderated group. Fairness doesn't come into it - n or any real comprehension of what you are claiming.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

snipped-for-privacy@gmail.com wrote in news: snipped-for-privacy@googlegroups.com:

He doesn't get it. It was not a trivial advancement, but even if it was, it is still an advancment.

Just think where we would be had Edison not invented the "Edison Screw Base" light bulb socket standard.

Large or small, the pressed or spun metal, cheap bulb sockets worked for many types of AC or DC powered "light bulb" type light source. They were very easy to integrate with the glass bulbs too. Many times all at the same time on the same line, the glass would get sealed via torch, cool, and then the socket would get added and the little center piece and get soldered all automated. And even the testing too.

Other types experience a loosening of the connection. The screw insured use in any orientation.

Been around a very long time too.

I'd bet that any school that has a course on computer science for a junior high school aged child even today touches on BASIC, if not has course paths that require its use as a lesson. Likely character based games as well, and moving into the graphically capable machines.

Reply to
DecadentLinuxUserNumeroUno

The entire point of Basic was that it would run on tiny computers, and it didn't change the world.

The world-changing stuff was done on more powerful computers, with rather better programming languages. NT clearly didn't come into contact with any of that.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

I was there when it was happening. BASIC didn't come into it.

The development of small, cheap and moderately powerful computers was what made the difference, and all sorts of people used them in all sorts of diff erent ways.

Not on this subject. The first computer I used was a mainframe - an IBM 704

0/44.

I even got to operate if from time to time, though it's main job was batch processing loads of little jobs for all the students (and a lot of the staf f) at the University of Melbourne. The second computer I used - at much the same time - was a DEC PDP-8, which wasn't used to run batch programs. I us ed it to record the outputs of my chemical kinetics experiments, which got punched out on paper tape, converted to Hollerith cards, and run through th e IBM 7040/44 (usually at around 2.00 am) for serious data extraction - non

-linear multi-paramater curve fitting.

The PDP-8 did have a BASIC compiler, but it really wasn't worth using.

Once I'd got my Ph.D. I got to work with advanced computers doing more exot ic jobs.

One of the electron beam testers I was involved with - much later - was use d to debug the first Motorola 68000 processor chips, and the Motorola guy w ho used it claimed that the machine shortened the debugging phase by about three months.

Teaching undergraduates to program in BASIC might have been worth doing, bu t it wasn't exactly a crucial engine of change.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

Bill Sloman wrote in news: snipped-for-privacy@googlegroups.com:

Bill Sloman is an abject idiot and has little grasp of computer science, much less its history, even though he was around as it evolved and even got to watch someone use one a couple times.

Now that they are ubiquitous, he does have one, but all he uses it for is posting retarded comments on Usenet.

Reply to
DecadentLinuxUserNumeroUno

Bill Sloman wrote in news:3544c5a5-d108-4696- snipped-for-privacy@googlegroups.com:

The move from batch processing to time sharing most certainly did involve BASIC. Sorry, punk, but you were not at Dartmouth.

Reply to
DecadentLinuxUserNumeroUno

DLUNU doesn't seem to have much of a clue about computer history. He won't know who Alan Turing was, or what he might have bad to do with Donald Watts Davies, with whom I collaborated - very briefly - around 1981.

Not exactly true. It runs LTSpice, and I do use it to simulate circuits fro m time to time. And it runs Libre Office, which I use to edit the NSW IEEE newsletter a couple of time a year. It has a Linux partition, as has it's p redecessors back to about 1999 when I installed the gEDA circuit design and layout package. These days I've got KiCad, which also runs under Windows, but I've not done much with it.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

In my first industrial job, back in 1970, we had a time-shared terminal in the lab that was supported by a Control Data computer on the other side of town. I used it write (and run) FORTRAN programs. During the training cours e, one of the instructors was unnecessarily rude about the code written by one of my colleagues, and I pointed out that it would compile to exactly th e same machine code as my version - which it did. Fun.

BASIC was made available on those kinds of systems, but they did compile pr oper languages too.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

Was that a BASIC compiler or a BASIC interpreter ?

Implementing time sharing is easier with an interpreter, since you can keep the interpreter resident all time and just swap out the limited number of variables to mass storage at the end of time quantum. The next time the application gets a quantum, just load the variables and start reading the source code from disk and continue execution from where it was.

BASIC was not the first attempt to give computing to the masses.

FORTRAN (FORmula TRAnsform) was an 1950's attempt.

COBOL was a similar attempt. According to Grace Hopper (the mother of COBOL) the aim was that accountants could do their own programming. For instance, the syntax allowed for a lot of fill words, so that sentences looked like normal English sentences.

In reality, accounts started doing their own programming with spreadsheets. It should also be noted that the formulas written into spreadsheet cells do much of the programming that was previously done with separate BASIC programs.

Reply to
upsidedown

Bill Sloman wrote in news: snipped-for-privacy@googlegroups.com:

AutoCAD bought Eagleware and it is free now. It is a Windows app now. Better than both of the remaining Linux EDA apps. It started out open though.

Reply to
DecadentLinuxUserNumeroUno

Bill Sloman wrote in news: snipped-for-privacy@googlegroups.com:

It was my first major, dipshit.

I knew about Turing *before* the excellent movie about him.

Again, you are an abject idiot.

Shame your bullshit isn't brief. It stinks up the place.

Reply to
DecadentLinuxUserNumeroUno

Bill Sloman wrote in news: snipped-for-privacy@googlegroups.com:

DTSS was 1964. D'oh!

Unnecessarily rude... you mean like you are in this group so often? Or some other version...

You are as bad as Trump when it comes to admitting that you made yet another improper assessment and attack.

*I* touted DTSS.

The *video* is titled referencing BASIC.

YOU made a stupid presumption, and even claimed I said something else based on your presumption. And you still have yet to watch the video. You brain is broken.

Reply to
DecadentLinuxUserNumeroUno

snipped-for-privacy@downunder.com wrote in news: snipped-for-privacy@4ax.com:

BASIC is a line by line interpreted language. There were "compilers" made for it on *some* platforms.

Reply to
DecadentLinuxUserNumeroUno

snipped-for-privacy@downunder.com wrote in news: snipped-for-privacy@4ax.com:

Wrong. FORTRAN was for the original building sized computers, not "the masses". The "masses" it "was for" was college computer science and engineering students, and staff engineers at the companies using the computers that were in the field. There were no computers in the hands of "the masses" back then.

Reply to
DecadentLinuxUserNumeroUno

This misses the point. The original computer programmers programmed in mach ine language. Assembly languages made the process easier by letting you wri te easily remembered words which could be directly translated to the approp riate binary string.

FORTRAN was the first higher-level language which was compiled to create th e appropriate sequences of binary strings.

The physical size of the computer didn't come into it. The fact that there weren't many of them initially limited the number of people who needed to w rite computer programs, but when integrated circuits made it a lot cheaper to build computers, the demand went up, and places like Dartmouth started c hurning outprogrammers. Their existence reflected the fact semiconductor te chnology had changed the world. They were an effect, not a cause.

If your computer history major didn't teach you that, you were being sold a line of goods, probably by somebody who put a high value on their own cont ribution.

--
Bill Sloman, Sydney
Reply to
Bill Sloman

Bill Sloman wrote in news:3ec9c3c4-2f49-4603- snipped-for-privacy@googlegroups.com:

It was just a reference to the timeline ya dumbfuck. Those were the only machines around then. You are thick.

Reply to
DecadentLinuxUserNumeroUno

Bill Sloman wrote in news: snipped-for-privacy@googlegroups.com:

You are still lost in your CONSTANT thinking that I missed something or lack some knowledge you have. Fuck you, Billy. you retarded f*****ad.

Reply to
DecadentLinuxUserNumeroUno

It enabled lots of kids to get into programming, resulting in lots of programmers, resulting in lots of software development later. What kinda dunce can't see that I don't know, but there ya go.

You'd use some other base. Swan's wasn't too practical. BC is but I don't know when it dates from. Now I do - 1870s.

we don't have that issue with BC.

Computing today without basic would be significantly less advanced.

NT

Reply to
tabbypurr

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.