Spectre / Meltdown

I first coded in FORTRAN. all there was then was ALGOL, FORTRAN and COBOL.

Then came B, BCPL and C...and then after that trash languages designed to let monkeys think they could code

--
"Strange as it seems, no amount of learning can cure stupidity, and  
higher education positively fortifies it." 

    - Stephen Vizinczey
Reply to
The Natural Philosopher
Loading thread data ...

COBOL wasn't actively evil (apart from the ALTER statement - and that had vanished by the mid-80s. It just far too verbose:

ADD A to B GIVING C ON SIZE ERROR PERFORM OVERFLOW-TRAP.

where the Java equivalent would be something like:

try { c = a + b; } catch (ArithmeticException e) { overflowTrap(e); }

There's also a lot of typing involved because of the size ot variable names and the number of them you have to define:

- fields in structures have names that aren't linked in any way to the structure's name.

- if a program reads a value from an input file, writes it to a database file and prints it in a report or displays it on screen, you'll end defining the variable three times and the input and display forms will be slightly different because they define external format.

That said, COBOL did at least always let you catch ON SIZE ERROR overflow exceptions, which is more than could be said for other contemporary HLLS such FORTRAN. Specifying formats for input and output fields was also always pretty straight-forward and intuitive.

However it had some terrible features such as the ALTER verb, which allowed a running COBOL program to self modify, making debugging hellish.

Early COBOLs would allow you to call assembler subroutines but the language did not allow external subroutines to be written in COBOL. I don't remember seeing this feature in any implementation before 1978.

Early COBOLs also required a program to be a single source file though it had a COPY statement that could be used to pull in record definitions or sections of code. Fortunately, the compilers could handle huge source files: a program of less then 200 lines usually did nothing useful. Programs were generally in the 1000-4000 line range though I have seen a few in the 10,000 line range.

There was a built-in Y2K gotcha too: The only way you could get hold of the date was with a statement like

ACCEPT DATE-TODAY FROM SYSTEM-DATE.

where the DATE-TODAY variable *had* to be declared in working storage as:

01 DATE-TODAY PIC 9(6).

and the ACCEPT statement would fill it with a date formatted as 'yymmdd'. This requirement was defined in the CODASYL Report (i.e. was part of the language standard) and AFAIK it was not changed until after Y2K had been and gone.

--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

Hello Richard,

Can you explain us how we can find out which versions can attack which Pi versions? As I have many Pi's from 2 x 1B, 1B+, 2B, 3B, 3B+, 4B4GB, 4B8GB, ZeroW, BeagleBoard xM, Acorn RiscPC, Archimedes A440, BBC/Master/Compact/Electron/ Atom. Some are running 24/7, others only occasionally. No one uses WiFi or BT, only RJ45 cable. Some RISC OS, Some Raspbian Linux. No electronic banking, no creditcard, no smartphone (only 2G phone, 3G inet router occasionally on the move), no Google account, no FaceBook, no Twitter, not using Cloud storage. I have switched off RFID paying with the bank Pin card, and they are permanently stored in a RFID free sleeve. So I want to minimise all the risks as less as possible. Thanks for informing us.

Henri.

Reply to
Henri Derksen

Hello David,

I saw the light for the first time in 1956 as sun of an inland skipper in NL, and after the school for the hard of hearing I went to work at offices for administration sinds 1978. Since 1979 I got knowledge of using computers for administration tasks, at mainframes, and mini's, and since 1983 also at home. I did many studies in ICT, i.e. programming Basic and communication etc. In the private study for COBOL I did no pass the examination ;-(. I think that was a kind of a "sign" to stop that route ;-). Since 1984 I started using the better Acorn BBC computer and almost all its followers intensively. You surely knew that, right?

Unluckily I got 4 times a hearing crisis, with the 3rd one a sudden deafness in 1993 as the worst, 50 dBA loss in only one night ;-( Reason: I worked too hard which too much stress as a computer system engineer, as I was deaf at the left side and hard of hearing on the right side from birth. I phoned very intensively with all the computer users in the company I worked for. When someone had a problem, I was there within 5 minutes. And I did it all alone. As I could not phone anymore I had to stop working in the ICT, and doing only administrative jobs like payroll controlling etc. After the last (4th) hearing crisis in 1997 (heavy Tinnitus, HyperAcusis, MisoPhony and Recruitment) I donot work any more professionally, but only voluntear work for the hard of hearing (induction loop testing) and about laws for inland skippers of the bigger recreational craft in NL.

When I left the firm, my work was done by 3 people now in 1999, and the users were crying far more than in the period before when I helped them alone. Reason: They wanted Windows in stead of DOS with simple menu's I made. And you know what that means ;-(. Ofcourse Windows was not needed for the functionallity, but it costed a lot more in resources, money and workmanschip, and gave much problems too. They did not see the Bill Gates methods, I saw from the beginning. I still hate Windows, and far most using RISC OS and Linux.

The last weeks I am viewing many YouTube video's with the Pi 4B about UK Narrow Boats, canals, rivers, locks etc. Very interesting. And you know I still am a very active member of the Dutch Big Ben Club for Acorn and RISC OS computers for 36 years. Pity we have to calm down our meetings because of CoVid19 ;-(.

Greetings from Henri, to your wife too. She knows me from the Britisch computer shows I visited several times.

Reply to
Henri Derksen

Here's a list showing when the first compiler for each language was released:

FORTRAN 1957 Algol 60 1960 COBOL 1960 BCPL 1967 C 1972

... which is interesting. I'd always thought that BCPL preceded Algol 60 because its a relatively primitive language. I didn't realise that ALGOL

60 and COBOL both had working compilers released in the same year

FWIW, Grace Hopper's FLOWMATIC, a recognisable ancestor of COBOL, had its first complete compiler out in 1959 and her MATH-MATIC, which looks more similar to FLOWMATIC than to FORTRAN (which was IBM proprietary), was released in 1957, the same year as FORTRAN's first compiler.

Its kind of interesting, too, that FLOWMATIC, MATH_MATIC and BASIC (released 1964) all used what are effectively line numbers as labels and all share the problem of renumbering lines when you need to add more code between existing lines.

AFAIK only JOSS and its successors JEAN (on ICL kit) and FOCAL (on DEC kit) solved that problem: these languages use real numbers for code lines:

1.1 to 1.9 1.2 set a=0 1.3 to 2.0 1.9 do something else

and treating all lines with the same whole number as a subroutine:

2.55 do part 3

I wrote a bit of JEAN back in the day and preferred it to BASIC even though its syntax prevents conditionals from having an else branch:

1.01 type "Hiya" if b=0 1.02 type "boo hiss" if b=1
--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

On Mon, 31 Aug 2020 16:57:22 -0000 (UTC), Martin Gregorie declaimed the following:

I'm pretty certain the COBOL-74 standard provided for external modules to be linked -- my Xerox Sigma-6 manual is in storage so I can't confirm -- and it has been 45 years since I took that class.

formatting link
""" COBOL-74 added subprograms, giving programmers the ability to control the data each part of the program could access. """

Strangely, while I'm sure my course introduced linking separately compiled COBOL modules, we were NOT introduced to "COPY" statements.

Though that is a minor problem -- since it returns the current date, it at least was easy to detect century wrap-around and preface with the correct 19 or 20.

Not so simple is the case of data records that could span a range where "windowing" was not feasible.

--
	Wulfraed                 Dennis Lee Bieber         AF6VN 
	wlfraed@ix.netcom.com    http://wlfraed.microdiversity.freeddns.org/
Reply to
Dennis Lee Bieber

It was in the late 60's when Datamation published Goldilocks and the three bears in compilable COBOL.

--

-TV
Reply to
Tauno Voipio

I agree with Mayayana, it is consumers psychology and behavior that fills in the pockets of the Bezos'. But it is also the advantage that the Bezos' had being few steps ahead in the technology and taking advantage on the consumer. The internet turned into a swamp full of shit and it is getting worse.

Reply to
Deloptes

I wrote COBOL on ICL mainframes and the 2903 office computer from about

1969 until 1977, on ICL2966/ VME/B systems 1978-1984 and a bit of deviant Tandem NonStop S-COBOL in the '90s.

You're right about statically linked subroutines, though before around

1774 I remember calling statically linked PLAN assembler subroutines but IIRC COBOL subroutines were not supported and there was certainly no support for using a variable to hold the subroutine name, even if the subroutines were all statically linked. I don't thing that was in the CODASYL standard either, but I could be wrong about that because all my COBOL up to then was written on ICL 1900s.

I don't recall any ability to call before 1973 in 1976/77 on the 2903 (it ran unmodified 1900 machine code and compilers) I needed the ability to select and run a COBOL subroutine depending on the screen image an accounting system was required to display, i.e. I could statically link a collection of subroutines, all using the same interface, with the main program but the subroutine name had to be a variable. At that time the

1900 compiler would only let the subroutine name be a constant, so I had to write the small (100 instructions or so) main loop in PLAN3 assembler and everything else (file access, screen-specific code) was written as COBOL subroutines.

By the time 1979 rolled round and I was still writing COBOL, but this time on an ICL 2966 running under VME/B, that COBOL dialect did support subroutine calls with the subroutine name in a variable, so I was able to write a similar call structure in 100% COBOL - just as well since there was no official assembler and the S3 system programming language was not available to application authors.

IIRC the various mainframe dialects differed: an ICL or Burroughs COBOL program was inlikely to compile on IBM iron without chenges and vice versa.

Depends in the application: what you suggest works fine for the date on a report or at the top of a screen, but becomes a problem in other cases - I remember seeing a payroll back in the late '60s (6 digit date days) that *HAD* to handle birth dates in the 1890s. Not altogether easy on a system whose base date was 1/1/1900 !

My real argument with the 6 digit date is that its universal use in COBOL meant that analysts and designers tended to forget that the century is sometimes important - as it was during, say producing a statement for for transactions made during December 1999 but processed and printed in Jan 2000.

In other cases it was no problem at all: one of the more 'interesting' systems I designed had, amongst other things, to deal with dates from 55 BC onward and with varying degrees of accuracy ranging from '14th century' through 'Flourished 1750-1780' and 'FY91/92' to 31/18/2020.

Indeed, and these were almost certainly the hardest to track down and fix.

--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie

That's because M$ tried to add proprietary extensions to it - in violation of their licensing agreement with Sun - and they got caught at it.

Remember those science fiction stories where you could do stuff like that? Back then it worked because it was assumed that smart houses would be independent entities (with suitable access controls), rather than slaves of the Cloud.

--
/~\  Charlie Gibbs                  |  Microsoft is a dictatorship. 
\ /        |  Apple is a cult. 
 X   I'm really at ac.dekanfrus     |  Linux is anarchy. 
/ \  if you read it the right way.  |  Pick your poison.
Reply to
Charlie Gibbs

Find out the CPU core in your Pi (look it up in Wikipedia or something), then use the table in the URL above.

--
https://www.greenend.org.uk/rjk/
Reply to
Richard Kettlewell

That's ~57 bytes .

...and that is ~72....counting tabs and carriage returns..

Definitely java is more verbose.

In Java.

--
?when things get difficult you just have to lie?
Reply to
The Natural Philosopher

Then M$ stopped using Java... and few months passed... and... behold! .NET arrived and had everything JVM had but now under their control. And VB and C#! Everything new and invented by M$! Not stolen! :)

Reply to
Nikolaj Lazic

"Charlie Gibbs" wrote

| > Java was being phased out. | | That's because M$ tried to add proprietary extensions | to it - in violation of their licensing agreement with | Sun - and they got caught at it. |

I don't understand why so many people want to cast Microsoft as the only villain. Yes, MS tried to come up with their own version of Java. But that's not why Java was phased out. It was phased out because it was bloated and unsafe and didn't belong in webpages. Just like ActiveX. And because it really wasn't cross-platform. And because... since when are there multiple platforms? Back then over 90% of people were on Windows. The Mac people were mostly artists.

As Nikolaj pointed out, MS then came up with .Net. I once asked a project manager friend why .Net was so successful. Her response was one word: tools. MS have always been very good with both tools and docs. They make things people can use.

As it turned out, .Net was also idiotic online, as was Silverlight. Both systems are also failures for desktop software. So they've failed in those uses. Instead they're used on server backends.

Actually, .Net was another case where Microsoft were too much ahead of their time. They came out with it in

2000 as a web services system:

formatting link

The trouble was, no one had heard of web services and they served no purpose on desktops with 56k modems. Gates was talking about getting sports scores, Wall St updates, or making dental appts. It was ridiculous. He wanted Windows everywhere but he wasn't making sense.

So MS aimed at desktop software. They told everyone the days of native code were over and that everyone should use their new, safer, easier system. But it was like Java in being bloated molasses. No purpose on desktops. So it joined Java in corporate usage. Meanwhile, MS have slowly been closing off access because they got the idea from Lord Jobs that they could screw over both developers and customers by locking down the system and charging rental fees. Why sell cars when you can rent taxis? Even better, you can extort the taxi drivers for 30%.

They tried the Longhorn mess in 2005. Windows on top of a superfluous .Net wrapper. By their own description it was too bloated for existing hardware. So they had to give that up. Instead they focused on using security and stability as excuses to lock both customers and developers out of the system.

Microsoft's latest scam is Metro, AKA RT. Trinket apps. You can write them in .Net, for what it's worth. But they're actually little more than HTAs from what I can tell. Because you can also write them with javascript. And most access to the system is cut off. And once again, MS has come up with a brilliant idea for vacuuming wallets but neglected to consider that people would need a reason to buy in. There isn't even a device for Metro apps. They don't have a phone. Tablets? Maybe. Anyone dumb enough to overpay for an RT Surface could be a potential customer. But a tablet really isn't good for very much in the way of apps where a phone wouldn't be better.

Long story short, I don't see how you can blame Java's failure on MS. It turned out to not really be cross-platform and it's bloated. And it's had security problems. And who wants to use that gigantic object model? Cross platform safety was a good idea. It just didn't work.

| > If you get attacked online it will almost certainly be | > because you enabled script and/or enabled remote access | > functions so you could call your thermostat to tell it | > you're on your way home. | | Remember those science fiction stories where you could do | stuff like that? Back then it worked because it was assumed | that smart houses would be independent entities (with suitable | access controls), rather than slaves of the Cloud. |

It's my impression that people are doing it. People get calls from their surveillance cameras to say someone has broken into their house, for instance. Soon, the more feebleminded will probably be calling their frig to see if they need milk. And they'll be bragging about it.

There was an interesting story some years ago about

2 men who were rich, flying a private plane to a hunting cabin one of them owned. The owner used his iPhone to call his thermostat, so the cabin would be warm when they arrived. It turned out that a squirrel had built a nest in the furnace exhaust pipe and there were no working CO alarms. When the two men arrived they were dead before they had time to notice something was wrong.

I've also seen stories about hacked e-front door locks. People are eating up the IoT, no matter how dumb. I suppose it is all cloud-linked, but isn't that really a privacy issue rather than a functionality issue?

Reply to
Mayayana

On Mon, 31 Aug 2020 22:39:59 -0000 (UTC), Martin Gregorie declaimed the following:

My response was solely in response to the COBOL verb to obtain "current" date. Not about having to modify data files to handle multiple centuries due to, as your example, DoB.

If you want something nasty... I once worked on a system where an identifier had to go from 6-bytes to 9-bytes -- but we were not (at the time) permitted to change the data record size! (Too much code was based upon the record size AND file total size.) We had to encode the identifiers using RAD-50 character set. It wasn't until we'd gone through two file resizes (from something like 3000 records, to 5000, and then to 10000 records -- the logic was using binary search to find records by a different identifier field) that I managed to persuade the powers-that-be to convert the file to ISAM, which removed the fixed size file as binary search was no longer needed, that we were able to also revert the RAD-50 identifier to a

9-byte ASCII.
--
	Wulfraed                 Dennis Lee Bieber         AF6VN 
	wlfraed@ix.netcom.com    http://wlfraed.microdiversity.freeddns.org/
Reply to
Dennis Lee Bieber

No, but it's why M$ lost interest in it. They wanted something they could manipulate to maximize lock-in, and Sun had taken measures to ensure Java wasn't it.

We're not. We're just always on the lookout for opportunities to do a little Microsoft-bashing.

And hackers will break into the surveillance cameras to figure out when it's safe to break into the house.

But most of all, the owners of the centralized sites that co-ordinate it all will be able to compile a dossier on all users' movements, with Alexa filling in the details.

(Excuse me, my tinfoil hat is pinching. Let me adjust it...)

Of course they will. Ancient Chinese emperors grew their fingernails so long that their hands were useless, and in Victorian times the upper classes wore clothing so complex and restrictive that they could hardly do anything for themselves. This was a point of pride, for it demonstrated that these people were sufficiently powerful to have others do things for them.

Another entry for the Darwin Awards.

Perhaps, but isn't privacy important too? Or is everyone really happy that Orwell's telescreens have been deployed? Given the rise of authoritarianism everywhere in the world (especially in the U.S.), I consider it cause for alarm.

The point is that it doesn't have to be cloud-linked (FSVO "cloud"). This was not the vision of the creators of the Internet; they saw it as a peer-to-peer thing, not a return to the centralized model of the '60s.

--
/~\  Charlie Gibbs                  |  Microsoft is a dictatorship. 
\ /        |  Apple is a cult. 
 X   I'm really at ac.dekanfrus     |  Linux is anarchy. 
/ \  if you read it the right way.  |  Pick your poison.
Reply to
Charlie Gibbs

In this case that's because they were: you do NOT grab somebody else's product (at that time it belonged to Sun and no part of it was open source), hack it about to suit yourself without consulting anybody, least of all the originators and standardisation people, and then try to flog it to all and sundry. Thats very little different to the way M$ bought what became MSDOS from its originators so they had something to flog to IBM, though they did at least buy MSDOS it from its authors. I forget what, if anything they paid Sun for Java.

You're right that it didn't belong in webpages but not about much else. Java was essentially a clean-sheet attempt to build something better than C++ and from the outset it was designed for the 'write once, run anywhere paradigm', which is why Java runs in the Java Virtual Machine - the language and compiled code is the same everywhere with all hardware- specific and OS-specific stuff kept where it belongs - in the JVM

Java is a lot more secure and crash-resistent simply because the compiler is designed to trap as many coding errors as possible before any compiled code is emitted. In addition insecure things like null-terminated strings and commonly misused things (untyped pointers, malloc and friends) are simply not exposed to coders and all objects are strongly typed, which gets rid of another heap of security issues. No preprocessors either.

A lot of the more recent languages, e.g. Rust, have used this approach to designing programming languages and writing compilers, so obviously the people behind Java did something right.

--
Martin    | martin at 
Gregorie  | gregorie dot org
Reply to
Martin Gregorie
[Snip]

I've many happy memories of travelling to the Netherlands for Big Ben shows with The ARM Club.

She says hi :)

---druck

Reply to
druck

Not much, if anything. Sun borrowed a page from Philips and the Compact Cassette (remember them?) that was introduced in 1963: make the licence very easy to obtain, but make one of its terms a strict compilance to the specification. This made the cassette a standard which many manufacturers adopted, and it quickly became ubiquitous. Java was on the same path when Microsoft violated the licensing agreement. Sun sued, and the judge gave Microsoft 90 days to either comply with the agreement or pull Windows 98 off the market. Microsoft had no choice but to back down - but it's little wonder that they suddenly lost interest in Java.

--
/~\  Charlie Gibbs                  |  Microsoft is a dictatorship. 
\ /        |  Apple is a cult. 
 X   I'm really at ac.dekanfrus     |  Linux is anarchy. 
/ \  if you read it the right way.  |  Pick your poison.
Reply to
Charlie Gibbs

"Charlie Gibbs" wrote

| > It's my impression that people are doing it. People | > get calls from their surveillance cameras to say someone | > has broken into their house, for instance. | | And hackers will break into the surveillance cameras | to figure out when it's safe to break into the house. |

Indeed. It's already happened:

formatting link

| > I've also seen stories about hacked e-front door locks. | > People are eating up the IoT, no matter how dumb. I | > suppose it is all cloud-linked, but isn't that really | > a privacy issue rather than a functionality issue? | | Perhaps, but isn't privacy important too?

To me it is. But seemingly not to most people. TVs are spying. Cars are spying. People don't know, don't see it happen, and are not aware of the ramifications. To my mind this started with hotmail claiming co-ownership of email and gmail claiming the right to read and hold your email. ("Don't worry. It's just for targetting ads.") People don't understand they're giving up their rights.

Reply to
Mayayana

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.