Is it a lost cause?

Indeed. These days once you have a keyboard, screen and mouse, the CPU is almost nothing costwise.

And Linux is nominally free.

--
"Socialist governments traditionally do make a financial mess. They  
always run out of other people's money. It's quite a characteristic of them" 

Margaret Thatcher
Reply to
The Natural Philosopher
Loading thread data ...

And core memory is better than William's tube memory:

formatting link

--

numerist at aquaporin4 dot com
Reply to
Charles Richmond

In 1980, the burroughs B4955 would run ten MICR reader-sorters, each at 2500 dpm (documents per minute). The flow path between the read-head and the pocket-select station was very short, if the host didn't select the pocket in 10 milliseconds, the check would miss the select station and processing would stop with a "too late to pocket select".

When the host was running the full ten sorters, it had about 2 milliseconds to process the MICR line and select the destination pocket for the document - stopages were discouraged as one could cost a bank hundreds of thousands in float.

Reply to
Scott Lurndal

Kinda, almost, maybe. It's not the hardware but application compatibility with what's happening today and in the near future. And maintenance/upgrades.

We keep reinventing things. First were "smart terminals" to offload SOME of the mundane work from the CPU. Then X-terminals for graphics. Then "thin clients" for splitting the work between the display processor and the server. That's now being reinvented as Cloud Services, Software-as-a-service, Platform-as-a-service.

Only 10 years ago, NJIT had a few banks of Sun SunRay "thin client" terminals and lotsa Windows PCs. I wanted to use the SunRay terminals but the browser could not handle many of the 'rich media' sites I needed to use. It was not the hardware, just a lack of interest in installing and maintaining the required support environment.

Huh? I'm using a BeagleBone Black: faster CPU, more memory and it's just too weak :-( In particular, the web browser keeps bombing. With no way to add RAM, it's just too limited to run as "thin client" with today's expectations of a full web experience.

Back in the 90s, many brokerages

a) used X-terminals so if anything broke, the broker just used another terminal. Nothing was local. All processing was delegated to the "back end" servers.

b) used Sun workstations. Enough CPU, RAM and disk space to run some analytics locally, and correlate feeds as each user/broker required. And wonderfully networked.

Now with the "mobile workforce" and BYOD (bring your own device), many offices only provide a monitor, keyboard, mouse and network link.

-- Jeffrey Jonas jeffj@panix(dot)com

Reply to
Jeff Jonas

Geez, that brings back images of Ronald Regan introducing ERMA: the General Electric computer that sorted checks.

That's a poor design of the sorter. I visited a mail sorting facility with 50, or perhaps 100 output bins. The last one was the reject bin, for things that missed their bin or didn't get selected in time.

But jams and belt/roller failures were to be expected, thus the full time operator/repair-person.

-- Jeffrey Jonas jeffj@panix(dot)com

Reply to
Jeff Jonas

So there is a "one-kind" list of featureswhich is the base of CPU design.

Ah! I'd forgotten about tests. In the old days, we had the Navy tests which kept manufactureres "honest". These tests are available to everyone with network access?

/BAH

Reply to
jmfbahciv

spreadsheet.

TTYs. Those not hard-wired had couplers which phoned in through a PDP-8/I.

/BAH

Reply to
jmfbahciv

[emoticon wiping coffee off TTY screen]

You remind me of Cutler but he was able to work on more than embedded apps.

I disagree.

/BAH

Reply to
jmfbahciv

KEWL!!!! :-))))

/BAH

Reply to
jmfbahciv

Ah, I misread your question. We didn't have control using software tools like you od. Our procedures were our source controls. Do you want me to describe them?

However, we always wrote the appropriate specs and had countless design sessions before we did the projects. There was no randomness in our devlopment procedures and processes as the OP wants to believe.

/BAH

Reply to
jmfbahciv

to

used

were

Nah, I think he's a kid but I haven't gotten to the point of handing him a Kleenix. He has potential for learning if can get that microchip off his shoulder.

/BAH

Reply to
jmfbahciv

architectures

folks have

years?

Thanks :-). All that work was done because there was unlimited access and discussions to the net. What happens when that stops?

/BAH

Reply to
jmfbahciv

the

TO

were?

didn't.

bad

program

GO

also

accurately.

You need to reexamine your assumption.

/BAH

Reply to
jmfbahciv

I understand the advantages. I like readable code so I can quickly do an overview scan of the program before delving into its guts. It's what I was used to.

/BAH

Reply to
jmfbahciv

spreadsheet.

JOBMAX was 50, IIRC. Pink scheduling would start to happen when about 30 users were hammering the system. This was 4S72 and not Level D file system (a pre-quel to TOPS-10).

We bought their BASIC and used it for the first training course where kids learned how to use the computer more than how to program. Before that, FORTRAN II on the 1620 was used to train kids how to use the gear and get exposure to fundamentals of programming.

/BAH

>
Reply to
jmfbahciv

Lighten up.

--
The biggest threat to humanity comes from socialism, which has utterly  
diverted our attention away from what really matters to our existential  
survival, to indulging in navel gazing and faux moral investigations  
into what the world ought to be, whilst we fail utterly to deal with  
what it actually is.
Reply to
The Natural Philosopher

Well, it was the software that would decide to stop, not the sorter. The document would hit the reject pocket, and if the site wanted to, they could continue processing instead of stopping on a too-late-to-pocket-select event.

However, it very seldom happened since the application code that ran in the interrupt ran in control state for efficiency.

You'd be surprised, I think, at the reliability of the hardware. Particularly given that the documents were often of variable size and had been folded, spindled and/or mutilated.

The Fed site in MSP had something like 50 pockets - rumour at the time had it that the operators had roller skates to empty the pockets before they were full (to avoid a stoppage) while feeding the input hopper.

Reply to
Scott Lurndal

90s, I was semi-facetiously saying I would take a $500 milspec part, agressive cost reduce by 2-3 orders of magnitude while making it more secure ... including at panel discussion in standing room ballroom at this event:
formatting link

the objective was to be much more secure than the card association payment chip ... which had a whole lot of vulnerabilities, less than

1/10th the cost, and significantly faster (still running something like 10secs). early part of the century the card association was characterized as having spent billions of dollars to prove that chips were less secure than magstripe ... old trip report to cartes2002 (gone 404 but still lives on at wayback machine)
formatting link

near the bottom of the trip report ... possible to create a "YES CARD" chip as easily as magstripe (and less secure). past posts

formatting link
In the 90s, I was approached by the transit industry and asked if I could also make the chip do a transaction in the transit turnstyle timing limits (1/10th of a second) ... without increasing the cost, not reducing integrity and still be used for payment transactions (so person could use the same card for transit and payment, including both point-of-sale and e-commerce/internet). Prototype was demoed at 1999 annual world-wide, BAI retail banking show in Miami
formatting link

the issue for the transit industry is that they are subsidized with public money and under lots of pressure to get out of (heavily subsizied) payment transaction business ... and move to industry standard payments (but would require that the banking industry enormously increase their performance and enormously reduce their cost per transaction).

the problem for the banking industry was the enormous increase in integrity eliminated a lot of fraud and justification for something like

90% of interchange fee on electronic payment transactions (with US banks at the time making avg of 40-60% of their bottom line from payment/interchange fees, a 90% cut in the fees would be a big hit to bottom line).

Note, 1999 was also GLBA ... which is now better known for having added repeal of Glass-Steagal

formatting link
enabling "too big to fail" ("too big to prosecute" and "too big to jail")
formatting link

however, the original rhetoric on the floor of congress was that the purpose of GLBA was if you already had a banking charter you got to keep it, but if you didn't already have one, you couldn't get one ... i.e. protect the banking industry oligarchy and keep new competition with new (more efficient) technology out of banking.

patent portfolio, all assigned, including covering being able to transition from institution centric cards to "person-centric" chipcard (aka person being able to use single chipcard for all authentication requirements). Originally the claims were going to be packaged as over

100 patents, then some executive directed that they be packaged as 9 patents (reduce the filing cost), then the patent office comes back and says to repackage as at least two dozen (humongous patents filing fee wasn't even covering cost of reading all the claims)
formatting link

other topic drift ... 1999 I was asked to help prevent the comming economic mess ... this is long winded post about conference for me sponsored by Atalla (ATM cash machine technology, had been bought by tandem, which had been bought by compaq)

formatting link

securitizing mortgages had been used during the S&L crisis to obfuscate fraudulent mortgages (poster child was office buildings in dallas/ft.worth that turned out to be empty lots) and I was asked to improve the integrity of mortgage supporting documents as countermeasure. they then find that they can pay the rating agencies for triple-A rating (when both the sellers and rating agencies know they weren't worth triple-A, from Oct2008 congressional hearing testimony).

With triple-A rating, the can sell securitized mortgages as fast as they can be made (w/o having to worry about borrower's qualification or loan quality) ... including to large funds restricted to only dealing in "safe investment" (like large public & private pension funds, resulting in large hit to those funds), largely responsible being able to do over $27T 2001-2008. It also met that they could start doing no-down, no-documentation, liar loans (and with no-documentation, there was no longer issue of supporting document integrity)

formatting link

--
virtualization experience starting Jan1968, online at home since Mar1970
Reply to
Anne & Lynn Wheeler

I once worked on a HP 2000 minicomputer system that was running BASIC time-sharing for 32 users. Wrote some programs for it, including a compiler that accepted a "structured" BASIC language with block structure, local variables, long variable names etc. and output a plain HP 2000 BASIC program that would run on that system. Another user wrote a userfriendly editor to edit the program text.

Googling for it I found that someone still collects programs for it, but unfortunately I have discarded my box full of papertapes last year when cleaning out the cellar.

Reply to
Rob

re:

formatting link
Is it a lost cause?

for other trivia ... just posted some more about the chip in thread over in comp.arch ... about giving a talk at session in the trusted computing tract at the intel developer's forum

formatting link
Intel spyware chip?

--
virtualization experience starting Jan1968, online at home since Mar1970
Reply to
Anne & Lynn Wheeler

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.