The value of a CS education

Answer to your questions here:

formatting link

Reply to
John S
Loading thread data ...

It's a broadening of the base of the pyramid as more and more tools are developed that require less and less education to use them. Some folks (Steve Balmer at Microsoft, for example) see this broadening base as an opportunity. Make it easier for people with less training to do more and make money off of that. Others lament the fact that 8th graders are writing in .NET. The bottom line is that there's no stopping the trend. Folks want to solve problems and it's a fact that more people can solve their own problems using tools they barely apprehend, now. Widens the door and let's one-off problems get solved where it would otherwise require high-powered types that couldn't be afforded for the one-off task.

In a way, same thing happened in electronics. You've got designers who use excel's solver without really understanding some of the deeper issues they probably should.

Not sure what to do about it. It's PhD types at first. Which is great while it lasts. But then they beat down the paths and clear the underbrush on the frontiers. Which makes room for others to come in and set up camp. Eventually, it matures enough that you've got vagrants living on park benches and everyone complains about the neighborhood going to hell. Oh, well.

So we live in .NET hell, now.

Jon

Reply to
Jon Kirwan

I'm not at all against putting power in people's hands--they ought to be able to do what they want. I just don't necessarily want to use what they produce.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

In PowerBasic, yes. But I wouldn't do that. Pointers make lots of trouble.

Certainly.

Eight. Why would anybody want 35 dimensions?

Signed and unsigned.

Sure. And 80 bit floats. And currency.

Numeric Data storage requirements and ranges

Data Type Size Decimal Range Binary Range Integer 16 bits (2 bytes), signed -32,768 to 32,767 -2^15 to 2^15-1 Long-integer 32 bits (4 bytes), signed -2,147,483,648 to 2,147,483,647 -2^31 to 2^31-1 Quad-integer 64 bits (8 bytes), signed -9.22*10^18 to +9.22*10^18 -2^63 to 2^63-1 Byte 8 bits (1 byte), unsigned 0 to 255 0 to 2^8 -1 Word 16 bits (2 bytes), unsigned 0 to 65,535 0 to 2^16 -1 Double-word 32 bits (4 bytes), unsigned 0 to 4,294,967,295 0 to 2^32 -1 Single-precision 32 bits (4 bytes) 8.43*10^-37 to 3.40*10^38 Double-precision 64 bits (8 bytes) 4.19*10^-307 to 1.79*10^308 Extended-precision 80 bits (10 bytes) 3.4*10^-4932 to 1.2*10^4932 Currency 64 bits (8 bytes) -9.22*10^14 to +9.22*10^14 Extended-currency 64 bits (8 bytes) -9.22*10^16 to +9.22*10^16 Variant 128 bits (16 bytes) {data-dependent} {data-dependent}

Plus real string variables of various types, and lots of neat string functions.

I don't run Dartmouth Basic. I don't think many people do any more.

Hell no. What a mess. PRINT USING is a zillion times better.

Sure. Full set of booleans and radix things.

Sure. Ascii, binary, records, random, sequential.

Anything, including unicode. To a serial port or Ethernet. TCP/UDP functions are native to PowerBasic. I can open an Ethernet port, shoot out some text, and close it in three lines.

How much ram do you have?

Can't imagine.

On my PC, I have written *useful* FOR loops that run at 30 MHz.

We had an indexed summing, sort of a histogram, problem a while back. The app ran C, under Linux. One of my guys coded it "efficiently" in C, using pointers. Just for fun, I did it in PowerBasic in Windows, using subscripts. Mine ran 7x faster. He played with the code and compiler optimizations for a day or so and got within 30% of my Basic.

John

Reply to
John Larkin

Well, that's a point.

Jon

Reply to
Jon Kirwan

Yes, but the trend is towards encapsulated power supplies so you can get the regulatory issues out of the way. That means putting the power switch *on* the wall wart...

[this just isn't practical in most applications] *Or*, designing the wall wart so that it can be remotely activated (using microwatts? of standby power while off)

The trickier problem is handling devices that need *some* power while idle -- yet putting lots of pressure on the designs to constrain their power consumption severely. (i.e., this tends to discourage COTS wall warts in a ubiquitous role)

Reply to
Don Y

Pointers are only dangerous in the hands of people who don't know how to use them. Would you disallow any electrical potential great enough to overcome skin resistance because of teh possibility of someone getting a shock?

Not in BASIC. 26 functions having the names A through Z (accompanied by an equally restricted set of variable names)

Why not? Why just eight? (or, the *two* that BASIC supports)

Not in BASIC. You only have "floating point".

Again, not in BASIC.

Powerbasic isn't BASIC. You lamented BASIC's bad reputation. I'm telling you *why* BASIC has a bad reputation. It's over

40 years old!

If you're abandoning the fundamentals of BASIC and allowing

*any* sort of extensions to morph "it" into whatever language you want, then why restrict yourself to PowerBASIC?

How many different processor families are supported? How big is the runtime library? How many OS's have been written in it? How big is the talent pool that I can draw on to fill an opening today? Will it be around 30-40 years (!) after it's creation?

Why? Because *you* say it is?

Gee, in Limbo, I can implement a *secure* (encrypted) connection to another arbitrary process on another node in even less code!

BASIC was limited to "100 constants" in a program (101 in the above example)

Sure, and I could read bytes off a 9 track tape on a 25MHz machine in < 7 us. And, I could guarantee that while writing the code

*before* compiling it. Can you guesstimate the time a particular line of your code takes to execute? Or, how many resources lie behind it?

How many orders of magnitude faster than that (25MHz) is your PC?

Great! Invest heavily in the product! I'm sure it will revolutionize computing! I guess I've just missed all the publicity from the multitudes using it for successful product development :-/

(I'm not holding my breath! I suspect even Perl or Python will have a bigger practical user base)

Reply to
Don Y

Oh, come on, Don, that's like arguing that Fortran 2008 has the same limitations as the original Fortran of 1957. You can't actually buy a compiler or interpreter for Dartmouth BASIC on any modern hardware that I know of. Do you know of one?

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

I agree with the above.

However the remaining s**te is snipped.

Reply to
John S

I was addressing John's lament re: the bad reputation that BASIC had. I.e., why it is "ridiculed", etc. Why language design went in a different direction. The sorts of features that were *added* to languages (to address capabilities

*missing* from other languages of that generation).

If you *forget* the history of BASIC and look at, e.g., PowerBASIC as a competitor to other *contemporary* languages (even "ancient C"), I don't see it offering much to make it stand out in the crowd.

It's interesting to note how languages have fallen out of favor. This clearly has some cause; either features of the language that proved to be cumbersome/ineffective or features that were outright *absent* or that limited the language's application to other application domains (how often do you see COBOL applied to scientific applications? Where has PL/1 gone? SNOBOL? Algol? Pascal?).

You have to wonder why things like C have such a long, established,

*stable* history. If it has so many "problems", then why hasn't it been replaced by a modern equivalent? What is it about the language that gives it so much "staying power"? Alternatively, what is it about the orphans that has given them so *little*?
Reply to
Don Y

40! imagine anyone being stupid enough to use a 40-year old language!

How old do you think C is?

Why allow C++?

John

Reply to
John Larkin

Probably the best programming language in history, measured in terms of actual outcomes, is Cobol.

John

Reply to
John Larkin

I like Delphi myself, it's a pumped up version of pascal with the horse power of C/C++ and it still can generate native apps, even though the current dev's of the product have seem to made a mess of it with the unicoded VCL.

Jamie

Reply to
Jamie

Define "outcomes".

Reply to
krw

Successful, bug-free applications.

Trillions of dollars processed.

John

Reply to
John Larkin

How do you divide by zero?

With that definition, obviously, since it's used almost exclusively for financial transactions.

Reply to
krw

One of the nice things about having a range of choices is that you can find one to suit you. Ain't capitalism wonderful?

COBOL came along after FORTRAN, and added nothing that a numerical analyst would want. It was designed by a subcommittee of SHARE, a very influrential IBM users' group that still exists. (My brother did some COBOL programming in school, about 1970, but hated it so much that his distaste still lingers with me.)

SNOBOL was highly specialized for text processing, and only existed as an interpreted language iirc. (I never used it.)

PL/1 was a very good general-purpose language that failed to catch on because of repeated schedule slips during the introduction of IBM System

360--the OS folks got in over their head, especially by promising to introduce an OS with time sharing before the performance tradeoffs of time sharing were adequately understood, and there weren't enough programmers available to get PL/1 to market on time. It was a pity.

Algol lives on--C, Ada, and Pascal are all ALGOL derivatives.

I haven't programmed in BASIC since the late 1980s, when I had an early BASIC compiler for DOS. In the mid-80s, when I was a grad student, I did rather like the HP Basic dialect for the HP 9816, because it did graphics relatively well, and especially because it made GPIB instrument control very nearly effortless. In the things I care about, it was streets ahead of any subsequent effort that I know of, including LabView.

I think that C had a stable history because the original implementers stayed involved. The Dartmouth folks weren't involved in the early microcomputer development of BASIC, so different vendors extended it in different ways. There was a brief attempt to recover from this situation, but instead of a real standards effort, it consisted of yet another variant, "Pure BASIC", which the vendors tried to set off against what they called "street BASIC", by which they apparently meatn Microsoft GW BASIC and QuickBASIC. I was sort of surprised to discover that there's still a product called 'Pure Basic', but there is.

So the 'basic' reason is that the Dartmouth folks dropped the ball, whereas the Bell Labs folks didn't.

Of course the fact that C was the compiler that shipped with UNIX didn't hurt, nor did its ability to combine low level bit banging with higher level constructs.

Because my software mostly gets used by me, I use languages that I like, primarily two: C++ for compute-intensive stuff and REXX for scripting and text handling. I use C for embedded gizmos, because it's adequate, and because I've never had the time or the need to learn any of the microcomputer assembly languages.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

If by "outcomes" you mean masses of people throwing up, I agree. ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

...and then the commies come along and force C on everyone...

I remembered it as a compiled language but apparently not. According to Wiki a compiled version (SPITBAL) did exist.

PL/I (not PL/1, BTW) is a huge language. The compilers were too complicated for the time. I don't believe a full implementation was ever written for anything other than OS/360 (and offspring). Great language though.

PL/I is in that list, too. Ada has as much roots in PL/I as it does Algol.

The last time I used BASIC was on the Tek SPS. That must have been around '82.

I just can't stand C syntax, so I don't program much anymore. What I do is in assembler (or VHDL, a child of Ada, if you count that).

Reply to
krw

Well, GCC cross-compilers exist for most micros, so you can program them in FORTRAN if you prefer. ;)

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs
Principal Consultant
ElectroOptical Innovations LLC
Optics, Electro-optics, Photonics, Analog Electronics

160 North State Road #203
Briarcliff Manor NY 10510
845-480-2058

hobbs at electrooptical dot net
http://electrooptical.net
Reply to
Phil Hobbs

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.