student project

thinking about how good BASIC is.

--
That's not BASIC's fault, it's the practitioner's.
Reply to
John Fields
Loading thread data ...

--- We were having some problems with our shower's drain, so we called a plumber to come fix it, which he did in about half an hour.

When I asked him how much to write the check for, he said: "Three hundred dollars."

Somewhat taken aback, I said: "That's six hundred dollars an hour; my doctor doesn't make that kind of money!"

He replied: "Neither did I when I was a doctor."

--- JF

Reply to
John Fields

So you consider debugging to be "tests"? When you fail a test, and you have to locate and fix the defect, do you not think of that as debugging?

Testing is a rotten way to find bugs.

I don't blame C, I blame the culture around C. Very few C programmers transcend it. I know a few good C programmers and many, many bad ones. Given a better language and especially some better education and standards, some of the bad ones might write decent code.

C was introduced about 1972. It was based on B, which was based on BCPL, which was based on CPL. CPL was introduced in the early 1960's.

So C is, fundamentally, a 50-year old language that was designed in the days of paper tape and 18-bit computers with 16K of core memory. It shows. The mass of programmers are resistant to things like Pascal and Ada, because they require too much discipline.

Procedural programming is about the worst thing we do in electronic system design. The culture of programming is a serious threat to national security. Most big programming projects fail. It's a scandal.

John

Reply to
John Larkin

Doctors are mostly very social. I don't have a lot of personal experience with architects, but the couple I know are also very social, but a really good architect, or surgeon, could get away with being weird.

John

Reply to
John Larkin

Larkin _is_ the primary programmer in his company :-) ...Jim Thompson

-- | James E.Thompson, CTO | mens | | Analog Innovations, Inc. | et | | Analog/Mixed-Signal ASIC's and Discrete Systems | manus | | Phoenix, Arizona 85048 Skype: Contacts Only | | | Voice:(480)460-2350 Fax: Available upon request | Brass Rat | | E-mail Icon at

formatting link
| 1962 |

I'll have you know that I have never once referred to anyone here as being a member of the ignorant, hateful, ugly, mooching class.

I have always been kind, referring to them by their own chosen name... Democrats O:-)

Reply to
Jim Thompson

OTOH to write the tests first you have to have some sort of basic specification of what the code is supposed to do and create test data to establish that it works on the boundary conditions. There is nothing wrong with regression testing to make sure an improvement is just that.

To be able to inspect code you also need to know what it is meant to do!

I don't like C particularly, but I think you are wrong about the root cause. I have seen bad code in Algol, Modula, Basic and any other language you care to mention. Some of the modern website Javascript OO stuff is appalling.

I think industry made a wrong turn by going for weakly typed C languages rather than robust defensive and more easily statically tested strongly typed modules of the Algol/Ada/Modula/Oberon family. But at the time it was understandable computers were an expensive luxury and CPU time and core space was hard fought for.

Imagine if every electronic connector no matter what its function or power rating had 32 pins and would fit in every other socket. That is a rough analogy of the problem facing software interfacing without strong typing. The compiler never stands a chance of spotting common human errors at compile time.

Things got decidedly worse when Windows came along and everything to do with the GUI was passed as a pointer to a god-alone-knows-what object.

And also Algol dates back to 1958.

Incidentally losing BCPL's tagged $) was a mistake in the C spec.

I don't think that was it at all. Several major computer science groups had active research into Wirth family languages - it was industry and commercial programming that went for C in the end not academia.

Then academia was pressurised to supply semi-trained C coders.

Big programming projects typically fail because there is a fluid or no written specifcation at the outset and S&M induced feature creep. Couple that with the technical sides wish to try out the latest and greatest new magic silver bullet and you have a recipe for disaster.

Regards, Martin Brown

Reply to
Martin Brown

I write the manual first. Then I design the hardware. Then I write the code. I consider anything past writing the code, what you call "testing" to be debugging. My "final test" is done by our test department, when the build the production test sets and write the production test/calibration code for the test stands. They had to do that anyhow, but if they turn up a problem, or request some improvement to facilitate test or calibration, I consider that additional work to be debug time, too, because I should have thought of it when I was coding.

Different culture.

Yup, disaster. One day, and I hope soon, some university or company will introduce a new programming discipline that actually works and scales and can be taught to reasonably bright kids. It will be a revolution.

It's shocking that no CS department has done this. We had a houseguest, the dean of CS at a big-name university, and I tried to talk to her about this. I got, literally, contempt that I would bother her about anything this trivial. "We don't program" quoth she, not quite supressing a sneer as she drank my wine.

John

Reply to
John Larkin

I agree with you, but I'd also point out probably the majority of programming jobs aren't "larger projects" in the first place. Go take a look at, e.g. Larkin's products, say, this one:

formatting link
-- I think he did all of its programming himself, and looking at the feature set, it strikes me as entirely reasonable for one guy to do all that. ...yet if John's box there had been designed by the likes of, say, HP, they probably would have had a half-dozen guys assigned to programming it, at which point I'd have to concur with your viewpoint that you tend to need a lot more analysis and testing tools to achieve the same level of quality... but of course a place like HP can readily afford the enterprise edition tools too.

My point here is that I think one reason software projects have difficulty is because management forces them into the "larger project" mold *without* supplying appropriate resources... even when, in many caes, the project could have been done by one talented guy and been both well-documented and bug-free based on the sort of straightforward (and cheap) methods that John is advocating (e.g., very deliberately read through your own darned code to check for bugs!)

Note also that the really talented guys will think nothing of taking complete responsibility for a software project, whereas for the lesser programmers there's a certain "safety in numbers" in that they can help each other out... as well as not look that bad if everyone in the group tends to write mediocre code! (Although setting the "acceptable level of code quality" is a function of management -- but far too many managers don't do this effectively...)

---Joel

Reply to
Joel Koltner

pants

The final truth is always in the code that is what decides the plot, comments can only help to explain.

My point being the if you need to comment every line you are either: writing lines that are far too complicated or redundantly reiterating the same obvious operation in a different language

-Lasse

Reply to
langwadt

pants

Disagree:

  1. Comments tell what is going on, and why. There's no point in explaining the instruction set.

  1. The very act of writing clear, correct comments improves the code, resulting in a considerable improvement in productivity and far fewer bugs. Commenting is like an instant code review; it makes the programmer slow down and think about what's going on.

The fastest way to produce good code is to slow down and think.

John

a random snippet:

; HERE WE HANDLE THE FAIRLY CONFUSING PORT D BIST MUX MESS. ; ; RAM VARIABLE 'BMUX' CONTROLS THE BIST MUX AND THE CMRR SHORT SSR (TPU15) ; BMUX LOOKS LIKE...

; 15 14 13 12 11 10 09 08 07 06 05 04 03 02 01 00 ; SH BL L L L H H H ; ; WHERE SH ASKS FOR THE SHORTING SSR (TPU15) TO BE ON ; BL ASKS FOR THE CAL BUS TO BE GROUNDED ON ALTERNATE IRQ'S ; L BITS DRIVE THE CAL- TEST MUX ; H BITS DRIVE THE CAL+ TEST MUX

; PORT D LOOKS LIKE THE LOW BYTE OF BMUX, BUT BITS 7 AND 6 DRIVE THE CAL BUS ; LATCHING RELAY. THAT RELAY IS CONTROLLED BY 'MPAT'.

; FIRST, CHECK FOR BLINKING MODE. WE BLINK THE CAL BUS ACCORDING TO THE LSB OF ; 'IRQ', WHICH ALLOWS A TEST PROGRAM TO SYNCHRONOUSLY DETECT THE CMRR SIGNAL.

TVDIP: MOVE.W BMUX.W, D1 ; READ BIST MUX CONTROL WORD BTST.L # 8, D1 ; IS BLINK BIT SET? BEQ.S TUBBY ; NO BTST.B # 0, IRQ+1.W ; USE IRQ COUNTER LSB AS ALTERNATOR GADGET BEQ.S TUBBY ; OFF? DO NORMAL OPS ORI.W # {HI*7}+{LO*7}, D1 ; ON? SET BOTH MUX SELECTIONS TO 7 = GROUND.

; MANAGE THE CAL BUS SHORTING SSR, VIA TPU15

TUBBY: MOVE.L # CBOPEN, D0 ; = TPU CODE FOR OPEN SSR TST.W D1 ; CHECK 'SH' (SIGN) BIT BPL.S TUMMY ; IF IT'S SET, CHANGE TPU MOVE.L # CBZERO, D0 ; COMMAND

TUMMY: MOVE.L D0, HSRR0.W ; FIRE TPU TO DRIVE SSR

; NOW SEE IF ANY LATCHING RELAY DRIVE IS REQUIRED

MOVE.W MPAT.W, D0 ; GET MODE WORD: ANDI.W # B0, D0 ; B0 SET WILL GATE CAL BUS OUT D9 CMP.W MOLD.W, D0 ; COMPARE TO OLD STATE BEQ.S TVNOC ; SKIP IF NO CHANGE

MOVE.W D0, MOLD.W ; UPDATE "OLD" STATE BNE.S TVMON ; NONZERO? THAT SETS RELAY ON

TVMOF: ORI.W # B7, D1 ; RELAY OFF == CAL BUS TO INTERNAL BIST BRA.S TVNOC ; BMX6 LOW PUTS + ON COIL PIN 1

TVMON: ORI.W # B7+B6, D1 ; RELAY ON == CAL BUS TO D9 ; BMX6 HIGH PUTS + ON COIL PIN 8

TVNOC: MOVE.B D1, PORTD.W ; LOAD PORT WITH BIST MUX + RELAY BITS ; DISPATCH TO ONE OF SEVERAL TASKS, EACH OF WHICH IS RUN AT ABOUT 6 HZ...

TVTASK: ADDQ.W # 1, IRQ.W ; COUNT OFF ONE SYSTEM TICK MOVE.W IRQ.W, VIRQ.W ; WRITE IRQ COUNTER TO VME HERE! ALLOWS CMRR TRIM TRICKS. MOVE.W IRQ.W, D0 ; NAB 200 HZ COUNTER MOVE.W D0, D1 ; + COPY

......

Reply to
John Larkin

I've tried cleaning up, (using gimp), some of Jan's drawings. Problem is that the contrast due to uneven illumination is greater than the contrast between pencil and paper. Waste of time.

--
"For a successful technology, reality must take precedence 
over public relations, for nature cannot be fooled."
                                       (Richard Feynman)
Reply to
Fred Abse

I did about half of those products, and expect to do less in the future. But yes, it's fairly easy for one person to manage 6000 lines of code. It's not so easy for someone else, or even the author, to pick it up three years later and understand it.

But as more people get involved, things often turn to chaos. If the army can get people to work together, and McDonalds can do it, why can't software development managers?

John

Reply to
John Larkin

John Larkin expounded in news: snipped-for-privacy@4ax.com:

Even good programmers make mistakes. The problem is that some bugs are extremely difficult to find. That one byte that was stepped on, or the one or few memory leaks can be very difficult to diagnose. Prevention is the most cost effective cure.

Ada provides the tools to do that job better, which is why it is used in life 'n death applications. For example, with a little compiler switch, you can enable _all_ array access checks. If you overstep your bounds, it "bails" (raises an exception).

In C, the only mechanism is the assert macro, which must be _manually_ coded. In C++ you can program it into the object's own acceses, but again this misses mark. The compiler is the best tool for this because it doesn't miss anything. No checks go omitted.

Additionally, in Ada you know the array bounds in the array passed into subprograms. In C/C++, you get a pointer and an agreed upon length, or you trust the programmer supplied argument(s) to tell you the length. This is ripe for abuse/misuse.

Anyway, this has been heavily "documented" in alt.lang.ada, which is where that discussion should go. ;-)

I've programmed in B on a Honeywell Level 66.

In the words of Marlin Brando (Apocalypse Now):

"The Horror.. the Horror!".

Pascal was ok for things in the "small". Ada (borrowing somewhat from Pascal) developed the "package" solution to allow development of "huge" systems. There's more, but that stands out.

Ada tends to force "design" and planning. If you don't plan ahead, your "packages" won't have the right visibility between packages and other components etc. I've heard Ada practitioners say that they've seen some large projects in Ada (poorly designed) that never successfully compiled (visibility conflicts). It's probably best to leave them that way.

Programmers don't like compilers telling them they can't do things. OTOH, C/C++ allows you to shoot your own foot. It's quite happy to oblige.

Students get to me- they froth at the mouth and bounce around in the starting gate just getting something to compile. They can't wait to run it. It doesn't seem to matter whether the program is correct or not. They want to work that out in debugging sessions. That approach is soooo wrong.

I don't think "procedural programming" is to blame. Not everything is suitable for OO application. Like anything else, it is a matter of applying the right tools for the job. Design first- implementation follows.

Warren

Reply to
Warren

one pants

m. So

lse

n

ets

k
t
,

That was kinda the point I was trying to make, that the instructionset should not be commented, the reasons and thinking behind the code should.

.....

first comment says it all, complex and messy, so of course it should have lots of comments, but a lot of code is nothing like that.

though I would say that a lot of those tst and branch, I'd group with only one comment. longer variable/constant/label names would help too

-Lasse

Reply to
langwadt

Martin Brown expounded in news:1WwFo.29301$ snipped-for-privacy@newsfe20.iad: ..

Not to mention that almost every integer becomes a WORD or DWORD in parameters.

In Ada you can differentiate based upon its use/purpose:

procedure T is type Amps is new Float; type Volts is new Float; type Watts is new Float;

Current : Amps; Voltage : Volts; Power : Watts; begin

Voltage := 12.0; Current := 0.001; Power := Watts(Voltage) * Watts(Current);

end T;

If I try to mix types in an expression, the compiler treats that as an error. This is why the Power assignment explicitly coerces Voltage and Current into Watts, to perform that power calculation.

Disclaimer: this was not meant to be the best Ada code for this example-- merely for illustrative purposes.

In C/C++ you can use typedefs etc., but the compiler only cares about the "implementation" of that type. If they all turn out to be "floats" (for example), then the compiler permits you to freely mix values in expressions without warning.

C would glibly permit:

Voltage = Power + Current; /* Bad, bad, bad... */

Warren

Reply to
Warren

Probably the best programming language, in terms of outcome, is COBOL.

John

Reply to
John Larkin

It's nowhere near this dire; utilities for checking array access, heap corruption, etc. for C programs have been around for quite awhile. E.g.,

formatting link
... there are numerous such programs available, some commercial, some free.

So just make it a policy that you don't allow "raw" pointers in the code; C++ has provided "safe" pointers and arrays for a few decades now.

Arguably C/C++ *provides* more ways to shoot yourself in the foot than something like Ada does, but I really don't think that's a problem with the language itself.

Granted, if we're talking jetliners or nuclear power plants, it's a compelling argument that removing all the guns from the room is a reasonable tradeoff in an attempt to insure safety.

That's for sure!

---Joel

Reply to
Joel Koltner

Makes sense. it wouldn't become an Algorithm until it starts looping in a predictable, on time manner...something you can tap your toes to...

mike

Reply to
m II

Get the PBCC (Console Compiler) version. It makes great Windows programs with the simplicity and style of old DOS programs (locate, print, input, like that.) The PB/Windows version tangles you up in all the Windows GUI crap. The old DOS version was nice, but doesn't play very well under Windows, although most of my old programs still work.

PBCC is amazing. If you want to do Ethernet, it just takes a few lines. The graphics stuff is great. And it has all the "modern" programming constructs you'd ever want.

' EPORT : OPEN TCP/IP PORT AS # 8

EPORT:

TCP CLOSE # 8 ' NUKE ANY OPEN ETHERNET COMM CLOSE # 1 ' OR SERIAL PORT

CPORT% = 0

LOCATE 18, 1

PRINT " Default IP address = "; IP1$ PRINT PRINT " Enter IP address, D for Default, P for previous : ";

LINE INPUT IP$ IF UCASE$(IP$) = "D" THEN IP$ = IP1$ IF UCASE$(IP$) = "P" THEN IP$ = IPO$

IF IP$ = "" THEN GOTO TOP

ERRCLEAR

IPO$ = IP$ ' REMEMBER IP ADD AS "OLD"

TCP OPEN PORT 2000 AT IP$ AS # 8 TIMEOUT 50 CPORT% = 8

IF ERR THEN

LOCATE 22, 10: PRINT "TCP OPEN ERROR" SLEEP 2000 CPORT% = 0

END IF

TCP RECV #8, 2000, B$ ' FLUSH THE TCP BUFFER!

GOTO TOP

I could have done the open inside a TRY...CATCH construct, even slicker.

John

Reply to
John Larkin

Jan Panteltje expounded in news:ic49u1$isn$ snipped-for-privacy@news.albasani.net:

..

Hang out in comp.lang.ada for a while. ;-)

It's still easy to get burnt with small changes. Apollo 13 comes immediately to mind. The problem with any change, is the range of assumptions that go with it.

If you've been forced to use a particular tool (asm), then no other tool is going to help. If you have "choice", then there might be "hope". Ada can make calls to modules in other languages. Unix is mainly a C world, so I do this regularly.

This is normal practice in any toolchain. But even libraries can be provoked to evil if used in a "novel way".

Ha!! GNAT (for example), is just an Ada front end to the gcc compiler. It routinely generates just as good code, and sometimes better than C. One the whole, the result is comparable, depending upon the task at hand (each language can showcase different strengths).

I just did an Ada project where I used Ada (GNAT) to do a MIDI controller on an ATmega168. I regularly compiled into less than

4 KB, and it used 1K SRAM. If you can do it in C, then there's no reason Ada can't also do the same within +/-10%.

I do understand however, that the PIC choice of tools is not entirely as happy as on the open source front.

Thankfully, I don't need do perform this level of verification. But I know that logic engines (Prolog IIRC) are used to process Ada code for correctness, testing the logic for faults or undefined areas. SPARC is a subset of Ada: I think some of that information is added to the source code in the form of comments to assist in the proofs.

It can be done-- ask in the comp.lang.ada forum if you want to know more. I don't know enough about this practice to give you much. I do know they use a package named "ASIS", to allow Ada programs to inspect Ada source code. This is combined (I think) with Prolog to prove correctness using logic.

This is necessary in flight or air traffic control systems. Hopefully this is also the kind of thing that happens in medical control systems.

But everyone makes mistakes. Ada tries to help prevent them and catch them before you "test". Some things can only be caught at runtime, so there are "exceptions" for that as well.

Ada is not a silver bullet. It's not designed to find hardware faults.

Assumptions are killers. I understand that those using the SPARC also use a rigorous set of code reviews, with multiple eyeballs. This helps to catch those faulty assumptions.

Warren

Reply to
Warren

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.