Fundamental C question about "if" statements

So I'm not imagining things. :-(

Farnell are currently charging 2.46 GBP/qty 1, 1.79 GBP each/qty 10.

I did think about it being a low volume part, but on other parts with a PDIP packaging option the extra cost of the PDIP option appears to be reasonably constant over time.

Simon.

--
Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP 
Microsoft: Bringing you 1980s technology to a 21st century world
Reply to
Simon Clubley
Loading thread data ...

Fair enough - it sounds like you make your decisions here in a balanced way. Some people insist on using only DIP parts, and therefore limit their choices drastically and unnecessarily.

Reply to
David Brown

Oh come on, it is time we stop commenting on wanting to use a DIP processor. I have not used one since the 6809 days, i.e. since the 80-s. There is no sensible reason to use one in whatever design.

If we must go archaic let us go valves :D.

Dimiter

------------------------------------------------------ Dimiter Popoff, TGI

formatting link

------------------------------------------------------

formatting link

Reply to
Dimiter_Popoff

That depends on your purpose. I like the ability to wiggle a fried LPC1114 out of my students boards and insert a fresh one for a few $. And to have them put an LPC810 in a solderless breadboard and connect all the wires themselves.

Wouter

Reply to
Wouter van Ooijen

There's a large population of programmers out there who believe it's been done -- they call the language "Ada".

I've never programmed in Ada, but I've debugged it. We had a multi- million dollar contract stall because our box (programmed in C) and the prime's box (programmed in Ada) could not talk to each other. We went back and forth on it for months, until finally the prime decided that they needed to fly half a dozen people all the way across the continent to smack us stupid Oregonians until we saw the light and fixed our bug.

I ended up getting sucked into it at the last minute (in retrospect I think it was because the project manager on our side knew that, when handled right, I can act like a demented terrier who's just coming down with second stage rabies -- and mange). So there I am, in a meeting room with half a dozen software engineers from two companies, plus half a dozen sales and project management types to lend weight and credence to the whole "this is serious business" air of the thing.

Two things became apparent to me, in the order that I present them here.

One, these weren't just Ada programmers, they were (as many Ada programmers seem to be), devout members of the Ada cult. Their simple rule for localizing bugs in a project that contained Ada code and C code was that the bug had to be in the C code -- end of story. On top of that, they had a complete and completely contemptuous ignorance of C -- trying to actually show them code was roughly equivalent to a devout pagan trying to explain what Aphrodite was really about to Jimmy Swaggart in his pre-caught days.

Two, the bug had to be in the Ada code. It came about because things had devolved into a very genteel argument, with the Disciples of Ada adamantly insisting that all problems are C problems, with their management team backing them up and making veiled threats to our management team, and our management team fighting a courageous rearguard action.

While this all was going on I sort of disengaged and started reading their code, line by line. Now, the problem was that a message was getting bit-reversed. I won't get , and I saw the following lines:

this_message[0..31]; that_message[1..31]; trouble_message[31..0]

So right in the middle of the discussion, which had been growing ever more heated in an ever more quiet and genteel way, I blurted "Hey! I think I found it!"

You may imagine the sorts of looks I got.

I explained the whole 0..31 vs. 31..0 business, and the Ada people _would not listen_ -- because Ada code is automatically bug-free, right? Particularly when you've cot C code in the vicinity acting like fresh, extra-sticky fly paper.

I tried explaining again -- have you ever tried to hold an intelligent conversation with a rock? A pissed-off rock? I got nowhere.

So I had to resort to intellectual violence: "hey, I know that I'm just a dumb-ass C programmer, and moreover that I'm Oregon born and bred. So could you Really Smart Ada people 'splain this here feature of your ever- so-wonderful language to me, in short words?"

Then I went over it "so this line says zero to thirty-one. And THIS line says 1 to thirty one." Then, resisting the urge to drool a bit "and THIS HERE line says thirty one to 0 and DAMN but stupid little old me just can't unnerstand what it all MEANS!"

At which point their chief Ada programmer and High Priestess of Blessed Code actually LOOKED at her code, slammed her printout on the table, and stomped out of the room.

So, anyway -- Ada, because it's always bug free, and infinitely better than C in every possible way.

--
Tim Wescott 
Wescott Design Services 
 Click to see the full signature
Reply to
Tim Wescott

Hi Wouter

There are also any number of SMT:DIP adapters that can be used, cost is ~$1.

--

John Devereux
Reply to
John Devereux

I was about to write something like that, good thing I delayed. You looked up some examples, I would not have done it :-).

I remember back in the HC11 days (20+ years ago) there was a PLCC 52 socket which soldered as through hole at a 2.54 grid, I think I used it once for a one-off thingie (that may have been some 15 years ago though).

Dimiter

Reply to
Dimiter_Popoff

I know, I sell such PCBs myself :)

But the right comparison (at least for my case of students using solderless breadboards) is

1) DIP chip

2) SMD chip + PCB + pins strips + solder it together

I still stringly prefer 1), especially due to the last part (soldering it together)

Wouter

Reply to
Wouter van Ooijen

I feel called to comment on this, as I'm almost in that group of programmers -- the difference is that I would not call it the "perfect" replacement, only the "best so far".

[snip]

That attitude is of course wrong and deplorable, even if there is much statistical support (as sometimes has been discussed in this newsgroup) for the opinion that C is more error-prone than Ada, other things being equal (which they seldom are).

I wonder if the attitude you encountered, on this occasion, was only partly Ada-vs-C, and partly also prime-vs-subcontractor?

[snip]

It is not central to your tale, but the reversed order of the low and high bounds in 31..0 would not directly bit-reverse a message. The range

31..0 is a null range in Ada, so if this code is from a declaration, it would declare a null (zero-length) array; if it is from an expression that takes a slice of an array, the slice would be a null array (but in both cases Ada would use round parentheses "()", not "[]"). Of course, if the Ada coder's intent was to write 0..31, to get 32 elements into the message, the actual result from 31..0 would be wrong in some other way. Or perhaps the whole of the processing was more complex and this null array/slice really did lead to bit-reversal for some complicated reason.

This part of your posting illustrates why I often find it difficult to discuss Ada-vs-C with programmers who have used only C. Programmers are proud of their knowledge and programs; if I suggest to a C user that I believe there is a better language than C, they often seem to feel that I'm calling them stupid, for having chosen a poorer language. Do you have a suggestion for how I could avoid that misunderstanding?

I'm not sure what you want to express here. Your tale is a cautionary one, of hubris and lack of humility among some Ada programmers, but in the end it does not say anything new about Ada or C. I don't know any Ada user who claims your closing sentence as fact, even if most think that Ada is better than C in many ways. The sentence seems like an attempt to use obviously false, ironical hyperbole to belittle or dismiss the real differences between Ada and C.

I do find your tale both interesting and comical, in a sad way, but not all Ada programmers are like that, and even if some are, it does not affect the technical comparison between Ada and C.

--
Niklas Holsti 
Tidorum Ltd 
 Click to see the full signature
Reply to
Niklas Holsti

/My/ language won't allow bugs like that - that would be a compile-time error!

The lesson from your story is, of course, that one can write bugs in any language. Ada has features that can help write correct code, and it reduces the risk of some errors that are not uncommon in C - but a good Ada programmer and a good C programmer, sticking to a good coding standard, will produce similar quality code. (C++ has many features that can help write good code, and many features that can help write appallingly bad code - but when you know what features to use, it can give solid results.)

There is one good, clear justification for claiming that Ada leads to fewer bugs and higher quality code - there are no amateur Ada programmers. People learn and use Ada because they care about code quality and correctness. Sure, there can be corporate cultural or personal reasons for not doing a good job of it, but you can be confident that they will at least understand the importance of quality coding. In the C world, there are countless people (coders and managers) who just do not see the point of quality - they are happy with something that seems to work okay during a quick test. So there is nothing wrong with good C (or C++) development - it's the low-end that brings down the average.

Reply to
David Brown

...

The tale sounded to me a combination of a perception by the 'Ada people' of C as unsafe (one that the Ada programmers and management bods could easily have picked up, given the press C sometimes gets) with positions and tempers having become entrenched and, as was pointed out, no one looking at the detail of the code.

It seems quite odd that there was no mention of a dump of the data as it flowed between the two domains, given that they knew that bits were being reversed.

But then again I have worked at places where someone gets an idea of what's wrong and goes rushing off to fix it only to find out later that that was not the problem at all. Once people get an idea in their heads, IMO even engineers don't always look at the details to try to prove their theories.

For example, a network which was reported to be running slowly had a switch or bridge installed to break up the traffic but it didn't help. Yet just 15 minutes of looking showed that the network was being flooded with broadcast storms; the switch/bridge made no difference whatsoever.

In another case a server admin set up a server but then got told that users could not write to their home directories. He apparently fixed it quickly but when we came to look at it some time later it turned out that he had given everyone read/write access to every directory so what should have been private, er, wasn't. :-(

James

Reply to
James Harris

It was somehow defining a bitfield, or an order of bits in a word.

They made some very disparaging comments about C that made their attitude quite clear. Some of it may have been prime vs. sub, but there was a lot of Ada-centric arrogance in the room.

Latterly, on USENET, I have run into, and even drawn, quite a few comments along the lines of "well, if you'd programmed that in Ada you wouldn't have that bug" -- which might be true: maybe you wouldn't have any bugs, or maybe you'd have other bugs, instead.

I suspect that the most vocal proponents of Ada as being not just technically better, but the be-all, end-all are not representative of the entire Ada community -- but they certainly seem to be the loudest, and hence, like right-wing "Christian" nuts vs. quiet and devout church- goers, end up representing the entire community in the eyes of the public.

When Ada came out it was touted as being almost magically self-correcting and naturally platform independent. It was the platform independence, in fact, which was a large part of the reason that the DoD started insisting on it. Then, about a decade later, it came out that nearly all of the installed Ada code base was, in fact, highly dependent on the hardware for which it was written, and that -- astonishingly enough -- at times it seemed that the software designers had gone out of their way to make the code that they were writing only work on the hardware that their company was selling.

It was a prime example of my argument with the more simple-minded members of the "Ada is better" camp -- Ada may make it marginally _easier_ to write better code. However, if you want good, robust, portable code you have to make it happen. There is no magic language that'll make it happen -- only language-independent diligence and hard work.

--
Tim Wescott 
Wescott Design Services 
 Click to see the full signature
Reply to
Tim Wescott

My favorite quote about C vs. C++ is that C gives you lots of rope. C++ gives you lots of rope, and, in a few places in the STL, some pre-tied nooses.

I think you're largely right. I see very few newbie/hobbyist types seeking to learn how to program PICs using Ada, but lots that just want to learn how to code in C.

I wonder if there's a way to test good responsible C or C++ code quality vs. Ada code quality.

--
Tim Wescott 
Wescott Design Services 
 Click to see the full signature
Reply to
Tim Wescott

So how on Earth was there no objective mechanism for showing where the message was being reversed? Like Wireshark or an HP BERT or just an oscilloscope? Logic analyzer? Packet dumper?

POR QUE?!?

Ah, the Festus Hagin gambit. Always works. LBJ made quite the career out of it...

Wow. I have never encountered any programmer who refused to believe they had a bug. Ever. This is called Epistemic Closure, and it's not a trait we like to find in engineers.

Of course.

--
Les Cargill
Reply to
Les Cargill

It was for a MIL-STD-1553 bus, for which bus analyzers are hard to get. There may have been one used, to what effect I don't know -- that was before I was pulled into the project so that I could bark and snarl at prime contractors.

If I remember correctly we did have a card in a computer, which had code that emulated the prime's end of things -- but it was written in C, so of course it was suspect too.

--
Tim Wescott 
Wescott Design Services 
 Click to see the full signature
Reply to
Tim Wescott

Oh. I get really impatient with cruft like that now.

It's Manchester biphase, so nothing too fancy really. And somewhere on a circuit board, it's gonna be CMOS or TTL levels...

of course, the test setup is on the test budget, so it's not like you can just... I dunno, test your code. And God forbid you tack some bluewires onnit...

Indeed. But ... so many airline tickets, hotel bills...

Oy.

Of course. *ALL* code is suspect. All hardware, too.

--
Les Cargill
Reply to
Les Cargill

There was published a paper on the subject 20 years ago (not the freshest data):

formatting link

How engineering students fare with the two languages does not map directly to real-world projects, but the difference is quite extreme:

formatting link

Greetings,

Jacob

--
"Any, sufficiently complicated, experiment is indistinguishable from magic."
Reply to
Jacob Sparre Andersen

The weird thing about C++ is how it combines features for good, safe, strongly typed programming with some incredible incoherent nonsense to achieve apparently simple things. Just for a laugh, I was trying to comprehend the "safe boolean idiom" recently - I believe I understand it roughly, but it is pretty mind-numbing stuff. (C++11 allowed conversion operators to be "explicit", making the whole thing redundant - C++11 really was a big step forward.)

Here's a couple of "shoot yourself in the foot" jokes:

Ada

If you are dumb enough to actually use this language, the United States Department of Defense will kidnap you, stand you up in front of a firing squad, and tell the soldiers, "Shoot at his feet." After correctly packaging your foot, you attempt to concurrently load the gun, pull the trigger, scream, and shoot yourself in the foot. When you try, however, you discover that your foot is of the wrong type. You scour all 156e54 pages of the manuals, looking for references to foot, leg, or toe; then you get hopelessly confused and give up. You sneak in when the boss isn't around and finally write the damn thing in C. You turn in 7,689 pages of source code to the review committee, knowing they'll never look at it, and when the program needs maintenance, you quit.

C

You shoot yourself in the foot. You shoot yourself in the foot and then nobody else can figure out what you did.

C++

You accidentally create a dozen instances of yourself and shoot them all in the foot. Providing emergency medical assistance is impossible since you can't tell which are bitwise copies and which are just pointing at others and saying, "That's me, over there."

Assembly

You try to shoot yourself in the foot only to discover that you must first invent the gun, the bullet, the trigger, and your foot. You crash the OS and overwrite the root disk. The system administrator arrives and shoots you in the foot. After a moment of contemplation, the system administrator shoots himself in the foot and then hops around the room rapidly shooting at everyone in sight. By the time you've written the gun, you are dead, and don't have to worry about shooting your feet. Alternatively, you shoot and miss, but don't notice. Using only 7 bytes of code, you blow off your entire leg in only 2 CPU clock ticks.

Python

You shoot yourself in the foot and then brag for hours about how much more elegantly you did it than if you had been using C or (God forbid) Perl. You create a gun module, a gun class, a foot module, and a foot class. After realizing you can't point the gun at the foot, you pass a reference to the gun to a foot object. After the foot is blown up, the gun object remains alive for eternity, ready to shoot all future feet that may happen to appear.

There have been various attempts at such studies. The problem is, to do it properly involves an enormous amount of money - you've got to find two teams of programmers with roughly equal proficiency and experience in one of the languages, and set them off doing independent high-quality implementations of the same task. At a minimum, you'd want perhaps 4 people on each team, and several months of work. Then you'd want separate judges to review the code quality afterwards.

Reply to
David Brown

See

formatting link

I'm particularly fond of the "const correctness" section, since it relates to the endless unresolved "casting away constness" discussions from the early 90s.

Reply to
Tom Gardner

There must still be a significant market for them however.

I can walk into the Farnell trade counter and buy these things over the counter from stock. Surely PDIP MCUs (and other devices) would not be held as in-stock inventory unless there was still a market for them.

Simon.

--
Simon Clubley, clubley@remove_me.eisner.decus.org-Earth.UFP 
Microsoft: Bringing you 1980s technology to a 21st century world
Reply to
Simon Clubley

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.