Larkin, Power BASIC cannot be THAT good:

then I=20

:-)

desktop

through

No-one

gets

(and C's=20

large -- =20

large...=20

do better,=20

formatting link

A software company with a sense of humor.

Reply to
JosephKK
Loading thread data ...

postgresql-8.1.4.tar.bz2

Can't speak for Jan, of course i use only premium horror resistant eyes and have an auto changer system as well, 20 seconds down time max. And i never look at that kind of thing other than seated decently in a chair in a familiar room.

Reply to
JosephKK

postgresql-8.1.4.tar.bz2

My PPoE is in its third year of chaos from having done that. Three to five more years before all the old CODASYL databases are migrated.

Reply to
JosephKK

useless in a browser.

There is something like 2 or 3 separate adders in the address unit since the 8088/8086 days. The bloody lot of them operate in a clock or two to compute EA.

Reply to
JosephKK

I don't need a recap; I still maintain a couple of packages which print by feeding PostScript to lpr.

No-one uses RPN because it's user-friendly (it isn't), but because it's simple to implement. It's no surprise that RPN never took off outside of contexts where minimalism was a priority (calculators, printers, OpenBOOT, embedded systems generally).

RPN doesn't need a parser, only a tokeniser, as the format is simply a linear sequence of tokens. While [...] (array) and (dictionary) may look like nesting constructs, they aren't. "[" and "" are operators which pull everything from the stack until the first mark. PostScript will happily accept any of "[1 2 3]", "

Reply to
Nobody

Selective memory. There is good expensive software too. It tends to exist in niche markets where high development costs have to be spread across a small number of units.

Big company management often pay only lip service to software quality - it is all about maximising their bonuses which usually means minimum time to market and maximum immediate sales. Turnover in the sales staff at software development companies means they are seldom around when the impossible promises they made to the customer have to be delivered.

Management know full well when the first version ships that is it riddled with bugs. XL2007 is a good recent example. You can always issue updates that ameliorate the problems later. But a *new* version is it. Once customers have it you can persuade other people to upgrade by making the new file format incompatible with previous versions.

I think the cumulative updates to XL2007 (original boxed version) now top 1GB. After all everyone that matters has broadband these days.

The big difference now is that nothing physically has to ship.

Regards, Martin Brown

Reply to
Martin Brown

They have but the young new programmers don't often get the chance to practice what they have been taught when they go into industry.

That has been known for decades the IBM analysis of formal reviews as a method of early detection of specification errors in 1981 highlighted this early on. Popularised version in the "Mythical Man Month". Formal reviews saved them a factor of 3 in bug fixing costs and reduced the defects found after shipping by a factor of 4. It still isn't standard practice except in the best places.

In the UK the problem with every large government computer project stems from the fact that the managers are innumerate civil servants. The new ID card thing should be hilarious. One database to bind them all...

The real difficulty is in persuading customers that they do not want a quick hack to give them something ASAP. Most times that is exactly what they ask for even when they don't yet know what they want it to do. It is the job of the salesman to get the order in ASAP so guess what happens...

Add in a few of the political factors like the guys who can see that if the computer system works right they will be out of a job and you can see how the specifications end up corrupt at the outset.

If you tried to build hardware with the sort of "specifications" some major software projects start out with you would have a rats nest of components miles high by the time it was delivered.

So far so good. The problem is that the variation of the position of these lines on the graph for different individuals is more than an order of magnitude. That is the worst programmers have a defect curve that is more than 10x higher than the best practitioners. And there are not enough good or excellent programmers. You cannot change the availability of highly intelligent and trained staff so you have to make the tools better.

BTW don't blame the programmers for everything - a lot of the problems in the modern software industry stem from late injected feature creep.

The newest tools like static testing and complexity analysis are all about detecting as many common human errors at compile time as possible. It is telling that the toolset for this is *only* available in the most expensive corporate version of MS development tools.

When they should be in the one sold to students!!! A problem with student projects is that they are small enough that any half decent candidate can rattle them off with or without the right approach.

If you want a safe well behaved curve then something like one of Wirth's languages Pascal or Modula2 is about as good as it gets (for that generation of language). Java is pretty good for safety if a bit slow and Perl has its uses if you like powerful dense cryptic code. APL is even terser and has seen active service in surprising places.

Domain specific languages or second generation languages augmented by well tested libraries can be way more productive.

Mathematica is one example.

Regards, Martin Brown

Reply to
Martin Brown

can VBA call DLLs? C can be used to write DLLs.

what sort of things are you doing in VBA that you want to do in C?

It's fairly easy to do stuff to postgresql in C C is one of the many languages available for writing stored procedures. bad C can shoot your database in the foot.

Reply to
Jasen Betts

scan rate is about 16Khz, so about 1000 clocks per line, if you use the USART or SPI for output could probably get VGA resolution. :)

Reply to
Jasen Betts

There's another interesting graph: X axis is how hard (or expensive) it is to change a product in the field, and Y axis is bug density.

The curve dives down. Serious hardware and software designs that are hard to update (like hard asics, auto engine control computers, or the stuff inside flat-screen TVs, or military bear) are debugged pretty hard before being shipped. Mediun things (fpga's, prodcuts that are easily upgraded with a flash stick) are in-between, Stuff that gets weekly updates over the web are often horrible.

I see this in my own work: hardware designs are checked exhaustively, and we go out for a batch of expensive multilayer boards (we dom't prototype!) and they are usually right. Assembly programming, I'm pretty careful because the assemble-install-test loop is tedious. On-screen programming is pretty much type and ignite and see what happens.

Yes, and that makes people care less about bugs. And when a product is so complex that each bug fix spins off more bugs...

John

Reply to
John Larkin

:

Not sure if VBA can call DLLs. I'm still a beginner at this.

We do completeness checks on the database. Simplified, we compare what the contractor does with what we expect for the day (week, month, semi-annual period). Findings are dumped to Excel spreadsheets. Any holes mean missing data, and we send the contractor a nastygram asking them to fix it (or dock their pay).

Michael

Reply to
Michael

ly if

Funny. I would use the 25480, and put the 24*60*60 in a comment.

Then again, I don't need to write high-performance code.

Michael

Reply to
Michael

ar.bz2

Ok, try this on for size.

Software contractor was tasked with writing completeness checks for the data, but somehow never got around to it over the years. Low priority, I guess. In exasperation, my boss asks me to look into it. I'd taken a few MS Access classes, but my degree is in Chemical Engineering. Eh, I'll give it a shot. A couple of days later, we have working MS Access/VBA code that compares the contractor's work with what we want for the day. (This is for process data: pressures, flowrates, water levels, etc. About 80 locations per day.)

Boss asks me to do the same for the chemistry data. A few days later, sure, we've got it. A bit more complex (chemistry data is stored on a LIMS system, and there's about a 1 month delay between the sample collection time and the time the lab uploads the data), but it works. Better than nothing.

Software contractor likes it, wants my code so they can make it available online. Sure, fine, whatever. Here you go.

Software contractor gives me a box: Visual Studio 2005. Oh, but our IT department doesn't want anyone doing programming except for them. That includes the engineers. Oh, and no workers have Admin rights. Including engineers. So the box goes to waste.

We have a meeting with the IT supervisors. They say, sure, no problem, they can do anything we want. Only, they do work in ColdFusion, not HTML/PHP/SQL Server. Oh, and take a number, get in line - maybe a month from now they'll get to what we want done. Oh, and write out the work spec in 5 pages or less. Dude. I can do it myself in minutes, with direct access to the database (which I have). Or send an email to our software contractor, who usually gets it done.

D'oh.

Michael

Reply to
Michael

On a sunny day (Wed, 10 Jun 2009 20:59:18 +0100) it happened Nobody wrote in :

Sorry late reply, busy here... anyways I am not sure the quest for ever higher resolution graphics and ever more realistic rendering makes much sense. I will give an example from my experience. I had this MS DOS game, it is called vrs-slingshot, and actually a collection of games in 3D. It uses LCD shutter glasses, and needs a special connection to the parallel printer port to drive those glasses. I loved to play that game, but the DOS emulation of win98 no longer played it, and I no longer had any PC with DOS. But the graphics were, by moderns standard, horrible. But it did not matter. It has some jets flying, and if you played it the 3D was so realistic that you would fall of the chair trying to shoot that other plane. Your monitor all of the sudden had the depth of an aquarium.. Hard to explain. I have tried several times to get that game working, even in virtualisers, but no way. So low resolution graphics and speed, the rest is for the imagination to fill in. You see Sony PS3 flipped sort of, in spite of the fact that it has the most powerful graphics around. Who needs it for pacman? ;-) It is the same thing with HDTV, not really what we want. Same thing with a book versus a movie, I did read 'The lord of the rings', had my own imagination what that world looked like, the movie sucked, in spite all the cool effects. It sucked for me because it did not match my imagination. TV in low resolution leaves things for the imagination, this is how the Hollywood wild west cities are build, just the front, it just looks real, as the imagination fills in what is not there. If you become too realistic it becomes boring, too many special effects and it breaks, as did the latest Starwars... Others may disagree, but the whole of multimedia is based on this imagination if you ask me, get too realistic or leave no space for the imagination and it becomes a dead end.

Reply to
Jan Panteltje

I feel the same way about old games. Doom is the greatest. Who needs accelerated video when you've got 320 x 200 x 256?

Tim

--
Deep Friar: a very philosophical monk.
Website: http://webpages.charter.net/dawill/tmoranwms
Reply to
Tim Williams

Perl is the last language you should be using if you want robustness. It actively encourages writing code which works 99% of the time; getting that last 1% requires a tenfold increase in effort.

Any task which involves reading structured text should begin with a robust parser which parses everything, not just the outermost layer. The rest of the program operates on data structured as lists, tuples, structures, objects, dictionaries, etc.

Manipulating data as structured text using string manipulation functions is the road to hell. I'm convinced this approach has created more security flaws than buffer overflows ever have.

Sure, you *can* write correct code in Perl; it just makes doing a half-arsed job so much easier than doing it right.

Reply to
Nobody

I hope not! I define the constant, then use the label. That way, if it changes, you don't have to track down every instance to change the hard numbers.

Cheers! Rich

Reply to
Rich Grise

All of which reinforces the view that the software industry really does know what it's doing. If reliability actually matters, you get reliability. If it doesn't, you don't.

The extent to which customers might want reliability only matters insofar as it affects their purchasing decisions.

That's mostly just Microsoft. When Linux vendors issue bugfix releases, they usually are just bugfixes, not version upgrades.

When I maintained Linux servers, I only bothered using a test server for upgrades. Bug fixes went straight onto the production servers, and I never encountered a regression as a result. For security fixes, leaving an unpatched server facing the internet for another hour was a bigger risk than the patch itself causing problems.

Reply to
Nobody

On a sunny day (Thu, 11 Jun 2009 11:56:44 -0500) it happened "Tim Williams" wrote in :

256 fps IS fast. I had MS flight simulator for win 98. I would fly from Europe to South America with it, in real time that takes hours... So I put it at the max speed (32x or so). I then found out that if you pushed it a bit more, it would flip a variable or something and the plane would never be able to come down again :-) You could just let go of the controls and it would stay up there forever.

I know they also did things for the Navy, America's enemies need not fear ;-) But Airbus? Do they make their own?

Reply to
Jan Panteltje

Maybe not, but there are a few boundaries where a quantitative improvement results in a qualitative improvement.

In 3D, it helps if you can manage ~24fps to obtain fluid animation. If it's too slow, you lose a lot of immersion.

Similarly, there's a world of difference between flat-shaded or Gouraud-shaded polygons and texture-mapped polygons.

This was one of Doom's main strengths: it was the first texture-mapped 3D game with fluid animation. Quake aimed to do the same thing for true 3D, without having to lock the vertical axis.

Everything since then has really just been bigger numbers (mostly just higher polygon count).

But there's a general principle here which goes beyond Quake. There are plenty of practical applications where you're dealing with continuous (analogue) data and you need results which are accurate to a given tolerance. Being able to use an approximation may result in an order of magnitude difference to the number of CPU cycles required for the task. Often, the accuracy/performance curve can have sharp kinks or even steps, the location of which are determined by fairly low-level architectural details.

Reply to
Nobody

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.