I don't know about them, but to my knowledge ffmpeg and mplayer use a lot of asm.
As far as I know, there isn't much of automatic vectorization in C compilers either, so any non-trivial use of SSE and such needs hand coded assembly.
OTOH, I think there was a fairly recent "survey" on slashdot about assembly. The result was that few slashdot readers use it, but then I read it as "slashdot reading perl and php coders don't use assembly". Kind of no brainer...
On a sunny day (Wed, 10 Jun 2009 21:58:36 +0300) it happened Anssi Saari wrote in :
Yes, those do. And I use ffmpeg mplayer :-) Those are just building blocks. I use libmpeg3 too....
Well, it makes your code non-portable for a start. And, how much is the real gain in speed? With ever more powerful processors and 30 fps or 25 fps TV systems it may no longer matter, except for encoding. And for encoding H264 is the bottle neck ATM, and for that they now start using the Nvidia API.
It can be fun programming in asm to get max speed, I have done video on a PIC in asm for example, but if you have enough processing power why bother?
I'm with John on spreadsheets (though not on BASIC)--they're famous for generating reasonable-looking wrong answers, and (unlike programming languages) their testing facilities are zilch. Sort of like the old Royal-McBee drum memory computers, where every instruction contained the address of the next instruction--every line had a GOTO.
Brr. Give me Mathcad or Octave for numbers, and little scripts for other sorts of data. Spreadsheets--blech.
Niche software is generally both low quality and expensive; the alternative would often be reasonable quality and unaffordable.
True, but that's not created on a commercial basis.
And a lot of the worst software is also free.
Free software is liberated from the need to turn a profit, so there's no product sabotage to preserve sales of the enhanced version, or chasing meaningless bullet points at the expense of useful functionality.
OTOH, it's also liberated from the need to cater to what users actually want (as opposed to e.g. what the developer thinks they ought to want).
On a sunny day (Wed, 10 Jun 2009 20:44:08 +0100) it happened Nobody wrote in :
Most important perhaps with much open source software is that it is written by somebody actually *USING* it, created to do some function the person needed, at least in my case. So it will likely do that :-)
That is very different from a CEO saying: Let's write a xxx package, competition is making millions with it. And the programmer never even uses it... so it will suck.
Whether the compiler can make use of it may depend upon your algorithm.
E.g. one of Quake's key optimisations was that it only performs one perspective division for every 16 pixels. This is much faster than performing a division per pixel, but visually almost as good; you won't see the difference unless you're looking for it.
Why 16 pixels? Because that's how many pixels it can render using the integer ALU while the FPU is (concurrently) performing the division.
You could write in C and still get the optimisation (with a sufficiently good compiler), but you would need a reasonable understanding of the x86 in order to structure the code in such a way as to enable the optimisation.
The compiler won't perform approximations for you, and you can't test every possible approximation for performance.
You are making the common argument that better software has to cost more. What I see is that some people write good code and some don't.
We use PADS for pcb layout; it has no known-to-me bugs, it's fast, and it never crashes. LT Spice is great. Agilent's Appcad ditto. Irfanview and Crimson are great. Firefox and Thunderbird too. Most Microsoft and Adobe products are slow and buggy. Why is Hyperterminal still messed up? Adobe Reader is a train wreck.
Free software is also usually written by one or a few people (beasts like Linux excepted; but even Linux is better than Windows.)
Video? In ASM? Hah, just the other day I tried getting my breadboarded Z80 to produce NTSC composite. Just bit banging out of a spare data bit or two.
No good, one instruction takes about an inch of scan, no time even to read the next line, and the horizontal sync is terribly unreliable due to the variable interrupt latency. With lots of prodding I made a few stable pictures, nothing interactive. Definitely needs support hardware. An AVR at 16MHz would probably do nicely though.
Tim
--
Deep Friar: a very philosophical monk.
Website: http://webpages.charter.net/dawill/tmoranwms
On a sunny day (Wed, 10 Jun 2009 16:45:32 -0500) it happened "Tim Williams" wrote in :
Hi, Tim, well, the 'video' PIC runs on the internal oscillator of about 4 MHz... What it does is remove those annoying pulses that some DVD players insert, the Dutch word is 'beeldverbeteraar', in English 'picture improver'. This is the complete project files, including asm: ftp://panteltje.com/pub/mvp-0.2.zip This is what the thing look like: ftp://panteltje.com/mpv-0.2-pcb.jpg
What it does is: It first finds the V sync pulse, then samples the video black level for an empty line, using a HC4053, then waits until just before where those nasty pulses occur, and then replaces that area with the sampled black level. Simple.
You may think it is illegal to remove those M * cr * v *s * on pulses, as normally those are only inserted to prevent copying, but my DVD player, a Mustek,
*always* added those pulses, also on my own DVDs that I created myself, so I had to design this to be able to make VHS tapes of my own DVDs.
Maybe helpful to somebody who has the same crap DVD player (I have put that one with the garbage some time ago). There was more wrong with that player.
AIUI, our part number and ECO tracking software company has been bought out by Oracle so some day we'll have to migrate over. It's a mess now so I can only imagine what an adventure of Titanic proportions that'll be.
In a normal interpreted language, each statement follows the one ahead of it, unless there's an explicit control structure. (Or a named label, but we're assuming reasonable practice here--use GOTOs to escape from a deeply nested loop to a nearby point, say, but not otherwise.)
Any spreadsheet cell, on the other hand, can depend on any other spreadsheet cell, without warning and without any way of finding that out in general other than an exhaustive examination of each cell.
_Your_ spreadsheets, I'm quite sure, are sensibly structured and reasonably comprehensible in outline. But that is far from the general case, even for spreadsheets that have real money depending on them. A nontrivial spreadsheet that has had more than a couple of people working on it is a disaster to debug--far worse than spaghetti code.
Yep, tables and graphs are about it.
Well, they're widely accepted, all right, but they still stink.
Once upon a time, IBM estimated that the computer market amounted to around 20 computers in the entire world. Nowadays, everyone has one, and the software industry is so large that there simply aren't enough "good" programmers to go around. In this environment, some degree of mediocrity is inevitable.
That's because Microsoft and Adobe are trying to corner the market, which means that their products have to do absolutely everything that any similar product might possibly want to do, otherwise they leave enough of a market for a competitor to survive. This pretty much guarantees bloatware.
Free software (or even less ambitious commercial software) will tend to stick to what it's good at; or at least not bother trying to do what it's particularly unsuited for.
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here.
All logos and trade names are the property of their respective owners.