Can anybody give information or link about currently available most efficient grammar based compression algorithm?
Thanks Nasif
Can anybody give information or link about currently available most efficient grammar based compression algorithm?
Thanks Nasif
By "efficient," do you mean compression ratio, or computation time, or memory consumption, or ...?
And how adaptive does it have to be to the incoming data?
-- Many thanks, Don Lancaster voice phone: (928)428-4073 Synergetics 3860 West First Street Box 809 Thatcher, AZ 85552 rss: http://www.tinaja.com/whtnu.xml email: don@tinaja.com Please visit my GURU\'s LAIR web site at http://www.tinaja.com
I'm too busy trying to download recovery blocks for some pornography using yenc par and rar files. This is my current mission in a search for European girlies in plastic clothing sitting on each others faces.
This may not sound like an answer to your problem but last time I worked with a couple of two Indians and Pakistanis they were in my face at each others throats behind each others backs.
Last time I checked Bangladesh takes a serious flooding on Tuesday.
That sort of makes me think that having an algorithm about Grammar is not the main problem unless you need to take the piss out of someone else later by getting it wrong and taking a bung from your local polititical person.
So, there you go.
Yenc is meant to use the transport medium efficiently.
Par gives you data error recovery so the message will get through, unless you don't have enough par files.
Rar is..... I'm not sure about that one but us p*rn downloaders use it because p*rn uploaders use it.
The first two are free stuff with explanations available and, if you ask nicely they will tell you to RTFM.
Rar is free to use if you ignore the warning when you use it but I am sure they are Swedish and will tell you how to do it anyway.
Anyhow, perhaps you can leave out all the e's and punctuation marks.
So, there you go.
Problem Solved.
DNA
What's wrong with Huffman trees?
Neighbors keep picking all of the grits off them.
-- Many thanks, Don Lancaster voice phone: (928)428-4073 Synergetics 3860 West First Street Box 809 Thatcher, AZ 85552 rss: http://www.tinaja.com/whtnu.xml email: don@tinaja.com Please visit my GURU\'s LAIR web site at http://www.tinaja.com
It is not clear to me whether you want an algorithm to compress grammar, or want to use grammatical algorithm to compress something else.
Somebody forgot to water its roots?
AFAIK, grammar based compression mainly makes sense for compressing material which can be described by a simple grammar. This applies to programming languages for instance. It's based on compressing the output of a parser for that language.
It is not intended as a general purpose compression algorithm. In many cases, you can only restore the *meaning* of the original, not the structure.
Most efficient depends on what you want to do and what your input grammar is. Example - a historic computer, the Sinclair ZX Spectrum, used to take twice as long to store a '1' on the tape than a '0'. An efficient compression algorithm for that computer was one which output mostly '0's.
Kind regards,
Iwo
Sorry for my vegue question. Here efficient means compression ratio. Clearly I state my problem " What is the most efficient grammar based DNA compression algorithm?" I think that GTAC is one that is done as a part of PHD thesis in Waterloo University. Is any modification is done on GTAC ?
Thanks Nasif
No, living organisms still use DNA Version 1.0, which uses C (cytosine), A (adenine), T (thymine) and G (guanine) as bases. GTAC, if you will.
You may however be interested in RNA, which uses U (uracil) instead of T.
As far as compression... dunno about algorithms, but I've been pretty happy with 7zip
Michael
it depends what you're wanting to compress.
The guys in comp.compression go on-and-on-and-on about little else.
Bye. Jasen
ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.