Help comparing two FFT approaches

No. They would be coherent on every dataset with an edge discontinuity. Random noise will average away but systematic defects caused by the implicit periodicity of the FFT can easily bite you in the backside.

There are also several conventions on how the tiling periodicity is done. I view the traditional tiling as translational.

true data : ... ? ? ? a b c d ? ? ? ...

DFT input data : ... 0 0 0 a b c d 0 0 0 ...

FFT implicit : ... a b c d a b c d a b c d ... (translation)

FFT implicit': ... a b c d d c b a a b c d ... (mirror)

or

FFT implicit": ... a b c d c b a b c d ... (mirror)

Jodrell bank used one of the latter two for MERLIN. I forget which. It is closely related to the cosine transform that underpins JPEG.

It affects how artefacts from the edge discontinuity influence things.

Although some of what he wrote wrt the Fast Hartley Transform was a little misleading for non-practitioners it may actually help the OP. Depending on whether he has a real valued time series to transform.

formatting link

I'd also vote for Bracewell's book as an introduction to FFTs & DFTs.

There are many more recent books on implementing FFTs and FFTW is now the one to beat - their classic paper is also online:

formatting link

Although for some of the intricate practicalities the optimal gridding functions for using FFT to compute DFT are in one of the VLA research papers by Schwab (~1980). In summary the definition is here:

formatting link

The paper memo scan is still online although it isn't an easy read.

formatting link

It deals with the problem of gridding non-uniform sampled Fourier data where the intention is to compute an ideal DFT using an FFT to do it. It is also applicable to controlling aliasing elsewhere.

It is important to remember that a multiplication in the time domain is a convolution in frequency and vice-versa. Choosing how you prepare your data for transformation is important if you care about the results.

--
Regards, 
Martin Brown
Reply to
Martin Brown
Loading thread data ...

s (Discrete Fourier Transforms)?

Great, Bracewell is the name I'm looking for. Any wisdom on the various ed itions? It looks like I can get the first for ~$15 the second for ~$45 (less if I order from France.) And the current third edition is ~$100 (in paper back, Yuck.)

No Kindle edition offered :^) OT.. Kindle is the only type of books my daughter reads these days. (Well at least she is reading!) But I complain to her that when I was young and a new book came out I'd hav e to wait a year for the paperback version that I could afford. Now when she finishes one it's click-click-click and the next in the series is down loaded, she doesn't even have to hike herself to the library or book store. .. kids these days.

George H.

Reply to
George Herold

Thanks for that Martin, the mirroring formats are interesting. But then do you have to double your data set, so it takes longer to do the FFT? So one period is now abcddcba (or abcdcba) rather than abcd. I admit to having no intuition for FFT's. (Bracewell may help in that regard.)

George H.

Reply to
George Herold

The second edition is sort of the classic. I have the second and third, but it's the second that I find myself looking up stuff in. The third has all the Hartley stuff, but that's a niche interest.

Cheers

Phil Hobbs

--
Dr Philip C D Hobbs 
Principal Consultant 
ElectroOptical Innovations LLC 
Optics, Electro-optics, Photonics, Analog Electronics 

160 North State Road #203 
Briarcliff Manor NY 10510 

hobbs at electrooptical dot net 
http://electrooptical.net
Reply to
Phil Hobbs

Heck no! The mirror symmetry is coded into the transform implementation in a devious way to minimise non-trivial complex multiplications.

Here is a paper on the 8 point DCT that illustrates the principles (not an easy read and unchecked but it looks about right at a glance).

formatting link

The crucial point is you no long have sharp edge discontinuities, but you pay for it with a different set of artefacts. There are tricks now to incredibly accurately compute a true DFT for a subset range of the FFT by putting a guard band on (and discarding) the ends of the array.

Most of this got discovered when the lowest noise receivers came along and dynamic range of synthetic aperture radar and radio astronomy was pushed to the limits by self calibration and then systematic data processing faults started to show up above the noise floor.

--
Regards, 
Martin Brown
Reply to
Martin Brown

I only reference the term 'energy' because there in lies the limits of reception.

Signals are fast enough to reside INSIDE the smallest length, actually AT LEAST 5 cycles appear within the shortest packet length. Do not expect anything lower frequency. No signals are faster than 89% of sampling Nyquist rate. Phase AND magnitude of received signal actually contain the information being sought! So, MUST preserve.

Reply to
RobertMacy

Each signal is measured for amplitude AND phase.

Reply to
RobertMacy

But 'energy' has a very well defined meaning in spectrum analysis.

If you have a known target carrier frequency then synchronous detection is always superior the only question is by how much. There might be some advantage in oversampling a little bit more. It would be very neat if you could choose your transform length so that an exact multiple of the main component(s) target frequency fit into it.

The longer time t the chunk of data you FFT at once the better your frequency discrimination df - only you can decide how much is enough.

--
Regards, 
Martin Brown
Reply to
Martin Brown

Martin, thank you for the exact URL's to those papers!

Sadly, the way I go through groups often accidentally deletes postings I WANT to read! Luckily, there is a way to regain a deleted posting, but until this morning didn't find much of this thread [especially valuable postings with URL's, too!]

Reply to
RobertMacy

As in my other response: a longer FFT gives you better frequency resolution. This means that you will get information for many more frequencies. But for those frequencies that do appear with both, the long and the short one, the result is exactly the same:

Here, for simplicity, compare a 4 point FFT with 2x 2 point FFTs with octava/matlab

x=rand(1,4); x1=x(1:2); x2=x(3:4);

fft(x1)+fft(x2) -> 2.83254 f=0 0.11009 f=fs/2

fft(x) -> 2.83254 + 0.00000i f=0 -0.17807 - 0.24707i f=fs/4 0.11009 + 0.00000i f=fs/2 -0.17807 + 0.24707i f=-fs/4

I can't see how you could loose information for high (or any) frequencies. Perhaps you were averaging FFTs for non-consecutive samples of the signal? This could make some periodicities disappear.

Pere

Reply to
o pere o

My comparison was a bit longer, but came to the same conclusion.

Never thought of having any gaps in the sampling.

Don't think it's happening but don't KNOW it's not happening. Thanks for the heads up on that potential flaw.

Reply to
RobertMacy

ElectronDepot website is not affiliated with any of the manufacturers or service providers discussed here. All logos and trade names are the property of their respective owners.