No. They would be coherent on every dataset with an edge discontinuity. Random noise will average away but systematic defects caused by the implicit periodicity of the FFT can easily bite you in the backside.
There are also several conventions on how the tiling periodicity is done. I view the traditional tiling as translational.
true data : ... ? ? ? a b c d ? ? ? ...
DFT input data : ... 0 0 0 a b c d 0 0 0 ...
FFT implicit : ... a b c d a b c d a b c d ... (translation)
FFT implicit': ... a b c d d c b a a b c d ... (mirror)
or
FFT implicit": ... a b c d c b a b c d ... (mirror)
Jodrell bank used one of the latter two for MERLIN. I forget which. It is closely related to the cosine transform that underpins JPEG.
It affects how artefacts from the edge discontinuity influence things.
Although some of what he wrote wrt the Fast Hartley Transform was a little misleading for non-practitioners it may actually help the OP. Depending on whether he has a real valued time series to transform.
I'd also vote for Bracewell's book as an introduction to FFTs & DFTs.
There are many more recent books on implementing FFTs and FFTW is now the one to beat - their classic paper is also online:
Although for some of the intricate practicalities the optimal gridding functions for using FFT to compute DFT are in one of the VLA research papers by Schwab (~1980). In summary the definition is here:
The paper memo scan is still online although it isn't an easy read.
It deals with the problem of gridding non-uniform sampled Fourier data where the intention is to compute an ideal DFT using an FFT to do it. It is also applicable to controlling aliasing elsewhere.
It is important to remember that a multiplication in the time domain is a convolution in frequency and vice-versa. Choosing how you prepare your data for transformation is important if you care about the results.