Wavelet Methods For Time Series Analysis Percival Pdf 74 !!TOP!!
In the SI, we explore the dependence of these results on methodological choices in network construction including the measure of functional connectivity (partial correlation, wavelet coherence, and the wavelet correlation used in the main manuscript), strength of edges (strongest versus weakest [45, 62]), and time series (wavelet details vs. wavelet coefficients). We observe that the effect of wavelet length is more salient (i) when using wavelet correlation than when using wavelet coherence or partial correlation, and (ii) when using the strongest 30% connections or 10% weakest connections than when using the 30% or 1% weakest connections. Results are consistent across the use of both wavelet details and wavelet coefficients. Based on prior work , we speculate that the networks constructed from the 1% weakest connections display significant spatial localization and the networks that constructed from the 30% weakest connections display significant random structure, together overshadowing the potential effects of wavelet length on group differences.
Longer Wavelets and Vanishing Moments As mentioned earlier, wavelet methods offer significant advantages over bandpass filtering in terms of signal preservation and denoising. These capabilities are supported by the fact that wavelets have vanishing moments , the number of which is the maximum degree of the polynomials the scaling function can reproduce. Both Daubechies and Coiflet wavelets have p vanishing moments for lengths 2p . For denoising purposes, it has been suggested that the number of vanishing moments should be greater than 2H + 1 where H is the Hurst exponent . Preliminary evidence suggests that the Hurst exponent for fMRI noise lies below 1, suggesting that one might wish to use wavelets with at least 4 vanishing moments . In the Daubechies family, this would correspond to a wavelet of length 8, which is a length that our results also support as demonstrating particularly high reliability, decreased sensitivity to artifact, and decreased variability. However, in the context of other clinical or task data, these choices might be quite different . While these heuristics suggest a minimal number of vanishing moments, it is not as simple to define a maximal number of vanishing moments that should be considered. Evidence suggests that very large numbers of vanishing moments can lead to computational artifacts in the decomposed signal . However, the point at which these artifacts occur is difficult to predict for different data types. Our work therefore offers a numerical approach to identifying wavelet lengths that maximize a statistic of interest such as the classification accuracy in a diagnostic test.
We have exercised these methods on functional networks constructed using the AAL atlas applied to resting state fMRI data, which represent common choices in functional network analysis in both health and disease. It will be interesting in future to assess the utility of these methods in other parcellation schemes and in task-based data.
Abstract:The existence of noise has great influence on the real features of observed time series, thus noise reduction in time series data is a necessary and significant task in many practical applications. When using traditional de-noising methods, the results often cannot meet the practical needs due to their inherent shortcomings. In the present paper, first a set of key but difficult wavelet de-noising problems are discussed, and then by applying information entropy theories to the wavelet de-noising process, i.e., using the principle of maximum entropy (POME) to describe the random character of the noise and using wavelet energy entropy to describe the degrees of complexity of the main series in original series data, a new entropy-based wavelet de-noising method is proposed. Analysis results of both several different synthetic series and typical observed time series data have verified the performance of the new method. A comprehensive discussion of the results indicates that compared with traditional wavelet de-noising methods, the new proposed method is more effective and universal. Furthermore, because it uses information entropy theories to describe the obviously different characteristics of noises and the main series in the series data is observed first and then de-noised, the analysis process has a more reliable physical basis, and the results of the new proposed method are more reasonable and are the global optimum. Besides, the analysis process of the new proposed method is simple and is easy to implement, so it would be more applicable and useful in applied sciences and practical engineering works.Keywords: time series analysis; de-noising; information entropy; wavelet transform; uncertainty
Power spectra of the artificial sine waves: (a) original time-averaged wavelet power spectrum, (b) rectified time-averaged wavelet power spectrum, (c) Fourier power spectrum; (d), (e) Same as (a) and (b), respectively, except that the wavelet power spectra are shown in base 2 logarithm.
This paper addresses a bias problem in the estimate of wavelet power spectra for atmospheric and oceanic datasets. For a time series comprised of sine waves with the same amplitude at different frequencies the conventionally adopted wavelet method does not produce a spectrum with identical peaks, in contrast to a Fourier analysis. The wavelet power spectrum in this definition, that is, the transform coefficient squared (to within a constant factor), is equivalent to the integration of energy (in physical space) over the influence period (time scale) the series spans. Thus, a physically consistent definition of energy for the wavelet power spectrum should be the transform coefficient squared divided by the scale it associates. Such adjusted wavelet power spectrum results in a substantial improvement in the spectral estimate, allowing for a comparison of the spectral peaks across scales. The improvement is validated with an artificial time series and a real coastal sea level record. Also examined is the previous example of the wavelet analysis of the Niño-3 SST data.
Wavelet analysis may be advantageous over the classical Fourier analysis in that it unfolds a time series not only in frequency but also in time, which is especially useful when the signal is nonstationary. Because of this property, wavelet analysis has been widely applied across disciplines since its introduction in the early 1980s. See Daubechies (1992), Chui (1992), Meyer (1992), Strang and Nguyen (1997), Percival and Walden (2000), and references therein, for a historical account. Applications in atmospheric and oceanic sciences have also been documented for over a decade (e.g., Gamage and Blumen 1993; Liu 1994; Gu and Philander 1995; Willemsen 1995; Liu and Miller 1996; Wang and Wang 1996; Percival and Mofjeld 1997; Haus et al. 1999; Haus and Graber 2000); several useful reviews on this topic can be found in Meyers et al. (1993), Lau and Weng (1995), Torrence and Compo (1998, hereafter TC98), Domingues et al. (2005), and Labat (2005), to name a few. Among these, TC98 is probably the most celebrated one, with over 800 citations so far. A practical step-by-step guide is given, and software packages in FORTRAN, IDL, and MATLAB languages are provided for free download from their Web site ( ).
The rest of this paper is arranged as follows. The bias problem is further illustrated with a real ocean time series in section 2. In section 3, theoretical derivations are followed to shed light on what is underlying a power spectral analysis. A physically consistent definition of energy, and hence an alternate formalism of power spectrum, are proposed, which allows for a solution of the problem. The improvement of the biased spectrum by the new formalism is validated in sections 4, 5, and 6, respectively, with idealized and real world examples. A summary and discussion are then provided in section 7.
Time series of hourly coastal sea level at St. Petersburg, Florida, are from the National Oceanic and Atmospheric Administration/National Ocean Service (NOAA/NOS) ( -ops.nos.noaa.gov) from January 1993 through December 2005. After quality control, the sea level record is de-tided by removing the four major tidal constituents: M2, S2, K1, and O1, using the T_Tide Harmonic Analysis Toolbox of Pawlowicz et al. (2002). It is then 48-h low-pass filtered, 12-h subsampled, and adjusted for the inverse barometer effect. Now the time series contains mainly subtidal sea level variations (Fig. 1, top). The air pressure data are from two NOAA/National Data Buoy Center (NDBC) stations, 42036 and VENF1 (Venice, Florida) ( ), and from University of South Florida surface buoys on the West Florida Shelf. The locations of the sea level and meteorological stations can be found in Fig. 1 of Liu and Weisberg (2005a). 2b1af7f3a8