Multi-trend signal, wavelets vs. moving average

The NOAA data for the surface temperature has multiple trends. I am surprised this fact has not been mentioned in literature.

This impacts the forecasting and related signal investigations.

In order to make intuitive review of the multi-trend signals, I wrote this little demo to clarify some concepts:

Multi-Trend Signal

As the demo shows moving average of larger windows sizes could indeed over-smooth the signal to the point of losing its shape and characteristics.

Dara

Comments

  • 1.
    edited July 2014

    I'll check out your demo!

    The NOAA data for the surface temperature has multiple trends. I am surprised this fact has not been mentioned in literature.

    I'm surprised you think it hasn't been mentioned. The lack of temperature increase between 1950 and 1970 is famous and endlessly discussed. Ray Pierrehumbert mentioned it in his talk Successful predictions in climate science, a plentary talk at the 2012 American Geophysical Union annual meeting.

    I summarized this part of his talk:

    Finally, there are, of course, some things that scientists didn’t predict. The most important of these is probably the multi-decadal fluctuations in the warming signal. If you calculate the radiative effect of all greenhouse gases, and the delay due to ocean heating, you still can’t reproduce the flat period in the temperature trend in that was observed in 1950–1970. While this wasn’t predicted, we ought to be able to explain it after the fact. Currently, there are two competing explanations. The first is that the ocean heat uptake itself has decadal fluctuations, although models don’t show this. However, it’s possible that climate sensitivity is at the low end of the likely range (say 2 °C per doubling of CO2), it’s possible we’re seeing a decadal fluctuation around a warming signal. The other explanation is that aerosols took some of the warming away from GHGs. This explanation requires a higher value for climate sensitivity (say around 3 °C), but with a significant fraction of the warming counteracted by an aerosol cooling effect. If this explanation is correct, it’s a much more frightening world, because it implies much greater warming as CO2 levels continue to increase. The truth is probably somewhere between these two. (See Armour and Roe, 2011 for a discussion.)

    Comment Source:I'll check out your demo! > The NOAA data for the surface temperature has multiple trends. I am surprised this fact has not been mentioned in literature. I'm surprised you think it hasn't been mentioned. The lack of temperature increase between 1950 and 1970 is famous and endlessly discussed. Ray Pierrehumbert mentioned it in his talk [Successful predictions in climate science](http://johncarlosbaez.wordpress.com/2013/02/05/successful-predictions-of-climate-science/), a plentary talk at the 2012 American Geophysical Union annual meeting. I summarized this part of his talk: > Finally, there are, of course, some things that scientists didn’t predict. The most important of these is probably the multi-decadal fluctuations in the warming signal. If you calculate the radiative effect of all greenhouse gases, and the delay due to ocean heating, you still can’t reproduce the flat period in the temperature trend in that was observed in 1950–1970. While this wasn’t predicted, we ought to be able to explain it after the fact. Currently, there are two competing explanations. The first is that the ocean heat uptake itself has decadal fluctuations, although models don’t show this. However, it’s possible that climate sensitivity is at the low end of the likely range (say 2 °C per doubling of CO2), it’s possible we’re seeing a decadal fluctuation around a warming signal. The other explanation is that aerosols took some of the warming away from GHGs. This explanation requires a higher value for climate sensitivity (say around 3 °C), but with a significant fraction of the warming counteracted by an aerosol cooling effect. If this explanation is correct, it’s a much more frightening world, because it implies much greater warming as CO2 levels continue to increase. The truth is probably somewhere between these two. (See [Armour and Roe](http://www.agu.org/pubs/crossref/2011/2010GL045850.shtml), 2011 for a discussion.)
  • 2.

    Thanx John, I did not see this.

    Also I did not see any formal signal processing treatment for the multi-trend and how to treat the data.

    But noteworthy is that requires very different thinking and forecast algorithms

    D

    Comment Source:Thanx John, I did not see this. Also I did not see any formal signal processing treatment for the multi-trend and how to treat the data. But noteworthy is that requires very different thinking and forecast algorithms D
  • 3.

    Yes, extracting 'cycles' - periodic phenomena - from data that exhibits complex trends is an interesting challenge. I'm not an expert on that! Someone must have thought about it.

    Comment Source:Yes, extracting 'cycles' - periodic phenomena - from data that exhibits complex trends is an interesting challenge. I'm not an expert on that! Someone must have thought about it.
  • 4.

    I could, code-wise speaking, extract the sub-band periods (and their amplitudes) but how would one integrate them back together to explain the phenomenon. For example in quantum mechanics you call these the superposition of waves, well understood concept. There are two waves, add them together and then there is interference.

    But for atmospheric related sciences, what construct do they have to explain the sub-band decompositions?

    D

    Comment Source:I could, code-wise speaking, extract the sub-band periods (and their amplitudes) but how would one integrate them back together to explain the phenomenon. For example in quantum mechanics you call these the superposition of waves, well understood concept. There are two waves, add them together and then there is interference. But for atmospheric related sciences, what construct do they have to explain the sub-band decompositions? D
  • 5.
    edited July 2014

    Blind source separation on temperature histories from multiple locations seems appropriate. I have already tried running NMF and ICA on the surface temperature data, but the extracted sources did not look very interesing -- at least to me. They were dominated by high frequency fluctuation. I have seen references to "slow feature analysis", that sounds like it may fix this, but I never looked into it in depth. Maybe this is a good time :).

    I have also just come across something called forecastable component analysis:

    Comment Source:Blind source separation on temperature histories from multiple locations seems appropriate. I have already tried running NMF and ICA on the surface temperature data, but the extracted sources did not look [very interesing](http://azimuth.mathforge.org/discussion/1366/r-programming-language/?Focus=11286#Comment_11286) -- at least to me. They were dominated by high frequency fluctuation. I have seen references to "slow feature analysis", that sounds like it may fix this, but I never looked into it in depth. Maybe this is a good time :). + [http://www.scholarpedia.org/article/Slow_feature_analysis](http://www.scholarpedia.org/article/Slow_feature_analysis) + [http://dl.acm.org/citation.cfm?id=1176442](http://dl.acm.org/citation.cfm?id=1176442) I have also just come across something called forecastable component analysis: + [http://cran.r-project.org/web/packages/ForeCA/index.html](http://cran.r-project.org/web/packages/ForeCA/index.html)
  • 6.

    Hello Daniel

    My wavelet decompositions as opposed to scikit's decomposition.NMF did immediately show interesting patterns in lower refinement decompositions, please:

    ENS2010

    you need CDF to view it, and it is 20 MB so takes a couple of seconds to properly run.

    The very last refinements {0,0,0,0,0} and {0,0,0,0,0,0} show dynamic patterns.

    Dara

    Comment Source:Hello Daniel My wavelet decompositions as opposed to scikit's decomposition.NMF did immediately show interesting patterns in lower refinement decompositions, please: [ENS2010](http://mathematica.lossofgenerality.com/2014/06/19/enso-2010/) you need CDF to view it, and it is 20 MB so takes a couple of seconds to properly run. The very last refinements {0,0,0,0,0} and {0,0,0,0,0,0} show dynamic patterns. Dara
  • 7.

    Hello Daniel

    Would you like to place your work in iPython notebooks with access to full scikit-learn libraries? This is like an alternative to Mathematica CDF

    Comment Source:Hello Daniel Would you like to place your work in iPython notebooks with access to full scikit-learn libraries? This is like an alternative to Mathematica CDF
  • 8.

    Hi Dara.

    I can create the the notebooks, but I do not have a publicly available server to serve them from. I could add them to the Azimuth GitHub and just let people download and run them, but that kind of loses the advantage of notebooks over raw python code.

    Comment Source:Hi Dara. I can create the the notebooks, but I do not have a publicly available server to serve them from. I could add them to the Azimuth GitHub and just let people download and run them, but that kind of loses the advantage of notebooks over raw python code.
  • 9.

    Daniel the server runs the iPython notebook live on Python code + SciKit-learn libraries (mostly non-python).

    So this not a hosting situation, this actually keeping live code over python in a server. Look at iPython as a client attached to a server

    Comment Source:Daniel the server runs the iPython notebook live on Python code + SciKit-learn libraries (mostly non-python). So this not a hosting situation, this actually keeping live code over python in a server. Look at iPython as a client attached to a server
  • 10.

    I meant to say that I do not have a publicly visible machine on which to run an Ipython server for people to access. Normally I run an Ipython server on my laptop, but nobody else access to that.

    Comment Source:I meant to say that I do not have a publicly visible machine on which to run an Ipython server for people to access. Normally I run an Ipython server on my laptop, but nobody else access to that.
  • 11.
    edited July 2014

    Daniel I got your point. I suggest that we iPython for several reasons:

    1. Another format for documents, so we are not entirely Mathematica CDF based, given iPython being open source that is a plus
    2. Python code in iPython notebook could call my parallelized Mathematica and C algorithms as if Python calls, without actually needing to learn gory parallelization crap
    3. John and his educator colleagues could then craft entire set of live interactive documents with actual live data from NOAA and NASA ... as teaching materials (I like to do that still in CDF as well)
    4. R code could be used iPython:

    RMAGIC

    Dara

    Comment Source:Daniel I got your point. I suggest that we iPython for several reasons: 1. Another format for documents, so we are not entirely Mathematica CDF based, given iPython being open source that is a plus 2. Python code in iPython notebook could call my parallelized Mathematica and C algorithms as if Python calls, without actually needing to learn gory parallelization crap 3. John and his educator colleagues could then craft entire set of live interactive documents with actual live data from NOAA and NASA ... as teaching materials (I like to do that still in CDF as well) 4. R code could be used iPython: [RMAGIC](http://nbviewer.ipython.org/github/ipython/ipython/blob/3607712653c66d63e0d7f13f073bde8c0f209ba8/docs/examples/notebooks/rmagic_extension.ipynb) Dara
  • 12.

    Dara, that sounds like a promising approach.

    Comment Source:Dara, that sounds like a promising approach.
  • 13.

    Hello David

    We will make a couple of iPython documents in coming weeks and then John and everyone here could evaluate to see the worth.

    In these things I am not theoretical, we try see if it works and then proceed

    D

    Comment Source:Hello David We will make a couple of iPython documents in coming weeks and then John and everyone here could evaluate to see the worth. In these things I am not theoretical, we try see if it works and then proceed D
  • 14.

    Great, I'm really looking forward to trying the IPython documents.

    Keep up the good work everybody, we're really moving now.

    Comment Source:Great, I'm really looking forward to trying the IPython documents. Keep up the good work everybody, we're really moving now.
  • 15.
    edited July 2014

    Dara, I have turned my code into notebooks. What should I do with them?

    Comment Source:Dara, I have turned my code into notebooks. What should I do with them?
  • 16.

    Could you put them some place in the github for this discussion group, while I am setting up an account for you.

    D

    Comment Source:Could you put them some place in the github for this discussion group, while I am setting up an account for you. D
  • 17.

    Daniel I have a tmp account for you, need to email you the password and login, I am not sure how we should conduct that here.

    Dara

    Comment Source:Daniel I have a tmp account for you, need to email you the password and login, I am not sure how we should conduct that here. Dara
  • 18.
    edited July 2014

    Hello Daniel

    My server guys might not respond before the weekend, so this is your tmp login URL:

    http://ipaic.lossofgenerality.com:8080/login

    You only need password, you should have all the SCIKIT-Learn libraries, if not or any problems will fix it for you.

    As for password, you need to email me directly to issue it to you: dara@lossofg...

    You have huge storage and possible access to mysql, but for next few days see if you could run some your scripts on the default configuration

    D

    Comment Source:Hello Daniel My server guys might not respond before the weekend, so this is your tmp login URL: [http://ipaic.lossofgenerality.com:8080/login](http://ipaic.lossofgenerality.com:8080/login) You only need password, you should have all the SCIKIT-Learn libraries, if not or any problems will fix it for you. As for password, you need to email me directly to issue it to you: dara@lossofg... You have huge storage and possible access to mysql, but for next few days see if you could run some your scripts on the default configuration D
  • 19.
    edited August 2014

    Dara -- or anyone else -- I'm interested to know your perspective on the role that iPython notebooks can play for tutorial software on scientific and mathematical concepts.

    In the [Azimuth Code Project] we've developed some interactive web models that demonstrate concepts such as stochastic resonance. (Though the we doesn't include me, since it was before my time here. ) The platform that was settled on was javascript + the JSXGraph library. The aim is to have everything run directly in the browser.

    Here is a draft of a blog article that explains the anatomy of one of these programs: Blog - The stochastic resonance program (part 2).

    I'm wondering to what extent iPython notebooks could contribute to the platform for such interactive software. Although the JSXGraph library does relieve of lots of low level programming, it's still javascript; all else being equal, I'm much more at home with the python language and platform.

    But is all else equal? Can we attain the same level of web-based interactivity with iPython notebook technology as we can with javascript?

    Comment Source:Dara -- or anyone else -- I'm interested to know your perspective on the role that iPython notebooks can play for tutorial software on scientific and mathematical concepts. In the [Azimuth Code Project] we've developed some interactive web models that demonstrate concepts such as stochastic resonance. (Though the we doesn't include me, since it was before my time here. ) The platform that was settled on was javascript + the JSXGraph library. The aim is to have everything run directly in the browser. Here is a draft of a blog article that explains the anatomy of one of these programs: [[Blog - The stochastic resonance program (part 2)]]. I'm wondering to what extent iPython notebooks could contribute to the platform for such interactive software. Although the JSXGraph library does relieve of lots of low level programming, it's still javascript; all else being equal, I'm much more at home with the python language and platform. But is all else equal? Can we attain the same level of web-based interactivity with iPython notebook technology as we can with javascript?
  • 20.

    Hello David

    I’m interested to know your perspective on the role that iPython notebooks can play for tutorial software on scientific and mathematical concepts.

    I thin it is a good OPEN SOURCE platform, though incomplete but much hope going for it. We used it for a year in connection with scikit-learn and it was great for tutorial purposes specially for professionals.

    However it lacks multi-user setup, so you need to have play some admin games to have multiple logins, but for Azimuth, per discussion with John, the notebooks will be open to the world.

    The developers need to share some number of logins, each make their own workspace area and all is good.

    So far as your concerns about WEB interactivity the iPython notebooks is one the top best platforms, again considering OPEN SOURCE. You might consider others like SAGE but iPython is a better choice for Azimuth.

    More importantly, we could easily allow iPython notebooks to access the code and data from other applications e.g. from iPython notebook you could run a Mathematica or C script. That being said such scripts are parallelized so the iPython notebook need any parallelization.

    Dara

    Comment Source:Hello David >I’m interested to know your perspective on the role that iPython notebooks can play for tutorial software on scientific and mathematical concepts. I thin it is a good OPEN SOURCE platform, though incomplete but much hope going for it. We used it for a year in connection with scikit-learn and it was great for tutorial purposes specially for professionals. However it lacks multi-user setup, so you need to have play some admin games to have multiple logins, but for Azimuth, per discussion with John, the notebooks will be open to the world. The developers need to share some number of logins, each make their own workspace area and all is good. So far as your concerns about WEB interactivity the iPython notebooks is one the top best platforms, again considering OPEN SOURCE. You might consider others like SAGE but iPython is a better choice for Azimuth. More importantly, we could easily allow iPython notebooks to access the code and data from other applications e.g. from iPython notebook you could run a Mathematica or C script. That being said such scripts are parallelized so the iPython notebook need any parallelization. Dara
  • 21.

    I really like IPython notebooks for exploratory data analysis, but opening a notebook to the world amounts to giving the whole world shell access to your machine, since you can execute arbitrary python code through the notebook interface.

    Comment Source:I really like IPython notebooks for exploratory data analysis, but opening a notebook to the world amounts to giving the whole world shell access to your machine, since you can execute arbitrary python code through the notebook interface.
  • 22.

    but opening a notebook to the world amounts to giving the whole world shell access to your machine, since you can execute arbitrary python code through the notebook interface.

    Let me talk to my server guru, I understand the concern.

    I get back to you on this shortly

    Dara

    Comment Source:>but opening a notebook to the world amounts to giving the whole world shell access to your machine, since you can execute arbitrary python code through the notebook interface. Let me talk to my server guru, I understand the concern. I get back to you on this shortly Dara
  • 23.

    I know very little about iPython notebooks, but I happened to see a discussion about them on phylobabble, a forum for phylogenetics (my area). Here's the discussion: http://phylobabble.org/t/ipython-notebook-for-phylogenetics/354.

    Comment Source:I know very little about iPython notebooks, but I happened to see a discussion about them on [phylobabble](http://phylobabble.org/), a forum for phylogenetics (my area). Here's the discussion: [http://phylobabble.org/t/ipython-notebook-for-phylogenetics/354](http://phylobabble.org/t/ipython-notebook-for-phylogenetics/354).
  • 24.
    edited September 22

    In today’s complex global economy, it is crucial for organizations to seamlessly. We additionally train the sharp IT Professionals who are expecting to overhaul their current range of abilities. It is providing robust networking, communication products, and other critical services. CCNA Certification from Cisco is in demand within the market

    CCNA Classes in Noida

    Comment Source:In today’s complex global economy, it is crucial for organizations to seamlessly. We additionally train the sharp IT Professionals who are expecting to overhaul their current range of abilities. It is providing robust networking, communication products, and other critical services. CCNA Certification from Cisco is in demand within the market <a href="https://www.sevenmentor.com/ccna-classes-in-noida">CCNA Classes in Noida</a>
Sign In or Register to comment.