Recent Changes

Tuesday, May 24

  1. page Results_SLP_Amelia edited ... Empirical Orthogonal Functions (EOF) homework.--Background reading: Hsieh book chapter 2Empiri…
    ...
    Empirical Orthogonal Functions (EOF) homework.--Background reading: Hsieh book chapter 2Empirical Orthogonal Functions (EOF) homework.--Background reading: Hsieh book chapter 2Background reading: Hsieh book chapter 2
    Empirical Orthogonal Functions (EOF) homework.--1. First, a question about the sense of EOF's:Empirical Orthogonal Functions (EOF) homework.--1. First, a question about the sense of EOF's:1. First, a question about the sense of EOF's:
    You have some data(x,t) with space-time structure: 144 space bins (in this case, just longitude), by 240 time bins (months).
    You want to decompose it into a set of orthogonal terms that add together to give the total.
    Since they are orthogonal, each term represents some variance: cross terms disappear when you average the square of the sum.
    If you keep enough terms you will get back all the variance (and more importantly, you can reconstruct the data in all its detail).
    In the case of EOF (also known as Principal Components (PC)) analysis, you express your data as:
    [[image:space/math/3a322530c7d160da73e6a98417605451.gif caption=" {EOF}_1(bold x) {PC}_1(t) + {EOF}_2(bold x) {PC}_2(t) + {EOF}_3(bold x) {PC}_3(t) + ... = mydata(x,t)"]]
    {EOF}_1(bold x) {PC}_1(t) + {EOF}_2(bold x) {PC}_2(t) + {EOF}_3(bold x) {PC}_3(t) + ... = mydata(x,t)

    How many values (numbers) are in your input data array? 144 x 240 (lon vs t)
    How many values (numbers) are needed to build each term on the left? 144 for EOFs, 240 for PCs
    ...
    {m4.jpg} {m5.jpg}
    {m6.jpg} {m7.jpg}
    Empirical Orthogonal Functions (EOF) homework.--Extra credit/ teach us something new:Empirical Orthogonal Functions (EOF) homework.--Extra credit/ teach us something new:Extra credit/ teach us something new:
    Do an EOF analysis of your second field. Display and interpret. Compare and contrast with your first field.
    Remove the mean, or don't, or remove a different mean. How are the results affected? Explain the sense of the results.
    Standardize the time series at each longitude: this gives eigenvectors and eigenvalues of the correlation matrix rather than the covariance matrix (recall HW3 where you plotted slices of these).
    Try doing the computation with x and t transposed. Now the "coefficients" or "eigenvectors" are in time (240) and the "scores" are in space (144).
    There is a part of the total spacetime variance that EOF's can't reach if you remove the TIME mean, but then use SPACE as the statistical dimension over which you sum to compute covariances. (Or, for that matter, if you remove the SPACE mean to define anomalies but then perform a TIME covariance analysis). What is that unreachable part of the spacetime variance? (Just look at the difference between the input data and the reconstruction and you will see what I am getting at.)
    Do a "Combined EOF" analysis of a vector that combines the two fields (each field must be standardized, since the units are different).
    you just make a (240x288) array where the 288 values at each time are the 144 field1 (standardized) and then the 144 field2.
    Run princomp() in the usual way
    Unpack the results at plotting time: the first 144 values are your field1, the others your field2. Rescale with physical units for a better plot.
    CEOFs here maximized the variance of the combined data, so they indicate related variations between the 2 fields.
    examples: [[file/view/CEOF.sst.slp.ps|CEOF.sst.slp.ps]] [[file/view/CEOF.sst.precip.ps|CEOF.sst.precip.ps]] from code [[file/view/HW5_CEOF_BEM.pro|HW5_CEOF_BEM.pro]]
    Read rotatefactors() documentation (Matlab) and learn and teach us all about "rotated EOFs".
    Background: you first truncate to the first few EOF/PC pairs, then relax one or more of the orthogonality conditions.
    (background of background: EOFs are orthogonal in space, AND PCs are orthogonal in time. Heavy constraint! Either one would allow us to still speak of variance as being cleanly partitioned among modes. Rotated modes may or might or could (?) thus be more physical, since a purely mathematical constraint has been relaxed).
    The double orthogonality condition:
    [[image:space/math/6ae5df9b48e5c6b2bf39479f51e7b41f.gif caption=" int EOF_i(vec x) EOF_j(vec x) d vec x = 0andint PC_i(t) PC_j(t) dt = 0"]]
    int EOF_i(vec x) EOF_j(vec x) d vec x = 0andint PC_i(t) PC_j(t) dt = 0
    Empirical Orthogonal Functions (EOF) homework.-IDL results for OLR fieldEmpirical Orthogonal Functions (EOF) homework.-IDL results for OLR fieldIDL results for OLR field
    [[file/view/HW5_EOF_BEM.ps|HW5_EOF_BEM.ps]]
    [[file/view/HW5_EOF_BEM.pro|IDL code HW5_EOF_BEM.pro]]
    {HW5_EOF_OLRmodes_BEM.png} HW5_EOF_OLRmodes_BEM.png
    HW5_EOF_OLRmodes_BEM.png
    {HW5_EOF_OLRreconstructions_BEM.png} HW5_EOF_OLRreconstructions_BEM.png
    HW5_EOF_OLRreconstructions_BEM.png
    Empirical Orthogonal Functions (EOF) homework.-IDL results for OLR field-Showing that an EOF is an eigenvector of the covariance matrixEmpirical Orthogonal Functions (EOF) homework.-IDL results for OLR field-Showing that an EOF is an eigenvector of the covariance matrixShowing that an EOF is an eigenvector of the covariance matrix
    (so that the results here are related back to those of Question 6.1 of HW3).
    Recall that an eigenvector of a matrix is a vector that, when multiplied by the matrix, returns the same vector (times a constant, the eigenvalue):
    [[image:space/math/13db31ddd1f603b4c1c102d17c1165b7.gif caption=" matrix M vec x = lambda vec x"]]
    matrix M vec x = lambda vec x
    {EOF1_is_eigenvector_check_olrBEM.png} EOF1_is_eigenvector_check_olrBEM.png
    EOF1_is_eigenvector_check_olrBEM.png
    {EOF2_is_eigenvector_check_olrBEM.png} EOF2_is_eigenvector_check_olrBEM.png
    EOF2_is_eigenvector_check_olrBEM.png

    (view changes)
    4:18 pm

Sunday, May 15

  1. page Results_sst_Chen edited ... explained = latent/sum(latent(:)) .*100; explained(1:10) ... way to compute EOF and PC; …
    ...
    explained = latent/sum(latent(:)) .*100;
    explained(1:10)
    ...
    way to compute EOF and PC;do it; here is
    X = [2 6 1 5 2;
    9 4 0 5 4];
    (view changes)
    9:02 pm
  2. page Results_sst_Chen edited ... {chen_EOF1.jpg} {chen_EOF2.jpg} {SSTAOr.jpg} {SSTARe.jpg} {Truncation1.jpg} {Truncation2…
    ...
    {chen_EOF1.jpg} {chen_EOF2.jpg}
    {SSTAOr.jpg} {SSTARe.jpg} {Truncation1.jpg} {Truncation2.jpg}
    The following is the main part of my code for computing EOF and PC:
    %% EOF=COEFF; PC=SCORE
    sst = nc_varget('data.nc','sst');
    sstm = mean(sst);
    sstm = repmat(sstm,240,1);
    ssta = sst-sstm;
    [COEFF,SCORE,latent,tsquare] = princomp(ssta); % n x m; SCORE=PC, COEFF=EOF!!!
    recon_ssta = SCORE*transpose(COEFF);
    explained = latent/sum(latent(:)) .*100;
    explained(1:10)
    There is another way to compute EOF and PC; here is a simple demonstration:
    X = [2 6 1 5 2;
    9 4 0 5 4];
    X(1,:) = X(1,:)-mean(X(1,:)); X(2,:)=X(2,:)-mean(X(2,:));
    % co-variance matrix
    C=X*X'/5;
    [EOF,E] = eig(C); % EOF: eigenvectors; E:eigenvalues
    PC = EOF*X;
    % reverse the order
    E = fliplr(flipud(E));
    lambda = diag(E); % retain eigenvalues only; exactly the same as variable 'latent' in the foregoing code.
    EOF = fliplr(EOF);
    PC = flipud(PC);
    %% check
    EOF*EOF' % = I
    PC*PC'/5 % = lambda
    EOF*PC % = X

    (view changes)
    9:00 pm

Friday, May 13

  1. page Result_OLR_Yang edited ... {http://mpo581-hw3-eofs.wikispaces.com/space/math/3a322530c7d160da73e6a98417605451.gif} How …
    ...
    {http://mpo581-hw3-eofs.wikispaces.com/space/math/3a322530c7d160da73e6a98417605451.gif}
    How many values (numbers) are in your input data array? 144*240
    ...
    left? 144*240 No! 144 + 240, much smaller.
    If 5
    ...
    set? (144-5)*144*240 No, 5*(144+240) versus 144x240.
    Empirical Orthogonal Functions (EOF) homework.--2. Read in your field1 (let's call it x again). Use the same data from HW3 data source here.2. Read in your field1 (let's call it x again). Use the same data from HW3 data source here.
    Perform and display an EOF analysis of your first field.
    (view changes)
    9:09 am

Monday, May 9

  1. page Results_precip_Jie edited ExtraCredits 1. First, a question about the sense of EOF's: You have some data(x,t) with spac…

    ExtraCredits
    1. First, a question about the sense of EOF's:
    You have some data(x,t) with space-time structure: 144 space bins (in this case, just longitude), by 240 time bins (months).
    ...
    {4.jpg}
    {5.jpg}
    ExtraExtraCreditsExtra credit/ teach
    Do an EOF analysis of your second field. Display and interpret. Compare and contrast with your first field.
    The first several EOF analysis of SST explained much more variability than Precip. This indicates that there’s more noise for the precip data. For the first mode, they both represent ENSO, but the strongest signal appears more east for the SST.
    (view changes)
    11:19 am
  2. page Results_precip_Jie edited ExtraCredits 1. First, a question about the sense of EOF's: You have some data(x,t) with spac…

    ExtraCredits
    1. First, a question about the sense of EOF's:
    You have some data(x,t) with space-time structure: 144 space bins (in this case, just longitude), by 240 time bins (months).
    (view changes)
    11:16 am

Sunday, May 8

  1. page Results_precip_Jie edited ... {4.jpg} {5.jpg} Extra credit/ teach us something new: Do an EOF analysis of your second …
    ...
    {4.jpg}
    {5.jpg}
    Extra credit/ teach us something new:
    Do an EOF analysis of your second field. Display and interpret. Compare and contrast with your first field.
    The first several EOF analysis of SST explained much more variability than Precip. This indicates that there’s more noise for the precip data. For the first mode, they both represent ENSO, but the strongest signal appears more east for the SST.
    {b1.jpg}
    {b2.jpg}
    {b3.jpg} {b4.jpg}
    {b5.jpg}
    Remove the mean, or don't, or remove a different mean. How are the results affected? Explain the sense of the results.
    It doesn’t seem to affect the results.
    {c1.jpg}
    Standardize the time series at each longitude: this gives eigenvectors and eigenvalues of the correlation matrix rather than the covariance matrix (recall HW3 where you plotted slices of these).
    This change slightly reduces the variability explained by the first several EOF modes. For the first mode, the maximum signal is slightly shifted eastward.
    {d1.jpg}
    {d5.jpg}
    Try doing the computation with x and t transposed. Now the "coefficients" or "eigenvectors" are in time (240) and the "scores" are in space (144).
    There is a part of the total spacetime variance that EOF's can't reach if you remove the TIME mean, but then use SPACE as the statistical dimension over which you sum to compute covariances. (Or, for that matter, if you remove the SPACE mean to define anomalies but then perform a TIME covariance analysis). What is that unreachable part of the spacetime variance? (Just look at the difference between the input data and the reconstruction and you will see what I am getting at.)
    EOF with x and t transposed:
    {e1.jpg}
    {e2.jpg}
    {e5.jpg}
    EOF with spatial mean removed.
    {f1.jpg}
    {f5.jpg}

    (view changes)
    6:09 pm
  2. file f5.jpg uploaded
    6:09 pm
  3. file f1.jpg uploaded
    6:08 pm

More