Calculate pixel values at increasing radius

ndvi_classes.m function could be used to calculate the number and percentage of pixels at increasing radius from a lake following certain criterion’s. The function was originally developed to count the number of pixels falling in specific NDVI classes around lakes but may be useful for different proxies and situations. The function use georeferenced tiff images as input. For details please refer to Aleman et al. (2013) or to the included help. The image used in the example can be downloaded here.

The function:

```function [classe,classe_per]=ndvi_classes(ImageName,critere1,critere2)

% INPUT:  ImageName is the name of the geotiff image for imput
%         critere1 (and 2) criterion for classes evaluation
%
% OUTPUT: classe: vector of NDVI classes at increasing radius from
%                 the lake center (number of pixel falling in class)
%         classe_per: Same as classe but in percentage of pixels
% EXAMPLE: Using the given Image 'classif_GBL_94.tif'
%{
[classe classe_per]=ndvi_classes('classif_GBL_94.tif','>2','<4');
%Plot the classe as a function of distance in km
%(1pixel=0.0301km, input 200 pixels when asked)
plot((1:200)*0.0301,classe_per,'k-')
ylabel('% pixels >2 and <4')
xlabel('Distance from lake center (km)')
%}
% Blarquez Olivier 2012
% blarquez@gmail.com

imagesc(double(Imge))

select= input('Select point interactively? Y/N: ', 's');
if select=='Y'
title('Please double clic on lake center:')
p = impoint(gca,[]);
p = wait(p);
p=round(p);
elseif select=='N'
valstring = input('Enter latitude and longitude: ', 's');
valparts = regexp(valstring, '[ ,]', 'split');
coords = str2double(valparts);
[~,p(2)]=min(abs(linspace(dat.Latlim(1),dat.Latlim(2),dat.RasterSize(1))-...
coords(1)));
[~,p(1)]=min(abs(linspace(dat.Lonlim(1),dat.Lonlim(2),dat.RasterSize(2))-...
coords(2)));
end

n=length(Imge);

warning('off','all')
imshow(Imge)
%imagesc(double(Imge))

hEllipse = imellipse(gca,[(n/2-(r)) n/2-(r) r*2 r*2]);

pixList(isnan(pixList),:)=[];

classe(r,1)=length(pixList(eval(['pixList',critere1]) & ...
eval(['pixList',critere2]),:));
classe_per(r,1)=classe(r,1)./length(pixList(:,1));
end
close all
warning('on','all')
end
```

Paleofire reconstruction based on an ensemble-member strategy

Matlab codes and data associated with the manuscript: Blarquez O., M. P. Girardin, B. Leys, A. A. Ali, J. C. Aleman, Y. Bergeron and C. Carcaillet. 2013. Paleofire reconstruction based on an ensemble-member strategy applied to sedimentary charcoal. Geophysical Research Letters 40: 2667–2672, are available here.

Preprocessed data for Lac à la Pessière and Lago perso used in the paper also available:
PESSIERE.mat.zip
PERSO.mat.zip

Replicated rarefied richness

This example is intended to present the methods develloped for the paper Blarquez et al. (2013). We will use the macroremain record of the Lac du Loup a small subalpine lake with continuous macoremain counts for the last 11 750 years. More details on this site can be found in Blarquez et al. (2010).

```dataMacro=read.csv("http://blarquez.com/public/code/dataMacro.csv")
# Macroremain counts
# Associated depths, ages and sample volumes
dataMacro[is.na(dataMacro)]=0
# Set missing values to zero
```

We now caluculates the median resolution of the record and the interpolated ages at which the influx will be calculated:

```resMed=median(diff(paramMacro[,3]))
Ages=seq(-50, 11750, resMed)

```

We reconstruct the influx matrix using the pretreatment function which is the R implementation of the CharAnalysis CharPretreatment.m function originally develloped by P. Higuera and available at https://sites.google.com/site/charanalysis
Requires a matrix as input with the following columns:
CmTop CmBot AgeTop AgeBot Volume (params)
A serie from which to calculate accumulation rates (serie)

```source("http://blarquez.com/public/code/pretreatment.R")
infMacro=matrix(ncol=length(dataMacro[1,]),nrow=length(Ages))
for (i in 1:length(dataMacro[1,])){
+ infMacro[,i]=c(pretreatment(params=paramMacro,serie=dataMacro[,i],
+ yrInterp=resMed,first=-50,last=11750,
+ plotit=F,Int=F)\$accI)
+ }

```

Calculate samples macroremain sums and the minimun influx sum:

```S=rowSums(infMacro)
nMin=min(na.omit(S[1:length(S)-1]))

```

We exclude the last sample (not calculated because of unknown accumulation rate) and samples with NA (because of vol==0):

```del=which(S==0 | is.na(S))
infMacro=infMacro[-del,]
Ages=Ages[-del]
S=S[-del]

```

We calculate a proportion matrix which is used to replicate the rarefaction procedure:

```S_=matrix(ncol=ncol(infMacro),nrow=nrow(infMacro))
for (i in 1:length(infMacro[1,])){
+ S_[,i]=c(S)
+ }
propMat=infMacro/S_

```

Replicate rarefaction with min influx n=1,2,…,500.
We will use the rarefy function from the package vegan:

```library(vegan)
rare_=matrix(nrow=nrow(infMacro),ncol=500)
for (n in 1:500){
+ infN=ceiling(S_*n/nMin*propMat)
+ rare_[,n]=c(rarefy(infN,sample=n))
+ }

```

And finnally calculates the 90 percent confidence intervals and plot the rarefied richness using a 500 years locally weighted scatter plot smoother:

```CI_=t(apply(rare_, 1, quantile, probs = c(0.05, 0.5, .95), na.rm = TRUE))
span=500/resMed/length(Ages)
plot(Ages,lowess(CI_[,2], f =span)\$y,type="l",ylab="E[Tn]",ylim=c(min(CI_),
+ max(CI_)),xlab="Age",main="Rarefied Richness",xlim=c(12000,-100))
lines(Ages,lowess(CI_[,1], f =span)\$y,type="l",lty=2)
lines(Ages,lowess(CI_[,3], f =span)\$y,type="l",lty=2)
```

References:
Blarquez O., Finsinger W. and C. Carcaillet. 2013. Assessing paleo-biodiversity using low proxy influx. PLoS ONE 8(4): e65852
Blarquez O., Carcaillet C., Mourier B., Bremond L. and O. Radakovitch. 2010. Trees in the subalpine belt since 11 700 cal. BP : origin, expansion and alteration of the modern forest. The Holocene 20 : 139-146. DOI : 10.1177/0959683609348857

Superposed Epoch Analysis (SEA)

SEA.m function for Superposed Epoch Analysis (SEA) with confidence intervals estimated using block bootstrap procedure, the method is freely inspired from Adams et al. (2003), for an example of using SEA in palaeoecology see Blarquez and Carcaillet (2010). For details please refer to included help.

Example:

Create a random time serie with noise, define some events and plot them along with the time series:

```t=1:500;
d=2.5*sin(2*pi*t/100)+1.5*sin(2*pi*t/41)+1*sin(2*pi*t/21);
T=(d+2*randn(size(d))).';
figure()
plot(t,T,'g')
ev=[50,93,131,175,214,257,297,337,381,428,470].';
hold on
plot(ev,repmat(4,length(ev),1),'v')
```

Then we perform a SEA using a time window of 20 years before and 20 years after each event. ‘nbboot’ argument is used to define the number of circular block bootsrtap replicates with an automatically calculated block length (‘b’ equal 0) following Adams et al. (2003) procedure.

```
[SEA_means,SEA_prctiles]=SEA(T,ev,20,20,'prctilesIN',[5,95],'nbboot',9999,'b',0)

```

Proxy response to the randomly generated events appears significant at the 95% confidence levels during events occurrence (at a lag close to zero).

Paleofire frequencies

KDffreq.m function calculates palaeo-fire frequency using a Gaussian kernel and computes bootstrapped confidence intervals arround fire frequency. Require the date of fire events as input variable. Associated with this function analyst may use cvKD.mfunction to estimate optimal kernel bandwidth by max log likelihood. Here’s an example:

Generate some fire dates for the last 8000 years:

```x=randi(8000,40,1);
plot(x,1,'+','col','red')
set(gca,'xdir','reverse')
xlabel('Time (cal yrs. BP)')
```

Use the cvKD.m function to calculate the optimal bandwidth for the kernel:

```cvKD(x)
ans = 724.3434
```

Then we estimate the long term fire frequency using the Gaussian kernel density procedure:

``` [ffreq]=KDffreq(x,'up',0,'lo',8000,'bandwidth',700,'nbboot',100,'alpha',0.05 );
```