Statistical inference on curves



Statistical inference on curves
27-04-2017
Salón de Graos - Facultad de Matemáticas



16:30- 17:30

Juan Carlos Pardo Fernández
Universidade de Vigo

Robust testing for superiority between two regression curves

17:30- 18:30

Ingrid Van Keilegom
Université Catholique de Louvain

Estimation in measurement error problems under minimal conditions on the distribution of the signal and the noise

18:30- 19:30

Marc Hallin
Université libre de Bruxelles

Dynamic Principal Components and Optimal Dimension Reduction in Functional Time Series


Juan Carlos Pardo Fernández - "Robust testing for superiority between two regression curves".

Abstract: In this talk we will focus on the problem of testing the null hypothesis that the regression functions of two populations are equal versus one-sided alternatives under a general nonparametric homoscedastic regression model. To protect against atypical observations, the test statistic is based on the residuals obtained by using a robust estimate for the regression function under the null hypothesis. The asymptotic distribution of the test statistic is studied under the null hypothesis and under root-n local alternatives. A Monte Carlo study is performed to compare the finite sample behaviour of the proposed tests with the classical one obtained using local averages. A sensitivity analysis is carried on a real data set.

This is joint work with Graciela Boente (Universidad de Buenos Aires).

 

Ingrid Van Keilegom - "Estimation in measurement error problems under minimal conditions on the distribution of the signal and the noise".

Abstract: In this presentation I will talk about two research problems related to the identifiability and estimation in measurement error problems. We like to make as little assumptions as possible on the error and on the variable that is subject to measurement error, and especially we do not want to impose the heavy assumption that the variance of the error is known, which is a common assumption in the literature.
For the first project, let $X$ denote an unobservable continuous variable with compact support (like e.g. a covariate in a regression model), and let $W$ be its mismeasured version. We assume the classical additive model $W = X + U$, where the error $U$ is independent of $X$ and is normally distributed with mean zero and {\it unknown variance} $\sigma^2$. No further assumptions are made regarding $X$ and $U$. Under this model we like to identify and estimate the variance $\sigma^2$. We will do this by approximating the density of $X$ by a linear combination of Beta densities, and by letting the number of densities go to infinity when the sample size grows. In this way we can show that the problem is identifiable, we develop an estimation procedure, asymptotic theory and a detailed simulation study.
For the second project, we are interested in the estimation of the boundary (or frontier) $c$ of a variable $X$ with support say $[c,\infty)$. The variable $X$ is again observed with noise, i.e.\ we observe $W=X+U$, where $U$ and $X$ are again assumed to be independent, and this time we assume that the distribution of $U$ is {\it completely unknown} apart from the fact that its density is known to be symmetric. We show that the boundary $c$ can be identified under these conditions, and propose an estimation procedure, which we illustrate in a simulation study.

 

Marc Hallin - "Dynamic Principal Components and Optimal Dimension Reduction in Functional Time Series".

Abstract: Dimension reduction techniques are central in the analysis of high-dimensional and functional observations, and time series data are no exception. This talk, is addressing the problem of optimal dimension reduction for functional time series. Such time series arise frequently, e.g., when a continuous time process is segmented into some smaller natural units, such as days, each observation representing one intraday curve. We argue that functional principal component analysis (FPCA), which is a key technique in the field, does not provide an adequate dimension reduction in a time series context. FPCA is a static procedure which ignores the essential serial dependence features of the data, and does not enjoy, in the time series context, the Karhunen-Loève optimality justifying its success in the presence of independent observations. Inspired by Brillinger's theory of dynamic principal components, we propose a dynamic version of FPCA which is based on a frequency domain approach, and show that it provides the expected optimal dimension reduction. By means of a simulation study and an empirical illustration, we show the considerable improvement our method entails when compared to the usual (static) FPCA procedure.


Based on joint work with Siegfried Hörmann and Lukasz Kidzinski (2015) Dynamic functional principal components. Journal of the Royal Statistical Society Series B 77, 319-348.