CONSISTENCY OF RESTRICTED MAXIMUM LIKELIHOOD

ESTIMATORS OF PRINCIPAL COMPONENTS

BY DEBASHIS PAUL AND JIE PENG

University of California, Davis

Abstract:

In this paper we consider two closely related problems: estimation of eigenvalues and eigenfunctions of the covariance kernel of functional data based on (possibly) irregular measurements, and the problem of estimating the eigenvalues and eigenvectors of the covariance matrix for high dimensional Gaussian vectors. In [A geometric approach to maximum likelihood estimation of covariance kernel from sparse irregular longitudinal data (2007)], a restricted maximum likelihood (REML) approach has been developed to deal with the first problem. In this paper, we establish consistency and derive rate of convergence of the REML estimator for the functional data case, under appropriate smoothness conditions. Moreover, we prove that when the number of measurements per sample curve is bounded, under squared-error loss, the rate of convergence of the REML estimators of eigenfunctions is near-optimal. In the case of Gaussian vectors, asymptotic consistency and an efficient score representation of the estimators are obtained under the assumption that the effective dimension grows at a rate slower than the sample size. These results are derived through an explicit utilization of the intrinsic geometry of the parameter space, which is non-Euclidean.Moreover, the results derived in this paper suggest an asymptotic equivalence between the inference on functional data with dense measurements and that of the high-dimensional Gaussian vectors.

Keywords: Functional data analysis, principal component analysis, high-dimensional data, Stiefel manifold, intrinsic geometry, consistency.