|제목||[학술세미나] [학과세미나] 12월 5일(목) 학과세미나 안내|
|내용||[학과세미나] 12월 5일(목) 학과세미나 안내
▪제목 : Fixed Support Positive Definite Modification of Covariance Matrix by Minimizing Matrix Norm
▪연사 : 조성훈 (서울대학교 통계학과)
▪일시 : 2019년 12월 5일(목) PM 17:00 – 19:00
▪장소 : 25동 210호
In this talk, we propose a simple modification to make a regularized covariance matrix positive definite (PD) while preserving its support and the convergence rate in matrix norm. Our proposal is an extension of fixed support positive definite (FSPD) modification by Choi et al. (2019) from spectral and Frobenius norm to matrix (equivalently, matrix norm). We consider a convex combination between the initial estimator (the regularized covariance matrix without PDness) and a given form of diagonal matrix, and minimize the distance between the initial estimator and the convex combination. We find a closed form expression for the modification. Like the original FSPD, the proposed modification is optimization-free and generic in the sense that it can be applied to any non-PD matrix, including the precision and correlation matrix. Further, most importantly, the matrix error of the covariance matrix estimator allows us to bound the error in multivariate procedure relying on it. An example is minimum variance portfolio (MVP) optimization problem, whose optimal weight is bounded by the matrix error of the plug-in covariance matrix estimator. We illustrate the MVP results with daily returns of S\&P 500 data from Jan. 1978 to Dec. 2014.
세미나 안내_191205_조성훈.hwp [15KB]