LOGIN ENGLISH

제목 [학술세미나] [학과세미나] 9월 7일(금) 11시 학과세미나 안내
작성일 2018-09-04 17:29:44
내용 [학과세미나] 9월 7일(금) 11시 학과세미나 안내

▪ 제목 : Recent Works on Bayesian Networks
▪ 연사 : 박건웅 (서울시립대 통계학과)
▪ 일시 : 2018년 9월 7일(금) AM 11:00 – 12:00
▪ 장소 : 25동 405호



초  록 
Learning Large-Scale Poisson DAG Models based on OverDispersion Scoring

In this paper, we address the question of identifiability and learning algorithms for large-scale Poisson Directed Acyclic Graphical (DAG) models. We define general Poisson DAG models as models where each node is a Poisson random variable with rate parameter depending on the values of the parents in the underlying DAG. First, we prove that Poisson DAG models are identifiable from observational data, and present a polynomial-time algorithm that learns the Poisson DAG model under suitable regularity conditions. The main idea behind our algorithm is based on overdispersion, in that variables that are conditionally Poisson are overdispersed relative to variables that are marginally Poisson. Our algorithms exploits overdispersion along with methods for learning sparse Poisson undirected graphical models for faster computation. We provide both theoretical guarantees and simulation results for both small and large-scale DAGs.

Learning Quadratic Variance Function (QVF) DAG Models via OverDispersion Scoring (ODS)

Learning DAG or Bayesian network models is an important problem in multi-variate causal inference. However, a number of challenges arises in learning large-scale DAG models including model identiability and computational complexity since the space of directed graphs is huge. In this paper, we address these issues in a number of steps for a broad class of DAG models where the noise or variance is signal-dependent. Firstly we introduce a new class of identiable DAG models, where each node has a distribution where the variance is a quadratic function of the mean (QVF DAG models). Our QVF DAG models include many interesting classes of distributions such as Poisson, Binomial, Geometric, Exponential, Gamma and many other distributions in which the noise variance depends on the mean. We prove that this class of QVF DAG models is identiable, and introduce a new algorithm, the OverDispersion Scoring (ODS) algorithm, for learning large-scale QVF DAG models. Our algorithm is based on rstly learning the moralized or undirected graphical model representation of the DAG to reduce the DAG search-space, and then exploiting the quadratic variance property to learn the ordering. We show through theoretical results and simulations that our algorithm is statistically consistent in the high-dimensional p > n setting provided that the degree of the moralized graph is bounded and performs well compared to state-of-the-art DAG-learning algorithms. We also demonstrate through a real data example involving multi-variate count data, that our ODS algorithm is wellsuited to estimating DAG models for count data in comparison to other methods used for discrete data.
 
High-Dimensional Poisson DAG Model Learning Using `1-Regularized Regression

In this paper we develop a new approach for learning high-dimensional Poisson directed acyclic graphical (DAG) models from observational data without strong assumptions such as faithfulness and strong-sparsity. A key component of our method is to decouple the ordering estimation or parents search where the problems can be eciently addressed using the `1-regularized regression and the mean-variance relationship. We show that the sample size n = (d73 log7 p) is suffices for our polynomial time MRS algorithm to recover true graph, where p is the number of nodes and d is the maximum indegree. We verify through simulations that our algorithm is statistically consistent in the high-dimensional p > n setting, and performs well compared to state-of-the-art ODS, GES, MMHC algorithms.
 
파일 세미나 안내_180907_박건웅.hwp [16.5KB]