Chiong, Khai Xiang
Permanent URI for this collectionhttps://hdl.handle.net/10735.1/6573
Khai Xiang Chiong is an Assistant Professor of Marketing. His research interests include:
- Quantitative marketing
- Empirical industrial organization
- Econometrics
- Machine learning
ORCID page
Browse
Recent Submissions
Item Random Projection Estimation of Discrete-Choice Models with Large Choice Sets(Informs, 2018-04-06) Chiong, Khai Xiang; Shum, Matthew; 0000-0002-6713-8907 (Chiong, KX); 6155646095618210089 (Chiong, KX); Chiong, Khai XiangWe introduce random projection, an important dimension-reduction tool from machine learning, for the estimation of aggregate discrete-choice models with high-dimensional choice sets. Initially, high-dimensional data are projected into a lower-dimensional Euclidean space using random projections. Subsequently, estimation proceeds using cyclical monotonicity moment inequalities implied by the multinomial choice model; the estimation procedure is semiparametric and does not require explicit distributional assumptions to be made regarding the random utility errors. Our procedure is justified via the Johnson-Lindenstrauss lemma-the pairwise distances between data points are preserved through random projections. The estimator works well in simulations and in an application to a supermarket scanner data set.Item Estimation of Graphical Models Using the L₁,₂ Norm(John Wiley and Sons, 2018-09-14) Chiong, Khai Xiang; Moon, H. R.; 0000-0002-6713-8907 (Chiong, KX); 6155646095618210089 (Chiong, KX); Chiong, Khai XiangGaussian graphical models are recently used in economics to obtain networks of dependence among agents. A widely used estimator is the graphical least absolute shrinkage and selection operator (GLASSO), which amounts to a maximum likelihood estimation regularized using the (Formula presented.) matrix norm on the precision matrix Ω. The (Formula presented.) norm is a LASSO penalty that controls for sparsity, or the number of zeros in Ω. We propose a new estimator called structured GLASSO (SGLASSO) that uses the (Formula presented.) mixed norm. The use of the (Formula presented.) penalty controls for the structure of the sparsity in Ω. We show that when the network size is fixed, SGLASSO is asymptotically equivalent to an infeasible GLASSO problem which prioritizes the sparsity-recovery of high-degree nodes. Monte Carlo simulation shows that SGLASSO outperforms GLASSO in terms of estimating the overall precision matrix and in terms of estimating the structure of the graphical model. In an empirical illustration using a classic firms' investment data set, we obtain a network of firms' dependence that exhibits the core–periphery structure, with General Motors, General Electric and US Steel forming the core group of firms.