第一届文澜经济测度与应用计量经济学系列讲座 之 合肥工业大学 陈亚 副研究员:“DEA with shrinkage techniques for dimension reduction in ‘big data’ contexts”

发布者:陈丹妮发布时间:2019-10-12浏览次数:402

主题|TopicDEA with shrinkage techniques for dimension reduction in ‘big data’ contexts

时间|Time1025日|Oct 25th , 1600 - 1630AM

地点Venue文澴楼809Meeting Room 809WENHUAN

 

主讲|Speaker

陈亚博士现任合肥工业大学经济学院副研究员。主要的研究领域为效率与生产率分析,非参数评估理论与方法,环境经济学,医疗效率与银行效率评价。已在European Journal of Operational ResearchAnnals of Operations ResearchOmega, Energy PolicyInformation Systems and Operational ResearchJournal of Cleaner Production等国际知名学术期刊上发表多篇论文。

 

研究领域|Research Interests

效率与生产率分析,非参数评估理论与方法,环境经济学,医疗效率与银行效率评价

 

摘要|Abstract

In data envelopment analysis (DEA), the curse of dimensionality problem may jeopardize the accuracy or even relevance of results when there is relatively large dimension of inputs and outputs, even for relatively large samples. Recently, an approach based on the least absolute shrinkage and selection operator (LASSO) for variable selection was combined with SCNLS, i.e. LASSO-SCNLS as a way to circumvent the curse of dimensionality problem. In this paper, we revisit this interesting approach, by considering various data generating processes. We also explore more advanced versions of LASSO, adapting it to DEA, the so-called Elastic Net approach. Our Monte Carlo simulations suggest different conclusions than those in previous literature: we find that none of the considered approaches clearly dominate the others. As the number of observations increases, the difference in performance is not significant across the different approaches. In other words, it would be fairer to conclude that in many cases, when adapted to DEA, the PCA does as well or even better than the Lasso. On the other hand, we also find evidence that the LASSO-type approaches could be more useful for addressing the wide big data contexts, to reduce very large dimensions into sparser DEA models.