Русские видео

Сейчас в тренде

Иностранные видео


Скачать с ютуб Jing Lei: Winners with Confidence: Discrete Argmin Inference with an Application to Model Selection в хорошем качестве

Jing Lei: Winners with Confidence: Discrete Argmin Inference with an Application to Model Selection 2 дня назад


Если кнопки скачивания не загрузились НАЖМИТЕ ЗДЕСЬ или обновите страницу
Если возникают проблемы со скачиванием, пожалуйста напишите в поддержку по адресу внизу страницы.
Спасибо за использование сервиса savevideohd.ru



Jing Lei: Winners with Confidence: Discrete Argmin Inference with an Application to Model Selection

American Statistical Association (ASA), Section on Statistical Learning and Data Science (SLDS) September webinar: Winners with Confidence: Discrete Argmin Inference with an Application to Model Selection Record: September 24, 2024 Presenter: Jing Lei is Professor of Statistics & Data Science at Carnegie Mellon University. He received his Bachelor of Science degree from the School of Mathematical Sciences at Peking University in China, and obtained PhD in statistics from UC Berkeley in 2010 before joining Carnegie Mellon in 2011. Jing's research focuses on providing rigorous insights into popular algorithms in practical contexts. He has done pioneering and foundational work on predictive inference, including conformal prediction and cross-validation. He developed advanced theory and methods for high-dimensional matrix data, including sparse PCA and network data, with successful applications in single-cell multi-omics data analysis. He is also among the first researchers to study differential privacy in a statistical context. He is a fellow of the Institute of Mathematical Statistics (IMS) and the American Statistical Association (ASA). Jing received an NSF CAREER Award and the Gottfried E. Noether Young Researcher Award in 2016. In 2022, he was a recipient of the Leo Breiman Junior Award. In 2024, he was awarded the IMS Medallion Lectureship. Abstract: We study the problem of finding the index of the minimum value of a vector from noisy observations. This problem is relevant in population/policy comparison, discrete maximum likelihood, and model selection. By integrating concepts and tools from cross-validation and differential privacy, we develop a test statistic that is asymptotically normal even in high-dimensional settings, and allows for arbitrarily many ties in the population mean vector. The key technical ingredient is a central limit theorem for globally dependent data characterized by stability. We also propose a practical method for selecting the tuning parameter that adapts to the signal landscape. For more information about or to join ASA SLDS, visit https://community.amstat.org/slds/home https://www.amstat.org/

Comments