吕绍高

来源:澳门新莆京游戏官方网站发布时间:2018-06-26浏览次数:17297

澳门新莆京游戏官方网站统计与数据科学公司

姓名     吕绍高

最后学位: 博士

岗位职称:教授                            

研究领域:统计机器学习与数据挖掘

教学课程:《机器学习》、《数据挖掘》等                                          

办公室:崇真楼202

Emai l lvsg716@nau.edu.cn

通讯地址:南京市浦口区雨山西路86

邮  编:211815

 

学习经历

2002.09 - 2006.06  河南师范大学数学科学学院       数学与应用数学 理学学士

2006.09 - 2011.06  中国科大&香港城市大学联合培养     应用数学专业 理学博士

2014.08 - 2015.08  日本统计数理研究所(ISM)学术交流

 

工作经历

2011.09 - 2018.04  西南财经大学统计学院        副教授(博士生导师)

2018.04 - 至今     澳门新莆京游戏官方网站统计与数学学院  教授、博士生导师

 

主持课题

1.国家自然科学基金青年项目:高维数据框架内的非参与半参分位数回归模型的研究。 (No.11301421, 2014-2016)

2.国家自然科学基金数学天元项目:基于凸正则化项的多核学习算法的理论研究。(11226111, 2012-2013)

3.校级中央高校专项基金-重大理论:适用于大规模数据的算法设计问题研究。 (2015-2017)

4.校级中央高校专项基金-交叉创新: 不依赖于模型的可大规模计算的变量选择方法。 (2014-2015)

5.国家自然科学基金面上项目:半参数统计模型的分布式估计及其推断No.11871277, 2019-2022

 

研究兴趣

统计机器学习与数据挖掘,尤其关注分布式学习、深度学习以及强化学习的理论基础与算法设计。

 

近年来部分论文成果

机器学习领域:

1.Shaogao Lv, Xin He* and Junhui Wang. (2023). Minimax kernel-based estimation for partially linear functional models. Journal of Machine Learning Research. Conditional Acceptance with Minor Revision.

2.Xingcai Zhou, Le Chang, Pengfei Xu and Shaogao Lv*. (2023). Communication-efficient and Byzantine-robust distributed learning with statistical guarantee. Pattern Recognition. 137, 109312.

3.Shaogao Lv* and Heng Lian. (2022). Debiased distributed learning for sparse partial linear models in high dimensions. Journal of Machine Learning Research.23:1-32,2022.

4.Shaogao Lv, Linsen Wei, Jiankun Liu and Yong Liu*. (2021). Improved learning rates of a functional Lasso-type SVM with sparse multi-kernel representation. 35th Conference on Neural Information Processing Systems (NeurIPS 2021), Sydney, Australia.

5.Shaogao Lv, LinSen Wei, Zenglin Xu*, Qian Zhang. (2021). Improved inference for imputed based semi-supervised learning under misspeficed setting. IEEE Transaction Neural Network and Learning Systems. Doi: 10.1109/TNNLS.2021.3077312.

6.Xingcai Zhou and Shaogao Lv*. (2021). Robust wavelet-based estimation for varying coefficient dynamic models under long-dependent structures. Analysis and Application. 19,1033-1057.

7.Yifan Xia,Y ongchao Hou and Shaogao lv*. (2021). Learning rates for partially linear support vector machine in high dimensions. Analysis and Application, 19, 167-182.

8.Guo Niu, Zhengming Ma* and Shaogao Lv. (2017). Ensemble multiple-kernel based manifold regularization. Neural processing Letter, 45, 539-552.

9.Lei Yang, Shaogao Lv and Junhui Wang. (2016). Model-free variable selection in reproducing kernel Hilbert space. Journal of Machine Learning Research, 17, 1-24.

10.Yunlong Feng, Shaogao Lv*, Hanyuan Han, Johan A.K. Suykens. (2016). Kernelized elastic net regularization: generalization bounds and sparse recovery. Neural Computation, 28, 525-562 .

11.Shaogao Lv*. (2015). Refined generalization bounds of gradient learning over reproducing Kernel Hilbert spaces. Neural Computation, 27, 1294–1320.

12.Shaogao Lv* and Fanyin Zhou. (2015). Optimal learning rates of L^p-type multiple kernellearning under general conditions. Information Science, 10255-268.   

统计学领域:

1.Yibo Deng , Xin He , and Shaogao Lv*. (2023). Efficient learning nonparametric directed acyclic graph with statistical guarantee. Statistica Sinica. Minor Revision.


2.Xin He, Junhui Wang and Shaogao Lv. (2021). Efficient kernel-based variable selection with sparsistency. Statistica Sinica. 31, 2123-2151.

3.Xin He, Shaogao Lv* and Junhui Wang.(2020). Variable selection for classification with derivative-induced regularization. Statistica Sinica.30, 2075-2103.

4.Shaogao Lv, Zengyan Fan, Heng Lian, Taiji Suzuki , Kenji Fukumizu. (2020). A reproducing kernel Hilbert space approach to high dimensional partially varying coefficient model. Computational Statistics and Data Analysis, (152), 107039.

5.Heng Lian, Kefeng Zhao and Shaogao Lv. (2019). Projected spline estimation of the nonparametric function in high-dimensional partially linear models for massive data. Annals of Statistics, 47 (5), 2922—2949.


6.Shaogao Lv, Huazhen Lin*, Heng Lian and Jian Huang. (2018). Oracle inequalities for sparse additive quantile regression in reproducing kernel Hilbert space. Annals of Statistics. 46, 781–813.

7.Xin He, Junhui Wang and Shaogao Lv*. (2018). Gradient-induced model-free variable selection with composite Quantile Regression. Statistics Sinica. 28, 1521-1538.

8.Shaogao Lv, Huazhen Lin*, Fanyin Zhou and Jian Huang. (2018). Oracle inequalities for high-dimensional additive Cox model. ScandinavianJournal of Statistics.  DOI: 10.1111/sjos.12327.

9.Shaogao Lv, Huazhen Lin*, Heng Lian and Jian Huang. (2018). On sign consistency of Lasso for high-dimensional Cox model. Journal of Multivariate Statistics. 16779-96.

10.Huazhen Lin*, Lixian Pan, Shaogao Lv and Wenyang Zhang. (2018). Semiparametric efficient estimate of the generalized additive model with unknown link and variance.Journal of Econometrics. 202, 230-244.

11.Shaogao Lv, Xin He and Junhui Wang*. (2017). A unified penalized method for sparse additive quantilemodels: a RKHS approach. Annals of the Institute of Statistical Mathematics. 69, 897-923.