팔로우
Dominic Richards
Dominic Richards
spc.ox.ac.uk의 이메일 확인됨 - 홈페이지
제목
인용
인용
연도
Asymptotics of ridge (less) regression under general source condition
D Richards, J Mourtada, L Rosasco
International Conference on Artificial Intelligence and Statistics, 3889-3897, 2021
772021
Graph-Dependent Implicit Regularisation for Distributed Stochastic Subgradient Descent
D Richards, P Rebeschini
Journal of Machine Learning Research 21 (34), 1-44, 2020
242020
Decentralised learning with distributed gradient descent and random features
D Richards, P Rebeschini, L Rosasco
Proceedings of Machine Learning Research, 2020
23*2020
Optimal Statistical Rates for Decentralised Non-Parametric Regression with Linear Speed-Up
D Richards, P Rebeschini
NeurIPS 2019, 2019
182019
Stability & Generalisation of Gradient Descent for Shallow Neural Networks without the Neural Tangent Kernel
D Richards, I Kuzborskij
Advances in Neural Information Processing Systems 34, 2021
162021
Distributed Machine Learning with Sparse Heterogeneous Data
D Richards, S Negahban, P Rebeschini
Advances in Neural Information Processing Systems 34, 2021
10*2021
Learning with Gradient Descent and Weakly Convex Losses
D Richards, M Rabbat
International Conference on Artificial Intelligence and Statistics, 1990-1998, 2021
102021
Comparing Classes of Estimators: When does Gradient Descent Beat Ridge Regression in Linear Models?
D Richards, E Dobriban, P Rebeschini
arXiv preprint arXiv:2108.11872, 2021
12021
현재 시스템이 작동되지 않습니다. 나중에 다시 시도해 주세요.
학술자료 1–8