Skip to content

Latest commit

 

History

History
50 lines (50 loc) · 1.99 KB

2023-04-11-baby23a.md

File metadata and controls

50 lines (50 loc) · 1.99 KB
title abstract section layout series publisher issn id month tex_title firstpage lastpage page order cycles bibtex_author author date address container-title volume genre issued pdf extras
Second Order Path Variationals in Non-Stationary Online Learning
We consider the problem of universal dynamic regret minimization under exp-concave and smooth losses. We show that appropriately designed Strongly Adaptive algorithms achieve a dynamic regret of $\tilde O(d^2 n^{1/5} [\mathcal{TV}_1(w_{1:n})]^{2/5} \vee d^2)$, where $n$ is the time horizon and $\mathcal{TV}_1(w_{1:n})$ a path variational based on second order differences of the comparator sequence. Such a path variational naturally encodes comparator sequences that are piece-wise linear – a powerful family that tracks a variety of non-stationarity patterns in practice (Kim et al., 2009). The aforementioned dynamic regret is shown to be optimal modulo dimension dependencies and poly-logarithmic factors of $n$. To the best of our knowledge, this path variational has not been studied in the non-stochastic online learning literature before. Our proof techniques rely on analysing the KKT conditions of the offline oracle and requires several non-trivial generalizations of the ideas in Baby and Wang (2021) where the latter work only implies an $\tilde{O}(n^{1/3})$ regret for the current problem.
Regular Papers
inproceedings
Proceedings of Machine Learning Research
PMLR
2640-3498
baby23a
0
Second Order Path Variationals in Non-Stationary Online Learning
9024
9075
9024-9075
9024
false
Baby, Dheeraj and Wang, Yu-Xiang
given family
Dheeraj
Baby
given family
Yu-Xiang
Wang
2023-04-11
Proceedings of The 26th International Conference on Artificial Intelligence and Statistics
206
inproceedings
date-parts
2023
4
11