Global Trend Radar
arXiv cs.LG (Machine Learning) INT ai 2026-05-08 13:00

Structural Learning Theory: A Metric-Topology Factorization Approach

元記事を開く →

分析結果

カテゴリ
教育
重要度
59
トレンドスコア
18
要約
arXiv:2602.07974v2 Announce Type: replace Abstract: Learning in structured, multi-context, or non-stationary environments involves two orthogonal difficulties. The first is \emph{metric}: once the correct context is know
キーワード
arXiv:2602.07974v2 Announce Type: replace Abstract: Learning in structured, multi-context, or non-stationary environments involves two orthogonal difficulties. The first is \emph{metric}: once the correct context is known, how hard is prediction within it? This is the domain of Statistical Learning Theory (SLT). The second is \emph{structural}: how many local contexts are required, and how can they be discovered from data? This paper develops \emph{Structural Learning Theory} (StrLT) for the structural axis. We introduce \emph{width}, the minimum number of jointly contractive and low-risk cells needed to cover a learning problem. Width is incomparable with VC dimension: either can diverge while the other remains bounded. We show that width induces a \emph{phase transition}: if the allocated number of cells \(K