作成者 |
|
本文言語 |
|
出版者 |
|
|
発行日 |
|
収録物名 |
|
巻 |
|
出版タイプ |
|
アクセス権 |
|
関連DOI |
|
|
関連URI |
|
|
関連情報 |
|
|
概要 |
In the PAC-learning model, the Vapnik-Chervonenkis (VC) dimension plays the key role to estimate the polynomial-sample learnability of a class of binary functions. For a class of multi-valued function...s, the notion has been generalized in various ways. This paper investigates the complexity of computing some of generalized VC-dimensions: $ VC^*$- dimension, $ Psi _*$-dimension, and $ Pai _G $-dimension. For each dimension, we consider a decision problem that is, for a given matrix representing a class F of functions and an integer K, to determine whether the dimension of .F is greater than K or not. We prove that the $ VC^* $-dimension problem is polynomial-time reducible to the satisfiability problem of length J with $ Oleft (log^2 J\right) $ variables, which includes the original VC-dimension problem as a special case. We also show that the $ Psi _G $ -dimension problem is still reducible to the satisfiability problem of length J with $ Oleft (log^2 J\right) $ , while the $ Psi_* $,-dimension problem becomes NP-complete.続きを見る
|