A PAC teaching model - under helpful distributions - is proposed which introduces the classical ideas of teaching models within the PAC setting: a polynomial-sized teaching set is associated with each target concept; the criterion of success is PAC identification; an additional parameter, namely the inverse of the minimum probability assigned to any example in the teaching set, is associated with each distribution; the learning algorithm running time takes this new parameter into account. An Occam razor theorem and its converse are proved. Some classical classes of boolean functions, such as Decision Lists, DNF and CNF formulas are proved learnable in this model. Comparisons with other teaching models are made: learnability in the Goldman and Mathias model implies PAC learnability under helpful distributions. Note that Decision lists and DNF are not known to be learnable in the Goldman and Mathias model. A new simple PAC model, where “simple” refers to Kolmogorov complexity, is introduced. We show that most learnability results obtained within previously defined simple PAC models can be simply derived from more general results in our model.
Mots-clés : PAC learning, teaching model, Kolmogorov complexity
@article{ITA_2001__35_2_129_0, author = {Denis, Fran\c{c}ois and Gilleron, R\'emi}, title = {PAC learning under helpful distributions}, journal = {RAIRO - Theoretical Informatics and Applications - Informatique Th\'eorique et Applications}, pages = {129--148}, publisher = {EDP-Sciences}, volume = {35}, number = {2}, year = {2001}, mrnumber = {1862459}, zbl = {0992.68118}, language = {en}, url = {http://archive.numdam.org/item/ITA_2001__35_2_129_0/} }
TY - JOUR AU - Denis, François AU - Gilleron, Rémi TI - PAC learning under helpful distributions JO - RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications PY - 2001 SP - 129 EP - 148 VL - 35 IS - 2 PB - EDP-Sciences UR - http://archive.numdam.org/item/ITA_2001__35_2_129_0/ LA - en ID - ITA_2001__35_2_129_0 ER -
%0 Journal Article %A Denis, François %A Gilleron, Rémi %T PAC learning under helpful distributions %J RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications %D 2001 %P 129-148 %V 35 %N 2 %I EDP-Sciences %U http://archive.numdam.org/item/ITA_2001__35_2_129_0/ %G en %F ITA_2001__35_2_129_0
Denis, François; Gilleron, Rémi. PAC learning under helpful distributions. RAIRO - Theoretical Informatics and Applications - Informatique Théorique et Applications, Tome 35 (2001) no. 2, pp. 129-148. http://archive.numdam.org/item/ITA_2001__35_2_129_0/
[1] Learning Regular Sets from Queries and Counterexamples. Inform. and Comput. 75 (1987) 87-106. | MR | Zbl
,[2] Queries and Concept Learning. Machine Learning 2 (1988) 319-342.
,[3] Nonuniform Learnability, in ICALP (1988) 82-92. | MR | Zbl
and ,[4] Occam's Razor. Inform. Process. Lett. 24 (1987) 377-380. | Zbl
, , and ,[5] On the Necessity of Occam Algorithms. Theoret. Comput. Sci. 100 (1992) 157-184. | MR | Zbl
and ,[6] Exact Learning Boolean Function via the Monotone Theory. Inform. and Comput. 123 (1995) 146-153. | MR | Zbl
,[7] Simple PAC learning of simple decision lists, in ALT 95, 6th International Workshop on Algorithmic Learning Theory. Springer, Lecture Notes in Comput. Sci. 997 (1995) 239-250.
and ,[8] PACS, simple-PAC and query learning. Inform. Process. Lett. 73 (2000) 11-16. | MR | Zbl
and ,[9] Learning regular languages from simple positive examples, Machine Learning. Technical Report LIFL 321 - 1998; http://www.lifl.fr/denis (to appear). | Zbl
,[10] PAC Learning with Simple Examples, in 13th Annual Symposium on Theoretical Aspects of Computer Science. Springer-Verlag, Lecture Notes in Comput. Sci. 1046 (1996) 231-242.
, and ,[11] PAC learning under helpful distributions, in Proc. of the 8th International Workshop on Algorithmic Learning Theory (ALT-97), edited by M. Li and A. Maruoka. Springer-Verlag, Berlin, Lecture Notes in Comput. Sci. 1316 (1997) 132-145. | MR | Zbl
and ,[12] Complexity of Automaton Identification from Given Data. Inform. and Control 37 (1978) 302-320. | MR | Zbl
,[13] On the Complexity of Teaching. J. Comput. System Sci. 50 (1995) 20-31. | MR | Zbl
and ,[14] Teaching a Smarter Learner. J. Comput. System Sci. 52 (1996) 255-267. | MR | Zbl
and ,[15] Lower Bounds on Learning Decision Lists and Trees. Inform. and Comput. 126 (1996) 114-122. | MR | Zbl
, , and ,[16] Equivalence of Models for Polynomial Learnability. Inform. and Comput. 95 (1991) 129-161. | MR | Zbl
, , and ,[17] Characteristic Sets for Polynomial Grammatical Inference. Machine Learning 27 (1997) 125-137. | Zbl
,[18] Recent Results on Boolean Concept Learning, in Proc. of the Fourth International Workshop on Machine Learning (1987) 337-352.
, , and ,[19] An Introduction to Computational Learning Theory. MIT Press (1994). | MR
and ,[20] Learning simple concepts under simple distributions. SIAM J. Comput. 20 (1991) 911-935. | MR | Zbl
and ,[21] An introduction to Kolmogorov complexity and its applications, 2nd Edition. Springer-Verlag (1997). | MR | Zbl
and ,[22] DNF: If You Can't Learn 'em, Teach 'em: An Interactive Model of Teaching, in Proc. of the 8th Annual Conference on Computational Learning Theory (COLT'95). ACM Press, New York (1995) 222-229.
,[23] Machine Learning: A Theoretical Approach. Morgan Kaufmann, San Mateo, CA (1991). | MR
,[24] On Learning Boolean Functions, in Proc. of the 19th Annual ACM Symposium on Theory of Computing. ACM Press (1987) 296-304.
,[25] Inferring regular languages in polynomial update time, in Pattern Recognition and Image Analysis (1992) 49-61.
and ,[26] On the Relationships between Models of Learning in Helpful Environments, in Proc. Fifth International Conference on Grammatical Inference (2000). | Zbl
and ,[27] Learning DFA from simple examples, in Proc. of the 8th International Workshop on Algorithmic Learning Theory (ALT-97), edited by M. Li and A. Maruoka. Springer, Berlin, Lecture Notes in Artificial Intelligence 1316 (1997) 116-131. | MR | Zbl
and ,[28] Simple DFA are polynomially probably exactly learnable from simple examples, in Proc. 16th International Conf. on Machine Learning (1999) 298-306.
and ,[29] Learning Decision Lists. Machine Learning 2 (1987) 229-246.
,[30] Approximate Testing and Learnability, in Proc. of the 5th Annual ACM Workshop on Computational Learning Theory, edited by D. Haussler. ACM Press, Pittsburgh, PA (1992) 327-332.
,[31] Learning with a Helpful Teacher, in Proc. of the 12th International Joint Conference on Artificial Intelligence, edited by R. Myopoulos and J. Reiter. Morgan Kaufmann, Sydney, Australia (1991) 705-711. | Zbl
, , and ,[32] The Strength of Weak Learnability. Machine Learning 5 (1990) 197-227. | Zbl
,[33] Teachability in Computational Learning. NEWGEN: New Generation Computing 8 (1991). | Zbl
and ,[34] A Theory of the Learnable. Commun. ACM 27 (1984) 1134-1142. | Zbl
,