RAIRO-Theor. Inf. Appl.
Volume 33, Number 1, January Fabruary 1999
Page(s) 1 - 19
Published online 15 August 2002
  1. D. Angluin, Identifying languages from stochastic examples. Internal Report YALEU/DCS/RR-614 (1988). [Google Scholar]
  2. M. Anthony and N. Biggs, Computational learning theory. Cambridge University Press, Cambridge (1992). [Google Scholar]
  3. R.C. Carrasco and J. Oncina, Learning stochastic regular grammars by means of a state merging method, in Grammatical Inference and Applications, R.C. Carrasco and J. Oncina Eds., Springer-Verlag, Berlin, Lecture Notes in Artificial Intelligence 862 (1994). [Google Scholar]
  4. M.A. Casta no, F. Casacuberta and E. Vidal, Simulation of stochastic regular grammars through simple recurrent networks, in New Trends in Neural Computation, J. Mira, J. Cabestany and A. Prieto Eds., Springer Verlag, Lecture Notes in Computer Science 686 (1993) 210-215. [Google Scholar]
  5. T.M. Cover and J.A. Thomas, Elements of information theory. John Wiley and Sons, New York (1991). [Google Scholar]
  6. W. Feller, An introduction to probability theory and its applications, John Wiley and Sons, New York (1950). [Google Scholar]
  7. K.S. Fu, Syntactic pattern recognition and applications, Prentice Hall, Englewood Cliffs, N.J. (1982). [Google Scholar]
  8. C.L. Giles, C.B. Miller, D. Chen, H.H. Chen, G.Z. Sun and Y.C. Lee, Learning and extracting finite state automata with second order recurrent neural networks. Neural Computation 4 (1992) 393-405. [Google Scholar]
  9. E.M. Gold, Language identification in the limit. Inform. and Control 10 (1967) 447-474. [CrossRef] [Google Scholar]
  10. W. Hoeffding, Probability inequalities for sums of bounded random variables. Amer. Statist. Association J. 58 (1963) 13-30. [Google Scholar]
  11. J.E. Hopcroft and J.D. Ullman, Introduction to automata theory, languages and computation, Addison Wesley, Reading, Massachusetts (1979). [Google Scholar]
  12. K. Lang, Random DFA's can be approximately learned from sparse uniform examples, in Proc. of the 5th Annual ACM Workshop on Computational Learning Theory (1992). [Google Scholar]
  13. F.J. Maryanski and T.L. Booth, Inference of finite-state probabilistic grammars. IEEE Trans. Comput. C26 (1997) 521-536. [Google Scholar]
  14. J. Oncina and P. García, Inferring regular languages in polynomial time, in Pattern Recognition and Image Analysis, N. Pérez de la Blanca, A. Sanfeliu and E. Vidal Eds., World Scientific (1992). [Google Scholar]
  15. J.B. Pollack, The induction of dynamical recognizers. Machine Learning 7 (1991) 227-252. [Google Scholar]
  16. A.S. Reber, Implicit learning of artificial grammars. J. Verbal Learning and Verbal Behaviour 6 (1967) 855-863. [Google Scholar]
  17. D. Ron, Y. Singer and N. Tishby, On the learnability and usage of acyclic probabilistic finite automata, in Proc. of the 8th Annual Conference on Computational Learning Theory (COLT'95), ACM Press, New York (1995) 31-40. [Google Scholar]
  18. A.W. Smith and D. Zipser, Learning sequential structure with the real-time recurrent learning algorithm. Internat. J. Neural Systems 1 (1989) 125-131. [CrossRef] [Google Scholar]
  19. A. Stolcke and S. Omohundro, Hidden Markov model induction by Bayesian model merging, in Advances in Neural Information Processing Systems 5, C.L. Giles, S.J. Hanson and J.D. Cowan Eds., Morgan Kaufman, Menlo Park, California (1993). [Google Scholar]
  20. A. van der Mude and A. Walker, On the inference of stochastic regular grammars. Inform. and Control 38 (1978) 310-329. [CrossRef] [MathSciNet] [Google Scholar]
  21. R.L. Watrous and G.M. Kuhn, Induction of finite-state languages using second-order recurrent networks. Neural Computation 4 (1992) 406-414. [Google Scholar]

Current usage metrics show cumulative count of Article Views (full-text article views including HTML views, PDF and ePub downloads, according to the available data) and Abstracts Views on Vision4Press platform.

Data correspond to usage on the plateform after 2015. The current usage metrics is available 48-96 hours after online publication and is updated daily on week days.

Initial download of the metrics may take a while.