ANN Tools for Number Series

Any mathematical pattern can be the generation principle for number series. In con-
trast to most of the application fields of artificial neural networks (ANN) a successful
solution does not only require an approximation of the underlying function but to correctly
predict the exact next number. We propose a dynamic learning approach and evaluate our
method empirically on number series from the Online Encyclopedia of Integer Sequences.
Finally, we investigate research questions about the performance of ANNs, structural prop-
erties, and the adequate architecture of the ANN to deal successfully with number series.
Solving number series poses a challenging problem for humans and Artificial Intelligence
Systems. The task is to correctly predict the next number in a given series, in accordance
with a pattern inherent to that series. We propose a novel method based on Artificial
Neural Networks with a dynamic learning approach to solve number series problems. Our
method is evaluated on an own experiment and over 50.000 number series from the Online
Encyclopedia of Integer Sequences (OEIS) database.

Dynamic Learning Approach for Solving Number Series

Positioning. The article, published as a book chapter in the Proceedings of the German Conference on Artificial Intelligence 2011, proposes to use Artificial Neural Networks to solve number series problems.

Research Question. Is it possible to develop a cognitive system able to solve number series problems of intelligence test or the 50 000 problems in the Online Encyclopedia of Integer Series?

Method. Dynamic training method for Artificial Neural Network

Results. Using a dynamic learning approach the approach can solve 26 951 of 57 524 number series.

Ragni, M., & Klein, A. (2011). Predicting numbers: An AI approach to solving number series. In S. Edelkamp & J. Bach (Eds.), KI-2011. Springer LNAI, Heidelberg.

Solving Number Series and Artificial Neural Networks

Positioning. The article, published at the International Joint Conference on Computational Intelligence (IJCI 2011), investigates if we can solve classical intelligence test problems and what parameters fit the best.

Research Question. What kind of architectural and formal properties can have an influence on successful artificial neural networks solving number series?

Method. Formal analysis of artificial neural networks

Results. Systematically testing the best parameters (number of input nodes, hidden nodes, and learning rate) shows that the structure of the Artificial Neural Networks can determine the success of solving a number sequence: 2-4 input nodes and about 5-6 hidden nodes provide the best framework to solve number series. By allowing approximations (deviations of ± 5) be improved to solve about 39 000 number series of the Online Encyclopedia.

Ragni, M., & Klein, A. (2011b). Solving Number Series – Architectural Properties of Successful Artificial Neural Networks. In K. Madani, J. Kacprzyk, & J. Filipe (Eds.), NCTA 2011 –  Proceedings of the International Conference on Neural Computation Theory and Applications(pp. 224–229). SciTePress.

Solving a Complex Cognition Task

Positioning. The article, published in the Journal Cognitive Systems Research, investigates an important complex cognition task: the dynamic stocks and flows task.

Method. Formal analysis; artificial neural networks; cognitive and computational modeling.

Results. The dynamic stock and flow task – an exploration problem – is first formally generalized to a general stock and flow task with an arbitrary number of tanks that have to be controlled for. As the system requires learning and a highadaptability it proposes to use a computational approach based on artificial neural networks combined with heuristics. The system is evaluated on all problems from the DSF-challenge and yields satisfactorily results.

Ragni, M., Steffenhagen, F., & Klein, A. (2011). Generalized dynamic stock and flow systems: An AI approach. Cognitive Systems Research, 12(3-4), 309-320.

Preferences Relational Reasoning

Positioning. The article, published at the International Joint Conference on Computational Intelligence in 2012, proposes to use Artificial Neural Networks, to predict preferred mental models in spatial reasoning based on the preferences of the point algebra.

Research Question. Can we reproduce the human preferences in reasoning based on their preferences in the point algebra?

Method. Cognitive Modeling with Artificial Neural Networks; empirical analysis

Results. The article shows that ANN trained on the preferred conclusions of the point algebra is able to reproduce the preferred conclusions more precisely 130 of 169 (76,9 %) for the Interval Algebra and 56 of 64 (87,5 %) of the Cardinal Direction Calculus could be correctly reproduced. The problem of selecting an adequate neural network architecture for a given problem has become recently more and more in the research focus: So an analysis shows that 6 hidden nodes provide the most adequate architecture for the ANN for this approach.

Ragni, M., & Klein, A. (2012). Deductive Reasoning – Using Artificial Neural Networks to Simulate Preferential Reasoning. In A. C. Rosa, A. D. Correia, K. Madani, J. Filipe, & J. Kacprzyk (Eds.), International Joint Conference on Computational Intelligence (pp. 635–638). SciTePress.