Stelian BRAD, Emilia BRAD


Various algorithms used nowadays in artificial intelligence need big data to train and test the models such that to ensure high accuracy and generality of the model and avoid the so-called underfitting problem. However, not all practical applications have sufficient data in their sets to train and test such models. This is a major challenge for the adoption of traditional machine learning or deep learning algorithms in areas where the processes are not suitable to collect big data. There are also cases no big data is needed, but small data are of interest. In such cases, novel algorithms of artificial intelligence are required to design models that can provide customized solutions based on small datasets. This paper highlights how TRIZ can be used to formulate some inventive strategies to handle these two categories of problems.

Full Text:



Dick, S. Artificial Intelligence, Harvard Data Science Review, DOI: 10.1162/99608 f92.92fe150c, Issue 1.1, Summer, 2019.

Steinkruger, D., Markowski, P., Young, G. An Artificially Intelligent System for the Automated Issuance of Tornado Warnings in Simulated Convective Storms, Weather and Forecasting, Vol. 35, Issue 5, pp. 1939-1965, 2020.

Jackson, P.C. Introduction to Artificial Intelligence, Dover Publications, Mineola, ISBN 0486832864, 2019.

Hamet, P., Tremblay, J. Artificial Intelligence in Medicine, Metabolism, Vol. 69, Supplement, pp. S36-S40, 2017.

Barr, A., Feigenbaum, E.A. The Handbook of Artificial Intelligence, Vol. II, Heuristech Press, Stanford, 2014.

Kacpryk, J., Pedrycz, W. Springer Handbook of Computational Intelligence, Springer, Heidelberg, ISBN 978-3-662-43504-5, 2015.

Jordan, M.I., Mitchell, T.M. Machine Learning: Trends, Perspectives, and Prospects, Science, Vol. 349, Issue 6245, pp. 255-260, 2015.

Goodfellow, I., Bengio, Y., Courville, A. Deep Learning, MIT Press, Cambridge, ISBN 9780262035613, 2016.

Jabbar, H.K., Khan, R.Z. Methods to Avoid Overfitting and Underfitting in Supervised Machine Learning (Comparative Study), Computer Science, Communication & Instrumentation Devices, AET-2014, ISBN: 978-981-09-5247- 1, pp. 163-172, 2015.

Belkin, M., Hsu, D., Ma, S., Mandal, S. Reconciling Modern Machine Learning Practice and the Classical Bias-Variance Trade-off, Proceedings of the National Academy of Sciences of the United States of America, Vol. 116, Issue 32, pp. 15849-15954, 2019.

Gavrilov, A.D., Jordache, A., Vasdani, M., Deng, J. Preventing Model Overfitting and Underfitting in Convolutional Neural Networks, International Journal of Software Science and Computational Intelligence, Vol. 10, Issue 4, 10 pp., 2018.

Briscoe, E., Feldman, J. Conceptual Complexity and the Bias-Variance Trade-off, Cognition, Vol. 118, Issue 1, pp. 2-16, 2011.

Ilevbare, I.M., Probert, D., Phaal, R. A Review of TRIZ and Its Benefits and Challenges in Practice, Technovation, Vol. 33, pp. 30-37, 2013.

Pokhrel, C., Cruz, C., Ramirez, Y., Kraslawski, A. Adaptation of TRIZ Contradiction Matrix for Solving Problems in Process Engineering, Chemical Engineering Research and Design, Vol. 103, pp. 3-10, 2015.


  • There are currently no refbacks.