On the choice of basic regression functions and machine learning

Authors

  • Sergey M. Ermakov St Petersburg State University, 7–9, Universitetskaya nab., St Petersburg, 199034, Russian Federation
  • Svetlana N. Leora St Petersburg State University of Economics, 30–32, nab. kanala Griboedova, St Petersburg, 191023, Russian Federation

DOI:

https://doi.org/10.21638/spbu01.2022.102

Abstract

As is known, the regression analysis task is widely used in machine learning problems, which allows to establish relationship between observed data and compactly store of information. Most often, a regression function is described by a linear combination of some of the selected functionsf_j(X), j= 1, . . . , m, X in D contains in R^s. If the observed data contains a random error, then the regression function restored from the observed data contains a random error and a systematic error depending on the selected functions f_j. The article indicates the possibility of optimal selection of functions f_j in the sense of a given functional metric, if it is known that the true dependence is consistent with some functional equation. In some cases (regular grids, s ≤ 2), similar results can be obtained using the random process analysis method. The numerical examples given in this article illustrate much more opportunities for the task of constructing the regression function

Keywords:

regression analysis, approximation, basis functions, operator method, machine learning

Downloads

Download data is not yet available.
 

References

Литература

1. Дрейпер Н., Смит Г. Прикладной регрессионный анализ, пер. с англ. 3-е изд. Киев, Диа- лектика (2016).

2. Гохберг И.Ц., Крейн М. Г. Введение в теорию линейных несамосопряженных операторов, пер. с англ. Москва, Наука (1965).

3. Донской В.И. Машинное обучение и обучаемость: сравнительный обзор. Intellectual Archive, №933, 1–19 (2012).

4. Усевич К.Д. Разложение функций в двумерном варианте метода "Гусеница"-SSA и связан- ные с ним системы уравнений в частных производных. Вестник Санкт-Петербургского универ- ситета. Прикладная математика. Информатика. Процессы управления, вып. 3, 151–160 (2009).

5. Ермаков С.М., Котова Л.Ю. О выборе базисных функций в регрессионном анализе. В: Сб. работ кафедры статистического моделирования СПбГУ, 3–43 (1999).

6. Самарский А.А. Теория разностных схем. Москва, Наука (1989).

7. Голяндина Н.Э., Усевич К.Д. Метод 2D-SSA для анализа двумерных полей. В: Труды VII Международной конференции "Идентификация систем и задачи управления" SICPRO’08, Москва, 1657–1727 (2008).

References

1. Draper N., Smith H. Prikladnoi regressionnyi analiz. 3rd ed. Kiev, Dialeсtica Publ. (2016). (In Russian) [Eng. transl.: Draper N., Smith H. Applied Regression Analysis. 3rd ed. New York, Wiley (1998)].

2. Gokhberg I.Ts., Kreyn M.G. Vvedenie v teoriiu lineinykh nesamosopriazhennykh operatorov. Moscow, Nauka Publ. (1965). (In Russian) [Eng. transl.: Gokhberg I.Ts., Kreyn M.G. Introduction to the theory of linear non-self-adjoint operators in a Hilbert space. In Ser.: Translations of Mathematical Monographs, vol. 18, AMS (1969)].

3. Donskoy V. I. Machine Learning and Learnability: Comparative Survey. Intellectual Archive, no. 933, 1–19 (2012). (In Russian)

4. Usevich K.D. Decomposition of functions in 2D-extension of SSA and related partial differential systems of equations. Vestnik of Saint Petersburg University. Applied Mathematics. Computer Science. Control Processes, iss. 3, 151–160 (2009). (In Russian)

5. Ermakov S.M., Kotova L.Yu. On the choice of basic functions in regression analysis. In: Collection of works of the Department of Statistical Modeling of St Petersburg State University, 3–43 (1999). (In Russian)

6. Samarskiy A.A. The theory of difference schemes. Мoscow, Nauka Publ. (1989). (In Russian)

7. Golyandina N. E., Usevich K.D. 2D-SSA Method for analysis of two-dimensional fields. In: Proceedings of the VII International Conference “System Identification and Control Problems” SICPRO’08, Moscow, 1657–1727 (2008). (In Russian)

Published

2022-04-10

How to Cite

Ermakov, S. M., & Leora, S. N. (2022). On the choice of basic regression functions and machine learning. Vestnik of Saint Petersburg University. Mathematics. Mechanics. Astronomy, 9(1), 11–22. https://doi.org/10.21638/spbu01.2022.102

Issue

Section

Mathematics

Most read articles by the same author(s)