Fundamentals of scaling theory in economics
Tutorial. - M.: Publishing House "SCIENTIFIC LIBRARY", 2021. - 272 p.
Methods of scaling theory are studied within the disciplines "Econometrics", "Modeling of Economic and Financial Processes", "System Analysis in Economics" and other disciplines.
The scaling methodology is part of such modern scientific areas as DATA SCIENCE, BIG DATA, DATA MINING, NeuroNet, Vtasurement Science and other areas of artificial intelligence and modern measurement theory.
However, the modern theory of scaling, including methods of scaling and measuring non-quantitative indicators under conditions of uncertainty, has not yet been considered in these disciplines. While such scaling and evaluation conditions are typical for complex economic systems, when obtaining estimates and management decisions in economic problems. This tutorial fills that gap. To implement scaling under conditions of uncertainty, a new type of measuring scales is proposed, called scales with dynamic constraints. The textbook discusses the theoretical foundations and practical methods for constructing such scales.
Practical methods of scaling and examples of building scales in various economic applications in industry, financial tasks, the banking sector, the social sphere, territorial development management, as well as new approaches to assessing indicators under uncertainty are given.
The textbook is intended for the preparation of bachelors and masters, it will also be useful for scientists, graduate students and specialists in the fields of measurement and intelligent information processing.
Bayesian intellectual technologies in the tasks of modeling the distribution law under uncertainty
monograph / S. V. Prokopchina. M. – Publishing House “SCIENTIFIC LIBRARY”, 2020-292 p.
(The publication was carried out with the financial support of the Russian Foundation for Basic Research under the project No. 20-17-00007)
Monograph is devoted to methods and means of Bayesian mathematical statistics, in particular, the approximation of the probability density of random variables and random processes by distributions of the Pearson system, which covers the main types of unimodal distributions used in scientific and practical problems, is considered. A unique section is devoted to determining the distribution laws under conditions of significant uncertainty for processing small samples, inaccurate, incomplete, fuzzy information, and numerous different types of information flows. The methodological basis is a regularizing Bayesian approach focused on processing information (both data and knowledge) under conditions of uncertainty. The monograph provides a methodology and analytical dependencies for determining the metrological characteristics of the resulting solutions (accuracy, reliability, reliability, and others). All algorithms are provided with full metrological support of solutions. All analytical conclusions and methods are illustrated by examples of solving various applied problems. Tables of calculated data and block diagrams of algorithms are of practical use. The book is intended for researchers, teachers, students and postgraduates, as well as for specialists in the field of analytical data processing.
Modeling the laws of distributions of random variables and processes in problems of econometrics
textbook / S.V. Prokopchina. – M .: Publishing house “SCIENTIFIC LIBRARY”, 2019.P. 260.
The manual discusses theoretical and practical issues of modeling the laws of distribution and estimation of the numerical characteristics of random variables and processes. The main attention is paid to the determination of the analytical form of the distribution law. As a system of approximating distributions, a system of Pearson curves was chosen that allows one to obtain analytical expressions for all known types of unimodal distributions. A unique aspect of the study guide is the inclusion of methods and means of determining the analytical form of the distribution law for small samples under conditions of significant uncertainty. The regularizing Bayesian approach is considered as the main methodology. Examples of determining the analytical form of distribution laws for econometric problems and other applications are given. The manual is intended for theoretical and practical training in the disciplines “Modeling of economic and financial processes”, “Econometrics”, “Business Informatics”.
Моделирование социально-экономических систем в условиях неопределенности
study guide – workshop, 2nd edition / Prokopchina S. V., Shcherbakov G. A., Efimov Yu. V.; under. ed. G. A. Shcherbakova. – M .: Publishing house “SCIENTIFIC LIBRARY”, 2019. – 508 p.
Modern economic systems are unique complex objects. They function and develop under the influence of various environmental factors and, in a specific way for each of these systems, change in time and space. The dynamics of their behavior is difficult to predict, and the evidence to describe it, as a rule, is not enough. These data are incomplete, inaccurate and heterogeneous, which generally determines the situation of uncertainty. At the same time, practically demanded models of economic systems should adequately reflect the properties and relationships of the modeled objects. For this reason, for their effective modeling, it is necessary to apply methods oriented to work in conditions of uncertainty. Possible solutions to the above problems are the subject of this tutorial. It is intended not only for the reader who is interested in the theory of modeling economic systems, but also is an entertaining workshop containing examples of the application of modern economic and mathematical methods in solving analytical and prognostic problems in various fields of the real economy. The main attention is paid to the regularizing Bayesian approach (BPS), which, since the 80s of the last century, has been successfully applied to solve a wide range of technical and socio-economic problems. The workshop presents both the theoretical foundations of the BPO and examples of its application for modeling economic systems under conditions of uncertainty. The manual is equipped with questions for self-training and practical tasks and can be useful both to scientists, teachers, graduate students and students of higher educational institutions, as well as specialists in the field of system modeling and management of complex national economic facilities and complexes.
Methods and tools for modeling the distribution law under uncertainty
monograph / S. V. Prokopchina. M. – Publishing House “SCIENTIFIC LIBRARY”, 2018-252 p.
The following main points are put forward and defended in the monograph. The approximation of the probability density of CB or SESP should be carried out with a given accuracy and reliability, in accordance with a priori and a posteriori information about the type of investigated PV. Developed on the basis of the Bayesian decision rule, the PV approximation algorithm meets the stated requirements. To ensure the specified accuracy and reliability of the PV approximation when organizing the approximation process by means of electronic computers and GVK, the values of their main technical characteristics, sample size, the width of the differential histogram corridor, and the number of PNK bits should be selected based on the required accuracy of the PV approximation based on the obtained dependences, and in accordance with the recommendations in it.
Soft Measurements and Computing. Volume V
monograph / ed. Doctor of Technical Sciences, prof. S.V. Prokopchinoy. – M .: Publishing House “SCIENTIFIC LIBRARY”, 2019 – 616 p.
The methods and means of intellectual information processing, models and their applications, information technologies in the design and manufacture of devices and control systems, multidimensional estimation of complex objects, a comprehensive assessment of the effectiveness of regional systems in the face of uncertainty based on a regularizing Bayesian approach, distributed and modular computing methods are considered. The monograph is intended for researchers, graduate students, students and other specialists working in the field of soft computing and measurement, in the field of creating methods and means of artificial intelligence.
Soft Measurements and Computing. Volume IV
under the editorship of Doctor of Technical Sciences, prof. S.V. Prokopchinoy. – M .: Publishing House “SCIENTIFIC LIBRARY”, 2018. – 342 p.
Socio-economic ecosystems are considered in the context of dual space-time analysis, system economics, economic cybernetics, soft measurements, system economics: in search of a unified platform for farming, management organization, development of economic theory. The innovative development of Russia is analyzed: philosophical analysis, socio-humanitarian technologies for assembling entities in self-developing polysubject environments, criteria for evaluating innovations in electronic culture. Particular attention is paid to energy management metrics, models and methodologies of sustainable development of enterprises. The monograph is intended for researchers, graduate students, students and other specialists working in the field of soft computing and measurement, in the field of creating methods and means of artificial intelligence.
Soft Measurements and Computing. Volume III
monograph / ed. Doctor of Technical Sciences, prof. S.V. Prokopchinoy. – M .: Publishing House “SCIENTIFIC LIBRARY”, 2017. – 300 p.
We consider Bayesian networks, related models and their applications, information technologies in the design and manufacture of devices and control systems (application of soft computing in industry), multidimensional object evaluation, a comprehensive assessment of the effectiveness of regional projects in the face of uncertainty, as well as the application of a regularizing Bayesian approach to assess and improve the sustainability of enterprises, the Monograph is intended for researchers, graduate students, students and other specialists working in the field of soft computing and measurement, in the field of creating methods and means of artificial intelligence.
Soft Measurements and Computing. Volume II
monograph / ed. S.V. Prokopchinoy. – M .: Publishing House “SCIENTIFIC LIBRARY”, 2017 – 416 p.
The application of fuzzy sets and soft calculations in economics and finance, a fuzzy-logical system of balanced indicators and an assessment of economic risk, the net present value and the internal rate of return of cash flows with fuzzy payments are considered. In addition, soft rationing technologies were analyzed at industrial enterprises under conditions of uncertainty, the interaction of cycles of various durations in the process of economic development, economic management problems in conditions of polycyclicity, and soft measurements and soft calculations when modeling the state of complex objects based on expert knowledge. The monograph is intended for researchers, graduate students, students and other specialists working in the field of soft computing and measurement, in the field of creating methods and means of artificial intelligence.
Soft Measurements and Computing. Volume I
monograph / ed. prof. S.V. Prokopchinoy. – M .: Publishing House “SCIENTIFIC LIBRARY”, 2017 – 490 p.
Soft approaches to measuring and managing complex systems, a regularizing Bayesian approach, a theoretical path from fuzzy sets to soft estimates and synergetic artificial intelligence are considered, methods for constructing granular logical values and structures, modeling of opinions and estimates of intelligent agents: from four-valued systems of modalities to non-classical measures are analyzed and fuzzy distributions. The theory of entropy potentials is considered. state, development prospects and practical applications, the methodology of cognitive visualization of multidimensional data, fuzzy methods of self-organization of information systems, methods for increasing the efficiency of genetic algorithms, the general concept and approach to building distributed self-organizing information systems, Bayesian networks, related models and their applications. The monograph is intended for researchers, graduate students, students and other specialists working in the field of soft computing and measurement, in the field of creating methods and means of artificial intelligence.