Theory, its essence, structure and functions. Structure and functions of scientific theory

a logically interconnected system of concepts and statements about the properties, relationships and laws of a certain set of idealized objects (point, number, material point, inertia, black body, ideal gas, actual infinity, socio-economic formation, consciousness, etc., etc.) P.). The purpose of a scientific theory is the introduction of such basic ideal objects and statements about their properties and relationships (laws, principles), in order to then purely logically (i.e. mentally) derive (construct) from them the largest possible number of consequences, which, when selecting a certain empirical interpretation would most adequately correspond to the observed data about some real area of ​​objects (natural, social, experimentally created, mental, etc.). The main structural elements of any scientific theory: 1) initial objects and concepts; 2) derived objects and concepts (the connection between the derivative and original concepts of the theory is specified by defining the former, ultimately, only through the original ones); 3) initial statements (axioms); 4) derived statements (theorems; lemmas), their connection with axioms is specified using certain rules of inference; 5) metatheoretical foundations (picture of the world, ideals and norms of scientific research, general scientific principles, etc.). The first scientific theory in the history of knowledge was Euclidean geometry, built by ancient mathematicians for about three hundred years (VII - IV centuries BC) and culminating in a brilliant generalization in Euclid’s work “Elements”. (See theory, science, idealization).

Excellent definition

Incomplete definition ↓

SCIENTIFIC THEORY

most developed form organizations scientific knowledge, giving a holistic idea of ​​the patterns and significant connections of the studied area of ​​reality. Examples of T.n. are the classical mechanics of I. Newton, the corpuscular and wave theories of light, the theory of biological evolution of Charles Darwin, the electromagnetic theory of J.K. Maxwell, special theory of relativity, chromosome theory heredity, etc.

Science includes descriptions of facts and experimental data, hypotheses and laws, classification schemes, etc., but only T.N. combines all the material of science into a holistic and observable knowledge about the world. It is clear that for the construction of T.n. Certain material about the objects and phenomena being studied must first be accumulated, so theories appear at a fairly mature stage of development scientific discipline. For thousands of years, humanity has been familiar with electrical phenomena, however, the first T.n. electricity appeared only in mid. 18th century At first, as a rule, descriptive theories are created that provide only a systematic description and classification of the objects under study. For a long time, theories of biology, including Jean Baptiste Lamarck's and Darwin's theories of evolution, were descriptive: they described and classified plant and animal species and their origins; D. Mendeleev's table of chemical elements was a systematic description and classification of elements. And this is quite natural. When starting to study a certain area of ​​phenomena, scientists must first describe these phenomena, highlight their characteristics, and classify them into groups. Only after this does deeper research become possible to identify causal relationships and discover laws.

The highest form of development of science is considered to be an explanatory theory, which provides not only a description, but also an explanation of the phenomena being studied. Every scientific discipline strives to build precisely such theories. Sometimes the presence of such theories is seen as an essential sign of the maturity of science: a discipline can be considered truly scientific only when explanatory theories appear in it.

Explanatory theory has a hypothetico-deductive structure. The basis of the T.n. serves as a set of initial concepts (quantities) and fundamental principles (postulates, laws), including only initial concepts. It is this basis that fixes the angle from which reality is viewed and sets the area that the theory covers. The initial concepts and principles express the main, most fundamental connections and relationships of the area being studied, which determine all its other phenomena. Thus, the basis of classical mechanics are the concepts of a material point, force, velocity and the three laws of dynamics; Maxwell's electrodynamics is based on his equations, which connect the basic quantities of this theory with certain relationships; the special theory of relativity is based on the equations of A. Einstein, etc.

Since the time of Euclid, the deductive-axiomatic construction of knowledge has been considered exemplary. Explanatory theories follow this pattern. However, if Euclid and many scientists after him believed that the starting points of a theoretical system were self-evident truths, then modern scientists understand that such truths are not easy to find, and the postulates of their theories serve as nothing more than assumptions about the underlying causes of phenomena. The history of science has provided quite a lot of evidence of our misconceptions, therefore the fundamental principles of the explanatory theory are considered as hypotheses, the truth of which still needs to be proven. Less fundamental laws of the field under study are deductively derived from the principles of the theory. That is why the explanatory theory is called “hypothetico-deductive”.

Initial concepts and principles of the so-called. relate directly not to real things and events, but to some abstract objects, which together form an idealized object of the theory. In classical mechanics it is a system of material points; in molecular kinetic theory - a set of chaotically colliding molecules closed in a certain volume, represented in the form of absolutely elastic balls, etc. These objects do not exist by themselves in reality, they are mental, imaginary objects. However, the idealized object of the theory has a certain relationship to real things and phenomena: it reflects some abstracted or idealized properties of real things. These are an absolutely solid or absolutely black body; perfect mirror; ideal gas, etc. By replacing real things with idealized objects, scientists are distracted from secondary, insignificant properties and connections of the real world and highlight in their pure form what seems to them the most important. The idealized object of the theory is much simpler than real objects, but this is precisely what allows it to be given an accurate mathematical description. When an astronomer studies the movement of planets around the Sun, he is distracted from the fact that the planets are entire worlds with a rich chemical composition, atmosphere, core, etc., and considers them as simply material points, characterized only by mass, distance from the Sun and momentum, but it is precisely thanks to this simplification that he is able to describe their movement in strict mathematical equations.

Idealized object So-called. serves for the theoretical interpretation of its original concepts and principles. Concepts and statements T.N. have only the meaning that the idealized object gives them. This explains why they cannot be directly correlated with real things and processes.

To the original basis T.n. also include a certain logic - a set of inference rules and mathematical apparatus. Of course, in most cases, as logic T.N. The usual classical two-valued logic is used, but in some theories, for example, in quantum mechanics, sometimes three-valued or probabilistic logic is used. T.N. They also differ in the mathematical means used in them. Thus, the basis of a hypothetico-deductive theory includes a set of initial concepts and principles, an idealized object that serves for their theoretical interpretation, and a logical-mathematical apparatus. From this basis, all other statements of the T. are obtained deductively. - laws of a lesser degree of generality. It is clear that these statements also speak of an idealized object.

The question of whether the T.N. includes empirical data, results of observations and experiments, facts, still remain open. According to some researchers, facts discovered thanks to a theory and explained by it should be included in the theory. According to others, the facts and experimental data lie outside the T.N. and the connection between theory and facts is effected by special rules of empirical interpretation. With the help of such rules, the statements of the theory are translated into empirical language, which allows them to be verified using empirical methods research.

To the main functions of the T.N. include description, explanation and prediction. T.N. gives a description of a certain area of ​​phenomena, certain objects, s.-l. aspects of reality. Due to this, T.n. may turn out to be true or false, i.e. describe reality adequately or distortedly. T.N. must explain known facts, pointing out the essential connections that underlie them. Finally, T.n. predicts new, not yet known facts: phenomena, effects, properties of objects, etc. Detection of predicted T.N. facts serve as confirmation of its fruitfulness and truth. The discrepancy between theory and facts or the discovery of internal contradictions in a theory gives an impetus to change it - to clarify its idealized object, to revise, clarify, change its individual provisions, auxiliary hypotheses, etc. In some cases, these discrepancies lead scientists to abandon the theory and replace it with a new theory. About Nikiforov A.L. Philosophy of science: history and methodology. M., 1998; Stepan B.C. Theoretical knowledge. M., 2000. A.L. Nikiforov

Excellent definition

Incomplete definition ↓

Term "theory" is used quite widely. So, sometimes theory is called mental activity in general. Often a theory is meant to mean something that is actually a hypothesis. For example, Oparin’s theory of the origin of life and other theories on this subject are hypotheses, and not theories in the proper sense of the word. Often a theory is a concept, a set of views or opinions of an individual, or a point of view on a certain issue, in particular, Lysenko’s theory, “theory of violence,” “racial theory,” etc.

In the philosophy of science, a theory is a system of objective knowledge. The scientific definition of a theory is as follows: theory is a qualitatively unique form of scientific knowledge, existing as a certain system of logically interconnected proposals that reflect essential, i.e., natural, general and necessary internal connections of a particular subject area.

From the point of view of scientific methodology, theory should be understood as true knowledge presented in the form of a system. What is theory as a system of knowledge?

Like any system, theory is characterized by a certain composition, i.e., a set of elements that determine its ideological content, and structure or structure , i.e., a set of relationships and connections between its elements. The composition or content of the theory includes: basic and special concepts, principles and laws, ideas, language, mathematical apparatus, logical means . They constitute the epistemological structure of the theory.

All these elements of the content of the theory are not arranged in an arbitrary order or in a purely external way (as in a dictionary), but represent a sequential communication system in which concepts and statements are connected by the laws of logic so that from one sentence, using the laws and rules of logic, other sentences can be deduced. This is the logical structure of the theory . It does not follow from the subject area, but from logical laws.

In accordance with the logical structure, three types of theories are distinguished: 1) axiomatic, 2) genetic, 3) hypothetico-deductive.

Axiomatic theory is constructed as follows: the initial proposals are accepted without proof, and all others are deductively derived from them.

Genetic theory arises from the need to substantiate the initial proposals, therefore they indicate ways to obtain these proposals, which, as a rule, are seen in induction.

Hypothetico-deductive theory is built from a hypothetically put forward general position, from which all other propositions are deductively derived.

Let us dwell in more detail on the epistemological structure of the theory.

The main and most important, as well as the initial element of the epistemological structure of the theory, is the principle that organically connects other elements of the theory into a single whole, into a coherent system.

Under the principle(from Latin principium - beginning, basis) in the theory of knowledge they understand the fundamental principle, the starting point of any concept, that which underlies a certain body of knowledge.

In a scientific theory, the principle constitutes its fundamental basis, around which all its concepts, judgments, laws, etc. are synthesized, revealing, justifying and developing this principle. Thus, the theory of materialist dialectics is based on the principle of development. All its laws and categories are subject to the disclosure of the essence of development, its manifestation in all areas of reality, at different levels, in different conditions. Therefore, while there is no synthesizing principle, there is no theory.

This position is well illustrated by the history of the formation of classical mechanics. Galileo also managed to formulate a number of laws related to classical mechanics, including the law of inertia. However, he failed to create a logically coherent, unified theory. There was only a simple sum of disparate provisions, not united by a single synthesizing principle, a single principle. It was later I. Newton who succeeded in completing the formation of the theory of classical mechanics, who took the law of inertia as the main one and united around it all the concepts, laws and other principles of mechanics (dynamics, statics, kinematics, Kepler’s laws, etc.)

When a contradiction arose between classical mechanics and the data obtained as a result of the study of electromagnetic phenomena by Maxwell, Lorentz and Hertz, Einstein took up the solution to the problem. He wrote: “Gradually I began to despair of the possibility of getting to the bottom of the true laws through constructive generalizations of well-known facts. The more and more desperately I tried, the more I came to the conclusion that only the discovery of a general formal principle could lead us to reliable results." Einstein was able to discover this principle only after ten years of reflection. This is the principle of relativity.

From the examples it is clear that the principle is not given ready-made at the beginning of the formation of the theory. This is preceded by a long process of studying the phenomena of the corresponding area of ​​reality covered by the theory being created. The formation of a theory essentially occurs after the principle has been found.

Typically, when creating a theory, a number of principles are used that differ in the degree of generality. But at the same time they must be compatible with each other and satisfy two conditions: first , they should not be in formal-logical contradiction with each other, and second, the principle of a lesser degree of generality must specify the principle of a greater degree of generality. The latter, as a rule, represents philosophical positions. Such principles include the principle of development, the principle of interconnection, and the principle of world unity. Philosophical principles play a very important guiding, methodological role in the creation of any scientific theory.

The value of a principle is determined by the degree of its development and truth. It is clear that a scientific theory cannot be built on the basis of false, unscientific or anti-scientific principles. Theologians also create their theories, but on the basis of false principles, and therefore their theories are not scientific.

In its synthesizing role, the principle resembles the idea discussed above. These concepts are quite close in their meaning and content, but still not identical. The idea is put forward before the hypothesis as abstract theoretical knowledge of the essence of the object of study in the most general approximation. The principle is already concrete theoretical knowledge that underlies a certain body of knowledge, thanks to which a knowledge system arises.

Laws occupy an important place in the epistemological structure of the theory. Law is a reflection of significant, stable, repeating and necessary connections between the phenomena studied by this theory. The theory, as a rule, includes several laws of varying degrees of generality. The core of the theory is one or several laws that are relatively independent from each other and have equal rights. They are the most general and cannot be derived from other laws of this theory.

The second group of laws of this theory consists of those that are deducible from the first group, but in their action retain relative independence in relation to each other. The third group of laws includes those that can be derived from the second group, etc., until the consequences of these laws that characterize a specific phenomenon are obtained. Consequences make it possible to discover new properties, aspects of these phenomena, as well as to discover previously unknown phenomena. Thus, Mendeleev discovered a number of elements purely theoretically, thanks to consequences from the periodic law.

The principle of the theory and the laws that reveal it, located at the top step of the hierarchical ladder discussed above, form the core of a scientific theory, its main essence.

The problem of recognizing the objective nature of laws is key in the methodology of science. Materialism recognizes the objective nature of the laws of science; objective idealism considers laws to be an expression of the world mind, embodied in nature and society. This is, in particular, Hegel’s position. In a more general form, we can say that objective idealism understands laws as a certain metaphysical, i.e., above natural essence, standing on the other side of phenomena.

Subjective idealism, represented by J. Berkeley, did not recognize the existence of any general concepts, much less objective laws. A more sophisticated position is taken by neopositivists. For them, a sign of a law is the repeatability or regularity of phenomena detected in systematic observations. Thus, R. Carnap believes that “the laws of science are nothing more than statements that express these regularities as accurately as possible. If a certain regularity is observed at all times and in all places without exception, then it appears in the form of a universal law."

If regularities are established by comparing observations, then, Carnap believes, we get empirical laws . They do not have the certainty of logical and mathematical laws, but they tell us something about the structure of the world. The laws of logic and mathematics tell us nothing about what would distinguish the actual world from some other possible world. Carnap argues that empirical laws are laws that can be confirmed directly by empirical observations.

Unlike them theoretical laws are not observable quantities. They are laws about objects such as molecules, atoms, electrons, protons, electromagnetic fields and other unobservable objects that cannot be measured in a simple direct way. Theoretical laws are more general than empirical ones, but they are not formed by generalizing empirical ones. Theoretical laws, according to neopositivism, are formed by the subject of knowledge, the scientist. They are confirmed indirectly through empirical laws derived from the theory, which includes these theoretical laws.

Thus, we can draw conclusions:

1) neopositivism does not consider the law to be a reflection of essence, but only a fixation of repetition;

2) empirical laws do not go beyond sensory experience and do not reach an abstract level;

3) theoretical laws are subjective in nature and the results of the constructive activity of a scientist.

If neopositivism in its interpretation recognizes the existence of empirical laws, then the previous form of positivism - empirio-criticism or Machism - considers law as a description of events in terms of law. Mach argued that science should ask not “why?”, “how?” Carnap explains this position by saying that earlier philosophers believed that describing how the world works was insufficient. They wanted a more complete understanding of the essence by finding metaphysical reasons behind phenomena and unattainable scientific method. To this the physicists, supporters of Machism, replied: “Don’t ask us “why?” There is no answer other than that given by empirical laws." Empiriocritics believed that the question “why?” touches on metaphysical aspects, and they are not the field of science. In this formulation, science was denied the right to penetrate into the essence of things. This means that positivism and neopositivism take the position of agnosticism.

Concepts– also an epistemological element of the theory. A concept is a form of thinking and a form of expression of scientific knowledge in which the most general, essential properties of objects, phenomena of reality, their most important connections and relationships are recorded. IN scientific concepts as if all our knowledge about the essential properties of objects and phenomena is accumulated, the most important connections and patterns are reflected and consolidated. We can say that all the basic scientific data that make up the content of the theory are concentrated in scientific concepts expressed in the corresponding laws.

Concepts as forms of thinking are the following types: everyday language, special scientific concepts, general scientific and philosophical concepts and categories that differ in the greatest degree of generality. The last three special scientific, general scientific and philosophical, are not only forms of thinking, but also forms of the theoretical level of knowledge as part of scientific theory.

Scientific picture of the world

It can be defined as a concept that expresses the evolution of everyday, scientific and philosophical ideas about nature, society, man and his knowledge, depending on the specific historical methods and forms of cognitive activity and social practice in general. NCM develops as an understanding of the images of the world that underlie human life, culture and practice; simplifies, schematizes and interprets reality as any cognitive image, at the same time isolating essential, basic connections from the infinite variety of relationships.

The difficulties of analyzing NCM as a value-ideological form of knowledge are largely associated with the fact that it exists in science mainly implicitly in texts and subtexts, in various unsystematized statements of scientists about the premises of the theory, and special methodological efforts are needed to identify it. NCM became the subject of special reflection in philosophical and scientific research in the second half of the 20th century; it is not always recognized as having the right to be an independent unit of knowledge; it is accepted as a metaphor, a kind of auxiliary illustrated image, etc. A meaningful logical-epistemological analysis reveals that everything The three terms included in the concept of NCM - “world”, “picture”, “scientific” - are very polysemantic and carry a significant philosophical and ideological load. In modern literature, it is realized that, although the term “peace” is completely legitimate, its correct use requires clarification of this term and taking into account the fact that the concept of “peace” does not exist outside the framework of certain philosophical and scientific ideas and concepts, which, with their changes, is subject to -the semantic meaning and methodological role of this concept are also changing. “World” is an evolving concept that captures the evolution of scientific and philosophical ideas about nature, society and knowledge, changing their scope and content depending on the specific historical methods and forms of scientific activity and social practice in general.

Another component of the concept of NCM is “picture”. It was this often literally understood term that for a long time kept ideas about NCM at an intuitive level, gave this concept a metaphorical meaning, and emphasized its sensory-visual nature. Obviously, the term “picture” is a tribute to early ideas about the synthesis of knowledge as a visual colorful picture of nature, into which each science adds colors and details.

In the 20th century, M. Heidegger, reflecting on the picture of the world, posed questions to himself: “...why, when interpreting a certain historical era, do we ask about the picture of the world? Does each era of history have its own picture of the world, and in such a way that each time it is concerned with constructing its own picture of the world? Or is it just the new European way of representing that asks the question of the picture of the world? What is a picture of the world? Apparently a picture of the world. But what is called the world here? What does the picture mean? The world appears here as a designation of existence as a whole. This name is not limited to space, nature. History also belongs to the world. And yet, even nature, history, and both together in their latent and aggressive interpenetration do not exhaust the world. This word also means the basis of the world, regardless of how its relationship to the world is conceived” (Heidegger M. Time of the Picture of the World // He. Time and Being. Articles and Speeches. M., 1993. P. 49).

For Heidegger, the “world” acts “as a designation of existence as a whole”; it is not limited to space and nature; history also belongs to the world. The picture of the world is not something copied, but something that a person aims at as “set before himself”; it is not a picture of the world, but “the world understood in the sense of such a picture”; It is not the picture that transforms from medieval to modern European, but the world, and beings becoming represented by beings. By composing such a picture for himself, a person brings himself onto the stage. This means that the transformation of the world into a picture is the same process as the transformation of a person into a subject as a thinking-representing being, possessing a “new freedom” and independently deciding what can be considered reliable and true. The more aggressively the subject behaves, the more irresistibly the science of the world turns into the science of man, anthropology, and therefore only where the world becomes a picture, “humanism rises for the first time,” existence as a whole is interpreted and assessed by man, which has come to be denoted by the word “worldview.” "

In modern knowledge, other terms are increasingly being used instead of “picture”: model, integral image, ontological scheme, picture of reality. These concepts, along with ideas about nature, its causality and patterns, space and time, increasingly include ideas about man, his activity, cognition, and social organization of the environment. This fact reflects two significant trends in the development of NCM as a form of knowledge. Firstly, the methods of synthesizing and integrating scientific knowledge are changing, a transition is taking place from NCM as an image, model, visual picture to NCM as a special complex structured logical form of scientific knowledge, representing the world in its integrity. The first modification of the concept - “pictoriality” is presented mainly in everyday consciousness and in the early stages of the development of science, the second - “modelling”, “integrality” - in more developed, especially in modern, science. Secondly, in historically changing NCMs, the “visibility function” was performed not only by images, models, but also by certain rather abstract constructions. It is known that Descartes’ picture of the world already lost its colors and became monochromatic, and as a result of Newton’s work it became a drawing, a graph, a diagram of quantitative relationships between phenomena, unambiguously reflecting reality, which was, in principle, a huge step forward. What is happening is not a loss of visibility, but a change in the very nature of visibility and a change in objects that perform this function; in particular, objects that have operational clarity receive the status of visual, since they began to denote a certain, fixed development of the conceptual apparatus, the relationship of principles, and methodological stereotypes.

Today, NCM is understood as one of the foundations of scientific research, a picture of the reality under study, presented in a special form of systematization of knowledge, which allows us to identify and interpret the subject of science, its facts and theoretical schemes, new research problems and ways to solve them. It is through NCM that fundamental ideas and principles are transferred from one science to another; it begins to play an increasingly important role, and not so much as a model of the world or its image, but as a synthesizing logical form of knowledge, which is more of a theoretical concept than a picture of the world in in the literal sense of the word. Thus, the most studied physical picture of the world characterizes the subject of physical research through the following ideas: about fundamental physical objects, about the typology of objects studied in physics, about the general features of the interaction of objects (causality and patterns of physical processes), about the spatio-temporal characteristics of the physical world. A change in these ideas due to changes in practice and knowledge leads to a restructuring and change of physical NCMs. Three known historical type: mechanical, electrodynamic and quantum-relativistic pictures of the world. The construction of the latter is not yet completed. In the case when special pictures are included in the content of the general scientific picture of the world, this happens on the basis of philosophical ideas and principles and in close connection with the foundations of the theories of these sciences and the empirical layer of knowledge. It is important to note that one of the procedures for substantiating theoretical schemes is to correlate them with the picture of the world, due to which they are objectified, as well as the interpretation of equations expressing theoretical laws. The construction of a theory, in turn, clarifies the picture of the world. In general, NCM performs several theoretical and methodological functions, combining knowledge into a single whole, objectifying scientific knowledge and incorporating it into culture, and finally, methodologically determining the paths and directions of the research process.

The experiment is carried out in order to test theoretical predictions. A theory is an internally consistent system of knowledge about a part of reality (the subject of the theory). The elements of the theory logically depend on each other. Its content is derived according to certain rules from a certain initial set of judgments and concepts - the basis of the theory.

There are many forms of non-empirical (theoretical) knowledge: laws, classifications and typologies, models, schemes, hypotheses, etc. Theory acts as the highest form of scientific knowledge. Each theory includes the following main components: 1) initial empirical basis (facts, empirical patterns); 2) basis - a set of primary conditional assumptions (axioms, postulates, hypotheses) that describe the idealized object of the theory; 3) logic of the theory - a set of rules of logical inference that are acceptable within the framework of the theory; 4) a set of statements derived in theory that constitute basic theoretical knowledge.

The components of theoretical knowledge have different origins. The empirical basis of the theory is obtained by interpreting experimental and observational data. The rules of logical inference are not definable within the framework of a given theory - they are derivatives of the metatheory. Postulates and assumptions are a consequence of rational processing of the products of intuition, not reducible to empirical foundations. Rather, postulates serve to explain the empirical basis of a theory.

The idealized object of the theory is a sign-symbolic model of a part of reality. The laws formed in theory actually describe not reality, but an idealized object.

According to the method of construction, axiomatic and hypothetico-deductive theories are distinguished. The first are built on a system of axioms, necessary and sufficient, unprovable within the framework of the theory; the second - on assumptions that have an empirical, inductive basis. There are theories: qualitative, constructed without the use of mathematical apparatus; formalized; formal. Qualitative theories in psychology include the concept of motivation by A. Maslow, the theory of cognitive dissonance by L. Festinger, the ecological concept of perception by J. Gibson, etc. Formalized theories that use mathematical apparatus in their structure are the theory of cognitive balance by D. Homans, the theory of intelligence J. Piaget, K. Lewin's theory of motivation, J. Kelly's theory of personal constructs. A formal theory (there are few of them in psychology) is, for example, the stochastic theory of the D. Rasch test (IRT - item selection theory), widely used in scaling the results of psychological and pedagogical testing. “The model of a subject with free will” by V. A. Lefebvre (with certain reservations) can be classified as a highly formalized theory.

A distinction is made between the empirical basis and the predictive power of a theory. A theory is created not only to describe the reality that served as the basis for its construction: the value of a theory lies in what phenomena of reality it can predict and to what extent this forecast will be accurate. Ad hoc theories (for a given case) are considered the weakest, allowing us to understand only those phenomena and patterns for which they were developed.

Followers of critical rationalism believe that experimental results that contradict the predictions of a theory should lead scientists to abandon it. However, in practice, empirical data that does not correspond to theoretical predictions can prompt theorists to improve the theory - to create “extensions”. A theory, like a ship, needs “survivability,” therefore, for every counterexample, for every experimental refutation, it must respond by changing its structure, bringing it into line with the facts.

As a rule, at a certain time there is not one, but two or more theories that equally successfully explain experimental results (within the experimental error). For example, in psychophysics there are equally the theory of threshold and the theory of sensory continuity. In personality psychology, several factor models of personality compete and have empirical confirmation (G. Eysenck’s model, R. Cattell’s model, the “Big Five” model, etc.). In the psychology of memory, the unified memory model and the concept based on the separation of sensory, short-term and long-term memory, etc. have a similar status.

The famous methodologist P. Feyerabend puts forward the “principle of perseverance”: do not abandon the old theory, ignore even facts that clearly contradict it. Its second principle is that of methodological anarchism: “Science is an essentially anarchist enterprise: theoretical anarchism is more humane and progressive than its law-and-order alternatives... This is proven both by the analysis of concrete historical events and by abstract analysis of the relationship between idea and action. The only principle that does not impede progress is called "anything goes"... For example, we can use hypotheses that contradict well-supported theories or well-founded experimental results. You can develop science by acting constructively” [Feyerabend P., 1986].

Theory as the highest form of organization of scientific knowledge is understood as a holistic idea, structured in diagrams, about the universal and necessary laws of a certain area of ​​reality - the object of the theory, existing in the form of a system of logically interconnected and deducible propositions.

The basis of the existing theory is a mutually agreed upon network of abstract objects that determines the specifics of this theory, called the fundamental theoretical scheme and the particular schemes associated with it. Based on them and the corresponding mathematical apparatus, the researcher can obtain new characteristics of reality, without always turning directly to empirical research.

The following main elements of the theory structure are identified:

1) Initial foundations - fundamental concepts, principles, laws, equations, axioms, etc.

2) An idealized object is an abstract model of the essential properties and connections of the objects being studied (for example, “absolutely black body”, “ideal gas”, etc.).

3) The logic of the theory is a set of certain rules and methods of proof aimed at clarifying the structure and changing knowledge.

4) Philosophical attitudes, sociocultural and value factors.

5) A set of laws and statements derived as consequences from the fundamentals of the theory in accordance with specific principles.

For example, in physical theories two main parts can be distinguished: formal calculus (mathematical equations, logical symbols, rules, etc.) and meaningful interpretation (categories, laws, principles). The unity of the substantive and formal aspects of the theory is one of the sources of its improvement and development.

A. Einstein noted that “the theory has two goals:

1. To cover, if possible, all phenomena in their interrelation (completeness).

2. To achieve this by taking as a basis as few logically mutually related logical concepts and arbitrarily established relationships between them (basic laws and axioms). I will call this goal "logical uniqueness"

Types of theories

The variety of forms of idealization and, accordingly, types of idealized objects corresponds to the variety of types (types) of theories that can be classified on different grounds (criteria). Depending on this, theories can be distinguished:

mathematical and empirical,

deductive and inductive,

fundamental and applied,

formal and substantive,

"open" and "closed"

explaining and describing (phenomenological),

physical, chemical, sociological, psychological, etc.

1. Modern (post-non-classical) science is characterized by the increasing mathematization of its theories (especially natural science) and the increasing level of their abstraction and complexity. The importance of computational mathematics (which has become an independent branch of mathematics) has sharply increased, since the answer to a given problem often needs to be given in numerical form, and mathematical modeling.

Most mathematical theories rely on set theory as their foundation. But in last years They are increasingly turning to the relatively recently emerged algebraic theory of categories, considering it as a new foundation for all mathematics.

Many mathematical theories arise through the combination, the synthesis, of several basic, or generative, structures. The needs of science (including mathematics itself) have recently led to the emergence of a number of new mathematical disciplines: graph theory, game theory, information theory, discrete mathematics, optimal control theory, etc.

The theories of experimental (empirical) sciences - physics, chemistry, biology, sociology, history - according to the depth of penetration into the essence of the phenomena being studied can be divided into two large classes: phenomenological and non-phenomenological.

Phenomenological (they are also called descriptive, empirical) describe the experimentally observed properties and quantities of objects and processes, but do not delve deeply into their internal mechanisms (for example, geometric optics, thermodynamics, many pedagogical, psychological and sociological theories, etc.). Such theories solve, first of all, the problem of ordering and primary generalization of the facts related to them. They are formulated in ordinary natural languages ​​using special terminology of the relevant field of knowledge and are predominantly qualitative in nature.

With the development of scientific knowledge, theories of the phenomenological type give way to non-phenomenological ones (they are also called explanatory). Along with observable empirical facts, concepts and quantities, very complex and unobservable, including very abstract concepts, are introduced here.

One of important criteria, by which theories can be classified, is the accuracy of predictions. Based on this criterion, two large classes of theories can be distinguished. The first of these includes theories in which the prediction is reliable (for example, many theories of classical mechanics, classical physics and chemistry). In theories of the second class, prediction has a probabilistic character, which is determined by the cumulative action large number random factors. This kind of stochastic (from the Greek - guess) theories are found in modern physics, biology and social sciences and humanities due to the specificity and complexity of the very object of their research.

A. Einstein distinguished two main types of theories in physics - constructive and fundamental:

Most physical theories are constructive, i.e. their task is to construct a picture of complex phenomena on the basis of some relatively simple assumptions (such as, for example, the kinetic theory of gases).

The basis of fundamental theories is not hypothetical provisions, but empirically found general properties of phenomena, principles from which mathematically formulated criteria that have universal applicability follow (this is the theory of relativity).

V. Heisenberg believed that a scientific theory should be consistent (in the formal logical sense), have simplicity, beauty, compactness, a defined (always limited) scope of its application, integrity and “final completeness.” But the strongest argument in favor of the correctness of the theory is its “multiple experimental confirmation.”

The theories of social sciences and humanities have a specific structure. Thus, in modern sociology, since the work of the great American sociologist Robert Merton (i.e., since the beginning of the 20th century), it has been customary to distinguish three levels of substantive study of social phenomena and, accordingly, three types of theories.

general sociological theory ("general sociology"),

· private ("middle rank") sociological theories - special theories (sociology of gender, age, ethnicity, family, city, education, etc.)

· sectoral theories (sociology of labor, politics, culture, organization, management, etc.)

Ontologically, all sociological theories are divided into three main types:

1) theories of social dynamics (or theories of social evolution, development);

2) theories of social action;

3) theories of social interaction.

The theory (regardless of its type) has the main features:

1. Theory is not individual, reliable scientific propositions, but their totality, an integral organic developing system. The unification of knowledge into a theory is carried out primarily by the subject of research itself, by its laws.

2. Not every set of provisions about the subject being studied is a theory. To turn into a theory, knowledge must reach a certain degree of maturity in its development. Namely, when it not only describes a certain set of facts, but also explains them, i.e. when knowledge reveals the causes and patterns of phenomena.

3. For a theory, justification and proof of the provisions included in it are mandatory: if there is no justification, there is no theory.

4. Theoretical knowledge should strive to explain the widest possible range of phenomena, to continuously deepen knowledge about them.

5. The nature of the theory determines the degree of validity of its defining principle, reflecting the fundamental regularity of a given subject.

6. The structure of scientific theories is meaningfully “determined by the systemic organization of idealized (abstract) objects (theoretical constructs). Statements of theoretical language are directly formulated in relation to theoretical constructs and only indirectly, thanks to their relationship to extralinguistic reality, describe this reality.”

7. Theory is not only ready-made, established knowledge, but also the process of obtaining it, therefore it is not a “bare result”, but must be considered together with its emergence and development.

The main functions of the theory include the following:

1. Synthetic function - combining individual reliable knowledge into a single, holistic system.

2. Explanatory function - identifying causal and other dependencies, the variety of connections of a given phenomenon, its essential characteristics, the laws of its origin and development, etc.

3. Methodological function - on the basis of theory, various methods, methods and techniques of research activity are formulated.

4. Predictive - the function of foresight. Based on theoretical ideas about the “present” state of known phenomena, conclusions are drawn about the existence of previously unknown facts, objects or their properties, connections between phenomena, etc. Prediction about the future state of phenomena (as opposed to those that exist but have not yet been identified) is called scientific foresight.

5. Practical function. The ultimate purpose of any theory is to be translated into practice, to be a “guide to action” for changing reality. Therefore, it is quite fair to say that there is nothing more practical than a good theory.

How to choose a good one from many competing theories?

K. Popper introduced the "criterion of relative acceptability." The best theory is the one that:

a) communicates the greatest amount of information, i.e. has deeper content;

b) is logically more strict;

c) has greater explanatory and predictive power;

D) can be more accurately verified by comparing predicted facts with observations.

In psychology, generally the same forms of scientific knowledge as in other sciences: concepts, judgments, conclusions, problems, hypotheses, theories. Each of them represents a relatively independent way of reflection by a subject of an object, a way of recording knowledge that has developed in the course of the development of universal human spiritual activity.

Among all forms of knowledge, the highest, most perfect and complex in the methodology of science is recognized theory. Indeed, if concepts or conclusions, problems or hypotheses are often formulated in one sentence, then an interconnected, ordered system of statements is necessary to express the theory. Entire volumes are often written to present and substantiate theories: for example, Newton substantiated the theory of universal gravitation in the voluminous work “Mathematical Principles of Natural Philosophy” (1687), which he spent more than 20 years writing; S. Freud outlined the theory of psychoanalysis not in one, but in many works, and over the last 40 years of his life he constantly made changes and clarifications to it, trying to adapt it to changing social conditions, assimilate new facts from the field of psychotherapy, and reflect the criticism of opponents.

However, this does not mean that the theories are super complex and therefore beyond the understanding of the “man on the street.” Firstly, any theory can be presented in a concise, somewhat schematized version, removing the secondary, insignificant, and bracketing out the supporting arguments and supporting facts. Secondly, ordinary people (i.e., those who are not professional scientists) master many theories along with their implicit logic from school, and therefore in adulthood they often build their own theories based on generalization and analysis of everyday experience, different from scientific degree of complexity, lack of mathematization and formalization, insufficient validity, less systematic and logical harmony, in particular, insensitivity to contradictions. Thus, a scientific theory is a somewhat refined and complicated version of everyday theories.

Theories act as methodological units, a kind of “cells,” of scientific knowledge: they represent all levels of scientific knowledge along with methodological procedures for obtaining and substantiating knowledge. Scientific theory includes and combines all other forms of scientific knowledge: its main “ construction material"- concepts, they are connected with each other by judgments, from which inferences are made according to the rules of logic; Any theory is based on one or more hypotheses (ideas) that are the answer to a significant problem (or set of problems). If a particular science consisted of only one theory, it would nevertheless possess all the basic properties of science. For example, geometry for many centuries was identified with the theory of Euclid and was considered at the same time an “exemplary” science in the sense of accuracy and rigor. In a word, theory is science in miniature. Therefore, if we understand how the theory is structured, what functions it performs, then we will comprehend the internal structure and “working mechanisms” of scientific knowledge as a whole.

In the methodology of science, the term “theory” (from the Greek theoria - consideration, research) is understood in two main senses: broad and narrow. In a broad sense, a theory is a complex of views (ideas, concepts) aimed at interpreting a phenomenon (or a group of similar phenomena). In this sense, almost every person has his own theories, many of which relate to the field of everyday psychology. With their help, a person can organize his ideas about goodness, justice, gender relations, love, the meaning of life, posthumous existence, etc. In a narrow, special meaning, theory is understood as the highest form of organization of scientific knowledge, giving a holistic idea of ​​the patterns and essential connections of a certain area of ​​reality. A scientific theory is characterized by systemic harmony, the logical dependence of some of its elements on others, and the deducibility of its content according to certain logical and methodological rules from a certain set of statements and concepts that form the initial basis of the theory.

In the process of developing knowledge, the emergence of theories is preceded by the stage of accumulation, generalization and classification of experimental data. For example, before the emergence of the theory of universal gravitation, a lot of information had already been collected both in astronomy (ranging from individual astronomical observations to Kepler’s laws, which are empirical generalizations of the observed motion of planets) and in the field of mechanics ( highest value for Newton were Galileo’s experiments in studying the free fall of bodies); In biology, the evolutionary theories of Lamarck and Darwin were preceded by extensive classifications of organisms. The emergence of a theory resembles an insight, during which an array of information is suddenly clearly organized in the theorist’s head thanks to a suddenly emerging heuristic idea. However, this is not entirely true: an innovative hypothesis is one thing, and its justification and development is quite another. Only after the completion of the second process can we talk about the emergence of a theory. Moreover, as the history of science shows, the development of a theory associated with its modifications, refinements, and extrapolation to new areas can last tens and even hundreds of years.

There are several positions on the question of the structure of theories. Let's highlight the most influential of them.

According to V.S. Shvyrev, scientific theory includes the following main components:

1) original empirical basis, which includes many facts recorded in this field of knowledge, achieved through experiments and requiring theoretical explanation;

2) the original theoretical basis -- a set of primary assumptions, postulates, axioms, general laws that collectively describe idealized object of theory;

3) logic of theory - a set of rules of logical inference and proof acceptable within the framework of the theory;

4) a set of statements derived in theory with their evidence, constituting the main body of theoretical knowledge .

The central role in the formation of a theory, according to Shvyrev, is played by the underlying idealized object - a theoretical model of the essential connections of reality, presented with the help of certain hypothetical assumptions and idealizations. In classical mechanics, such an object is a system of material points; in molecular kinetic theory, it is a set of chaotically colliding molecules closed in a certain volume, represented as absolutely elastic material points.

It is not difficult to demonstrate the presence of these components in developed subject-centric psychological theories of personality. In psychoanalysis, the role of the empirical basis is played by psychoanalytic facts (data from clinical observations, descriptions of dreams, erroneous actions, etc.), theoretical basis consists of the postulates of metapsychology and clinical theory, the logic used can be characterized as “dialectical” or as the logic of “natural language”, the “multidimensional” model of the psyche (topological, energetic, economic) acts as an idealized object. From here it is clear that psychoanalytic theory is more complex than any physical theory, since it includes more basic theoretical postulates, operates with several idealized models at once, and uses more “subtle” logical means. Coordination of these components and elimination of contradictions between them represents an important epistemological task, which is still far from being resolved.

A different approach to explicating the structure of the theory is proposed by M.S. Burgin and V.I. Kuznetsov, identifying four subsystems in it: logical-linguistic(language and logical means), model-representative(models and images describing the object), pragmatic-procedural(methods of cognition and transformation of an object) and problem-heuristic(description of the essence and ways to solve problems). The identification of these subsystems, as the authors emphasize, has certain ontological grounds. “The logical-linguistic subsystem corresponds to the existing orderliness of the real world or some part of it, the presence of certain patterns. The pragmatic-procedural subsystem expresses the dynamic nature of the real world and the presence of interaction with it by the cognizing subject. The problem-heuristic subsystem appears due to the complexity of the cognizable reality, which leads to the emergence of various contradictions, problems and the need to solve them. And, finally, the model-representative subsystem primarily reflects the unity of thinking and being in relation to the process of scientific knowledge.”

The comparison of the theory with the organism made by the above-mentioned researchers is worthy of attention. Like a living being, theories are born, develop, reach maturity, and then grow old and often die, as happened with the theories of caloric and ether in the 19th century. As in a living body, the subsystems of the theory are closely interconnected and are in coordinated interaction.

The question of the structure of scientific knowledge is addressed somewhat differently by V.S. Stepin. Based on the fact that the methodological unit of knowledge analysis should not be a theory, but a scientific discipline, he identifies three levels in the structure of the latter: empirical, theoretical and philosophical, each of which has a complex organization.

Empirical level includes, firstly, direct observations and experiments, the result of which are observational data; secondly, cognitive procedures through which the transition from observational data to empirical dependencies and facts is carried out. Observation data are recorded in observation protocols, which indicate who observed, the time of observation, and describe the devices, if they were used. If, for example, a sociological survey was conducted, then the observation protocol is a questionnaire with the answer of the respondent. For a psychologist, these are also questionnaires, drawings (for example, in projective drawing tests), tape recordings of conversations, etc. The transition from observational data to empirical dependencies (generalizations) and scientific facts involves the elimination from observations of the subjective aspects contained in them (associated with possible observer errors, random interference distorting the occurrence of the phenomena under study, instrument errors) in order to obtain reliable intersubjective knowledge about the phenomena. Such a transition involves rational processing of observation data, searching for stable invariant content in them, and comparing multiple observations with each other. For example, a historian establishing the chronology of past events always strives to identify and compare a multitude of independent historical evidence, which for him serves as observational data. Then the invariant content identified in the observations is interpreted (interpreted), using known theoretical knowledge. Thus, empirical facts, constituting the bulk of the corresponding level of scientific knowledge, constituted as a result of the interpretation of observational data in the light of a particular theory.

Theoretical level is also formed by two sublevels. The first consists of particular theoretical models and laws, which act as theories relating to a fairly limited area of ​​phenomena. The second consists of developed scientific theories that include particular theoretical laws as consequences derived from the fundamental laws of the theory. Examples of knowledge of the first sublevel can be theoretical models and laws that characterize certain types of mechanical motion: the model and law of oscillation of a pendulum (Huygens’s laws), the movement of planets around the Sun (Kepler’s laws), the free fall of bodies (Galileo’s laws), etc. In Newtonian mechanics, serving as a typical example of a developed theory, these particular laws, on the one hand, are generalized and, on the other hand, derived as consequences.

A unique cell for organizing theoretical knowledge at each of its sublevels is a two-layer structure consisting of theoretical model and formulated regarding it law. The model is built from abstract objects (such as a material point, a reference system, an absolutely solid surface, an elastic force, etc.), which are in strictly defined connections and relationships with each other. The laws express the relationship between these objects (for example, the law of universal gravitation expresses the relationship between the mass of bodies, understood as material points, the distance between them and the force of attraction: F = Gm1m2/ r2).

The explanation and prediction of experimental facts by theories is connected, firstly, with the derivation of consequences from them that are comparable with the results of experience, and, secondly, with the empirical interpretation of theoretical models achieved through establishing a correspondence between them and the real objects that they reflect. Thus, not only are facts interpreted in the light of theory, but also the elements of the theory (models and laws) are interpreted so as to be subject to experimental verification.

Level foundations of science is the most fundamental in the structure of scientific knowledge. However, until the mid-20th century, it did not stand out: methodologists and scientists simply did not notice it. But it is precisely this level that “acts as a system-forming block that determines the strategy of scientific research, the systematization of acquired knowledge and ensures its inclusion in the culture of the corresponding era.” According to V.S. Stepin, we can distinguish at least three main components of the foundations of scientific activity: ideals and norms of research, the scientific picture of the world and the philosophical foundations of science.

In paragraph 2 of Chapter 1, we already looked at the first two components of this level, so we will focus on the third. According to V.S. Stepin, philosophical foundations– these are the ideas and principles that substantiate the ontological postulates of science, as well as its ideals and norms. For example, Faraday's justification for the material status of electric and magnetic fields was carried out by reference to the metaphysical principle of the unity of matter and force. Philosophical foundations also ensure the “docking” of scientific knowledge, ideals and norms, the scientific picture of the world with the dominant worldview of a particular historical era, with the categories of its culture.

The formation of philosophical foundations is carried out by sampling and subsequent adaptation of ideas developed in philosophical analysis to the needs of a specific area of ​​scientific knowledge. In their structure, V.S. Stepin identifies two subsystems: ontological, represented by a grid of categories that serve as a matrix of understanding and cognition of the objects under study (for example, the categories “thing”, “property”, “relationship”, “process”, “state”, “causality”, “necessity”, “accident”, “ space", "time", etc.), and epistemological, expressed by categorical schemes that characterize cognitive procedures and their results (understanding of truth, method, knowledge, explanation, evidence, theory, fact).

Noting the validity and heuristic nature of the positions we have outlined on the issue of the structure of scientific theory, in particular, and scientific knowledge in general, we will try to identify their weaknesses and determine our own vision of the problem. The first, naturally arising question is related to whether to include empirical level science to the content of the theory or not: according to Shvyrev, the empirical level is included in the theory, according to Stepin - not (but is part of the scientific discipline), Burgin and Kuznetsov implicitly include the empirical level in the pragmatic-procedural subsystem. Indeed, on the one hand, theory is very closely interconnected with facts; it is created to describe and explain them, therefore the elimination of facts from theory clearly impoverishes it. But, on the other hand, facts are able to “lead their own life”, independent of a specific theory, for example, “migrate” from one theory to another. The last circumstance, it seems to us, is more significant: the theory precisely describes and explains the facts, is imposed on them, and therefore they should be taken beyond the limits of the theory. This is also supported by the established division of levels of scientific knowledge into theoretical and empirical (fact-fixing).

Therefore, Stepin’s point of view seems to us the most justified, but adjustments must also be made to it related to the understanding of the structure and role of the philosophical foundations of science. Firstly, they cannot be considered as being on the same level with ideals and norms, with the scientific picture of the world, precisely because of their fundamental nature, primacy, as the author himself notes. Secondly, they are not reduced to ontological and epistemological, but also include value (axiological) and practical (praxeological) dimensions. In general, their structure is homologous to the structure of philosophical knowledge, which includes not only ontology and epistemology, but also ethics, aesthetics, social philosophy, and philosophical anthropology. Thirdly, the interpretation of the genesis of philosophical foundations as the “flow” of ideas from philosophy into science seems to us too narrow; we cannot underestimate the role of the personal life experience of a scientist, in which philosophical views, although developed to a large extent spontaneously, are most deeply rooted due to “ emotional, value-semantic charge”, direct connection with what was seen and experienced.

Thus, the theory is higher form scientific knowledge, a systematically organized and logically connected multi-level set of abstract objects of varying degrees of generality: philosophical ideas and principles, fundamental and particular models and laws, built from concepts, judgments and images.

Further specification of ideas about the nature of scientific theories is associated with the identification of their functions and types.

The question about the functions of theory is, in essence, a question about the purpose of theory, about its role both in science and in culture as a whole. Coming up with an exhaustive list of features is quite difficult. Firstly, in different sciences, theories do not always play the same roles: mathematical knowledge, which deals with the world of “frozen”, self-equal ideal entities, is one thing, and humanitarian knowledge, focused on understanding the constantly changing, fluid, is another thing. human existence in an equally unstable world. This substantive difference determines the insignificance (often the complete absence) of the predictive function in the theories of mathematics, and, on the contrary, its importance for the sciences that study man and society. Secondly, scientific knowledge itself is constantly changing, and along with it, ideas about the role of scientific theories are being transformed: in general, with the development of science, more and more new functions are assigned to theories. Therefore, we will note only the most important, basic functions of scientific theory.

1. Reflective. The idealized object of the theory is a kind of simplified, schematized copy of real objects, therefore the theory reflects reality, but not in its entirety, but only in the most significant moments. First of all, the theory reflects the basic properties of objects, the most important connections and relationships between objects, the patterns of their existence, functioning and development. Since the idealized object is a model real object, then this function can also be called modeling (model-representative). In our opinion, we can talk about three types of models(idealized objects): structural, reflecting the structure, composition of the object (subsystems, elements and their relationships); functional, describing its functioning over time (i.e. those single-quality processes that occur regularly); evolutionary, reconstructing the course, stages, reasons, factors, trends in the development of an object. Psychology uses many models: psyche, consciousness, personality, communication, small social group, family, creativity, memory, attention, etc.

2. Descriptive the function is derived from the reflective function, acts as its private analogue and is expressed in the theory’s fixation of the properties and qualities of objects, connections and relationships between them. Description, apparently, is the oldest, simplest function of science, therefore any theory always describes something, but not every description is scientific. The main thing in a scientific description is accuracy, rigor, and unambiguity. The most important means of description is language: both natural and scientific, the latter being created precisely to increase accuracy and rigor in recording the properties and qualities of objects. Likewise, the psychologist begins the examination of the client by searching and recording significant facts. Therefore, it is difficult to imagine that, for example, Freud built a psychoanalytic theory without relying on his own and other people’s previous clinical experience, in which descriptions of case histories were abundantly presented with detailed indications of their etiology, symptoms, stages of development, and methods of treatment.

3. Explanatory also derived from the reflective function. An explanation already presupposes a search for consistent connections, clarification of the reasons for the appearance and occurrence of certain phenomena. In other words, to explain means, firstly, to bring a single phenomenon under a general law (for example, a single case of a brick falling to the ground can be brought under the general law of gravity, which will show us why the brick flew down (and not up or did not remain hanging in the air) and precisely at such a speed (or acceleration) and, secondly, to find the reason that gave rise to this phenomenon (in our example, the reason that caused the brick to fall would be the force of gravity, the gravitational field of the Earth). and any person cannot do without searching for consistent connections, without finding out the causes of events and taking into account the influence of various factors on what is happening to him and around him.

4. Prognostic the function stems from the explanatory one: knowing the laws of the world, we can extrapolate them to future events and, accordingly, predict their course. For example, I can reliably assume (and with one hundred percent probability!) that the brick I threw out the window will fall to the ground. The basis for such a forecast, on the one hand, is everyday experience, and on the other hand, the theory of universal gravitation. Involving the latter can make the forecast more accurate. IN modern sciences When dealing with complex self-organizing and “human-sized” objects, absolutely accurate forecasts are rare: and the point here is not only in the complexity of the objects under study, which have many independent parameters, but also in the very dynamics of self-organization processes, in which randomness, small force influence at bifurcation points can radically change the direction of system development. Also in psychology, the vast majority of forecasts are of a probabilistic-statistical nature, since, as a rule, they cannot take into account the role of numerous random factors that take place in social life.

5. Restrictive (prohibiting) function is rooted in the principle of falsifiability, according to which a theory should not be omnivorous, capable of explaining any, primarily previously unknown, phenomena from its subject area; on the contrary, a “good” theory should prohibit certain events (for example, the theory of universal gravity prohibits the upward flight of a brick thrown from a window; the theory of relativity limits the maximum speed of transmission of material interactions to the speed of light; modern genetics prohibits the inheritance of acquired traits). In psychology (especially in such sections as personality psychology, social Psychology), apparently, we should talk not so much about categorical prohibitions, but about the improbability of certain events. For example, from E. Fromm’s concept of love it follows that a person who does not love himself cannot truly love another. This is, of course, a ban, but not an absolute one. It is also very unlikely that a child who missed a sensitive period for language acquisition (for example, due to social isolation) will be able to fully master it in adulthood; in the psychology of creativity, the low probability of an opportunity for a complete amateur to make an important scientific discovery in fundamental areas of science is recognized. And it is almost impossible to imagine that a child with an objectively confirmed diagnosis of imbecility or idiocy could become an outstanding scientist.

6. Systematizing the function is determined by man’s desire to order the world, as well as by the properties of our thinking, which spontaneously strives for order. Theories act as an important means of systematization and condensation of information simply due to their inherent organization, the logical relationship (deducibility) of some elements with others. The simplest form systematization are the processes of classification. For example, in biology, classifications of plant and animal species necessarily preceded evolutionary theories: only on the basis of extensive empirical material of the former was it possible to advance the latter. In psychology, perhaps the most famous classifications relate to personality typology: Freud, Jung, Fromm, Eysenck, Leonhard and others made significant contributions to this area of ​​science. Other examples are the identification of types of pathopsychological disorders, forms of love, psychological influence, types of intelligence, memory, attention, abilities and other mental functions.

7. Heuristic the function emphasizes the role of theory as “the most powerful means of solving fundamental problems of understanding reality.” In other words, a theory not only answers questions, but also poses new problems, opens up new areas of research, which it then tries to explore in the process of its development. Often, questions posed by one theory are solved by another. For example, Newton, having discovered the gravitational force, could not answer the question about the nature of gravity; this problem was already solved by Einstein in general theory relativity. In psychology, the most heuristic theory still remains, apparently, psychoanalysis. On this subject, Kjell and Ziegler write: “Although research concerning Freud's psychodynamic theory cannot prove his concepts beyond doubt (since the verifiability of the theory is low), he has inspired many scientists by showing them in which direction research can be carried out to improve our knowledge about behavior. Literally thousands of studies have been prompted by Freud's theoretical claims." In terms of the heuristic function, the vagueness and incompleteness of the theory are more advantages than disadvantages. This is Maslow's theory of personality, which is more a collection of delightful guesses and assumptions than a clearly defined structure. Largely because of its incompleteness, coupled with the boldness of the hypotheses put forward, it “served as a stimulus for the study of self-esteem, peak experience and self-actualization, ... influenced not only researchers in the field of personology, but also in the field of education, management and health care.”

8. Practical the function is epitomized by the famous aphorism of the 19th-century German physicist Robert Kirchhoff: “There is nothing more practical than a good theory.” Indeed, we build theories not only to satisfy curiosity, but, above all, to understand the world around us. In a clear, orderly world, we not only feel safer, but we can also function successfully in it. Thus, theories act as a means of solving personal and social problems and increase the efficiency of our activities. In the era of post-non-classics, the practical significance of scientific knowledge comes to the fore, which is not surprising, because modern humanity faces global problems, which most scientists see as possible to overcome only through the development of science. The theories of psychology today claim not only to solve the problems of individuals and small groups, but also strive to contribute to the optimization public life generally. According to Kjell and Ziegler, psychology has an important contribution to make in solving problems associated with poverty, racial and sexual discrimination, alienation, suicide, divorce, child abuse, drug and alcohol addiction, crime, etc.

Kinds theories are distinguished on the basis of their structure, determined, in turn, by the methods of constructing theoretical knowledge. There are three main, “classical” types of theories: axiomatic (deductive), inductive and hypothetico-deductive. Each of them has its own “construction base” represented by three similar methods.

Axiomatic theories, established in science since antiquity, personify the accuracy and rigor of scientific knowledge. Today they are most common in mathematics (formalized arithmetic, axiomatic set theory), formal logic (propositional logic, predicate logic) and some branches of physics (mechanics, thermodynamics, electrodynamics). A classic example of such a theory is Euclid's geometry, which for many centuries was considered a model of scientific rigor. As part of an ordinary axiomatic theory, there are three components: axioms (postulates), theorems (derived knowledge), and rules of inference (proofs).

Axioms(from the Greek axioma “honored, accepted position”) - provisions accepted as true (as a rule, due to self-evidence) that together constitute axiomatics as the fundamental basis of a specific theory. To introduce them, pre-formulated basic concepts (definitions of terms) are used. For example, before formulating the main postulates, Euclid gives definitions of “point”, “straight line”, “plane”, etc. Following Euclid (however, the creation of the axiomatic method is attributed not to him, but to Pythagoras), many tried to build knowledge on the basis of axioms: not only mathematicians, but also philosophers (B. Spinoza), sociologists (G. Vico), biologists (J. Woodger). The view of axioms as eternal and unshakable principles of knowledge was seriously shaken with the discovery of non-Euclidean geometries; in 1931, K. Gödel proved that even the simplest mathematical theories cannot be completely constructed as axiomatic formal theories (the incompleteness theorem). Today it is clear that the acceptance of axioms is determined by the specific experience of the era; with the expansion of the latter, even the most seemingly unshakable truths may turn out to be erroneous.

From the axioms by certain rules the remaining provisions of the theory (theorems) are derived (deduced), the latter making up the main body of the axiomatic theory. Rules are studied by logic - the science of the forms of correct thinking. In most cases they represent the laws of classical logic: such as law of identity(“every essence coincides with itself”), law of contradiction(“no proposition can be both true and false”), law of the excluded middle(“every judgment is either true or false, there is no third choice”), law of sufficient reason(“every judgment made must be properly justified”). Often these rules are applied by scientists half-consciously, and sometimes completely unconsciously. As noted above, researchers often make logical mistakes, relying more on their own intuition than on the laws of thinking, preferring to use the “softer” logic of common sense. Since the beginning of the 20th century, non-classical logics began to develop (modal, multivalued, paraconsistent, probabilistic, etc.), moving away from classical laws, trying to grasp the dialectics of life with its fluidity, inconsistency, not subject to classical logic.

If axiomatic theories are relevant to mathematical and formal logical knowledge, then hypothetico-deductive theories specific to the natural sciences. G. Galileo is considered the creator of the hypothetico-deductive method, who also laid the foundations of experimental natural science. After Galileo this method was used (though for the most part implicitly) by many physicists, from Newton to Einstein, and therefore until recently it was considered fundamental in natural science.

The essence of the method is to put forward bold assumptions (hypotheses), the truth value of which is uncertain. Then, consequences are deductively derived from the hypotheses until we arrive at statements that can be compared with experience. If empirical testing confirms their adequacy, then the conclusion (due to their logical relationship) about the correctness of the initial hypotheses is legitimate. Thus, the hypothetico-deductive theory is a system of hypotheses of varying degrees of generality: at the very top are the most abstract hypotheses, and at the top lowest level– the most specific, but subject to direct experimental verification. It should be noted that such a system is always incomplete, and therefore can be expanded with additional hypotheses and models.

The more innovative consequences that can be verified by subsequent experience can be derived from a theory, the more authority it enjoys in science. In 1922, Russian astronomer A. Friedman derived equations from Einstein’s theory of relativity that proved its nonstationarity, and in 1929, American astronomer E. Hubble discovered a “red shift” in the spectrum of distant galaxies, confirming the correctness of both the theory of relativity and Friedman’s equations. In 1946, an American physicist of Russian origin G. Gamow, from his theory of the hot Universe, deduced the necessity of the presence in space of microwave isotropic radiation with a temperature of about 3 K, and in 1965 this radiation, called relict radiation, was discovered by astrophysicists A. Penzias and R. Wilson. It is quite natural that both the theory of relativity and the concept of a hot Universe have entered the “solid core” of the modern scientific picture of the world.

Inductive theories in their pure form in science, apparently, are absent, since they do not provide logically based, apodictic knowledge. Therefore, we should rather talk about inductive method, which is also characteristic, first of all, of natural science, since it allows us to move from experimental facts first to empirical and then theoretical generalizations. In other words, if deductive theories are built “from the top down” (from axioms and hypotheses to facts, from the abstract to the concrete), then inductive theories are built “from the bottom up” (from individual phenomena to universal conclusions).

F. Bacon is usually recognized as the founder of inductive methodology, although the definition of induction was given by Aristotle, and the Epicureans considered it the only authoritative method of proving the laws of nature. It is interesting that, perhaps under the influence of the authority of Bacon, Newton, who in fact relied mainly on the hypothetico-deductive methodology, declared himself a supporter of the inductive method. A prominent defender of the inductive methodology was our compatriot V.I. Vernadsky, who believed that it is on the basis of empirical generalizations that scientific knowledge should be built: until at least one fact is discovered that contradicts a previously obtained empirical generalization (law), the latter should be considered true.

Inductive inference usually begins with the analysis and comparison of observational or experimental data. If at the same time something common and similar is seen in them (for example, the regular repetition of a property) in the absence of exceptions (conflicting information), then the data are generalized in the form of a universal proposition (empirical law).

Distinguish complete (perfect) induction, when the generalization refers to a finitely observable area of ​​facts, and incomplete induction, when it relates to an infinitely or finitely observable area of ​​facts. For scientific knowledge, the second form of induction is most important, since it is it that gives an increase in new knowledge and allows us to move on to law-like connections. However, incomplete induction is not a logical reasoning, since no law corresponds to the transition from the particular to the general. Therefore, incomplete induction is probabilistic in nature: there is always a chance that new facts will appear that contradict those previously observed.

The “trouble” of induction is that a single disproving fact makes the empirical generalization as a whole untenable. This cannot be said about theoretically based statements, which can be considered adequate even when faced with many contradictory facts. Therefore, in order to “strengthen” the significance of inductive generalizations, scientists strive to substantiate them not only with facts, but also with logical arguments, for example, to derive empirical laws as consequences from theoretical premises or to find the reason that determines the presence of similar characteristics in objects. However, inductive hypotheses and theories in general are of a descriptive, ascertaining nature and have less explanatory potential than deductive ones. However, in the future, inductive generalizations often receive theoretical support, and descriptive theories are transformed into explanatory ones.

The considered basic models of theories act primarily as ideal-typical constructions. In the actual scientific practice of natural science, when constructing theories, scientists, as a rule, use both inductive and hypothetico-deductive methodology simultaneously (and often intuitively): the movement from facts to theory is combined with a reverse transition from theory to verifiable consequences. More specifically, the mechanism for constructing, justifying and testing a theory can be represented by the following diagram: observational data → facts → empirical generalization → universal hypothesis → particular hypotheses → testable consequences → setting up an experiment or organizing an observation → interpretation of experimental results → conclusion about the consistency (failure) of hypotheses → putting forward new ones hypotheses The transition from one stage to another is far from trivial; it requires the use of intuition and a certain amount of ingenuity. At each stage, the scientist also reflects on the results obtained, aimed at understanding their meaning, compliance with the standards of rationality, and eliminating possible errors.

Of course, not every hypothesis verified by experience is subsequently transformed into a theory. In order to form a theory around itself, a hypothesis (or several hypotheses) must not only be adequate and new, but also have a powerful heuristic potential and relate to a wide range of phenomena.

Development psychological knowledge In general, it follows a similar scenario. Let's take, for example, the theory of personality (more precisely, the psychotherapeutic concept as one of its parts) by K.R. Rogers, recognized throughout the world, meeting to a fairly high degree the criteria of heuristics, experimental approbability, and functional significance. Before moving on to building a theory, Rogers received a psychological education and acquired rich and varied experience working with people: first helping difficult children, then teaching at universities and counseling adults, and conducting scientific research. At the same time, he studied in depth the theory of psychology, mastered methods of psychological, psychiatric and social assistance. As a result of analyzing and summarizing his experience, Rogers came to understand the futility of “intellectual approaches,” psychoanalytic and behaviorist therapy, and the realization that “change occurs through experience in relationships.” Rogers was also dissatisfied with the inconsistency of Freudian views with the “scientific, purely objective statistical approach to science.”

Rogers bases his own psychotherapeutic concept on the “basic hypothesis”: “if I can create a certain type of relationship with another person, he will discover the ability to use this relationship for his development, which will cause a change and development of his personality.” Apparently, the advancement of this assumption is based not only on the therapeutic and life experience of the author, but also owes its appearance to Rogers’ philosophical ideas and intuitive conviction of its correctness. Particular consequences follow from the main hypothesis, for example, the position of three “necessary and sufficient conditions” for successful therapy: non-judgmental acceptance, congruence (sincerity), empathic understanding. The conclusion of particular hypotheses in this case cannot be considered purely logical or formal; on the contrary, it is substantive, creative in nature, and is associated, again, with the generalization and analysis of the experience of relationships with people. As for the main hypothesis, it fully complies with the above-mentioned requirements of heuristics and fundamentality, and therefore may well serve as the “ideological center” for building a developed theory. The heuristic nature of the main hypothesis was manifested, in particular, in the fact that it guided many researchers to study the quality of the relationship between the consultant and the client. Its fundamental nature is associated with the possibility of extrapolation to any (not just psychotherapeutic) relationships between people, which was done by Rogers himself.

The hypotheses put forward formed the theoretical basis of client-centered therapy, which then became the subject of objective, rigorous, measurement-based, empirical study. Rogers not only formulated a number of testable consequences due, first of all, to the operationalization of basic concepts, but also defined a program and methods for their verification. The implementation of this program has convincingly proven the effectiveness of client-centered therapy.

From Rogers' theory it follows that the success of therapy depends not so much on the knowledge, experience, and theoretical position of the consultant, but on the quality of the relationship. This assumption can also be tested if we can operationalize the concept of “relationship quality”, consisting of “sincerity”, “empathy”, “goodwill”, “love” for the client. For this purpose, one of Rogers' employees, based on scaling and ranking procedures, developed the Attitude List questionnaire for clients. For example, agreeableness was measured using sentences of different ranks: from “He likes me”, “He is interested in me” (high and medium levels of agreeableness) to “He is indifferent to me”, “He disapproves of me” (zero and negative levels, respectively). goodwill). The client rated these statements on a scale from “very true” to “not at all true.” As a result of the survey, a high positive correlation was discovered between the empathy, sincerity, and friendliness of the consultant, on the one hand, and the success of therapy, on the other. A number of other studies have shown that the success of therapy does not depend on the theoretical position of the consultant. In particular, a comparison of psychoanalytic, Adlerian and client-centered psychotherapy showed that success depends precisely on the quality of the relationship between the participants in the therapeutic process, and not on the basis of what theoretical concepts it unfolds. Thus, particular, and, consequently, Rogers’ main hypotheses received experimental confirmation.

Using the example of Rogers' concept of interhuman relations, we see that the development of the theory is cyclical, spiral-shaped: therapeutic and life experience → its generalization and analysis → putting forward universal and particular hypotheses → drawing testable consequences → testing them → clarifying hypotheses → modification based on refined knowledge of the therapeutic experience. Such a cycle can be repeated many times, with some hypotheses remaining unchanged, others being refined and modified, others being discarded, and others being generated for the first time. In such a “circulation,” the theory develops, refines, and enriches, assimilating new experience and putting forward counterarguments to criticism from competing concepts.

Most others psychological theories functions and develops according to the same scenario, so it is legitimate to conclude that the “average psychological theory” combines the features of both hypothetico-deductive and inductive theories. Are there “pure” inductive and hypothetico-deductive theories in psychology? In our opinion, it is more correct to talk about the gravitation of a particular concept towards the pole of induction or deduction. For example, most concepts of personality development are predominantly inductive in nature (in particular, Freud’s doctrine of psychosexual stages, E. Erikson’s theory of psychosocial development, J. Piaget’s theory of stages of intellectual development) since they, firstly, rely on a generalization of observations and experiments, - secondly, they are predominantly descriptive in nature, characterized by “poverty” and weakness of explanatory principles (for example, Piaget’s theory cannot explain, except by reference to observational data, why there should be exactly four (and not three or five) stages of intelligence formation, why only children develop faster than others, why the order of stages is this way, etc.). With regard to other theories, it is often impossible to say exactly which type they are closer to, since the development of universal hypotheses in most cases is equally based on both the experience and intuition of the researcher, as a result of which many provisions of the theories combine the qualities of empirical generalizations and universal hypotheses-guesses .

But why are there so many theories in psychology, what determines their diversity, since we live in the same world, have similar life experiences: we are born, learn language and etiquette, go to school, fall in love, get sick and suffer, hope and dream? Why do theorists interpret this experience differently, each emphasizing their own, paying attention to some aspects of it and losing sight of others, and accordingly they put forward different hypotheses and build theories that are completely different in content from each other? In our opinion, the key to answering these questions lies through the study of the philosophical foundations of psychological theories, to which we now turn.