Value of Knowledge Reference
In the period very roughly from the beginnings of modern physics (1905) up to Alan Turing's description of the Turing machine in 1938, one of the focal points of dispute in the theory of knowledge was the foundations of mathematics.
The main players in this struggle are:
Gottlob Frege: the founder of Logicism, the position that the whole of mathematics can be reduced to a set of relations derived one from the other solely by means of logic, without reference to specifically mathematical concepts such as number. Wittgenstein attempted to carry Frege's concepts of mathematics over to the natural language, with predictably inane results. Frege was also the inspiration for Rudolph Carnap and the various schools of Logical Positivism which continued to wrestle with the problems generated by the new physics. Frege took no part in the struggle after 1903, and died in bitterness and isolation in 1925 having failed to complete a system based on his concept without the appearance of contradictions or logical flaws. His project was later continued by Bertrand Russell and Alan Whitehead.
David Hilbert: the founder of Formalism, the position that mathematics consists solely in the generation of combinations of symbols according to arbitrary rules and the application of logic. His first important work in 1899 was to produce a definitive set of axioms for Euclidean geometry without any appeal to spatial references or intuition. In 1905 (and again from 1918) Hilbert attempted to lay a firm foundation for mathematics by proving consistency - that is, that finite steps of reasoning in logic could not lead to a contradiction. But in 1931, Kurt Gödel showed this goal to be unattainable: propositions may be formulated that are undecidable; thus, it cannot be known with certainty that mathematical axioms do not lead to contradictions.
Luitzen Brouwer: the founder of Intuitionism, that views the nature of mathematics as mental constructions governed by self-evident laws. Brouwer is considered the founder of Topology. In his doctoral thesis in 1907, On the Foundations of Mathematics, Brouwer attacked the logical foundations of mathematics and in 1908, in On the Untrustworthiness of the Logical Principles, he rejected the use in mathematical proofs of the principle of the excluded middle, which asserts that every mathematical statement is either true or false and no other possibility is allowed. In 1918 he published a set theory, the following year a theory of measure, and by 1923 a theory of functions, all developed without using the principle of the excluded middle. Brouwer was the first to build a mathematical theory using Logic other than that normally accepted, a method of research since applied to quantum mechanics and more widely.
Kurt Gödel: in 1931, author of the epoch-making Gödel's theorem, which states that within any consistent mathematical system there are propositions that cannot be proved or disproved on the basis of the axioms within that system and that, therefore, it is uncertain that the basic axioms of arithmetic will not give rise to contradictions. The proof was specifically aimed against Russell & Whitehead's Principia Mathematica - an attempt to complete Frege's project. This article ended nearly a century of attempts to establish axioms that would provide a rigorous basis for all mathematics. Gödel was an avowed Kantian and expresses support for Husserl's Phenomenology.
Alan Turing; founder of computer science and research in artificial intelligence. Motivated by Gödel's work to seek an algorithmic method of determining whether any given proposition was undecidable, with the ultimate goal of eliminating them from mathematics, he proved instead, in 1936, that there cannot exist any such universal method of determination and, hence, that mathematics will always contain undecidable propositions. To illustrate this, Turing posited a simple device that possessed the minimal properties of a modern computing system: a finite program, a large data-storage capacity, and a step-by-step mode of mathematical operation - the "Turing machine". Using Hilbert's own methods, Turing and Gödel put to rest the hopes of David Hilbert & Co. that all mathematical propositions could be expressed as a set of axioms and derived theorems.
Turing championed the theory that computers could be constructed that would be capable of human thought and his writing on this subject show considerable affinity with behaviourist psychology.
The following extended quote in which Gödel summarises his position is worth considering:
"... it turns out that in the systematic establishment of the axioms of mathematics, new axioms, which do not follow by formal logic from those previously established, again and again become evident. It is not at all excluded by the negative results mentioned earlier that nevertheless every clearly posed mathematical yes-or-no question is solvable in this way. For it is just this becoming evident of more and more new axioms on the basis of the meaning of the primitive notions that a machine cannot imitate.
"I would like to point out that this intuitive grasping of ever newer axioms that are logically independent from the earlier ones, which is necessary for the solvability of all problems even within a very limited domain, agrees in principle with the Kantian conception of mathematics. The relevant utterances by Kant are, it is true, incorrect if taken literally, since Kant asserts that in the derivation of geometrical theorems we always need new geometrical intuitions, and that therefore a purely logical derivation from a finite number of axioms is impossible. That is demonstrably false. However, if in this proposition we replace the term "geometrical" - by "mathematical" or "set-theoretical", then it becomes a demonstrably true proposition. I believe it to be a general feature of many of Kant's assertions that literally understood they are false but in a broader sense contain deep truths. In particular, the whole phenomenological method, as I sketched it above, goes back in its central idea to Kant, and what Husserl did was merely that he first formulated it more precisely, made it fully conscious and actually carried it out for particular domains. Indeed, just from the terminology used by Husserl, one sees how positively he himself values his relation to Kant.
"I believe that precisely because in the last analysis the Kantian philosophy rests on the idea of phenomenology, albeit in a not entirely clear way, and has just thereby introduced into our thought something completely new, and indeed characteristic of every genuine philosophy - it is precisely on that, I believe, that the enormous influence which Kant has exercised over the entire subsequent development of philosophy rests. Indeed, there is hardly any later direction that is not somehow related to Kant's ideas. On the other hand, however, just because of the lack of clarity and the literal incorrectness of many of Kant's formulations, quite divergent directions have developed out of Kant's thought - none of which, however, really did justice to the core of Kant's thought. This requirement seems to me to be met for the first time by phenomenology, which, entirely as intended by Kant, avoids both the death-defying leaps of idealism into a new metaphysics as well as the positivistic rejection of all metaphysics. But now, if the misunderstood Kant has already led to so much that is interesting in philosophy, and also indirectly in science, how much more can we expect it from Kant understood correctly?" [The modern development of the foundations of mathematics in the light of philosophy, Gödel 1961]
Gödel has done a great service here in drawing the very precise and formal development of the foundations of mathematics back to the fundamental questions which drove classical epistemology. The real question is not the building of ever more elaborate logical edifices, but understanding the nature and source of these "more and more new axioms on the basis of the meaning of the primitive notions".
With the more or less decisive defeat of the Formalist and Logicist schools, and Turing's reduction of the problems to questions of programming, controversy in the foundations of mathematics died down after World War Two. Turing's work introduced new concepts of complexity in language which have provided the basis for Noam Chomsky's Kantian structural psychology and the foundations of complexity theory. Gödel's theorem indicates that the behaviour of even purely formal systems cannot be completely described by formal logic, and this is at the root of the inherent complexity, unpredictability and richness of the world of Nature and society.
None of this controversy bore on the issue of how it comes that mathematics finds application in the sciences. Attempts to reduce mathematics to logic failed, so it must be accepted that mathematics is a science which studies an aspect of Nature, viz., Quantity, it is not just rules for manipulating symbols. Nevertheless, the "Third Positivism", which climbed out of the ashes of the positivism of Mach & Co., took inspiration from the Logicist School and remain an important trend to this day. The way in which mathematics found application in the New Physics was central to the development of positivistic philosophy in the period from 1905 up to recent times.
The problem of value is also the problem of quantity. To understand the problem of the validity of knowledge, the concept of quantity has an important place. In the physical sciences, this word is usually used in a narrow sense closely related to the concept of number. In resolving the problem of dualism in Western philosophy, Hegel gave to Quantity a broader, more "philosophical" definition:
"Quality is, in the first place, the character identical with being: so identical that a thing ceases to be what it is, if it loses its quality. Quantity, on the contrary, is the character external to being, and does not affect the being at all. [§ 85n] ... Quantity is pure Being, where the mode or character is no longer taken as one with the being itself, but explicitly put as superseded or indifferent." [§ 99, Shorter Logic]
All cognition thus begins with a qualitative-quantitative dialectic [which Hegel called "Measure"], and there can be separation between quantity and quality nor any valid separation between "exact sciences" and "inexact sciences" according to the place of measurement in a science. Value is simply the quantitative side of human labour, inseparable from the qualitative side. No conception is possible without a concept of "Measure" determining at what point a thing becomes no longer itself but something else.
Moritz Schlick: In 1926, Schlick gathered around him a group of philosophers known as the Vienna Circle, which included Rudolf Carnap, Otto Neurath and the mathematicians and scientists Kurt Gödel, Philipp Frank, and Hans Hahn. Influenced by Ernst Mach and Ludwig Boltzmann, the Circle also drew on the work Bertrand Russell and Ludwig Wittgenstein. The Circle aimed to apply modern symbolic logic to further develop the views associated with Ernst Mach. and developed what has been known as Logical Positivism of Logical Empiricism. The Vienna Circle was characterised by an hostility to what they called "metaphysics", by faith in the techniques of modern symbolic logic, and by belief that the future of philosophy lay in becoming the handmaiden of natural science.
Rudolph Carnap: studied at Jena 1910-14 where he attended Frege's lectures and joined Schlick's circle in 1926, and collaborated with a group of Positivist Empiricists in Berlin led by Hans Reichenbach. Carnap was not concerned with the problem of how people arrived at an understanding of the world, which was relegated to psychology, but sought to develop a logical grounding for empirical knowledge. Carnap's approach was to view the natural language expressing empirical experience. By substituting more extensive logical expressions for words and phrases of the natural language, with symbols indicating immediate sense-data, Carnap aimed to show that all empirical statements are fully translatable into statements about immediate experiences which are subject to logical analysis. Later his methods moved more towards operational rather than empirical reduction. Sentences not subject to such reduction and therefore not subject to logical analysis were declared to be meaningless. All sentences concerning observable physical objects are translatable into the vocabulary of physics and thus Carnap hoped to establish a method of testing the consistency of physical theories.
To avoid the Nazi threat, in 1936 Carnap moved to the US and joined discussions with Bertrand Russell (Logical Positivism), Alfred Tarski and the Pragmatists Willard Quine (Constructivism) and Charles Morris (Semiology).
Carnap and other Logical Empiricists held that the statements of logic and mathematics, unlike those of empirical science, may be established a priori. Some, including Quine, argued that the attempt to delimit a class of statements that are true a priori should be abandoned as misguided. From 1945, Carnap turned his efforts to problems of inductive reasoning and of rational belief and decision to construct a formal system of inductive logic, centred on probabilistic implication.
However, the fact is that this whole school which based itself on the Logicist project initiated by Gottlob Frege was transcended with Gödel's theorem in 1931. Formal logic is a finite instrument, subsumed within mathematics. It is a wonderful thing about mathematics, that a mistaken view can be so shown to be so, so decisively and irrevocably.
Edmund Husserl began as a mathematician, moved to psychology to find a solution to the problems raised by the foundations of mathematics, and then to an introspective transcendental system. As a connecting thread between the psychologists and the physicists, between those who focussed on objective knowledge and those who focussed on the soul, he is very important. He is also a link with the classical German tradition. More later.
Ludwig Wittgenstein is also a figure who cuts across the social and mathematical disciplines. He also crossed the Anglo-American / Continental divide. Personally I feel that his whole project was misconceived, but as a key to understanding the crisis of the inter-war years, he is important. More later.
The decisive struggle which characterised the ideological landscape for the post-World War Two world was the struggle over epistemology among the founders of modern quantum physics: Einstein, Bohr, Born, Heisenberg and I would have to include Percy Bridgman whose status as a physicist is one grade below the foregoing but should be credited with the most consistent and materialist formulation of the Pragmatist principle, partly on the basis of the critique of quantum and relativity physics.
The issues relating to the principle of invariance have been dealt with separately in concluding part of the article "Perception under the Microscope". The relativism which continued and sought support from Einstein's discovery in this respect sought new bases of support in:
In general, all these epistemological problems arise exclusively from the intrusion of human practice into phenomena which are totally foreign to the sense organs, and consequently the entire logic and structure of our intuition which is intimately connected with sensuous representation.
Einstein held to the position that "the essentially statistical character of contemporary quantum theory is solely to be ascribed to the fact that this [theory] operates with an incomplete description of physical systems", and anticipated further developments of quantum physics which would uncover a causal substratum to the probability field, while all the other leaders in the field held, and continue to hold that the probability field described by Schrödinger's wave equation constitute a complete description, and consequently, at the level of quantum behaviour, the "law of sufficient cause" fails to hold - things 'just happen'.
Bohr was the father of the modern, quantum theory of the structure of the atom and founder of the Copenhagen School - a centre for discussion of the philosophical aspects of modern physics. Bohr's most noted contribution to the philosophical problems associated with the interpretation of quantum theory was his Principle of Complementarity, which "implies the impossibility of any sharp separation between the behaviour of atomic objects and the interaction with the measuring instruments which serve to define the conditions under which the phenomena appear." As a result, "evidence obtained under different experimental conditions cannot be comprehended within a single picture, but must be regarded as complementary in the sense that only the totality of the phenomena exhausts the possible information about the objects." This interpretation of the meaning of quantum physics gradually came to be accepted by the majority of physicists. Such a conception makes it impossible to conceive of the properties of quantum objects (such as momentum or frequency) independently of the specific interactions which manifest or determine those properties. Such a situation radically challenges intuitive conceptions of objectivity in which properties as momentum, position, frequency adhere to objects in themselves. Bohr's conception was central to the overcoming of subjectivist interpretations of this aspect quantum physics and in his later years, Bohr further developed the conception to be applied more widely.
A student of Max Born and Niels Bohr, it was Heisenberg who first
represented the properties of a quantum-object as matrices, that
is, as 2x2 arrays of numbers and determined the laws of interaction
between such objects using matrix algebra. Heisenberg applied
Einstein's operational approach to the solution of the problem
of relativity to the determination of the properties of quantum
objects, one of the outcomes of which is the famous Heisenberg
Indeterminacy (or Uncertainty) Principle. The form he derived
appeared in a paper that tried to show how matrix mechanics could
be interpreted in terms of the intuitively familiar concepts of
classical physics. If q is the position coordinate of an
electron and p its momentum, assuming that q, and
independently, p have been measured for many electrons
in the particular state, then, Heisenberg proved, that Dq
* Dp > h, where Dq is the standard deviation
of measurements of q, Dp is the standard deviation
of measurements of p, and h is Planck's constant
(6.626176 x 10-27 erg-second). Indeterminacy principles are characteristic
of quantum physics; they state the theoretical limitations imposed
upon any pair of non-commuting matrix variables; in such cases,
the determination of one affects the determination of the other.
Heisenberg took the principle to indicate the non-intuitive properties
of quantum, as distinct from classical, systems.
Although he early, and indirectly, came under the influence of Ernst Mach, Heisenberg, in his philosophical writings about quantum mechanics, vigorously opposed the Logical Positivism developed by philosophers of science of the Vienna Circle. According to Heisenberg, what was revealed by active observation was not an absolute datum, but a theory-laden datum; i.e., relativised by theory and contextualized by observational situations. He took classical mechanics and electromagnetics, which articulated the objective motions of bodies in space-time, to be permanently valid, though not applicable to quantum mechanical systems; he took causality to apply in general not to individual quantum mechanical systems but to mathematical representations alone, since particle behaviour could be predicted only on the basis of probability.
The struggle to development the mathematical instruments to describe quantum behaviour, and to resolve the epistemological problems which were generated by this work, raged throughout the inter-war period, and by the early years after World War Two, a settled interpretation of quantum physics was available which avoided subjectivist misconceptions. In the meantime however, a substantial and significant area of physical theory had been created in which intuitive conceptions of the objectivity of the physical properties of objects existing independently of observation had become untenable.
The outstanding point of dissension of Einstein is often exaggerated. The jury is still out on whether the statistical interpretation is final and the principle of sufficient cause not universally valid, or, on the other hand, and new, non-statistical interpretation of the field comes about, which derives the statistical manifestation from the indeterminacy of underlying causal interactions. The fact remains that probability is an objective phenomenon manifested in all complex processes, irrespective of whether it is found to be "irreducible" in microphysics.
To a great extent the crisis of quantum physics simply continued and deepened the crisis created by Einstein's Theory of Invariance (or Relativity), but the peculiar difficulty brought out by Quantum Physics is "wave-particle" duality. Mathematics provided a means to consistently describe quantum interactions, but any attempt to render the equations of quantum physics into the natural language referring to the objects of ordinary sensuous representation leads to contradictions.
I have frequently used the word "determination" above in a context where it is common to use the word "measurement". "Measurement" carries the implication that the value of a property inheres in the object, and the act of measurement brings this value to consciousness. Interpretation of quantum interactions in this way inevitably leads to subjectivism and interpretations which manifest formal contradictions. A quantum property, which is representable mathematically by a matrix, provides a substratum which allows of reification - that is, it may be deemed to adhere to a quantum object without leading to contradictions and inconsistency. However, such matrix properties defy imagination. An interaction which leads to an event in the "macrosphere", such as a flash on a phosphorescent plate indicating the impact of an electron, determines the position of the electron; subjectively expressed, the observer measures the position of the electron by using the phosphorescent plate. Determination is an objective process which goes on independently of the consciousness of an observer who organises experimental apparatus with the purpose of making a measurement. Determination always involves an interaction in which phenomena representable by quantum-mathematical entities give rise to phenomena representable by the mathematics of classical physics and familiar to intuitive understanding. In other words, we have here the same problem of subjectivist interpretation that arises in connection with the operational definition of classical properties of objects moving at speeds comparable to the speed of light.
All these epistemological problems arise due to the intrusion of human practice in phenomena beyond the domain of sense perception, now combined with the capacity of mathematics to effectively describe this practice - mathematics which has itself gone beyond the domain of primitive intuitive notions. In other words, these problems arise in a world in which the products of human industry and science transcend human natural-sensuous experience.
Human senses can no longer be understood as natural attributes nor can human reason be understood as something either innate or arising from natural-human interaction with Nature. Both must be conceived as social products, including measuring instruments and mathematics, both of which are the products of human labour.
With these achievements, what was worked in general by Hegel and Marx, has been worked out in detail insofar as it relates to the practical-natural activity of people. What now remains is to understand the nature of the social relations which underlie the production of the concepts by means of which people understand Nature. This is not to say that science came to an end with the resolution of the epistemological problems of Relativity and Quantum physics. Far from it. But the comprehension of these scientific revolutions in epistemological terms dealt with the problems of knowledge, what remains is the endless task of progress of natural science itself, which cannot be furthered without the resolution of the problems of social development.
Natural science finds itself faced with the task of tackling the problems of the social and historical development of science.