Home Categories philosophy of religion the development of my philosophy

Chapter 16 Chapter 16 Non-Demonstrative Reasoning

the development of my philosophy 罗素 11203Words 2018-03-20
I returned to England in June 1944 after three weeks on the Atlantic coast.Trinity College has given me lectures for five years, and I have chosen "Non-Demonstrative Reasoning" as the title of my year's course.Before this, I had become more and more aware of the narrow scope of deductive reasoning used in logic and pure mathematics.I perceive that all reasoning used in common sense and science is different from that of deductive logic.The nature of reasoning as used in common sense and science is such that the conclusion is only probabilistic when the premises are true and the reasoning is correct.During the first six months of my return from America, I lived at Trinity College.Although Germany launches V-1 and V-2 rockets, I enjoy a sense of tranquility.I began to study probabilities and the kind of reasoning that produces them.I initially found this subject to be a bit cumbersome, since there are so many issues tangled together that each thread must be separated from the others.The positive results of the research appear in Human Knowledge, but in that book I never mention the various perplexities and tentative assumptions through which I arrived at my final conclusions.I now think this was a mistake, because it makes conclusions seem more hasty and less secure than they really are.

I found the topic of non-demonstrative reasoning to be much larger and more interesting than I had originally imagined.I find that most discussions of non-demonstrative reasoning are too restricted to the study of inductive methods.I have come to the conclusion that inductive arguments, unless confined within the bounds of common sense, lead more often to false than to true conclusions.The boundaries imposed by common sense are easy to feel, but not easy to formulate.In the end I came to the conclusion that, while reasoning in science requires unprovable, extralogical principles, induction is not one of those principles.Induction has its place, but it cannot be used as a premise.I shall discuss this subject shortly below.

Another conclusion I had to draw was that if we knew only what could be experienced or verified, not only science, but a great deal of knowledge that no one could doubt was impossible.I think experience was too much valued in the past, so I think the philosophy of empiricism must be greatly restricted. I was initially a bit tricky due to the size and number of questions included.Since non-demonstrative inferences in nature only give probabilities to conclusions, I thought it prudent to study probabilities first, especially since there is already some active knowledge on the subject which seems to exist in the ocean of impermanence. A floating raft.For several months I studied probabilistic computation and its applications.There are two kinds of probability, one of which is statistical and the other is suspicious.Some theorists think they can only deal with one of these probabilities, and some think they can only deal with the other.Mathematical calculations are interpreted in the usual way, and they are statistically probabilistic.There are fifty-two cards in a deck.So, if you draw a card at random, the chance of a seven of diamonds is 1 in 52.It is generally believed (but not conclusively proven) that if you draw a lot of cards at random, the Seven of Diamonds will appear about once in every fifty-two.The question of probability arose from the interest of some nobles in games of chance.

They hired mathematicians in order to devise methods by which gambling could be made profitably without wasting money.These mathematicians have written quite a few interesting works, but they don't seem to have made their employers rich. There is a theory that all probabilities are statistical, and this theory is called the "frequency" theory. How likely is it, for example, that a man chosen at random from the population of Great Britain will turn out to have the surname "Smith"?You know how many people there are in England, and how many of them have the surname "Smee".Then you define the probability of choosing a person whose surname is "Smith"? Said it is the ratio of the number of people surnamed Smith to the population of the whole country.This is a perfectly precise mathematical concept that has nothing to do with uncertainty.Uncertainty occurs only when you apply the concept.

For example, if you see a stranger across the street, you bet one hundred to one that his name is not "Smith." But as long as you do not apply probabilistic calculations to empirical material, it is a perfectly pure branch of mathematics, with all the precision and certainty of mathematics. But there is another, quite different doctrine, adopted by Keynes in his On Probability.He maintains that there can be a relation between two propositions in which one does make the other more or less probable.He maintains that this relation is ambiguous and can vary in degree, to the extreme that one proposition makes another necessarily true, or makes the other false.He did not believe that all probabilities could be measured numerically, or could be reduced to frequency even in theory.

I have come to the conclusion that the frequency theory applies wherever the probabilities are clear, but there is another idea, also misnamed the frequency theory, which can be called by something more like Keynes's.This and that other kind of thinking I call "degrees of believability" or "degrees of dubiousness."Clearly, we are much more certain about some things than others, and our uncertainty about things is often not statistical.Indeed, statisticalness can sometimes be found where it is not obvious at first glance.I read a book about the Saxon invasion of England, which forced me to believe that Hengist was real, while Horsa was mostly a legend.It might be possible to line up the evidence about Horsa with that about other historical figures and see what percentage of it has been proven right or wrong.But this practice, though sometimes possible, is by no means universal.In doing so, the degree of doubt remains an essential concept in the study of matters regarded as knowledge.

I feel that the concept of doubtfulness is far more important than mathematical probability in the problems I am working on.In the inferences I have studied, not only are the premises (even if true) insufficient to make the conclusion certain.Much more important, the premise itself is indeterminate.This leads me to the conclusion that the mathematical aspects of probabilities are less relevant to problems of scientific reasoning than one might think. Next I collect instances in which we make inferences which we find to be sound, though they can be proved only by principles other than logic.In collecting such instances, I included all that a philosopher would have doubted in defending a doctrine.In general, I do not reject common sense except where it contradicts strong scientific argument.Here is a very simple example: Suppose one day, the weather is fine, you are walking outside; your shadow follows you; if you swing your arm, your shadow swings its arm; if you jump, your shadow moves jump; therefore you do not hesitate to say that it is your shadow, and you are sure that it is causally connected with your body.But although this reasoning is something no sane person would doubt, it cannot be proved logically.Logically, it is not impossible for a black block to have some actions, similar to your actions, but it exists independently.I collected as many cases as I could think of, in which non-demonstrative reasonings seemed beyond doubt, and I found by analyzing my thoughts, that there was some principle other than logic which must be true if we were not mistaken about these cases.The evidence in favor of this principle comes from these instances, not from the contrary.I think there are several such principles, but I have come to the conclusion that induction is not one of them.

I have found that some people, for lack of analysis, admit one set of non-demonstrative reasoning, because they are subjectively biased in favor of certain kinds of knowledge, while rejecting other sets of knowledge because of an opposite bias.It seems to me that in any particular reasoning which seems beyond doubt, we should discover the principle upon which it is based, and admit some other reasoning which is based on it.I have found that almost all philosophers get it wrong about what can and cannot be inferred from experience alone.I divide the problem of empirical knowledge into three stages: (1) knowledge about myself; (2) knowledge about the minds of others—including proofs of containment; and (3) knowledge about the physical world.Beginning with my own knowledge, I find that solipsism as it is commonly taught contains many things which are incompatible with the prudence which animates it.I don't remember a single thing that happened to me before I was two years old, but I don't think it makes sense to say that I came into existence when I was two years old.I am convinced that many things happened to me later that I don't remember.Even what I remember may never have happened.

I sometimes have dreams and remember things in them that are entirely imaginary.I once dreamed that I was afraid of the police because in that dream I "remembered" that I had murdered Lloyd George with Whitehead a month earlier.It can be seen from this that I remembered something, which in itself is not enough to prove that it really happened.So, if the solipsist is to achieve the kind of logical certainty he seeks, he must be limited to what I have called "current solipsism." Not only did he say, "I don't know whether the material world exists, whether other minds exist than my mind," he had to go a step further and say, "I don't know if I have a past and a future, because the past and the future are the same the existence of other people or the material world." No solipsist has been so thorough, and therefore every solipsist, in admitting those reasonings about himself, is contradictory, about his own Those reasonings have no better warrant than reasoning about people and things.

Much of our deeply held knowledge rests on evidence, and evidence rests on the belief that there are minds other than our own.The existence of other minds seems beyond doubt to common sense, and I myself see no reason to contradict common sense on this point.But there can be no doubt that I am compelled to believe in the existence of other minds because of my own experience; and there can be no doubt that we can still have these experiences of mine, in the absence of other minds, from a purely logical point of view.The belief in other minds is partly based on analogy, but partly comes from another source, which has a wider application.Suppose you make a comparison of two identical books, and you find that the letters of the two books match, you have to conclude that the two books have a common cause, you can go from typesetting and publishing to the author of the book. Trace this common cause.You would not believe that the author's body had the motion to write this book, while at the same time he did not have any thoughts.These grounds for admitting the existence of other minds are not demonstrable in a logical sense.

In a dream you may have experiences which, while you were asleep, convinced you just as much, but when you wake up, you do not think they are real.These facts prove suspicious to some degree, but usually only a small one.In most cases, because of these facts, you admit that the evidence is justified, if there is no evidence to the contrary. Next I speak of purely material events.Take, for example, our reasons for believing in sound waves.If there is a big explosion in a certain place, the time for some people to hear the sound depends on the distance between these people and that place.We don't believe that these people can hear a loud voice at different times, unless there is something going on in the separated space.A series of accidents where there are ears, and no connected incidents at all elsewhere, strikes us as too incoherent to be believed.A simpler example is object persistence.We couldn't believe that Everest didn't exist when no one was looking; our house vanished with a bang when we left it.We have no reason to believe such absurd things.Our inability to admit such absurdities is essentially the same as that which compels us to believe that what we have had, now forgotten, happened. Not only science, but also a lot of common sense, not talking about individual things, but talking about universal laws.But our knowledge of general laws, if it is empirical, is inferred, rightly or incorrectly, from our knowledge of particular events. "Dogs bark" is a universal law, but this law cannot be recognized if one does not hear individual dogs barking at individual times.I find that our knowledge of these individual events raises questions that some philosophers, especially logical positivists, have not adequately considered.But these problems are not involved in non-demonstrative reasoning, for the reasoning in question can only be said to be justified if it is grounded in a general law.If you hear a dog barking, you are using a general law when you infer that there is a dog.Most of the laws that science seeks are causal in nature.This leads me to the question, "What exactly do we mean by causality? What is the evidence for the emergence of causality?" In the past, philosophers often thought that the law of causality can be expressed by the formula "A causes B", which means that as long as a certain event A occurs, another certain event B will follow.Many people have argued that the causal continuum not only involves invariance, but also must have some properties, which can be called "necessity".But many empiricists deny this.They thought it contained nothing but unchanging continuity.But if these philosophers had known anything about science, they wouldn't have insisted on the whole point.The laws of causality must be either invariant, or simply something that expresses a tendency.In classical dynamics, the law of causality is expressed by differential equations, which represent accelerations, not actual events.In modern physics, causality has become statistical: causality does not say how a particular event will be, but that different things will each happen in a certain proportion of events.So causality is not what it used to be in the books of the old-fashioned philosophers.But causality still retains an important place. For example, what is meant by a "thing" that remains largely unchanged?This "thing" must be composed of a series of events.Each set represents a momentary state of what we may call this "thing."Several states of this "thing" at different times are often (though not always) connected by laws which can be stated without reference to some other "thing".If this were not the case, scientific knowledge could never have had a beginning.Unless we can know something without knowing everything, obviously we can never know.This applies not only to particular events, but also to laws linking events.In physics, both atoms and molecules persist for some time, without which the concept of motion becomes meaningless.A human body continues for some time, although the atoms and molecules of which the body is composed are not unchanged.A photon that travels from a star to the human eye remains unchanged throughout its journey, without which we could not say what it means to see a star.But all this persistence is just usual, not constant.The laws of causality with which science sets out to speak only approximate the usual case.Whether it will be known more precisely in the future, we cannot know now. I think what we can say is roughly as follows: If there is a given event, there is usually an event at a similar time and some nearby place, which is similar to the given event; and it is probably always possible to find a What law, this law roughly determines that this event is slightly different from the established event.Principles such as this are necessary in order to explain the general invariance of many "things" and to explain the difference between the perception of A and the perception of B—for example, if A and B are what we are now both Visible two stars. If a series of events has the property that from any one of them some neighboring events in the series can be inferred, I call the series of events a "causal system."It is because of these "causal systems" Therefore, the concept of "thing" is useful for common sense, and the concept of "thing" is useful for physics.It is precisely because these "causal systems" are approximate, not permanent, nor universal, that modern physics considers this concept of "things" unsatisfactory. There is another concept that I think is very useful in non-proof reasoning, that is the concept of "structure".If you see red in one direction and blue in another, it seems reasonable to assume that what is happening in one direction is somewhat different from what is happening in the other.Therefore, although we have to admit that the external cause of our color vision itself is not colored like our perception, when you see some color patterns, there must be similar patterns in the cause of your color vision .The concept that the space-time structure is always a constant, or roughly constant, in a series of causally connected events is very important and useful.To give a very simple example, assume that A recites a book, and B writes down what A reads, and what A sees in the book is the same as what B wrote down, deny the following four groups of things The causal connection between (1) the words printed in the book, (2) the sound A made while reading aloud, (3) the sound B heard, (4) the sound B wrote down word.The same is true of the relationship between the phonograph and the music it produces.Another example is radio broadcasting, where sound turns into electromagnetic waves, and electromagnetic waves turn into sound.If there is no space-time structure that is very similar to the space-time structure of the words spoken and heard, the electromagnetic wave that acts as an intermediary between the spoken sound and the heard sound, then the spoken sound and the heard sound are as close to each other as possible. impossible.There are countless examples of complex structures in nature. As long as these structures have changes in internal properties, they will be transferred from one structure to another. There is a causal relationship between them. The change between sound and electromagnetic waves during broadcasting is a change in internal properties. .In fact, all sights and hearings are characterized by this transfer structure, but have no intrinsic quality. Those who are not accustomed to modern logic can hardly imagine that we can know the structure of space-time without knowing the properties that make up the structure.This is part of a larger aspect of knowledge.If we do not fall into absurd paradoxes, we are compelled to admit that we can know propositions such as "Every A is B" or "Some A is B" without being able to give any instance of A, e.g., "Every Numbers I have never thought of, or never will think of, are numbers greater than a thousand."Although this proposition is undeniable, I would be contradicting myself if I tried to give an instance.The same is true of the space-time structure in the purely physical world.In the purely physical world, there is no reason to think that the properties that make up this structure are essentially similar to those we know from sensory experience. The general principles necessary for the validity of scientific reasoning cannot have what is commonly called proof.These principles are distilled from individual examples by analysis, and these examples seem to be extremely obvious, such as the dictation of A and the written record of B that I just mentioned.There is a gradual development from what I call "animal expectations" and up to the most subtle laws of quantum physics.The starting point of this whole process is to experience A and then expect B.An animal smells a certain odor and expects the food to be edible.If its expectations are usually wrong, it cannot survive.Evolution and adaptation to the environment make expectations more likely to be right and less often wrong, although expectations are not logically defensible. We can say that, of course, there are some habits.If the animal is to survive, there must be some adaptation of the animal's habits to the habits of nature. If this argument is used against Cartesian skepticism, it is a poor argument.But if we start from skepticism, I think it is impossible to get anywhere.What appears to be knowledge, and which there is no particular reason for denying, is broadly admitted, and we must take it as a starting point.Hypothetical skepticism is useful in the dissection of logic.It enables us to know what results can be obtained without this or that premise, just as, for example, we can inquire to what extent geometry is possible without the principle of parallel lines.But hypothetical skepticism is useful only for this purpose. Before explaining the real epistemological utility of undemonstrable premises in non-demonstrative reasoning, there is something else we must say about induction. I said before that inductive reasoning is not among the premises of non-demonstrative reasoning.But this is not because inductive reasoning is not used, but because inductive reasoning is not unprovable, so far as the form of inductive reasoning is concerned.In his book "On Probability", Keynes made a very smart study on whether it is possible to deduce inductive reasoning from the mathematical theory of probability.The question he wants to study is: Now there are several examples of As is Bs, but no opposite example, if the number of examples of As is Bs continues to increase, under what circumstances is it possible to make the generalization "all A is B" Can one approach such a limit of certainty?He came to the conclusion that this must satisfy two conditions.The first and more important of these conditions is that the concept "everything A is B" must have a finite probability on the basis of the rest of our knowledge before we know any instance of As being Bs.The second condition is that, if this generalization is false, the probability that we see only favorable examples should tend to zero as a limit when the number of inferences increases sufficiently.Keynes thought that the second condition could be satisfied if there was some indeterminate probability, let's say P, then, assuming that the generalization was false, and that n-1 As were known to be Bs, then the chance that the nth A will be B is always less than P, if n is large enough. The second of these two conditions is less important than the first, and is much less inconvenient.Let's focus on the first condition. How do we know that a generalization has a favorable defined probability until we examine the evidence for it or against it?If we know that this generalization has many favorable examples and no unfavorable examples, Keynes's argument will give this generalization a high degree of probability, which we have to know.The assumptions I have arrived at by analyzing examples of non-demonstrative reasoning serve to confer this limited a priori probability on some generalizations, but not on others.Be aware that these assumptions do not have to be certain in order to work, all that is necessary is a limited probability. In this respect these assumptions are quite different from the kind of a priori principles sought by idealistic philosophers, since their advocates hold them to a higher degree of certainty than most empirical knowledge. I ended up with five hypotheses.It is not important how these assumptions are formulated.I think it is quite possible that the number of these assumptions can be reduced and formulated more precisely.However, while I do not believe that any of these assumptions are necessary, I do affirm that they are sufficient.What has to be noticed is that these assumptions only express probability, not certainty, and only give the probability of the limited premise that Keynes needs to make his inductive reasoning valid.I have said something preliminary about these assumptions.I will now restate them, with more exactness and clarity. The first postulate I call the "semi-permanent postulate", and in a sense, I think this postulate can replace Newton's first law of motion.The reason why common sense can use the concept of "person" and the concept of "thing" with considerable success depends on this assumption.Once upon a time there was a long time when science and philosophy were able to apply "substances" This concept also relies on this assumption.The meaning of this assumption may be stated as follows: If there is an event A, then at any nearby time, at some nearby place, there is always an event very similar to A. Common sense would assume that this very similar event is part of the history of the person or thing that encountered A. The second assumption is the assumption of separable causal lines.This is perhaps the most important of the five assumptions.This assumption allows us to make a partial probabilistic inference from partial knowledge.We believe that everything in the universe has, or can have, some influence on everything else.Because we don't know everything about the universe, we can't say with certainty how things are going to be.But we can say roughly, probabilistically.Otherwise, we would have no way of acquiring knowledge and scientific laws.The hypothesis is as follows: It is often possible to form a series of events in which, from one or two events in the series, something can be deduced about the remaining events. The most obvious examples are things like sound and light waves.Because of the persistence of these waves, hearing and vision can tell us what's happening near and far. The third is the assumption of space-time continuity.The assumption is that the denial works at a distance.This assumption means that if there is a causal relationship between two non-adjacent things, there must be some intermediate links in the chain of causation.For example, if A heard what B said, we think that something must have happened between A and B.But I dare not say that this assumption cannot be reduced to a tautology, because physical space-time is entirely speculative, and the arrangement of events in space-time depends on causality. The fourth assumption I call the "structural assumption".This assumption is extremely important and powerful.The hypothesis is about things of the kind that, say, many people hear a lecture, or see a performance in a theater, or, to take a larger example, everyone sees some stars in the sky.The meaning of this assumption is as follows: If some structurally similar complex events are arranged around a center and arranged in some regions not far away, these events often belong to a causal line originating from a centrally structured event. The space-time structure is important.I originally emphasized this point in "Analysis of Things".Space-time structures can explain how one complex event can be causally related to another complex event, even though the two events are by no means qualitatively similar.The two events need only be similar in the abstract nature of the space-time structure.Obviously, the electromagnetic waves used in broadcasting elicit sensations in the listener, but there is no similarity between the two except in terms of structure.It is precisely because of the importance of structure that theoretical physics can be satisfied after finding formulas for non-empirical events which need not have any resemblance to our empirical events except in structure . Finally, there is the analogy hypothesis. The most important role of this hypothesis is to prove that the beliefs in other people's hearts are well-founded. This hypothesis is stated as follows: Suppose there are two kinds of events, A and B, and suppose that, provided that A and B can be seen, there is reason to believe that A caused B, then, if in a certain situation, A is seeing , but it is impossible to see whether there is B, then probably B will happen; likewise, if it is seen that B happened, but neither A nor A is seen, A will probably happen. I repeat that these assumptions are justified by the fact that they are implied in all reasoning that we think possible, and that, although they cannot be formally proved, scientific The whole system and everyday knowledge from which these hypotheses are drawn is within certain limits possible to obtain verification by itself.I don't recognize the theory of the coincidence of truth, but there is a theory of "probability" of coincidence, which is very important, and I think it is effective.Assuming that you have two facts and a causal principle connecting these two facts, the probability of the combination of the three may be greater than the probability of one of them, and the more and more complicated the interconnected facts and principles are, the more likely they are to be related to each other. The probability that comes from the combination increases even more.You must know that if the principle is not introduced, a bunch of hypothetical facts can neither be said to be consistent nor contradictory, because without relying on principles outside of logic, no matter what two kinds of facts cannot imply or contradict each other .I believe that the above five principles, or something similar to them, may serve as the basis for that agreement which produces the increased probability we are discussing.Something loosely called "causality" or "natural consistency" appears in many discussions of scientific method.The purpose of these hypotheses is to replace these general and vague principles by something more exact and more effective.I am not very confident in the above assumptions, but I am convinced that if we want to justify non-demonstrative reasoning (which, in fact, no one would doubt), things like the ones listed above are can not do without. Ever since I started writing Principia Mathematica, I have had a method.I wasn't very aware of this approach at first, but it slowly became more and more explicit in my mind.The method is to build a bridge between the sensory world and the scientific world.I think the two worlds are largely indisputable. It's like building an alpine cave. The work has to be done from both ends, hoping to meet in the middle to finish. Let's analyze some scientific knowledge first. All scientific knowledge uses some artificial entities, the purpose of which is to be easily processed by a certain calculation method.The higher the science, the more applicable this statement will be.Of all the empirical sciences this statement applies most fully to physics.In a high science, such as physics, the philosopher's initial work is to make people understand that this science is a deductive system, which begins with a few principles (from which the rest follows logically). progress), and real or hypothetical entities by which everything that this science deals with can be explained (at least in theory).If this work is done well, the principles and entities left after the analysis can be counted as collateral for the whole science, and the philosopher need not bother with the rest of the complicated knowledge contained in this science. up. But there is no empirical science that only aims to be a self-contained fairy tale, but to contain some propositions that can be applied to the real world, and because of the relationship between these propositions and the real world, people believe that these propositions are true. Even the most abstract parts of science, such as general relativity, are recognized because of observed facts. So the philosopher cannot fail to study the relationship between observed facts and scientific abstractions.It's a tough job. One of the reasons why it is difficult is that our starting point is common sense, and common sense has been contaminated with theory, although this theory is crude and simple.What we think we observe is more than what we actually observe, and part of the gain is added by the metaphysics and science of common sense.I am not saying that we should completely deny the metaphysics and science of common sense, but that this is also part of what we have to study.它一方面不属于用公式表示的科学这个极端,一方面也不属于纯粹的观察那个极端。 我因为把数理逻辑上的方法用之于物理学上的说明,颇受非议;可是关于这一点,我毫不后悔。最先让我知道在这一个领域内有什么可能性的人是怀特海。数理物理学所用以从事研究的是由点所成的空间、由瞬所成的时间和由质点所成的物质。没有一个近代数理物理学家认为在自然界中有这类的东西。但是,假定有一堆乱七八糟的东西缺乏数学家们所喜欢的那些圆滑的性质,也可能做得出一些结构来,那些结构包含这些东西,而且还具有一些对于数学家很方便的性质。正是因为这是可能的,所以数理物理学并不只是一种玩艺儿。指明如何弄出这些结构来的是数理逻辑。因为这个道理,在我前面所讲的建造一座桥梁来沟通感觉和科学上,数理逻辑是一件要紧的工具。 笛卡尔式的怀疑方法在我年轻的时候颇惹我的喜欢,而且在逻辑分析的工作上也许仍然可以把它当做一个工具,可是现在我觉得它已不再具有可靠的有效性了。普遍的怀疑论是驳不倒的,但是也不是能够接受的。到了现在,我终于承认感觉上的事实和科学上明显的真理是哲学家应该拿来用做资料的东西,因为,虽然这些东西之为真理不是十分确实不移,可是其盖然性的程度比哲学的思辩中所可能获得的任何东西要高一些。 从粗糙的事实过渡到科学,除了演绎逻辑以外,我们还需要另外一些推理的形式。 传统上认为归纳法就可以做这个用。但是这种想法是错误的,因为从正确的前提所得到的归纳推理的结论,错误的时候多,正确的时候少,这是有明证的。用分析才能得到从官觉过渡到科学所需要的推理原理。所要做的分析乃是把事实上没有人怀疑的那些种类的推理来加以分析。举例来说,如果你看见你的猫是在炉边的地毯上,过了一会儿你又看见它是在门口,那是他走过了这两个地点之间的一些中间地点,虽然你并没有看见它这样走。如果分析科学的推理这种工作做得好,就可以知道这种推理的具体的实例是(a)没人出于真心地对之加以怀疑的,(b)是不能没有的,如果在感觉的事实的基础上我们要相信这个基础以外的事物。 这种工作所得的结果要算是科学,而不是哲学。那就是说,对这种结果加以承认的理由是在科学工作里适用的那些普通的理由,而不是从什么形而上学学说得来的一些不切近的理由。更要紧的是,一些轻率的哲学家所要求的确定性是达不到的。他们常常妄以为已经达到了那种确定性。
Press "Left Key ←" to return to the previous chapter; Press "Right Key →" to enter the next chapter; Press "Space Bar" to scroll down.
Chapters
Chapters
Setting
Setting
Add
Return
Book