Home Categories Science learning complex

Chapter 31 edge of chaos

complex 米歇尔·沃尔德罗普 10404Words 2018-03-20
edge of chaos "I have a machinist streak in me," Langton said. "I always want to play with something, put it together, see it work. Once I actually put something together, any doubts go away." .I can see artificial life starting here." He was very clear: Now that he had given birth to the self-reproduction mechanism of the cellular automation world, he had to go further and require these models to be able to perform certain tasks before self-replication, such as finding Sufficient energy, or a certain amount of suitable combined components.He had to build many of these models so they could compete with each other for resources.He must give them the ability to travel around and feel each other.He must allow for all possibilities of change, for errors in reproduction. "All of these are problems that need to be solved. But it's all good now. I know I can embed the mechanics of evolution in von Neumann's world."

With this self-reproducing molecular automaton in hand, Langton returned to campus for another round of efforts to gain support for an interdisciplinary Ph.D.He would point to the constantly unfolding structures on the screen and tell people, "This is what I want to study." But still unsuccessful.The feedback he got was even more lukewarm than initially."At this stage, there's so much to explain to people," he says. "But people in the anthropology department don't understand computation and cycles, let alone molecular automata. 'Is this any different than a video trick?' They Ask. And the people in the computer science department know nothing about molecular automata and have no interest in biology. 'What does self-reproduction have to do with computer science?' they ask. So when you're trying to paint the whole picture When you’re like, hey, you’re going to look like a real, blah, blah, blah, blah idiot.”

"But I know I'm not crazy," he said. "Right now I feel very sane, more sane than anyone else. In fact, that's what worries me. I'm sure crazy people feel that way." No progress has been made and it is time to find another way out. Langdon wrote to his former philosophy mentor, now at the University of Pittsburgh, Wesley Selmon, asking, "What should I do?" In reply, Selmon offered his wife's advice: "Go to Box for advice." Box? "I thought he was dead. Most of the people who lived in his generation are dead," Longton said.Birx, however, is living in perfect health at the University of Michigan.Moreover, when Langton began to correspond with Birx, Birx gave him great support, even arranging for him to fight for financial assistance as a teaching assistant and research assistant.You apply, he wrote.

Langton applied immediately.By then he had learned that UM's computer and communications science research was well-known in the field of study he was pursuing.Langton said: "For them, information processing is a discipline that can cross all disciplines, no matter what kind of information processing method is worth studying. I applied there for this idea." Soon after, he received a letter from the head of the department, Professor Gideon Frieder."Sorry, your background is inappropriate," he wrote in the letter. His application was not accepted. Longton was furious.He wrote back in a seven-page letter.The main meaning of this letter is, what the hell are you doing! ? "That's the whole philosophy and purpose that you claim you live and breathe with, and that's what I'm after. And you're saying no to me?"

A few weeks later, Fred sent Langton a letter back, to the effect: "Welcome to the department." He later told Langton: "I just like being around people who dare to say 'no' to the dean. " In fact, as Langdon later learned, it was much more complicated than that.Box and Holland hadn't even seen his original application.For various bureaucratic and financial reasons, the broad Department of Computer and Communication Sciences, which had taken three decades to form, was about to be merged into the Department of Electrical Engineering.People in the Department of Electrical Engineering have a much more realistic view of the research topic.Such expectations have led Fred and others to downplay research like "adaptive computer science."Burks and Holland are engaged in a rearguard battle.

But whether Langdon was lucky or not, he didn't know that at the time.He's just happy to be accepted. "I can't lose this opportunity, especially when I already know I'm doing the right thing." Elvira is also willing to give him a try.Indeed, to do so she would have had to give up her job at the University of Arizona and be away from her mother's family in Arizona.But now that she's pregnant with her first child, she thought it would be nice to be able to take advantage of Langton's student health insurance.Plus, while they both love the Southwest, it's fun to see Michigan's dark clouds from time to time.So in the fall of 1982, they set off north.

Intellectually at least, Langton has learned a lot at the University of Michigan.As a teaching assistant for Birx's computer history course, he drew on the early computer development history materials that Birx personally experienced, and assisted Birx to collect and exhibit some of the earliest hardware of the ENIAC machine.He met John Holland, and for Holland's Integrated Circuits class designed and developed chips that could execute Holland's sorter system extremely quickly. But most of the time Langton studies like crazy.Formal language theory, computer complexity theory, data structure, editing and construction, he systematically learned the sporadic knowledge he had dabbled in before.He is eager to learn.Box, Holland, and other professors were very demanding.During Langton's time at the University of Michigan, he knew that in a doctoral qualification interview, they failed almost all the candidates and refused to transfer to the doctoral candidate qualification (of course, the losers still have a chance). “They ask you questions outside of the course and you have to answer intelligently. I really like this way of learning. Just passing the exam is very different from actually mastering the book knowledge.”

But in the realm of academic politics, things are not so good. At the end of 1984, when Langton finished his course, obtained his master's degree, passed the doctoral qualification examination, and was about to start writing his doctoral dissertation, he found painfully that the university did not agree with him to do research based on the von Neumann world. The evolutionary study of artificial life.The rearguard battle between Jinkes and Holland ended in failure. In 1984, the former Department of Computer and Communication Sciences was merged into the School of Electrical Engineering.In the new environment, dominated by an electrical engineering culture, the Burks-Holland-style "natural systems" curriculum was phased out. (This situation was and still is one of the few things that made Holland genuinely angry. He was originally one of the most pro-merger people, believing that the natural systems perspective would be preserved, and now he feels It seemed to be swallowed. Indeed, the situation at the time made Holland more motivated to participate in the Santa Fe activities.) But the valor of Box and Holland made them both encourage Langdon to pursue biological activities. Lesser, and more computer science-based doctoral research.From a practical standpoint, Langton concedes, they do have a point. "By then I had gained a lot of insight and understood that the von Neumann universe was an extremely difficult system to build and put into operation. So I started looking for some kind of research topic that could be completed in a year or two, and It’s not something that will take decades to complete.”

Instead of building a whole von Neumannian universe, he thought, why not just do a little research on its "physics"?Why not investigate why some molecular automata rule tables allow you to build very meaningful structures while others do not?This is at least a step in his direction.This research may satisfy both the hard and fast rules of computer science and the demands of engineering.In any case, it may have some interesting connections to real physics.Indeed, the connection between molecular automata and physics later became a hot topic. In 1984, Stephen Wolfram, a genius in physics, pointed out when he was at Caltech that molecular automata not only contain rich and varied mathematical structures, but also have profound similarities with nonlinear dynamics.

What Langton found particularly appealing to him was that Wolfram believed that all molecular automaton rules could be reduced to four levels of generality.Wolfram's first level includes the so-called doomsday rule: no matter what model of living or dead cells you start with, everything dies within a move or two.The squares on the computer screen will become a single color.In powertrain terms, this rule has a single "point of attraction".That is, the system is mathematically like a marble rolling along the bottom of a bowl of cereal: no matter which side of the bowl the marble rolls from, it always rolls into the bowl very quickly The center point of the bottom, that is, in the dead end.

Wolfram's second rank was slightly animated, but only slightly.Under these rules, models of living and dead cells initially randomly distributed on a computer screen would quickly coalesce into a set of motionless clumps, with perhaps other clumps oscillating periodically.This automaton still gives the impression of freezing stalls and deadlocks.In dynamical systems terms, these rules seem to form a set of periodic attractors.That is, there are some holes in the bottom of the uneven bowl, and the marble will roll around it. Wolfram's third-order rules went to the other extreme: they were overactive.These rules generated so much activity that the screen seemed to boil.Nothing is stable, nothing is predictable.Once the structure is formed, it breaks down again.In dynamical systems terms, these rules correspond to "strange" attractors -- a state often referred to as chaos.They are like Dali rolling fast and violently in a big bowl, never able to settle down Finally, there are Wolfram's fourth-order rules, including those rare rules where it is impossible to stagnate in a certain state.These rules neither create frozen blobs nor cause total chaos.They are coherent structures, rules capable of multiplying, growing, dividing and recombining in a wonderfully complex way.They basically can't settle down.In this sense, the most famous example of the rules of the fourth order is the "Game of Life".In powertrain terms, they are... And that's where the problem lies.In conventional dynamical systems theory, nothing seems to conform to the rules of the fourth order.These rules, Wolfram speculates, are like a unique behavioral manifestation of molecular automata.But the truth is that no one knows exactly what they look like, or why one rule produces fourth-order behavior and the other doesn't.The only way to find out which class a particular rule belongs to is to test it and see what behavior it produces. For Langton, the situation not only made him curious, but revived the "because it wasn't there" feeling he once had about anthropology.These rules seem to be the basis of his imaginary von Neumann universe, which captures many important features of life's spontaneous emergence and self-reproduction.So he decided to devote himself to researching the question: How are Wolfram's ranks related to each other?What determines that a particular rule belongs to a certain class? He immediately had an idea.He happened to be reading some books on dynamical systems and chaos theory at the time.He knew that in many truly nonlinear systems, the equations of motion contained many parameters that acted as knobs to determine how chaotic the system was.For example, if the system is a dripping faucet, the parameter is the flow rate of the water.Or, if the system is a herd of rabbits, the parameter would be the ratio between the birth rate of rabbits and the death rate due to overbreeding.In general, small parameter values ​​usually lead to stable behavior: water droplets with uniform velocity, constant rabbit herd size, etc.This is very similar to the stagnation behavior of Wolfram's first and second ranks.But as the parameters get bigger and bigger, the behavior of the system becomes more and more complex—various sized water droplets, fluctuating herd size, and so on—until it becomes total chaos.By this time, the behavior of the system was Wolfram's third order. Langton wasn't quite sure how this description accommodated the fourth tier.But the similarity between nonlinear systems and Wolfram's hierarchy is too great to ignore.If he could find some way of relating similar parameters to the rules of molecular automata, then Wolfram's hierarchy would take on its meaning.Of course, he cannot arbitrarily relate parameters to molecular automata rules.Regardless of the outcome, its parameters must be derived from its rules themselves.Maybe he could gauge the responsiveness of each rule.Like, how often it causes the central cell to change its state.But there will be a lot of things to test. So Langton started programming his computer to test every half-understood parameter. (One of the first things he did at the University of Michigan was to perfect his Molecular Automata program on the Apple II on the powerful, high-speed Apollo workstations.) This work did not yield any results. progress.Until one day, when he was experimenting with one of the simplest parameters, the Greek letter (λ), as he called it, just became the probability that any given cell would "survive" into the next generation.Thus, if a rule has a lambda value of exactly 0.0, nothing survives the first step, and its rule is clearly class one.If its regular lambda value is 0.5, the grid seeths with activity, with an average of half the cells alive and half dead.Then we can speculate that such a rule belongs to the third level of chaos.The question is whether λ can reveal any interesting phenomena between the two values ​​(beyond 0.5, the effect of "living" and "dead" is just opposite, and things may become simple again, until 1. 0, back to the first level, which is like observing the behavior of a negative of a photo). To test the parameters, Langton wrote a small program, which could tell the Apollo machine to use a special value of lambda to automatically generate a rule, and then run the molecular automaton on the screen to show the effect of this rule.He said: "The first time I ran the program, I took a lambda value of 0.5, thinking I was setting it in a completely arbitrary state. But suddenly I started getting all the rules of the fourth level. , these rules came one by one! I thought, 'God, this is unbelievably good!' So I tested the program and figured out that there was a bug in the program that would set lambda to a different value, which happens to be the key value for this class of automata." After Langton corrected this programming error, he began to systematically probe for various values ​​of lambda.At very low values ​​around 0.0, he found nothing but lifeless and frozen first-level rules.When he increased the value of λ a little, he found periodic second-order rules, and when he increased the value of λ still higher, he found that it took longer and longer time for the second-order rules to settle down.If he immediately increased the value of λ to 0.5, he found that, as he expected, a completely chaotic third-level rule appeared.But between the second and third levels, tightly clustered around this magic "key" value of lambda (about 0.273), he found all the rules of the fourth level.Yes, "Game of Life" is in it.He was dumbfounded.Somehow, this simple lambda parameter puts Wolfram's ranks in exactly the order he wishes to achieve.He discovered where the fourth level came into play, and it was at the turning point: Ⅰ&Ⅱ→“Ⅳ”→Ⅲ The sequence also points to a challenging shift in the powertrain: order → "complexity" → chaos By "complex" I mean the perpetually astonishing dynamical behavior exhibited by certain fourth-order automaton rules. "It immediately reminded me of some kind of phase transition," he says. If you think of the parameter λ as temperature, you find that low values ​​of the first and second order rules λ are like ice-like solids, where The water molecules are firmly solidified into a crystal lattice. The third-level rule with a slightly higher λ value corresponds to a gas like water vapor, and its water molecules volatilize and collide with each other, completely in a chaotic state.And what does the rule of the fourth order in between correspond to?liquid? "I didn't know much about phase transitions, but I drilled into the molecular structures of so-called liquids," Langton says. At first it seemed promising: He found that liquid molecules typically tumble into each other, and every second Clocks combine, gather, and break apart again billions of times, much like the "Game of Life". "The idea that something like a 'game of life' could go on forever at the molecular level like a glass of water seems convincing to me." Langton really liked the concept.But as he thought about it further, he began to realize that this was not quite true.Fourth-order rules are often able to produce "extended transients", such as the glider in the "Game of Life", a structure that can survive and reproduce for an arbitrarily long time.Under normal circumstances, liquids do not exhibit this molecular-level behavior.It is well known that liquids, like gases, can be in a state of total chaos.Indeed, Langton learned that by increasing the temperature and pressure to a certain point, you could turn water vapor directly into water, without going through a phase change at all.In general, gases and liquids are nothing but two manifestations of the flow state of a single substance.So the difference is not fundamental, and the similarity between liquid and the "Game of Life" is only superficial. Langton went back to his physics textbook and continued reading. "I finally found the fundamental difference between first-order and second-order phase transitions." First-order phase transitions are the ones we're all familiar with: violent and unmistakable.For example, if you heat an ice cube to 32 degrees Fahrenheit, the ice cube will immediately turn into water.Molecules are basically forced to choose between order and chaos.Below the temperature at which the transition occurs, the molecules oscillate slowly enough to maintain crystalline order (ice cubes).But when the temperature is high above the transition point, the molecules will vibrate violently, the speed of molecular bond breaking is faster than the speed of their formation, and the molecules are forced to choose chaos (water). Langton learned that the nature of second-order phase transitions is unusual (at least at the temperatures and pressures humans are accustomed to during them).But the phase transition is mild, largely because the system's molecules don't have to make an either-or choice.They combine chaos and order.For example, above the transition temperature, most water molecules roll over each other in a state of total chaos: the fluid phase.However, among the tumbling water molecules are thousands of tiny, ordered, latticed islands whose water molecules are constantly disintegrating and recrystallizing at the edge.The islands are neither very large nor very long-lasting, even on their molecular scale.So the system is still close to chaos.But as the temperature dropped, the largest islands became very large and relatively prolonged.The balance between chaos and order begins to shift.Of course, if the temperature rises all at once above the transition point, the effect is reversed: the state of the object changes from a sea of ​​fluid with islands to a solid continent with lakes of fluid.But if the temperature is just at the transition point, the balance will be perfect: the amount of orderly structure is exactly equal to the amount of chaotic fluid, and order and chaos are intertwined in the dance of tiny arms and broken threads, showing a complex And the state of perpetual change.The largest structure of order will only extend it arbitrarily long in space and time.Nothing ever really settled down. Langton was shocked when he discovered that "this is the crucial connection! This is exactly like Wolfram's fourth degree".Everything is covered here.The prolific, glider-like "extended transients," the restless dynamics, the perpetually surprising complexity of the dance of structures capable of growing, dividing, and recombining—all this virtually defines the second phase transition of order. So Langton now has a third analogy: Molecular automata class: Ⅰ&Ⅱ→“Ⅳ”→Ⅲ power system: order → "complexity" → chaos substance: Solid → "Phase Change" → Fluid The question is, is there a greater meaning than this analogy?Langton went back to work, tweaking all the statistical tests physicists had to apply to von Neumann's universe.When he graphed the results of lambda, the graph looked as if it had been copied straight from a textbook.Physicists look at it and shout, "Second-order phase transition."Langton didn't know why his lambda parameter worked so well, or why it was so similar to air temperature. (Indeed, no one really understands this until now.) But no one can deny this fact.Second-order phase transitions are real, not just an analogy. Langton would often name this phase transition as he pleased: "transition to chaos," "frontier of chaos," "beginning of chaos."But the name that really allows him to grasp the essential feeling is "Edge of Chaos". He explained: "The name reminds me of a feeling I experienced when I was learning to dive. Most of the time we were diving very close to the shore, where the water was crystal clear and we could see it very clearly. Sixty feet deep. But one day our instructor took us to the edge of the continental shelf, where the crystal clear sixty feet turned into an eighty-degree slope, sliding deep into unfathomable water. I I believe that the depths of that slope varied from two thousand feet from top to bottom. It made me realize that the dives we had made, though adventurous and daring at the time, were really no more than frolics by the sea. Come 'Ocean', the continental shelf is nothing more than a puddle." "Life emerges in the ocean, and you live on its edge, rejoicing in the infinite nourishment of the flow. That's why the expression 'edge of chaos' strikes me with a very similar feeling: because I believe that life also originated On the edge of chaos. We live on this edge, rejoicing in the nourishment provided by matter..." Of course, this is a very poetic statement.But for Langton, this belief is far more than poetic.In fact, the more he thought about it, the more he felt that there was a very deep connection between phase transition and computers, between computers and life itself. Of course, this connection can be traced directly to the "Game of Life."When the game was invented in 1970, Langton said, the first thing people noticed were structures that could reproduce, such as gliders that could carry signals from one end of the von Neumann universe to the other. .Indeed, you can think of the solo gliding of a group of gliders as a string of binary digits: "glider present" = 1; "glider disappeared" = 0.As one continues to play, one discovers structures that can store this information, or emit new information signals.In fact, it soon became clear that the "Game of Life" structure could be used to build a complete computer with data storage, information processing and all. The "Game of Life" computer may have nothing to do with the computer on which the game operates. No matter what kind of computer it is, whether it is a PDP-9, an Apple II, or an Apollo workstation, it is nothing more than a computer capable of making molecular automata A working engine.No, the "Game of Life" can exist entirely in von Neumann's universe, in Langton's self-reproducing mode.Indeed, it was a primitive, inefficient computer.But in principle, it does exist, and it would be a general-purpose computer powerful enough to compute anything that can be computed. That's a pretty surprising result, Langton said, especially when you consider that only a relatively small number of molecular automata rules can do it all.You can build such a general-purpose computer with molecular automata governed by first- and second-order rules, because the structures they produce are too sluggish, and you can store data in such a universe, but you cannot Such a computer reproduces information everywhere, and it is impossible to build a computer of the third chaos level automaton.Because the signal is quickly lost above this, the stored structure is also quickly fragmented.Indeed, Langton says, the only rules that would allow you to build a general-purpose computer lie in a fourth level like the Game of Life.These are the only rules that provide both enough stability to store information and enough mobility to send signals between arbitrary distances.Sufficient stability and sufficient fluidity are the keys to a computer.Of course, these are also the rules that appear in the phase transition on the edge of chaos. Langton realized that phase transitions, complexity, and computers were all involved here.Or at least, they're all included in von Neumann's universe.But Langton believes that the same relevance holds for the real world—from social systems to economic institutions to living cells.All real life situations are the same.Because as soon as you start working with a computer, you are diving into the very essence of life. "Life depends on information processing to an incredible degree," he said. "Life stores information, draws a map of sensory information, and then performs some complex transformation of information to generate action. British biologist Richard Dawkins (Richard Dawkins) gave a very good example: If you choose Pick up a stone and throw it into the air, it will fall in a beautiful parabola. This is because it is subject to the laws of physics. It can only give a simple answer to the external force on it. But if you put a small The bird is thrown into the sky, it never behaves like a rock, it flies somewhere in the bush. The same external forces are of course acting on the bird. But the bird processes a lot of the information it receives , which caused it to fly into the bushes. Even simple cells do the same: their behavior is different from that of inanimate matter. They do not simply respond to external forces. Therefore, For living objects, an interesting question is: At what point do dynamical systems subject to information processing emerge from matter that simply responds to physical forces?" To answer that question, Langton said, "I pulled out my phase-change glasses and looked at the phenomenology of computers. There are a lot of parallels here." For example, when you take a computer theory class, the first thing you learn is to distinguish the "halt" program from - That is, a program that generates a reply within a certain period of time after receiving a series of data - and a program that is always running.It's like distinguishing the behavior of matter above and below a phase transition, Langton said.In this sense, matter is often "computerized" to figure out how to arrange itself at the molecular level: if it is cold, the answer is very fast to completely solidify into a crystal.But if it is very hot, it cannot answer at all and can only exist in the form of fluid. This distinction, he says, is also similar to the difference between the first order, second order, where the molecular automata eventually freezes into a fixed configuration, and the third order of chaotic states, where the molecular automaton boils, e.g., One program just typed "Hello, world!" on the screen and then disappeared.Such a program has a low value of 0.0 for lambda relative to the first-order molecular automata, so it stops quiet almost immediately.Conversely, if a program has a fatal bug so that it prints a string of gibberish characters on the screen that never repeats itself, such a program corresponds to a molecular automaton of the third order, with lambda values ​​between 0.5, At this time, the degree of chaos is the most serious. Next, if you move away from the two extremes, towards the phase transition.In the physical world, you'll find transients lingering longer and longer.That is, as the temperature gets closer to the phase transition, it takes longer and longer for the molecules to make their own decisions.Likewise, as λ increases from 0 to the von Neumann universe, you find that molecular automata churn violently for a while before coming to a standstill, and how long they go depends on their original state.This is equivalent to a so-called polynomial-time algorithm in computer science - that is, it has to do a lot of calculations before stopping, but the calculations are relatively fast and efficient. (Polynomial-time algorithms often come up with tricky problems like list sorting.) But when you look further, when λ is closer to a phase transition, you see that molecular automatons churn violently for quite a while.These amount to non-polynomial-time algorithms, some sort of never-ending state.This algorithm is completely ineffective. (An extreme example would be a software program that plays chess by trying to look ahead to every possible move.) What if it happens to be in phase transition?In the physical world, a particular molecule may be excited in a phase of order, or flow, without prior knowledge of it, because order and chaos are closely intertwined at the molecular level.Also, the fourth level rule may or may not be a frozen pattern.But regardless of the resulting pattern, phase transitions at the edge of chaos correspond to what computer scientists call "undecidable" algorithms.These algorithms may quickly stall on some input, like starting the "game of life" with a known stable structure.But they may go on forever because of another input.The point is that you can't always know what's going to happen in advance, and even in principle you can't.In fact, Langton says, there's even a theorem that spells out this effect: it's the "undecidability theorem," proved in the thirties by British logician Alan Turing.The theorem basically says that no matter how smart you think you are, there will always be algorithms that will outperform your ability to predict in advance.The only way to discover what these algorithms will produce is to run them. Of course, these are exactly the algorithms you want to use to simulate life and intelligence.So it is no surprise that the "Game of Life" and other fourth-order molecular automata are so similar to life.They exist in the only dynamical realm where complexity, computers and life itself are possible: there is the edge of chaos. Langton now has four very elaborate analogies— Molecular automata class: Ⅰ&Ⅱ→“Ⅳ”→Ⅲ power system: order → "complexity" → chaos substance: Solid → "Phase Change" → Fluid computer: stop → "undecidable" → non-stop There is a fifth and more assumptions: Too stable → "life/intelligence" → too loud But what's the point of all this?Langton's judgment: "solid" and "fluid" are not just two fundamental states of matter, just like water and ice, but two fundamental levels of general dynamical behavior, including space like molecular automaton rules, Or the dynamic behavior of such thoroughly nonlinear realms as the space of abstract algorithms.He further realized that the existence of these two fundamental levels of dynamic behavior implies the existence of a third fundamental level: the "phase transition" behavior on the edge of chaos.On the verge of chaos, you'll encounter complex computers and, quite possibly, life itself. Does this mean that one day you might be able to write the general physical laws of phase transitions, including those that would explain the freezing and melting of water and the mysteries of the origin of life?maybe.Perhaps life arose out of a primordial soup four billion years ago, from some kind of real phase transition.Langton didn't know.But he could not resist the imagining that life was indeed always trying to balance on the brink of chaos: always in danger of falling into excess order on the one hand, and always threatened by excess disorder on the other.Perhaps this, he thought, was evolution: it was simply the process by which life learned to control its own parameters so that it became more and more able to balance itself on the margins. Who knows?Figuring it all out would take a lifetime. 1986年,朗顿终于让工程学院接受了他把他对计算机、动力系统和分子自动机中的相变的概念作为博士论文的题目。但他还要做许多工作才能建立基本的框架,使其足以满足他的论文指导委员会的要求。
Press "Left Key ←" to return to the previous chapter; Press "Right Key →" to enter the next chapter; Press "Space Bar" to scroll down.
Chapters
Chapters
Setting
Setting
Add
Return
Book