Фрагменты когнитивной психологии. Бабушкин А.П. - 10 стр.

UptoLike

Составители: 

10
5. Read and translate the text:
Neural Representation of Information
From an information-processing point of view, the most important components of
the nervous system are the neurons. A neuron is a cell which accumulates and transmits
electrical activity. The human brain itself contains roughly 100 billion neurons, each of
which may have roughly the processing capability of a modest-sized computer. A
considerable fraction of the 100 billion neurons are active simultaneously and do much
of their information processing through interactions with one another. Imagine the
information-processing power in 100 billion interacting computers! Neurons interact by
driving up the activation level of other neurons (excitation) or by driving down their
activation level (inhibition). All neural information processing takes place in terms of
these excitatory and inhibitory effects; they are what underlies human cognition.
It is an interesting question just how these neurons represent information. There is
evidence that individual neurons respond to specific features of a stimulus. For instance,
there are neurons in the monkey brain that appear to respond maximally to faces.
However, it is not possible that we have single neurons encoding all the concepts and
shades of meaning we possess.
If a single neuron cannot represent the complexity of our cognition, how is it
represented? How can the activity of neurons represent our concept of baseball; how
can they result in our solution of an algebra problem; how can they result in our feeling
of frustration? Similar questions can be asked of computer systems, which have been
shown to be capable of answering questions about baseball, solving algebra problems,
and displaying frustration. Where in the millions of off – and – on bits in a computer
does the concept of baseball lie? How does a change in a bit result in the solution of an
algebra problem or in feeling of frustration? The answer in every case is that these
questions fail to see the forest for the trees. The concepts of baseball, problem solution,
and emotion occur in large patterns of bit changes. Similarly we can be sure that human
cognition is achieved through large patterns of neural activity.
We do not really know how the brain encodes cognition in neural patterns, but the
evidence is strong that it does. There are computational arguments that this is the only
way to achieve cognitive function. There is also a fair amount of evidence suggesting
that human knowledge is not localized in any single neuron, but is distributed over
many neurons in large patterns of activation. Damage to a small number of neurons in
the brain generally does not result in the loss of specific memories. On the other hand,
massive damage to larger areas of the brain will not result in temporary or a large set of
memories.
It is informative to consider how the computer stores information. Consider a
simple case: the spelling of words. Most computers have codes by which individual
patterns of binary values (1’s and 0’s) represent different letters. Similarly, information
in the brain can be represented in terms of patterns of neural activity. At the same time
the brain codes information redundantly so that even if certain cells are missing, it can
                                            10

5. Read and translate the text:

                        Neural Representation of Information

       From an information-processing point of view, the most important components of
the nervous system are the neurons. A neuron is a cell which accumulates and transmits
electrical activity. The human brain itself contains roughly 100 billion neurons, each of
which may have roughly the processing capability of a modest-sized computer. A
considerable fraction of the 100 billion neurons are active simultaneously and do much
of their information processing through interactions with one another. Imagine the
information-processing power in 100 billion interacting computers! Neurons interact by
driving up the activation level of other neurons (excitation) or by driving down their
activation level (inhibition). All neural information processing takes place in terms of
these excitatory and inhibitory effects; they are what underlies human cognition.
       It is an interesting question just how these neurons represent information. There is
evidence that individual neurons respond to specific features of a stimulus. For instance,
there are neurons in the monkey brain that appear to respond maximally to faces.
However, it is not possible that we have single neurons encoding all the concepts and
shades of meaning we possess.
       If a single neuron cannot represent the complexity of our cognition, how is it
represented? How can the activity of neurons represent our concept of baseball; how
can they result in our solution of an algebra problem; how can they result in our feeling
of frustration? Similar questions can be asked of computer systems, which have been
shown to be capable of answering questions about baseball, solving algebra problems,
and displaying frustration. Where in the millions of off – and – on bits in a computer
does the concept of baseball lie? How does a change in a bit result in the solution of an
algebra problem or in feeling of frustration? The answer in every case is that these
questions fail to see the forest for the trees. The concepts of baseball, problem solution,
and emotion occur in large patterns of bit changes. Similarly we can be sure that human
cognition is achieved through large patterns of neural activity.
       We do not really know how the brain encodes cognition in neural patterns, but the
evidence is strong that it does. There are computational arguments that this is the only
way to achieve cognitive function. There is also a fair amount of evidence suggesting
that human knowledge is not localized in any single neuron, but is distributed over
many neurons in large patterns of activation. Damage to a small number of neurons in
the brain generally does not result in the loss of specific memories. On the other hand,
massive damage to larger areas of the brain will not result in temporary or a large set of
memories.
       It is informative to consider how the computer stores information. Consider a
simple case: the spelling of words. Most computers have codes by which individual
patterns of binary values (1’s and 0’s) represent different letters. Similarly, information
in the brain can be represented in terms of patterns of neural activity. At the same time
the brain codes information redundantly so that even if certain cells are missing, it can