AI 2023. Meet ChatGPT. - page 148

 
Valeriy Yastremskiy #:

Google is evil)))) This is different information then, they sound the same, but there is entropy and a measure of ordering, some person for some reason called it not successfully information. What does information have to do with what Peter is talking about? It doesn't correlate with the measure of orderliness at all. Besides, now I discussed with my son, he is also not aware, although he graduated about 5 years ago and is familiar with the topic, that the term information is used in the physical characteristics of gases or other substances. With copromat he is on your own))))) I was not taught either, although entropy was put into my brain normally so)))))

Misinterpretation of a term begins with cutting off the connection with its etymology. The reason is probably the inclusiveness of some concepts. Over time, they go beyond their original boundaries and are periodically reinterpreted by scholars. Inclusiveness grows, the level of abstraction increases, and with it, the confusion and confusion in understanding the concept also increases. Information is just such a case.

Because the forms of information are diverse and it surrounds us everywhere, it is difficult for us to generalise all its phenomena within the boundaries of a single definition.

In the information age, we are confused about what information is. This is not surprising. There are too many information-related entities mixed up in our heads: matter, waves, carriers, bits, entropy, neural activity, and so on.

Among all the things that information is associated with, we need to isolate what produces it. The "generator" of information. Before, we talked about carriers, now, it is logical to talk about the source.

Is information produced by matter? And how does information participate in physical processes? And how can it be measured? We must admit that science does not include information as a constituent of matter.

For matter, a disc with and without information is the same thing. For a person who recorded photos of his favourite dog on it, it is information.

On this basis - a physical carrier of information carries nothing. .... And at the same time, it does if a person interacts with it. A paradox? No. Everything fits into the concept of information as a product of interpretation, which is neural activity.
 
Valeriy Yastremskiy #:

Nah, then you don't get it anymore)))))) I didn't find entropy in 8th grade computer science))))) Lan, in physics it turns out the other way round. The measure of disorder is the number of different states - entropy. 0 is a calm state, the elements of the system are the same. 1 turbulent state. There are too many variants of states to estimate them. And it is not infinitely many. Simply the states of elements change chaotically relative to each other.

And so yes, we have 0 information about the state of matter and its elements accurately at entropy 1. We have no possibility to calculate or estimate the states of elements inside.

We estimate presence or absence of information between systems or inside one. What kind of connections - this is the 3rd question.
 
Maxim Dmitrievsky #:
The presence or absence of information between or within systems is assessed. What exactly are the connections - this is the 3rd question.

Well, this information has nothing to do with Peter))))) This is the number of states of the elements of the system and the possibility of their accounting. Like each element has properties, and these properties are information. And there is an assumption in this science that when the number of states increases, the possibility of accounting for states decreases. Often they decrease not linearly, but in steps.

Peter's information is the reflection of the world by nerve cells)))))) Good definition))))))

 
Maxim Dmitrievsky #:
You're so fucking stupid. The greater the entropy, the greater the loss of information. That's it, I'm not communicating with you :) people are too lazy to even read.

all processes tend to increase entropy (the "disorder" increases, the number of possible states increases). an increase in the amount of information on a medium leads to an increase in entropy (it looks like a paradox, but it is true).

two identical blank discs have uniform magnetisation, one disc is recorded and the uniformity of the magnetic fields becomes less, which increases the entropy of the magnetic field of the recorded disc.

an easier to use and understand concept in relation to information is isotropy, i.e. uniformity of properties. an increase in information on a medium inevitably leads to a decrease in isotropy (uniformity).

examples:

1. an inscription was applied to a surface with paint with prior removal of the advised mass of material on the surface - the isotropy of the chemical composition decreased

2. an inscription was made by coring or tracks were applied to a vinyl plate - the structural isotropy of the object material decreased

3. information was recorded on a magnetic disc - the isotropy of the magnetic field decreased.

the same is true with electromagnetic waves - waves carrying more information are less isotropic than "empty" uniform waves, which are extremely rare in nature. therefore, if uniform in frequency and amplitude waves are detected, it is assumed that they are of artificial origin.

if there is no key to the way of reading information, it looks like chaotic distributed properties in the object, it is very important, because everything that is not understood is perceived as chaos, randomness.

 
Реter Konow simplest structure. Information is however complex. Matter cannot make clever interpretations of its data.

Well the example of nucleic acid replication in cell nuclei doesn't seem tricky to you, does it? Big deal, it's useless nonsense for humans, they don't interpet genetic information (oops, they already interpret it!).

Retag Konow #:
Misinterpretation of a term begins by cutting off the connection to its etymology. The reason is probably the inclusiveness of some concepts. Over time, they go beyond their original boundaries and are periodically reinterpreted by scholars. Inclusiveness grows, the level of abstraction increases, and with it, the confusion and confusion in understanding the concept also increases. Information is just such a case.

Well, matter was not discussed earlier in such a way that it is a form of information in some medium-carrier. But now we do. And what is the use of old etymological roots of different words, which are probably unnecessary at all?

 
Valeriy Yastremskiy #:

Well, this information has nothing to do with Peter))))) This is the number of states of the elements of the system and the possibility of their accounting. Like each element has properties, and these properties are information. And there is an assumption in this science that when the number of states increases, the possibility of accounting for states decreases. Often they decrease not linearly, but in steps.

Peter's information is the reflection of the world by nerve cells)))))) Apt definition))))))

It is exactly the same process - the formation of ordered connections between neurons
Is it really so difficult
 
Andrey Dik #:

all processes tend to increase entropy (the "disorder" increases, the number of possible states increases). increasing the amount of information on a medium leads to an increase in entropy (it looks like a paradox, but it is true).

two identical blank discs have uniform magnetisation, one disc is recorded and the uniformity of magnetic fields becomes less, which increases the entropy of the magnetic field of the recorded disc.

a more simple to use and understand concept in relation to information is isotropy, i.e. uniformity of properties. an increase of information on a medium inevitably leads to a decrease in isotropy (uniformity).

Examples:

1. an inscription was applied to a surface with paint with preliminary removal of the advised mass of material on the surface - the isotropy of the chemical composition decreased

2. the inscription was made by coring or tracks were applied to a vinyl plate - the structural isotropy of the object material was reduced

3. information was recorded on a magnetic disc - the isotropy of the magnetic field decreased.

the same is true for electromagnetic waves - waves carrying more information are less isotropic than "empty" uniform waves, which are extremely rare in nature. therefore, if uniform in frequency and amplitude waves are detected, it is assumed that they are of artificial origin.

If there is no key to the way of reading the information, it looks like chaotic distributed properties in the object, it is very important, because everything that is not understood is perceived as chaos, randomness.

Well, we are comparing not the discs themselves, but the tracks on them. It would be more correct to compare random serifs with non-random serifs. The random ones will have the highest entropy. And we'll hear brown noise (or whatever it is).
 
Maxim Dmitrievsky #:
Well, we're not comparing the discs themselves, but the tracks on them. It would be more accurate to compare random serifs to non-random serifs.

When tracks appear on discs, entropy increases, isotropy decreases.

random and non-random serifs have equal volume of raw information and equal entropy and isotropy, but non-random serifs are encoded and unpacking them will produce more information (entropy will increase, isotropy will decrease).

entropy reduction is possible only with energy expenditure. so unpacking (information compression) reduces entropy with energy expenditure.

example. a star explodes and a neutron star appears, which is very uniform in structure - paradox? - no, because energy was expended and more mass of the star was dissipated in space, which in sum over the total initial mass of the star means that matter has come to a higher entropy.

Maxim, don't be stupid))))

 
Maxim Dmitrievsky #:
It's exactly the same process - the formation of ordered connections between neurons.
Is it really that hard

No, it's not, but how do you measure this process in a downer, a schizophrenic, how do you assess the psychological state? And the main question is, why? The point is to invent something and assume that it works everywhere? In computer science, yes, it works, but Peter doesn't want it to)))).

 
Andrey Dik #:

appearance of tracks on discs - entropy increased, isotropy decreased.

random and non-random serifs have equal volume of raw information and equal entropy and isotropy, but non-random serifs are encoded and unpacking will produce more information (entropy will increase, isotropy will decrease).

entropy reduction is possible only with energy expenditure. therefore unpacking (information compression) reduces entropy with energy expenditure.

example. a star explodes and a neutron star appears, which is very uniform in structure - paradox? - no, because energy was expended and more mass of the star was dissipated in space, which in sum over the total initial mass of the star means that matter has reached a higher entropy.

Maxim, don't be stupid))))

You have to compare the same system when the random serifs have changed to non-random ones
Reason: