AI 2023. Meet ChatGPT. - page 149

[Deleted]  
Valeriy Yastremskiy #:

No, it's not difficult, but how to measure this process in a dauer, schizophrenic, how to assess the psychological state of this process. And the big question is, why? The point is to invent something and assume that it works everywhere? In computer science, yes, it works, but Peter doesn't want it to)))).

We were discussing information.
 
Maxim Dmitrievsky #:
We should compare the same system when the random serifs have changed to non-random serifs.

OK, let's compare. I think I already did, let's do it again).

encoded information may have less entropy than raw and random information, but encoding requires energy (under equal conditions energy inputs are already unequal)! moreover, unpacked information will again have more entropy. so more information means more entropy and less anisotropy.

That's right.

In general, all other things being equal (mass and energy), random data has more entropy than non-random data.

 
Andrey Dik #:

ok, let's compare. i think i already did, come on, let's do it again).

encoded information may have less entropy than raw and random information, but encoding requires energy (under equal conditions the energy inputs are already unequal)! in addition, unpacked information will again have more entropy. so more information means more entropy and less anisotropy.

That's right.

And unpacking encoded information doesn't release energy, by any chance? 😄

 
Ilya Filatov #:

Doesn't unpacking coded information release energy, by any chance? 😄

yeah, I thought about that too))) but I guess not. it won't work to mine energy by unpacking zips))))

By the way, "perpetual" motors also consume energy (wave energy for example), so perpetual motors are not motors, but energy converters.

 
Ilya Filatov #:

1. Actually, it's the other way round. If we calculate how much of our usual information is needed to encode such a system as an atom, and then calculate how many such objects form some or other material forms of our usual world, then we get prohibitive amounts of information. Besides, it is in them that our usual information, our thoughts and understanding of things are also indirectly encoded. Thus, material information > human (computer) interpretation.

2. Unstructured data? That's how those structures store the very "human" information you're talking about. Furthermore, if you don't know how the cells in your body work, that doesn't turn your body into a bag of unstructured broth of whatever matter there is. It's very orderly and works there without your understanding or consent.

3- Well the example of nucleic acid replication in cell nuclei doesn't seem tricky to you, does it? Big deal, it's useless nonsense for a human being, he himself does not interpet genetic information (oh, he already interprets it!).

4. Well yes, earlier we did not reason about matter in such a way that it is a form of information in some medium-carrier. But now we do. And what is the use of the old etymological roots of different words, which are probably unnecessary?

Point by point:

1. Who, except a man, can describe the structure of an atom? Matter itself? - No. Hence, there cannot be more information about the atom than can be produced by man. There's nowhere for it to come from.

Man extracts data from matter and interprets it. This is how information - a structured, multi-layered interpretation of the data obtained by interacting with matter - arises.

To get information about a phenomenon or an object of the external world one has to carry out work of the mind - analyses of perception data, logical conclusions - extrapolation, deduction.... storage on media, etc. Information is the result of our work and is not provided on a "platter" by matter. Otherwise, we would not need to think.

2. The work of the cells of the body is biomechanics. The blind play of physical forces. The interaction of molecules. The fact that matter is capable of the highest level of organisation through evolution does not mean that the process is not simple biochemistry.

The data we receive from matter does not come structured. It is we who, consciously or not, structure them in the context of our own structure. Therefore, all kinds of organisms hear and see the same thing differently. Everyone has a different perception and the final information product is different, although the data broadcast by matter does not change depending on the types of eyes and ears of beings. It's all about personal experience of interaction and interpretation of unstructured data received.

3. You speak of the complex workings of the mind, the products of which you attribute to blind biochemical processes. Cell nuclei don't tell us anything about themselves, it is WE who tell us about them based on experiments and research. You point me to products of the human mind and claim they belong to cells and genes, not humans. Allegedly, nucleic acids themselves "tell" scientists how they work. That's right. They do. Megabytes of unstructured data that scientists have great difficulty extracting at the first and last stage of contact with them, because interpretation follows.


4. Correct me if I am wrong, from your point of view, Matter is a FORM of Information, but Information is NOT Matter?

Let's apply logic:

If matter is a form of information, and information is NOT matter, what is information?

You point out what matter is and what information is NOT, leaving out the question of what information IS. So, what is it?

If I strictly separate information and matter, then you have matter merged with information into one. In your concept, the notion of "information" "floats" with "cut off roots" because of the crossed out etymology, and since you explain matter by information, matter also "floats" on top of information without definition.

It is difficult for me to explain what matter is, but I do not confuse it with information, which, in my opinion, is much easier to explain. Thus, I have a definition of at least one of the two concepts, while you, it turns out, have none.




 
Maxim Dmitrievsky #:
We discussed the information.

Yes, but information from informatics and its formulas do not fit the process of reflection of the world by nerve cells. It's a different kind of information. More complex.

It's not a good use of the term to estimate the number of states and the degree of their consideration. How entropy and, for example, the degree of information about the properties of the system don't fit? Just information about the system is not a good term. In computer science it's fine.
[Deleted]  
Valeriy Yastremskiy #:

Yes, but information from informatics and its formulas do not fit the process of reflection of the world by nerve cells. It's a different kind of information. More complex.

Not a good use of the term in assessing the number of states and the degree of their consideration. How entropy and, for example, the degree of information about the properties of the system do not fit? Just information about the system is not a good term. It's fine in computer science.

Well, you can tell this to neural networkers (closer to nerve cells).

All the arguments are passed over because of a total lack of knowledge of the subject.

It's going to start again that it's not the same, blah blah...

[Deleted]  
Andrey Dik #:

ok, let's compare. i think i already did, come on, let's do it again).

encoded information may have less entropy than raw and random information, but encoding requires energy (under equal conditions the energy inputs are already unequal)! in addition, unpacked information will again have more entropy. so more information means more entropy and less anisotropy.

That's right.

If we take it as a whole, all other things being equal (mass and energy), random data has more entropy than non-random data.

We take information and its evaluation, coding is another matter.

 
Maxim Dmitrievsky #:

Well, you tell that to the neural networkers.

Well, there's a lot of simplifications of real processes. So I don't think they will agree that they can model the reflection process of a real subject and how it feels. At the level of neural networkers, yes, informatics works, but unfortunately not at the level of medicine.

[Deleted]  
Valeriy Yastremskiy #:

Well, there's a lot of simplifications of real processes. So I don't think they would agree that they can model the reflection process of a real subject and how it feels. At the level of neural networkers, yes, informatics works, but unfortunately not at the level of medicine.

Guess what, it's started. Or rather, it's continued.

The essence of the narrative is this: information and its expression is universal for all cases.