You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
No, it's not difficult, but how to measure this process in a dauer, schizophrenic, how to assess the psychological state of this process. And the big question is, why? The point is to invent something and assume that it works everywhere? In computer science, yes, it works, but Peter doesn't want it to)))).
We should compare the same system when the random serifs have changed to non-random serifs.
OK, let's compare. I think I already did, let's do it again).
encoded information may have less entropy than raw and random information, but encoding requires energy (under equal conditions energy inputs are already unequal)! moreover, unpacked information will again have more entropy. so more information means more entropy and less anisotropy.
That's right.
In general, all other things being equal (mass and energy), random data has more entropy than non-random data.
ok, let's compare. i think i already did, come on, let's do it again).
encoded information may have less entropy than raw and random information, but encoding requires energy (under equal conditions the energy inputs are already unequal)! in addition, unpacked information will again have more entropy. so more information means more entropy and less anisotropy.
That's right.
And unpacking encoded information doesn't release energy, by any chance? 😄
Doesn't unpacking coded information release energy, by any chance? 😄
yeah, I thought about that too))) but I guess not. it won't work to mine energy by unpacking zips))))
By the way, "perpetual" motors also consume energy (wave energy for example), so perpetual motors are not motors, but energy converters.
1. Actually, it's the other way round. If we calculate how much of our usual information is needed to encode such a system as an atom, and then calculate how many such objects form some or other material forms of our usual world, then we get prohibitive amounts of information. Besides, it is in them that our usual information, our thoughts and understanding of things are also indirectly encoded. Thus, material information > human (computer) interpretation.
2. Unstructured data? That's how those structures store the very "human" information you're talking about. Furthermore, if you don't know how the cells in your body work, that doesn't turn your body into a bag of unstructured broth of whatever matter there is. It's very orderly and works there without your understanding or consent.
3- Well the example of nucleic acid replication in cell nuclei doesn't seem tricky to you, does it? Big deal, it's useless nonsense for a human being, he himself does not interpet genetic information (oh, he already interprets it!).
4. Well yes, earlier we did not reason about matter in such a way that it is a form of information in some medium-carrier. But now we do. And what is the use of the old etymological roots of different words, which are probably unnecessary?
We discussed the information.
Yes, but information from informatics and its formulas do not fit the process of reflection of the world by nerve cells. It's a different kind of information. More complex.
It's not a good use of the term to estimate the number of states and the degree of their consideration. How entropy and, for example, the degree of information about the properties of the system don't fit? Just information about the system is not a good term. In computer science it's fine.Yes, but information from informatics and its formulas do not fit the process of reflection of the world by nerve cells. It's a different kind of information. More complex.
Not a good use of the term in assessing the number of states and the degree of their consideration. How entropy and, for example, the degree of information about the properties of the system do not fit? Just information about the system is not a good term. It's fine in computer science.Well, you can tell this to neural networkers (closer to nerve cells).
All the arguments are passed over because of a total lack of knowledge of the subject.
It's going to start again that it's not the same, blah blah...
ok, let's compare. i think i already did, come on, let's do it again).
encoded information may have less entropy than raw and random information, but encoding requires energy (under equal conditions the energy inputs are already unequal)! in addition, unpacked information will again have more entropy. so more information means more entropy and less anisotropy.
That's right.
If we take it as a whole, all other things being equal (mass and energy), random data has more entropy than non-random data.
We take information and its evaluation, coding is another matter.
Well, you tell that to the neural networkers.
Well, there's a lot of simplifications of real processes. So I don't think they will agree that they can model the reflection process of a real subject and how it feels. At the level of neural networkers, yes, informatics works, but unfortunately not at the level of medicine.
Well, there's a lot of simplifications of real processes. So I don't think they would agree that they can model the reflection process of a real subject and how it feels. At the level of neural networkers, yes, informatics works, but unfortunately not at the level of medicine.
Guess what, it's started. Or rather, it's continued.
The essence of the narrative is this: information and its expression is universal for all cases.