You are missing trading opportunities:
- Free trading apps
- Over 8,000 signals for copying
- Economic news for exploring financial markets
Registration
Log in
You agree to website policy and terms of use
If you do not have an account, please register
Information exists independently of the subject's knowledge of that information. Information can be transmitted or received through various means, and may or may not be available to someone who can use it for his or her own purposes. For example, if someone does not know that the Earth is round, it will not change the fact that the Earth is round. Facts and knowledge exist outside of our awareness of them, but having knowledge can enhance our use of that information.
+
Facts are objective events or phenomena that occur in the world, and they exist outside of our awareness.
Knowledge, on the other hand, is information or beliefs that exist in a person's mind. They may be based on facts, but they may also be distorted or incomplete. For example, we may not be aware of the existence of a certain country or language, but this will not change the fact that they actually exist.
Thus, we can say that facts and knowledge exist outside of our awareness of them, and our awareness is only a way of accessing those facts and knowledge. (ChatGPT)
Infa, as a property of an object, is a field badly missed by ordinary people, for whom infa is information about an object, understood by someone. Not the object itself.
well, we have already agreed to the point that you put an equal sign between data and matter....
information for you is a philosophical concept, you have the right. for me information is a practical concept used in information technologies. for me it is quite normal that information can be stored, processed, compressed and unpacked. most likely you call the word "information" something else, but then it is off-topic within the forum.
The problem is that there are several inconsistent definitions of information in different fields of knowledge.
I tried to take the most fundamental one (from physics) and dilute it with all the others.
That's why my data are material objects (including electrons in a flash drive), and information is rather a process of entropy reduction in a receiver.
From here information is already derived in terms of thinking - it is physical structuring of neuron connections in the brain, reduction of uncertainty
Informatics is more difficult, because everything is already digitised there. Okay, there the data may be symbols rather than matter. But they still reflect quite physical changes in conductors, so there is no contradiction.
Nowhere in this picture is there information as such, neither in storage nor in transmission. Information is a process of entropy reduction.
I think I deserve a schnobel.
Infa, as a property of an object, is not good for ordinary people, for whom infa is information about the object, understood by someone. Not the object itself.
Nowhere in this picture is there information as such, neither in storage nor in transmission. Information is a process of entropy reduction.
I think I deserve a schnobel.
Well, that's sophistry to me, actually. Of course, information about the object and its components is 80 per cent, entropy 20 per cent as something quite invented and of course logical equation. But it is hard for simple physicists) and does not affect their discoveries in any way. But as a point of view in computer science of course is, and within the framework of computer science is already postulative))))
In physics entropy is the number of states, not the number of unaccounted states. When the temperature of a gas increases, the number of states increases, entropy increases, but this does not mean that today we cannot take into account all the states of all molecules in one cubic cm of helium when the temperature increases by one degree, or in 0.001 cm cc. what difference, the rule must work everywhere).
There is no such property on the object.
I don't know what you mean.
Data source:
MINISTRY OF EDUCATION AND SCIENCE OF THE RUSSIAN FEDERATION
Federal State Autonomous Educational Institution of Higher Education
"Southern Federal University
Academy of Engineering and Technology
Yu.A. Kravchenko, E.V. Kuliev, V.V. Markov
INFORMATION AND SOFTWARE TECHNOLOGIES
part 1: INFORMATION TECHNOLOGIES
Textbook
Rostov-on-Don-Taganrog Publishing House of the Southern Federal University
2017
Well, for me it's sophistry already. Of course, information about the object and its components is 80 per cent, entropy is 20 per cent as a completely invented and of course logical equation. But it is hard for simple physicists) and does not affect their discoveries in any way. But as a point of view in computer science of course there is, and within the framework of computer science is already postulated))))
In physics entropy is the number of states, not the number of unaccounted states. When the temperature of a gas increases, the number of states increases, entropy increases, but it doesn't mean that today we can't take into account all states of all molecules in one cubic cm of helium when the temperature increases by one degree, or in 0.001 cm of helium when the temperature increases by one degree, or in 0.001 cm of helium when the temperature increases by one degree, or in 0.001 cm of helium when the temperature increases by one degree.)
...
The diagram in your post is almost exactly what I said earlier (although I hadn't seen it before), but it misses the work of converting data into information. Also, the additional division into information and knowledge seems unnecessary. Information, knowledge, information, representations are essentially the same thing - different levels of interpretation of data.
It cannot exist in nature, because nature is matter.
I don't know what you mean.