What is entropy?

What Does entropy Mean

Entropy is a notion that comes from a Greek word that can be translated as "turn" or "transformation" (used figuratively).

In the 19th century Clausius coined the concept in the field of Physics to refer to a measure of the disorder that can be seen in the molecules of a gas. Thereafter this concept would be used with various meanings in multiple sciences, such as physics , chemistry , computer science , mathematics and linguistics .
Some definitions are:

Entropy can be the thermodynamic physical quantity that allows us to measure the unusable part of the energy contained in a system . This means that this part of the energy cannot be used to produce work.
Entropy is also understood to measure the disorder of a system . In this sense, it is associated with a degree of homogeneity.
The entropy of formation of a chemical compound is established by measuring the one that makes up each of its constituent elements. The higher the formation entropy, the more favorable its formation will be.
In information theory , entropy is the measure of the uncertainty that exists before a set of messages (of which only one will be received). It is a measure of information that is necessary to reduce or eliminate uncertainty.
Another way to understand entropy is as the average amount of information that the transmitted symbols contain . Words like “the” or “that” are the most frequent symbols in a text, but nevertheless, they are the ones that provide the least information. The message will have relevant information and maximum entropy when all symbols are equally probable.
Entropy in the field of linguistics
The way it organizes and disseminates information in a speech is one of the most important research and susceptible to linguistic issues. And thanks to entropy, a deeper analysis of communication can be performed.
In the case of written communication, the problem is easy to analyze (the basic units, the letters, are well defined); If you want to fully understand the message, you can decode it accurately and understand both what is said literally and figuratively . But in oral language, things change a bit, presenting some complications.
It is not easy to determine the fundamental elements of the code in oral discourse ; Words sound different depending on who is saying them and, in the same way, they can have different meanings. It is therefore not enough to classify them into vowel and consonant phonemes because this would not allow us to understand how the information is organized because, for example, if vowel phonemes are suppressed, it is not possible to understand the message.

According to a study carried out at the University of Wisconsin-Madison, a good way to isolate and understand the oral code is through the spectral decomposition of sound signals. Thanks to this technique, an attempt is made to understand how the cochlea filters and analyzes what reaches it. The cochlea is the part of our ears that has the function of transforming sounds into electrical signals and sending them directly to the brain.
To carry out this experiment, a unit of measurement known as "cochlear scale spectral entropy" was used. (CSE), it allows to establish connections between a signal and the one that precedes it ; deciding what possibilities there are to predict a signal starting from the previous one.
The results returned that the more similar two signals are , the easier it is to predict the second ; This means that the information we take from the second is almost nil. Likewise, the more they differ from each other, the greater the information provided by the second signal, so if it is eliminated it will cause considerable consequences in the understanding of the speech.

Go up