EducationSecondary education and schools

Entropy is ... The concept of entropy. Standard entropy

Entropy is a word that many have heard, but few people understand. And we should admit that it is really difficult to fully understand the essence of this phenomenon. However, this should not frighten us. Much of what surrounds us, we, in fact, can only be explained superficially. And we are not talking about the perception or knowledge of any particular individual. No. We are talking about the totality of scientific knowledge that mankind has.

Serious gaps exist not only in knowledge of galactic scales, for example, in questions about black holes and wormholes, but also in what surrounds us constantly. For example, there are still disputes about the physical nature of light. And who can lay out the concept of time? There are a lot of such questions. But in this article we will discuss entropy. For many years scientists have been fighting the concept of "entropy". Chemistry and physics go hand in hand in the study of this mysterious phenomenon. We will try to find out what has become known by now.

Introduction of the concept in the scientific community

For the first time the notion of entropy in the environment of specialists was introduced by the eminent German mathematician Rudolf Julius Emmanuel Clausius. In simple terms, the scientist decided to find out where the energy goes. In what sense? For illustration, let's not turn to the numerous experiments and complicated conclusions of the mathematician, but take an example more familiar to us in everyday life.

You should be perfectly aware that when you charge, say, a mobile phone battery, the amount of energy that is accumulated in the batteries will be less than the actual power received from the network. There are certain losses. And in everyday life we are used to it. But the fact is that similar losses occur in other closed systems. And for physicists-mathematicians this already poses a serious problem. Rudolph Clausius studied this question.

As a result, he brought out a most curious fact. If we, again, remove the complex terminology, it will reduce to the fact that entropy is the difference between an ideal and a real process.

Imagine that you own a store. And you got under the realization of 100 kilograms of grapefruit at a price of 10 tugriks per kilogram. Putting a mark-up of 2 tugriks per kilo, you will receive 1200 tugriks as a result of the sale, give the agreed amount to the supplier and leave a profit of two hundred tugriks.

So, it was a description of the ideal process. And any trader knows that by the time all the grapefruits are sold, they will be able to shrink by 15 percent. And 20 percent will rot, and they will simply be written off. But this is a real process.

So, the concept of entropy introduced into the mathematical environment by Rudolf Clausius is defined as the interrelation of a system in which the increase in entropy depends on the ratio of the system's temperature to the value of absolute zero. In fact, it shows the value of waste (lost) energy.

The measure of chaos

Still it is possible with some confidence to assert that entropy is a measure of chaos. That is, if we take as the model of a closed system the room of an ordinary schoolboy, then the school uniform that has not been cleaned up will already characterize some entropy. But its significance in this situation will be small. But if in addition to this scatter toys, bring popcorn from the kitchen (naturally, dropping a little) and leave all the textbooks in a mess on the table, then the entropy of the system (and in this particular case - this room) will rise sharply.

Complex Matter

The entropy of matter is a very complex process for describing. Many scientists throughout the last century have contributed to the study of the mechanism of its work. And the concept of entropy is used not only by mathematics and physics. It also takes a deserved place in chemistry. And some craftsmen with its help explain even the psychological processes in relations between people. Let us trace the difference in the formulations of the three physicists. Each of them reveals the entropy on the other hand, and their totality will help us draw a more holistic picture for ourselves.

Clausius's statement

The process of heat transfer from a body with a lower temperature to a body with a higher temperature is impossible.

It is not difficult to verify this postulate. You can never, with cold hands, warm, say, a frozen little puppy, no matter how much you want to help him. Therefore, you have to shove it in your bosom, where the temperature is higher than it is at the moment.

Thomson's statement

A process is impossible, the result of which would be the performance of work due to the heat taken from one particular body.

And if it's very simple, it means that it is physically impossible to construct a perpetual motion machine. The entropy of a closed system will not allow.

Boltzmann's statement

Entropy can not decrease in closed systems, that is, in those that do not receive an external energy feed.

This formulation shook the faith of many devotees of the theory of evolution and made them seriously think about the existence of the universe of a reasonable Creator. Why?

Because by default in the closed system entropy always increases. So, the chaos is aggravated. It can only be reduced through external energy supply. And this law we observe every day. If you do not take care of the garden, house, car, etc., they will simply become worthless.

In megastases, our universe is also a closed system. And the scientists came to the conclusion that our very existence must testify that from somewhere this external energy is energized. Therefore, today no one is surprised that astrophysicists believe in God.

The arrow of time

Another very ingenious illustration of entropy can be represented in the form of an arrow of time. That is, entropy shows, in which direction the process will move physically.

Indeed, unlikely to learn about the dismissal of the gardener, you will expect that the territory for which he answered, will become more accurate and well-groomed. Quite the opposite - if you do not hire another employee, after some time even the most beautiful garden will come to desolation.

Entropy in Chemistry

In the discipline "Chemistry" entropy is an important indicator. In some cases, its importance affects the course of chemical reactions.

Who has not seen frames from feature films, in which the heroes very carefully transferred the containers with nitroglycerin, for fear of provoking an explosion with a careless sharp movement? This was a visual aid to the principle of the action of entropy in a chemical substance. If its indicator reached a critical level, then the reaction would begin, as a result of which an explosion occurs.

Order of disorder

Most often argue that entropy is the desire for chaos. In general, the word "entropy" means transformation or rotation. We have already said that it characterizes the action. Very interesting in this context is the entropy of the gas. Let's try to imagine how it happens.

We take a closed system consisting of two connected tanks, each of which contains gas. The pressure in the containers, until they were tightly connected to each other, was different. Imagine what happened at the molecular level, when they were connected.

The crowd of molecules, which was under more intense pressure, immediately rushed to its brethren, who had previously been relatively free. Thus, they increased the pressure there. This can be compared to how the water in the bathroom splashes. Running on one side, she immediately rushes to the other. So are our molecules. And in our ideally isolated system from external influence they will be pushed until the impeccable balance is established in the whole volume. And now, when there is exactly as much space around each molecule as the neighboring one, everything will calm down. And this will be the highest entropy in chemistry. Turns and transformations will cease.

Standard entropy

Scientists do not give up attempts to order and classify even disorder. Since the value of entropy depends on the set of attendant conditions, the concept of "standard entropy" was introduced. The values of these standards are summarized in special tables so that it is possible to easily perform calculations and solve a variety of applied problems.

By default, the standard entropy values are considered under pressure conditions of one atmosphere and a temperature of 25 degrees Celsius. With a rise in temperature, this indicator is also increasing.

Codes and ciphers

There is also information entropy. It is designed to help encode encrypted messages. With respect to information, entropy is the value of the probability of predictability of information. And if in very simple language, then this is how easy it will be to crack the intercepted cipher.

How it works? At first glance, it seems that without even some basic data, one can not understand the coded message. But it is not so. Here the probability comes into play.

Imagine a page with an encrypted message. You know that the Russian language was used, but the characters are absolutely unfamiliar. Where to begin? Think: what is the probability that on this page the letter "ъ" will meet? And the opportunity to stumble upon the letter "o"? The system you understand. The symbols that occur most often (and least often - this is also an important indicator) are calculated, and are compared with the peculiarities of the language on which the message was composed.

In addition, there are frequent, and in some languages, invariable letters. This knowledge is also used to decipher. By the way, this is the method used by the famous Sherlock Holmes in the story "Dancing Men". Codes were cracked in the same way on the eve of the Second World War.

And information entropy is designed to increase the reliability of the encoding. Thanks to the derived formulas, mathematicians can analyze and improve the options offered by ciphers.

Connection with dark matter

Theories that are just waiting for their confirmation, a great many. One of them connects the phenomenon of entropy with a relatively recently discovered dark matter. It says that the lost energy is simply transformed into a dark one. Astronomers admit that in our universe only 4 percent falls on the matter known to us. And the remaining 96 percent are not studied at the moment - dark.

This name was received because it does not interact with electromagnetic radiation and does not emit it (like all known objects in the universe before that time). Therefore, at this stage in the development of science, the study of dark matter and its properties is not possible.

Similar articles

 

 

 

 

Trending Now

 

 

 

 

Newest

Copyright © 2018 en.delachieve.com. Theme powered by WordPress.