ComputersInformation Technology

A meaningful approach to measuring information: everything you need to know

Before you start measuring information, let's introduce a definition and understand what we are dealing with.

Definition

Information is information, messages, data in all its manifestations, forms, regardless of their content. Even a complete nonsense written on a scrap of paper can be considered information. However, this definition is from the Russian federal law.

The following values can be distinguished from international standards:

  • Knowledge of the subjects, facts, ideas, meanings, opinions that people exchange in a specific context;
  • Knowledge about facts, events, meanings, things, concepts, which in a concrete context have a certain meaning.

Data is a materialized form of information representation, although in some texts these two concepts can be used as synonyms.

Methods of measurement

The concept of information is defined in different ways. It is also measured in different ways. The following basic approaches to measuring information can be distinguished:

  1. Alphabetical approach.
  2. Probabilistic approach.
  3. A meaningful approach to measuring information.

All of them correspond to different definitions and have different authors whose opinion on the data varied. A probabilistic approach was developed by A.N. Kolmogorov did not take into account the subject of information transfer, that is, he measures its quantity no matter how important it is for the transmitting and receiving entity. A meaningful approach to measuring information, created by K. Shannon, takes more variables into account and is a kind of assessment of the importance of this data for the host country. But let's look at everything in order.

Probabilistic Approach

As already mentioned, the approaches to measuring the amount of information are very different. This approach was developed by Shannon in 1948. It consists in the fact that the amount of information depends on the number of events and their probability. Calculate the amount of information received in this approach can be done according to the following formula, in which I is the required quantity, N is the number of events, and p i is the probability of each particular event.

Alphabet

Absolutely self-sufficient method of calculating the amount of information. He does not take into account what exactly is written in the message, and does not connect the amount written with the content. To calculate the amount of information, we need to know the power of the alphabet and the amount of text. In fact, the power of the alphabet is not limited. However, computers use a sufficient alphabet with a capacity of 256 characters. Thus, we can calculate how much information carries in itself one symbol of printed text on the computer. Since 256 = 2 8 , one character is 8 bits of data.

1 bit is the minimum, indivisible amount of information. According to Shannon, this is an amount of data that reduces the uncertainty of knowledge by half.

8bit = 1 byte.

1024 bytes = 1 kilobyte.

1024 kilobytes = 1 megabyte.

Think

As you can see, the approaches to measuring information are very different. There is another way to measure its quantity. It allows you to evaluate not only the quantity, but also the quality. A meaningful approach to measuring information allows to take into account the usefulness of the data. Also, this approach means that the amount of information contained in the message is determined by the amount of new knowledge that a person will receive.

If expressed in mathematical formulas, then the amount of information equal to 1 bit, should reduce the uncertainty of human knowledge in 2 times. Thus, we use the following formula for determining the amount of information:

X = log 2 H, where X is the amount of data received, and H is the number of equiprobable outcomes. For an example, we solve the problem.

Let's have a three-edged pyramid with four sides. When toss it up there is a chance that it will fall on one of the four sides. Thus, H = 4 (the number of equiprobable outcomes). As you understand, the chance that our object will fall on one of the faces and so will remain standing is less than if you toss a coin and expect it to stand upright.

Decision. X = log 2 H = log 2 4 = 2.

As you can see, the result is 2. But what is this figure? As already mentioned, the minimum indivisible unit of measure is a bit. As a result, after the fall, we received 2 bits of information.

Approaches to measuring information use logarithms for calculations. To simplify these actions, you can use a calculator or a special table of logarithms.

Practice

Where can you benefit from the knowledge gained in this article, especially data on a meaningful approach to measuring information? No doubt, in the exam in computer science. This question allows you to better navigate in computer technology, in particular, in terms of internal and external memory. In practice, these knowledge have no practical value, except in science. No employer will force you to calculate the amount of information in a printed document or written program. Unless in programming, where you will need to set the size of the memory allocated to the variable.

Similar articles

 

 

 

 

Trending Now

 

 

 

 

Newest

Copyright © 2018 en.delachieve.com. Theme powered by WordPress.