At the image form, the information is classified as. The phenomenon of information

Information (from lat. Informatio, clarification, presentation, awareness) - information about something or, regardless of the form of their presentation.

Currently, there is no single definition of information as a scientific term. From the point of view of various areas of knowledge, this concept is described by its specific set of features. For example, the concept of "information" is the basic informatics, and it is impossible to give it a definition through other, more "simple" concepts (also in geometry, for example, it is impossible to express the content of the basic concepts "point", "beam", "plane" Through simpler concepts). The content of the main, basic concepts in any science should be explained on the examples or detected by their comparison with the content of other concepts. In the case of the concept of "information", the problem of its definition is even more complicated, as it is a general scientific concept. This concept is used in various sciences (informatics, cybernetics, biology, physics, etc.), while in each science the concept of "information" is associated with various systems of concepts.

History of concept

The word "information" comes from the lat. Informatio, which translated changes, clarification, familiarization. The concept of information was considered by the antician philosophers.

Before the start of the industrial revolution, the definition of the essence of the information remained prerogative of predominantly philosophers. In the XX century, cybernetics and informatics began to deal with questions.

Classification of information

Information can be divided into species on various criteria:

by method of perception:

by form of representation:

by appointment:

by meaning:

  • Actual - information, valuable at the moment.
  • Reliable - information obtained without distortion.
  • Clear - information expressed in the language, understandable to whom it is intended.
  • Full - information sufficient to make the right solution or understanding.
  • Useful - the usefulness of information is determined by the subject that received information, depending on the amount of possibilities of its use.

by truth:

The meaning of the term in various fields of knowledge

Philosophy

The traditionalism of the subjective constantly dominated in the early philosophical definitions of information, as a category, concepts, properties of the material world. Information exists regardless of our consciousness, and may be reflected in our perception only as a result of interaction: reflections, reading, in the form of a signal, incentive. Information is intangible as all the properties of matter. Information stands in a row: matter, space, time, systemic, function, etc. What is the fundamental concepts of formalized reflection of objective reality in its distribution and variability, diversity and manifestation. Information is the property of matter and reflects its properties (state or ability of interaction) and the amount (measure) by interaction.

From a material point of view, information is the procedure for following the objects of the material world. For example, the order of the letters on a sheet of paper according to certain rules is written information. The order of the colorful points on a sheet of paper according to certain rules is graphical information. The order of musical notes is musical information. The order of genes in DNA is hereditary information. The procedure for following bits in computer is computer information, etc., etc. To carry out an information exchange requires the presence of necessary and sufficient conditions.

The necessary conditions:

  1. The presence of at least two different objects of material or intangible world.
  2. The presence of a common property in objects allowing to identify objects as a carrier of information.
  3. The presence of the objects of a specific property that allows you to distinguish between objects from each other.
  4. The presence of the properties of space allows to determine the order of objects. For example, the location of written information on paper is a specific paper property that allows you to place the letters from left to right and top down.

A sufficient condition is one:

The presence of a subject capable of recognizing information. This is a person and human society, the society of animals, robots, etc.

Various objects (letters, symbols, pictures, sounds, words, suggestions, notes and TPs.) Communted one time form the basis of information. The informational message is built by selecting from the basis of copies of objects and the location of these objects in space in a specific order. The length of the information message is defined as the number of copies of the base objects and is always expressed in an integer. It is necessary to distinguish the length of the information message, which is always measured by an integer, and the number of knowledge contained in the information message, which is measured in an unknown unit of measurement.

From a mathematical point of view, information is the sequence of integers that are written in the vector. The numbers are the object number in the information base. The vector is called invariant information, as it does not depend on the physical nature of the base objects. The same informational message can be expressed in letters, words, suggestions, files, pictures, notes, songs, video clips, any combination of all previously mentioned. Whatever we express information - only the basis changes, and not the invariant.

In computer science

The subject of studying science computer science is the data: methods for their creation, storage, processing and transmission. And the information itself recorded in the data, its meaningful meaning is interesting to users of information systems that are specialists of various sciences and areas of activity: the physician is interested in medical information, geologist - geological, entrepreneur - commercial, etc. (including computer science specialist Interested in information on working with data).

Systemology

Working with information is related to transformations and always confirms its material nature:

  • recording is the formation of the structure of matter and modulation of streams by the interaction of the tool with the carrier;
  • storage is the stability of the structure (quasistatics) and modulation (quasidinamics);
  • reading (study) is the interaction of the probe (tool, converter, detector) with a substrate or a flow of matter.

Systemology considers information through relations with other bases: i \u003d S / F, where: I - information; S is the system of universe; F - functional communication; M - matter; V - (V undeclined) Sign of the Great Association (Systems, Unity of Base); R - space; T - Time.

In physics

The objects of the material world are in a state of continuous change, which is characterized by the exchange of energy of the object with the environment. Changing the state of one object always leads to a change in the state of some other environmental object. This phenomenon is regardless of how which states and which objects are changed, can be considered as a signal transmission from one object to another. Changing the state of the object when the signal is transmitted to it is called the signal registration.

The signal or sequence of signals form a message that can be perceived by the recipient in one form or another, as well as in one way or another. Information in physics is a term, a qualitatively generalizing concept "signal" and "message". If signals and messages can be quantified, then we can say that signals and messages are units of information of information.

The same message (signal) in different systems is interpreted in its own way. For example, consistently long and two short sounds (and even more so in symbolic coding - ..) Signal in the terminology of Morse, this is a letter D (or d), in the BIOS terminology from Award - a video card malfunction.

In mathematics

In mathematics, the theory of information (mathematical theory of communication) is the section of the applied mathematics, which determines the concept of information, its properties and establishing limit relations for data transmission systems. The main sections of the information theory are the source coding (compressive encoding) and channel (noise-resistant) coding. Mathematics is more than scientific discipline. She creates a single language of all science.

The subject of mathematics studies are abstract objects: number, function, vector, set, and others. At the same time, most of them are injected aciamatically (axiom), that is, without any connection with other concepts and without any definition.

Information is not among the subject matter of mathematics research. Nevertheless, the word "information" is used in mathematical terms - own information and mutual information relating to the abstract (mathematical) part of the theory of information. However, in mathematical theory, the concept of "information" is associated with exclusively abstract objects - random values, while in modern theory of information, this concept is considered significantly wider - as a property of material objects.

The connection between these two identical terms is undoubted. It is the mathematical apparatus of random numbers used the author of the theory of information Claude Shannon. He himself implies under the term "information" something fundamental (unteracted). In Shannon's theory, intuitively relies that the information has content. Information reduces overall uncertainty and information entropy. The amount of information is available to measure. However, he warns researchers from mechanical transfer of concepts from its theory to other areas of science.

"Search for ways to apply the theory of information in other areas of science is not reduced to the trivial transfer of terms from one science area to another. This search is carried out in the long-term advancement of new hypotheses and experimental verification. " K. Shannon.

In jurisprudence

Legal definition of the concept of "information" is given in the federal law of July 27, 2006 No. 149-FZ "On information, information technologies and information protection" (Article 2): "Information - information (messages, data), regardless of the form of their representation" .

Federal Law No. 149-FZ determines and enshrines the rights to protect information and information security of citizens and organizations in computer and information systems, as well as information security issues of citizens, organizations, societies and states.

In the theory of management

In the theory of management (cybernetics), the subject of the research of which is the basic laws of management, that is, the development of management systems, information is called messages received by the system from the outside world during adaptive control (adaptation, self-repair system).

The founder of Cybernetics Norbert Wiener spoke about the information like this:

"Information is not matterium and not energy, information is information." But the main definition of information that he gave in several of his books, the following: information is the designation of the content obtained by us from the outside world in the process of adapting to him and our senses..

- N. Wiener Cybernetics, or control and contact in the animal and machine; or cybernetics and society

This idea of \u200b\u200bWiener gives a direct indication of the objectivity of the information, that is, its existence in nature, regardless of the consciousness (perception) of a person.

Objective information Modern cybernetics determines as an objective property of material objects and phenomena to generate a diversity of states that are transmitted from one object (process) to the other, and are captured in its structure.

The material system in cybernetics is considered as many objects that themselves can be in different states, but the state of each of them is determined by the states of other system objects. In nature, the set of state states is information, the states themselves are the primary code, or source code. Thus, each material system is a source of information.

Subjective (semantic) Cybernetics information determines how the meaning or content of the message. (See ibid.) Information is the characteristic of the object.

Disinformation

Disinformation (also disinforming) is called one of the ways to manipulate information, like the introduction of someone's misleading by providing incomplete information or full, but no longer information, or complete, but not in the right area, distortion of the context, distortion of the part of the information.

The purpose of such an impact is always one - the opponent must do as it is necessary to manipulator. The act of an object against which the misinformation is directed may be in making the necessary solution to the manipulator or in refusing to make a decision unfavorable for manipulator. But in any case, the ultimate goal is the action that will be undertaken.

Information is a very difficult education. The unity concerned its definition did not succeed. But we understand perfectly well what it is. This is probably one of the few cases in science when it operates with the concepts without a clear definition. Interestingly, the information is the subject of studying computer science. Perhaps this is why there is no clear definition. But still let's try to give your own interpretation to this term for a clearer your understanding.

The concept of information

The types and properties of the phenomena we are impossible to disassemble if at least approximately not to navigate in this term. So what is information? This is a compex of phenomena that received a certain reflection in our psyche and which can be used in our further activities. Information can be used in a huge number of possible spheres of our life. Moreover, it is used. Various new ways to apply information appear lately. Just sin not to consider the sphere of human life in which information can be used.

Why do we need it?

Indeed, what's the point in obtaining information by us? It allows us to act and survive in this world. The very fact of survival is manifested everywhere, where the human foot goes. Let's look at where we show ourselves and there is a need to survive or progress (this is the second task of information).

  1. Basic needs.
  2. Safety.
  3. Communication.
  4. Self-development.
  5. Training.
  6. Education.

Obviously, it is only a small part of the possible spheres of human life, where it would be possible to use the information. We can get it the most different ways. From the very beginning we will talk about where the information comes from, and then let's move on to different classifications characterizing this phenomenon.

Methods for obtaining information

And now directly begin to move to the topic "Main types of information". Let's start considering this phenomenon with the description of possible it turns out, their really incredible amount. In fact, any item, if we can think and perceive it, can be a carrier of different degrees of importance for us. Here are small examples:

  1. The Internet.
  2. Books.
  3. Television.
  4. Plate.
  5. Another man.
  6. Apple.

And much more. Perhaps some items surprised you. For example, as an apple can be but if you think about it, then this is true. The same with the stove.

Possible classifications

And now we turn to the consideration of this concept from a scientific point of view. There are three types of classifications. Only two of them will be considered in this article. But briefly list these classifications in this subsection. What attempts to systematize the entire information flow?

  1. Information on the form form.
  2. By public value.

This classification is so simple that it is served by children in the second grade in computer science lessons. And now we turn directly to the topic of this article.

Classification of information on the method of perception

This is how the man absorbs her and processes. This classification is familiar to us since childhood, as she was taught immediately in several educational subjects. Allocate such types of information on the method of perception:

  1. Visual. These are the data that we perceive through our eyes. An example of such information may be the same stove or apple. We estimate their appearance. Based on what light bulbs burn on the stove, we can find out if it works. And based on this data, determine whether it is necessary to follow it. Knowledge of this we need to survive. Interesting, do not find?
  2. Audial. This is a kind of information that is perceived by our ears. Her examples are very simple - the noise of the car, the voices of people, ringing bells. All we hear are information in the audience form. Its value for human survival is also colossal. After all, it is the audio information that allows you to evaluate the part of the environment that cannot be seen, touch or try to taste.
  3. Tactile. This type of information is directly related to our skin. The function of this type of perception for survival is also very important - a person can, for example, feel the temperature of the subject, to which he touches, and its texture. This not once saved us - this will agree with this to try the temperature of the burner on the plate with hand.
  4. Olfactory. This is the sphere of smells. It is they who help us feel the smell of rotten food or determine what you need to get into the room. Actually, the smell is a very important characteristic that will help not only do not poison, but also to understand the nutritional value of food.
  5. Flavor. This type of information works in a pair with an olfactory concern for food definition. According to the logic of our body, everything is simple. The food is delicious - it means it can be eaten.

Everything is very simple. These are just the five senses for which we talked about at school. Here again we remembered them. Understanding which types of information on the method of perception exist, makes it possible to use this data in computer science to ensure the most efficient interaction of a person with a computer. And this can significantly improve the quality of our life in the future.

We have already understood what sources of information are. Types of information on the method of perception was also postponed in our head. And now it has come a series of parsing another classification - in the form of the presentation. What is it at all? Actually, this is a scientific image of what sources information can be obtained. This is a system of what we have described earlier. So, what are the types of information on the form of the presentation?

  1. Text. This is the same book or website on the Internet.
  2. Numeric. The numbers can sometimes say much more than words.
  3. Graphic. And the picture? If it is beautiful and there are many mysteries in it, then why a simple image on the wall can not become useful for a person?
  4. Musical. How neither here are, but everyone loves good music. Yes, submissions are different. But in any case, music is beautiful.
  5. Combined. For example, a music clip.

These are the types of information perception by man we know.

conclusions

During the time of reading this article, you understood a lot. These are the types of information on the method of perception and the form of the presentation. How grateful is the case - share the entire information flow on varieties? You decide for yourself. Every person should have their own opinion. It is known exactly one thing - without information, our life would be impossible. And it is obvious to everyone.

Question number 1.

The concept of "information". The word "information" comes from the Latin word Informatio, which in translation means minimize, clarification, familiarization. The concept of "information" is the basic informatics, it is impossible to give it a definition through the other, more "simple" concepts.

Properties of information.

1. Attribute properties- These are those properties, without which the information does not exist.

2. 2. Pragmatic properties- These are those properties that characterize the usefulness of information for the user, consumer and practice. Manifest in the process of using information

3. 3. Dynamic properties- These are those properties that characterize the change in time information.

Question # 2.

Classification of information as an integral part of the management information support, without which it is impossible to effectively and promptly carry out management activities. Categories of Classifiers TECI and their status (international, all-Russian)

Forms of signal transmission

Used channel separation methods (RK) can be classified on linear and nonlinear (combinational).

In most cases separation of channels, each message source is highlighted in a special signal, called channel. Messaged by messages channel signals are combined, resulting in group signal. If the operation of the linear combination, then the resulting signal is called linear group signal.

To unify multichannel communication systems for the main or standard channel accept tonal Frequency Channel (Channel PM), providing messages with an efficiently transmitted frequency band of 300 ... 3400 Hz, corresponding to the main spectrum of the telephone signal.

Multichannel systems are formed by combining the channels of PM into groups, usually multiple 12 channels. In turn, often use the "secondary seal" channels of TCs with telegraph channels and data channels.

Classification of information. Forms of information transmissions.

Information can be divided into views of different criteria:

1. In truth: True and false;

2. As a method of perception: visual - perceived by the bodies of vision: Audial - perceived by the bodies of the fool: the combustible - perceived by tactile receptors; Obligating - perceived by olfactory countersuators; flavoring receptors perceived by flavors.

3. On the form of representation

Text - transmitted in the form of symbols, designed to denote language lexemes.

Numeric - in -Video numbers and signs denoting mathematical actions.

R Panic - in the form of images, items, graphs.

Sound - oral or in the form of a recording of the transmission of the band language audio through.

4.Popication

Mass - contains trivial information and operates with a set of concepts, understandable most of the society. Special - Contains a specific set of concepts, when used j, the transfer of information, which may not be understood by the bulk of the sociossecrete - transmitted by the narrow circle of individuals and by closed (protected) channels.

Personal (private) is a set of information about any personality defining the "social situation and types of social interactions within the population.

5.The value

Actual - information valuable at the moment.

Reliable - information obtained without distortion.

Clear - in (| The legal entity expressed in the language of understandable to whom it is intended.

Full - information sufficient to make the right solution or

understanding. Useful - the usefulness of information is determined by the subject that received information, depending on the amount of possibilities of its use.

Information transfer

The transmission of semantic information is called the process of its spatial transfer from the source to the recipient (addressee) ") - information processes should be used to transmit information over long distances.

To present information, various iconic systems are used - sets of pre-agreed semantic symbols: items, 1 pictures written or printed words of natural language.

I presented with their help semantic information about any object, a phenomenon or process is called a message, J is obvious that the information must be transferred to any mobile media for transmitting the message. Media can move in space using vehicles. This method ensures the complete accuracy of the transfer of information, since the addressee receives the original message, however requires considerable time to transmit. From the middle of the XIX century, the way information transmitting information was distributed; Use naturally propagating information carrier - electromagnetic oscillations (electrical oscillations, radio waves, light). The implementation of these methods requires: pre-transfer of information contained in the message to the carrier - encoding the transfer of the signal thus obtained by the address of the Special Communication Channel: Reverse Transformation of the Signal Code into the message code - decoding. Devices that implement the data transfer process form, communication systems. Depending on the method of presenting information, the communication system can be divided into the iconic (telegraph, telegraph, telefax), sound (telephone), video and combined systems (television). The most developed system of communication in our time is the Internet.

Question)

Informational resources - In a broad sense, the set of data organized to effectively obtain reliable information.

These are books, articles, patents, dissertations, research and development documentation, technical translations, data on advanced industrial experience, etc.

Information resources (in contrast to all other types of resources - labor, energy, mineral, etc.), the faster they grow, the more they are consumed.

Resources are available in stock reserves, means that can be used if necessary. Currently, scientists and practices include informational resources to important strategic resources, on which the development of the economy, science, education, culture, etc. depends. The first attempts to give the definition of information resources were made in the 90s of the 20th century, when the so-called "resource approach" was formed to study information. A narrow and wide understanding of the information resources is used: only network information resources available through computer communications communications are used in a narrow understanding, and in a wide - any information that is fixed on traditional or electronic media is suitable for saving and distribution.

Information resources can be different species - Etleto media, library, Internet. The following information resources can be successfully sold via the Internet:

News ribbons (on-line-news). For example, the tape of financial and political news is vital to traders for making decisions on sales and purchases on stock exchanges;

Subscriptions for electronic copies of periodicals. Some newspapers and magazines produce their full electronic copies and provide access to them;

Access to electronic archives and databases containing information on the most similar issues;

Analytical reports and research;

Own analytical materials and forecasts.

By access categories, information resources can be open (publicly available) or with limited access. In turn, documented information with limited access is divided into public secrets and confidential.

Classification of information systems:

In a broad sense, the information system has a combination of technical, software and organizational support, as well as personnel intended to ensure properly adequate information ("information system is called a complex comprising computing and communicating equipment, software, linguistic means and information resources , as well as system personnel and providing support for a dynamic information model of some part of the real world to meet the information needs of users ").

In a narrow sense, the information system is only a subset of IP components in a broad sense, including databases, database management system (DBMS) and specialized application programs. IP in a narrow sense is considered as a software and hardware system designed to automate targeted end-user activities, providing, in accordance with the processing logic laid in it, the possibility of obtaining, modifying and storing information.

The task issatisfying specific information needs within a specific subject area.

4. Due to the unconditional priority of the binary numbering system with an internal presentation of information in the computer, encoding characters is based on a comparison of each of them a certain group of binary signs. Coding decoding should be used by uniform codes, i.e. binary groups of equal length.

Solve the simplest task: Having, say, uniform code from groups by N. Binary signs as you can form different code combinations. The answer is obvious TO \u003d 2 n. So, for N. = 6 TO \u003d 64 - obviously not enough N. = 7 TO \u003d 128 - quite enough.

However, for coding several (at least two) natural alphabets (plus all marked signs) and this is not enough. Minimally sufficient significance N. In this case, 8; Having 256 combinations of binary symbols, it is quite possible to solve the specified task. Since 8 binary characters make up 1 bytes, they are talking about the systems of "byte" coding.

In the communication channel, a message compiled from the symbols (letters) of one alphabet can be transformed into a message from the symbols (letters) of another alphabet. The rule describing the unambiguous correspondence of the letters of the alphabets is called the code. The procedure for converting the message is called transcoding. Such a message conversion can be performed at the time of receipt of the message from the source to the communication channel (encoding) and at the time of receiving the message by the recipient (decoding).

Question number 5.

Notation - Symbolic method of recording numbers, representation of numbers using written signs.

Notation:

§ gives representations of a plurality of numbers (integers and / or real);

§ gives each number a unique representation (or at least a standard representation);

§ reflects the algebraic and arithmetic structure of numbers.

§ The most common current positioning systems are: decimal, octal and hexadecimal. Each positional system has a certain alphabet of numbers and base.

Number systems

System of Number - This is a way of representing and images of numbers using a strictly limited set of characters, each of which has certain quantitative values. Numbers in the number systems are presented using some set of signs - numbers , and their quantity depends on the system used.

Rules of non-definite arithmetic- operation subtract In binary code is replaced by the operation additions With a negative number, addition of two positive, positive and negative, negative and positive and two negative numbers. Generally, the operation of addition, along with the shear operation, are the main, because In addition to subtraction, the multiplication and operations of the division of binary numbers are reduced to them. Division Binary numbers are produced as in a conventional decimal number system. In the first step, you should check the possibility of subtracting the divider divider (the result should not be negative), if it is possible in a particular one, the unit is written, otherwise zero, and the divisor shifts to one digit to the right relative to the divide. Then one discharge divide is demolished down and repeated check. The result mark is obtained by adding as when multiplying.

Indicator Generation EUM.
The first 1951-1954 Second 1958-I960 Third 1965-1966. Fourth Fifth?
A 1976-1979 B 1985-?
Elementary base processor Electronic lamps Transistors Integrated Schemes (IP) Large IS (bis) Ultra-handed IS (SBI) + Optoelectronics + cryoelectronics
Elemental base of RAM Electron beam tubes Ferrite cores Ferrite cores BIS SBI SBI
Maximum capacitance of RAM, byte 10 2 10 1 10 4 10 5 10 7 10 8 (?)
Maximum processor speed (op / s) 10 4 10 6 10 7 10 8 10 9 + multiprocessing 10 12, + multiprocessing
Programming languages Machine code + Assembler + High Procedural Languages \u200b\u200b(Yava) + New procedural yav + Non-Procedure Yava + New necrotic java
User communication with EUM Control Panel and Perfgories Perfocards and punctants Alphanume digital terminal Monochrome graphic display, keyboard Color + graphic display, keyboard, "mouse", etc. EUM voice communication devices

In 1642, Pascal was brushed into the eight-bit summable mechanism. In 1820, French Charles de Colmar created an arithmeter capable of producing multiplication and division. All the main ideas that underlie the work of computers were set out in 1833 by the English mathematician Charles Babbird. He developed a project of a machine to perform scientific and technical calculations, where the main devices of the modern computer predicted, as well as its tasks. Management has existed programmatically. For input and output, it offered to use punch cards - sheets of thick paper with information applied using holes. In 1888, the American Engineer Herman Holelerite constructed the first electromechanical counting machine. This machine, called the tabula, could read and sort statistical records encoded on poles.

In February 1944, at one of the enterprises of the AI-Bi-Em (IBM), in collaboration with scientists of Harvard University, the Mark 1 was created for the United States Navy. It was a monster weighing about 35 tons. In "Mark 1", mechanical elements were used to represent numbers and electromechanical - to control the operation of the machine. The numbers were kept in registers consisting of ten-jubous counting wheels. Each register contained 24 wheels, and 23 of them were used to represent the number (i.e., "Mark 1" could "polish" numbers up to 23 digits long), and one to represent its sign. The register had a mechanism for transferring tens and therefore was used not only to store numbers in one register, the number could be transferred to another register and added to the number there (or deducted from it). In total, there were 72 registers in "Mark 1" and, moreover, additional memory of 60 registers formed by mechanical switches. In this additional memory, constants were administered manually - numbers that did not change during the calculation process. EU classification

super-Evm.- the most powerful computing system existing in the appropriate historical period

Large EUM. more accessible than "super".

Mini-computer- Use -Le for process management, or in time separation mode as a small local network control machine.

Microcomputer- among them allocate multiplayer, equipped with many remote terminals and operating in time separation mode; Built-in, which can control the machine, any subsystem of a car or another device (including military purposes), being its small part.

work station- It is used in several, sometimes inconsistent, meaning. Thus, the workstation may be a powerful micro-computer focused on specialized work of a high professional level, which cannot be attributed to personal computers at least due to very high cost.

8) Safety and rules for the operation of PC devices.

1.k independent work on the PCs are allowed persons not under 18 years old,

past medical examination, special training, instruction on labor protection at the workplace, who studied the "operating manual" and the safe methods and techniques of work.

Personnel allowed to work on PCs by adjustment, the operation of PR-I must:

· Get \u200b\u200binstructing on labor protection;

· To familiarize yourself with the general rules of operation and instructions on the safety of labor in the "manual";

· Get \u200b\u200bacquainted with warning entries on covers, walls, block panels and devices;

· Get \u200b\u200bacquainted with the rules of operation of electrical equipment.

2. The PC should be connected to a single-phase network with a normal voltage 220 (120) in, frequency of 50 (60) Hz and grounded neutral. Grounding contacts The sockets must be securely connected to the indoor safety circuit. The room must be installed automatic emergency or switching device of the total power outage.

3. It is forbidden to independently repair the PC (its blocks), if it is not included in the circle of your responsibilities.

4. When operating the PCs, the following requirements must be performed, the rules:

· Do not connect and not disconnect the connectors and electrical power cables with the submitted voltage of the network;

· Do not leave PCs included without observation;

· Do not leave PC included during a thunderstorm;

· Upon completion of work, turn off the PC from the network;

· Devices must be located at a distance of 1 m from the heating devices; jobs should be located between themselves at a distance of at least 1.5 meters;

· Devices should not be exposed to direct sunlight;

continuous duration of work When entering data on a PC should not exceed 4 hours at an 8-hour working day, through every hour of work it is necessary to take a break for 5-10 minutes, after 2 hours for 15 minutes; indoors where the computer equipment is located, should be equipped Corner fire extinguishing.

9. A complete set of software required for the organization, say, the automated workplace (AWP) of the designer engineer, the scientist (physics, chemist, biologist, etc.) in value exceeds (sometimes several times) the cost of a computer adequate class .

All kinds of software,

Operating systems are a set of programs providing

Resource management, i.e. the agreed work of all computer hardware;

Process management, i.e. execution of programs, their interaction with computer devices, with data;

User interface, i.e. User dialogue with a computer, performing certain simple commands - information processing operations.

Programming systems;

Instrumental software, integrated packages;

Application programs.

10. Applied programs are designed to ensure Computer equipment in various spheres of human activity. Application developers great efforts spend on improving and upgrading popular systems. New versions, support old, maintaining continuity, and include the basic minimum (standard) of possibilities.

One of the possible options for classifying software tools (PS) that make up the application software (PPO) is reflected in Fig.2.11. Like almost any classification, shown in the figure is not the only possible. It contains even not all types of applications. However, the use of the classification is useful for creating a general idea of \u200b\u200bthe PPO.

Fig.2. P. Application Classification

12. The operating systems of personal computers imposed a deep imprint. The concept of the file system underlying the UNIX operating system. In the UNIX OS, the I / O subsystem unifies the access method to both files and peripheral devices. Under the file, this is understood as a data set on a disk, terminal or any other device. Thus, the file system is a data management system.

File systems of operating systems create for users some virtual representation of external memorable devices, allowing you to work with them not at a low level, and at a high level of sets and data structures. The file system hides from the programmers a picture of the real location of information in external memory, and also provides standard responses to errors. When working with files, the user provides tools for creating new files, read operations and write information.

NTFS The standard file system for family of Microsoft Windows operating systems NT.ntfs has replaced the FAT file system used in MS-DOS and Microsoft Windows. NTFS supports the metadata system and uses specialized data structures to store file information to improve productivity, reliability and efficiency of disk space. NTFS stores file information in the main file table - Master File Table. NTFS has built-in features to distinguish access to data for various users. Multiple versions of the M1YU2 versions are used in Windows NT 3.51 and Windows NT 4.0B M3U0 comes with Windows 2000b M3Y1 - with Windows XP

Fat. The classic file system architecture that is used for flash drives and memory cards. The recent past was used in floppy disks, on hard disks and other media. The size of one file is limited to 4 GBDB Bill Gates and MacDonald ( english) In 1976-1977. Used as the main file system in the operating systems of DOS families and Windows. There are three versions of FAT - Fat12, FAT16. and FAT32.. They are distinguished by the discharge of records in the disk structure, that is, the number of bits set for storing the cluster number. Fat12 is used mainly for floppy disks, FAT16 - for small volume disks. Used primarily for flash drives.

11 Question!)

1. The interface is a way to communicate a user with a personal computer, a user with applied programs and programs among themselves. The interface is used for the convenience of managing computer software. Interfaces are single-way and multi-tasking, single-user and multiplayer. The interfaces differ among themselves on the convenience of managing software, that is, by the method of launching programs. There are universal interfaces that allow all ways to launch programs, such as Windows 3.1, Windows-95. Example: Windows-95 has all launch methods, including allows you to run programs using the Start button menu.

2. Types of interfaces.

2.1. Command (text) interface.

To manage a computer to the command line, it is written (entered from the keyboard) command, for example, the name of the program command file or the service words specially reserved by the operating system. The command can be edited if necessary. Then the ENTER key is pressed for executing the command. This type of interface as the main one has all varieties of operating systems, such as MS-DOS 6.22. As an additional means, this type of interface has all kinds of software shells (Norton Commander, Dos Navigator, etc.) and Windows 3.1, Windows-95/98. The command interface is uncomfortable, as it is necessary to remember the names of many teams, an error in writing even one symbol is invalid. It is rarely applied in a session of direct work with the operating system or in case of failures when other methods are impossible.

2.2. Graphic full screen interface.

It takes, as a rule, at the top of the screen, the menu system with prompts. The menu is often dropped (falling). To control the computer, the screen cursor or mouse cursor after searching in the directories tree is installed on the program command files (* .exe, * .com, * .bat) and to start the program, the Enter key or the right mouse button is pressed. Different files may be highlighted in different colors or have a different pattern. Catalogs (folders) are separated from files size or pattern.

This interface is the main for all types of software shells. Example: Norton Commander and Norton-like shells (DOS Navigator, Windows Commander, Disk Commander). Such an interface has Windows 3.1 tools (File Manager) and Windows-95/98 (My Computer and Explorer). Such an interface is very convenient, especially when working with files, since it provides a high speed of operations. Allows you to create a custom menu, run applications for expanding files, which increases the speed of working with programs.

2.3. Graphic multi-color pictographic interface.

It is a desktop (Desktop) on which pictograms (icons or icons of programs) are lying. All operations are made, as a rule, mouse. To control the computer, the mouse cursor is supplied to the icon and start the program by clicking the left mouse button on the icon. This is the most convenient and promising interface, especially when working with programs. Example: Apple Macintosh Computer Interface, Windows 3.1, Windows-95/98, OS / 2.

Let us give a brief description and designation of some of the most common types of interface communication equipment / terminal equipment (DCE / DTE).

V.24 is the equivalent RS-232, COM port, asynchronous port (by the way, it can work in synchronous mode). Port low-speed, although recently motherboards began to appear, able to operate at a speed of up to 230,400 bits / s. Its characteristics are limited to the presence of only one "earth" wire and a high level of logical units and zero - -3B and + 3V, respectively. Standard connector - DB-25 or DB-9 plug on the terminal device (DTE, computer) and DB-25 socket on the communication device (DCE, modem).

V.35 - was originally developed as a standard for high-speed modems, but only a high-speed DCE / DTE interface was taken from the framework of this standard. It has a low level of logical units and zero and differential lines

Question number 13.

13). Files, attributes. Forming file names ...

file named area on the media of information. The file has a name from 1O 255. The punctuation marks should not be in the name (except for dash). Through a point, you can set the name of the name that indicates the file format doc.exe starts the file. Attributes The following file characteristics (read-only, hidden, system, archive) files are text, binary, graphic. Names of data media are mounted, as a rule, branded storage recorders.

The Informatics use the following definition: File - Received byte sequence.

Working with files is implemented by means of operating systems.

Names like files have and processed in a similar way:

§ data areas (optional on disk);

§ devices (both physical, ports, for example; and virtual);

§ Data streams (named channel);

§ network resources, sockets;

§ Objects of the operating system.

The first type files historically arose first and are common most widely, so often the file is called the data area corresponding to the name.

Attributes

In some file systems, such as NTFS, attributes are provided (usually this binary value "Yes" / "No" encoded by one bit). In many modern operating systems, attributes practically do not affect the ability to access files, for this in some operating and file systems there are access rights.

Attribute name transfer value File Systems OS
Read only only for reading To the file is forbidden to write DOS, OS / 2, Windows
System. Systemic Critical for operating system file Fat32, Fat12, Fat16, NTFS, HPFS, VFAT DOS, OS / 2, Windows
Hidden. hidden The file is hidden from the show, until it is clearly indicated by the opposite Fat32, Fat12, Fat16, NTFS, HPFS, VFAT DOS, OS / 2, Windows
Archive. Archive (requiring archiving) The file is changed after backup or not copied by backup programs. Fat32, Fat12, Fat16, NTFS, HPFS, VFAT DOS, OS / 2, Windows
Suid. Installing user ID Program execution on behalf of the owner Ext2. Unix-like.
SGID Installing group IDs Program execution on behalf of a group (for directories: Any file created in the directory with the SGID installed will receive a specified owner group) Ext2. Unix-like.
Sticky Bit. Sticky Bit initially prescribed the kernel without unloading the completed program from memory immediately, but only after some time to avoid constant loading from the disk of the most frequently used programs, currently in different OSs are used in different ways Ext2. Unix-like.

Forming file names.

The file is called the named part of the hard disk or flexible floppy disk. Also, the file is a logical device, a potential source or receiver of information. The length of each file is limited only to the container of the computer's external memory device.
Long file names
The maximum length of the file name can be 255 characters, including spaces. Symbols can be used in names
Cyrillic and others prohibited in DOS symbols: /:. *? "< >
The total length of the route and the file name should not exceed 260 characters (disk name 2 symbol + root directory name / - 1 symbol
+ File name - at least 1 symbol + separation point -1Simvivo \u003d 5 + 255 \u003d 260).
When creating a file, it is assigned to 2 name - long and short (according to the DOS rules - in format 8.3). The short name is formed by the following rules:
1) From the long name, spaces and prohibited in DOS symbols are removed. For the 8-letter name, the first 6 remaining
The characters to which the ~ ~ and the sequence number of the file is added (among files with the same initial characters). Xxxxxx ~
2) For 3 letters of the type, the first three characters are used after the last point in the long name.
For example:
Long name
Short name
Microsoft Windows 95.bmp
Micros ~ 1.Bmp
Microsoft Office.Tmp
Micros ~ 2.Tmp
Coursework Ivanova I.I..doc
Courses ~ .doc.
Unicode universal encoding assigns each symbol of 2 bytes. Windows uses this encoding to store long file names, so on. A long name may require up to 500 bytes (255 characters at a maximum length). In DOS in the FAT system file information
(Name, size, date and creation time) is stored in a 32 byte directory element. In Windows, file information (short name, size, date and creation time) is stored in the usual directory element. The long name and date of the last handling are stored in the directory elements adjacent to the main and labeled in a special way. So One file takes 2 directory elements or more (21 in the case of maximum length: 1 - Normal (DOS), others - for a long name). Features:
1) the size of the catalog increases, the access time, the probability of fragmentation;
2) The root diskette directory contains 224 items. So In the root directory, a floppy disk can be about 10 files named
Maximum length. If all the elements are filled, then a message is issued about a shortage of memory, a shortage of free disk space (even
If there is free space on the disk). Therefore, it is necessary to lay out files by folders and not to store them in the root directory (except for service).

Types of files

In various operating and / or file systems, various types of files can be implemented; In addition, the implementation of various types may vary.

§ "Ordinary file" - a file that allows reading, recording, moving inside the file

§ Catalog (eng. directory. - Alphabetical directory) or directory - a file containing records of files included in it. Catalogs may contain records about other directories, forming a tree structure.

§ Hard link (eng. hardLinkFrequently used cartank "Hardlink") - in the general case, the same area of \u200b\u200binformation may have several names. Such names are called hard references (Hardlinka). After creating Hardlink to say where the "real" file, and where the Hardlink is impossible, since the names are equal. The data area itself exists as long as there is at least one of the names. Hardgings are possible only on one physical media.

Electronic media include speakers for a single or multiple recording (usually digital) Electrically: CD-ROM, DVD-ROM, semiconductor (flash memory, etc.), floppy disks.

Have a significant advantage over paper (sheets, newspapers, magazines) in volume and specific cost. For storage and providing operational (not long-term storage) information - have an overwhelming advantage, there are also significant opportunities for the provision and in a convenient consumer format (formatting, sorting). Disadvantage - Small screen size (or significant weight) and fragility of reader devices, dependence on power sources.

Currently, electronic media actively displaces paper, in all branches of life, which leads to a significant savings of wood. The minus they are that for reading and for each type and format of the media, it is necessary to correspond to it.

[edit] Storage devices

Main article:Memory device

Media, together with a mechanism for recording / reading information on it ( reading device, reading device), called information storage device (also - information driveIf it provides for a download of incoming to already available). These devices can be based on a variety of physical record principles.

In some cases (to guarantee reading, with rarity carrier, etc.), the information carrier is delivered to the consumer along with the storage device to read it.

Root directory

Directory, directly or indirectly including all other directories and file system files are called root. It is indicated by the symbol " /" (Slash).

The path to the file.

In order to find a file in a hierarchical file structure, you must specify the path to the file. The path to the file includes the "\\" logical disk name and sequence of names attached to each other in each other, in the last of which this desired file is located.

For example, the path to files in the figure can be written so.

Considering the issues of determining the concept of "information"

1.1. Definition of information
1.2. Quantitative measure of information (- what is the value or amount of information; - the formula of Shannon; - BIT and bytes; - expert methods for assessing the information and the formation of new information measures)
1.3. Classification of information (- according to the encoding method; - on the sphere of occurrence; - according to the method of transmission and perception; - by public appointment)
1.4. Properties of information (- attribute properties of information; - pragmatic properties of information; - dynamic properties of information)
2. What is informatics
2.1. Definition of informatics
2.2. The main components (- theoretical computer science; - Simeotics; - Cybernetics; - analog and digital processing of information)
2.3. Some definitions.

Introduction

The problem of teaching computer science at the initial stage both in the high-grades of secondary schools and in the first courses of higher education is numerous disputes. Until recently, it was considered one of the main tasks to get acquainted with computer equipment and the ability to program on one of the simplest languages \u200b\u200b(as a rule "school algorithmic language", "Baysik" or "Pascal"). Such an orientation has noted a bias towards programming. The trainee has an association of the word "informatics" with the word "programming". In this methodological manual, an attempt was made to disclose the concepts of computer science and information in order to use by their specialists of humanitarian directions. The trainees should be able to operate with the information of any kind: linguistic, visual, musical. The manual will help them will begin to gain processing skills and systematization of information, orientation in information networks.



1.1. Definition of information


The concept of "information" is widely used in the usual life of a modern person, so everyone has an intuitive idea what it is. But when science begins to apply well-known concepts, it clarifies them, adapting to their goals, limits the use of the term with a strict framework of its application in a specific scientific field. So physics has determined the concept of force, and the physical term of strength is not at all what is meant when they say: the power of the will, or the power of the mind. At the same time, science, studying the phenomenon, expands the representation of a person about him. Therefore, for example, for physics, the concept of force, even limited to its strict physical meaning, is much more richer and more informative than for non-in-law in physics. So the concept of information, becoming the subject of studying many sciences, is specified in each of them and enriched. The concept of information is one of the main in modern science and therefore cannot be strictly determined through simpler concepts. You can only refer to various aspects of this concept, explain, illustrate its meaning. People's activities are related to the processing and use of materials, energy and information. Accordingly, scientific and technical disciplines developed, reflecting issues of materials science, energy and computer science. The importance of information in the life of society is growing rapidly, methods of working with information are changing, the scope of application of new information technologies is expanded. The complexity of the phenomenon of information, its multi-faceted, the breadth of application scope and rapid development is reflected in the constant appearance of new interpretations of informatics and information concepts. Therefore, there are many definitions of the concept of information, from the most common philosophical - "information is the reflection of the real world" to the narrow, practical - "information is all information that is the object of storage, transmission and transformation."


We also give a comparison some other definitions and characteristics:


  1. Information (information) is the content of the message or signal; Information considered in the process of their transfer or perception, allowing to expand knowledge about the object of interest.

  2. Information is one of the fundamental essences of the world around us (Acad. Pospelov).

  3. Information - initially information transmitted by one people to other people oral, written or in some other way (BSE).

  4. Information is a reflected variety, that is, violation of monotony.

  5. Information is one of the main universal properties of matter.

By information, it is necessary to understand the items themselves and processes, but their reflection or mapping in the form of numbers, formulas, descriptions, drawings, symbols, images. The information itself can be attributed to the field of abstract categories like, for example, mathematical formulas, but work with it is always associated with the use of any materials and energy costs. The information is stored in the rock paintings of the ancient people in the stone, in texts of books on paper, in the paintings on canvas, in musical tape recordings on a magnetic tape, in the data of the computer's RAM, in the hereditary DNA code in each living cell in the memory of a person in his brain etc. For its recording, storage, processing, distribution We need materials (stone, paper, canvas, magnetic tape, electronic data carriers, etc.), as well as energy, for example, to activate printing machines, create artificial climate for storing visual art masterpieces , Patient Electronic Calculator Electronic Schemes, maintain the operation of transmitters on radio and television stations. Successes in the modern development of information technologies are primarily associated with the creation of new materials underlying electronic components of computing machines and communication lines.


1.2. Quantitative measure of information


What is the magnitude or amount of information


Each item or phenomenon is trying to characterize, for comparison with the like, its magnitude. It is not always possible to make it easy and unequivocally. Even the magnitudes of physical items can be estimated in different ways: by volume, weight, mass, the number of components of its elements, cost. Therefore, for example, it is clear that even on a simple question: "What is more, a kilogram giri or a children's air ball?" - You can answer in different ways. The matter is more difficult and multifaceted and the more characteristics of this phenomenon, the harder it is difficult to find it satisfying everyone who is engaged in this phenomenon, the definition of its magnitude. Similarly, the amount of information can be measured in different ways: in the quantities of books, pages, signs, meters of film, tons of archival materials, kilobytes of the EMM RAM, and also evaluate the emotional perception of a person, for the benefits received from the possession of information at the necessary costs of processing , systematization of information, etc. Try to evaluate where more information: in the ENStein formula E \u003d MC2 underlying the physics of the hydrogen bomb, in the picture of the Aivazovsky "ninth shaft" or in the daily television program "News". Apparently the easiest to assess the amount of information on how much space is necessary for its storage, choosing some single way of presenting and storing information. With the development of computer, such a single way was the encoding of information using numbers 1 and 0. The coding here we call rewriting information from one presentation method to another. The number of positions (called binary), in which there are only 1 or 0 numbers required to directly write the message, is one of the criteria for the amount of information and is called the amount of information in the bits. To record one symbol (letters, numbers, a space between words, punctuation marks) in the computer, the 8 binary positions are most often used, and this is called byte. Thus, the phrase: "Snow White and the Seven Dwarfs" consists of 21 letters (without quotes) and two spaces between words and will occupy in memory of the computer 23 bytes or 184 bits. It is possible not direct, but a compressed record of information, i.e. Coding it smaller bits. This is done by special processing and analyzing the frequency of appearance, location and number of characters in the message. In practice, a person also applies compression a message based on its meaning. For example, a long message in the volume of 37 bytes "Thousand nine hundred and ninety-sixth year" can be compressed up to four characters "1996" for the first time, as a scientific concept, information has become applied in library science, journalism theory. Then it began to consider science on optimal coding of messages and transferring information on technical communication channels.


Formula Shannon


Claude Elwood Shannon proposed the theory of information in 1948, which gave a probabilistic statistical definition of the concept of the number of information. Each signal in Shannon's theory is attributed to the likelihood of its appearance. The less likely to appear a signal, the more information it carries for the consumer. Shannon proposed the following formula for measuring the number of information:



I \u003d -s p i log 2 p i



where i is the amount of information; P i is the probability of the appearance of the i-th signal;


N - the number of possible signals.


The formula shows the dependence of the number of information from the number of events and on the likelihood of these events. Information is zero, if only one event is possible. With an increase in the number of events, information increases. I \u003d 1 - a unit of information called "Bit". Bit is the main unit of information.


Bit and Byte


The technique is possible two outgoings, which are encoded as follows: the number one "1" - "yes", "enabled", "current goes" ... digit zero "0" - "no", "off", "current does not go ". Figures 1 and 0 are symbols of the simplest iconic calculus system. Each sign or symbol of the binary calculation system contains one bit of information. Of particular importance for measuring the volume of symbolic information has a special unit - bytes. 1 byte \u003d 8 bits, which corresponds to eight discharges of binary numbers. Why exactly 8? So it happened historically. The amount of information is also measured in byte derivatives of units: Kribi, MB, MB and GB, only the prefix "K", "M" and "G" do not mean, as in Physics "Kilo", "Mega" and "Giga", although they are often So called. In Physics "Kilo" means 1000, and in the computer science "K" means 1024, as this number is more natural for computing machines. They are at the heart of their arithmetic use number 2, as a person at the heart of its arithmetic applies the number 10. Therefore, the numbers 10, 100, 100, etc. Comfortable for humans, and numbers 2, 4, 8, 16, and finally the number 1024, which is obtained multiplying the twos ten times, "convenient" for a computer.


1 KB (KB) \u003d 1024 byte \u003d 8192 bits


1 MB (MB) \u003d 1024 KBA \u003d 2 20 byte \u003d 2 23 bits


1 GB (MB) \u003d 1024 MB \u003d 2 20 KBA \u003d 2 30 byte \u003d 2 33 bits.


The number of information entered in this way does not coincide with the generally accepted concept of the number of information, as the importance of the information received, but it is successfully used in computing technology and communications.


Expert methods for assessing the information and the formation of new information measures


Since information has a variety of characteristics, the practical importance of which in various informatics applications are different, then there may be a single measure of the number of information, convenient in all cases. For example, the number of information measures can be the complexity of calculation using some universal algorithm. It should be expected that the further penetration of informatics in the direction of human activity, where it is still weak, including art, will lead to the development of new scientific definitions of the number of information. So the perception of the work of art that we like, brings a feeling of filling the new information that is previously unknown. Not a gift, often the effect made on a man with a great musical work, a web artist, and sometimes just contemplating nature: picturesque mountains, deep sky, - characterize the word "revelation". Therefore, the characteristics of the number of information characterizing its aesthetic and artistic value may appear. Until simple, mathematically pronounced definitions of the number of one or another properties of information are created, the so-called expert assessments are used to evaluate its magnitude, i.e. Conclusions of specialists in this field. They give their estimates on the basis of personal, often very subjective experience. Professional communication between experts and creative discussion of the subject of analysis leads to the development of more or less generally accepted evaluation criteria, which may ultimately become the basis for creating a formal measure, unambiguous, as an international standard meter. Examples of the formation of future information measures, in its different manifestations, the following expert assessments and other indicators already used can be:



points given by the judges of the competition for the artisticity of execution, for example, figure skating;
Film reviews in the press with the pool of balls to their degree of interest to the film worker;

The cost of painting works;

Evaluation of the work of a scientist in the number of published articles;

Evaluation of the work of a scientist in the number of references to its work in the works of other scientists (refumericability index);

The popularity indices of musical works and their performers published in the press;

Student ratings exhibited by college teachers.



In addition to measuring the amount of memory in bits and bytes, the technique also uses other units that characterize the information with information:



the number of operations per second, which characterizes the speed of processing information by the computing machine;

the number of bytes or bit per second, characterizing the speed of information transfer;

The number of signs per second characterizing the read speed, dialing of texts or the speed of the printing device.



1.3. Classification of information


Information can be conditionally divided into various types based on this or ion of its property or characteristic, for example, by the method of coding, the sphere of the occurrence, the method of transmission and perception and public appointment, etc.


According to the encoding method


By the method of encoding a cigal, information can be divided into analog and digital. Analog signal information about the value of the source parameter, which is reported in the information, is in the form of a value of another parameter, which is the physical basis of the signal, its physical carrier. For example, the magnitudes of the clock arrow angles are the basis for analog time display. The height of the mercury column in the thermometer is the parameter that gives analogue temperature information. The larger the length of the table in the thermometer, the greater the temperature. To display information in an analog signal, all intermediate parameter values \u200b\u200bfrom the minimum to maximum are used, i.e. Theoretically endlessly large number. The digital signal uses only a minimum number of such values \u200b\u200bas a physical basis for recording and transmitting information, most often only two. For example, two states of the physical signal carrier are used at the heart of the information of the information into the computer - electrical voltage. One state - there is an electrical voltage, conditionally denoted by unit (1), another - no electrical voltage, conventionally denoted by zero (0). Therefore, to transmit information about the value of the initial parameter, it is necessary to use the presentation of data in the form of a combination of zeros and units, i.e. Digital view. Interestingly, the one-time computing machines were developed and used, based on the ternary arithmetic, as the main states of the electrical voltage naturally take three following: 1) the voltage is negative, 2) the voltage is zero, 3) the voltage is positive. Scientific works are still overlooking such machines and describing the advantages of ternary arithmetic. Now in competitive struggle, manufacturers of binary machines won. Will it always be so? We present some examples of household digital devices. Electronic clock with digital indication give digital information about the time. The calculator makes calculations with digital data. The mechanical lock with digital code can also be called a primitive digital device.


On the sphere of origin


On the occurrence of information, information can be classified as follows. The information arising in inanimate nature is called elementary, in the world of animals and plants - biological, in human society - social. In nature, alive and inanimate, information is carried: color, light, shadow, sounds and smells. As a result of the combination of color, light and shadow, sounds and odors there are aesthetic information. Along with natural aesthetic information, as the result of creative activities of people arose another variety of information - works of art. In addition to aesthetic information, semantic information is created in human society, as the result of the knowledge of the laws of nature, society, thinking. The division of information on aesthetic and semantic is obviously very conditionally, it is simply necessary to understand that its semantic part can prevail in one information, and in another aesthetic.


According to the transfer and perception method


According to the method of transmission and perception, information is customary as follows. The information transmitted in the form of visible images and symbols is called visual; transmitted by sounds - audial; sensations - tactile; smells - taste. Information perceived by the office equipment and computers is called machine-oriented information. The number of machine-oriented information is constantly increasing due to the continuously increasing use of new information technologies in various spheres of human life.


According to public appointment


According to public appointment, information can be divided into mass, special and personal. Mass information is divided in turn to the socio-political, ordinary and popular. Special information is divided into industrial, technical, managerial and scientific. Technical information has the following gradations:

Machine-tool,

Mechanical engineering

Instrumental ...

Scientific information is divided into biological, mathematical, physical ...


1.4. Properties of information


The information has the following properties:

Attribute;

Pragmatic;

Dynamic.

Attributes are those properties, without which the information does not exist. Pragmatic properties characterize the degree of utility of information for the user, consumer and practice. Dynamic properties characterize the change in time information.


Attribute properties of information


Emergency information from physical media and language nature of information


The most important attribute properties of information are the properties of the integrity of information from the physical media and the language nature of information. One of the most important areas of informatics as science is to study the characteristics of various media and information languages, the development of new, more advanced and modern. It should be noted that although the information and the integrity of the physical carrier and has a linguistic nature, it is not associated with a rigidly neither with a specific language or a specific carrier.


Discreteness


The following attribute properties of information on which you want to pay attention is the property of discreteness. The information contained in information, knowledge is discrete, i.e. characterize individual actual data, patterns and properties of studied objects that are distributed in the form of various messages consisting of a line, compound color, letters, numbers, symbol, sign.


Continuity


The information has a property to merge with the already fixed and accumulated earlier, thereby contributing to progressive development and accumulation. This confirmation is confirmed by another attribute property of information - continuity.


Pragmatic properties of information


Meaning and novelty


Pragmatic information properties are manifested in the process of using information. First of all, this category of properties will take the existence of meaning and novelty of information, which characterizes the movement of information in social communications and highlights the part of it, which is Nova for the consumer.


Utility


Useful information is information that reduces the uncertainty of information about the object. Disinformation is regarded as negative values \u200b\u200bof useful information. The use of the information utility term is found to describe what influence on the internal state of the person, his mood, well-being, finally health, has incoming information. In this sense, useful or positive information is the one that is happily perceived by a person, contributes to improving his well-being, and negative information inhibitively acts on the psyche and human well-being, can lead to a deterioration in health, a heart attack, for example.


Value


The next pragmatic property of information is its value. It is necessary to pay attention to the value of the information is different for various consumers and users.


Cumulativeness


The property of cumulativeness characterizes the accumulation and storage of information.


Dynamic properties of information


Dynamic properties of information, as follows from the very name, characterize the dynamics of information development in time.


An increase in information


First of all, it is necessary to note the property growth property. Movement of information in information communications and its constant distribution and growth determine the property of multiple propagation or repeatability. Although information is dependent on a particular language and a specific carrier, it is not associated with a rigidly neither with a specific language with any specific carrier. Due to this, information can be obtained and used by several consumers. This is a multiple use property and manifestation of information dispersion on various sources.


Aging


Among the dynamic properties it is also necessary to note the character of aging information.


2. What is informatics


2.1. Definition of informatics


Not very long ago, informatics understood the scientific discipline studying the structure and general properties of scientific information, as well as the patterns of all the processes of scientific communication - from informal processes of exchange of scientific information with the immediate oral and written communication of scientists and specialists to formal exchange processes through scientific literature. This understanding was close to such as "library science", "Booking". The concept of "informatics" sometimes served as the term "documentation" The rapid development of computing technology changed the concept of "informatics", giving it a significant meaning much more directed to computing techniques. Therefore, there are still different interpretations of this term. In America, as analogous to the European understanding of informatics, the term "computer science" is applied - science of computers. Close to the concept of informatics is the term "systemotechnics", for which the dictionaries also translate "Computer Science" is also given. Computer science is a science that studies all aspects of obtaining, storing, transformation, transfers and use information.


2.2. Main components


The components of this science are: theoretical informatics, simeotics, cybernetics. Practically informatics is implemented in programming and evisoring technology.


Theoretical informatics


Theoretical informatics is a foundation for building general computer science. This discipline is engaged in the construction of models, the construction of discrete sets that describe these models. An integral part of theoretical computer science is logic. Logic - a set of rules that are subject to the process of thinking. Mathematical logic studies logical connections and relationships underlying a deductive (logical) output.


Simeomotics


Simeomotics explores the iconic systems that make up signs - may have the most diverse nature, only three components related to the contractual relations can be distinguished: syntax (or expression plan), semantics (or values) and pragmatatic (or plan use). Simeomotics allows you to establish analogies in the functioning of various systems of both natural and artificial origin. Its results are used in computer linguistics, artificial intelligence, psychology and other sciences.


Cybernetics


Cybernetics arose in the late 40s, when N. Viner put forward the idea that the rules for managing alive, non-life and artificial systems have many common features. The relevance of the findings of N. Viner was supported by the emergence of the first computers. Cybernetics today can be considered as the direction of computer science, considering the creation and use of automated systems for controlling varying degrees of complexity.


Analog and digital information processing


Informatics, as a science on processing information, is implemented in analog and digital processing of information. The analog processing of information include direct actions with color, light, shape, line, etc. Look at the world through pink glasses (literally) is analog processing of visual information. Analog computing devices are possible. They were widely used earlier in technique and automation. The simplest example of such a device is the logarithmic ruler. Previously, in schools, they were taught using it to produce multiplications and divisions and it was always at hand of any engineer. Now it was replaced by digital devices - calculators. Digital processing, information usually understand the action with information through digital computing. Currently, traditional analog methods for recording sound and television information are replaced by digital methods, but they have not yet been widespread. However, we are increasingly using digital devices for managing traditional "analog" devices. For example, the signals fed from the portable control device of the TV or VCR are digital. The scales that appear in stores, outstanding the weight and purchase cost, are also digital. Natural ways of displaying and processing information in nature are analog. An animal trail imprint is an analog signal of an animal. Creek is an analog way to convey the inner state: the louder - the stronger the feeling. Physical processes perform analog processing of signals in sense organs: focusing the image on the retina of the eyeball, spectral analysis of sounds in the ear snail. Analog signal processing systems are faster than digital, but perform narrow functions, poorly rebuilt for new operations. Therefore, now the numerical computers developed so rapidly. They are universal and allow you to process not only numerical, but also any other information: text, graphic, sound. Digital computers are able to receive information from analog sources using special devices: analog-to-digital converters. Also, information after processing on digital computer can be translated into analog form on special devices: digital-analog converters. Therefore, modern digital computers can speak, synthesize music, draw, manage a machine or machine. But it may not be so noticeable for everyone as digital computers, but the analog information processing systems develop. And some of the analog information processing devices have not yet found and apparently in the near future they will not find themselves a decent digital replacement. Such a device, for example, is the camera lens. It is likely that the future of technology behind the so-called analog-to-digital devices using the benefits of those and others. The emerging senses, the nervous system and thinking are also built by nature both on analog and digital basis. When designing a man-machine system, it is important to take into account the characteristics of a person to perceive a particular type of information. When reading texts, for example, a person perceives 16 bits in 1 second, while holding 160 bits at the same time. Convenient design in the aircraft cabin, on the control panel of a complex system, greatly facilitates the human work, increases the depth of its awareness of the current state of the managed object, affects the speed and efficiency of the decisions taken.


2.3. Some definitions.


Science is a social scope of creating and using information as knowledge of the objective world of man.


Art is social activities to create and use sources of information that affect the first of all for feelings in the second consciousness.


Creativity - Manufacture of new information by a person. Pedagogy is the organization of the information process associated with the maximum learning of information.


Training is the transfer of information to acquire knowledge and skills.

Literature

1. Informatics. Encyclopedic dictionary for beginners. Ed. D.A.Pozosrelova - M. Pedagogic-Press, 1994


2. Ya.L.Shraiburg, M.V. Balov - Reference Guide on the basics of informatics and computing equipment - M. Finance and Statistics, 1995


3. Informatics and culture. Collection of scientific papers. - Novosibirsk, Science, Siberian Branch, 1990


4. D.I. Blyumenu - Information and Information Service - Leningrad, Science, 1989


5. Information technology: development and application issues. - Kiev.: Nauk.domka, 1988


6. The concept of informatization of education // Informatics and education. - 1990 - N1


7. The terminological dictionary on the basics of informatics and computing equipment / A.P. Ershov et al.; Ed. A.P. Hershova, N.M.Shansky. - M.: Enlightenment, 1991.-159 p.


8. Zavarykin V.M. and others. Fundamentals of computer science and computing technology: studies. Manual for students ped. In-Tov on the physical mat. Spec.- M.: Enlightenment, 1989.-207 p.


9. Encyclopedia of cybernetics. - The main editorial office of the Ukrainian Soviet Encyclopedia. Kiev, 1974.


Goncharenko Elena Alexandrovna
Znamensky Vasily Serafimovic


NGO CBD
Nalchik College Design
Nalchik-1996.


Information is information about anything.

Concept and types of information, transmission and processing, searching and storing information

Information is the definition

Information isany intelligence, accepted and transmitted, stored by various sources. - This is the whole set of information about the world around us, about all sorts of processes occurring in it that can be perceived by living organisms, electronic machines and other information systems.

- this is Significant information about something, when the form of their submission is also information, that is, it has a formatting function in accordance with its own nature.

Information isall that our knowledge and assumptions can be supplemented.

Information is For something, regardless of the form of their presentation.

Information is Mental any psychophysical organism produced by it when using any means called the means of information.

Information isinformation perceived by man and (or) specials. devices as a reflection of the facts of the material or spiritual world in process communications.

Information is Data organized in such a way that makes sense to have a person with them.

Information is The value attached to the person in data on the basis of the well-known agreements used for their submission.

Information is Information, explanations, presentation.

Information is Any data or information that is interested in anyone.

Information isinformation about the objects and phenomena of the environment, their parameters, properties and condition that information systems perceive (living organisms, control machines, etc.) in process Life and work.

The same informational message (article in the newspaper, announcement, letter, telegram, certificate, story, drawing, radio transmission, etc.) may contain a different amount of information for different people - depending on their preceding knowledge, from the level of understanding of this Messages and interest in it.

In cases when they talk about automated work With information through any technical devices, it is not interested in the message that does not contain the message, but how many characters this message contains.

INFORMATION (INFORMATION) is

In relation to computer processing, the data is understood to understand some sequence of symbolic designations (letters, numbers, coded graphic images and sounds, etc.), carrying the semantic load and the form presented in a clear computer. Each new character in such a sequence of characters increases the information volume of the message.

Currently, there is no single definition of information as a scientific term. From the point of view of various areas of knowledge, this concept is described by its specific set of features. For example, the concept of "information" is the basic informatics, and it is impossible to give it a definition through the other, more "simple" concepts (also in geometry, for example, it is impossible to express the content of the basic concepts "Point", "Direct", "Plane" Through simpler concepts).

The content of the main, basic concepts in any science should be explained on the examples or detected by their comparison with the content of other concepts. In the case of the concept of "information", the problem of its definition is even more complicated, as it is a general scientific concept. This concept is used in various sciences (informatics, cybernetics, biology, physics, etc.), while in each science the concept of "information" is associated with various systems of concepts.

The concept of information

In modern science there are two types of information:

Objective (primary) information is the property of material objects and phenomena (processes) to generate a diversity of states that are transmitted by other objects by interactions (fundamental interactions) and imprinted in their structure.

Subjective (semantic, meaningful, secondary) information is the semantic content of objective information about objects and processes of the material world, formed by the consciousness of a person with the help of semantic images (words, images and sensations) and recorded on any material carrier.

In the domestic meaning information is information about the environment and processes occurring in it, perceived by a person or special device.

Currently, there is no single definition of information as a scientific term. From the point of view of various areas of knowledge, this concept is described by its specific set of features. According to the concept of K. Shannon, the information is the filmed uncertainty, i.e. The information that should be lifted to one degree or another by the purchased prior to their receipt of uncertainty, to expand its understanding of the object with useful information.

From the point of view of Gregory Concrete, an elementary unit of information is "not indifferent distinction" or an effective difference for some greater perception system. Those differences that are not perceived, he calls "potential", and perceived - "effective". "Information consists of not indifferent differences" (c) "Any perception of information with the need to get information about the difference." From the point of view of computer science, information has a number of fundamental properties: novelty, relevance, accuracy, objectivity, completeness, value, etc. Analysis of information is engaged, first of all, science logic. The word "information" comes from the Latin word Informatio, which translated changes, clarification, familiarization. The concept of information was considered by the antician philosophers.

INFORMATION (INFORMATION) is

Before the start of the industrial revolution, the definition of the essence of the information remained prerogative of predominantly philosophers. Next to consider the issues of information theory has become new at that time science cybernetics.

Sometimes in order to comprehend the essence of some concept, it is useful to analyze the meaning of the word that this concept is indicated. The clarification of the inner form of the word and the study of the history of its use can shed unexpected light into its meaning, eclipsed by the usual "technological" use of this word and modern connotations.

The word information entered the Russian language to the Petrovsk era. First recorded in the "spiritual regulation" of 1721 in the meaning "presentation, the concept of something.". (In European languages, it has been fixed earlier - near the XIV century.)

INFORMATION (INFORMATION) is

Based on this etymology, information can be considered any significant change in the form or, in other words, any materially recorded traces formed by the interaction of objects or forces and amenable to understanding. The information is thus the transformed form of energy. The information carrier is a sign, and the method of its existence - interpretation: identifying the value of the sign or sequence of signs.

The event may be reconstructed by the sign, served as the cause of its occurrence (in the case of "natural" and involuntary signs, such as traces, evidence, and so on.), Or a message (in case of conventional signs inherent in the sphere of language). It is the second kind of signs that makes up the body of human culture, which, according to one of the definitions, there is a "totality of not atrial transmitting information."

INFORMATION (INFORMATION) is

Messages may contain information about facts or interpretation of facts (from lats. Interpretatio, interpretation, translation).

A living creature receives information with the help of senses, as well as through reflection or intuition. The exchange of information between the subjects is communication or communication (from the lat. Communicatio, message, transmission, derived, in turn from lat. Communico, do general, report, talk, connect).

From a practical point of view, the information is always presented in the form of a message. An information message is associated with a message source, recipient and communication channel.

Returning to the Latin etymology of the word information, let's try to answer the question of what exactly the form is attached here.

It is obvious that, first, some meaning, which, being initially unformed and unselected, exists only potentially and should be "built" to become perceived and transmitted.

Secondly, the human mind, which is brought up to think structurally and clearly. Thirdly, a society, which is due to the fact that his members share these meanings and jointly use them, acquires unity and functionality.

INFORMATION (INFORMATION) is

information as a pronounced reasonable meaning is knowledge that can be stored, transmitted and being the basis for generating another knowledge. Forms of conservation of knowledge (historical memory) are diverse: from myths, chronicles and pyramids to libraries, museums and computer databases.

Information - information about the world around us, processes flowing in it that perceive living organisms, manager Machines and other information systems.

The word "information" Latin. For a long life, its importance has undergone evolution, it expanding, then extremely narrowing its borders. At first, under the word "information" implied: "View", "concept", then "information", "transfer of messages".

In recent years, scientists have decided that the usual (all adopted) meaning of the word "information" is too elastic, vague, and gave him such a value: "measure of certainty in the message."

INFORMATION (INFORMATION) is

Theory of information caused the need for practice. Her appearance is associated with work Claude Shannon "Mathematical Communication Theory" published in 1946. The foundations of the theory of information rely on the results obtained by many scientists. By the second half of the 20th century, the globe buzzed from the transmitted information running on telephone and telegraph cables and radio channels. Later, electronic computing machines appeared - information processors. And for that time, the main task of the theory of information was, first of all, improving the efficiency of communication systems. The complexity in the design and operation of funds, systems and communication channels is that the design and engineer is not enough to solve the task from physical and energy items. From these points of view, the system may be the most perfect and economical. But it is important when you create transmitting systems to pay attention to how much information will pass through this transmitting system. After all, information can be measured quantify, calculate. And they come with similar calculations as the usual way: abstracts from the meaning of the message, as the concreteness in the usual arithmetic action we are familiar (as from the addition of two apples and three apples go to the addition of numbers in general: 2 + 3).

Scientists stated that they were "fully ignored the human assessment of information." Sequential row of 100 letters, for example, they attribute a certain value of information, not paying attention, whether this information has the meaning and has, in turn, the meaning is practical. The quantitative approach is the most developed branch of the theory of information. In accordance with this definition, a combination of 100 letters - the phrase of 100 letters from the newspaper, the Pieces of Shakespeare or the Einstein theorem - has exactly the same amount of information.

Such a definition of the amount of information is highly useful and practical. It exactly corresponds to the task of the communication engineer, which must transfer all the information contained in the filed telegram, regardless of the value of this information for the addressee. Communication channel is soulless. The transmitting system is important one: transfer the desired amount of information for a certain time. How to calculate the number of information in a particular message?

INFORMATION (INFORMATION) is

Assessment of the amount of information is based on the laws of probability theory, more precisely, is determined through probability events. This is understandable. The message is value, it carries information only when we learn from it about the outcome of an event random when it is suddenly unexpected. After all, no information does not contain any information. Those. If you, let's say, someone will call the telephone apparatus and say: "In the afternoon it is light, and at night dark," then such a message will surprise you only with the absurdity of the obvious and well-known, and not the news it contains. Other business, for example, the result of the race on the races. Who will come first? The outcome is difficult to predict here. The event that interests us has random outcomes, the more valuable the message of its result, the more information. A message about an event that only two equally possible outcomes contains one unit of information called bit. The choice of units of information is not accidental. It is associated with the most common binary method of its encoding during transmission and processing. We will try at least in the most simplified form to imagine that general principle of quantitative evaluation of information, which is the cornerstone of the whole theory of information.

We already know that the amount of information depends on probables These or other event outcomes. If the event, as scientists say, has two equivalent outcome, this means that each outcome is 1/2. Such is the probability of the fallout of the "Eagle" or "Dish" when throwing a coin. If the event has three equivalent outcome, then the probability of each is equal to 1/3. Notice, the amount of probabilities of all outcomes is always equal to one: because some of all possible outcomes will have to come. The event, as you understand, may have non-equilibrium outcomes. So, with a football match between strong and weak teams, the likelihood of victory for a strong team is large - for example, 4/5. No less than 3/20. The probability of the same lesion is completely small.

It turns out that the amount of information is a measure to reduce the uncertainty of some situation. Different amounts of information are transmitted over communication channels, and the number of information passing through the channel cannot be greater than its bandwidth. And it is determined by how much information passes here per unit of time. One of the heroes of the novel Jules is true "Mysterious Island", a journalist Gideon Spielllet, passed on telephone set The chapter from the Bible so that his competitors could not use the telephone connection. In this case, the channel was loaded completely, and the amount of information was equal to zero, because the subscriber was transferred to the information known for it. So, the channel worked hard, having missed a strictly defined number of pulses, without loading them. Meanwhile, the more information makes each of the specific number of pulses, the more fully used channel bandwidth. Therefore, it is necessary to reasonably encode information, find an economical, stingy language for sending messages.

The information "sieves" the most thorough way. In telegraph, frequent letters, combinations of letters, even whole phrases depict a shorter set of zeros and units, and those that are less common - longer. In the case when it reduces the length of the code word for frequent characters and increases for rarely encountered, they are talking about effective encoding of information. But in practice, it often happens that the code that occurred as a result of the most careful "sifting" code, the code is convenient and economical, can distort the message due to interference, which is always, unfortunately, there are in channels of communication: sound distortion in the phone, atmospheric interference in, distortion or dimming images in television, errors when transferred to telegraph. These interference, or, as experts are called, noises, fell on the information. And this is the most incredible and, naturally, unpleasant surprises.

Therefore, to increase reliability in transmission and processing information, you have to introduce unnecessary symbols - a kind of protection against distortion. They are these unnecessary characters - do not carry a valid content in the message, they are redundant. From the point of view of the theory of information, everything that makes a language colorful, flexible, rich shades, multidimensional, multivalued, is redundancy. How to excessively from such positions, Tatyana's letter to Onegin! How many informational excesses in it for a brief and all understandable report "I love you"! And how the informationally accurate hand drawn designations, understandable to everyone and everyone, who is included in the subway today, where instead of words and phrases of the ads hang laconic symbolic signs indicating: "Login", "Exit".

In this regard, it is useful to remember the anecdote, told by the famous American scientist Benjamen Franklin, about the hat, inviting his friends to discuss the sign of the signboard. Didn't draw a hat on a signboard and write: "John Thompson, a hat, does and sells hats for cash." One of the friends noticed that the words "for cash money»Are unnecessary - such a reminder will be offensive for buyer. Another I also found an excess word "sells", as it must be understood that the hat is selling hats, and does not give them to the gift. The third it seemed that the words "hat" and "makes hats" are unnecessary tautology, and the last words were thrown into. The fourth suggested throwing out the word "hat" - the drawn hat clearly says who John Thompson. Finally, the fifth assured that for buyer It is absolutely indifferent whether the hat will be called John Thompson or otherwise, and suggested to do without this instruction. In the way, in the end, nothing remains on the sign, except hats. Of course, if people had only enjoyed such codes, without redundancy in messages, all the "informational forms" - books, reports, articles would be extremely brief. But would lose in dustiness and beauty.

Information can be divided into views of different criteria: By truth: True and false;

by way of perception:

Visual - perceived by the agencies of vision;

Audial - perceived by hearing authorities;

Tactile - perceived by tactile receptors;

Olfactory - perceived by olfactory receptors;

Delicious - perceived taste receptors.

at the form of the presentation:

Text - transmitted in the form of symbols intended to denote language lexemes;

Numeric - in the form of numbers and signs denoting mathematical actions;

Graphic - in the form of images, items, graphs;

Sound - oral or in the form of a recording of the transmission of the band language audio through.

for appointment:

Mass - contains trivial information and operates with a set of concepts, understandable most of the society;

Special - Contains a specific set of concepts, when used, information is transmitted, which may not be understood by the bulk of society, but necessary and understood as part of a narrow social group, which uses this information;

Secret - transmitted by a narrow circle of individuals and closed (protected) channels;

Personal (private) is a set of information about any personality that determines the social status and types of social interactions within the population.

by meaning:

Actual - information valuable at the moment;

Reliable - information obtained without distortion;

Understandable - information expressed in the language of understandable to whom it is intended;

Full - information sufficient to make the right solution or understanding;

Useful - the usefulness of information is determined by the subject that received information, depending on the amount of possibilities of its use.

The value of information in various areas of knowledge

In theory of information in our time, many systems, methods, approaches, ideas are developing. However, scientists believe that new ideas will appear in the current areas in theory of information, new ideas will appear. As evidence of the correctness of its assumptions, they lead "alive", the developing nature of science, indicate that the theory of information is surprisingly quickly and firmly introduced into a variety of areas of human knowledge. The theory of information has penetrated physics, chemistry, biology, medicine, philosophy, linguistics, pedagogy, economy, logic, technical sciences, aesthetics. According to the specialists themselves, the doctrine of information that emerged due to the needs of the theory of communication and cybernetics, crossed their framework. And now, perhaps, we have the right to talk about the information as a scientific concept that gives the theoretical method to the research method, with which you can penetrate many sciences about living and inanimate nature, about society, which will not only take a look at all problems with the new The parties, but also see still unresponsive. That is why the term "information" received in our time widespread, becoming part of such concepts as an information system, information culture, even information ethics.

Many scientific disciplines use the theory of information to emphasize the new direction in the old sciences. Thus arose, for example, information geography, information economics, information law. But the term "information" was extremely important in connection with the development of the latest computer equipment, the automation of mental labor, the development of new means of communication and information processing and especially with the emergence of computer science. One of the most important tasks of the theory of information is to study the nature and properties of information, the creation of methods for its processing, in particular the transformation of various modern information in a computer program, with which the mental work is automating, a peculiar increase in intelligence, and therefore the development of intellectual resources of society.

The word "information" comes from the Latin word Informatio, which in translation means minimize, clarification, familiarization. The concept of "information" is the basic informatics, however, it is impossible to give it a definition through the other, more "simple" concepts. The influence of "information" is used in various sciences, while in each science the concept of "information" is associated with various concepts of concepts. Information in biology: Biology studies living nature and the concept of "information" is associated with the appropriate behavior of living organisms. In living organisms, information is transmitted and stored with objects of various physical nature (DNA state), which are considered as signs of biological alphabets. Genetic information is inherited and stored in all cells of living organisms. Philosophic approach: Information is the interaction, reflection, knowledge. Cybernetic approach: Information is characteristics manager Signal transmitted over line.

The role of information in philosophy

The traditionalism of the subjective constantly dominated in the early definitions of information, as a category, concepts, properties of the material world. Information exists outside of our consciousness, and may be reflected in our perception only as a result of interaction: reflections, reading, obtaining in the form of a signal, incentive. Information is not material, as well as all the properties of matter. Information is in a row: matter, space, time, systemic, function, etc. What is the fundamental concepts of formalized reflection of objective reality in its distribution and variability, diversity and manifestations. Information is the property of matter and reflects its properties (state or ability of interaction) and the amount (measure) by interaction.

From a material point of view, information is the procedure for following the objects of the material world. For example, the order of the letters on a sheet of paper according to certain rules is written information. The order of the colorful points on a sheet of paper according to certain rules is graphical information. The order of musical notes is musical information. The order of genes in DNA is hereditary information. The order of the bits in the computer is computer information, etc. etc. For information exchange requires the availability of necessary and sufficient conditions.

INFORMATION (INFORMATION) is

The necessary conditions:

The presence of at least two different objects of material or intangible world;

The presence of a general property that allows you to identify objects as a carrier of information;

The presence of a specific property in objects allowing to distinguish between objects from each other;

The presence of the properties of space allows to determine the order of objects. For example, the location of written information on paper is a specific paper property that allows you to place the letters from left to right and top down.

A sufficient condition is one thing: the presence of a subject capable of recognizing information. This is a man and human society, the society of animals, robots, etc. The informational message is built by selecting from the basis of copies of objects and the location of these objects in space in a specific order. The length of the information message is defined as the number of copies of the base objects and is always expressed in an integer. It is necessary to distinguish the length of the information message, which is always measured by an integer, and the number of knowledge contained in the information message, which is measured in an unknown unit of measurement. From a mathematical point of view, information is the sequence of integers that are written in the vector. The numbers are the object number in the information base. The vector is called invariant information, as it does not depend on the physical nature of the base objects. The same informational message can be expressed in letters, words, suggestions, files, pictures, notes, songs, video clips, any combination of all previously mentioned.

INFORMATION (INFORMATION) is

The role of information in physics

information is information about the world (object, process, phenomenon, event), which are the object of conversion (including storage, transmission, etc.) and are used to work out behavior, to make a decision, to manage or for learning.

The characteristic features of the information are the following:

This is the most important resource of modern production: it reduces the need for land, work, capital, reduces the cost of raw materials and energy. For example, having the ability to archive your files (i.e., having such information), you can not spend money on the purchase of new diskettes;

Information causes new production to life. For example, the invention of the laser beam was the cause of the production and development of laser (optical) disks;

Information is a commodity, and its information does not lose after the sale. So, if a student informs his comrade about the schedule of classes during the semester, he does not lose this data;

Information gives additional value to other resources, in particular, labor. Indeed, the worker with higher education is valued more than with an average.

As follows from the definition, three concepts always associate with information:

The source of the information is the element of the surrounding world (object, phenomenon, event), information about which is the object of conversion. So, the source of information that the reader of the present study manual is currently receiving is informatics as a sphere of human activity;

The acquirer of the information is the element of the surrounding world that uses information (to work out behavior, to make a decision, to manage or for training). The acquirer of this information is the reader himself;

The signal is a material carrier that records information to transfer it from the source to the acquirer. In this case, the signal is electronic. If the student takes this manual in the library, then the same information will have a paper carrier. Being read and remembered by the student, the information will acquire another media - biological when it is "written" in the memory of the trainee.

The signal is the most important element in this scheme. The forms of its presentation, as well as the quantitative and qualitative characteristics of the information contained in it, important for the acquirer of information, are further discussed in this section of the textbook. The main characteristics of the computer as the main tool that makes the display of the source of information into the signal (1 in the figure) and "bringing" the signal to the acquirer of the information (connection 2 in the figure) are driven in a computer. The structure of procedures implementing communication 1 and 2 and the components of the information process is the subject of consideration in the information process.

The objects of the material world are in a state of continuous change, which is characterized by the exchange of energy of the object with the environment. Changing the state of one object always leads to a change in state, some other environmental object. This phenomenon, regardless of how which states and which objects are changed, can be considered as signal transmission from one object, another. Changing the state of the object when the signal is transmitted to it, is called signal registration.

The signal or sequence of signals form a message that can be perceived by the recipient in one form or another, as well as in one way or another. Information in physics is a term, a qualitatively generalizing concept "signal" and "message". If signals and messages can be quantified, then we can say that signals and messages are units of information of information. The message (signal) in different systems is interpreted in its own way. For example, a consistently long and two short sound signals in the terminology of Morse, this is a letter de (or d), in BIOS terminology from Award - a video card malfunction.

INFORMATION (INFORMATION) is

The role of information in mathematics

In mathematics, the theory of information (mathematical theory of communication) is the section of the applied mathematics, which determines the concept of information, its properties and establishing limit relations for data transmission systems. The main sections of the information theory are the source coding (compressive encoding) and channel (noise-resistant) coding. Mathematics is more than scientific discipline. She creates a single language of all science.

The subject of mathematics studies are abstract objects: number, function, vector, set, and others. At the same time, most of them are injected axiomatically (axioma), i.e. Without any connection with other concepts and without any definition.

INFORMATION (INFORMATION) is

information is not among the subject matter of mathematics research. Nevertheless, the word "information" is used in mathematical terms - own information and mutual information relating to the abstract (mathematical) part of the theory of information. However, in mathematical theory, the concept of "information" is associated with exclusively abstract objects - random values, while in modern theory of information, this concept is considered significantly wider - as a property of material objects. The connection between these two identical terms is undoubted. It is the mathematical apparatus of random numbers used the author of the theory of information Claude Shannon. He himself implies under the term "information" something fundamental (unteracted). In Shannon's theory, intuitively relies that the information has content. Information reduces overall uncertainty and information entropy. The amount of information is available to measure. However, he warns researchers from mechanical transfer of concepts from its theory to other areas of science.

"Search for ways to apply the theory of information in other areas of science is not reduced to the trivial transfer of terms from one area of \u200b\u200bscience to another. This search is carried out in a long-term process of extending new hypotheses and experimental verification." K. Shannon.

INFORMATION (INFORMATION) is

The role of information in cybernetics

The founder of Cybernetics Nor Bert Wiener spoke about the information like this:

information is not matter and not energy, information is information. "But the basic definition of information that he gave in several of his books, the following: Information is the designation of the content obtained by us from the outside world, in the process of adapting to it and our feelings.

Information is the basic concept of cybernetics, just as Economic I. is the main concept of economic cybernetics.

The definitions of this term are a lot, they are complex and contradictory. The reason is obvious that I. As a phenomenon, various sciences are engaged, and cybernetics are only the youngest of them. I. - the subject of studying of such sciences as the science of management, mathematical, genetics, theory of media I. (printing, radio, Television), Informatics dealing with scientific and technical problems I., etc. Finally, the last time is a great interest in problems I. Philosophers: they tend to consider I. as one of the main universal properties of matter related to the concept of reflection. With all the interpretations of the concept I. It involves the existence of two objects: the source of I. and the acquirer (recipient) I. Transfer I. From one to the other, it occurs with the help of signals that, generally speaking, may not have any physical connection with its meaning: this Communication is determined by the agreement. For example, the blow to the eunch bell meant that it was necessary to gather on the square, but those who did not know about this order, he did not report any I.

In a situation with the Bell Bell, a person participating in the agreement on the meaning of the signal knows that at the moment there may be two alternatives: the eve of the congregation will take place or will not take place. Or, expressing the language of the theory I., an indefinite event (eve /) has two outcome. The received signal leads to a decrease in uncertainty: a person now knows that the event (eve /) has only one outcome - it will take place. However, if it was known in advance that the veche takes place in such a hour, the bell did not report anything new. This implies that the less likely (i.e. more unexpected) the message, the more I. it contains, and vice versa, the greater the likelihood of the event before performing an event, the less I. contains a signal. Approximately such arguments led in the 40s. XX century To the emergence of statistical, or "classic", theory I., which determines the concept of I. through the measure of reducing the uncertainty of knowledge about the accomplishment of any event (such a measure was named entropy). N. Wiener, K. Shannon and Soviet scientists A. N. Kolmogorov, V. A. Kotelnikov and others were stood at the origins of this science, V. A. Kotelnikov and others. They managed to bring mathematical patterns of measurement of the number I., and hence such concepts as channel bandwidth and ., Storage capacity of I. devices, etc., which served as a powerful incentive to the development of cybernetics as science and electronic computing equipment as a practical application of cybernetics achievements.

As for the determination of value, the usefulness of I. for the recipient, then there is still a lot of unresolved, unclear. If you proceed from the needs of economic management and, therefore, economic cybernetics, then I. can be defined as all those information, knowledge, messages that help solve this or that task of management (i.e., reduce the uncertainty of its outcomes). Then, some possibilities for evaluation and it are also opening: it is more useful, more valuable, the sooner or smaller costs leads to a solution to the problem. The concept of I. Closely the concept of data. However, there is a difference between them: Data is signals from which it still needs to be removed by I. Data processing is the process of bringing them suitable for this.

The process of their transfer from the source to the acquirer and perception as I. can be considered as a passage of three filters:

Physical, or statistical (purely quantitative restriction on the bandwidth of the channel, regardless of the content of the data, i.e., from the point of view of syntallation);

Semantic (selection of those data that can be understood by the recipient, i.e. correspond to the thesaurus of his knowledge);

Pragmatic (selection among the understood information of those that are useful for solving this task).

This is well shown in the scheme taken from the book E. G. Yasin on economic information. Accordingly, three aspects of the study of I. Problems are allocated - syntactic, semantic and pragmatic.

According to the content I., it is divided into socio-political, socio-economic (including Economic I.), scientific and technical, etc. In general, the classifications of I. Many, they are built on various reasons. As a rule, data classifications are also built due to the proximity of concepts. For example, I. is divided into a static (constant) and dynamic (variable), and the data is on constant and variables. Another division is the primary, derivative, output I. (data is also classified). The third division is I. Managing and awareness. The fourth is excessive, useful and false. Fifth - complete (solid) and selective. This idea of \u200b\u200bWiener gives a direct indication of the objectivity of the information, i.e. Her existence in nature is independent of the consciousness (perception) of a person.

INFORMATION (INFORMATION) is

Objective information Modern cybernetics determines as an objective property of material objects and phenomena to generate a diversity of states that are transmitted from one object (process) to the other, and are captured in its structure. The material system in cybernetics is considered as many objects that themselves can be in different states, but the state of each of them is determined by the states of other system objects.

INFORMATION (INFORMATION) is

In nature, the set of state states is information, the states themselves are the primary code, or source code. Thus, each material system is a source of information. Subjective (semantic) Cybernetics information determines how the meaning or content of the message.

The role of information in computer science

The subject of studying science is the data: methods for their creation, storage, processing and transmission. Content (also: "filling" (in context), "filling the site") - term meaning all types of information (both text and multimedia - images, audio, video) constituting (visualized, for visitor, content) web -site. It is used to separate the concept of information constituting the internal structure of the page / site (code), from the one that will eventually be displayed.

The word "information" comes from the Latin word Informatio, which in translation means minimize, clarification, familiarization. The concept of "information" is the basic informatics, but it is impossible to give it a definition through other, more "simple" concepts.

The following approaches to the definition of information can be distinguished:

Traditional (ordinary) - used in computer science: Information is information, knowledge, reports on the situation that a person perceives from the world with the help of the senses (vision, hearing, taste, smell, touch).

Probabilistic - used in theory of information: Information is information about objects and phenomena of the environment, their parameters, properties and condition that reduce the degree of uncertainty and incompleteness of knowledge about them.

The information is stored, transmitted and processed in the symbolic (sign) form. The same information can be represented in different form:

The sign written, consisting of various characters among which is distinguished by symbolic in the form of text, numbers, specials. symbols; graphic; Table and so on.;

The form of gestures or signals;

Oral verbal form (conversation).

The information presentation is carried out using languages \u200b\u200bas iconic systems that are based on a certain alphabet and have rules for performing signs. Language is a certain sign of information presentation. Exist:

Natural languages \u200b\u200bare spoken languages \u200b\u200bin oral and writing. In some cases, conversational speech may be replaced by the language of the Mimici and gestures, the language of special signs (for example, road);

Formal languages \u200b\u200b- Special languages \u200b\u200bfor various areas of human activity, which are characterized by a rigidly fixed alphabet, more stringent rules of grammar and syntax. This is a language of music (notes), language of mathematics (figures, mathematical signs), number systems, programming languages, etc. At the heart of any language lies the alphabet - a set of characters / signs. The total number of alphabet characters is customary called the alphabet.

Media media - medium or physical body for transferring, storing and playing information. (These are electrical, light, thermal, sound, radio Signals, Magnetic and Laser Disks, Printed Publications, Photos, TD.)

Information processes are processes associated with obtaining, storage, processing and transmission of information (i.e., actions performed with information). Those. These are processes, during which the content of the information or the form of its presentation changes.

To ensure the information process, a source of information is needed, communication channel and the acquistent information. The source transmits (sends) information, and the receiver receives it (perceives). The transmitted information is achieved from the source to the receiver using the signal (code). Changing the signal allows you to get information.

Being an object of transformation and use, information is characterized by the following properties:

Syntax is a property that determines the method of presenting information on the carrier (in the signal). Thus, this information is represented on an electronic medium using a specific font. Here you can also consider such parameters for presenting information as the style and color of the font, its dimensions, a firmist interval, etc. Selecting the desired parameters as syntactic properties is obviously determined by the estimated method of conversion. For example, for a poorly visible person, the size and color of the font is essential. If this text is supposed to enter the computer through the scanner, the paper format is important;

Semantics - a property that determines the meaning of information as a correspondence of the signal to the real world. So, the semantics of the signal "Informatics" lies in this earlier definition. Semantics can be considered as a certain agreement, known to the acquirer of information, which means each signal (the so-called interpretation rule). For example, it is precisely the semantics of the signals study a novice motorist, staring the rules of the road, knowing road signs (in this case, signals are the signs themselves). The semantics of the words (signals) learns a trainee to any foreign language. It can be said that the meaning of learning computer science is to study the semantics of various signals - the essence of the key concepts of this discipline;

Pragmatics - a property that determines the impact of information on the behavior of the acquirer. So pragmatics of information received by the reader of the present tutorial is at least in the successful delivery of the examination exam. I would like to believe that this pragmatics will not be limited to this, and it will serve for further training and professional activity of the reader.

INFORMATION (INFORMATION) is

It should be noted that different signals on syntax may have the same semantics. For example, the "ECM" and "computer" signals mean an electronic device for converting information. In this case, usually talk about synonymia signals. On the other hand, one signal (i.e., information with one syntactic property) may have different pragmatic for consumers and different semantics. So, a road sign known as the "brick" and having a completely defined semantics ("entry is prohibited"), means for the motorist a ban on the entry, and does not affect the pedestrian. At the same time, the "key" signal can have a different semantics: a treble key, a spring key, the key to open the lock, the key used in the computer coding with the purpose of protection against unauthorized access (in this case they are talking about signal onemoniums). There are signals - antonyms that have the opposite semantics. For example, "cold" and "hot", "fast" and "slow", etc.

The subject of studying science computer science is the data: methods for their creation, storage, processing and transmission. And the information itself recorded in the data, its meaningful meaning is interesting to users of information systems, which are specialists of various sciences and areas of activity: the physician is interested in medical information, geologist - geological, businessman - commercial, etc. (including computer science specialist Interested in information on data on data).

Semiotics - Information Science

Information should not be imagined without it, processing, transmission, etc., that is, outside the exchange of information. All information exchange acts are carried out through symbols or signs, with which one system affects the other. Therefore, the main one that studies information is semiotics - science of signs and signs in nature and society (theory of signs). In each information of the information exchange, you can find three of its "participant", three items: a sign, an object that it denotes, and the recipient (use) of the sign.

Depending on whether the relationship between which elements are discussed, semiotics are divided into three sections: synthaktics, semantics and pragmatics. Synthakia studies signs and relations between them. However, it will abstract from the content of the sign and from its practical value for the recipient. Semantics studies relationships between signs and objects denoted by them, while distracting from the recipient of signs and values \u200b\u200bof the latter: for him. It is clear that the study of the patterns of the semantic display of objects in signs is impossible without taking into account and use the general patterns of constructing any iconic systems studied by synthate. Pragmatics studies relationships between signs and their use. In the framework of the pragmatics, all factors distinguish between an act of information exchange from another, all the issues of practical results of using information and its value for the recipient.

At the same time, many aspects of the relations of signs between themselves are inevitably affected and with objects, they are denoted. Thus, three semiotics sections correspond to the three levels of abstraction (distractions) from the specifics of specific information sharing acts. The study of information in all its diversity corresponds to the pragmatic level. After being distracted by the recipient of the information, excluding it from consideration, we turn to studying it at the semantic level. With distraction from the content of signs, the analysis of information is translated to the level of synthate. Such interpenetration of the main semiotics sections associated with different levels of abstraction can be represented using the "three semiotics and their relationship" scheme. The measurement of information is carried out according to the same in three aspects: syntactic, semantic and pragmatic. The need for such a different dimension of information, as will be shown below, is dictated by the practice of design and firms Works of information systems. Consider a typical production situation.

At the end of the change, the Planner of the site prepares data on the implementation of the production schedule. This data enroll in the information and computing center (ITC) of the enterprise, where processed, and in the form of reports on the state of production are currently issued to managers. The head of the workshop on the basis of the data obtained decides on the change in the production plan for the following planned or making any other organizational measures. Obviously, for the head, the workshop, the amount of information contained a summary depends on the value of the economic affect obtained from its use in making decisions, on how useful information received. For the site scheduler, the amount of information in the same message is determined by the accuracy of compliance with its actual position of affairs on the site and the degree of surprise of reported facts. What they are unexpected, the faster you need to report them to management, the more information in this message. For employees of the ILS, the number of characters will be of paramount importance, the length of the message carrier, since it is precisely it determines the time for loading computing equipment and communication channels. At the same time, neither the usefulness of information nor the quantitative measure of the semantic value of their information is practically interested.

Naturally, by organizing a production management system, the system of the solution selection model, we will use the usefulness of information as a measure of information. When building a system accounting and reporting that ensures the management of data on the progress of the production process for the measure of information should be made novelty received. Company The procedures for mechanical processing of information requires measuring the amount of messages as the number of processed signs. Three such substantially different approaches to the measurement of information do not contradict and do not exclude each other. On the contrary, measuring information in different scales, they allow you to fully and comprehensively assess the informativeness of each message and more efficiently organize a production management system. By a member of the expression of prof. NOT. Kobrinsky, when it comes to a rational company flow company, the amount, novelty, the usefulness of information is among themselves also related as the quantity, quality and cost of production in production.

Information in the material world

information is one of the general concepts associated with matter. Information exists in any material object as a manifold of its states and is transmitted from the object to the object in the process of their interaction. The existence of information as an objective nature of matter logically follows from the known fundamental properties of matter - structurality, continuous change (movement) and interaction of material objects.

The structure of matter is manifested as the internal disseminance of integrity, the legitarious order of communication of the elements in the composition of the whole. In other words, any material object, from the subatomic particle Meta of the Universe (big explosion) as a whole, is a system of interconnected subsystems. Due to the continuous movement, understood in a broad sense, as moving in space and development in time, material objects change their states. Object status changes and when interactions with other objects. Many states of the material system and all of its subsystems represent information about the system.

Strictly speaking, by virtue of uncertainty, infinity, structural properties, the number of objective information in any material object is infinite. This information is called complete. However, one can allocate structural levels with finite sets of states. Information that exists at the structural level with a finite number of states is called private. For private information, the concept of the number of information is noteworthy.

From the above view, it is logical and simply follows the selection of a unit of measurement of the amount of information. Imagine a system that can be only two equivalent states. We assign one of them the code "1", and the other is "0". This is the minimum amount of information that can contain the system. It is a unit of measurement of information and is called bits. There are other, more complexly determined, methods and units of measurement of the amount of information.

Depending on the material form of the carrier, the information is two main species - analog and discrete. Analog information varies in time continuously and takes values \u200b\u200bfrom the values \u200b\u200bcontinuum. Discrete information varies at some points in time and takes values \u200b\u200bfrom a certain set of values. Any material object or process is a primary source of information. All possible conditions make up the source code. The instantaneous value of states is presented as a symbol ("letter") of this code. In order for the information to be transmitted from one object to another as the receiver, it is necessary that there is some intermediate material carrier that interacts with the source. Such carriers in nature, as a rule, are the rapidly propagating processes of the wave structure - cosmic, gamma and X-rays, electromagnetic and sound waves, potentials (and there may be not yet open waves) of the gravitational field. When electromagnetic radiation interacts with the object as a result of absorption or reflection, its spectrum changes, i.e. The intensities of some wavelengths change. Change when interactions with objects and harmonics of sound oscillations. Information is transmitted and in mechanical interaction, however, mechanical interaction, as a rule, leads to large changes in the structure of objects (up to their destruction), and the information is very distorted. The distortion of information when it is transmitted is called disinformation.

Transferring source information to the structure of the carrier is called coding. In this case, there is a source code conversion into the carrier code. The carrier with a source code transferred to it as a carrier code is called a signal. The signal receiver has its own set of possible states, which is called the receiver code. The signal, interacting with the object-receiver, changes its states. The process of converting the signal code into the receiver code is called decoding. Introducing information from the source the receiver can be considered as information interaction. Informational interaction is radically different from other interactions. With all other interactions of material objects, a substance and (or) energy occurs. In this case, one of the objects loses the substance or energy, and the other receives them. This property of interactions is called symmetry. With information interaction, the receiver receives information, and the source does not lose it. Informational interaction Asymmetrically. The lensitive information itself is not material, it is a property of matter, such as structural, movement, and exists on material carriers as its codes.

Wildlife Information

Living nature is complex and diverse. Sources and receivers of information in it are living organisms and their cells. The body has a number of properties that distinguish it from non-living material objects.

Main:

Continuous metabolism, energy and environmental information;

Irritability, the body's ability to perceive and recycle information about environmental changes and the inner environment of the body;

Excitability, the ability to respond to the effect of stimuli;

Self-organization manifest as changes in the body to adapt to the conditions of the external environment.

The body, considered as the system, has a hierarchical structure. This structure relative to the very organism is divided into domestic levels: molecular, cellular, level of organs and, finally, the organism itself. However, the body interacts over organiser living systems, the levels of which are populations, ecosystem and all wildlife as a whole (biosphere). Between all these levels, the flows are circulated not only of substances and energy, but also information. Informational interactions in wildlife occur in the same way as in inanimate. At the same time, wildlife in the process of evolution created a wide variety of sources, carriers and receivers of information.

The reaction to the exposure to the outside world is manifested in all organisms, since it is caused by irritableness. At the highest organisms, adaptation to the external environment is characterized by complex activities, which is effective only with sufficiently complete and timely environmental information. Receivers of information from the external environment are the senses, which include vision, hearing, smell, taste, touch and vestibular apparatus. In the internal structure of organisms there are numerous internal receptors associated with the nervous system. The nervous system consists of neurons whose processes (axons and dendrites) are analogue of information transmission channels. The main bodies ensuring storage and processing of information in vertebrate are the spinal cord and brain. In accordance with the peculiarities of the sense organs, information perceived by the body can be classified as visual, auditory, taste, olfactory and tactile.

Finding on the retina of the human eye, the signal specially excites its cell components. Nervous cell pulses through axons are transmitted to the brain. The brain remembers this feeling in the form of a certain combination of states of the components of its neurons. (Continuation of the example - in the "Information in Human Society" section). The accumulative information, the brain creates on its structure the associated information model of the surrounding world. In the wildlife for the body - receiver information is an important characteristic of its availability. The amount of information that the human nervous system is capable of submitting to the brain when reading texts is approximately 1 bit for 1/16 s.

INFORMATION (INFORMATION) is

The study of organisms is difficult to be difficult. The abstraction of structure as a mathematical set for non-living objects is hardly permissible for a living organism, because to create a more or less adequate abstract model of the body, it is necessary to take into account all hierarchical levels of its structure. Therefore, it is difficult to introduce the measure of information. It is very difficult for the relationship between the components of the structure. If it is known which organ is a source of information, what is the signal and that the receiver?

Prior to the emergence of computing machines, biology engaged in the studies of living organisms applied only qualitative, i.e. Descriptive models. In a high-quality model, the information links between the components of the structure is almost impossible. Electronic components made it possible to apply new methods in biological studies, in particular, machine modeling method, which implies a mathematical description of known phenomena and processes occurring in the body, adding a hypotheses to them about some unknown processes and the calculation of possible options for the behavior of the body. The obtained options are compared with the real behavior of the body, which makes it possible to determine the truth or felt of the hypotheses extended. In such models, information interaction can also be taken into account. Extremely complex are information processes that ensure the existence of life itself. And although it is intuitive that this property is directly related to the formation, storage and transfer of complete information about the structure of the body, the abstract description of this phenomenon was prevented until some time impossible. However, information processes that ensure the existence of this property are partially disclosed due to the decryption of the genetic code and read the genomes of various organisms.

Information in human society

The development of matter in the process of movement is directed towards the complication of the structure of material objects. One of the most complex structures is the human brain. While this is the only structure known to us, which has a property that man himself calls consciousness. Speaking about the information we, as thinking creatures, a priori imply that information, besides its presence in the form of the signals we receive, has some sense. Forming in his consciousness model of the surrounding world as a interconnected set of models of its objects and processes, a person uses meaning notes, and not information. The meaning is the essence of any phenomenon, which does not coincide with him himself and binds it with a wider context of reality. The Word itself directly indicates that the semantic content of the information can form only thinking information receivers. In human society, the information itself becomes crucial, and its semantic content.

Example (continued). After having experienced a feeling, a person assigns the concept of the object - "tomato", and its condition is the concept - "red". In addition, his consciousness records communication: "Tomato" - "Red". This is the meaning of the received signal. (Continuation of Example: Below in this section). The ability of the brain to create semantic concepts and relations between them is the basis of consciousness. Consciousness can be viewed as a self-developing semantic model of the surrounding world. This is not information. Information exists only on material carrier. Consciousness of a person is considered intangible. The meaning exists in the consciousness of a person in the form of words, images and sensations. A person can pronounce words not only out loud, but also "to herself". He also "to himself" can create (or remember) images and sensations. However, it can restore the information corresponding to this sense by saying the word or writing them.

INFORMATION (INFORMATION) is

Example (continued). If the words "tomato" and "red" is the meaning of concepts, then where then the information? The information is contained in the brain in the form of certain states of its neurons. It also contains in the printed text consisting of these words, and when encoding letters with a three-bit binary code, its quantity is 120 bits. If you say words out loud, the information will be significantly larger, but the meaning will remain the same. The greatest amount of information carries a visual image. This is reflected even in folklore - "it is better to see once than hearing than a hundred times." The information is called in this way the information is called semantic information, since it encodes the meaning of some primary information (semantics). Hearing (or seeing) the phrase pronounced (or written) in a language whom a person does not know, he receives information, but cannot determine its meaning. Therefore, to transfer the semantic content of information, some contracts are needed between the source and the receiver on the meaning of the signals, i.e. words. Such agreement Can be achieved in the process of communication. Communication is one of the most important conditions for the existence of human society.

In the modern world, information is one of the most important resources and, at the same time, one of the driving forces of human society. Information processes occurring in the material world, wildlife and human society are studied (or at least taken into account) by all scientific disciplines from philosophy to marketing. The increasing complexity of scientific research problems led to the need to bring to solve large groups of scientists of different specialties. Therefore, almost all the theories under consideration below are interdisciplinary. Historically, two comprehensive industries - cybernetics and informatics are engaged in the study of directly information.

Modern cybernetics is multi disciplinary industry Science, exploring ultra-empty systems, such as:

Human Society (Social Cybernetics);

Economy (Economic Cybernetics);

Living organism (biological cybernetics);

The human brain and its function is consciousness (artificial intelligence).

Informatics, formed as science in the middle of the last century, separated from cybernetics and deals with research in the field of methods for obtaining, storing, transferring and processing semantic information. Both of these industries Use several fundamental scientific theories. These include the theory of information, and its sections are the theory of coding, the theory of algorithms and the theory of automata. Studies of the semantic content of information are based on a complex of scientific theories under the general name of Semiotics. Theory of information is integrated, mainly a mathematical theory, which includes a description and evaluation of methods for extracting, transferring, storing and classifying information. Considers the storage media as elements of abstract (mathematical) set, and interactions between carriers as a way of locating elements in this set. This approach makes it possible to formally describe the information code, that is, to define an abstract code and explore it with mathematical methods. For these studies, methods of probability theory, mathematical statistics, linear algebra, game theory and other mathematical theories are applied.

The foundations of this theory laid the American scientist E. Hartley in 1928, which determined the measure of information for some communication tasks. Later the theory was essentially developed by American scientists K. Shannon, Russian scientists A.N. Kolmogorov, V.M. Glushkov and other information theory includes as partitions the theory of coding, the theory of algorithms, the theory of digital machines (see below) and some others. Sovage and alternative information theories, such as "Quality Information Theory" proposed by Polish Scientist M. Mazur.c The concept of an algorithm is a sign of any person, without even suspecting it. Here is an example of an informal algorithm: "Tomatoes cut into circles or slices. Put in them chicken onions, pour vegetable oil, then sprinkle with fine chopped peppers, mix. Sprinkle salt before use, put in a salad bowl and decorate the parsley with greens. " (Tomato salad).

The first in the history of mankind rules for solving arithmetic tasks were developed by one of the well-known scientists of al-Khorezmi in the 9th century of our era. In his honor, formalized rules for achieving any goal are called algorithms. According to the theory of algorithms, finding methods for constructing and evaluating effective (including universal) computing and control algorithms for information processing. To justify such methods, the theory of algorithms uses the mathematical apparatus of the theory of information. The modern scientific concept of algorithms as methods of processing information was introduced in the works of E. Post and A. Turing in the 20s of the twentieth century (Turing Machine). The Russian scientists of A. Markov (Normal Markov algorithm) and A. Kolmogorov. Theory of automata - a section of the theoretical cybernetics, which examines mathematical models of real-life or fundamentally possible devices of processing discrete information into discrete moments of time.

The concept of the machine arose in the theory of algorithms. If there are some universal algorithms for solving computing tasks, there must be devices (albeit abstract) to implement such algorithms. Actually, the Abstract Turing Machine, considered in the theory of algorithms, is at the same time an informally defined machine. The theoretical justification for the construction of such devices is the subject of the theory of automata. The machine of automata uses the device of mathematical theories - algebra, mathematical logic, combinatorial analysis, theory of graphs, the theory of probabilities, and other machinery, along with the theory of algorithms are the main theoretical basis for creating electronic computing machines and automated Managing Systems. Alectica is a complex of scientific theories that study the properties of iconic systems. The most significant results are achieved in the semiotics section - semantics. The subject of studies of semantics is the semantic content of the information.

The iconic system is considered a system of specific or abstract objects (signs, words), with each of which a certain meaning is associated in a certain way. In theory, it is proved that such comparisons may be two. The first type of matching determines the material object directly, which denotes this word and is called denotat (or, in some works, - nominee). The second type of compliance determines the meaning of the sign (words) and is called the concept. At the same time, such properties of comparisons as "meaning", "truth", "definability", "follow-up", "Interpretation", etc. For research, the apparatus of mathematical logic and mathematical linguistics is used. The semantics scheduled for the city of V. Leibnitsa and F de Sosyur in the XIX century, formulated and developed Ch. Pierce (1839-1914), Ch. Morris (R. 1901), R. Karnap (1891-1970) and other axial achievement of the theory is the creation of a semantic analysis apparatus that allows The meaning of the text in the natural language in the form of a record on some formalized semantic (meaningful) language. Thessantic analysis is the basis for creating devices (programs) of machine translation from one natural language to another.

Information storage is carried out using its transfer to some material carriers. Semantic information recorded on a material storage medium is called a document. Humanity has learned to keep information for a long time. In the most ancient forms of storage of information, the location of items - shells and stones on the sand, nodules on the rope. An environmental development of these methods was writing - a graphic image of symbols on stone, clay, papyrus, paper. Huge importance in the development of this direction had invention typography. For its history, humanity has accumulated a huge amount of information in libraries, archives, periodicals and other written documents.

Currently, information storage in the form of binary symbols sequences has been particularly important. A variety of storage devices are used to implement these methods. They are the central link of information storage systems. In addition to them, such systems use information tools (search engine), reference tools (information and reference systems) and information display tools (output device). Formed information systems such information systems formed databases, data banks and knowledge base.

The transmission of semantic information is called the process of its spatial transfer from the source to the recipient (addressee). Transmit and receive information a person learned even earlier than to store it. Speech is a way to transfer that our distant ancestors in direct contact (conversation) - we use it now. To transfer information over long distances, it is necessary to use significantly more complex information processes. For the implementation of such a process, the information must be fragile in some way (presented). To present information, various iconic systems are used - sets of pre-agreed semantic symbols: items, pictures, written or printed words of natural language. Presented with their help semantic information about any object, phenomenon or process is called the message.

Obviously, the information must be transferred to any mobile medium to send a distance. Media can move in space using vehicles as it happens with letters sent by mail. This method ensures the complete accuracy of the transfer of information, since the addressee receives the original message, however requires considerable time to transmit. Since the middle of the XIX century, there was a spread of information transfer methods that use naturally propagating information carrier - electromagnetic oscillations (electrical oscillations, radio waves, light). The implementation of these methods requires:

Preliminary transfer of information contained in the message to the carrier - coding;

Ensuring the transmission of the signal thus obtained by the address of the Special Channel;

Reverse signal code conversion to message code - decoding.

INFORMATION (INFORMATION) is

The use of electromagnetic media makes the delivery of the message to the addressing almost instantaneous, but requires additional measures to ensure the quality (reliability and accuracy) of the transmitted information, since the real communication channels are exposed to natural and artificial interference. Devices that implement the data transfer process form communication systems. Depending on the method of presenting information, the communication system can be divided into iconic (, telefax), sound (), video and combined systems (television). The most developed system of communication in our time is the Internet.

Data processing

Since information is not material, its processing is in various transformations. Processing processes include any information transfer from the media to another medium. Information intended for processing is called data. The main type of processing of primary information obtained by various appliances is the transformation into a form that ensures its perception by human senses by organs. Thus, photographs of space, obtained in X-rays, are converted to conventional color photographs using special spectrum converters and photographic materials. Night vision devices convert the image obtained in infrared (thermal) rays into the image in the visible range. For some communication and management tasks, it is necessary to convert analog information. This uses analog-digital and digital-analog signal converters.

The most important type of semantic information processing is the definition of meaning (content), which consists in a certain message. Unlike primary semantic information does not have statistical Characteristics, that is, a quantitative measure - the meaning is either there or it is not. And how much if it is if it is - it is impossible to install. The meaning described in the message is described in an artificial language reflecting the semantic links between the words of the source text. A dictionary of such a language called thesaurus is in the message receiver. The meaning of words and phrases of the message is determined by their assignment to certain groups of words or phrases, the meaning of which is already installed. Thesaurus, thus, allows you to establish the meaning of the message and, at the same time, is replenished with new semantic concepts. The described type of information processing is used in information retrieval systems and machine translation systems.

One of the widespread types of information processing is the solution of computing tasks and automatic control tasks using computing machines. Information processing is always made with some purpose. To achieve it, the procedure for action on information leading to a given purpose should be known. This procedure is called the algorithm. In addition to the algorithm itself, some device that implements this algorithm is also necessary. In scientific theories, such a device is called a machine. It has been noted as the most important feature of the information that due to the asymmetry of information interaction in the processing of information there is a new information, and the initial information is not lost.

Analog and digital information

Sound is wave oscillations in any environment, for example in the air. When a person says, the oscillation of the ligaments of the throat is transformed into the wave air fluctuations. If we consider the sound not as a wave, but as oscillations at one point, then these oscillations can be represented as the air pressure varied in time. Using a microphone, you can catch changes in pressure and convert them to an electrical voltage. There was a transformation of air pressure into electrical voltage oscillations.

Such a transformation can occur according to various laws, most often the transformation occurs according to the linear law. For example, such:

U (t) \u003d k (p (t) -p_0),

where U (T) is an electrical voltage, P (T) - air pressure, P_0 is the average air pressure, and k is the conversion factor.

Both electrical voltage and air pressure are continuous functions over time. The functions u (t) and p (t) are information about the oscillations of the throat ligaments. These functions are continuous and such information is called analogue. Music is a particular case of sound and it can also be represented as some kind of function. It will be an analog music representation. But the music is also recorded in the form of notes. Each note has a duration of a multiple of a predetermined duration, and height (up to, re, mi, fa, salt, etc.). If this data is converted into numbers, then we will receive a digital presentation of music.

Human speech is also a private occasion of sound. It can also be represented in analog form. But as music can be broken down on notes, we can spend the letters. If each letter gives your digit set, then we will receive a digital presentation of speech. The difference between the analog information and the digital is that the analog information is continuous, and the digital discrete. Reformation of information from one species to another depending on the conversion type is called differently: simply "transformation", for example, digital analog conversion, or analog-digital transformation; Complex conversions are called "coding", for example, delta coding, entropy coding; The transformation between characteristics such as amplitude, frequency or phase is called "modulation", such as amplitude-frequency modulation, pulse modulation.

INFORMATION (INFORMATION) is

Usually, the analog conversions are quite simple and various devices invented with them easily with them. The tape recorder converts the magnetization on the film into the sound, the voice recorder converts the sound to the magnetization on the film, the camcorder converts the light into the magnetization on the film, the oscillograph converts the electrical voltage or current into the image, etc. Converting analog information into digital is noticeably more complicated. Some transformations can not be done or managed with great difficulty. For example, transforming speech into the text, or convert a concert record to notes, and even by nature your digital representation: the text on paper is very hard to convert to the same text in the computer's memory.

INFORMATION (INFORMATION) is

Why then use a digital presentation of information if it is so difficult? The main guarantee of digital information before the analog is noise immunity. That is, in the process of copying information, digital information is copied as it is, it can be copied to almost an infinite number of times, the analog information during the copying process is loosened, its quality deteriorates. Usually analog information can be copied no more than three times. If you have a two-cassette audio tape recorder, you can make such an experiment, try to rewrite several times from the cassette on the tape one and the same song, after a few such overwrites you will notice how much Worstered recording quality. The information on the cassette is stored in analog form. Music in mp3 format you can rewrite how much time, and the quality of music does not deteriorate. Information in the MP3 file is kept in digital form.

Number of information

A person or some other receiver of information, receiving a portion of information, permits some uncertainty. Take for example, everything is also a tree. When we saw a tree, we allowed a number of uncertainties. We learned the height of the tree, tree view, the density of foliage, the color of the leaves and, if it is a fruit tree, we saw fruit on it, as far as they ripen, etc. Before we looked at the tree, we did not know all this, after we looked at the tree, we allowed uncertainty - received information.

If we go to the meadow and look at it, then we will get the information of another kind, as far as the meadow is big, like a high grass and what color grass. If a biologist comes out on the same meadow, then he will be able to find out: what grades of herbs grow in a meadow, what type of this meadow, he will see what flowers bloomed, which only bloom, whether the meadow is suitable for grazing cows, etc. That is, he will receive the amount of information more than we, since he, before he looked at the meadow, there were more questions, the biologist would allow more uncertainty.

INFORMATION (INFORMATION) is

The big uncertainty was allowed in the process of obtaining information, the more information we received. But this is a subjective measure of the number of information, and we would like to have an objective measure. There is a formula for calculating the number of information. We have some uncertainty, and we have a N-OE number of cases of resolution of uncertainty, and each case has some permission probability, then the number of information received can be calculated according to the following formula that Shannon offered us:

I \u003d - (p_1 log_ (2) p_1 + p_2 log_ (2) p_2 + ... + p_n log_ (2) p_n), where

I - the amount of information;

N - the number of outcomes;

p_1, p_2, ..., p_n- probability of outcome.

INFORMATION (INFORMATION) is

The amount of information is measured in bits - reduction from Binary Digit English words, which means binary digit.

For equivalent events, the formula can be simplified:

I \u003d log_ (2) n, where

I - the amount of information;

N - the number of outcomes.

Take, for example, a coin and throw it on the table. It will fall either by an eagle or a male. We have 2 equivalent events. After we threw a coin, we got Log_ (2) 2 \u003d 1 bit of information.

Let's try to find out how much information we get after throwing a cube. Cube has six faces - six equivalent events. We get: Log_ (2) 6 Approx 2.6. After we threw a cube on the table, we got approximately 2.6 bits of information.

The likelihood that we will see the Martian dinosaur when we get out of the house is equal to one ten-billion. How much information do we get about the Martian dinosaur after you come out of the house?

Left (((1 Over (10 ^ (10))) log_2 (1 Over (10 ^ (10))) + Left ((1 - (1 Over (10 ^ (10)))) IGHT) log_2 left (( 1 - (1 OVER (10 ^ (10)))) IGHT)) IGHT) Approx 3.4 CDOT 10 ^ (- 9) bit.

Suppose we threw 8 coins. We have 2 ^ 8 options for falling coins. So after throwing coins, we get Log_2 (2 ^ 8) \u003d 8 bits of information.

When we ask the question and can in an equal likelihood of getting the answer "yes" or "no", then after answering the question we get one bit of information.

It is surprising that if you apply the Shannon formula for analog information, then we will get an infinite amount of information. For example, the voltage at the point of the electrical circuit may receive an equilibious value from zero to one volt. The number of outcomes we are equally infinity and, substituting this value in the formula for equally life events, we get infinity - an infinite amount of information.

Now I will show how to encode "War and Peace" with just one risk on any metal rod. Clean all letters and signs encountered in " war And the world ", with the help of two-digit numbers - they should be enough for us. For example, the letter "A" give the "00" code, the letter "B" - the code "01" and so on, we codify the signs of punctuation, Latin letters and numbers. Recodula " war And the world "with this code and we get a long number, for example, such 70123856383901874 ..., we draw a comma and zero before this number (0.70123856383901874 ...). It turned out the number from zero to one. Put risk On the metal rod so that the ratio of the left side of the rod to the length of this rod is just our number. Thus, if suddenly we want to read the "war and the world", we simply measure the left part of the rod to risks And the length of the whole rod, we divide one number to another, we get a number and recollect it back into the letters ("00" in "A", "01" in "b", etc.).

INFORMATION (INFORMATION) is

It is real that we will not succeed, since we will not be able to determine the lengths with infinite accuracy. Increase the accuracy of measurement, we interfere with some engineering problems, and quantum physics shows us that after a certain limit, we will already interfere with quantum laws. It is clear to us that the smaller measurement accuracy, the less information we receive, and the greater the accuracy of the measurement, the more information we get. Shannon's formula is not suitable for measuring the number of analog information, but for this there are other methods that are considered in the "information theory". In the computer technology, the bit corresponds to the physical state of the information carrier: Magnetic - not magnetized, there is a hole - no hole, charged - not charged, reflects light - does not reflect the light, high electric potential - low electric potential. In this case, one state is accepted to indicate a number 0, and another - digit 1. You can encode any information: text, image, sound, etc.

Along with the bit, the value called by the byte is often used, it is usually equal to 8 bits. And if the bit allows you to choose one equal version of the two possible, then byte - 1 of 256 (2 ^ 8). To measure the number of information, more large units are also made:

1 KB (one kilobyte) 210 bytes \u003d 1024 byte

1 MB (one megabyte) 210 KB \u003d 1024 KB

1 GB (one gigabyte) 210 MB \u003d 1024 MB

Really consoles Cilo-, mega-, gigas should be used for multipliers 10 ^ 3, 10 ^ 6 and 10 ^ 9, respectively, but historically there has been a practice of using multipliers with detects.

Bit on Shannon and Bit, which is used in computer technician, coincide if the probabilities appear zero or units in a computer bite are equal. If probabilities are not equal, then the number of information on Shannon becomes smaller, we saw on the example of the Martian dinosaur. A computer amount of information gives an upper assessment of the number of information. Energy-dependent memory after the supply of power is initialized normally by some value, for example, all units or all zeros. It is clear that after the power supply to memory, there is no information there, since the values \u200b\u200bin the memory cells are strictly defined, there is no uncertainty. Memory can store some amount of information in itself, but after applying for it, no information in it is not available.

Disinformation - Obviously false information provided by the enemy or a business partner for more efficient operation of hostilities, cooperation, checks on the leakage of information and the direction of its leakage, identifying potential customers of the black market. Also disinformation (also disinforced) is called the process manipulation process itself, such as: Introducing someone's misleading by providing incomplete information or complete, but no longer necessary information, context distortion, distortion of part of the information.

The purpose of such an impact is always one - the opponent must do as it is necessary to manipulator. The act of an object against which the misinformation is directed may be in making the necessary solution to the manipulator or in refusing to make a decision unfavorable for manipulator. But in any case, the ultimate goal is an action that the opponent will be taken.

Disinformation is thus product Human activity, an attempt to create a false impression and, accordingly, push the desired actions and / or inaction.

INFORMATION (INFORMATION) is

Types of disinformation:

Introduction of a particular person or group of persons (including a whole nation);

Manipulation (actions of one person or group of persons);

Creating a public opinion on some kind of problem or object.

INFORMATION (INFORMATION) is

Introduction is not anything other than a direct deception, providing false information. Manipulation is a way of exposure directed directly to changing the direction of people's activity. The following levels of manipulation are distinguished:

Strengthening existing in the minds of people advantageous to manipulator values \u200b\u200b(ideas, installations);

Partial change in the views to a particular event or circumstance;

Careful change in life plants.

Creating public opinion is the formation of a certain attitude to the chosen problem in society.

Sources and links

ru.wikipedia.org - Wikipedia's free encyclopedia

youTube.com - video hosting YouTube

images.yandex.ua - Pictures Yandex

google.com.ua - Google pictures

ru.wikiBooks.org - Viki Co-bank

inf1.info - Planet Informatics

old.russ.ru - Russian Journal

shkolo.ru - information directory

5Byte.ru - computer science site

sSTI.ru - Information Technology

klgtu.ru - Informatika

informatika.sch880.ru - Teacher's website Informatics O.V. Sobotseva

Encyclopedia of cultural studies

The main concept of cybernetics, in the same way economic I. The main concept of economic cybernetics. The definitions of this term are a lot, they are complex and contradictory. The reason for this is obviously that I. how the phenomenon is engaged in ... ... Economics and Mathematical Dictionary


We Are Using Cookies for the Best Presentation of Our Site. Continuing to Use This Site, You Agree with this. OK.