SEO vocabulary - terms and jargon. A Brief Dictionary of SEO Terms

Update class = url> (from English update) - updating something, for example "blog update", "search engine index update", etc.

The bot class = url> (Search Robot, Spider, Spider) - a special program developed by the administrators of the search engine in order to enter into the database changes occurring on the Internet (the appearance of new content on sites, content changes, etc.). The data that the bot collects is entered into the search engine index and subsequently displayed in the search results.

Browser class = url> (from the English Browser) - a computer program for viewing sites. By far the most popular browser is Firefox.

VIC (Weighted Citation Index) class = url> is one of the indicators of the authority of a web page. It is calculated based on the number of incoming links to a specific page from other resources, which, in turn, are also taken into account by "weight" and "significance" in the network.

HF class = url> - high-frequency request; the word or phrase most often searched by search engine users.

Issuance - results class = url>, which are displayed by the search engine for each specific request.

Doorway generator class = url> - a special program that creates doorways. Most often, generation occurs in a semi-automatic mode, but there are also completely stand-alone applications. There is a division of generators into desktop and server.

Guest house, guest houses class = url> - a special module that allows site visitors to leave messages. On the modern Internet, guestbooks are already an anachronism and are used mainly by spammers to get additional links.

Shit site class = url> (slang) - a site that is not made for people, but for search engines. In most cases, it does not contain unique content and design. Used solely to make money through link placement and enhanced display of advertisements.

Engine class = url> - slang name for CMS (content management system).

Denver class = url> - local server supporting PHP and MySQL. It is used to create and debug web pages without connecting to real servers on the Internet.

Domain class = url> (from French domaine) - a unique website address on the Internet.

Doorway operator class = url> - the person who creates doorways.

Dorgen class = url> - same as "doorway generator"

DSDL class = url> is a slang abbreviation meaning "we make sites for people". Alternative acronym SOM - "sites for people"

Mirror class = url> - a copy of a site or a specific page. It can be created both for security and for selfish purposes.

Mirror class = url> - a special bot (robot) that searches for copies of sites and pages.

IC (Citation Index) class = url> - a quantitative indicator of links to a specific site from other resources.

Captcha, CAPTCHA class = url> (from the English Completely Automated Public Turing test to tell Computers and Humans Apart - a fully automated public Turing test for distinguishing computers and people) - letters and numbers that the user must enter in order to register somewhere, send message etc. It is used to protect against bots.

Cloaking class = url> (from English to cloak - to be disguised) is a special technology through which a search robot is misled. Most often, it looks like one thing is slipped to the search engine, and another to the visitor (for example, the search engine is shown high-quality text, and the visitor is shown a pay-per-click advertisement). Read more about this method here.

Keywords class = url> - words that are most often found on the site. Competent selection of keywords leads to the fact that the site positions itself well in the search engine results.

Content class = url> (from English content - content) - a set of materials available on a particular site (graphics, texts, files, etc.). Content determines the value of the site not only for the visitor, but also for the search engines. Websites with valuable content rank well in the SERP.

Copy-paste class = url> (slang) - any text that is not written independently by the owner of the resource, but copied (copy - paste) from another site.

Copy-paste class = url> (from English copy paste) - slang designation of a person who steals someone else's text content.

Copywriter class = url> (from the English copywriter) - the person who creates unique content. One of the ways to make money, profession.

Cookies class = url> (slang, from the English HTTP cookie) - data that is automatically created by the server and placed on the user's computer. Used for such purposes as statistics, authentication, storage of personal settings, etc.

Linkopomoyka class = url> - a chaotic heap of external links on a page. Similar in meaning - linkfarm and FFA (directories that allow you to post any links).

Muzzle class = url> - the main page of the website, the place where the visitor goes by typing the domain name of the site. Links from the muzzle are considered the most significant.

Muscle class = url> (slang) - MySQL database

Cheat class = url> - special user actions aimed at falsifying (most often overstating) any site indicators (rating, traffic, etc.).

Nepot class = url> - a special filter applied to the resource by the search engine to reduce the weight of external links. Read about the reasons for the imposition of nepot here

Noob class = url> (from English newbie) - "teapot", a person who is not very well versed in some issue.

LF class = url> is a search query that is not common. Synonym - "low-frequency", "low-frequency request".

Pag, paga class = url> (slang, from English page) - web page, site.

Parser class = url> (from English parsing) - text auto-processing program. It is most often used by spammers to create special web pages or to process search results.

Parse class = url> - perform targeted auto-processing in order to find the required data. For example, you can parse the search results for the ranking indicators of a web page.

Spider, robot, bot class = url> - an integral part of the search engine that monitors changes in the content of sites, as well as saves the required data to the search engine server.

Sandbox, Sandbox class = url> - a special search filter (most often used by Google), which is designed to underestimate the performance of newly created sites. The sandbox is also present on other search engines. Additional information about this phenomenon is here

Plagiarism class = url> - illegal reprint and distribution of other people's works without attribution. Falls under the Copyright Act.

Plugin class = url> (from English Plug-in) - a special software module that allows you to expand the functionality of a program. Nowadays, for example, plugins for CMS are very widespread.

Search spam class = url> - all different content available on the web and created to attract search bots. Material from the web that is of no value to a living person.

Run class = url> - the process of registering a site, on any resources in order to receive incoming links. The most commonly used are directories, forums, blogs, etc. In most cases, the run is carried out by special programs.

Sales, product class = url> (slang, from the English. - productivity) - the actual ratio of the number of visitors to the resource, to the number of clicks made by them on certain objects.

Proxy, Proxy server class = url> (from English proxy) - a special server that ensures user anonymity. There are paid and free proxy servers.

Pseudo satellite class = url> - a small site created with the aim of making money on the Internet and does not differ in high-quality content.

Puzomerka class = url> - special pictures-counters on which the indicators of the TCI and PR of the site are shown, as well as other statistical data (ratings, visits per day, per hour, etc.)

Frame class = url> (slang) - Rambler search engine.

Ranging class = url> is a multilevel process that takes place in the bowels of any search engine. The main goal of ranking is to build an adequate (relevant) search query results.

Ratio class = url> (from lat. ratio) - the actual ratio of the number of visitors to the resource and the number of clicks made by them on certain objects on the site. Most often used in e-commerce (visitor to sales ratio).

Redirect class = url> (from English redirect) - forced redirection of a site visitor to some other resource. Most often used in adult and doorway construction. The redirect is carried out using a special script.

Relevance class = url> (from English relevant) - the actual match of the data entered into the search engine with the displayed results of the search.

Refka, reflink class = url> - a special link with an identifier to an affiliate program posted on the referral's website. On the server of the affiliate program, the transition is recorded and, as a result, commissions are charged to the referral account.

Submit, submit class = url> (from English submit - registration) - registration of a resource in search engines, catalogs, forums, etc., which leads to an increase in site traffic. Distinguish the category of "autosubmit" - in this case, the registration of the site is done by means of special automated tools (programs, scripts, etc.). Nowadays, auto-submit is very close to the concept of "sleeping out".

Site crashed, site crashed class = url> - full or partial unavailability of the server on which the website is located. The expression is also used when the site starts to work very slowly.

Sapper class = url> (slang) - moneymaker who makes money on sap (sape link exchange).

Satellite class = url> - a special site that is created for the purpose of additional assistance in promoting the main resource. The most commonly used are the so-called. "satellite networks", i.e. several sites of auxiliary purpose. Satellite can have unique content and position itself quite well in the SERP.

SDL class = url> (slang) - site for people

SEO, SEO class = url> (from English Search Engine Optimization - site optimization for a search engine). What is SEO optimization and why is it needed, see here

SICKLE, SERP class = url> - keyword search results loaded into the user's browser. The higher the site rises in the search results, the more visitors come to it.

Sij script, (CJ) class = url> - a special script that analyzes the behavior of a resource visitor and performs certain actions to transfer it to any of the traders. Most often used in adult. Through CJ, a visitor can wander for hours on the same trading network.

Synonymizer class = url> - a script or program that generates text content from existing content by rearranging words in a sentence and using synonyms. Often texts are synonymized to create doorways, govnosites, etc.

Skim class = url> - in SEO terminology, this word is used to define the type of traffic. Most often, skim traffic comes from cj scripts. Wholesale purchase of such visitors is widespread.

Script class = url> (from English script) - program or program script file. You can read more about scripts here

Snippet class = url> (from the English snippet - a flap, excerpt or fragment) - short text information about the content of the site, which appears in the search results (immediately below the url address of the page). Read more about snippets and their correct positioning here

Software class = url> (from English soft) - program, software. In slang - softina or softinka.

Spam class = url> (from English spam) - in SEO this term refers to everything that, in one way or another, is connected with the massive and automated promotion of a resource by placing links.

Splog class = url> (from the English splog) - most often a splog is called a spam autofilled site (or blog), which does not differ in unique content and was created for moneymaking (selling links, displaying ads, etc.).

Midrange class = url> - medium-frequency request, i.e. such a request to a search engine, which cannot be called a high-frequency one, but is not a low-frequency one either.

Query statistics class = url> - the number of user requests to the search engine for the same keyword.

Eaten pag class = url> (slang) - the number of pages in the search engine index.

Thematic catalog class = url> - a special directory created to post links on a specific topic. A link from a good subject directory is very valuable.

Title class = url> (slang) - information that is included in the title tag.

TIC class = url> - thematic citation index used by Yandex. It is a kind of Google Page rank.

Tlog class = url> - a specialized blog with no (or almost no) text content, but media content (for example, pictures, online video and music).

TOP class = url> - the highest indicators in the search results. Most often, getting to the TOP is a demonstration of a site for a specific keyword in the top ten sites.

Springboard class = url> - a special page or even a separate one-page site, the main task of which is to demonstrate a link to a specific resource.

Trastrank class = url> (from English trust) - a category of the level of trust (determined automatically), which is awarded to a specific site by a search engine. Sometimes, the trust applies to free domains.

Traf, traffic class = url> (from English traffic) - the number of visitors to a site. Also used as attendance category.

Trade, trade, trading class = url> - the process of exchanging visitors between resources. Most often, trading is done through special scripts.

Unique content class = url> - any information available for review only on a specific resource. Most often intended for humans, but can also be generated for search engines.

Unique visitor class = url> - a visitor who first came to a particular resource within a certain period of time. Most often, the "uniqueness" of a visitor is determined by the IP address or cookies.

Feed class = url> - in seo terminology, this is the page that the doorway visitor comes to after the redirect has occurred.

Freehost class = url> (from English free host) - free hosting, a place where you can open a website for free. A typical example of free hosting is Yandex Narod.ru.

Hitbot class = url> - a special script or program that winds ratings, puzomerki, impressions and clicks. Most often used in spam.

Hamster class = url> - (slang, from the English homepage) - home page, personal site of a person, musical group, etc.

Host class = url> - in SEO, this is one unique visitor on any resource.

Hrumer class = url> (Xrumer) is one of the most powerful spam programs that has become widespread on the Internet. More details about Hrumer can be found here

Citation class = url> - category of the quantity and quality of links to a specific site from other resources. Citation is used by search algorithms and is presented in the form of the so-called. Citation Index. Most often, the value is expressed in numbers.

Usability class = url> (from English Usability) - a category that determines how convenient and understandable a particular resource is for a visitor. It is considered one of the most important points in internet marketing. You can read more about site usability here

I class = url> (slang) - one of the Yandex services called "Yandex Money"

Yaka class = url> (slang) - a directory of sites from the Yandex search engine. Located It is believed that getting a site into this directory will bring good results in Yandex search results.

Yaha class = url> (slang) - Yahoo search engine.

Yasha class = url> (slang) - Yandex search engine.

SEOSHNIK's Dictionary
Definitions of SEO concepts

SEO (Search Engines Optimization) is a full range of possible actions to optimize all internal and external factors of the site to improve its position in the search engine results.

SEO specialists, or optimizers- specialists who use optimization rules in order to attract a search audience, through the competent implementation of all the necessary procedures with the site to get it into the TOP of search engines

PR(Page Rank) Google Citation Index. Determines the link importance of the "weight" of the site and affects the placement in the search results in Google. PR assigned separately to each page of the site.

Down- unavailability of the server, hosting from the network. Simple. An unpleasant thing, since your site, accordingly, is also inaccessible to visitors.

Search engine indexing- adding a page to the search engine index. That is, from this moment, the search engine knows about this page and it will be displayed in his search in the position that it deserves in the opinion of the search engine (ranking).

IC- citation index. The number of referring sites.

Search engine cache- screenshot (snapshot) of the website page from the cache - the version of the Internet page recorded at the time of indexing. To see a newer version of the page, you need to click "refresh".

Keywords(keyword, key, query) - these are the words and phrases by which people will find your site in search engines.

Cloaking- technology, as a result of which one version of the page is issued to a person, and another version to the search engine robot (spider).

Content(content) - material, site content, texts.

Muzzle- Site's home page.

Spider Indexer- a robot that scans the content of pages on the Internet, following links, and adds the content of sites (html-code of pages) to the database.

Positions in the search results- TOP or SICKLE(SERP - Search Engine Results Page) - search results, search engine results page.

Pessimization- artificial underestimation of relevance by the search engine.

Ref-link(ref-code) - The so-called referral or affiliate link, which, in addition to the main address (URL), contains additional characters (code) that identify the account of the person who invited you to follow the link. All payments made by you on that site will be recorded and credited to the account of the account you came from.

Ref(referral) - a person who received an account in the affiliate program using your link.

Search engine ranking- determining the place in the search results for the page.

Relevance(English relevant - relevant, appropriate) - that is, in relation to the results of a search engine - the degree of correspondence between the query and the found, relevance.

Rewriting (rewriting of texts)- rewriting originated from copyright. If copyright is the writing of the original text, then rewriting is a change, alteration of the existing text. Rewriting pursues the goal, ideally, of obtaining similar, but unique from the point of view of search engines, texts from existing texts.

Submit- the procedure for adding information about your site to catalogs.

SDL- a site for people.

Dispensing system- arranges sites in a certain order when they enter a query into the search form.

Semantic core- selection of keywords for the site.

Snippet- a part of the page text, usually containing the words of the search query, which the search engine displays in the search results for a given query as a description of the proposed sites.

Link ranking- accounting for the text of external links to your site.

Counter(counter, counter) - a script that allows you to count the number of visitors to the site. The most important optimizer tool.

TIC- Yandex thematic citation index. Same as Google PR. Calculated cumulatively for the entire site.

VIC- internal scale for calculating the importance of the site, used by Yandex and available to him alone.

Topic users- users who are interested in the topic of your site.

Topic is the subject matter of the content of the web page.

Traffic(traffic, traffic, traffic) site - the flow of site visitors.

Host(host) - hosting. Hosting is a service for hosting someone else's website on your web server. Hosting is also called the sites themselves that provide this service.

Hoster(hoster, hosting) - a company offering hosting services.

Yak- Yandex directory.

I continue to publish, started in the post, the list of abbreviations. So let's continue ...

  1. - a request from Yandex or Google users, which is very popular (probably somewhere over 10,000 requests per month). High-frequency users are also distinguished by an increased level of competition in the search engine results (as a rule). It is especially difficult to advance on such a request. You can see how often users enter certain queries in search engines here:
  2. Mid-frequency request (mid-frequency) - mid-frequency request. Usually, mid-frequency is understood as queries that are entered several thousand times a month. This number is individual for each topic and depends on its popularity on the Internet. The competition here, as a rule, is somewhat lower, but still sufficient.

Frequency of requests, updates, nausea, ban and relevance

  1. LF-request (low-frequency) - low-frequency request. They are introduced several hundred times a month. You can advance on the low frequencies without using external optimization at all. To do this, it will be enough to competently carry out internal optimization (link articles).
  2. Through link (draft) - a weblink from all pages of the site (main + all internal). Draft is good, but when driving through queries (to get to the top), draft doesn't have much of an advantage over a regular web link from the home page.
  3. Yaka - Yandex directory.
  4. - Open Directory Project - popular non-Neshen directory (dmoz.org). Has a great influence on pr when promoting (there is an opinion that this does not apply to Runet). Adding to DMOZ is free of charge. A Russian-language resource can be added at dmoz.org/World/Russian.
  5. Cloaking is an optimization method in which the search robot is provided with one content, and the visitors of the resource - another. If detected, it is punishable by a ban (the resource is excluded from the search engine base).
  6. Pessimization of a site is an underestimation of the position of a resource when it is. Usually arises from the use of "black optimization" methods. Yandex and Google do it in order to increase the relevance of their search results.
  7. Search relevance - matching the answer to the question. Two components are important - completeness (nothing is lost) and accuracy (nothing extra was found).
  8. Ban - prohibiting a resource from being indexed by one or more search engines. For example, if Yandex has banned you, then yours will not be found for any of the requests.
  9. - the degree of unnaturalness of the text. Nausea increases with an explicit search for keywords, a large number of strong tags, etc. It does not necessarily lead to pessimization and even more so to a ban, but it can spoil the opinion of search engines about your resource.
  10. Search spam is any technique or text designed only for search engine robots and not intended for a “live” visitor. For example, the placement of hyperlinks invisible to the user (as an option - in the form of a transparent 1 × 1 picture) and the creation of the so-called Link Farm (link farms, or link cleaners) - web pages with a huge number of hyperlinks to various resources.
  11. - this is an update of the Yandex index (with Google, everything happens in almost real time), leading to a change in positions in the search results. Updates are carried out in two ways - instant cache clearing and gradual replacement. In the first case, during the start of the update, we see strong changes in the output right away, because the cache is cleared and the issue is generated in real time, and in the second, the issue is updated gradually.
  12. Search engine SERP (search engine result page) - a set of links to sites and their descriptions, given as a result of a search. We entered a query and as a result received a list of resources that match, according to the search engine, your query. This list will be the search results.
  13. The semantic core is a complete list of requests for which the project will be promoted. It is necessary to develop the structure of the site only after compiling the semantic core. This will give you an idea of ​​the sections of your unbuilt website and the material that will need to be posted on it.
  14. Link ranking (Link Popularity) is the influence of the text component of a hyperlink placed from someone else's site on the relevance of the web page of your resource for search queries contained in the text component of the link (its anchor). This is an anchor vocabulary search, which is perhaps the most interesting of the criteria that influence the ranking of documents in search results.

    Link ranking is a hierarchy based on authority in Yandex or Google search results. If a resource is referenced by many other resources, then it is authoritative. The webmaster cannot manage all the referring websites, therefore the link criterion is more objective than the text one. Complementing each other, both criteria allow search engines to provide.

  15. Link popularity is a measure of how many hyperlinks from external resources lead to your site, document, or domain on which it is located. Link popularity will vary depending on the search engine, since each search engine has its own database and the number of links stored in the databases of different systems is different. Google PR and Yandex VIC are indicators of link popularity.
  16. Free-for-all (FFA) pages are a place where anyone can add their own hyperlink. Since the process is automated, there is no way to control it. Thousands upon thousands of non-thematic links are found on the FFA pages. The only goal is to increase link popularity.

Anchors, engines, types of optimization, parsers and sandbox

  1. - link text.
  2. "Black" optimization - promotion by dishonest methods, search engine spam. To achieve the result, the placement of invisible text is used, the use of doorways with redirects, cloaking, link spam (placement of hyperlinks invisible to the user.)
  3. "Gray" optimization - "blind" optimization, without taking into account the specifics of the project. It is carried out, for example, by creating doorways without a redirect, chaotic link exchange. The effect of such optimization is minimal.
  4. "White" optimization - work with the content and structure of the site in order to make it the most convenient for visitors and available for indexing by search engines. It is carried out by optimizing the navigation on the site, cleaning the code, adding content, placing links on thematic resources. You can also read the article -
  5. - the program code on the basis of which the site is built. Most often we mean CMS (Content Management System) - Content management systems. Optimization puts forward a number of requirements for "engines" that are usually neglected by many CMS developers.
  6. - a hyperlink from a web page with great authority (TIC, PR). The importance of this referring page, in turn, is the higher, the greater the number and the more authoritative the links leading to this web page.
  7. Sharpen - in relation to SEO, this is search engine optimization, or text for a keyword.
  8. Cybersquatters are those who register domains for sale. A cybersquatter's margin can exceed several hundred percent.
  9. Mordovian links - from the main web pages (muzzles) of the sites.
  10. Nepot filter - search engine refusal to consider hyperlinks from spam sites.
  11. Nepot list - a list of resources that fell under the restrictions imposed by the nepot filter.
  12. Parser is a program that provides automatic processing (parsing) of web pages of sites in order to obtain the necessary data. There are a lot of parsers in nature, depending on the task, for example, parsers for collecting keywords from the wordstat.yandex.ru service.
  13. Parse - the process of checking (parsing) data, the main purpose of which is to find the necessary elements of the file (link, logical parts of the text, etc.).
  14. You can parse the output for the positions of the resource, you can page for finding the necessary link there, etc.
  15. The sandbox is one of the most serious obstacles that a new project needs to overcome before getting into. The Google sandbox does not allow new resources to artificially improve their positions. Not every search engine has such filtering. Sites in the sandbox can stay there from two weeks to a year. During this period, new projects will have time to take high positions in other search engines, but not in Google.

Good luck to you! See you soon on the pages of the blog site

you can watch more videos by going to
");">

You may be interested

Deciphering and clarification of Seo abbreviations, terms and jargon Features of online stores promotion
Taking into account the morphology of the language and other problems solved by search engines, as well as the difference between HF, MF and LF queries
Methods for optimizing content and taking into account the topic of the site during link promotion to keep costs to a minimum
Yandex Wordstat and the semantic core - the selection of keywords for the site using the statistics of the online service Wordstat.Yandex.ru
What search engine optimization factors affect website promotion and to what extent
Statistics of Yandex, Google and Rambler search queries, how and why to work with Wordstat

Beginners in this business often come across a lot of unfamiliar terms and concepts. In most cases, it is for this reason that novice optimizers have problems and questions. Believe me, knowledge and understanding of basic SEO terms will significantly increase the speed of perception of information on SEO optimization. The latter will allow you to quickly understand the intricacies of this case, which will save both your time and your money. To help you more or less understand SEO, you should familiarize yourself with the most common SEO terms below. So, let's begin.

Russian SEO terms:

AP or update(Update) is an update of the database of any of the search engines. By the type of data being updated, updates are usually divided into the following: Update TCI, Update Page Rank, Update Yandex catalog, Update search results, Update site mirrors, Update favicon.

Search engine bans- this is the complete removal of absolutely all pages of a particular site from the search engine database and, accordingly, from the results of its search results.

Backlinks(backlinks) are backlinks to the site that are indexed by search engines.

HS- This is a site made specifically to extract any (most often material) benefits by the site owner. Such sites are usually promoted only by “black” SEO methods.

Keywords(keywords) are words that help search engines determine the topic of a particular web page.

Meta tags- this is specific information of a particular web page, enclosed in special html tags and providing structured metadata about a single web page. This information helps search engines determine the topic and purpose of each web page.

Organic search- these are search results that a webmaster receives without using paid advertising. In other words, this is traffic that is attracted to the site only by optimizing the entire content of a single web page.

Pessimization- This is a decrease in the search results of the site's position when some search engines identify artificially generating interest in this page. If the site is promoted only in this way, then absolutely all pages of the site can fall under pessimization. Most often, pessimization is a direct path to a ban, if the situation is not corrected in time.

Density (nausea) of keywords Is the frequency of using a specific keyword or a specific keyword phrase on a specific web page of the site. For example, in case of excessive nausea, Yandex may consider such a page as spam, and it will be inconvenient for an ordinary user to read such text, to put it mildly.

Search system Is a hardware and software complex located on a specific site, the main purpose of which is to help the user find all the information he is interested in.

Search queries (LF, MF, HF) Is what ordinary users enter into the search form of search engines. Depending on the frequency of their search by users, search queries are divided into low-frequency, medium-frequency and high-frequency.

Ranging- this is the building by the search engine of the pages of the sites according to their maximum correspondence to a certain request from the point of view of the algorithm of the operation of this system. Sometimes this view does not reflect the interests of users who are looking for something.

SDL Is a site made for people.

Title(html Title tag) is an informational meta element that defines the actual "title" for a particular web page. The content of the Title tag is usually displayed in the top bar of the browser. The Title tag is also shown in the SERP of search engines as a hyperlink.

TCI Is a thematic citation index of the Yandex search engine, used for high-quality ranking of all sites in SERP. It is designed to determine the quality of content not just of a specific page (like Google PageRank), but of the entire site as a whole. Another difference of the TCI is the priority of thematic proximity of all referring sites. The number and authority of linking sites also plays a huge role in search engine optimization.

Top results- these are the positions on the first page in the search results (SERP) of search engines. These pages collect the vast majority of traffic from SERP.

English SEO terms:

LSI(Latent Semantic Indexing) is an algorithm used by the Google search engine (and quite possibly other search engines as well) specifically to determine the relevance of words used on a particular website page.

PageRank(PR) is a unique Google search engine algorithm that calculates the "importance" of a particular web page based on the quantity and quality of other web pages linking to it.

SEO(Search Engine Optimization) are the actions required to optimize a site for search engines.

SERP, or SERP(Search Engine Results Page) is a list of web pages displayed on a search engine page after an Internet user has entered a specific search query. This list is also called search results.

You do not need to memorize all the terms at first, it will be enough just to make a printout of the above SEO terms, and then over time you will already easily understand them and apply them in practice. Basic SEO terms will be posted here from time to time, so be sure to check back periodically!

Search engine (PS, Search engine)- A system that helps the user to find the necessary information on the Internet, examples are google.ru, yandex.ru, rambler.ru.

SEO (Search Engines Optimization, SEO) search engine optimization of the site. Site optimization for search engines or simply search engine optimization are activities to improve the code, content and structure of the site. All this is done to raise the position of the site in the search results in the search engines (search engines) for certain user requests.

SEO optimizer (SEO specialist, sometimes just a webmaster)- A person who is engaged in site optimization for search engines.

Query (Search query)- what the user types in the search bar of the search engine, most often a short, formulated phrase, for example "seo terms".

Query statistics- information about user requests to the search engine by "keywords".

Content (content, content)- what the information resource is filled with, for example, a website, or a blog, its information. Content can be called text, graphics, multimedia (video, audio) - everything from which information can be obtained in the future.

High Frequency Interrogation (HF)... A query that users search for very often. Query frequency - the number of specific search queries to the search engine from users per month.

Mid-frequency request (MF)... A request not related to low-frequency and high-frequency.

Low-frequency request (LF)... A query that users do not search often, that is, queries with a low query rate.

Relevance- the degree of compliance of the page with the necessary search queries.

Bot (Robot, Spider, Spider, Search Robot)- a program that is an integral part of any modern PS. Designed for crawling site pages and entering them into the PS database, for subsequent indexing of the downloaded pages. By its principle of operation, the spider resembles a regular browser, it enters the page, downloads it and then saves information about the page to the PS database.

Indexer- a program designed to index pages of sites or documents on the network.

Indexing- the process of processing pages of sites that were collected by a spider, an indexer program, as a result of which the PS index is formed.

Index- the database of the search engine, created in the process of indexing the pages collected by the search bot.

Citation- the number and quality of links, which is expressed as one of the citation indicators, for example, IC ((TCI) Yandex, Aport) PageRank (Google).

IC citation index- citation index of the document / website TIC citation index. Determines the location of the site in YandexCatalog, as well as the cost of placing links on this site, if the owner sells them.

YAK Yandex-catalog.

TCI- thematic citation index. Determines the space occupied by the site in the YAK, and the TIC is also used to determine the cost of placing links on the site if the owner sells them.

VIC- weighted citation index. If we take into account all the parameters of the site and the sites linking to you, then we get a VIC. It determines the position of the site in the search results in Yandex.

PageRank (PR, pagerank, pr)- The citation index, according to the Google PS, is the same as the CIC.

Link (Link, link)- a hyperlink (hyperlinks) leading from one document / web page / website / document to another. The most popular search engines take links into account when ranking, search results in the SERP, therefore, a link is a very important tool for an SEO optimizer.

Ranging- building by the search engine of web pages (sites) according to their maximum correspondence to a specific request.

Link ranking- the algorithm used by the PS to calculate the relevance of the site to the request. Based on the analysis of links to a site that include a keyword. Link ranking is one of the factors that affect the overall relevance of a site to a query.

SERP (search engine result page), search result, sickle)- a page in the PS, which is issued to the user in response to a given request, including links to those resources, in the opinion of the search engine, that are relevant to this request.

Top results- positions in the SERP, hitting which gives tangible traffic. Usually, the top is understood as positions with 1-3 issues of PS, less often 1-10.

AP (update) update, recalculation of positions and indicators in accordance with the new data collected by the search engine since the last update.

Pessimization- artificial underestimation of relevance on the part of the search engine while artificially increasing it on the part of the site owner.

Ban- removal of a site or page from the search result.

Muzzle- the main page of the site, usually has the highest rates of TIC and PR.

Muzzle lunge- the main page does not appear in the search results.

Cloaking (cloaking, hiding)- the technique of "black" search engine optimization (prohibited by the search engine), which means that the information given to the user and search robots on the same page is different. That is, this can be achieved if the important text that will be given to the PS is made invisible; other ways of implementation are also possible.

Doorway- a site or page created and optimized for a specific search query in order to take a high place in the search engine results. Doorways are prohibited SEO optimization techniques.

Parsing (from Parser)- Automatic processing in order to obtain the required data. For example, parsing of SERPs for site position, parsing in order to collect addresses of sites of the desired topics.

Parser- A program that serves for parsing.

Submit (submit, rega, registration)- registration of a resource, in our case a website, designed to help increase citation. Accordingly, autosubmit is auto registration (in terms of quality it is several times inferior to manual, but in time it is many times superior).

Submitter (submitter, registrar)- a program that allows you to increase the speed of the submission, perhaps even the quality, an example of allsubmitter

The article is taken from open sources: http://dleman.ru/faq/1103-seo-terminy.html