SEO Dictionary
Term - definition
.htaccess - Apache directory-level configuration file that can be utilized for password protection and redirecting files.
Be careful and copy your current .htaccess file prior to editing it, and do not take any risks by editing it on a site that would negatively impact you too much if it were to go down.
301 - Moved Permanently - The file now has a new location; a permanent server redirect. This is the main form of readdressing for the majority of websites or pages. If you are thinking of transferring a site to a different spot, consider simply relocating just a file or folder initially, and see how that ranks. It could take a few days or even several weeks for the 301 redirect to be picked up depending on site authority and crawl frequency.
404 - Not Found - The URL could not be located on the server. Sometimes even when documents exist, content management systems or software that allows for the organization of data will send 404 status codes. You should make sure that existing files give a 200 status code and a 404 status code should be given for files which do not exist. Ask your host if you are able to set up a custom 404 error page as this will simplify viewing your popular or pertinent navigational selections for visitors. A robots.txt file will be requested by search engines to determine the parts of your site they are able to crawl. A favicon.ico file will be requested by a large portion of browsers when loading your site. The creation of these files will ensure log files are clean, allowing you to focus on other site errors.
Above the Fold – This term typically describes the upper half of a newspaper but in terms of web marketing it refers to the portion of content which one can see before scrolling. Some describe ‘above the fold’ as an ad spot appearing on the upper part of the screen, but because of banner blindness, common ad locations do not achieve as good results as ads that are incorporated properly into content. If ads have the same look as content they tend to do better in terms of performance. A top heavy process was launched by Google to reprimand sites with too much ‘above the fold’ placement of advertisements.
Absolute Link – This is a link that reveals the linked page's whole URL. Often times, links merely reveal relative link paths rather than the whole reference URL within the a href tag. It is normally favored to use absolute links in lieu of relative links because of hijacking concerns.
Absolute link sample:
<a href="http://seobook.com/folder/filename.html">Cool Stuff</a>
Relative link sample:
<a href="../folder/filename.html">Cool Stuff</a>
Ad Retargeting (see retargeting)
AdSense - AdSense - Google's relative advertising system. Publishers can publish applicable advertisements with or near content and then share with Google the profit that results from their ad clicks.
AdSense presents an accessible automatic ad revenue system. This helps publishers to formulate a starting point for the worth of their ad inventory. Often times, AdSense is underpriced, but it is expected in regards to systemizing ad sales.
The format for AdSense ad includes:
- cost per click - meaning promoters will only pay if someone clicks on their ad
- CPM - advertisers pay per ad impression. Sites can be targeted via specific category, keyword or relevant demographic information.
The formats include:
-videos and text
-graphic
-animated graphics
Click through rate (CTR) can be impacted by:
-a site’s function
-commercial orientation of a site
-the relevancy of advertisers in their vertical
Also worth mentioning is the importance in avoiding being overly aggressive when it comes to site monetization as this can prevent the site from becoming very profitable.
Effective monetization models depend on vertical:
-AdSense
-selling your own services
-affiliate marketing
-direct ad sales
Adwords - Google Pay Per Click advertisement platform, frequently used for basic website advertisement.
AdWords – This is Google's advertisement system. Many of Google's ads are keyword targeted. They are often sold on a cost per click basis which influences ad click-through rate in addition to max bid. Google might expand their ad network to include video ads, radio ads, traditional print ads, demographic targeting, and affiliate ads.
AdWords is a multifaceted marketplace.
Adwords site - (MFA) Made For Google Adsense Advertisements - websites created as a place for GA advertisements. This tends to be ineffective. An example of Made For Advertisement would be television programming.
Affiliate Marketing - Merchants increase their market reach and mindshare through affiliate marketing programs by paying by means of cost per action (CPA). Payment is only received if an action is completed by visitors.
Most affiliates don’t tend to profit much for a few reasons, including: they are not dynamic marketers; they are not motivated enough; and they try instant gratification/wealth programs that don’t work. Few affiliates make thousands or millions of dollars a year thanks to a deep concentration on automation and the ability to tap large traffic streams. Characteristically, niche affiliate sites do better than broad ones (per unit effort) because they have a higher conversion rate.
It’s more difficult to sell a conversion than to sell a click. Search engines take advantage of:
-algorithms that pick up duplicate content
-manual review
-application of paid ads’ landing page quality scores
AJAX - Asynchronous JavaScript and XML lets a web page request data from a server without the loading of a new page being necessary.
Alexa – A search service which is owned by Amazon.com and can be used to determine website traffic.
Alexa is free, and inclined towards sites that have a heavier focus on marketing and webmaster communities.
Algorithm (algo) – This is employed by search engines to decide which pages are the most applicable to a specific search term.
Google frequently updates its algorithm.
The quality of a website’s content will always be the most important.
Alt Attribute - This helps search engines, like a person with blindness, understand an image by giving a text equivalent.
Example usage:
<img src="http://www.seobook.com/images/whammy.gif" height="140" width="120" alt="Press Your Luck Whammy." />
Alt Text - A description of a graphic that the end user will only see if the graphic is undeliverable. Alt text is important because search engines are unable to distinguish one picture from another. Alt text, when correctly used, is a precise depiction of the accompanying picture. Those who are blind or partially blind can use alt text to understand graphics.
Analytics - A platform that helps to obtain and examine data regarding usage of a web site, such as Google analytics.
Analytics - Software that enables users to keep track of page views, conversion figures and user paths based on the understanding of log files or via a JavaScript tracking code.
Marketers who take advantage of the opportunity to track the actions of visitors will benefit more than those who do not.
Anchor Text – Users click on this text to follow a link. When the link is an image, image alt attribute might replace the anchor text.
When links are naturally occurring, they generally have a vast range of anchor text combinations. Too much similar anchor text can end up being filtered. Anchor text should be mixed up when building links.
Example of anchor text:
<a href="http://www.seobook.com/">Search Engine Optimization Blog</a>
When targeting Google, the general idea is that no more than 10% to 20% of anchor text should be the same. Backlink Analyzer can be utilized to examine the anchor text profile of your competition.
Android – An operating system by Google that powers electronics, such as cell phones
API - Application Program Interface - a sequence of procedures utilized to access software functions. The majority of chief search products employ an API program.
Arbitrage – The manipulation of market inadequacies, in a sense – items are bought and then re-sold to make a profit. In terms of the web, thin content sites that are filled with AdSense advertisements purchase traffic from larger search engines with the expectation that a portion of that web traffic will click on their ads. Arbitrage is common with shopping search engines as this is a common way they get traffic.
ASP - Active Server Pages - vibrant programming language for Microsoft.
Authorities - Topical authorities are reliable and respected sites. A topical authority is a page which is referenced from many contemporary authorities and hub sites. A topical hub will reference specialists.
Topical authority examples:
-big field brands
-a reputable blogger discussing your subject/topic
-Wikipedia page on your subject matter
Authority - How trustworthy or reliable a site is for a specific search query. “Authority” is a result of correlated incoming links from sites that can also be trusted. This directly relates to a domain's ability to perform well in rankings. The publication of original content is important, as is site history, site age and trends in web traffic.
Authority Site - A site with inbound links from associated hub sites; a site which is trusted. Concurrent citation from principal hubs allows an authority site to have better search results placement. Wikipedia is an authority site.
Automated Bid Management Software - Pay per click search engines are becoming more intricate. Some search engines utilize software that helps users control their ad spending. Tools are available that help look at conversion and ROI rather than simply viewing price per click.
Prevalent bid management tools include:
-KeywordMax
-Atlas OnePoint
-BidRank
B2B - Business to Business.
B2C - Business to Consumer
Backlink (back link) - (incoming link) A link to one page from another page. Usually referred to as a link from one website to another website.
Bait and Switch – A deceptive marketing method that draws attention by appearing to have a specific purpose, but once authority or trust is gained, the purpose is changed.
Sites that are commercial might try to appear to be more informative or noncommercial, for example.
Behavioral Targeting - Ad targeting that revolves around past searches. If one recently searched for engagement rings on Google, and later visits a completely different website, they might see an advertisement from a jeweler on the other website.
Bid Management Software (see Automated Bid Management Software)
Bing - Search engine for Microsoft; also controls the organic search results for Yahoo! Search
Bing Ads - Paid search program for Microsoft; a competitor for Google AdWords; controls paid search results on Yahoo! Search.
Black Hat - Search engine optimization strategies that are counter to unsurpassed practices, for instance, the Google Webmaster Guidelines.
Black Hat SEO - Black hat SEO covers practices such as artificial growth hacking, scraping, keyword stuffing, and link schemes, all of which will get you a disadvantage or complete ban from results pages. Search engines label deceptive marketing practices as black hat SEO (White hat SEO techniques operate using proper guidelines).
Search algorithms can be tested to understand how search engines work which can be helpful overall.
Block Level Analysis – This is the process of breaking pages down into smaller blocks or points. This can assist in distinguishing if a page is an advertisement, or a component of a navigational system, and help with web search performance.
Blog – A blog is comparable to an online diary or journal, with the most recent blog entry appearing first. A blog can provide information on a product or service, or even advice, and generally allows interaction from visitors as they can post comments. Blogs can be incredibly influential as visitors continuously return to read recently posted information. The primary focus of bloggers is content creation.
Popular blog sites include Typepad, Blogger, Movable Type, and Wordpress.
Blog Comment Spam – These are comments which offer no relevance or value and can be left manually by a person or automatically. An automated blog spam comment might be left by a software company trying to sell a product, such as Viagra. A manual blog post might read something along the lines of, “We don’t know each other, but check out my fitness page!”
Blog spam comments are beginning to look more realistic, as fake personalities can be created to converse and respond to one another.
Blogger - Blogger is Google-owned platform for blogs, and it is free to use, making it quite popular. It can be used to publish sites and is easy for a novice to figure out. Those who have a strong interest in making money online should opt for publishing content to their own domain rather than building equity with a website owned by someone else.
Blogroll - This is a list of links found on a blog that generally link to friends' or associates' blogs, or blogs which are owned by the same company.
Bold - A darker font for words to be written in – bold font generally stands out more than italic or plain font, so it is more likely to be read by a person who might be scanning through a web page. It should be used minimally to still appear natural.
Example use:
-<b>words</b>
-<strong>words</strong>
Either would appear as words.
Bookmarks – One can bookmark favorite pages. If a file has a large number of bookmarks then it can be assumed to have worthy, quality information. Search engines could ultimately utilize bookmarks to assist in helping search relevancy. A tagging site is another name for a social bookmarking site. A sample of a social bookmarking site is Del.icio.us and Yahoo! MyWeb lets users tag results. YouTube and Google Video let users tag content as well.
Some meta news sites let users tag pages. When one receives many votes for their story they can ultimately find the story on the homepage. Digg and Slashdot are news sites that let visitors vote on content.
Boolean Search – This is essentially a way to expand search results using mathematical formulas like AND, NOT and OR. The majority of search engines will naturally include this formula with your search, for instance, if you do a search for the term SEO Book you will get results for SEO and Book.
Have a look at some other examples listed:
-A search on Google that is done for the phrase "Miniature Dachshund" will bring back results for the phrase Miniature Dachshund.
-A search that is done on Google for Miniature Dachshund -Max will reveal results that consist of Miniature AND Dachshund but nowhere will the word “Max” be found in said results. It will be as if Max was not included in the search at all.
There are search engines that let users look for other filtering patterns. Have a look:
-A mathematical range of numbers: 9...20 would look for digits that fall between 9 and 20.
Bot - (robot, crawler, spider) A program that carries out a task unconventionally. Search engines take advantage of bots in relation to their search indexes as they will add pages to them upon searching for them. On the other hand, spammers will “scrape” content via bots to steal or plagiarize.
Bounce Rate - This is a method of keeping track of visitors that view a site and then leave without going through pages.
Brand – This is essentially how customers will connect to a company.
Brands are created through interaction with clients. In terms of the web, search engines will seek out indications such as recurrent web traffic and utilize them with algorithms such as Panda.
Branded Keywords - Keywords which are connected with a brand. These will be valuable keywords or phrases. Different programs might push for affiliates to bid on core branded keywords while others discourage it.
Bread Crumbs – This is used to describe the Web site navigation that would appear in a horizontal bar, usually on the top of a page over the primary content. It assists visitors in navigating the site.
Breadcrumb Navigation – A search system that provides visitors easy access between pages and helps them better understand a site.
Sample of breadcrumb navigation:
Home > SEO Tools > SEO for Firefox
This essentially provides organization between pages.
Brin, Sergey - Google Co-Founder
Broken Link - A malfunctioning hyperlink that fails to take visitors to where they are trying to go.
Here are some reasons as to why a link might be broken:
-a website has gone offline.
-the content might have been meant to be temporary
-the location of a page might have been moved
It is not uncommon for bigger websites to have a couple of broken links but there should not be too many. There could be a large amount of content which is outdated if there are numerous broken links. Broken links negatively impact search engine results.
A free program that seeks out broken links is Xenu Link Sleuth
Browser - This is how content is searched for online.
The most well-known web browsers are Microsoft's Internet Explorer, Opera, Mozilla's Firefox, and Safari.
Buying Cycle – This is the cycle that buyers go through before making a big purchase. It entails looking up products or services that match what they want. Branded keywords are used to help reach buyers and will have more noticeable rates of conversion.
Here are the stages that usually make up the buying cycle:
-Problem is discovered: The potential buyer finds they need something.
-Search: The potential buyer will search for keywords that will help to solve their problem, for instance, if they need a lawn care company to mow their grass, they might search for “lawn care in Orlando.”
-Evaluate: They will compare different businesses and read reviews.
-Decide: They will determine what matches their needs.
-Purchase: They will buy the product or service.
-Reevaluate: They might review their purchase and experience.
Cache - A search engine will store a copy of a web page. Some don’t realize that when they do a web search they are actually looking for files which are in the search engine index. You can find search engines that present links to a page’s cached version.
Calacanis, Jason - Founder of Weblogs, Inc. Popular blogger. Credited with adding an additional layer to the Netscape site and ultimately became GM.
Canonical Issues – a canonical issue would involve duplicate content, which is very difficult to steer clear of. 301 Server Redirects can take a person to the canon to effectively deal with canonical issues. A canon is a legit version. In other words, one can avoid duplicate content by just specifying the preferred site or canonical version.
Canonical URL – Numerous systems which are utilized to manage content are ridden with miscalculations that can leave duplicate content being indexed under numerous URLs. This is because of varying link structures. The trusted version of a site is the canonical version. Search engines can identify canonical URLs through PageRank.
Webmasters should take advantage of unswerving linking structures all through their sites. It is a good idea to end a link location at a / rather than putting the index.html in the URL
Here are some samples of URLs that could potentially utilize the same information even though the web addresses vary:
-http://website.com/
-http://website.com/index.html
-http://www.website.com/
Catch All Listing – This is a sometimes useful method employed by pay per click search engines with the intent of monetizing long tail terms which are not yet being utilized by marketers, but it is only very effective if you have competitive keywords. The links are already pre-qualified which can make them valuable for directories.
Chrome - A web browser for Google, and also an OS
Click Fraud - A company can pay to show up higher in supported listings by bidding on specific search terms – it becomes click fraud when there is a lack of conversion with clicks commonly come from the same ISP. Competitors can do this to leave companies displaced from the search engine market. If a publisher is partaking in click fraud, it is likely an attempt to earn more money in a fraudulent way.
Cloak – This can lead to being completely vanished from search engine results, so it is a risky practice. It entails bringing different content to a search engine spider than is visible to users.
Cloaking - Showing altered content to search engines. It can cause one to be vanished from search engine results. It is legitimate if it falls within certain guidelines, such as if a user experience is modified because of their location.
Clustering – This is done with search results so that the listings are limited and grouped or clustered together. This makes the results look more organized. Hubs can be grouped together on a specific topic, too.
CMS - Content Management System – This is a feature that makes the process of adding information to a site simple.
A prevalent content management system would be a blog software program. Duplicate content can end up being a concern because of errors that are found within content management systems that impact indexing content. One can use a blog program like Wordpress and not need skills in coding.
Co-citation – This is when links that show up close together might be considered related in topical authority based search algorithms.
Code Swapping - (bait and switch) Modifying content once high rankings have been obtained.
Comments – Visitors to blogs and other sites are able to leave feedback or comments that are relevant to the content. This allows for interaction.
Comments Tag – A way to simplify code by putting comments in the source code.
HTML comments in the source code of a document will come up as something like <!-- your comment here -->. They can be viewed but will not show up in the typical formatted HTML version of a file. An outdated technique involves stuffing keywords in comment tags, but it doesn’t seem to be effective anymore as it just brings risk to a site.
Compacted Information - Information that correlates with a product. For instance, ISBN is something that goes with the majority of published books.
Product databases are only increasing online so it is important to have exclusive, original documents that are seen as important to get data indexed.
Concept Search – This is when a search is performed to match results on a conceptual level instead of just with words.
An example of this would be a search engine comprehending that a specific search term is relevant to another term. It is looking for search intent instead of just the actual search.
Conceptual Links – These are links that a search engine will try to comprehend beyond only words. Intent in a search is considered. This is an advanced technique that sometimes examines co-citation, too.
Content (copy or text) – This is the part of a web page that is valuable to a visitor. Ads and boilerplates are not considered content.
Contextual Advertisement - Content-related advertising
Contextual Advertising - This is a technique that will display advertisements that are relevant to a web page's content.
Conversion - (goal) This is when a goal on a website that can be measured is obtained. The sale of a product, clicks on advertisements and web site sign-ups are all conversion examples. Some types of online advertising are easier to keep track of than others – offline ads, for instance, are difficult to track.
Achieving a goal online could come in the form of:
-the sale of a product>br/>
-getting an email
-a survey being filled out
-a form being filled out
-a phone call from a consumer
-receiving feedback
-a web site share
It can be simple to track conversion sources through analytic programs and affiliate tracking.
Conversion Rate - The number of visitors or percentage who convert - see conversion.
Cookie – Used to track users in order to personalize their experience and assist in tracking conversions. A small data file.
CPA - Cost per action. The efficiency of numerous forms of online advertising is dependent on cost per action. This is the structure for numerous affiliate marketing programs. Actions can be clicks on advertisements, or actually purchasing a service or product, for instance.
CPC - Cost per click. Some ads are paid for per click. Luckily, they can be contextually targeted in hopes of being more effective.
CPM - (Cost Per Thousand impressions) A numerical metric that is utilized for the quantification of the average cost of Pay Per Click advertisements. The Roman numeral for one thousand is referenced by M. CPM can determine how lucrative a website is.
Crawl Depth – Measures how much a website is crawled.
Pages still need to be able to rank high even if a search is long and more targeted. In order to be deeply indexed a bigger site needs to have good link equity. There also can’t be duplicate content concerns.
Crawl Frequency – The frequency in which a website is crawled.
A more regularly crawled site is likely more trusted or commonly updated than a site that is less crawled. Heavily duplicated content in sites will cause them to be crawled less frequently.
Crawler - (spider, bot) A platform that travels through a website by means of the link structure to collect data.
Crawling – Websites can be included in search results as a result of “spiders” which are sent by Google to gather information on fresh web pages. This is something that happens frequently. A web page does not need to be sent to Google to be included in search results.
CSS - Cascading Style Sheets is a process that accumulates styles to web documents.
Note: The utilization of external CSS files simplifies the process of modifying page designs by making edits to one file.
An external CSS file can be linked to utilizing code in the head of HTML documents that is comparable to:
<link rel="stylesheet" href="http://www.seobook.com/style.css" type="text/css" />
CTR - Clickthrough Rate – this is a way to determine the relevancy of a keyword. It is the percentage of visitors who click on an advertisement. Banner ads are not as effective as search ads, as search ads tend to have better CTR.
A search engine has the ability to distinguish if a specific search query is navigational or informational or transactional by examining the comparative CTR of various listings on the search result page as well as the CTR of visitors who have recurrently done a search for a specific keyword.
Cybersquatting – The registration of domains related to other brands in order to take advantage of the worth created by that brand.
Dayparting – This is the control of ad campaigns which can be switched on or off; and bid rate can be modified - this is based on the assumed availability of the target audience. One would bid more if the audience is assumed to be available and less if the audience is assumed to be unavailable.
Dead Link - A nonfunctional website link or a link that used to exist, but the url of the page has been changed without a 301 redirect, leaving a link from another site that brings up a 404 error because that url does not exist anylonger.
Dedicated Server - A server that just serves one website or a single person's miniscule website collection.
Dedicated servers cost more money than virtual servers but are generally more reliable.
Deep Link - A link that points to a website’s internal page.
The majority of websites will have interior page links or deep links. Content is sometimes based on simple linking opportunities.
Deep Link Ratio - The ratio of the number of links which point to internal pages within a website, to the overall amount of links pointing to a site.
A higher deep link ratio is generally a good indication of a valid natural link profile.
Del.icio.us - A social bookmarking website that is quite famous.
De-Listing – This is when one is de-indexed from a search engine either momentarily or forever.
De-indexing can happen when:
-The website is new and has not been crawled yet.
-There is an update taking place and crawl priorities are being readapted this could be due to duplicate content, or link quality, for instance.
-Pages with new locations are not being correctly redirected
-Search Spam
-The manual removal of a website by a person
Demographics - Data that describes population segments or parts.
Internet Marketing Platforms like AdWords allow target advertising, so ads will be targeted to a certain demographic in regards to people conducting online searches. Demographics that might be considered for target advertising include age, gender and location.
Description – Search engines and directories offer a small description close to listings with the goal of adding context.
Directories which are higher in quality steer away from promotional descriptions and prefer ones that provide an accurate description of the site.
Here is what search engines will ultimately do:
-use a description from a principal directory for homepages of directory sites
-utilize the page meta description
-try to extract a description from the content of a page that is related to a specific search query
Digg - An interactive news site in which visitors are able to vote on stories
Directory - A categorized collection of websites, characteristically physically put together by topical editorial specialists.
Some directories will center around certain niche subjects; others are inclusive. Yahoo! Directory is a popular, trusted directory with good editorial control, so its links will be more trusted by search engines.
Directory Page - A page that has links which are relevant to WebPages.
Disavow - The disavow tool for links allows webmasters to establish whether or not they guarantee inbound links to their website.
The recovery from manual link consequences will typically mean disavowing some low quality links. If one gets an automated link penalty, the disavow tool is usually helpful with penalty recovery, nevertheless Google will need to crawl the pages for disavow to be applied to the links. Manual penalties can be avoided for the most part by getting rid of lower quality links.
DMOZ - The Open Directory Project is the biggest directory of websites edited by persons. AOL owns DMOZ, which is mostly run by volunteers.
DNS - Domain Name System or Domain Name Server. This is a mechanism which is utilized in resolving a domain name to a particular TCP/IP Address.
Document Freshness (see fresh content)
Domain - Structure that is utilized for logical or location organization of the web. A domain can also indicate a particular website.
Doorway - (gateway) A web page that was created with the intention of drawing traffic from a search engine. A doorway page that redirects visitors to a different website or page is employing cloaking.
Doorway Pages - Pages that were made with the intention of ranking for highly targeted search queries; they generally redirect searchers to an advertisement filled page.
There are webmasters that will cloak doorway pages on reliable domains, and as a result will make a lot of money – they are eventually de-listed.
Dreamweaver - Prevalent editing and web development software
Duplicate Content - Content that has been copied or plagiarized. This can impact trust for a website from search engines.
It is not desirable for search engines to index several versions of comparable content. Printer friendly pages could be an example of unfriendly duplicates for search engines. Because some automatic content generation methods depend on reusing content, search engines can be cautious when filtering content that seems similar to other content.
Dwell Time – This is the length of time spent by a visitor on a destination site prior to going back to search results.
The amount of time one dwells on a site is not necessarily a strong relevancy signal due to the fact that some searches take more time for users to find what they are looking for. Repeat visits to a website, relative CTR and branded searches can give a better idea of whether or not a visitor is satisfied with a certain website.
Dynamic Content - Content that is prone to changing over time or utilizes PHP.
Dynamic content should appear static, so URL rewriting is useful.
Dynamic Languages - PHP or ASP - programming languages - that help build pages fast.
E commerce site - A retail sales website
Earnings Per Click - A method of estimating earnings made per click. Used by contextual advertising publishers.
Editorial Link – Links are counted as votes of quality. Editorial links that were earned rather than bought are more desirable.
TrustRank is a good algorithm for determining trust for sites. There are paid links that count as signs of votes so long as they have editorial quality standards or editorial control. Link farms will not help with ranking.
Emphasis - An HTML tag for emphasizing text.
Content must be easily read by humans, meaning if a page is difficult to read or poorly converts, it might not look good to search engines or visitors. A page will be more difficult to read if every keyword is in emphasis.
<em>emphasis</em> would appear as emphasis
Engagement Metrics - The extent of how engaging specific content within a site is to users, or how engaging a site in its entirety is to users.
End user behavior is analyzed to improve search engine rankings. An algorithm such as Panda will offer a ranking boost if a site has a high CTR or many repetitive visits from brand related searches.
Entities – These are things, places or people that search engines try to offer background information for.
An example of an entity is a brand. Movies and songs can also be entities.
Entry Page - The entry page to a website.
When purchasing pay per click ads, one must make sure visitors are directed to the targeted page that is connected with the keyword they did a search for. When link building, make sure that:
-visitors are sent to the most relevant page when clicking a link
-search engines can understand what different pages of your site are associated with.
Ethical SEO - Search engines tend to utilize SEO services that manipulate their relevancy algorithms as being unethical.
There are search marketers that will describe services offered by competitors as unethical while their own services are ethical. This technique is not necessarily effective or ineffective.
Two frauds include:
-Failing to disclose risks: Some SEOs employ high risk techniques and fail to disclose risks to clients.
-Charging for nothing: There are often times little to no start-up costs for selling SEO services, and some of the people that claim to do it do not actually know what they are doing. They charge small businesses for nothing, essentially.
Everflux - Chief search indexes are continually being updated, which is called everflux.
Google constantly updates index but used to only do it once a month.
Expert Document - Quality page that links to numerous non-affiliated topical resources.
External Link - Link which references a different domain.
Linking to other relevant sources can assist search engines when it comes to comprehending what a site is about. It is a good idea to connect to quality editorial site links.
Favicon - A small icon that shows up in a web browser next to URLs In order to have a site associated with a favicon, just upload an image called favicon.ico
Favorites (see bookmarks)
Feed – Blogs and other content management systems let readers subscribe to content update notifications by means of RSS or XML feeds.
Feeds might also allude to merchant product feeds (which are not very effective for content generation because of filters for duplicate content) or pay per click syndicated feeds. A feed is content delivered to a user through special websites or programs (for example, news aggregators).
Feed Reader - A website or software utilized for subscribing to feed update notifications.
FFA - (Free For All) A page or site that does not contain much if any unique content but links to unrelated websites. Anyone can add links to them. Search engines penalize sites which have no value to human users.
Filter – If there are activities which make a site seem unnatural, search engines could remove them from search results.
A red flag would be a site publishing a lot of duplicate content – this could cause it to get filtered out of search results. Spamming is another penalty.
Firefox - Prevalent open source web browser.
Flash - Popular vector graphics-based animation software that simplifies the process of making websites appear rich and interactive.
Search engines can have a difficult time with indexing and ranking flash websites as relevant content tends to be minimal. When one uses flash, they should:
-embed flash files within HTML pages
- describe what is in the flash with a noembed element
-publish flash content in different separate files so appropriate flash files are embedded in relevant pages
Forward Links (see Outbound Links)
Frames - A method utilized to show multiple smaller pages on the same screen or display. Spiders have difficulty properly navigating them so they are bad for SEO. Using frames to build a site today is not recommended or needed.
Fresh Content – Recently published content or dynamic content that draws attention to a website.
Fresh content is advantageous because it:
-Helps a website to grow and gives people reason to come back, and link to the site.
-Ideas spread fast.
-Expanding archives: The more fresh content you create, the bigger your catalog of pertinent content, which means better ranking opportunities.
-Frequent crawling: The website will be crawled more often.
A risk with overdoing “fresh” content is that the engagement metric could be poor if it is coming from a low quality source, and a penalty could occur from Panda.
Freshness (see fresh content)
FTP - File Transfer Protocol or FTP is a protocol for conveying data between computers.
A great deal of content management systems and even web development software like Dreamweaver include FTP capabilities. Cheap or free FTP programs include: Leech FTP, Cute FTP, and Core FTP
Fuzzy Search – Refers to terms that are spelled incorrectly. A search will still list matching terms. Fuzzy searches correct misspellings.
GAP - Google Advertising Professional or GAP is a program that marketers utilize to be considered proficient AdWords marketers.
Gateway Page - (doorway page) A web page that was made to draw traffic from a search engine and redirect the traffic to another page or site.
Gizmo - (widget, gadget) small applications that can be good link bat; utilized on web pages to provide precise functions such as IP address display or a hit counter.
Google - Created by Larry Page and Sergey Brin (Stanford students); the largest search engine in regards to reach, in the world.
Google AdSense (see AdSense)
Google AdWords – Google’s advertising product and primary revenue source, offering PPC and CPM advertising and site targeted rich media ads, banner, and text.
Those who use this service can show ads on:
-The Google Search Network, containing the customary Google Search, Maps, Google Shopping, and its numerous search partners.
-Google Display Network, or any Google partner, or sites such as Blogger, Gmail, and YouTube.
If you opt for PPC, you have the ability to set your bid to automatic or manual. You select your bid amount with manual, but Google chooses the bid amount within the provided budget with automatic. With CPC and CPM you can decide the maximum bid amount.
Google AdWords (see AdWords)
Google Base - Free database created by Google containing information structured semantically.
Google Base can assist Google in better comprehending which information is commercial and how dissimilar vertical search products should be structured.
Google bomb - The intention of changing Google search results to ones that are meant to be humorous, for instance, if one were to search for a “miserable failure” George Bush has come up as the end result.
Google Bowling – A malicious technique to knock a competing site out of search results by pointing many low trust low quality links at their website. Also known as reverse SEO. This is easier to do with newer sites.
Google Checkout – Google’s payment service which assists in enhanced understanding of merchant conversion rates as well as keyword and market value.
Google Dance - Google once updated their index on a monthly basis and this was referred to as a Google Dance. Google now constantly updates index, which is called everflux.
Google Dance is also a reference to an annual party at the Google headquarters.
Google Juice - (authority, trust, pagerank) this is a reference to trust from Google, which streams through outgoing links to other pages.
Google Keyword Tool - Google's keyword research tool that attempts to determine the competition for a specific keyword, and then recommends related keywords.
Google OneBox – A part of the search results page that Google utilizes to show vertical search results from Google owned vertical search services.
Google Sitelinks – Google might believe that one search result is more relevant than others so at the top of the results page, they will list multiple deep links to that particular site.
Google Sitemaps – Webmasters are able to help Google index content through this program. This is normally done using XML markup.
Building high quality editorial links is the preferred way to have a site stay in search indexes.
Google Supplemental Index – This is a location for pages with lower trust scores. If a site has a lot of duplicate content, or the site does not have a lot of trust, it might end up in Google’s Supplemental Index.
Google Traffic Estimator – Tool that determines the amount of Google searchers that will click a keyword’s ad and estimates bid prices.
If a bid price is not submitted, the tool will return a bid price estimate needed to rank #1 for 85% of queries via Google for a specific keyword.
Google Trends - Tool that lets a person view how search volumes via Google for a specific keyword change throughout time.
Google Webmaster Guidelines – A subjective & constantly changing collection of conditions that can be utilized to validate penalizing a website.
Different guideline components can be confusing. Some believe Google’s ad labeling is deceptive in their own search results. The chief objective is to keep SEO ROI to a minimum.
Google Webmaster Tools - Google tools that allow persons to see recent trends in search traffic; webmasters are able to set an objective market; they can choose pages and request they are recrawled; and they can request that Google’s editorial team does a manual review. Note: Google utilizes webmaster registration data to penalize other sites owned by the same webmaster.
Google Website Optimizer - Complimentary multi variable testing feature utilized to assist AdWords advertisers in enhancing their conversion rates.
GoogleBot - Search engine spider for Google
Google has different spiders and a shared crawl cache between them, to include vertical search spiders and ad targeting spiders.
Guestbook Spam - A low quality automated link that search engines do not have much trust in.
GYM - Google - Yahoo - Microsoft
Headings - A brief description of the subject of a section.
Heading elements range from H1 to H6. The most important headings are the ones with the lower numbers. Just one H1 element should be used on a page. Here is how the source would look:
<h1>Your Topic</h1>
Hidden Text – a method for SEO utilized to show text to search engine spiders that humans will not see. Using hidden text is usually not worth the risk for legitimate sites.
Hijacking - Tricking a search engine into thinking that your URL actually hosts another website. This is usually done using a meta refresh or 302 redirect.
Hit – Now a mostly meaningless term replaced by impressions or pageviews. When a server sends an object (documents, files, etc.) a hit happens.
Home Page – A website’s main page; helps users to navigate a website. It is simple to build links on a homepage, but this page must remain focused. Other pages can end up ranking better in terms of search results.
Host (see Server)
HTML - (Hyper Text Markup Language) instructions or “markup” which are utilized for the addition of formatting and web functionality to plain text online. HTML is the main language for search engines, for which the World Wide Web's pages are created.
HTTP - HyperText Transfer Protocol is the leading protocol for communication between web browsers and servers. It is how data is transferred to an active browser.
Hub - (expert page) a trusted page with high quality content that links out to related pages.
Hubs - Topical hubs will reference numerous authorities and be referenced by many as well; trusted pages
Hummingbird - A search algorithm update by Google that allowed for improved conversational search.
In Bound Link - (inlink, or incoming link) Inbound links which come from related pages are trusted and do well with pagerank. A link that points to a website from another site. Use the link: function to see a sample of links which point to a document.
Index – Data collection that acts as bank to search through for a match to an inquiry. Billions of documents are generally in catalogs for bigger search engines. Index could also mean the root of a folder on a web server.
When search engines perform searches they are looking via reverse indexes by words and results are returned based on corresponding relevancy vectors. Semantic analysis lets search engines provide near matches.
Indexed Pages - The pages that have been indexed on a site.
Information Architecture – The meaningful structure and design/organization of content. The way search spiders and humans access a website will be considered in good information architecture.
Here are some suggestions:
-make sure there is a particular topic being focused on.
-utilize descriptive page titles and meta descriptions for explaining a page’s content.
-use clean, descriptive folder and file names
-use headings
-utilize breadcrumb navigation
-utilize descriptive link anchor text
-link to related information from within the content area of your web pages
-make it easy for people to find what they want
-avoid duplicate content
Information Retrieval - Searching through databases to find pertinent information.
Inktomi – In 2002, this was purchased by Yahoo!; a search engine that established the paid inclusion business model.
Inlink - (inbound link, incoming link) Inbound links from related pages which are trusted
Internal Link – Links that go from one page to another on the same site.
Descriptive internal linking simplifies the process of understanding what a website is about, for search engines. Use consistent navigational anchor text for each section of a site, highlighting other pages in that section. This will help improve a website’s usability.
Internet Explorer - Microsoft's web browser.
Inverted File (see Reverse Index)
Invisible Web - Parts of the web that cannot be easily accessed by crawlers because of limitations in terms of search technology, information architecture problems or copyright issues.
IP Address - Internet Protocol Address. All computers with internet have an IP address. Some servers and websites will have a unique IP address, but the majority of web hosts have numerous websites on a single host.
Numerous SEOs allude to unique C class IP addresses. All sites are hosted on a numerical address, for instance, aa.bb.cc.dd. Sometimes sites are hosted on the same IP address.
ISP - Internet Service Providers sell access to the web to end users. Some businesses will sell usage data to web analytic companies.
JavaScript – This client-side scripting language can be embedded into HTML documents to add dynamic features.
The majority of content in JavaScript is not indexed by search engines. JavaScript has been combined with additional technologies in AJAX to improve how interactive web pages are.
Keyword - A phrase or single word that targeted prospects search for.
Brand related or longtail keywords generally have a higher value than short keywords because they are believed to have a higher level of implied intent, coming later on in the buying cycle.
Keyword Cannibalization – Reusing the same keyword on an excessive number of web pages within the same site. This makes it difficult to know which page is the most relevant for a search.
Keyword Density – This is a way that search engine relevancy used to be measured, based on how frequently prominent keywords showed up within the content of a page; how frequently words on a page are a specific keyword. If keyword density is too high, a page will likely be penalized. This is called keyword stuffing.
Keyword Funnel - The relationship between different relevant keywords that searchers look for. Some searches are better associated with others as a result of poor search relevancy, spelling mistakes, and automated query refinement.
Keyword Research - Determining relevant keywords for targeting. Keyword discovery methods include:
-viewing analytics data
-checking copy on competing sites
-reading customer feedback
-checking what people search for in search boxes on site
-talking to customers directly and asking them how they discovered your business
Keyword Research Tools - Tools to help determine good potential keywords to use, looking at search trends, bid rates and relevant sites’ page content.
Prevalent keyword research tools include:
-SEO Book Keyword Research Tool - free keyword tool that links to social bookmarking and tagging sites, Google Suggest, and more.
-Bing Ad Intelligence
-Google AdWords Keyword Planner
-Wordtracker
Note that keyword research tools are valuable for qualitative measurements.
Keyword Spam - (keyword stuffing) Overuse of a keyword
Keyword Stuffing - (keyword spam) Overuse of a keyword
Keyword Suggestion Tools (see Keyword Research Tools)
Kleinberg, Jon – Scientist; did a great deal of research regarding search relevancy algorithms.
Knowledge Graph – Google will scrape information from third parties and show it in search results in an extended format.
The goal is to:
-Answer questions from users fast without having them leave the search results
-Possibly raise clickthrough rates on ads
-Boost monetization of the search result page.
Landing Page - After clicking an advertisement, the page a user ends up on.
Landing Page Quality Scores - Utilized by Google to filter ads from their AdWords program to keep ads relevant.
Latent Semantic Indexing (LSI) - Search engines index usually-associated sets of words in a document. Instead of just using synonymns, LSI looks for other words that would be associated with the main keyword to determine what the page is really about and whether the keywords are spammy or what the page is really about to the visitor.
Link - Connecting one web document to another, or to another position in the same document; a part of a web page one can click on to go to another page.
Links are considered a vote of trust by most search engines.
Link Baiting – Getting high quality links to point at a site through formatting, targeting and creating information. Often times directed at social media.
Link Building - Aggressively cultivating incoming links to a site. Having a lot of good quality links is good and will help with ranking. A high authority site link is better than many links from low authority sites.
Here are some link building tips:
-build unique content that is high quality and link-worthy
-come up with good marketing ideas
-mix anchor text
-acquire deep links
-register the site in directories that are high quality such as DMOZ, Business.com and the Yahoo! Directory
-create link bait
-get attention from bloggers
Link Bursts - A surge in the amount of links pointing at a website.
Links tend to develop over time when they are naturally occurring.
Link Churn - The speed that a site loses links.
Link Disavow (see Disavow)
Link Equity - Determining the strength of a site based on the prevalence of its inbound links and the trust level of the sites providing said links.
Link Exchange - a give-and-take linking arrangement which is frequently enabled by a site dedicated to directory pages. Link exchanges do not bring themselves value.
Link Farm - a collection of sites that link to each other.
Link Hoarding – Basically an attempt to keep all link popularity to yourself by not linking to other sites.
Link hoarding is not a good idea because:
-one that does not want to link out to other sites is not going to see many other sites linking to their own.
-outbound links to relevant resources can help credibility
Link Juice - (trust, pagerank, authority)
Link Partner - (reciprocal linking, link exchange) Two sites that link to one another. They are not seen as high value links by search engines.
Link Popularity - a way to determine a site's value in regards to the number of and quality of sites it is linked to; the amount of links pointed to a website.
Link Reputation - A mixture of anchor text and link equity.
Link Spam - (Comment Spam) Links that are unwanted and often times coming from places such as blog comments.
Link Text - (Anchor text) Text of a link that is visible to a user. Search engines utilize anchor text to specify a referring site’s relevancy and link to the landing page’s content.
Link Velocity – The speed which a page or website accrues fresh inbound links. A page that, in a short amount of time, gets a surge in fresh inbound links might be flagged for manual editorial review.
Live.com - Microsoft portal that was utilized as their search brand prior to rebranding as Bing.
Log Files - Server files that let you see where your main traffic is coming from and what users are searching for to bring them to your website.
Note: Analytics programs will show more data.
Long Tail – these are longer, more specific and precise searches. Generally a long tail keyword has more words in it. These are very specific searches and are generally more valuable of a visitor to your site then the more searched short keywords. If you were looking for a plumber in Dallas, a long tail keyword example would be: High rated 24 hour plumber Portland, OR
Looksmart – A paid search provider that started as a directory service.
LSI - Latent Semantic Indexing is a method used by search systems to statistically comprehending and demonstrating language in relation to the resemblance of pages and keywords. Note that a result might not have the specific search term directly in it, but could come back because it has numerous related words in pages that have the search words.
Manual Penalty - If a Google engineer decides a site is in violation of the Google Webmaster Guidelines, the site will receive a penalty. A review can be requested in Google Webmaster Tools after a problem is fixed.
Sites that get a manual penalty will usually see a warning come up in Google Webmaster Tools, but sites dealing with automated penalties might not get a warning.
Manual Review - Search engines use this technique to catch search spam and to train relevancy algorithms. It is a combination of their automatic relevancy algorithms and manual review process. A flag for manual review might be unusual usage data.
Meta description - Description of usually two or three sentence under the page title in search results.
Descriptions are important because they are what searchers use to determine if your site is relevant. It is important to keep it brief and unique. This is how the code appears:
<meta name="Description" content="Your meta description here. " / >
Meta Keywords - This tag highlights keywords and phrases the page is targeting.
The code will look like:
<meta name="Keywords" content="keyword phrase, another keyword, yep another, maybe one more ">
Search engines do not place much emphasis on meta keywords. Meta keywords tags are no longer commonly used by SEOs
Meta Refresh - A meta tag that causes a browser to refresh to a different URL location.
A meta refresh looks something like this:
<meta http-equiv="refresh" content="10;url=http://www.site.com/folder/page.htm">
301 or 302 redirect is usually preferred over a meta refresh.
Meta Search - A search engine that rearranges top ranked search results into one new result from various search engines.
Meta Tags - Another way to describe meta descriptions and meta keywords in the HEAD of a page. A page title (a very important component) can be grouped into a meta tag, too. The title is what search engines use to decide what the page is about.
Mindshare – How many people think of your product when thinking of your category’s products.
Strong mindshare indicates that a site is more likely to be linked at, along with top rankings. The link quality of Mindshare related links are generally better quality than average links.
Mirror Site - Essentially the same exact site or a duplicate site, but at a different web address. This is not good for SEO. The search engines are looking for unique content.
Monetize - To get money from a site. Many people do this using Adsense ads.
Natural Search (see Organic Search Results)
Natural Language Processing - Algorithms that try to comprehend what a search query is actually about and intended for, instead of just matching keywords with results.
Natural Link (see Editorial Link)
Natural Search Results - Search engine results that are not paid for or sponsored.
Navigation - System that lets website users understand where they have been and where they are, as well as how that is related to the rest of their website.
Using regular HTML navigation is better than coding navigation in Flash or JavaScript, for instance , as search engines are unable to easily index them.
Negative SEO - Trying to have a harmful effect on a third party site’s ranking.
Link strategies can be white hat or gray hat or even black hat. To harm a page’s ranking, a competitor might point low-quality links at a page, which would cause the page to be filtered out of search results. Negative SEO can also be engaged in without the use of links.
Niche – The subject a website is concentrated on; a site that pertains to a certain topic. It is a more simple process to compete in small, fresh niches than to try taking over larger verticals. Gain authority then go for larger markets.
Nofollow - Feature to prevent a link from passing link authority. Usually found within individual link code or in the HEAD section of a web page. Frequently used on sites with user generated contributions, such as blog comments. It is essentially a commabd that tells a Googlebot to ignore a link. One might do this if a link points to untrusted content.
Here is the code to use nofollow on a link:
<a href="http://wwwseobook.com.com" rel="nofollow">anchor text </a>
Note: It is not a good idea to practice link hoarding. To boost relevance in search engines, use outbound links
Noindex - Type of link condom. A command that tells robots not to index a page or link, generally found within individual link code or in the HEAD section of a page.
non reciprocal link – When a site links to a site that does not link back, it is not reciprocal. Non-reciprocal links have more value to search engines because they are not as likely to result from conspiracy.
Not Provided (see Keyword (Not Provided))
Off-Page SEO – Off-site, promotional methods of raising the ranking of websites, such as through social media.
On-Page SEO – Methods of making a site visible to search engines that are controlled on a web page, such as having a relevant URL, pages which load fast, and clear navigation.
Ontology – The effort to create a comprehensive, conceptual schema about a domain. It is an ordered data structure which consists of the pertinent entities and their relationships and guidelines contained in the domain.
Open Directory Project, The - (see DMOZ)
Open Source - Software distributed with its source code such that developers can adjust it as they deem appropriate. It is a method of rapidly building mindshare and exposure.
Opera - A fast criteria-based web browser.
Organic Link - organic links are those which are published because they have user value, as per the webmaster
Organic Search Results – The results one sees in search engines are paid and unpaid ads. Unpaid lists are organic search results or natural results. These are organized according to relevancy, which is determined by page content, trust related data, linkage data and other factors. Up to 80% (or more) of clicks are on organic search results.
When organic results are seen at the top of the search engine results page, they have generally been optimized well. Google does provide results based on the location and search history of users as well.
Outbound Link - A link from one website which points at another website.
Referencing different resources can help in building credibility; and linking to useful documents can help search engines in comprehending what your website is about. When you link to other websites, they might link back to yours.
Outlink - out going link
Overture - A company, originally called ""GoTo"", later bought by Yahoo! and called Yahoo! Search Marketing. Sold targeted searches on a pay per click basis.
Page Title (see Title)
Page, Larry - Google Co-Founder
Pagerank - (PR) a value assigned by Google algorithm, between 0 and 1 to determine link popularity and trust among other (proprietary) factors. A scale revolving around link equity. This term is often confused with Toolbar Pagerank.
Here is the PageRank formula:
PR(A) = (1-d) + d (PR(T1)/C(T1) + ... + PR(Tn)/C(Tn))
PR = PageRank
d= dampening factor (~0.85)
c = number of links on the page
PR(T1)/C(T1) = PageRank of page 1 (then divide by page one's total number of links)
Paid Inclusion - A system which lets websites buy relevant exposure so long as they pass editorial quality guidelines.
Paid Link (see Text Link Ads)
Paid Search – One can pay to have their website rank at the top of SERP. Top boxes showcase the companies which have paid to submit to Google Shopping, and boxes below have paid to show as a result of search terms.
Examples:
-PPC: Pay-per-click – one pays a search engine each time their ad is clicked on.
-CPM: Cost-per-impression – one pays for certain amount of times their ad shows up on a page. "
Panda Algorithm - A Google algorithm that sorts websites based on quality - mainly the quality of the website's content. Panda patent references signals such as a site’s link profile and entity related search queries. A site likely to be penalized would be one with shallow content, lots of ads and few repeat visits. This also penalized sites that contained duplicate content.
Pay For Inclusion - PFI is when one is charged for a website to be included in a search engine. It is good to avoid Google prohibition on paid links.
Pay For Performance - Payment arrangement in which affiliated sales workers receive a commission for getting consumers to perform specific tasks. Generally speaking, those who publish contextual ads are paid per ad click. On the other hand, affiliate marketing programs pay for conversions.
PDF - Portable Document Format, a popular Adobe Systems file format that allows for the viewing of files in the original printer friendly context.
Penalty - Search engines can penalize or ban sites from being highly ranked in search results if they are suspected of spamming. Penalties are automatic or manually applied. When a penalty is algorithmic, once a certain amount of time passes, the site can start ranking again. When the penalty is manually, it might last a long time or require getting in touch with the search engine.
Sites can be filtered for an array of reasons.
Penguin Algorithm – An algorithm by Google that gives sites with unnatural link profiles penalties.
Google complicated the emphasis on links when they launched the Penguin algorithm because, at the same time, they updated on-page keyword stuffing classifiers. This was originally called a spam update, too, and later changed to Penguin after complaints. Updates are often times run close together which complicates which update was problematic for a site. If a site has a manual penalty, one will likely see a warning in Google Webmaster Tools, but a site with a penalty through Panda or Penguin will not necessarily have a warning.
Personalization - The modification of search results and customizing of them based on a person's search history, location and recently viewed content.
PHP - PHP Hypertext Preprocessor is open source server side scripting language which is employed to add interactivity to and condense web pages.
Pigeon Update - An algorithmic update to Google local search results which tied in more signals that have been linked to regular web search.
Portal - A web service offering a wide array of features such as email and news to try to get users to set the portal as their “home page” on the web. Examples of portals include Google, Yahoo, and MSN.
PPA - (Pay Per Action ) Comparable to Pay Per Click, but with PPA, publishers are paid only if click throughs result in conversions.
PPC - (Pay Per Click) is a scheme where advertisers pay ad agencies such as Google when a user clicks their ad. An example is Adwords.
Precision – Measured usually as a percentage, this is a search engine’s ability to list results that fulfill the query.
Search spam is a challenge when it comes to a search engine's precision.
Profit Elasticity - A determination of the profit potential of differing economic circumstances based on price adjustment, supply, and other variables to make a different profit potential in which supply and demand curves cross.
Proximity - The closeness of words.
When a page has words that are near each other it might be more inclined to satisfy a search which is comprised of both words. However, when keywords are repeated too often, this could be a sign of low quality content.
QDF – “Query deserves freshness” is an algorithmic signal that lets Google know a certain search query should rank in recent results. This would be based off scenarios like a surge in search volume of a specific topic. When older content is updated or has a readership increase, it can be viewed as fresh, just like newly published content.
Quality Content - Linkworthy content
Quality Link – Links votes of trust are counted with search engines. Quality links are more powerful than low quality links.
High quality links are correlated with:
-Trusted Source: If a link is from a seemingly trusted website, it will count more than one from a rarely used website.
-Hard to Get: When a link is more difficult to acquire, search engines are more likely to trust it.
-Aged: Links from older sources are sometimes more trusted by search engines than new links.
-Co-citation: Pages that link at competing sites.
-Related: Links which are from related pages or websites have the tendency to count more than links from unrelated sites.
-In Content: Links that come from editorial links are more likely to be links that are found in a page’s content area.
A high quality link has a lot of weight, perhaps even more so than anchor text.
Query - The search string which is entered into a search engine.
Query Refinement – When searchers do not think their results are relevant, they may opt to refine their search query. Some search engines may even promote other search queries if they believe they are relevant for the searcher. Query refinement is an automated and manual process. If searchers do not find their search results to be relevant then they will likely just search again. Here is how search engines automatically refine queries:
-Google OneBox: promotes a vertical search database towards the top of the results.
-Spell Correction: offers what the search engine believes to be the right spelling
-Inline Suggest: provides related search results.
Recall - The portion of relevant documents which were retrieved in comparison to all relevant documents.
Reciprocal Link(s) - (link exchange, link partner) Two sites that link to one another. Sometimes these occur because websites are attempting to build false authority or are using low quality link schemes.
Ranking manipulation is likely taking place if all links on a site are reciprocal.
Quality reciprocal link exchanges are not necessarily a negative, but most reciprocal link offers are of low quality. Having too many low quality links will impact ranking in a negative way."
Redirect - A method to change a landing page's address, such as when a site is changed to a new domain. This is how search engines are made aware of a page location change. 301 redirects signal permanent changes while 302 redirects signal temporary changes.
Referrer - The source a visitor to a website came from.
Regional Long Tail - (RLT) coined by onlinedevelopment.co.uk's Chris Paston - helpful for the service industry, a keyword term containing a city or region name.
Registrar - A company that lets users register domain names.
Reinclusion - When a site is penalized for spamming but fixes the issue, they can ask for reinclusion.
Relative Link - A link demonstrating the relation of the current URL to the page’s URL that is being linked at. Absolute links are preferred over relative links because of hijacking concerns.
Example of a relative link
<a href=""../folder/filename.html"">Cool Stuff</a>
Absolute link example:
<a href=""http://seobook.com/folder/filename.html"">Cool Stuff</a>"
Relevancy - A measure of how beneficial searchers find search results.
Many search engines will be biased towards organic search results to informational resources.
Remarketing (see retargeting)
Repeat Visits - Visitors that go to a website they have been to before. Repetitive visits to a site is a sign of strong engagement, which will cause a site to rank higher in algorithms such as Panda. Sites without repeat visits are sometimes considered low quality.
Reputation Management - Making sure your brand related keywords display results that will reinforce your brand.
Resubmission – Comparable to search engine submission, resubmission is by and large an impractical program offered by businesses cheating customers out of money for a service that is worthless.
Retargeting - Advertising programs that target persons who have visited a website in the past, or looked at a particular product, or even added a certain product to their online shopping cart. There is activity bias associated with retargeted ads.
Reverse Index - An index of keywords that is comprised of records of relevant documents containing those keywords.
Rewrite (see URL Rewrite)
RLT see Regional Long Tail
Robots.txt - a file in a website’s root directory that controls search engine spiders’ behavior – it lets search engines know which sites to avoid crawling.
Keep files off public servers if you do not want them indexed by search engines. "
ROI - (Return On Investment) The utilization of analytics software to quantify return on investment, or to see the amount of return one receives for each marketing dollar. ROI is a good measurement but some markets are inclined to use more refined profit elasticity calculations.
RSS - Real Simple Syndication or Rich Site Summary is a way to syndicate information to a feed reader or to syndicate information to other software that lets persons subscribe to a channel of interest.
Sandbox - There has been hearsay that Google places new sites into a “sandbox,” which makes it so they are unable to rank well until it’s been a certain period of time. Not a universally accepted notion.
Scrape - duplicating a site's content, frequently facilitated by automated bots.
Scumware - Invasive software that generally targets ads, violates privacy, and is unknowingly installed.
SE (Search Engine)
Search Engine - (SE) a program that looks for a document(s) with matches for a keyword phrase and returns a list of the most relevant matches. Google and Yahoo search are popular search engines. A search engine is comprised of index, a spider, search results and relevancy algorithms.
Search Engine Spam - Pages that cause search engines to deliver less relevant results. Search Engine Optimizers are sometimes viewed as spammers.
"Search History - Search history information is stored by many search engines and can be utilized for better ad targeting or to make information easier to find.
Having a lot of brand related search queries is a good signal of quality."
Search Marketing - The marketing of a website in search engines. This is generally done through SEO, paid inclusion and by purchasing pay per click ads.
SEM - Search engine marketing is making a website visible to search engines such as Bing and Google, to draw in new and returning visitors aline. Generally uses organic search and paid tactics. Also known as: Search Marketing
SEO – Stands for Search Engine Optimization, the publishing of information in a way that assists search engines in comprehending that your information is relevant to certain search queries. Consists of SEO copywriting, information architecture, keyword research, link building, mindshare building, and viral marketing.
The hope is through SEO, the number of visitors that go to a web site will be increased, and the site will have a higher ranking in the search results.
SEO Copywriting - Writing copy in such a way that makes it relevant to a vast variety of relevant search queries.
Here is how to write SEO friendly titles:
-Write titles that are literal and associated with things persons will search for.
-Write page titles which are convincing to link at. The more people that link at them, the better. "
SERP - A search engine results page; the page showing results for a search.
Server - Computer that is employed to host files and serve these files to the World Wide Web.
Server Logs - Files hosted on servers that show website traffic sources and trends.
Server logs are generally less user-friendly than analytics software. Not all hosts provide server logs."
Siphoning - Methods that are meant to steal web traffic from other sites; often use spyware and cybersquatting.
Site Map - A page or organized group of pages that improve the usability of a site and link to every user accessible page on a website. An XML sitemap can frequently be found in the root directory of a site to assist search engine spiders in finding all of the site pages.
Here are some tips:
-On page navigation on larger sites should help search engines find all pertinent web pages.
-Only list the most important pages on a site map, at least on a bigger site.
-Site maps can utilize descriptive anchor text to assist search engines in comprehending what your pages are about.
SMM - (Social Media Marketing) is brand or website promotion through social media
SMP - (Social Media Poisoning) Rand Fishkin's term - any black hat techniques (sometimes illegal) designed to make a competitor out to be a spammer - For example, one might partake in blog comment spamming in the name of a competitor.
Social Bookmark - Social Media form in which users bookmarks are combined for public access.
Social Media - Online technology for information sharing, such as forums, blogs, wikis, social bookmarking, reviews and rating sites. Facebook and Twitter are the biggest social media sites online.
Social Media Marketing - (SMM) Brand or website promotion via social media
Spam - Email messages which are unsolicited.
Search engines will call low quality search results spam. They have vague guidelines that constantly change, which determine which marketing techniques are acceptable. The majority of algorithms are lenient to avoid falsely spamming something, but it is important to not build many low quality links, have duplicate content, etc. If your site is banned from a search engine you can fix the problem and request to be reinstated.
Spam Ad Page - (SpamAd page) A Made For Adsense page that employs machine-generated text for content, and does not have actual value to users. Spammers tend to create sites with many of these pages.
Spamdexing - Spamdexing also known as search engine spamming is a method of altering web pages to increase the chance of them being placed towards the top of search engine results.
Spammer - A person who employs spam to accomplish a goal.
Spamming - The act of creating spam.
Spider - (bot, crawler) A specialized bot which is utilized by search engines to search or "crawl" web pages for indexes. Also known as a Googlebot
Splash Page - A rich or nicely designed web page that does not have good usability or offer a lot of content worth indexing. Generally just a graphic page that is flashy in appearance.
Be sure to fill home pages with content that is relevant.
Splog - Spam Blog that might consist mostly of low quality, automated content and is not of much worth
Spyware - Software programs that have the ability to spy on web users; generally utilized for the collection of consumer research.
"SSI - Server Side Includes make it simple to update websites; they are a means of calling portions of a page in from another page.
In order to use this you must:
-end file names in a .shtm extension or .shtml
-use PHP or a similar language
-change your .htaccess file so .htm files or .html are processed like they were .shtml files.
Here is the code which would be used for the creation of a server side include:
<!--#include virtual=""/includes/filename.html"" -->"
Static Content - Content that does not frequently change. Content which lacks social elements or dynamic programming languages.
Static sites often times are successful, but fresh content is good for SEO because:
-When content is built frequently it ultimately results in a strong archive.
-By updating content regularly you keep building brand equity and mindshare, and provide content that is worth linking at.
Static Page - Static pages are friendly in regards to search engine spiders. These are web pages that lack dynamic content or variables like session IDs in the URL.
Stemming – Utilizing the stem of or portion of a word to help meet search relevancy requirements. For instance, searching for dancing can return results that contain the word “dance.”
Stickiness - Lessening bounce rate. Changes in a website which entice users to spend more time on a site, and to view more pages betters the sites “stickiness”.
Stop Words - Common words (for instance: to, a, and, is ...) that do not add relevancy to a search query, and are eliminated from the search query.
It is okay to use stop words in content. They just frequently go ignored because they do not have much use in terms of a search as they are so common.
Submission - Making related websites aware that your website exists. The most effective way to submit a site is to have others link to it. Sometimes, vertical search systems or topical systems will require submission, however, there is no need to submit a site to large scale search engines.
Supplemental Index - (supplemental results) Pages that are relevant to a search query but have low pagerank, frequently appear in the Search Engine Results Page labeled Supplemental Result. This does not signal a penalty, just low page rank.
Supplemental Results - Documents that have less trust and rank lower than main search index documents.
Google and other search engines have numerous indices. Documents might be less trusted because of:
-limited link authority in comparison to the site's number of pages
-duplicate content
-complex URLs
Documents in the supplemental results are not crawled as often as documents in the main index.
Tagging, tags (see Bookmarks)
Taxonomy - A generally hierarchical classification system of controlled vocabulary utilized in the organization of topical subjects.
Technorati - Blog search engine that tracks prevalent stories and link relationships.
Telnet - Internet service that lets a remote computer log into a local one for tasks like manipulation and script initialization
Teoma – Powers Ask.com; a search engine which is topical community-based and reliant on the notion of authorities and hubs.
Term Frequency - The amount of times a certain keyword appears in a document or document collection.
Term Vector Database – The goal of this weighted index of documents is the comprehension of the topic of documents in regards to their similarity with other documents; next, the most relevant documents are matched to a search query based on vector angle and length.
text link - A plain HTML link; this is one that does not need special code or graphic.
Text Link Ads - Advertisements formatted as text links.
Because of the fact that the web was once based on text and links, now people are more likely to pay mind to text links than other ad formats. Nonetheless, search engines aim to count editorial links as votes, and as a result, links grouped with paid links are not as likely to be considered important or carry much weight, if you will, in search engines.
Time On Page - This is the amount of time one spends on a page before clicking onto something else; it indicates quality.
Title - The title should describe a document’s content.
This is a very significant component of SEO. Titles should be:
-Unique to that specific page
-Descriptive
-Not long - 8 to 10 words or less in length is best.
Titles are the links that searchers will click on in search results. When people link to documents, they generally utilize the title as link anchor text. It should certainly be enticing. An emotionally-driven or controversial title can be successful, too.
Be SEO friendly by:
-Writing literal titles
-Writing page titles which are compelling
Toolbar - Search toolbars are often distributed by search companies to acquire marketshare. Toolbars can come with beneficial options like pop-up blockers, form autofill, and more. They can also be used to track usage data.
Toolbar Pagerank - (PR) a value allocated by the Google Algorithm, between 0 and 10, that measures page ranking. Toolbar Pagerank is updated just a few times per year. It is frequently confused with Pagerank.
Top Heavy - This is a Google algorithm that penalizes websites with a high ad density above the fold and web sites that complicate the process of finding the content one searched for prior to landing on the page.
Topic-Sensitive PageRank – Process of calculating PageRank that, rather than creating a single global score, produces topic related PageRank scores.
Trackback - Automatic notification that your site was mentioned by another site. Trackbacks are easy to spam so often times, publishers turn them off.
TrustRank - Search relevancy algorithm that puts weighting on links that come from trusted seed websites controlled by educational institutions, chief corporations, or governmental institutions. It is the process of distinguishing between valued pages and spam.
Unethical SEO - Some search engine marketers attempt to market their services as being ethical, while making people believe that services from alternate providers are unethical (this usually is not the case). There are risks to certain SEO techniques and a quality provider will share this with clients.
Update - Search engines, to keep their results fresh, often times update their algorithms and data sets. The majority of big search engines are constantly updating their search index and relevancy algorithms.
URL - Uniform Resource Locator - address or Web Address for a web document.
URL Rewrite - A method that aims to faciliate improved sitewide indexing which makes URLs more unique, creative and descriptive.
Usability – A measure of how simple it is for customers to carry out their anticipated actions.
The structure and formatting of text and hyperlink based calls-to-action can radically increase the usability of a website, and consequently conversion rates.
Usage Data – Signs of quality include a great deal of traffic, a high clickthrough rate, many repeat visitors, long dwell time, and numerous page views by one visitor, to name a few. Some search engines will leverage these signals to better high quality documents’ rankings via algorithms like Panda.
User Engagement (see usage data)
User Generated Content - (UGC) Blogs, Folksonomies, Social Media, wikis, and more count on User Generated Content. Technically speaking, it could be said that Google uses the entire web as user generated content.
Vertical Search - A search service focused on specific information, a specific field, or a specific information format.
Business.com is a B2B vertical search engine, for instance. "
Viral Marketing – Self- broadcasting marketing practices. Word of mouth, blogging and email are common.
Numerous social bookmarking sites as well as social news sites result in secondary citations.
Virtual Domain - Website that is hosted on a virtual server.
Virtual Server - A server that lets many top level domains be hosted from a single computer. It is recommend that dedicated hosting is utilized for big commercial platforms, while a virtual server can be good for smaller applications.
Walled Garden - a collection of pages that link to one another, but are not linked to by other pages. A walled garden will likely have a low pagerank even if it is still able to be indexed.
Web 2.0 - Categorized by websites that promote user interaction.
Weblog (see Blog)
Webmaster Tools (see Google Webmaster Tools)
White Hat SEO – A method of making a website more visible in a way that is fair and not manipulative (keyword stuffing, poor content and artificial link-building are bad practices – these are labeled as black hat SEO). Techniques which fall into the right guidelines are white hat SEO. Search engines set up guiding principles which assist them in making billions of dollars in ad revenue from the attention of searchers and work of publishers.
Whois – Every single domain has an owner of record. This information is stored in the Whois record for said domain. Ownership data can be hidden depending on the domain registrar.
Widget – (gizmo or gadget) These are programs that can make good link bait; small applications utilized on web pages to provide precise functions such as IP address display or a hit counter.
Wiki - Software that uses collaborative editing to publish information.
Wikipedia - Free online encyclopedia that utilizes wiki software.
Wordnet - A lexical database of English words that assists search engines in comprehending word relationships.
Wordpress - Prevalent open source blogging software platform, that provides a hosted solution and downloadable blogging program.
Remember that published content to your own domain is better for building a brand because you need link equity and age related trust.
Wordtracker – A keyword research tool filled with rich features that collects data from prevalent meta search engines such as Dogpile.
XHTML - Extensible HyperText Markup Language is a set of stipulations intended to move HTML to conform to XML formatting.
XML - Extensible Markup Language is an easy and flexible text format that comes from SGML; its purpose is to simplify syndicating information with technology like RSS.
XML Sitemap - This is a listing of every page on a website in a document hosted on the website’s server. This lets webmasters notify search engines when new pages have been updated or added.
This is great for sites that have pages with few links as it can help them to get discovered by Google.
With WordPress, for instance, a sitemap is generated automatically and submitted to search engines on a regular basis.
YouTube - Google's Video upload and syndication website that allows visitors to post, share and view videos