Tuesday, 28 January 2014

When Was The Last Time You Checked Your Links?



An important component of the search engine ranking algorithm is the inbound links that point to a website. The search engines use this link information as one factor to determine the authority of a website, and in turn where it should appear on the search engine results page. Good links from high quality, industry authority websites are going to convey that the linked to website is high quality by association and pass on some of that authority. Bad links on the other hand can really wreak havoc on a site, especially in recent years after the release of the Google Penguin update that targets websites that have lots of inbound links from low quality sites.

It is the responsibility of the website owner to keep an eye on their inbound link portfolio and take action if anything seems amiss. Unfortunately a website owner doesn’t have complete control over their inbound links given the nature of the web. Any website out there can link over to your website without your permission. Sometimes negative SEO is at play and competitors or spammers set out to sabotage your site by building lots of low quality links in a short period of time (a huge red flag to the search engines). In some cases website owners don’t do their research and make the mistake of outsourcing SEO link building efforts to black hat SEO firms that will also build these low quality links to try and beat the system and meet a certain quota of links per month. No matter how these bad links show up, website owners need to do their best to remove them once they appear, and hopefully before they do any permanent damage to the search engine authority of the website.

On a regular basis a website owner will need to conduct a manual link audit to determine the quality of the inbound links. You can pull a list of inbound links from your Google Webmaster Tools account. It’s also a good idea to get an additional list from a paid tool like Moz since each program pulls links in different ways and might give you a slightly different list of inbound links. You will then need to analyze the list of links. If you recognize a domain as “good”, if it’s a site that you’ve been actively guest blogging or blog commenting on for example, you can skip over those. It’s the ones that you don’t immediately recognize that you will need to manually visit and determine whether it’s a good link that occurred naturally or if it’s a link from a low quality site that could hurt you.

When you spot a link from a low quality site, you will need to try and get that link removed ASAP. To start, locate the contact information of the site owner and send an email stating that you’d like the link removed. Follow up if necessary. If you see a significant number of poor links, you can also create a disavow list and submit that to Google, essentially telling Google that you know those links are bad and don’t want Google to consider those links as a part of your link portfolio for SEO purposes.

Checking your inbound links certainly isn’t a fun part of SEO, but it needs to be done. If you don’t keep an eye on your inbound links they can get out of control and a search engine penalty can sneak up on you.

How To Connect Google Analytics To Your WordPress Site

Tracking your website’s visitors is a very important task when it comes to monitoring your online success. If you don’t know how many people are coming to your website and from what sources (Google, Facebook, other sites, etc) how can you effectively grow your online presence? You can easily track traffic to your website  with Google Analytics and the Google Analyticator plugin. There are many WordPress plugins that you can use to connect Google Analytics to your WordPress site, but I feel Google Analyticator has the best options and is the most user friendly. It also allows you to monitor your traffic directly from the WordPress admin dashboard so you don’t have to log into any other platform. Even better, it’s available for free!

Google Analytics:
To begin sign up for Google Analytics if you haven’t already done so.
1. Login to Google Analytics and inside the dashboard add a website profile
2. After entering in your website URL and naming your profile click save and finish

Google Analyticator:
Installing the Google Analyticator plugin is extremely easy. If you don’t know how to install a plugin or need help, then please read my previous post “How To Install a WordPress Plugin”.

Once the plugin is activated go into Google Analyticator’s settings. You’ll need to grant Google Analyticator access to your Google Analytics account.

1. Click the button to continue
2. A popup with your Google account info will come up asking you to grant access to the plugin. Click accept
3. Now that the two are hooked up, you’ll see a dropdown next to “Google Analytics Logging”. Make sure that is enabled
4. Next to “Google Analytics account” select the website profile you set up earlier
5. To finish the setup scroll to the bottom of the page and click the blue “Save Changes” button.

You’ve now successfully linked Google Analytics to your website!

Required Elements of a Modern SEO Campaign

SEO has undergone multiple transformations through the years. During the early days, the focus was on building links to your website. At one point, nearly any link would help. However, that’s no longer the case. The only way a link is going to help your SEO efforts today is if it generates traffic and comes from a relevant, high quality source. An SEO campaign looks much different than it did even just a few years ago and also requires more creativity and effort.

Here are the 5 required elements of a modern SEO campaign:
Keyword research:
The fact that keywords play an important role in SEO is nothing new. Keywords are at the foundation of the campaign. What’s no longer accepted is “stuffing” keywords into your content. Keywords should be implemented into content naturally, so that it’s not even noticeable by those that are reading it. In order to utilize keywords, you first need to select the best keywords to use; the keywords that have search volume and are most relevant to what you offer. Finding these keywords requires keyword research. Just guessing which keywords to use could result in missed opportunities since target audience members may have different ways of searching that you haven’t thought of.

On site optimization:
Once keyword research has been completed, the next step in the process is to optimize the whole website. This provides you with a solid foundation for the rest of your SEO activities. By implementing keywords into the meta tags and on page content, those pages are more likely to appear for related search terms. On site optimization isn’t enough, since the site needs to earn trust, but it’s an important first step.

Industry research:
Link building is still a part of an SEO campaign, but the key is to build links from relevant sites that are in some way related to your industry. You will need to spend time browsing the web looking for potential opportunities such as industry directories, blogs, forums, etc. You should already be regularly visiting industry sites to stay up to date with current trends so you can start with those sites. The next step is to use an SEO tool like Moz and pull a list of your competitor’s inbound links and look for opportunities in those lists as well.

Content marketing:
Traditional link building efforts are worthwhile, but what’s going to generate natural inbound links to your site is the thought leadership content that you create. Start by implementing a blog on your site and posting quality content on a regular basis. Once you’ve gotten into that groove, you can look for additional guest posting opportunities on other industry sites. Publishing informational content will increase the awareness of your business across the web.

% Social media:
Social media is now closely tied to SEO since the search engines consider social signals as a part of the search ranking algorithm. All content that is published should be shared in social media in order to stimulate this kind of activity.

Complete SEO Glossary 2014

Above the fold (ATF): Originally a newspaper term, above the fold means on the top half of the page. Placing a story above the fold makes it more visible. In Web publishing, in which no fold exists, premium placement generally means toward the top of the page, in a position where visitors don't have to scroll down. Screen resolutions differ, of course, so if you design your page using a resolution of 1280 x 1024, for example, your own fold is way down the page. The higher the resolution, the more material you can put into each "fold" portion of the page, because high resolutions make text and graphics smaller. (In effect, high resolution makes the screen bigger.)

Algorithm: A formula or set of steps for solving a particular problem. To be an algorithm, a set of rules must be unambiguous and have a clear stopping point. Algorithms can be expressed in any language, from natural languages like English or French to programming languages like FORTRAN.  We use algorithms every day. For example, a recipe for baking a cake is an algorithm. Most programs, with the exception of some artificial intelligence applications, consist of algorithms. Inventing elegant algorithms- algorithms that are simple and require the fewest steps possible-is one of the principal challenges in programming.

Alt Attribute: XHTML tag that provides alternative text when non-textual elements, typically images, cannot be displayed. The image tag is a very important tag. It directs the browser to either a gif or jpeg file. The browser then displays that image file where the command is placed.

Backlink: A link at another site, leading to your site. Also called an incoming link. The number and quality of backlinks represent the most important factor in determining a site's PageRank. The value of any backlink is determined partly by the PageRank of the linking site, which is determined partly by the quality of its backlinks, and so on.

Bandwidth: Bandwidth refers to the amount of data that can be transferred from one place to another within a certain amount of time. Digital devices measure bandwidth in bytes per second. The bigger the bandwidth, the faster the data can be transferred.

Blog: A blog (short for "weblog") is a journal that's available on the internet. Updating a blog is referred to as "blogging" and the person keeping the blog is a "blogger". Blogs are usually listed in chronological order, with the most recent entry first. Many blogs are available as RSS feeds, which means they are delivered to a feedreader.

Blogosphere: The term blogosphere describes the information available on blogs and/or the sub-culture of those who create and use blogs. By its nature, the blogosphere tends to be democratic, inclusive, and encourages two-way communication between its participants.

Buzz keyword: Coined by Dr. Clinton Cimring in 2006, a buzz keyword is a newly created keyword that usually just became recognized as an auto suggestion in Google and Yahoo!.

Cloud Hosting: Cloud Hosting is web hosting where more than one server is used as a host. This may be in the context of multiple servers in one single location or multiple servers in multiple locations. The benefit of the first option of cloud hosting is that if there is a power outage in one geographic location, the second server would not be affected. The benefit of the second option of cloud hosting would be that server resources can be combined for the same files.

Cloaking: A type of search-engine subterfuge in which an indexed Web page is not shown to visitors who click its link in Google (or another search engine). The cloaking works two ways: Visitor content is cloaked from Google, and Google's indexed content is cloaked from visitors. This serves to give a high PageRank to content that ordinarily would rate a low PageRank. Cloaking is not always illicit. A certain type of cloaking are used to deliver pages tailored to a visitor's ISP (America Online, for example) or specific Web browser.

Congregate Websites: congregate website is a term coined by Dr. Clinton Cimring in 1998, which refers to any website that compiles other websites in bulk such as directories or networking websites.

Content Dilution: content dilution is the result of either 1) having too much content on a page that is being optimizer for thereby diluting the keyword densities of keywords that would otherwise be featured or 2) having too many pages on a website and thereby loosing the ability to feature all of them.

CRM: Customer relationship management, or CRM, software is for tracking the traditional sales process including marketing automation, lead generation, sales forecasting, measuring ROIs, etc. Note: Although BatchBook is often referred to as a CRM, it is actually more about managing contact information than sales leads, although it could do both.

Cross linking: Intentionally or unintentionally, cross linking creates large backlink networks among sites that exist in the same domain or are owned by the same entity. Unintentional cross linking happens when a site generates a large number of pages with identical navigation links or when at least two sites mutually link related content. When cross linking is done intentionally, the Webmaster is seeking to raise the PageRank of the involved sites. Excessive cross linking can backfire. If Google decides that the resulting enhanced PageRank is artificial, any or all of the sites might be expelled from the Web index. Innocent cross linking between two related sites is usually not a problem.

Deepbot: The unofficial name for Google's monthly spider. Freshbot is the unofficial name of Google's frequently crawling spider. The official name for both crawlers is Googlebot.

Density: Most search engines look for keyword density. Some will only look at the first 200-400 characters of your site, and count the number of times the keyword appears. Some index a small amount of text from the top, middle, and bottom parts of your web page, and search them for keywords. Generally keyword density should be in the 6-8% range. Simply repeating the keyword will not work because some search engines consider grammar structure in their calculations. For a very competitive keyword you could aim a little higher perhaps targeting a 10% range, but you have to take into consideration the search engine may consider this spamming.

Directory Submissions: The act of supplying a URL to a search engine in an attempt to make a search engine aware of a site or page.

Domain: The first- and second-level address of a Web site. Top-level destinations are defined by the domain extension: .com, .net, .org, .biz, and others. The second level adds a domain name: yoursite.com.

Domain name: The second-level domain address that identifies and brands a site, such as google.com and amazon.com.

Domain name registration: The process of taking ownership of a domain name. Registrations are processed by dozens of registrars approved by ICANN (Internet Corporation for Assigned Names and Numbers). The cost of domain ownership is no more than $35 per year. (Hosting the domain's Web site is an additional expense.) Registration takes place online, and the activation of a new domain (or moving a domain from one host to another) generally requires no more than 48 hours.

Doorway page: An entry page to a Web site, sometimes known as a splash page. Doorway pages endure a negative connotation due to illicit techniques that send visitors to an entirely different site than the destination they clicked in Google.

Dynamic content: Web pages generated by an in-site process that depends on input from the visitor. Most dynamic content comes from a database operating behind the scenes, feeding information to a Web page created in response to a visitor's query. Search engines are among the largest producers of dynamic content; every Google results page, for example, is pulled from the background index in response to a keyword query. Google's spider generally avoids portions of sites that rely on dynamic page-generation, making it difficult to index the content of those sites.

Feed reader: A feed reader (also known as an RSS reader, news reader, or feed aggregator) is an application (desktop or web-based) that allows you to subscribe to multiple RSS feeds, allowing you to read the content from many websites from one place.

Folksonomy: The word folksonomy is a combination of folks, meaning "people", and -onomy, meaning "management". Users create informal social Specials using tags to organize content so that others may easily find and share it.

Forum: A forum is a web-based application that allows people to hold discussions through individual posts. The posts will be displayed in chronological order or as threaded discussions.

Fresh crawl: Google's frequent scan of Web content that occurs between the deep monthly crawls. Google does not publicize the schedule of its intermediate crawls or its target sites. The term "fresh crawl" is an unofficial one used by Webmasters, site optimizers, and other Google obsessive's.

Freshbot: The unofficial name for Google's near-daily spider. Deepbot is the unofficial name of Google's monthly-crawling spider. The official name for both crawlers is Googlebot.

Geo-targeting: Geo-targeting applied to organic SEO is the process of combining keywords with geographic criteria such as city names, metropolitan areas, or zip codes. An example may be, "personal trainer boca raton." These results would appear on the left-hand side of the screen. Geo-targeting applied to SEM is displaying ads based on the target audiences' IP Address. In this case, the searcher would type in, "personal trainer," and the results would appear on the right hand side for searchers who live in zip codes within Boca Raton or the surrounding area.

Heading Tag: Headings (h1-h6) are used as the topics of the website's sections.
Example: <h1>Heading tag</h1>

HTTP: HTTP is called a stateless protocol because each command is executed independently, without any knowledge of the commands that came before it. This is the main reason that it is difficult to implement Web sites that react intelligently to user input. This shortcoming of HTTP is being addressed in a number of new technologies, including ActiveX, Java, JavaScript and cookies. Short for Hypertext Transfer Protocol, the underlying protocol used by the World Wide Web. HTTP defines how messages are formatted and transmitted, and what actions Web servers and browsers should take in response to various commands. For example, when you enter a URL in your browser, this actually sends an HTTP command to the Web server directing it to fetch and transmit the requested Web page.

Index: In the context of Google, the index is the database of Web content gathered by the Google spider. When Google receives a search query, it matches the query keywords against the index.

Internet Marketing: Internet marketing is the act of promoting products and services by increasing a web site's online visibility. Some of these promotion techniques includes: natural SEO, pay per click advertising, e-mail marketing, newsletter distribution, blogging, community forums, article writing and distribution, and banner advertising.

IP Address: Short for "internet protocol address", this is a unique number that identifies a computer connected to the Internet to other Internet hosts. An example of an IP address is 127.0.0.1.

Keyword: As an optimization term, a keyword represents a core concept of a site or a page. The site's content, XHTML tagging, and layout strategies are based on effective deployment of keywords, which could also be key phrases. Google matches search results to keywords entered by its users and assigns a PageRank in part on how consistently a site presents its keywords.

Keyword Count, Occurrence: How often a keyword or keyword phrase occurs in a particular XHTML page section. The key word count is used is used in a calculation determine the key word density.

Keyword density: A proportional measurement of keywords embedded in a page's content. High keyword density focuses the page's subject in a way that Google's spider understands. The spider can interpret too high a density as spam, which results in a lower PageRank or elimination from the index. Most optimization specialists recommend a density between 5 and 15 percent.

Keyword stuffing: The attempt to gain a higher PageRank (or higher ranking in any search engine) by loading a page's XHTML code or text with keywords. In most cases a visitor can't see the keywords because they're buried in XHTML tags, camouflaged against the background color of the page, or reduced to a tiny typeface. Keyword stuffing violates Google's guidelines for Webmasters and can result in expulsion from the index.

Link farm: A site whose only function is to display outgoing links to participating Web sites. Link farms are disreputable versions of legitimate, topical link exchange sites through which visitors gain some content value. Link farms often have no topicality and present no guidelines or standards of submission. Google does not explicitly threaten expulsion for joining link farms, but it discourages their use.

Link Popularity: a measure of the quantity and quality of sites that link to your site. A growing number of search engines use link popularity in their ranking algorithms. Google uses it as its most important factor in ranking sites. HotBot, AltaVista, Microsoft Bing, Inktomi, and others also use link popularity in their formulas. Eventually every major engine will use link popularity, so developing and maintaining it are essential to your search engine placement.

Link Text or Anchor Text: Link text is the clickable text which connects one web page to another.
Example: <a href="page.html">link text or anchor text</a>

Local Internet Marketing: Internet marketing geo-targeted through directories, Google maps, and social networking sites.

Geotargeted SEO: SEO targeted toward a city, state, or metropolitan area by utilizing either GEO-Targeted terms and keywords in content or jargon for that specific area.

Manual Submission: adding a URL to the search engines individually by hand.

Meta tag: Positioned near the top of an XHTML document, the meta tag defines basic identifying characteristics of a Web page. Often, several meta tags are used on each page. In those tags you set the page's title, description, and keywords.

Mirror site: Mirror sites duplicate content and are used for both legitimate and engine-spamming purposes. Legitimate mirror sites assist in downloading when a great deal of traffic is trying to reach a page or acquire a file. Illicit mirror sites attempt to fill a search results page with multiple destinations owned by a single entity. When Google discovers a mirror site whose only purpose is to dominate a search page, that site risks expulsion.

Optimization: A set of techniques to improve a Web site's presentation to visitors and its stature in a search engine's index. As a specific field, SEO has suffered in reputation due to unscrupulous individuals and companies using tactics that degrade the integrity of search results and violate guidelines set by those engines. Generally, any optimization scheme that tricks a search engine also tricks visitors to that site, making online life worse for everyone involved. Pure optimization, though, helps everyone: the Webmaster, the search engine, and the visitor. The true values of optimization are clear content, coherent navigation, wide reputation for quality, and high visibility in search engines.

Organic SEO: Organic SEO or Natural SEO is SEO results appearing on the left hand side of a search engine results page. It is distinguishable from SEM which primarily focuses on pay-per-click SEO, which is usually, sponsored links appearing on the right hand side of the screen. Organic SEO usually has a much higher return on investment than SEM; in fact, it usually has the highest ROI out of any advertising or marketing medium.

Outgoing link: A link from your page to another page. Outgoing links don't build PageRank by volume, as incoming links (backlinks) do. However, Google pays attention to the text elements of outgoing links, and a page's optimization can be strengthened by consistent placement of key concepts in that text.

Page redirect: A background link that sends site visitors to another site. Page redirects can be used legitimately, as when a site moves from one domain to another. In that scenario, the Webmaster sensibly keeps the old domain active for a while, seamlessly sending visitors to the new location when they click the old one. As an illicit optimization technique, page redirects deflect visitors from the site indexed by Google to another site that would not be able to gain as high a PageRank. This type of redirect, when uncovered by Google, risks the expulsion of both sites from the index.

PageRank: A proprietary measurement of Google's proprietary ordering of pages in its Web index. PageRank is the most intense point of focus, speculation, observation, and desire in the Webmaster and optimization communities. More than any other single marketing factor, PageRank has the power to determine a site's visibility. A high PageRank moves a page toward the top of any search results page in Google when that page matches the user's keywords. Obtaining a PageRank high enough to break a page into the top ten is the primary goal of Google optimization. An approximate version of any page's PageRank can be checked by displaying the page in Internet Explorer while running the Google Toolbar.

Pixel tracking/web bugs/patty mail: Implementation of a code into the website of the advertiser. This then tracks the user's behavior on the website and reports information back to the adserving system .Pixel tracking is used for Optimization and Tracking conversions.

Prominence: Prominence is the ratio of the position of one keyword or keyword phrase to the positions of the other keywords in an XHTML section of the page. For example in the text enclosed by the BODY tag is one of sections of the page we measure keyword prominence in. Your most important keywords must appear in the crucial locations on your web pages because search engines like pages where keywords appear closer to the top of the page. They should preferable appear in the first paragraphs of your page. Also keep in mind if you include keywords closer to the bottom of your page it will have a negative effect on the overall keyword prominence calculations.

Rank Theft: a term coined by Dr. Clinton Cimring in 2006, Rank Theft is a method directories and other congregate websites use to gain the would-be page rank attributed to a website by offering subpages on their own domain. This is different to Google Jacking or spoofing where one website is redirected to another website. Most recently, sites like Myspace and Merchant Circle began offering subpages for free with the hopes that users would use these pages instead of building their own websites. Based on this approach myspace.com's page rank rose approximately 1 point per month. Merchantcircle.com rose from a PR of 0 in 2005 to a PR of 7 in 2008. In contrast, website like wordpress.com and blogger.com offer subdomains rather than subpages in order to avoid these false attributes.

Robots.txt file: A simple text file that stops Google (and other search engines that recognize the file and its commands) from crawling the site, selected pages in the site, or selected file types in the site.

RSS feed: RSS stands for Really Simple Syndication. An RSS feed is a document that contains either a summary of content from a web site or the full text of a website. RSS feeds makes it possible for people to keep up with their favorite web sites automatically rather than checking them manually.

SE (search engine): A site, such as Google.com, that matches keywords to Web page content.

SEM (search engine marketing): SEM is SEO that focuses on the marketing aspect of optimization in order to produce results rather than the backend programming, coding, content and design of a website. This marketing is usually associated with pay-per-click campaigns, banner ads, and affiliate networks intended for branding versus action by viewers. The long term results of SEM are usually much higher than organic SEO and may even have a negative return on investment.

SEO: "Strategically Elevating Optimization" is a manipulation of a search engine's algorithm in order to have a website appear higher in a search engine result. For an update See SEO 2.0. It is highly debated in the web community whether the approach was invented by Dr. Clinton Cimring or Danny Sullivan of searchengineland.com.

SEO 2.0: SEO 2.0, a term coined by Dr. Clinton Cimring in 2007, is an optimization reaction to Google's new Universal algorithm and to Web 2.0 social media sites like Digg, del.icio.us, Technorati, and StumbleUpon. It includes the following results within Google: images, videos, indented pages, subpages, a search within a search, and SML descriptions. 

SEO Copywriting: Writing specifically for web pages involves incorporating target keywords that tell the search engines what a specific web page is about. Effective SEO copywriting achieves two goals. 1) It creates persuasive, informative content for the web site visitor while 2) maintaining an optimum keyword count for the search engines to index.

SEO Footprint: An SEO footprint is the imprint a search engine optimizer leaves on the web that can be used to trace his/her activity through various sites. It can be used to locate multiple accounts and multiple sites he/she owns. An SEO footprint is an obvious sign of search engine manipulation and can be used to by Google or competition to rip apart his/her network.

Search Engine Positioning: Typically, a search engine works by sending out a spider to fetch as many documents as possible. Another program, called an indexer, then reads these documents and creates an index based on the words contained in each document. Each search engine uses a proprietary algorithm to create its indices such that, ideally, only meaningful results are returned for each query.

Search Engine Ranking: A program that searches documents for specified keywords and returns a list of the documents where the keywords were found. Although search engine is really a general class of programs, the term is often used to specifically describe systems like Alta Vista and Excite that enable users to search for documents on the World Wide Web and USENET newsgroups

SERP: Search engine results page. A page of links leading to Web pages that match a searcher's keywords.

Server: A server is a computer running administrative software that controls access to the network and its available resources such as printers and disk drives. It also provides resources to computers that are operating on the network. A server can also be a program that contains data or files and that responds to commands.

Social Media: The term social media describes media that is posed by the user and can take many different forms. Some types of social media are forums, message boards, blogs, wikis and podcasts. Social media applications include Google, Facebook and YouTube.

Social Networking: A social networking site allows you to identify your contacts and establish a link between you and each of your contacts.

Spam: Generally refers to repeated and irrelevant content. As an optimization term, spam refers to loading a page with keywords or loading a search engine's index with mirror sites. Google reacts strongly to spamming, and takes harsh measures against Web sites that use spamming techniques to improve PageRank.

Spider: An automated software program that travels from link to link on the Web, collecting content from Web pages. Spiders assemble this vast collection of content into an index, used by search engines to provide relevant sites to searchers. Spiders are also called crawlers, bots (short for robots), or Web bots. Google's spider appears in Webmaster logs as Googlebot.

StopWords: Words that are common in a full-text file but have little value in searching. Words in a stopword file will be excluded from the indexes, considerably reducing the size of the indexes and improving search performance. For example these are stopwords: a, about, an, are, as, at, be, by, com, for, from, how.

Tag: A tag is a keyword used to describe a piece of data (such as a blog post, photo, video, etc.). Tags can either be assigned by the author of the content or the consumer of the content.

Title Tag: XHTML tag used to define the text in the top line of a Web browser, also used by many search engines as the title of search listings.

Title Attribute: Link title is the attribute of the link and adds information about the link, it is rendered as a tool tip in the browser.
Example: <a href="title-attribute.html" title="link title">text</a>

Trackback: When a blog links to another blog, a trackback is a notification sent between the two blogs letting the receiving blog's author know (s)he is being linked to (this implies that both blogs have the ability to send and receive trackbacks)

Tweet: A "Tweet" is an individual message (or "update") posted from Twitter.

URL: A URL (Universal Resource Locator) is the address of documents and resources on the internet. Most search engines look for the keywords in the domain name, folder name and page name. Keywords should be separated by hyphens.
Example: http://www.keyword1.com/keyword2-keyword3.html

Web 2.0: Web 2.0 is a trend in the use of World Wide Web technology and web design that aims to facilitate creativity, information sharing, and, most notably, collaboration among users. These concepts have led to the development and evolution of web-based communities and hosted services, such as social-networking sites, wikis, blogs, and folksonomies (the practice of catgorizingcontent through tags). Although the term suggests a new version of the World Wide Web, it does not refer to an update to any technical specifications, but to changes in the ways software developers and end-users use the internet.

Web 3.0: Web 3.0 is a phrase coined by John Markoff of the New York Times in 2006, which refers to a supposed third generation of Internet-based services that collectively comprise what might be called'the intelligent Web'—such as those using semantic web, microformats, natural language search, data-mining, machine learning, recommendation agents, and artificial intelligence technologies—which emphasize machine-facilitated understanding of information in order to provide a more productive and intuitive user experience. Nova Spivack defines Web 3.0 as the third decade of the Web (2010–2020) during which he suggests several major complementary technology trends will reach new levels of maturity simultaneously.

Wiki: A wiki is a website or piece of software that allows users to create and edit webpages. Users are able to link to outside sites and collaborate on the information that is posted.