Top SEO Strategies You Need For Your Online Success – Part Two
Have you read Read Read Top SEO Strategies You Need For Your Online Success – Part One?
if not Read Top SEO Strategies You Need For Your Online Success – Part One here
There’s no doubt about it. Optimizing pages to satisfy search engines can be a tedious and demanding task. Not just initially, but throughout the duration of any website being live on the web.
Basically, your search engine optimization never ends.
You strive for high page rank. That can mean an actual score like the one Google assigns to individual web pages or merely a conceptual rating that provides your website with more search engine recognition and stature than other sites in your area of interest.
Either way, the goal is to make your website more popular, more visible, more important than all the competition.
You might not reach the top of the heap, but that’s where you have to aim in order to land anywhere near the top.
Not that you can’t reach the very top. You can. It’s just not necessary in order to reap all the benefits – at least, from a strictly search engine perspective.
Let’s face it. If you land in the top three positions (or even on the first page) of search results, you’ll most likely capture the same amount of traffic that the number one website enjoys. Maybe even more.
It all depends on your description. Or should we say, the description that a search engine displays in your listing – since meta description tags are rarely used anymore.
If your description more closely matches what a viewer is searching for, they’ll go to your website first. Regardless of what results position you happen to be in.
And even if they don’t go there first, they’ll most likely get there eventually. Unless, of course, one of the other websites has totally and completely satisfied their needs and they don’t feel compelled to continue their search.
The point is, it’s not entirely about what position you gain in search engine results. It’s about targeting a specific keyword (search term) and then making certain you accomplish these two things…
- Your website ranks high for that keyword.
- Your website can deliver viewer expectation for that keyword.
Of course, delivering the viewer’s expectation is fairly straightforward.
If the search term is “improve golf swing”, it’s a pretty safe bet the viewer is looking for something to improve their golf swing. As long as you provide information or a product (or both) that can satisfy that need, you’re in excellent striking distance.
Covering the first accomplishment – getting a high rank for your website – is a whole lot more involved.
It’s not just about satisfying a specific viewer need. Instead, it’s all about convincing a search engine that your website is superior with regard to satisfying a specific viewer need. For example…
There are over two million web pages associated with improving one’s golf swing. Some contain information, some contain products. Some contain nothing more valuable than a brief mention of the search term.
Regardless, there are millions of pages that show up in the search results total when a viewer types in “improve golf swing” (approximately 50,000 results if you put quotes around it, which the majority of searchers don‘t include).
All you have to do is dive into that vast ocean of search results and somehow manage to dog-paddle your web page past all the other possibilities and onto the sandy beach. Where only a few top ranked pages are currently basking in the sun.
The only question is, how do you accomplish that? How do you wind up in front of all those other web pages?
You start by analyzing each of those top ranked pages. You sift through their source code, their web content, their design techniques. Whatever it takes to find out exactly what they’re doing that placed them in the top results positions.
And then you do the same thing. Only better. And you keep doing it until you reach your ultimate goal.
That goal might just be the number one position. Or maybe it’s getting listed in the top three. Or maybe you’re willing to settle for any position on the first page of search results.
It doesn’t really matter.
Whatever goal you’ve set, whatever position you’re shooting for, you level your sights on the top ranked web pages and then do everything they’re doing and more.
Of course, if you’re targeting a less sought-after search term, you won’t have to work nearly as hard. And that’s why so many savvy webmasters do just that…
They deliberately seek out search terms that are valuable to their particular niche, but don’t have nearly the amount of competition associated with them.
That way, simply implementing the basic optimization techniques will most often ensure them a top position in search results for any one of those keywords.
Of course, you have to know which optimization techniques work for which search engines or directories. They’re all different. They all set their own criteria for what elements are most important.
Some put the greatest emphasis on link popularity. Others place a good deal more value on the count and density of a specific keyword on individual web pages. Still others are more interested in seeing a basic theme or topic carried throughout the entire website.
Fortunately, if you limit your optimization efforts to satisfying the top players – Google, Yahoo, MSN, and Open Directory (DMOZ) – you can cover the most important SEO bases simultaneously.
For example, even though having the keyword in your page title might not carry a great deal of weight with Yahoo, it’s an absolute must when it comes to satisfying Google. So put your keyword in the title.
Although DMOZ doesn’t care so much about links pointing to your page from other websites, Google, Yahoo, and MSN do place a considerable amount of value in how “popular” your page is.
And all of them want to see a fair amount of quality keyword-rich content and a solid topic or niche theme throughout.
By incorporating all of the most important optimization techniques – the ones that are unilaterally perceived as most valuable – you’ll find that you have automatically satisfied the top players.
And speaking of top players, Google is the one that you need to aim most of your time and energy toward. And to assist you in that regard, the majority of this particular report contains Google specific information.
Concentrate on rising to the top of Google’s results and everything else will naturally fall into place. It’s just that simple.
SEO Strategy – Google Style
Google Webmaster Tools
They’re free and yet very few webmasters take advantage of the tools that Google has made available. And that includes Google Sitemaps, one of the best methods for getting your pages crawled and subsequently indexed (we’ll talk about that one in depth in the next segment).
Listed below you’ll find some of the free SEO tools that you should be using on a regular basis.
NOTE: In order to use any of these tools, you’ll need a special key. Just click on “Get a Free Googleä API Key” or go to http://www.google.com/api and submit the form. The key will then be sent to whatever email address you specify.
Google Rankings
http://www.googlerankings.com/index.php
This tool allows you to locate the search results position for any given keyword and URL address. You can input one word at a time or multiple keywords.
You also have three choices with regard to where the search will be conducted. That gives you the option of seeing what position is held in one or more of the three major contenders… Google, Yahoo, and MSN.
The nice thing about this particular tool – aside from the valuable information it provides – is that fact that it’s relatively fast. Unlike other tools of this type that can take several minutes to complete the search and results process.
Google SEO Tool
http://googlerankings.com/ultimate_seo_tool.php
When it comes to keyword optimization, this tool is an absolute must. There are two steps involved which return information about keyword count, keyword density, and keyword position.
Step 1
Analyze Keywords – Gives you a list of 1, 2, and 3 word phrases that appear “x” amount of times or more on any given page (“x” is the amount you choose when first filling out the form). You also receive the density percentage for each word listed.
It will also display the page title, the meta description and keywords tags, and the top five most often used keywords.
Step 2
Create Position Report – Tells you what position the web page holds in Google search results for each of the top five words found in Step 1.
Googlerankings Position Tracking
http://googlerankings.com/positiontracking/
This is an excellent means of staying on top of all your search engine positions. You create a free account and then log in to input whatever URL addresses and keywords you want to keep track of.
It allows you to check your ranking history, create charts, or download data to your spreadsheet application.
Google AdWords Keyword Tool
https://adwords.google.com/select/main?cmd=KeywordSandbox
Use this suggestion tool to get ideas for new keywords that can help improve your ad relevance. Enter one or more keywords and Google will show you matching queries and alternatives. Can be very helpful when running AdWords campaigns.
Google Suggest
http://www.google.com/webhp?complete=1&hl=en
As soon as you start typing in the search query, Google will begin to suggest similar search terms. It will also show you how many results exist for each of those terms. Very helpful when compiling keyword lists or determining niche markets.
Google Sponsored Links
http://www.google.com/sponsoredlinks
Conduct a search in Google that returns only sponsored link results only. This is extremely useful when you’re trying to find the proper wording for your Adwords or need to see how your competition is doing.
Search Term Difficult Checker
http://www.searchguild.com/difficulty/
This one doesn’t happen to be directly from Google but it has such tremendous value, it definitely had to be included here.
All you do is enter your Google API Key and a search term. (If you don’t have an API key, you can get one for free at http://www.google.com/api.)
The program will return a score factor that will let you know how difficult it would be to gain a position on the first page of Google search results for the keyword (search term) you just queried. The lower the score, the easier it will be.
Now, whenever you come up with a keyword you think might have potential, you can find out right away whether or not it‘s even worth investing any time and effort. Both from a traffic generating perspective and an SEO position.
Google Sitemaps
Everyone knows about sitemaps. Traditionally, it’s a separate area where you include links to every public page on your website.
Sometimes they include brief descriptions of the different pages and the content they contain. Sometimes they are nothing more than a long and somewhat generic list of page links.
Some people create sitemaps with the sole purpose of giving their viewers a comprehensive web page directory.
Some people create sitemaps simply to make certain the search engine crawlers find each and every available page on their website.
And then came Google Sitemaps…
Like all search engine crawlers, GoogleBot is out there with the express purpose of gathering valuable data that can be added to its searchable index. The sooner it can return with new and updated information the better. For both Google and the people who use their search engine.
With that in mind, the Google sitemap service offers a twofold solution.
First, it lightens GoogleBot’s burden of having to constantly crawl the same places over and over again looking for new and updated content.
Now, with a system that tells the bot when and where to crawl, the result is simply a great deal of time being saved. Time that can be spent much more efficiently.
Rather than waste time on pages that have not been (and might never be) updated or changed, the bot can zero in on places that have valuable and current content that can be added to the search database.
For webmasters, Google Sitemaps offers a way to send immediate notification when any change or addition takes place within their websites. This not only increases the possibility of getting pages indexed faster, it ensures that Google Bot can easily locate pages that are available and bypass any and all pages that aren’t meant to be public.
For the sitemap files themselves, there are two different types that you can implement.
The first one is your typical list of individual pages (just like any other sitemap would display). The second type would be used as an index, listing multiple sitemaps (in the event you have more than one).
The limit is 50,000 URLs per sitemap with a maximum of 1,000 sitemaps.
Google accepts plain text versions but gives higher priority for sitemaps that are written in XML format. That’s because the XML version includes valuable notification options that can be associated with each URL.
Here is a brief explanation of each of those options.
Last Modified <lastmod>
Allows you to specify the exact time and date a page was last changed or updated. This should conform to the ISO 8601 format (your can read these specifications at http://www.w3.org/TR/NOTE-datetime) . If you choose not to include the time, the format for the date alone would be YYYY-MM-DD. March 9, 2006, for example, would be displayed as <lastmod>2006-03-06</lastmod>.
Change Frequency <changefreq>
Allows you to specify how often a page will change or be updated. Valid values are always, hourly, daily, weekly, monthly, yearly, and never. Be aware, however, that the value is merely used as a guide and not a command. It’s possible that any given page can be crawled more or less frequently than the specified value.
Priority <priority>
Allows you to specify a number that tells how important you feel any page is in relation to all the other pages on your website. Valid values range from an absolute low of 0.0 to a maximum high of 1.0 (the default priority value of a page is 0.5).
Keep in mind that the priority you set has no bearing with regard to what search engine results position your page achieves (if any). It merely tells GoogleBot which page should be given the most importance when crawling your website.
XML Sitemap Example
<?xml version=”1.0″ encoding=”UTF-8″?>
<urlset xmlns=”http://www.google.com/schemas/sitemap/0.84″>
<url>
<loc>http://www.example.com/</loc>
<lastmod>2005-01-01</lastmod>
<changefreq>monthly</changefreq>
<priority>0.8</priority>
</url>
<url>
<loc>http://www.example.com/page1.html</loc>
<changefreq>weekly</changefreq>
</url>
<url>
<loc>http://www.example.com/page2.html</loc>
<lastmod>2004-12-23</lastmod>
<changefreq>weekly</changefreq>
</url>
<url>
<loc>http://www.example.com/page3.html</loc>
<lastmod>2004-12-23T18:00:15+00:00</lastmod>
<priority>0.3</priority>
</url>
<url>
<loc>http://www.example.com/page4.html</loc>
<lastmod>2004-11-23</lastmod>
</url>
</urlset>
Sitemap Index Example
<?xml version=”1.0″ encoding=”UTF-8″?>
<sitemapindex xmlns=”http://www.google.com/schemas/sitemap/0.84″>
<sitemap>
<loc>http://www.example.com/sitemap1.xml.gz</loc>
<lastmod>2004-10-01T18:23:17+00:00</lastmod>
</sitemap>
<sitemap>
<loc>http://www.example.com/sitemap2.xml.gz</loc>
<lastmod>2005-01-01</lastmod>
</sitemap>
</sitemapindex>
Notice the additional .gz extension. To reduce bandwidth, you have the option of compressing your sitemap files using gzip. Uncompressed sitemap files cannot exceed ten megabytes.
Naturally, if you have a relatively small website, managing your sitemap won’t be difficult or overly time consuming. But having a program that automates the process of updating and delivering the sitemap would still be beneficial.
Of course, you probably don’t have one small website. You most likely have (or will have at some point) numerous websites with hundreds if not thousands of pages each. And under those circumstances, you an automated system would definitely be an asset.
Sitemap Equalizer ( http://www.sitemapequalizer.com ) is the best program for doing that. Especially if you want to make absolutely certain everything has been taken care of accurately and properly.
It provides a powerful web spider that will crawl your entire site beforehand, making certain there are no dead ends or traps where a search engine spider can get stuck in a loop, unable to access all of your pages.
For more information about Google’s sitemap service, check out the following pages of their website…
- Google Sitemaps : http://www.google.com/webmasters/sitemaps/
- Google Sitemaps Overview: http://www.google.com/webmasters/sitemaps/docs/en/navigation.html
You need to see this: Advanced Internet Marketing Masterclass
Google Friendly Design
No information about SEO strategy would be complete without mentioning how basic design elements can effect indexing and page rank. And in this instance, what works best for Google basically applies to all search engines.
The first thing you need to understand is this…
When it comes to good optimization, the only one that really matters – the only one you need to satisfy – is the search engine crawler.
Naturally, nice clean design and proper navigation is important to your viewer. But great website presentation and performance isn’t much good if it doesn’t comply with search engine standards or requirements.
Unlike viewers, who can view your website both outside and in, search engine crawlers only get to experience your website from the inside, by following the source code from top to bottom.
And they’re on a specific mission… to locate information that will help index any given page. If everything is laid out properly, the crawler will have no problem locating keywords that have been deliberately and properly placed within its path.
That allows the crawler to accurately index your web page. Which, of course, is what you ultimately want. Web pages that are indexed according to the keywords that will provide you with the greatest benefit.
If the design is jumbled (or causes the source code to contain a large volume of unnecessary elements), there’s a good chance the crawler will never come up with a viable indexing choice. And since the crawler is always in a hurry, it’s not about to stick around for any additional or extended length of time on your behalf.
If, on the other hand, the important information – the keywords you’ve carefully and painstakingly chosen – are located in all the right places and used in the proper context, a crawler won’t have a bit of difficultly determining exactly how that particular page should be indexed.
Primarily, those crawler-friendly locations include places like the page title, clearly visible and high-placement < h > tags, and the first paragraphs and/or sentences of the main text content.
Should you ever consider incorporating the most flashy and innovative techniques on your website, think again. Doing so is never going to impress or solicit favor from search engine crawlers. (It probably won’t even impress your human visitors.)
Following is a basic list of what most search engine crawlers can’t process (extract information from)…
- Image text
- Multimedia (such as flash and streaming video)
- Pages that require login or cookies
- PDF files
- XML
- Java applets
In addition, most search engine crawlers have a hard time with things like frames and dynamically generated content (for example, URLs that include “?”).
If the crawlers can’t navigate your site (and remember, they’re navigating through the source code rather than the outside elements), they can’t properly index your website.
Worse case scenario is that they’ll leave prematurely and never wind up fully indexing your website.
In order to optimize your pages in such a way that you satisfy both human visitors and search engine crawlers, you need to do the following:
- utilize the best keywords for your topic or niche
- place keywords where they are most effective and advantageous
- use keywords in their proper context
- include the correct amount of keywords throughout all locations
As long as you accomplish that, you’ll have a website that’s not only people friendly, but search engine friendly as well.
Checklist
- The goal is to make your website more popular, more visible, more important than the competition.
- Although it’s not necessary to reach the number one search results position, you need to aim there in order to land anywhere near the top.
- If your description more closely matches what a viewer is searching for, they’ll go to your website first regardless of what your results position happens to be.
- It’s not exclusively about position. It’s about targeting a specific keyword and then making certain your website 1) ranks high for that keyword and 2) can deliver what the viewer is searching for.
- In order to compete with websites in top results positions, you need to find out what they’re doing and then do the same thing, only better.
- If you limit your optimization efforts to the top search engines and directories, you can cover the most important SEO bases simultaneously.
- Take advantage of all the free SEO webmaster tools that Google and other websites have available.
- Use Google Sitemaps to make certain the crawler finds all available pages/
- Use Google Sitemaps to help get your pages indexed faster.
- Submit XML sitemaps so you can take advantage of the notification options such as the date a page was last modified and the frequency you anticipate a page will be changed or updated.
- Indicating priority only tells how important a page is in relation to all the other pages on your website. It has no bearing on what position your page will hold in search engine results.
- Use Sitemap Equalizer ( http://www.sitemapequalizer.com ) to create and manage all of your sitemaps.
- Don’t design your web pages for viewers only. Design them to help search crawlers easily and quickly locate the specific information and keywords that you want your page indexed for.
- Crawler friendly locations include the page title, high placement < h > tags, and the first paragraphs or sentences of the main text content.
- Most search engine crawlers can’t extract information from image text, multimedia such as flash and streaming video, pages that require login, PDF files, XML, and Java applets.
- Most search engine crawlers have a difficult time with things like frames and dynamically generated content and pages.
- To satisfy both humans and crawlers, you need to utilize the best keywords, place keywords where they are most effective, use keywords in their proper context, and include the correct amount of keywords throughout.