Shell Harris

Posts Tagged ‘SEO Strategies’

PRWEb vs. PRNewswire Press Release Service for SEO – SEO Tip Week 38

In 52 SEO Tips, Link Building, SEO Strategies on September 25, 2007 at 6:31 pm

SEO TipsWhen submitting press releases as part of our SEO services we are often asked why we chose PRweb.com as our vehicle for submission rather than PRNewswire. When compared across our big three parameters PRWeb is the better service for our purposes. PRNewswire seems more like something large companies use to get out news because they send your release more to journalists and less to Web outlets. This summarized it nicely: “Services such as PRNewswire and Newswire provide a far more targeted channel to specific demographics than the cheaper alternatives, however unless you’re willing to pay top-rates, the SEO benefit (on a keyword level) is less.”

Pricing
PRWeb.com is more affordable, which our clients certainly appreciate.

PRWeb PRNewswire
No membership fee. Their “average” press release costs is $80 with the SEO Visibility option being $200. Membership based with annual fee of $150. From their site: “The cost of distributing your news release is determined by the newsline you select and the length of your news release. Each newsline covers a specific geographical area ranging from local, regional, national and international. Prices start at $180 for a city/metro or statewide distribution. A national distribution starts at $680.”PRNewswire Toolkit

Reach
The reach appear to be better on the web, which is our focus.

PRWeb PRNewswire
“Gets picked up in leading online news sites like Yahoo! News, Google News, Ask.com, and Topix. Additionally, your press release is distributed through a host of other online news sites including our own PRWeb.com and eMediaWire.com, which deliver over 50 million page views each month.” “Your message will reach mainstream and industry trade media, thousands of web outlets and PR Newswire for Journalists, a digital media channel serving more than 85,000 registered journalists across the globe.” No mention of Google News ” among 3,600 of the world’s most widely accessed Web sites”

Cached Pages
Of course this is just one example, but Google may index PRWeb better as well.

PRWeb: August 28 press releaseCached

PRNewswire: August 28 release – Not cached

Both sites have their advantages, but for SEO, it would appear the PRWeb.com is still the best choices for helping your rankings.

When to Use a 301 vs. 302 Redirect – SEO Tip Week 35

In 52 SEO Tips, Google, Live Search (MSN), Search Engine Optimization, Search Engines, SEO Strategies, Yahoo Search on September 2, 2007 at 9:51 am

SEO TipsThere are two types of redirects you can use, a 301 and a 302. These numbers refer to the HTTP Status Code returned by the server for a given URL. A 301 redirect tells the search engine that the page has moved permanently to the new URL. A 302 redirect tells the search engine that the move is only temporary, and you may decide to show content at the original location in the future without a redirect.

301 Redirects
All three major search engines handle 301 redirects the same, that is to say they ignore the original URL and instead index the destination URL. For example, www.beekerfurniture.com uses a 301 redirect to www.hendersonsfurniture.com and Google, MSN and Yahoo all return the result http://www.hendersonsfurniture.com when searching for “beeker furniture”. The word beeker doesn’t appear anywhere on the hendersonsfurniture.com site, and a site search in Google shows that only the home page has any relevance for the word. Clicking on the Cached link in the site search results further shows that the word only exists in links pointing to the site, “These terms only appear in links pointing to this page: beeker.” Those links Google is referring to are actually pointing to http://www.beekerfurniture.com and the 301 redirect is passing along the relevance of the word beeker to hendersonsfurniture.com.

301 redirects can be very powerful when you redesign your site and the URLs change, move to a different domain, acquire a new domain, or implement a URL rewrite. In most cases, this is the type of redirect you want to use because you know exactly how the search engines will respond.

302 Redirects
The three major engines handle 302 redirects very differently, and because of this 302s are typically not recommended.

Google treats 302 redirects differently depending if they are on-domain or off-domain. An example of an on-domain redirect is athletics.mlb.com which uses a 302 redirect to http://oakland.athletics.mlb.com/index.jsp?c_id=oak. If you search for “oakland a’s” in Google you will see that athletics.mlb.com is displayed in the results because links point to that URL, which in turn uses a 302 redirect to the destination page. This is a great example where 302 redirects can be used effectively, since the shorter URL looks much more enticing in the results pages.

Off-domain 302 redirects would be ripe for hijacking situations if treated the same way. Because of this, in most cases, Google will treat off-domain 302 redirects like 301s, where they will ignore the original URL and instead index the destination URL. I say most cases because Google will sometimes determine that the 302 is legitimate & index the original URL instead. An example of an off-domain redirect is pets.roanoke.com which uses a 302 redirect to a third-party site http://www.gadzoo.com/roanoke/pets.aspx. In this case, Google determined that this was a legitimate use of a 302 redirect and displays pets.roanoke.com when searching for “pets roanoke”.

MSN treats 302 redirects exactly how it treats 301 redirects, it will always ignore the original URL and instead index the destination URL. A search for “oakland a’s” in MSN shows the URL oakland.athletics.mlb.com/index.jsp?c_id=oak in its results. And a search for “pets roanoke” shows http://www.gadzoo.com/roanoke/pets.aspx in its results.

Yahoo takes the same stance that MSN takes, except that they reserve the right to make exceptions in handling redirects. A search for “oakland a’s” in Yahoo shows the URL http://www.oaklandathletics.com in its results. (www.oaklandathletics.com also uses a 302 redirect to http://oakland.athletics.mlb.com/index.jsp?c_id=oak) But a search for “pets roanoke” shows http://www.gadzoo.com/roanoke/pets.aspx in its results.

There are very few times where you actually want a 302 redirect, although they are used more often than 301s merely because most people don’t know the difference. 302 redirects are often the default redirect in website control panels, and JavaScript or Meta redirects will produce a 302 status as well. In certain situations however, 302 redirects work wonders.

As with all our tips, please use them responsibly. When in doubt, use a 301 redirct.

Google’s Supplemental Results – SEO Tip Week 29

In 52 SEO Tips, Google, Search Engines, SEO Mistakes, SEO Strategies on July 20, 2007 at 6:19 pm

52 SEO TipsGoogle is indexing more pages now then ever before, but that’s not always a good thing. Sometimes these pages get sent to the supplemental index instead of the main index. It’s perfectly normal for most sites to have some pages in the supplemental index, but if your main pages (and especially your home page) get sent to the supplemental index you’ll likely not see much traffic from Google any more.

My site’s listed in the supplemental results, what does that mean?
As Google states, “Supplemental sites are part of Google’s auxiliary index.” Google will always show results from their main index before showing results from the supplemental index. This means that supplemental pages will almost never show up for searches, and will only show up for super specialized searches if few or no results come from the main index. With so many blogs and tag pages out there, even crazy many-word searches will bring back at least a few non-supplemental results.

How did my site get in the supplemental index?
One way pages end up in the supplemental index instead of the main index is a lack of PageRank (PR). This could be because you orphaned the page (no links pointing to it), the page lies too many clicks away from your home page, or your home page itself has a very low PR. If this is the case, you should work on your link building to those important pages of your site and build up their PageRank.

The other way your pages end up in the supplemental index is by having duplicate content on your page. This could be because you used the same manufacturer written product description that dozens of other sites use, you copied content from another website, or your pages have very little content and too much template which is duplicated on all pages. If this is the case, try writing unique content or changing your template so it doesn’t have the same elements on every page.

I changed my pages, what’s next?
Now that you’ve fixed your pages, it can be a long and hard process for getting them out of the supplemental index because the supplemental spider doesn’t come along very often. You should create or edit your Google sitemap XML file and hope that will be enough. If that doesn’t work, try changing the name (URL) of those pages and delete the old file.

Feel free to add your own observations about supplemental results here, we’d love to hear your stories.

Flat Site Architecture is SEO-Friendly – SEO Tip 27

In 52 SEO Tips, Search Engine Optimization, SEO Mistakes, SEO Strategies, Website Conversion on July 6, 2007 at 11:56 am

Why you should use a flat site architecture rather than a deep, or nested, site architecture if SEO is important to your site?

52 SEO TipsIn my previous life as a website designer and HTML developer I loved to have a folder/directory for everything. While I’m not a organized person (ask my wife) I did like keeping my files structured in clearly labeled directories. So nesting directories 4 or 5 levels deep was common practice. When I transitioned to an SEO specialist my ideas on structuring files and site architecture began to change and here is why.

A flat site offers quick access to all the pages within the site. A minimal number of clicks are needed to find all the pages within your site, usually no more than three clicks is ideal. According to the views of the search engines (SEs), less clicks mean higher importance. The view of the SEs are that more important information will be easier to reach. Home page information is the most important, one click from the home page is secondary information and two clicks is tertiary information and so forth.

Think of it like bodies of water. Your home page is the ocean and off of the home page are large rivers and then smaller rivers, then streams, then creeks and brooks and finally the smallest trickle of water is all that is left. Don’t let you products, services or information be at the end of the trickle, drying up eventually. Closer to the ocean is always better and that is how the search engines will rank your pages too.

I’ve seen some site place everything in the root folder and this isn’t good practice either. Structure your sites as to what makes sense, but be aware that more clicks can mean less viewers, both for search engine traffic and visitors on your site.

SEO Titles, Using the Title Tag – SEO Tip Week 25

In 52 SEO Tips, Search Engine Optimization, SEO Copywriting, SEO Mistakes, SEO Strategies on June 22, 2007 at 5:35 am

52 SEO TipsPage titles are one of the most important parts of any web page, especially when performing search engine optimization. A page title is located at the very top of your browser’s window. It can bring traffic in abundance or completely isolate your site. Knowing how to properly word a page title is critical to your site’s SEO success. Let’s look at how page titles can be used to increase your site’s traffic.

Snowflakes and Titles
It is said no two snowflakes look exactly the same. Well, the same should be said for page titles on your site.

Every page on your site should have a different focus from the other pages on your site, or you are repeating yourself and duplicating content. So if all your pages are telling a different story, shouldn’t they all have a different title? And that title should effectively reflect the content of the page.

Missing an Opportunity
In my daily web searches, I see many missed opportunities with poorly titled web pages. Pages simply named “Untitled” or “Untitled Document” can be found in the millions (79 million were found in a Google search). Search engines depend on titles to gather information about your web pages. And a page title without a unique description does not help the search engines – in fact, a generic page title makes the page nearly impossible to find…

Putting Your Company Name in the Title Tag
I’m not against putting your company name in your page title – after all, it will help build brand awareness. But, I am against putting only your company name in the title and until you become a household name, I would suggest putting your company name at the end of your title. The focus of your web pages should be on what people would search for to find your company. In other words you want targeted keyword phrases in the title. Let me give an example:

If your company name is “Miller & Sons” and you sell fishing equipment near Whitefish, Montana you should not limit your title to “Miller & Sons”. Instead try “Fishing Rods & Reels in Whitefish, Montana – Miller & Sons“. With this you are netting traffic searching for your product, your location and your company name. Keeping the location in the name is very important if you are serving only a regional or local market.

Let Your Copy Be Your Guide
When deciding on your page titles, read the page first and let that guide your decision. If you can’t confine the theme of your page copy into a concise page title, you may need to break the copy into more than one page.

If you sell toys on your site, your page copy should have the keyword “toys” and so should your page title. Even better it should include what type of toys. Do you sell dog toys? Cat toys? Children’s toys? Your title should convey this. Adding in other possible search terms is also a good idea. An example of a toy site home page title could be “Children’s toys and games – toys for boys and girls of all ages” You have your most important keyword, “toys,” listed twice and have added some other important keywords such as “games,” “boy” and “girl.”

Suppose one of the sub pages of your toy site showcases Leap Frog’s Discovery Ball. What type of toy is this? It’s an educational toy and you should use that in your title along with the actual toy name. This gives you an opportunity to be found in the search results for the toy name, the popular toy company and the heavily searched key phrase “educational toy.” A title catering towards SEO for this example would be: “Leap Frog Discovery Ball – Educational Toys“. Since the more targeted term is the name of the toy you put that first. Educational Toys would be shown first on a category page.

Now that you have integrated your keyword phrases from your page copy into your title, you’ll find that getting found in the search engines is a much easier task.

Underscores vs. Dashes – SEO Tip Week 24

In 52 SEO Tips, Search Engine Optimization, SEO Strategies on June 15, 2007 at 3:58 pm

52 SEO TipsSpaces should never be used in a URL or file names because the space character gets translated to “%20” by the browser, and this can wreak havoc with both readability and statistics or analytics programs. The question then remains, which is better to use instead of spaces, underscores “_” or dashes “-“.

As far as Google is concerned Big_Oak consists of one word, “Big_Oak”, and Big-Oak consists of two words, “Big” and “Oak”.

The reason Google does not treat the underscore as a word separator is because Google was created by programmers who knew that programmers often wanted to search about programming. Many computer programming languages use the underscore character in such ways that CLASS is different from _CLASS.

Because of this, I always recommend using dashes instead of underscores in your filenames and URLs. Be careful not to use too many dashes in your domain name, as that could get your site flagged for other reasons. I prefer to have a domain name with no dashes, and to use dashes where appropriate in the directory and file structure.

Example URL:
http://www.bigoakinc.com/blog/category/52-seo-strategies/

Other things about Google to keep in mind when choosing filenames and URL structure.

  • There is no difference between lower-case and upper-case:
    big oak, Big Oak, BIG OAK, and biG Oak are all the same.
  • The ampersand “&” is a word seperator:
    Big&Oak is treated as two words.
  • Singular words are not the same as plural words:
    oak and oaks are treated as different words.
  • Google cannot read words that are within other words:
    bubble will not be seen inside of bubblegum.

As with any tip, keep in mind that it’s a combination of many factors which will ultimately decide your placement in the search engine rankings and quite often every little bit counts.

Update: A Test

I created a test page to illustrate how Google reads words.

Examples:

A search for Test_travveran shows the sample page.

A search for Flibstopper Test shows the sample page. The two words are even highlighted in the URL. The word “test” appears in the page title.

A search for travveran shows no results in Google. Google did not read my made-up word from the URL or content because it only appeared in phrases with underscores.

A site search for choosing colors shows all the pages in our Out on a Limb section because those two words appear in the navigation on all pages.

A site search for “choosing colors” (in quotes) shows no pages because those two words do not appear together in our site, choosing_colors on the test page is treated as a single word.

A site search for “the blue pill” (in quotes) shows our test page since dashes are treated as word separators.

A site search for “bush seo” (in quotes) shows our test page since the ampersand “&” also acts as a word separator.

Similarly Google can find the page with reality tv and bubblegum, but it cannot find the page with bubble or 1971.

Even though many of the stranger examples have little relevance to SEO, it’s a good idea to understand how Google reads and understands words.

Keep Fresh Content on Your Homepage – SEO Tip Week 22

In 52 SEO Tips, SEO Strategies on June 1, 2007 at 8:15 pm

52 SEO TipsAdding fresh content to your site and your homepage is an often heard bit of advice from SEO consultants and advice columns. Yes, It is a good idea to keep your content up-to-date and fresh, but using information that isn’t pertinent to your site such as a weather feed or generalized information you can find on a host of other sites, it isn’t likely to draw much attention from the search engines. You need something more substantial and more on topic with your site.

Look for content that speaks to your audience. For example, maybe you own an SEO company and adding new content to your home page looks like a good idea. Well, if you write a blog on a consistent basis then your own blog posts can become homepage content through an RSS feed. As a matter of fact we have implemented this very tactic today on our own home page. If you look at the Big Oak homepage you will see an area that is displaying an RSS feed of this blog (SEO Blog) with a snippet from the two most recent blog posts. Now we have content that will change whenever this SEO blog is updated and most of the time the posts will be on topic. The search engines will notice my home page changes frequently and visit more often and give our site a higher value because of this.

If you aren’t an avid blogger, you can add content from other blogs or from search result and news feeds, but this can draw traffic from your site if your visitors follow links included in the third-party feeds.

Problems can occur as fresh content can cause your rankings to change based on the content displayed, but these fluctuations are usually minor and the benefits outweigh the negatives. Keeping a watch on your fresh content should be part of your strategy.

Be creative and think about what content would be of use to your visitors. Think about adding new products to the homepage or specials that change frequently. The key is to stay vigilant and automation is good answer. RSS feeds are a great way to do this and many shopping carts have the option to provide an RSS feed. Other ideas are writing tips, maybe 52 Industry tips in advance, that change out weekly. They don’t have to link anywhere and can simply be placed on the home page every Monday and so on. It is important to make it fun and easy, otherwise it will become a struggle especially if you are trying to provide the content yourself rather than displaying content from other sites or feeds. Be sure to get permission if you aren’t sure about copyright issues when using the content from other sites.

Please pass along any ideas you have for keep your homepage fresh with relative content.

Control What Gets Indexed – SEO Tip Week 19

In 52 SEO Tips, SEO Strategies on May 11, 2007 at 1:47 pm

52 SEO TipsBefore reading this tip, make sure that you have read the tips from week 7 (Duplicate Content & URL Canonicalization) and 14 (Every Web Page Should Be Unique) to understand why unique content is valuable.

Under normal circumstances, you want every page to be unique, and for there to be one, and only one, URL displaying that content. However, there are situations where you might use multiple URLs for specific purposes, such as tracking your Pay Per Click (PPC) campaigns. Giving your PPC ads individual specific URLs makes them easier to track in your website statistics or analytics packages.

Example Original URL: www.company.com/mypage.html
Example PPC Ad URL: www.company.com/mypage.html?src=ppc001

The problem with giving ads unique URLs is that they may end up being indexed by the search engines, especially Yahoo. The best solution is to use a 301 redirect to the original URL for that page, thereby capturing any importance the individual URLs may have. If you cannot use a 301 redirect for technical or other reasons, you can stop these URLs from being indexed with your robots.txt file.

If you only have a few URLs that you want to disallow, you can list them individually:

User-agent: *
Disallow: /mypage.html?src=ppc001
Disallow: /mypage.html?src=ppc002

This can become problematic if you are trying to disallow hundreds or even thousands of different URLs. This is where wildcards come in. While wildcards are not part of the standards of robots.txt files, Google, MSN, and Yahoo all support them. This can be extremely helpful in trying to get rid of some of those tricky content duplication problems.

The two supported wildcards are the asterisk "*" which matches any sequence of characters, and the dollar sign "$" which signifies the end of a URL string. Trailing asterisks are redundant, and not needed, since that is the natural behavior of the robots.txt standard.

CompUSA Example

The CompUSA home page is indexed hundreds of times by Yahoo with many variations of referral and tracking parameters.

If they wanted to get rid of all URLs with parameters on their default.asp page, they could use:

User-agent: *
Disallow: /default.asp?

Additional Examples

If you have printable versions of all your HTML pages that contain "_print" in the URL, you would use:

User-agent: *
Disallow: /*_print*.html$

If you use a session id parameters called "sessionid" with users who are logged in, you would use:

User-agent: *
Disallow: /*sessionid=

If you have a private folder called "private_memories" and you don’t want hackers being able to know the full name of the folder simply by looking at your robots.txt file, you would use::

User-agent: *
Disallow: /private*/

As you can see, there are many uses for the robots.txt file now that all of the big 3 search engines support wildcards. Hopefully the official specification for robots.txt will support wildcards in the future, and all bots will understand them.

Search Engine & Directory Submissions – SEO Tip Week 18

In 52 SEO Tips, Link Building, SEO Strategies on May 4, 2007 at 2:37 pm

52 SEO TipsDirectories are an easy way to build links because anyone can submit and get listed. Directories can, therefore, be of little use for the same reason. Of course getting in directories can be time consuming but it is a one-time affair and usually worth the time. They provide one-way links which will increase your online presence. Not all directories are created equal and paying for the better ones is often money well spent.

Inclusion Tips

Select the best category for your site and follow the instructions on the submission form carefully. Write your descriptions without sensational text. Descriptions of sites should describe the content of the site concisely and accurately When submitting to directories, make sure to vary anchor text and use keywords in the description and title fields.

Choose the most appropriate category for your site. Finding a category that best matches your site’s theme or content will increase traffic from the directory and provide higher quality one-way link to your site for the search engines to follow.

Good Directories

A few of the more search engine friendly directories for valuable links are the following:

Yahoo! Directory: The Yahoo! Directory is the biggest and oldest directory on the web, and one of the few directories that can provide traffic.

DMOZ: The Open Directory Project (DMOZ) is a free, volunteer-run directory. Google and many other sites pull information directly from DMOZ and display the description in the search engine results.

Business.com: A huge business-related directory.

Best of The Web: One of the oldest directories on the web.

Aviva Directory: A well-marketed directory.

GoGuides.org: Another well-run directory that has been around a while.

Blog Directories

Here are some recommended blog directories.

MyBlogLog: Everyone who reads a web site or blog can learn about and engage with one another, and in the process take the conversation to a whole new level.

Blogcatalog: Labeled as “the premiere blog directory on the internet”. Promote your own blog or find blogs on various topics.

Bumpzee: join a community, you are fed with an ever-updating list of blogs and blog posts submitted and chosen by like-minded individuals.

A more complete list of online directories is on our website. Aviva Directory has has a list of the best directories.

Submit Press Releases – SEO Tip Week 16

In 52 SEO Tips, Link Building, Search Engine Optimization, SEO Copywriting, SEO Strategies on April 20, 2007 at 6:41 am

Submit Press Releases for Link Building

52 SEO TipsFinding one-way links to your website can be a tough task. Finding quality one-way links to your websites can be even more difficult.

Wouldn’t it be wonderful if people put links to your site without you even asking them to? Well, submitting a press release can help accomplish this. Similar to my tip on SEO with articles, using a press release to build links and increase search engine rankings can also be effective. What so many people overlook submitting press releases, which are copyright free, is the fact that you can put links to your site within the content of the press release. We used this method of promotion for our online toy store client and you can see one of their press releases.

Not only can you place helpful links to your home page in the body of the press release but you can use deep linking and link to interior pages of your website. Deep linking is often harder to do as most websites would rather link to your home page. We use PRweb.com and the cost is higher to use text links rather than actual URLs.

Press Release Submit Site

As I wrote, PRweb.com does charge a fee for submission, but there are many free press release submission sites as well. Here are some that we also use:

Press release submission is not the magic bullet for SEO success, but it is a valuable tool in the SEO consultant’s toolbox. Whether it is a product release, a major hire, or a strategic acquisition you can make the press release perform double duty as both a internet marketing tool and a link building tool. Big Oak provides press release writing & submission as part of our SEO strategy.