No, these aren’t “myths” disguised as “common mistakes.” I’ve already beaten the SEO myths theme to death with my previous three articles.
What follows are innocent mistakes that many SEOs make. Some of these things catch even the best of us…
1. Google AdWords Keyword Tool Set To Broad Match
The Google AdWords Keyword Tool defaults to “Broad match” mode, which yields useless data from an SEO perspective — useless in that the numbers are hugely inflated to include countless phrases incorporating the search term specified. For example, the Keyword Tool reports 30.4 million queries for “shoes”, but that includes multi-word phrases such as “dress shoes,” “leather shoes,” “high heeled shoes,” and even “horse shoes,” “snow shoes,” and “brake shoes.”
In Exact mode, the search query volume for “shoes” drops to 368,000. The difference between those numbers is striking, isn’t it? So always remember if you are doing keyword research for SEO in the AdWords Keyword Tool: untick the box next to Broad match and tick the box next to Exact.
2. Disallowing when you meant to Noindex
Ever notice listings in the Google SERPs (search engine results pages) without titles or snippets? That happens when your robots.txt file has disallowed Googlebot from visiting a URL, but Google still knows the URL exists because links were found pointing there. The URL can still rank for terms relevant to the anchor text in links pointing to disallowed pages. A robots.txt Disallow is an instruction to not spider the page content; it’s not an instruction to drop the URL from the index.
If you place a meta robots noindex meta tag on the page, you’ll need to allow the spiders to access the page so it can see the meta tag. Another mistake is to use the URL Removal tool in Google Webmaster Tools instead of simply “noindexing” the page. Rarely (if ever) should the removal tool be used for anything. Also note that there’s a Noindex directive in the REP (Robots Exclusion Protocol) that Googlebot obeys (unofficially). More on disallow and noindex here.
3. URL SERP Parameters & Google Instant
I just wrote about parameters you can append to Google SERP URLs. I’ve heard folks complain they aren’t able to add parameters to the end of Google SERP URLs anymore — such as &num=100 or &pws=0 — since Google Instant appeared on the scene. Fear not, it’s a simple matter of turning Google Instant off and URL parameters will work again.
4. Not using your customer’s vocabulary
Your customer doesn’t use industry-speak. They’ve never used the phrase “kitchen electrics” in a sentence, despite the fact that its the industry-accepted term for small kitchen appliances. Your customer may not search in the way you think makes intuitive sense. For example, I would have guessed that the plural “digital cameras” would beat the singular “digital camera” in query volume — yet it’s the other way around according to the various Google tools.
Sometimes it is lawyers being sticklers that gets in the way — such as a bank’s lawyers insisting the term “home loan” be used and never “mortgage” (since technically the latter is a “legal instrument” that the bank does not offer). Many times the right choice is obvious but it’s internal politics or inertia keeping the less popular terminology in place (e.g. “hooded sweatshirt” when “hoodie” is what folks are searching for).
5. Skipping the keyword brainstorming phase
Too rarely do I hear that the site’s content plan was driven by keyword brainstorming. Keyword brainstorming can be as simplistic as using Google Suggest (which autocompletes as you type and is built into Google.com) or Soovle (which autocompletes simultaneously from from Google, Bing, Yahoo, YouTube, Wikipedia, Amazon, and Answers.com). The idea is to think laterally.
For example, a baby furniture manufacturer discovers the popularity of “baby names” through looking at popular terms starting with “baby” and decides to build out a section of their site dedicated to related terms (“trends in baby names”, “baby name meanings”, “most overused baby names” etc.).
6. Mapping URLs to keywords, but not the other way around
It’s standard operating procedure to map all one’s site content to keyword themes (sometimes referred to as primary keywords, declared search terms, or gold words.) What’s not so common is to start with a target (i.e. most desired) keyword list and map each keyword to the most appropriate page to rank for that keyword and then optimize the site around the keyword-to-URL pairs.
For example, “vegan restaurants in phoenix” could be relevant to five different pages, but the best candidate is then chosen. The internal linking structure is then optimized to favor that best candidate, i.e. internal links containing that anchor text are pointed to the best candidate rather than spread out across all five. This makes much more sense than competing against oneself and none of the pages winning.
7. Setting up a free hosted blog
Free hosted blog platforms like WordPress.com and Blogger.com provide a valuable service. Over 18 million blogs are hosted on WordPress.com. They’re just not a service I would sign up for if I cared about SEO or monetization. They aren’t flexible enough to install your own choice of plugins or themes/frameworks to trick out the blog with killer SEO. And for Heaven’s sake, don’t make your blog a subdomain wordpress.com. For $10 per year, you can get a premium WordPress.com account under your own domain name.
Did you know putting AdSense ad units on your WordPress.com blog is against the service’s Terms & Conditions? Much better to get yourself a web host and install the self-hosted version of WordPress so you have full control over the thing.
8. Not properly disabling Google personalization
Not long ago, Google started personalizing results based on search activity for non logged in users. For those who thought that logging out of Google was sufficient in order to get non-personalized results, I’ve got news for you: it isn’t. Click on “Web History” in the Google SERPs and then “Disable customizations based on search activity”. Or on an individual query you can add &pws=0 to the end of the Google SERP URL (but only if Google Instant is off, see above).
9. Not logging in to the free tools
Some of the web-based tools we all use regularly, such as Google Trends, either restrict the features or give incomplete (or less accurate) data if not logged in. The Google AdWords Keyword Tool states quite plainly: “Sign in with your AdWords login information to see the full list of ideas for this search”. It would be wise to heed the instruction.
10. Not linking to your top pages with your top terms on your home page
The categories you display on your home page should be thought through in terms of SEO. Same with your tag cloud if you have one. And the “Popular Products” that you feature. In your mind translate “Popular Products” into “Products for which I most want to get to the top of Google.”
11. Not returning a 404 status code when you’re supposed to
As I mentioned previously, it’s important to return a 404 status code (rather than a 200 or 301) when the URL being requested is clearly bogus/non-existent. Otherwise, your site will look less trustworthy in the eyes of Google. And yes, Google does check for this.
12. Not building links to pages that link to you
Many amateur SEOs overlook the importance of building links to pages that link to their sites. For commercial sites, it can be tough to get links that point directly to your site. But once you have acquired a great link, it can be a lot easier to build links to that linking page and thus you’ll enjoy the indirect benefit.
13. Going over the top with copy and/or links meant for the spiders
Countless home pages have paragraphs of what I refer to as “SEO copy” below the footer (i.e. after the copyright statement and legal notices) at the very bottom of the page. Often times they embed numerous keyword-rich text links within that copy. They may even treat each link with bold or strong tags. Can you get any more obvious than that? I suppose if you put an HTML comment immediately preceding that said “spider food for SEO!” (perhaps “Insert keyword spam for Google here” might be more apropos?)
14. Not using the canonical tag
The canonical tag (errr, link element) may not always work but it certainly doesn’t hurt. So go ahead and use them. Especially if it’s an ecommerce site. For example, if you have a product mapped to multiple categories resulting in multiple URLs, the canonical tag is an easy fix.
15. Not checking your neighborhood before settling in
If you’re buying a home, you’d check out the area schools and the crime statistics, right? Why wouldn’t you do the same when moving into a new IP neighborhood. Majestic SEO has an IP neighborhood checker. This is especially important for the small-time folks. You don’t want to be on the same IP address (shared hosting) with a bunch of dodgy Cialis sites.
16. Doing too much internal linking
Don’t water down your link juice so much that only a trickle goes to each of your pages. An article page should flow PageRank to related topics not to everything under the sun (i.e. hundreds of links).
17. Trusting the data in Google webmaster tools
Ever notice Google Webmaster Tools’ data doesn’t jive with your analytics data? Trust your analytics data over the webmaster tools data.
18. Submitting your site for public site review at a conference where Google engineers are present
Doh! (Insert Homer Simpson voice here.) Unless you’re absolutely sure you have nothing weird going on within your site or link neighborhood, this is pretty much a suicide mission. Corollary: talking to Matt Cutts at a conference without covering your badge up with business cards. Note this mistake was contributed by a guy we’ll call “Leon” (you know who you are, “Leon”!)
19. Cannibalizing organic search with PPC
Paying for traffic you would have gotten for free? Yeah that’s gotta hurt. I wrote about this before in Organic Search & Paid Search: Are they Synergistic or Cannibalistic?.
20. Confusing causation with correlation
When somebody tells me they added H1 tags to their site and it really bumped up their Google rankings, the first question I ask is: “Did you already have the headline text there and just change a font tag into an H1, or did you add keyword-rich headlines that weren’t present before?” It’s usually the latter. The keyword-rich text at the top of the page bumped up the keyword prominence (causation). The H1 tag was a correlation that didn’t move the needle.
21. Not thinking in terms of your (hypothetical) Google “rap sheet”
You may recall I’ve theorized about this before. Google may not be keeping a “rap sheet” of all your transgressions across your network of sites, but they’d be foolish not to. Submitting your site to 800 spam directories over a span of 3 days is just plain stupid. If it’s easy enough to see a big spike in links in Majestic SEO, then it’s certainly easy enough for Google to spot such anomalies.
22. Not using a variety of anchor text
That just doesn’t look natural. Think link diversity.
23. Treating all the links shown in Yahoo Site Explorer as “followed”
Don’t ask me why YSS includes nofollowed links in its reports, but it does. Many YSS users wrongly assume all of the links reported under the “Inlinks” tab are followed links that pass link juice.
24. Submitting a Reconsideration Request before EVERYTHING has been cleaned up
This may not be “super-common” because many SEOs have never submitted a “Reconsideration request” to Google. But if you have or plan to, then make sure everything — and I mean EVERYTHING — has been cleaned up and you’ve documented this in your submission.
25. Submitting to the social sites from a non power user account
Nothing goes flat faster than a submission from an unknown user with no history, no followers, no “street cred”. Power users still rule, Digg redesign or not.
Bonus tip: Stop focusing on low- (or no) value activities
Yes I’ll beat on the meta keywords tag yet again. Google never supported it. All it is is free info for your competitors. Guaranteed there are items on your SEO to-do list like this that aren’t worth doing. Be outcome-focused, not activity-focused. Focus on what matters.