16 DIY Magento SEO Tweaks

Over the past year, I’ve been learning more and more about optimizing a Magento store. Unfortunately, Magento is not search engine friendly, at least up to Community Edition 1.8 and Enterprise Edition 1.12.

Magento Tricks by Traffic MotionLater releases of the platform may have already solved some of these problems, but a lot of Magento sites out there are on these older versions. Even worse, the cost of upgrading from one version to the next can cost several tens of thousands of dollars, and there is little incentive to do that if the only benefit to doing to is some minor SEO improvements.

I’ve divided up this post into three sections. The first is attempting to speed up Magento, the second is on-site improvements including a subsection on structured data, and the final section is on collecting more data from third party tools like Google Analytics and Webmaster accounts from Google, Bing, and Yandex.

Speeding Up Magento

Server Speed – Hardware & Software

The first thing you should do is either read Magento Site Performance Optimization, or give a copy of the book to your IT team or Magento developers. While not every host will want you running HHVM or have a different type of PHP caching solution, you’ll want to familiarize yourself with how hardware and software on the server can improve your store’s speed.

Improve Speed from the Backend

There are a lot of little tricks and button to press in the Magento System Configuration panels, and some of them may be very useful for incremental gains in speed.

Disable These Four Caches

Oddly enough, disabling a handful of Magento’s caches may actually speed up your site. The reasoning is counterintuitive, but makes perfect sense once you realize the database is the bottleneck at which Magento can get stuck. Allowing Magento to generate these resources on the fly, rather than querying the database to find cached versions that may not exist, can make your server response time increase (or at least stabilize).

.htaccess Tweaks

Whether you’re on NGinx or Apache, there are a number of tweaks you can make to your store’s .htaccess file that can boost your speed. These mostly involve compression and setting expiry headers, but they can make your load faster for new and returning visitors.

On-Site Improvements for Better Crawling

Setting a Unique Home Page Title

While “best practice” for product pages is “Brand Product | Description | Site Brand,” the home page should have the site’s name first. But with how Magento handles title suffixes, you might have a home page title of “Site Brand | Site Brand” if you don’t use this very simple trick to set the home page’s title tag.

Turn Off Category Paths in URLs

This trick was confirmed from Everett Sizemore, but it’s something I’ve recommended. Instead of letting Magento build product URLs with all sorts of different category structures depending on how many categories a product is in, just turn them off. It’s an easy fix in the System > Configuration > Catalog > Search Engine Optimization tab. While you’re there, turn on the Canonical tags.

Cache Breadcrumbs

Magento has a problem in that breadcrumbs are generated the first time a user visits a page (although the breadcrumbs will be different depending on the path they took to get there). Unfortunately, the breadcrumbs are not added to the page cache, so subsequent users and search engines will not have the benefit of seeing the breadcrumb navigation links. This seems like a strange bug, but it’s an easy fix.

NoIndex Filtered Navigation Pages

In order to keep your site from having tons of duplicate content indexed by Google due to all of the different filtered navigation, use this simple extension to set the robots meta tag to NOINDEX,FOLLOW. That way, you can focus the search robots on crawling and indexing the more important pages – categories and product pages.

NoIndex Internal Search Results

While we’re using the scalpel on our pages to keep them out of the search index, let’s get internal searches out of the index. This easy layout update will set the robots meta tag to NOINDEX,FOLLOW on any search results pages that Google or Bing may crawl.

Structured Data Implementation

Basic Schema.org Markup

I can’t do any better than the job Robert Kent did in his book Magento Search Engine Optimization, so just buy that book, follow the instructions for modifying some of your template files, and enjoy easy, valid structured data for your product pages, logo, and any contact info you have on the site.

Extened Schema.org with ProductOntology

One thing I can help you add to your basic Schema.org markup is ProductOntology.org data, which provides external reference classes for each of your products. What’s the benefit? I don’t know. What’s the potential downside? None that I can think of.

Facebook/Pinterest Open Graph Cards

Facebook and Pinterest both use Open Graph markup to provide metadata that they can read and use in enhanced links and pins of your site’s content. Implementing this type of markup is actually very simple.

Twitter Cards

Similarly, Twitter has its own set of meta tags that you can add to your site. The result is more information contained in tweets of your products, including images, prices, and references to your site’s official Twitter page. Again, the implementation is easy.

Collect More Visitor and Search Engine Data

Increase Google Analytics Site Speed Sample Rate

By default, Google Analytics samples 1% of your site’s visitors for server response, page load, and other site speed data. For websites with 50,000 monthly page views, this is only 500 samples per month, which is definitely not enough data to determine if your store is fast or slow. This simple tutorial develops a custom extension to up that sample rate to 10%, which gives you a much better gauge of the performance of your website.

Add Webmaster Verification Meta Tags

Magento has a section in the backend to add miscellaneous tags and scripts to every page of the site. But what if you just want to add tags to your homepage and maintain them in a separate block? Google, Bing, and Yandex require a verification meta tag on the homepage, so why include it on all 10,000 other pages of your site? Let’s keep the HTML cleaner and implement the tags on just one page.

Google Removes Author Photos from Search Results

The big news in SEO this week was Google’s announcement that author photos will no longer appear in most search results. This is a curious move by Google, and signals the end of one of the big reasons bloggers and writers of all types flocked to Google Plus.


Going forward, instead of an author photo showing up in the search results for logged out users, along with Circles data from Google Plus, all that will appear in the SERPs is a by-line. Author photos can still show up for users logged in to their Google Plus account, depending on if they have an individual author in their circles. In Google News, a smaller author photo may appear.

So why were author photos dropped by Google? According to John Mueller, “we’re simplifying the way authorship is shown in mobile and desktop search results, removing the profile photo and circle count. (Our experiments indicate that click-through behavior on this new less-cluttered design is similar to the previous one.)”

Interestingly, Mueller doesn’t indicate whether click-through behavior is similar on the actual search results that showed authorship photos compared to results without a photo. Just that “click-through behavior” is similar from one to the other.

His explanation also flies in the face of research done by Google and others on eye tracking. Remember this eye tracking study from 2006? With 10 blue links, search behavior is F-shaped, while “chunking” is exhibited around images where rich search results are displayed. Google is now indicating that click behavior is the same on results with images as with only 10 blue links. But eye-tracking behavior has been proven to be different depending on the presence or absence of rich media in search results.

The elimination of widespread author photos in the search results is also curious, as it raises questions about Google’s intentions with its Google Plus social network. One of the big draws of Google Plus was the ability to have one’s photo show up in the search results. This was Google’s enticement for anonymous authors to come out from the cold, join their network, start building relationships, and dive deep into Google Plus.

With Google’s focus on authority, authenticity, and veracity in its semantic search strategy, verifying an account as a real person through Google Plus was a huge piece of the puzzle. According to David Amerland’s book Google Semantic Search, Google specifically gave people an incentive to join Google Plus, establish authorship markup on their articles, and link all of their other social networks together. The main ego-bait was the author photo showing up in SERPs.

Will a simple by-line, rather than the ego-bait of an actual photo, be enough to draw new users to Google Plus? I can’t speak for anyone else, but my reaction has been a resounding “Meh.” When I do research on Google (something I do increasingly less as I’ve taken a liking to DuckDuckGo‘s results), I’ve always been drawn to authorship photos. It’s a lot easier to remember a face than a name, especially since I’m far more likely to skim over by-lines than photos.

Without the increased click-through rates I was seeing on properly verified articles due to the author photo showing on Google, I can’t think of much reason to spend a lot more time developing a presence on the social network. I barely use Facebook, LinkedIn, or Twitter, because there’s almost NO incentive to use them and I have more important things to do than establish a presence on those other networks.

It was Google Plus that I used the most, and the selfish incentive of building my “author rank” over time and having my image appear in more searches was a strong incentive to add people and companies to my circles and engage in discussions with them (even though I was often the sole person commenting on Google Plus posts for many companies).

For now, I’m sure I’ll keep my Google Plus and see where Google takes its authorship markup, but I’ll be far less likely to continue developing my own “contributor to/authorship” profiles around the web at the same pace and with the same enthusiasm at which I once did.

How to Noindex Internal Search Results in Magento

If you’re like me, you’re tired of seeing twice as many pages indexed by Google as you have actual pages and products on your ecommerce website. One way to combat this is to use the Robots.txt file to Disallow Google, Bing, and other web bots from crawling the content of your pages.

Magento Tricks by Traffic MotionBut that doesn’t actually keep the pages out of the index! It only keeps them from being crawled and having their content indexed and searchable. The page URLs themselves can (and most likely will) still appear in search results, with a message on Google saying that the page’s content is blocked by the site’s Robots.txt file.

So using the Disallow command in Robots.txt won’t reduce the number of pages indexed by the search engines. The only way to do that (besides tedious manual removal requests) is to include a NOINDEX meta tag on pages that you do not want indexed. We already took a look at Noindexing filtered navigation pages of product listings.

In this article, we’ll take a quick look at Noindexing the search results. Why is this important? Well, read it from the man himself, Matt Cutts, who wrote a “Search Results in Search Results” post on his blog all the way back in 2007. So yes, keeping Magento search results pages out of the Google index is important. Let’s look at how to do it!

Updating the local.xml Layout File

Navigate to the magentoroot/app/design/frontend/base/default/layout/local.xml file and open it. If it doesn’t exist, create it.

The code that we have to add to the file is this:

<?xml version="1.0" encoding="UTF-8"?>
<!-- other code -->
<!-- other layout code -->
	<catalogsearch_result_index translate="label">
		<reference name="head">
			<action method="setRobots"><value>NOINDEX,FOLLOW</value></action>

Wrapping Up Noindexing Internal Search Results

That’s it? We’re done? Well, yea, that’s basically all there is to it. In some cases, you may also want to Noindex the catalog_advanced_index and catalog_advanced_result but you can use the exact same code snippet as above, just replacing catalog_result_index with the other two variations.

This is a pretty simple fix, and can help Google keep a lot of your internal search pages out of their index, especially if you occasionally link to a search result as the most relevant and user-friendly method to find a product or list of related products.

If you followed along with the tutorial in the base default theme, you can do a search for anything, get the results, and check out the Page Source. You’ll see something like the following code, indicating that this simple fix has worked.

<title>Search results for: 'nokia'</title>
<meta name="description" content="Default Description" />
<meta name="keywords" content="Magento, Varien, E-commerce" />
<meta name="robots" content="NOINDEX,FOLLOW" />

Adding Google, Bing, Yandex, and Pinterest Verification Meta Tags to Magento

Meta tags are an easy way to add verification to your Magento store for Google Webmaster Tools, Bing Webmaster Tools, Yandex.Webmaster, and Pinterest for Business accounts. In this article, we’ll go over two easy ways to add these tags to your Magento store.

Magento Tricks by Traffic Motion

One alternative to adding meta tags is to upload a file from each of these sites to your site’s root directory. Unfortunately, when developers, IT, and version control systems are involved, this can be a much longer process than it should be. Instead of taking a few seconds, it may take days or even weeks just to get a new file (let alone four) uploaded to the root directory of your store.

Thankfully, Magento provides a couple of workaround for this, one that is extremely simple, and one that is a little more sophisticaned but cleaner and easier to read and maintain. We’ll start with the simpler method.

Miscellaneous Head Scripts

Log in to the admin section of your store, and navigate to System > Configuration > Design > HTML Head > Miscellaneous Scripts, and simply copy-paste the verification code there.

The code for our TrafficMotion website would look like this:

<meta name="msvalidate.01" content="14B60B177AD73A9CB88E52E10DE81E9F" />
<meta name="google-site-verification" content="Y0KjL1s3sCz5_RnzmBPdNIiMKE7n-iPy3OzjlRlPvdI" />
<meta name="p:domain_verify" content="8c5c5b782d04896448cc802df0e4e97b" />
<meta name='yandex-verification' content='7754504dbf4109f0' />

Easy, right? So what’s the problem with this method?

There really isn’t one, but the scripts are added to the <head> of every page of your site, where Google, Bing, Yandex, and Pinterest only require it on the home page.

To put it simply, why clutter up every page of your site, when you don’t need to? Google or Bing aren’t going to bother verifying the tag on every single page of your store, so why bother including it? It’s just extraneous code for the vast, vast majority of your pages.

So let’s add these pieces of code a different way, and just get them on the home page.

Block & Layout Update

First, navigate to System > Configuration > General > Content Management > WYSIWYG Options. We’re going to change the setting for “Enable WYSIWYG Editor” to “Disabled by Default.” If we don’t do that, Magento’s WYSIWYG editor will strip out the meta tags we’ll be adding to a new static block.

Adding a New CMS Static Block

Once you’ve done that, go to CMS > Static Blocks > Add New Block. You can name it anything you want, but we’ll give it the Block Title of “Validation Codes” and the Identifier of “html_validation_codes” for the purposes of this tutorial.

Then, in the Content, copy-paste in your meta tag verification codes. Now, we just have to get that block into our home page.

Adding the Validation Codes Block to the Home Page

Now, go to CMS > Pages > Home page, and hit the Design Tab under Page Information. This is the part where it can get tricky for new store owners, but it’s really not too difficult.

There may already be some code in the Layout Update XML field if you’re using the base default theme, or it may be empty. In either event, we’re going to add a little snippet of code to reference our Validation Codes static block. Here’s what you would include if you’ve followed along so far:

<reference name="head">
<block type="cms/block" name="validation_code">
    <action method="setBlockId"><block_id>html_validation_codes</block_id></action> 

Wrapping Up Validation Codes

Now, check the Page Source of your Magento store home page, and you should see the block of code just under our customized Google Analytics code. It looks just like this:

<meta name="msvalidate.01" content="14B60B177AD73A9CB88E52E10DE81E9F" />
<meta name="google-site-verification" content="Y0KjL1s3sCz5_RnzmBPdNIiMKE7n-iPy3OzjlRlPvdI" />
<meta name="p:domain_verify" content="8c5c5b782d04896448cc802df0e4e97b" />
<meta name='yandex-verification' content='7754504dbf4109f0' />
<script type="text/javascript">//<![CDATA[
        var Translator = new Translate([]);

Then, you can check for the code on any other page of your website, and it won’t be there! It’s not a huge savings in terms of the amount of HTML on the page, but it does keep a few lines of extraneous code off of them. No website would pay much attention to those meta tags if they were on every page anyway, so let’s keep the category, CMS, and product pages a little cleaner.

Essential .htaccess Modifications for Speeding Up Your Website

One of the largest concerns for website owners should be the speed of their sites. While Google does not actively penalize slow sites, there is a definite opportunity cost of having pages that take too long to download. This comes mainly in the form of Google and other search engines not crawling your site as deeply or as frequently as they otherwise would.

Picture of snail
This snail is tired of waiting for your store website to load
For ecommerce websites, this means that it can take weeks instead of days for Google to crawl new products, even after being made aware of them through the XML sitemap. With new products hitting the market every day, site speed can mean the difference between being on the search engine today or in two weeks, and the commensurate loss of income due to having a slow site.

While there are many factors out of the control of site owners, .htaccess modifications is one that is relatively easy to manipulate. The modifications in this article will work for any site on an Apache server — Magento, WordPress, or plain old HTML — but we’ll focus on Magento right now.

I’ve been writing a lot about Magento lately, and a default Magento install references several Apache server modules natively in its .htaccess file. Problem is, they’re either commented out or not filled in properly.

So let’s remedy that situation by fixing up the default Magento .htaccess file.


Let’s start with mod_deflate located on Line 74 of the Magento .htaccess located in the root directory of the install. The basic file looks like this.

<IfModule mod_deflate.c>

## enable apache served files compression
## http://developer.yahoo.com/performance/rules.html#gzip

    # Insert filter on all content
    ###SetOutputFilter DEFLATE
    # Insert filter on selected content types only
    #AddOutputFilterByType DEFLATE text/html text/plain text/xml text/css text/javascript

    # Netscape 4.x has some problems...
    #BrowserMatch ^Mozilla/4 gzip-only-text/html

    # Netscape 4.06-4.08 have some more problems
    #BrowserMatch ^Mozilla/4.0[678] no-gzip

    # MSIE masquerades as Netscape, but it is fine
    #BrowserMatch bMSIE !no-gzip !gzip-only-text/html

    # Don't compress images
    #SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary

    # Make sure proxies don't deliver the wrong content
    #Header append Vary User-Agent env=!dont-vary


To get this module working, all we have to do is remove some of the # comments from certain lines. Removing some of the extraneous comments, the modified version would look like this:

<IfModule mod_deflate.c>
        SetOutputFilter DEFLATE
        BrowserMatch ^Mozilla/4 gzip-only-text/html
        BrowserMatch ^Mozilla/4.0[678] no-gzip
        BrowserMatch bMSIE !no-gzip !gzip-only-text/html
        SetEnvIfNoCase Request_URI .(?:gif|jpe?g|png)$ no-gzip dont-vary
        Header append Vary User-Agent env=!dont-vary


A default install of Magento makes no mention of mod_gzip in its .htaccess, so that’s our next step. Add these files to take advantage of this compression module in your .htaccess.

<IfModule mod_gzip.c>
    mod_gzip_on Yes
    mod_gzip_dechunk Yes
    mod_gzip_item_include file .(html?|txt|css|js|php|pl)$
    mod_gzip_item_include handler ^cgi-script$
    mod_gzip_item_include mime ^text/.*
    mod_gzip_item_include mime ^application/x-javascript.*
    mod_gzip_item_exclude mime ^image/.*
    mod_gzip_item_exclude rspheader ^Content-Encoding:.*gzip.* 


Finally, Magento has some mention of mod_expires but it is just a default expiry date of “access plus 1 year,” which is not very helpful or specific for store owners who may update some files (CSS tweaks, for example) on a continuing basis, while rarely touching others (images, for instance). Here is what Magento includes by default:

<IfModule mod_expires.c>

## Add default Expires header
## http://developer.yahoo.com/performance/rules.html#expires

    ExpiresDefault "access plus 1 year"


Not much to it, right? Let’s beef it up a little bit by making it more specific. Change these expires headers time periods according to your own site’s development process:

<IfModule mod_expires.c>
        ExpiresActive On
        ExpiresDefault "access plus 1 days"
        ExpiresByType text/html "access plus 1 days"
        ExpiresByType image/gif "access plus 90 days"
        ExpiresByType image/jpeg "access plus 60 days"
        ExpiresByType image/png "access plus 90 days"
        ExpiresByType text/css "access plus 14 days"
        ExpiresByType text/javascript "access plus 7 days"
        ExpiresByType application/x-javascript "access plus 7 days"

That’s all there is to it. Just edit the code in your favorite code editor, upload the new version of the .htaccess file, and enjoy a faster Magento (or WordPress or other) website!

1and1 Debt Collection Agency Threat Letter

I’m posting this under the Webmaster Tools category, even though it’s more of a generalized rant against a lack of customer service and respectful negotiation.

We’ve been customers of 1and1 since 2005 or 2006 when we switched to their domain and hosting services. I’d be hard pressed to remember any missed or late payments in those 7-8 years, and we’ve remained loyal 1and1 customers throughout, despite the often difficult and counter-intuitive user interface.

But when our business needed to switch debit cards a few weeks ago, we were without a payment option for the space of about a week. A few things tried to bill us but were declined. Usually, we got an email about the declined payment, let the company know of the situation, and made the payment as soon as the new card was ready.

Not 1and1, though. They sent a letter threatening to freeze the account and even send us to collections! For $44.97!

I think two rational organizations can negotiate a late payment of $44.97 without getting collections and courts involved, right?

There’s almost nothing in this letter that I disagree with, in terms of the fact. Yes, we were late making a payment. Yes, 1and1 can cancel the contract. Yes, they can freeze the account and take all of our sites offline for days. Yes, they can sell our account to a collections agency and tack on another $18.95 if they want.

What I have a problem with here is the lack of respect that 1and1 shows for customers that have loyally supported them for nearly eight years. And it’s not as though we had ever had a history of late payments with them, or a history of attempting to steal their services without paying for them.

I might not even have a problem with a letter threatening a freeze on our account if it was unpaid. But a threat to send us to collections after single missed payment is simply overkill.

In fact, it’s enough to persuade us to leave 1and1 for good. Most of our clients’ websites are on HostGator anyway, which gives us and them an easier way to access… well, just about everything that we ever need to access to service their websites. And HostGator didn’t immediately send us what amounts to legal threats when our old card was declined for a few days.

What happens when 1and1 sends accounts to collections? Obviously, a $44.97 problem immediately turns into a larger one, with the $18.95 charge, plus whatever charges the collection agency is legally allowed to tack on. Also, the collection agency would be able to sue us and get a judgment. Both the collection and the judgment would appear on my credit report for 7-10 years, whether I paid or not.

And of course, if the collection agency was really nasty, and I didn’t bother showing up to a court hearing to tell them all of my income and assets so they could deduct their judgment from my bank account or something, they could ask the court for a “bench warrant,” which amounts to a day in jail for 1and1 debtors’ prison.

After working with enough low income homeowners and consumers on various foreclosure, real estate, and financial websites, I know that a threat to send to someone to a debt collection agency is really a threat of jail time, or legalized killing by a police officer for resisting 1and1’s debtors’ prison.

After being with 1and1 for close to a decade now, I’m really disappointed by their lack of communication, negotiation, and customer service skills. Rather than waiting for more than one payment to be declined or putting a temporary freeze on the account and then sending legal threats, 1and1 jumped straight to debt collection agency threats.

Ironically enough, I think I received this letter after we had made the payment on the account. So it was resolved even though our websites were frozen for a period of a weekend.

But no thank you, 1and1. I don’t know if your letter contained hollow threats or not, but we’re not intimidated by your threats of collections, with all that implies: lawsuits, judgments, hearings, bench warrants, jail, tasers, and guns.

I’m sure you can find others to pay for your services in the future, but we have too much respect for ourselves to continue being your client.

There are plenty of other web hosting companies out there who would be happy to receive our business, and who understand that sometimes payments are declined for any number of reasons (lost, stolen, expired cards), and will give customers the benefit of the doubt for at least a few weeks without resorting to threats.

[Note: Google is full of horror stories of 1and1 actually following through on its collection agency threats. I guess they weren’t hollow threats at all. Here’s one story, and another, and another, and another, and another.]

Common Mistakes in Using Google’s Disavow Links Tool

Check out the latest Google Webmaster Help video on YouTube by Matt Cutts: “What are common mistakes you see from people using the ‘disavow links’ tool?”

webmaster tools blog iconA few takeaways:

  1. Google’s parser is looking for .txt files, so don’t upload a .doc or .xls file and expect it to be able to make sense of your disavow requests.
  2. Instead of looking for individual links on individual pages to disavow, you might need to disavow the entire domain, depending on how bad your link profile looks to Google. You can do a “domain:” disavow in these cases.
  3. Use the right format for domain: disavows. Don’t include the http:// or www prefixes. Instead, just list the domain itself like “disavow: example.com” and Google will be able to interpret it.
  4. Include any comments or justifications in the reconsideration request, rather than in the disavow file uploaded to Google. In fact, don’t include a lot of comments in a disavow links .txt file.
  5. Disavow won’t cure all of your SEO problems. Try to clean up those links first and remove them from the internet. Use disavow for links you can’t remove yourself.

Watch the entire video below.