I was recently on a conference call with a group of colleagues charged with defining IBM’s external search strategy. There was a delightful buzz among participants, considering the collective brain power straining the conference call line with the weight of its wit. I’m not talking about my brain power, but that of the subject matter experts from whom I was attempting to extract search wisdom.
Just as I was about to pronounce the call a success and send them back to their desks with some homework, a colleague said something that stopped me cold:
“I don’t know why we’re devoting so much effort to this. After all, search is just a tactic.”
Since the call, I have struggled to compose an adequate response to this statement. It seems innocuous at first hearing. After all, search is in fact approached like a tactic alongside other advertising tactics by corporations all over the globe. But one word inserted in it renders it much more insidious than it at first seems: just.
If search is just a tactic, much of my work over the last several years is pointless. In particular, the search-first approach to content strategy, which is at the heart of my research, seems a colossal waste of time.
For my way of thinking, this is not just bad for me. It’s bad for the entire web content industry. Treating search as just another tactic alongside e-mail campaigns undermines its central place in the web user experience. If search is just a tactic, we will never escape the stigma of search engine optimization as glorified spamming.
Search engine optimization (SEO) has a bad reputation among many in the industry. It started in the days of AltaVista, when scammers crammed the margins and metadata of their pages with keywords that are marginally relevant to their users. This compounded poor usability of context-insensitive search algorithms by loading the search engine results pages (SERPs) with junk. In a sea of unstructured text, users look to search engines to help them find relevant content. Search engines couldn’t help them, in part because of dishonest tactics designed to drive traffic to web pages with no regard for user satisfaction.
Google was game changing for two reasons. First and foremost, it was able to discover the context of pages by looking at the linking relationships between them. This helped results dramatically. Second, and nearly as important, it designed its algorithm to exclude pages from its index that appeared to engage in keyword spamming. And it hired thousands of editors to review pages the algorithms flagged for exclusion from its index.
But search spammers are as clever as malware writers. As malware writers are always trying to stay one step ahead of the security industry, search spammers are always trying to stay one step ahead of the search industry. They know what triggers Google’s flags and they avoid it. Instead of loading titles and H1 tags with keyword variations, they load alt attributes and other hidden text with keywords. These tactics are designed to pump up the keyword density of pages artificially. Google is continuously adding new flags to punish these folks, and the beat goes on.
The goal of black-hat SEO is to entice unsuspecting users to click into pages whether they are interested in the page or not. In so doing, they can pump up their traffic numbers and make money off this traffic. Google has designed the ultimate poison pill for black-hat SEO, however. Users who land on an irrelevant page from Google more often than not bounce back to the SERP. Google measures bounce rates and will push pages with high bounce rates down in the rankings. On the other hand, pages with good engagement rates (the opposite of bounce), climb in the rankings. Black hat SEO might work for a brief time, but it is never sustainable.
Most SEO that is practiced today is akin to black-hat SEO. Despite the best intentions of everyone involved, the process leads to higher traffic, higher bounce rates and lower engagement rates. The process artificially manipulates the experience to get the desired result. This kind of SEO is just a tactic.
In gray-hat SEO, pages are created naturally and an SEO consultant is brought in after the fact to try to improve their search-engine effectiveness. SEO gurus find the words that have the highest demand in Google and recommend placing these words in strategic places on the page (title tags, H1, H2, body copy, alt attributes, etc.). Because the keywords chosen have relatively high query volumes in Google, ranking well for those words results in more traffic to your pages.
If traffic is all you (or your customers) care about, gray-hat SEO is not so bad, as long as the words that the SEO guru identifies are relevant to the original page. But in my experience, it is very difficult to find keywords that are both high demand and relevant to the content. Most of the time, the chosen words are marginally relevant to the content as written. It’s a bit of a challenge for the writer to fit them in without doing damage to her content. Often the writer resists, leaving too little keyword density to ever rank well for the words chosen. Writers often try and fail, degrading the message implicit in the content.
Gray-hat SEO failure has two guises. In the one, the page never ranks well for lack of keyword density. In the other, the page ranks well, but the content is not relevant enough to the keyword to get much engagement. Users who scan the page might find the keywords in it, but they’ll bounce anyway because it is just not relevant enough to what they’re looking for.
The other aspect of gray-hat SEO is link building. This is an artificial way to find link partners and convince them to link to your pages. If the link partners are willing to do this without link swapping and other under-the-table deals, it is OK. But in my experience, most of the folks you approach with these after-the-fact link building exercises require link swapping. Link swapping is one of the flags Google looks for to detect black-hat SEO. The risk of getting caught is not worth it for most content owners, unless you have a pre-existing relationship and the linking site is an obvious destination for your audience.
Black-hat and Gray-hat SEO have such a bad reputation, it convinces very smart people like Mike Moran and Bill Hunt to advise us to “write for people first, not for search engines (2009, 306).” I certainly have some sympathy with this attitude. Writing in this way shows your users that you have integrity; that you’re not trying to rig the system. The problem with it is in practical application: If you start out writing for people and do SEO after the fact, you inadvertently appear to engage in gray-hat SEO.
The primary way gray-hat SEO is akin to black-hat SEO is in the user experience. In trying to balance writing for people and writing for search engines, you risk building pages that have high bounce rates and low engagement rates. Not only is it difficult to maintain search rankings with this approach, it creates a lot of negative branding and missed opportunities for brand loyalty.
White-hat content strategy does not see users as mere statistics with which to generate revenue. They are potential loyal customers or collaborators with whom you want to establish a relationship. You should care if you are forcing them to bounce off your pages. Not only does this create a negative brand experience for them, it does not engender repeat visits or other manifestations of brand loyalty.
I call it white-hat content strategy rather than white-hat SEO because SEO is only a component of a larger strategy. The basic premise is to write for your users. But how do you know how to do this? If you can understand how to do this, you can write in ways that connect with them, and ultimately foster the kind of trust they need to become loyal customers or collaborators.
What audience analysis tools do you use to understand the way your target audience thinks and writes? On the web, one of the best and easiest is keyword research. If you can understand what words your target audience uses in their search queries and social media writing, you can use those words to connect with them. Keyword research is not done to pump up the volume of your traffic. It’s done to write with the words that are most relevant to your target audience. If you write with those words, the audience you attract will be much more likely to engage with your content and start developing a relationship of trust with you.
One problem SEO is supposed to address is corporate speak. In corporations, we get indoctrinated in the nomenclature of our brands and offerings. The longer we are part of this insular culture, the less we are aware of how we describe our product families with a different vocabulary than our target audience. But if we write with this vocabulary, and then do SEO after the fact, the content will clash and be ineffective for both search engines and users.
Search-first content strategy solves the problem by insisting that keyword research be done at the very earliest inceptions of content. Keyword research is a way of educating writers and editors of the vocabularies of their audiences. When they begin to speak the language of their audiences, they can write more relevant web pages for them. The only SEO in the process is baked into the templates and authoring environments in which the writers work. By writing for search engines, you are writing for a well defined target audience. By writing for a well defined target audience, you are writing for search engines.
A side benefit is link building. Search-first content is link bait: It’s the kind of content collaborators will want to post and tweet and otherwise generate buzz about. This buzz naturally results in links. This kind of content does not require link swapping and other under-the-table deals common in gray-hat SEO.
Now that I’ve laid out the overall strategy, you might understand why I reacted so negatively to the notion that search is just a tactic. Search is not a just a tactic. It is the primary way users find the content they’re looking for on the web. It is a central feature of digital content. It needs to be at the center of a digital content strategy, not something that’s just tacked on after the fact.
James Mathewson is the Global Search Strategy and Expertise Lead for IBM and co-author of Audience, Relevance and Search: Targeting Web Audiences with Relevant Content.