There is still much uncertainty about keywords and it’s not surprising: because keyword “science” is not — and probably never will be — settled at all.
The temptation is to “stuff” content with as many relevant words as possible in the hope that they will match the sorts of words that people are searching for. That was how Search Engines used to work and it’s a simple concept: pages with lots of keywords will be relevant to more people and so appear in more search results.
The problem with this approach is that the search result is more often than not completely irrelevant.
It is also susceptible to spamming because one way to get lots of people to your site used to be to cram as many different keywords into a piece of text as possible. Spammers work on very small margins: send out a thousand fake emails and even one click is worth it – especially when you’re actually sending a million!
That’s why today’s search engines are much more sophisticated. Their business is to produce relevant results, which makes them trusted (and more popular), which in turn means that their advertisements get seen by more people.
That requires an accurate search result to come from a very few words, even if those words aren’t the ones used by the searcher.
Actually, we all use a host of words for a single concept – for example: rooms, hotels, accommodation, stays, beds, auberge, boarding house, caravansary, dump, fleabag, flophouse, hospice, hostel, hostelry, house, inn, lodging, motel, motor inn, public house, resort, roadhouse, rooming house, spa, tavern – and modern search engines realise this.
It gets more complicated because sometimes we say one thing and mean another, or we don’t know what we want but we expect the search result to tell us what it is, or … our spolling is simply not up to scrotch.
Search engines know this …
And because we’re not always forthcoming with our search queries, Search Engines also use other signals to decide exactly what is the best search result for the supplied phrase.
For example, if you’re signed in to any Google service, your previous search history will be used to tailor your current results, meaning that if you regularly search for any term or terms — like ‘hotels in Town‘ — your results will be affected by this. If yours is a regular search, it’s likely that your results will differ significantly from those of someone else who has used that query for the first time (perhaps a potential customer).
Your results are also affected by geography: a search at London Heathrow’s departure lounge will return a different set of results to the one you carry out in the arrivals lounge at JFK using the same tablet computer.
And this “tailoring” can happen even if you’re NOT logged into to a Google account.
However, Google have been doing something similar for a long while: when Google’s Executive Chairman Eric Schmidt told a 2010 keynote at the IFA in Berlin that “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” he wasn’t being Orwellian, just hinting at better search results.
What all this means for Organic Search is that the days of “stuffing” and “keyword densities” are over.
While we might not have reached predictive perfection – although predictive search results are physically changing the way people use keywords — we can finally concentrate on producing good, insightful, useful, informative and engaging content instead of worrying that we haven’t used the word “hotel” often enough in the first paragraph.
In fact, early in 2012, top Googler Pierre Farr, told the London Search Marketing Expo (SMX2012) that overusing “key” keywords such as “hotel” were almost intrinsically seen as spammy in sites whose core business is hotels.
That’s not to say that you can be lazy about content — a page about a hotel should be full of hotel keywords — it’s just that you should concentrate on presenting good information in an engaging way.
All the major search engines are now following Google’s lead in downgrading “over-optimised” websites. The major tool being used by Google is the Penguin updates, the latest just two weeks ago. These effectively downgrade sites with suspicious levels of poor-quality in-bound links, anchor text which optimises high volume keywords, or thin content.
Another recent update to Google’s algorithm is aimed at sites using keyword-rich domains and little relevant content; these include hastily-constructed affiliate-style sites which pretend to deliver important information but are actually just link farms pumping visitors to third-party sites. A particular favourite topic here is travel keywords.
To sum up, here are some simple guidelines for content, based on Google and Bing guidelines, as well as empirical data from our own (and industry) sources …
- Do be aware of keyword volumes but use them as a guideline for the sorts of things you should be writing about
- Do consider your end user and write content which will engage with their needs using local phrasing, information in depth and written in a readable way
- Don’t keep using the same word so that the text seems stilted and robotic; use synonyms, alternatives, other ways of saying the same thing.
- Do keep your content relevant to the subject of the page; drifting off to other topics will dilute its effectiveness. One topic, one page.
- Don’t repeat yourself, say something new.
- Do use short paragraphs
- Do make your content readable, understandable, interesting and a call to action!
- Don’t worry about keyword densities or getting as many keywords into the text as possible; if the page is relevant, you ARE effectively covering keywords!
- Don’t repeat yourself! (this is important).
Finally, if you’ve managed to reach this sentence, you’ve seen how even vaguely interesting content can make people read on past the first paragraph.
And the longer you can engage the visitor the more chance you have that they will convert, as well as providing more keyword opportunities to bring more potential customers to the site.