Archive for the 'Quality' Category

The Truth About Modern Content

Monday, September 28th, 2015

The mantra “Content is King” has been bandied around for years now: it was certainly huge when I started this blog in 2007.

As a “content person” I’ve always wanted this to be true, and acted as if it was. In 2009, I was running what we would now call “Content Marketing” for a big travel client using a network of authors with their own profiles, long before “Authorship” came to the fore.

Truth is, however, that between then and now content, while important, has still played second- or third-fiddle to all sorts of other factors in SEO, especially in-bound links and other technical tweaks. There have always been ways to artificially boost a site’s authority because — until recently — search engine algorithms have been pretty simple beasts, and it’s been very difficult to turn content “effectiveness” into a meaningful metric.

Google’s Hummingbird update was probably the beginning of the end for technical fiddles. It’s wrong to call Hummingbird an “update”, it was much more than that. We’d all been talking about “Semantic Search” for years, but Hummingbird meant that semantics — the basis of real language — really would play a part in search quality.

King Tut’s Tomb

The reasoning behind quality content being a ranking factor is simple. People go to search engines to find answers to their questions, and if they get unsatisfactory answers they don’t come back. So search engines need to find the sites giving those best answers and feature them highly. That doesn’t necessarily mean the sites that everyone knows about — in other words, the ones with the most inbound links.

Any researcher will know the joy of finding something that has never been seen before, or which has been lost. In real life, you might liken this to the discovery of Tutankhamen’s Tomb. All the wonderful treasures were there to be found because all trace of the tomb had been lost for centuries.

The search engines need to do two things to ensure they show the best search results:

  • They need to find it
  • They need to assess the content just like the end user and ascribe a value to it

The problem with the first issue is that in-bound links have become incredibly devalued as a result of years of “Black Hat” SEO techniques, so some other way to validate existing ranking factors is required. The move towards the “entity” module is one, with simple mentions of your URL on another site is seen an an endorsement, or — as in the case of a poor review — exactly the opposite. And then there’s Social Media: not a ranking factor of itself as Google’s John Mueller repeatedly points out, but certainly a flag that something is making waves and should be investigated.

All this means is that keeping your house in order with good technical SEO, Meta tagging, Schema and great server response times is still important.

Say it like it is

But this is a blog about content and it is point 2 that is important to us. Once upon a time, and not so long ago, all you needed to do with content was make sure you had your chosen keywords in the right ratios and that was it: no attention to quality or readability or relevance was really required. Today, Semantic Search means you don’t need to worry about Keyword Densities, you only need to write authoritatively about your subject in a natural way. Of all the search engines, Google has perhaps spent most time — and money — trying to get content assessment right, even lashing out big bucks on buying large chunks of time on mega computers owned by the US department of Defense.

Today, as a result of all this effort, Google can now read as well as any 10-year-old: a 10-year-old that can speak 123 languages and name 170 million references in less than a second.

That means that for the first time in forever, what you write on your site is as important as anything else, for if a search engine comes via a tweet or Google+ posting or even an inbound-link, unless it finds something useful, unique and usable there, it will shrug its subset-evaluation functions and move on.

Google have gone public about this. In February 2015, New Scientist magazine published an article called “Google wants to rank websites based on facts not links” saying preferential treatment would be given to websites which carried more truthful information. Basically, liars won’t prosper.

As an SEO I increasingly examine my chosen specialty with despair. It was never easy, but it is becoming more and more difficult to try to see what more code-updates, responsive layout tweaks or time-to-first byte improvements you can do to boost a site’s rankings. Hopefully, there is something now which works … content.

Key facts about keywords

Wednesday, October 17th, 2012

There is still much uncertainty about keywords and it’s not surprising: because keyword “science” is not — and probably never will be — settled at all.

The temptation is to “stuff” content with as many relevant words as possible in the hope that they will match the sorts of words that people are searching for. That was how Search Engines used to work and it’s a simple concept: pages with lots of keywords will be relevant to more people and so appear in more search results.

The problem with this approach is that the search result is more often than not completely irrelevant.

It is also susceptible to spamming because one way to get lots of people to your site used to be to cram as many different keywords into a piece of text as possible. Spammers work on very small margins: send out a thousand fake emails and even one click is worth it – especially when you’re actually sending a million!

That’s why today’s search engines are much more sophisticated. Their business is to produce relevant results, which makes them trusted (and more popular), which in turn means that their advertisements get seen by more people.

That requires an accurate search result to come from a very few words, even if those words aren’t the ones used by the searcher.

Actually, we all use a host of words for a single concept – for example: rooms, hotels, accommodation, stays, beds, auberge, boarding house, caravansary, dump, fleabag, flophouse, hospice, hostel, hostelry, house, inn, lodging, motel, motor inn, public house, resort, roadhouse, rooming house, spa, tavern – and modern search engines realise this.

Mistakes

It gets more complicated because sometimes we say one thing and mean another, or we don’t know what we want but we expect the search result to tell us what it is, or … our spolling is simply not up to scrotch.

Search engines know this …

And because we’re not always forthcoming with our search queries, Search Engines also use other signals to decide exactly what is the best search result for the supplied phrase.

For example, if you’re signed in to any Google service, your previous search history will be used to tailor your current results, meaning that if you regularly search for any term or terms — like ‘hotels in Town‘ — your results will be affected by this. If yours is a regular search, it’s likely that your results will differ significantly from those of someone else who has used that query for the first time (perhaps a potential customer).

Your results are also affected by geography: a search at London Heathrow’s departure lounge will return a different set of results to the one you carry out in the arrivals lounge at JFK using the same tablet computer.

And this “tailoring” can happen even if you’re NOT logged into to a Google account.

One of the big SEO talking points of 2012 has been tracking users — and their marketing preferences — using methods that do not require cookies or other overt tracking solutions, even down to the profile of their hardware. This is partly because of new EU rules which require sites to declare their use of cookies for privacy protection but also because Google (and others) are witholding search data from end users like us: the infamous “data not provided”.

Brand

However, Google have been doing something similar for a long while: when Google’s Executive Chairman Eric Schmidt told a 2010 keynote at the IFA in Berlin that “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” he wasn’t being Orwellian, just hinting at better search results.

What all this means for Organic Search is that the days of “stuffing” and “keyword densities” are over.

While we might not have reached predictive perfection – although predictive search results are physically changing the way people use keywords — we can finally concentrate on producing good, insightful, useful, informative and engaging content instead of worrying that we haven’t used the word “hotel” often enough in the first paragraph.

In fact, early in 2012, top Googler Pierre Farr, told the London Search Marketing Expo (SMX2012) that overusing “key” keywords such as “hotel” were almost intrinsically seen as spammy in sites whose core business is hotels.

That’s not to say that you can be lazy about content — a page about a hotel should be full of hotel keywords — it’s just that you should concentrate on presenting good information in an engaging way.

All the major search engines are now following Google’s lead in downgrading “over-optimised” websites. The major tool being used by Google is the Penguin updates, the latest just two weeks ago. These effectively downgrade sites with suspicious levels of poor-quality in-bound links, anchor text which optimises high volume keywords, or thin content.

Another recent update to Google’s algorithm is aimed at sites using keyword-rich domains and little relevant content; these include hastily-constructed affiliate-style sites which pretend to deliver important information but are actually just link farms pumping visitors to third-party sites. A particular favourite topic here is travel keywords.

Takeaways

To sum up, here are some simple guidelines for content, based on Google and Bing guidelines, as well as empirical data from our own (and industry) sources …

  • Do be aware of keyword volumes but use them as a guideline for the sorts of things you should be writing about
  • Do consider your end user and write content which will engage with their needs using local phrasing, information in depth and written in a readable way
  • Don’t keep using the same word so that the text seems stilted and robotic; use synonyms, alternatives, other ways of saying the same thing.
  • Do keep your content relevant to the subject of the page; drifting off to other topics will dilute its effectiveness. One topic, one page.
  • Don’t repeat yourself, say something new.
  • Do use short paragraphs
  • Do make your content readable, understandable, interesting and a call to action!
  • Don’t worry about keyword densities or getting as many keywords into the text as possible; if the page is relevant, you ARE effectively covering keywords!
  • Don’t repeat yourself! (this is important).

Finally, if you’ve managed to reach this sentence, you’ve seen how even vaguely interesting content can make people read on past the first paragraph.

And the longer you can engage the visitor the more chance you have that they will convert, as well as providing more keyword opportunities to bring more potential customers to the site.

News from The Front

Sunday, September 2nd, 2007

I’m still amazed at the number of people who ask for a “splash page” on their site: preferably something with lots of animated gifs “because they look nice”.

It’s become almost a mantra with me that home pages must provide a reason for the visitor to come back. (Actually, all pages should give the visitor a reason to come back because it’s just as likely that they’ll parachute in as a result of a link from StumbleUpon or Facebook or some search engine.) So your homepage should feature fresh content, perhaps even some random call to action, to keep it interesting.

This was all confirmed by a reread of Steve Krug’s Don’t Make Me Think. And then it dawned on me as I looked over the homepage of this very site, it too was a splash page. One visit was enough to know everything it said and there was precious little novelty: no reason to come back.

Needless to say, I’ve begun a rewrite of the JWC home page. There’s still some way to go — I will be adding some live updated content in the form of RSS too — but there’s certainly lessons to be learned.

They include:

  • Never be afraid to re-examine your content
  • Never be complacent about your site, and
  • Take your own good advice

Good Lessons

Don’t Make Me Think was first published in 2000, but almost all of it is still relevant. I’ll be regurgitating much of it in the coming weeks, with some more up to date insights of my own.

But usability is a vital part of good SEO and you neglect it at your peril. It’s not just a question of hard-to-use web sites not being “sticky” (actually, studies show that people will persevere with an inaccessible website because they fear the alternative won’t be much better), a usable site makes for better SEO because it is attractive to humans and robots.

The Case for Web Content Planning – Part 1

Tuesday, August 28th, 2007

If you’re new to the wide world of the web, or even considering the relaunch of an existing site, then you should really be giving some thought to a strategy for content.

A lot of sites around today happened without a content plan: they simply grew organically from the germ of an idea and the basis of a design. That’s all right for hobby sites, but when it comes to content for a purpose wishy-washy organic won’t cut it.

Obviously, you want a richly-populated, deeply interesting site; one which will attract those all important back links from popular sites because it has something interesting to say. Like a novel, every good website content needs a plot. And while that plot may develop over the coming years, it should always fit perfectly with your business objective at any point in time: there should be no gaps, no awkward pauses, no pages that are hinted at but just aren’t there.

So from the get go, you should have a plan for web content so that as your business grows, the content grows with it. Get it right and within five years you’ll have a huge resource on your hands with a minimum of effort. Get it wrong and you’ll be left with a nightmare of time-consuming revision, rewriting and damaging contradiction.

Back-Breaking Back-Links

And there are great SEO benefits in well-planned and seamless content. Right now you’re probably thinking about cross-linking campaigns and not looking forward to the prospect. It’s a laborious enterprise and more trouble than it’s worth, not least because the websites most likely to reciprocate are those with the smallest page-rank and hence the least clout SEO wise.

But, if your site is full of interesting joined-up content, there’s more chance that sites with good page rank will link to you automatically. You don’t have to be a TIME, Wikipedia or a BBC to be interesting enough for TIME, Wikipedia or the BBC to link to you: you just have to be relevant, original and authoritative.

Five ways to damage your SEO with content

Monday, August 27th, 2007

1. Write Rubbish

If your content makes no sense, if it’s dull and irrelevant, if not even your mother would make it all the way through, then you can be sure it will be bad for SEO. Make content interesting, make content readable, make content fun!

2. Duplicate It

There’s nothing quite so annoying as content repeated again and again. I mean, there is NOTHING so annoying as content which is repeated time after time. Really, repeating content again and again and again is really, REALLY, really annoying. The search engines don’t like it either, you might even describe it as SEO’s worst nightmare. Don’t use content which is duplicated (or even just summarized) elsewhere on the internet — even if it’s your own copyright, especially if it’s duplicated on the same site. Check for originality on copyscape.com if you’re not certain, and even if you are. Even if the content is your copyright and has been copied by someone else, it can hit your SEO if the copying site has a higher Page Rank than yours.

3. Make It Invisible

For search engines, invisible text equals SEO scam. Technically, making content invisible to the naked eye — for example, making it the same colour as the background or making it transparent or putting it in comment tags — comes under the heading of “Black Hat SEO”, or cheating. It’s a way of artificially loading content with keywords [SEO, content, search engine, timeshare, cialys, pre5cription5] to bump up the density, and the search engines got wise to it years ago. It will hurt your SEO.

4. Use JavaScript To Present It

Search engines just won’t index content which is provided by JavaScript. There have been too many SEO scams using scripts in the past and Google and the rest aren’t taking chances any more. If all you can see in the content source code is a ton of JavaScript, then you can be sure that the search engines won’t be seeing it either.

5. Make It Chaotic

Content should make sense. Part of that is how it is organised. One of the best ways to ruin your SEO is to order content in an illogical, inconsistent fashion so that the reader doesn’t know whether they are at the beginning, middle or end. This extends to your <h> tags: use them in the order <h1>, <h2>, <h3> … <h6>. Keeping content organised means the search engine spiders can crawl it, index it and rank it to the best effect.

Why Content is on the Rise

Sunday, August 26th, 2007

This may not yet be the Golden Age of Content, but it IS coming. Getting those all important search engine places has, until now, been a matter of juggling organic search elements.

These include keyword factors like good meta tags, keyword density in text, internal links and even the domain name itself, domain registration age and history, good backlinks and relevance to the topical neighbourhood, the age of links and the quality of the sending domain and metrics such as the time spent on pages and the number of searches.

Google’s current algorithm certainly has some direct analysis of content beyond keyword densities, and there is some speculation that further content endorsement comes from good old human beings (search specialists will also tell you that a good route to prominent Google placement is via the The Open Directory project — dmoz.org –which is entirely human-based).

Google’s own comment on their search algorithm is simply: “Google’s complex automated methods make human tampering with our search results extremely difficult”.

Google watchers say the algorithm changed last year to the detriment of many existing sites using the arsenal of so-called “White Hat” tricks such as keyword density and Long Tail. One way of regaining SEO that seemed to work was increased pagination: more content. It seems as if Google (and other search engines) have good, relevant and interesting content in their sights.

But isn’t that what search engines were meant to be? Somewhere people went to find readable pages relevant to their interest.

You should take note of this now. Don’t abandon White Hat, but be aware that your site should be more than a series of search engine algorithm tricks. Content is king.