Archive for the 'Strategy' Category

The Truth About Modern Content

Monday, September 28th, 2015

The mantra “Content is King” has been bandied around for years now: it was certainly huge when I started this blog in 2007.

As a “content person” I’ve always wanted this to be true, and acted as if it was. In 2009, I was running what we would now call “Content Marketing” for a big travel client using a network of authors with their own profiles, long before “Authorship” came to the fore.

Truth is, however, that between then and now content, while important, has still played second- or third-fiddle to all sorts of other factors in SEO, especially in-bound links and other technical tweaks. There have always been ways to artificially boost a site’s authority because — until recently — search engine algorithms have been pretty simple beasts, and it’s been very difficult to turn content “effectiveness” into a meaningful metric.

Google’s Hummingbird update was probably the beginning of the end for technical fiddles. It’s wrong to call Hummingbird an “update”, it was much more than that. We’d all been talking about “Semantic Search” for years, but Hummingbird meant that semantics — the basis of real language — really would play a part in search quality.

King Tut’s Tomb

The reasoning behind quality content being a ranking factor is simple. People go to search engines to find answers to their questions, and if they get unsatisfactory answers they don’t come back. So search engines need to find the sites giving those best answers and feature them highly. That doesn’t necessarily mean the sites that everyone knows about — in other words, the ones with the most inbound links.

Any researcher will know the joy of finding something that has never been seen before, or which has been lost. In real life, you might liken this to the discovery of Tutankhamen’s Tomb. All the wonderful treasures were there to be found because all trace of the tomb had been lost for centuries.

The search engines need to do two things to ensure they show the best search results:

  • They need to find it
  • They need to assess the content just like the end user and ascribe a value to it

The problem with the first issue is that in-bound links have become incredibly devalued as a result of years of “Black Hat” SEO techniques, so some other way to validate existing ranking factors is required. The move towards the “entity” module is one, with simple mentions of your URL on another site is seen an an endorsement, or — as in the case of a poor review — exactly the opposite. And then there’s Social Media: not a ranking factor of itself as Google’s John Mueller repeatedly points out, but certainly a flag that something is making waves and should be investigated.

All this means is that keeping your house in order with good technical SEO, Meta tagging, Schema and great server response times is still important.

Say it like it is

But this is a blog about content and it is point 2 that is important to us. Once upon a time, and not so long ago, all you needed to do with content was make sure you had your chosen keywords in the right ratios and that was it: no attention to quality or readability or relevance was really required. Today, Semantic Search means you don’t need to worry about Keyword Densities, you only need to write authoritatively about your subject in a natural way. Of all the search engines, Google has perhaps spent most time — and money — trying to get content assessment right, even lashing out big bucks on buying large chunks of time on mega computers owned by the US department of Defense.

Today, as a result of all this effort, Google can now read as well as any 10-year-old: a 10-year-old that can speak 123 languages and name 170 million references in less than a second.

That means that for the first time in forever, what you write on your site is as important as anything else, for if a search engine comes via a tweet or Google+ posting or even an inbound-link, unless it finds something useful, unique and usable there, it will shrug its subset-evaluation functions and move on.

Google have gone public about this. In February 2015, New Scientist magazine published an article called “Google wants to rank websites based on facts not links” saying preferential treatment would be given to websites which carried more truthful information. Basically, liars won’t prosper.

As an SEO I increasingly examine my chosen specialty with despair. It was never easy, but it is becoming more and more difficult to try to see what more code-updates, responsive layout tweaks or time-to-first byte improvements you can do to boost a site’s rankings. Hopefully, there is something now which works … content.

The Case for Web Content Planning – Part 1

Tuesday, August 28th, 2007

If you’re new to the wide world of the web, or even considering the relaunch of an existing site, then you should really be giving some thought to a strategy for content.

A lot of sites around today happened without a content plan: they simply grew organically from the germ of an idea and the basis of a design. That’s all right for hobby sites, but when it comes to content for a purpose wishy-washy organic won’t cut it.

Obviously, you want a richly-populated, deeply interesting site; one which will attract those all important back links from popular sites because it has something interesting to say. Like a novel, every good website content needs a plot. And while that plot may develop over the coming years, it should always fit perfectly with your business objective at any point in time: there should be no gaps, no awkward pauses, no pages that are hinted at but just aren’t there.

So from the get go, you should have a plan for web content so that as your business grows, the content grows with it. Get it right and within five years you’ll have a huge resource on your hands with a minimum of effort. Get it wrong and you’ll be left with a nightmare of time-consuming revision, rewriting and damaging contradiction.

Back-Breaking Back-Links

And there are great SEO benefits in well-planned and seamless content. Right now you’re probably thinking about cross-linking campaigns and not looking forward to the prospect. It’s a laborious enterprise and more trouble than it’s worth, not least because the websites most likely to reciprocate are those with the smallest page-rank and hence the least clout SEO wise.

But, if your site is full of interesting joined-up content, there’s more chance that sites with good page rank will link to you automatically. You don’t have to be a TIME, Wikipedia or a BBC to be interesting enough for TIME, Wikipedia or the BBC to link to you: you just have to be relevant, original and authoritative.

Content By The Numbers

Monday, August 20th, 2007

So why beef up the content on your website? The logic goes that the more people you get to come to your website, the more business you get. Logical really …

It used to be said that success on a website was defined as a “conversion rate” of anything better than 1% — that is, for every 100 people coming to your website, one made a purchase.

Yet there is a potential problem. The more varied content one offers, the higher visitor numbers should rise and that is good. Conversely, the more varied the content one offers, the more varied one’s audience will be. In fact, much of your audience will consist of people who have no need for your prime service. By that logic, page views may go through the roof and sales double or treble, but the conversion rate will fall.

No matter. This is a case where it pays not to get caught up in the jargon. The key phrase is sales DOUBLING or TREBLING. There are some analogies with the retail idea of “pile it high and sell it cheap”, if you sell lots, you can afford a smaller profit margin, though that’s not the whole story. Getting many more people to come to your website will help to make your brand a household name. And as the business develops — and as markets develop too — the people that drift through the site today may be the customer of tomorrow looking for the product of the moment.

That’s why some content streams might not seem to be “target audience”. Your net needs to be spread further.

Five Reasons to have a CMS

Thursday, August 16th, 2007

1. Because you always have access to your content

Managing content using flat-XHTML and a code editor means you ALWAYS need a code editor. Even if you have a team running a round-the-clock service there will always be gaps. A good Content Management System or CMS means an authorised user can nip into any cybercafé in the world and change stuff to their heart’s content.

2. Because you can keep tabs on who’s doing what

Flat-XHTML is anonymous. A good CMS will include an “audit trail”, a clear record of who’s done what to which. This means sources of error can be pinpointed; ageing content can be freshened up or removed; differences in individual workloads can be managed; and weak areas can be highlighted. This may seem a little Big Brotherish but, in reality, it’s about spotlighting excellence as well as under-achievement.

3. Because you can maintain a style

Even with the world’s best-written style manual, bespoke additions to flat-XHTML content produce differences: it’s human nature to squeeze and poke things into position and cumulative, piecemeal changes can be difficult to roll back. A good CMS enforces style by limiting changes to content to those sanctioned by content managers.

4. Because it helps you to delegate

A good CMS allows access at different levels; from the Site Manager who can do anything, to the writer who can only enter and revise text. Now your website can be built by people with no internet skills, which is most of us.

5. Because things change

A good CMS allows you to change the way your site as required in the least harmful way. So, if your company totally rebrands, then the website can totally rebrand (and at much less cost). And if new rules or ways of thinking come along, you can meet the challenge easily because your content is held as raw data by the CMS.