Archive for the 'SEO' Category

The Truth About Modern Content

Monday, September 28th, 2015

The mantra “Content is King” has been bandied around for years now: it was certainly huge when I started this blog in 2007.

As a “content person” I’ve always wanted this to be true, and acted as if it was. In 2009, I was running what we would now call “Content Marketing” for a big travel client using a network of authors with their own profiles, long before “Authorship” came to the fore.

Truth is, however, that between then and now content, while important, has still played second- or third-fiddle to all sorts of other factors in SEO, especially in-bound links and other technical tweaks. There have always been ways to artificially boost a site’s authority because — until recently — search engine algorithms have been pretty simple beasts, and it’s been very difficult to turn content “effectiveness” into a meaningful metric.

Google’s Hummingbird update was probably the beginning of the end for technical fiddles. It’s wrong to call Hummingbird an “update”, it was much more than that. We’d all been talking about “Semantic Search” for years, but Hummingbird meant that semantics — the basis of real language — really would play a part in search quality.

King Tut’s Tomb

The reasoning behind quality content being a ranking factor is simple. People go to search engines to find answers to their questions, and if they get unsatisfactory answers they don’t come back. So search engines need to find the sites giving those best answers and feature them highly. That doesn’t necessarily mean the sites that everyone knows about — in other words, the ones with the most inbound links.

Any researcher will know the joy of finding something that has never been seen before, or which has been lost. In real life, you might liken this to the discovery of Tutankhamen’s Tomb. All the wonderful treasures were there to be found because all trace of the tomb had been lost for centuries.

The search engines need to do two things to ensure they show the best search results:

  • They need to find it
  • They need to assess the content just like the end user and ascribe a value to it

The problem with the first issue is that in-bound links have become incredibly devalued as a result of years of “Black Hat” SEO techniques, so some other way to validate existing ranking factors is required. The move towards the “entity” module is one, with simple mentions of your URL on another site is seen an an endorsement, or — as in the case of a poor review — exactly the opposite. And then there’s Social Media: not a ranking factor of itself as Google’s John Mueller repeatedly points out, but certainly a flag that something is making waves and should be investigated.

All this means is that keeping your house in order with good technical SEO, Meta tagging, Schema and great server response times is still important.

Say it like it is

But this is a blog about content and it is point 2 that is important to us. Once upon a time, and not so long ago, all you needed to do with content was make sure you had your chosen keywords in the right ratios and that was it: no attention to quality or readability or relevance was really required. Today, Semantic Search means you don’t need to worry about Keyword Densities, you only need to write authoritatively about your subject in a natural way. Of all the search engines, Google has perhaps spent most time — and money — trying to get content assessment right, even lashing out big bucks on buying large chunks of time on mega computers owned by the US department of Defense.

Today, as a result of all this effort, Google can now read as well as any 10-year-old: a 10-year-old that can speak 123 languages and name 170 million references in less than a second.

That means that for the first time in forever, what you write on your site is as important as anything else, for if a search engine comes via a tweet or Google+ posting or even an inbound-link, unless it finds something useful, unique and usable there, it will shrug its subset-evaluation functions and move on.

Google have gone public about this. In February 2015, New Scientist magazine published an article called “Google wants to rank websites based on facts not links” saying preferential treatment would be given to websites which carried more truthful information. Basically, liars won’t prosper.

As an SEO I increasingly examine my chosen specialty with despair. It was never easy, but it is becoming more and more difficult to try to see what more code-updates, responsive layout tweaks or time-to-first byte improvements you can do to boost a site’s rankings. Hopefully, there is something now which works … content.

Key facts about keywords

Wednesday, October 17th, 2012

There is still much uncertainty about keywords and it’s not surprising: because keyword “science” is not — and probably never will be — settled at all.

The temptation is to “stuff” content with as many relevant words as possible in the hope that they will match the sorts of words that people are searching for. That was how Search Engines used to work and it’s a simple concept: pages with lots of keywords will be relevant to more people and so appear in more search results.

The problem with this approach is that the search result is more often than not completely irrelevant.

It is also susceptible to spamming because one way to get lots of people to your site used to be to cram as many different keywords into a piece of text as possible. Spammers work on very small margins: send out a thousand fake emails and even one click is worth it – especially when you’re actually sending a million!

That’s why today’s search engines are much more sophisticated. Their business is to produce relevant results, which makes them trusted (and more popular), which in turn means that their advertisements get seen by more people.

That requires an accurate search result to come from a very few words, even if those words aren’t the ones used by the searcher.

Actually, we all use a host of words for a single concept – for example: rooms, hotels, accommodation, stays, beds, auberge, boarding house, caravansary, dump, fleabag, flophouse, hospice, hostel, hostelry, house, inn, lodging, motel, motor inn, public house, resort, roadhouse, rooming house, spa, tavern – and modern search engines realise this.

Mistakes

It gets more complicated because sometimes we say one thing and mean another, or we don’t know what we want but we expect the search result to tell us what it is, or … our spolling is simply not up to scrotch.

Search engines know this …

And because we’re not always forthcoming with our search queries, Search Engines also use other signals to decide exactly what is the best search result for the supplied phrase.

For example, if you’re signed in to any Google service, your previous search history will be used to tailor your current results, meaning that if you regularly search for any term or terms — like ‘hotels in Town‘ — your results will be affected by this. If yours is a regular search, it’s likely that your results will differ significantly from those of someone else who has used that query for the first time (perhaps a potential customer).

Your results are also affected by geography: a search at London Heathrow’s departure lounge will return a different set of results to the one you carry out in the arrivals lounge at JFK using the same tablet computer.

And this “tailoring” can happen even if you’re NOT logged into to a Google account.

One of the big SEO talking points of 2012 has been tracking users — and their marketing preferences — using methods that do not require cookies or other overt tracking solutions, even down to the profile of their hardware. This is partly because of new EU rules which require sites to declare their use of cookies for privacy protection but also because Google (and others) are witholding search data from end users like us: the infamous “data not provided”.

Brand

However, Google have been doing something similar for a long while: when Google’s Executive Chairman Eric Schmidt told a 2010 keynote at the IFA in Berlin that “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” he wasn’t being Orwellian, just hinting at better search results.

What all this means for Organic Search is that the days of “stuffing” and “keyword densities” are over.

While we might not have reached predictive perfection – although predictive search results are physically changing the way people use keywords — we can finally concentrate on producing good, insightful, useful, informative and engaging content instead of worrying that we haven’t used the word “hotel” often enough in the first paragraph.

In fact, early in 2012, top Googler Pierre Farr, told the London Search Marketing Expo (SMX2012) that overusing “key” keywords such as “hotel” were almost intrinsically seen as spammy in sites whose core business is hotels.

That’s not to say that you can be lazy about content — a page about a hotel should be full of hotel keywords — it’s just that you should concentrate on presenting good information in an engaging way.

All the major search engines are now following Google’s lead in downgrading “over-optimised” websites. The major tool being used by Google is the Penguin updates, the latest just two weeks ago. These effectively downgrade sites with suspicious levels of poor-quality in-bound links, anchor text which optimises high volume keywords, or thin content.

Another recent update to Google’s algorithm is aimed at sites using keyword-rich domains and little relevant content; these include hastily-constructed affiliate-style sites which pretend to deliver important information but are actually just link farms pumping visitors to third-party sites. A particular favourite topic here is travel keywords.

Takeaways

To sum up, here are some simple guidelines for content, based on Google and Bing guidelines, as well as empirical data from our own (and industry) sources …

  • Do be aware of keyword volumes but use them as a guideline for the sorts of things you should be writing about
  • Do consider your end user and write content which will engage with their needs using local phrasing, information in depth and written in a readable way
  • Don’t keep using the same word so that the text seems stilted and robotic; use synonyms, alternatives, other ways of saying the same thing.
  • Do keep your content relevant to the subject of the page; drifting off to other topics will dilute its effectiveness. One topic, one page.
  • Don’t repeat yourself, say something new.
  • Do use short paragraphs
  • Do make your content readable, understandable, interesting and a call to action!
  • Don’t worry about keyword densities or getting as many keywords into the text as possible; if the page is relevant, you ARE effectively covering keywords!
  • Don’t repeat yourself! (this is important).

Finally, if you’ve managed to reach this sentence, you’ve seen how even vaguely interesting content can make people read on past the first paragraph.

And the longer you can engage the visitor the more chance you have that they will convert, as well as providing more keyword opportunities to bring more potential customers to the site.

Getting Google to keep coming back

Tuesday, July 15th, 2008

A friend of mine has recently launched a blog on an unsuspecting world, starting from scratch.

She downloaded a copy of WordPress, bought some blog-only web space for a startling £27 a year and took one of the many WordPress themes and tweaked the content until it looked right to her. This all happened on June 27.

Then she regularly surfed the web news directories, pulled out the stories that caught her eye, rewrote them and stuck them on her new site at the rate of at least three a day.

Now, with a bit of a tweak here and there, and the inclusion of a WordPress plug-in that creates a sitemap on the fly, she is beginning to scent the sweet smell of success. While her blog has no PageRank and so far no-one is linking to it, she is already getting a daily index from Google and others.

I can’t tell you much more. I’ve promised that I won’t let on what the blog is called, or link to it: apparently, she’s using it as a test bed to see just how hard (or easy) it is to get top SEO purely from content. If it were me, I’d be shouting it from the rooftops, but then I’m a bloke so I have an ego to massage.

I can admit that I’m extremely jealous of her success because for technical reasons I’m still unable to add any pages to the site which is my day-job.

But my friend is right. It proves that once you take away all the coding factors, like semantic XHTML and proper linkage to and from the site, good SEO is REALLY all about content these days.

Provide good content, and provide it regularly, and Google and its rivals will come knocking.

According to best estimates of when the next PageRank update takes place, we have just 55 days to see how much daily indexing equates to good PR.

Hidden text that works for everyone!

Thursday, July 10th, 2008

So hidden text on the web is a BAD THING, we all know that. Gone are the days when it was considered cool to stick loads of words on a page — usually at the bottom — in the same colour as the background, possibly in 5px type.

Of course the search engines got wise to this “keyword loading”: it contributed nothing to the content of the page, after all. It got dumped into that category of “Blackhat technique”.

Good content is all about value to the reader and every word should count, so stuffing lots of “invisible” text on a page is simply wasted pixels. If you want to increase the keyword densities of your pages, simply write more (or at least write better).

But hang on. Never say never. There is a very good reason for including “invisible” text on your page, and not just the correct use of alt- and title-tags.

Those with a visual impairment rely on the text on a page completely: pretty pictures make no difference to them, so make the page work for people who can only read text. That means fully explaining text links and adding blocks of text to substitute for images.

This is all achieved using the CSS attribute display: none;

Create a style called .accessible (or .ted or .jarvis or whatever, it’s not important) thus …

.accessible {display: none;}

Now, any time you want to add “hidden” text, you can do it simply by wrapping it inside this class.

That means that the phrase …

The <span class=”accessible”>cat sat on the </span>mat

renders to an ordinary browser as …

The mat

but to a screen reader as …

“The cat sat on the mat”

Of course it’s a frivolous example but you might use this technique to improve a list-based navigation.

One site I worked on had a left nav where the code was a horrible table-based affair …

<table width="180" border="0" cellpadding="0" cellspacing="0"  class="bgrleftmenu"> <tr> <td width="20"> <img src="../images/default/spacer.gif"width="20"   height="36"></td> <td colspan="2" class="headerleft">Menu</td> </tr> <tr> <td colspan="3"> <img src="../images/default/spacer.gif"width="20"   height="8"></td> </tr> <tr> <td>&nbsp;</td> <td width="13"> <img src="../images/default/spacer.gif"width="4"  height="7"></td> <td width="147"> <a id="ctl00_LeftUserMenu1_LeftMenu1_hlinkHome" class="linkyellow12"  href="default.aspx">Home</a></td> </tr> <tr> <td>&nbsp;</td> ... <td>  <img src="../images/default/spacer.gif"width="4"  height="7"></td>  <td> <aclass="linkyellow12"   href="../en/help.aspx">Home</a></td> </tr> <tr> <td colspan="3"> <img src="../images/default/spacer.gif"width="1"  height="12"></td>
 </tr> </table>

However, the Semantic alternative was not only much more elegant, it worked better in terms of accessibility AND in terms of SEO!

<h2><span class="accessible">Site </span>Menu</h2> <ul>
 <a href="./." title="Go to the Home Page"> <li> <span class="accessible">Go to the </span>Home<span class="accessible">Page</span> </li> </a>  ... <a href="../en/help.aspx" title="Need Help? Get it here!"> <li> <span class="accessible">Need </span>Help<span class="accessible">? Get it here!</span> </li></a> </ul>

In a common or garden web browser both of these would produce a standard vertical navigation …

:: Home

:: Help

But via a screen reader you get ..

:: Go to the Home Page

:: Need Help? Get it here!

“But what’s the point of all this?” I hear you cry. “Are you just being nice to blind people?”

Well, yes — and remember that a MAJORITY of the world’s population has some sight impairment — but there’s one “blind” individual that’s important to everyone interested in content and SEO: your local search engine.

Search engines, whatever flavour (but we’re all thinking Google, aren’t we) are effectively “blind”. That text-light, image-heavy page may look good to humans with perfect eyesight and a true sense of colour dynamics, but to Google it’s just a load of source code.

Make your site more accessible to those with a visual impairment and you also make it more accessible to the search engine spiders, but use a technique like this and you actually get more keywords on your page with no penalties!

Adding more content to an image-focused site

Wednesday, July 2nd, 2008

My current project — smartlivecasino.com — is meant to be visually attractive. It’s an entertainment experience, after all.

Unfortunately, the people who originally designed the site saw it more as a work of art than a sales tool. As a result, there are loads of pretty pictures, most of which don’t even have alt-tags, and little actual text.

To the human eye it looks fine, but to any ONE — or any THING — not relying on vision there’s a problem. Strip away the imagery and there’s very little for a search engine spider to index except for a few links and disconnected phrases: not exactly what you’d call good content. No doubt we’ve all seen worse: sites where even the text is displayed as a GIF image, and an un-tagged one at that.

For good SEO any site needs words, and sentences made up from these words, and paragraphs made from these sentences. The bottom line is that only by increasing the number of words on a page can one hope to improve keyword densities to that sweet spot of between 5% and 15% of the total.

Indeed, the current version of smartlivecasino.com comes close to running the risk of agitating the search engines because the density of certain keywords is greater than 20%: to a search engine that could look like “blackhat” SEO.

But before undertaking a major redesign, can anything else can be done to improve matters?

Well, remember the bit about any ONE? If you consider the page from a disability access standpoint, there are plenty of things that could be done to make it more useful to someone with a visual impairment, or a search engine.

A good first step would be to alt-tag all the images using clear keyword-rich phrases. A stage further would be to add keyword-rich TITLE tags to images, links and any other media. Neither of these measures would disturb the look of the page in any way but they would give more content for the search engines to spider.

However, the line NEVER to cross is to include hidden text in your page, artificially increasing the keyword density by peppering it with white text on a white background or commented out phrases which bear no relation to the code. That sort of thing WILL get you in Google’s bad books.

I say “never” but it depends on what you mean by hidden; however, that’s another post …

Content and SEO with a twist

Friday, June 20th, 2008

So my latest challenge is a website that some people my find uncomfortable. Smart Live Casino is one of the world’s growing numbers of gambling websites.

Smart Live’s “twist” is just that: it’s live roulette, streamed via webcam or broadcast on UK digital television (SKY 851 and Freeview 22) from early evening to the wee small hours, presented by attractive croupiers, in a relaxed style.

I’ve never been much of a gambler myself, although I’ve always enjoyed the spectacle of gambling events like casinos or horseracing. And the vast numbers of people who flock to the great gambling meccas like Las Vega or the Aintree Grand National show just how popular it is. For most, it’s just a hobby: a way to release tension at the end of a busy day. And for most, it’s completely harmless. Sure they may lose occasionally, but don’t we all take risks every day and aren’t we okay if things don’t go quite as we’d planned.

Still, there is an understandable air of uneasiness when it comes to the subject, especially when people don’t want to be seen to publicly endorse a lifestyle which others object to.

This has proved to be a problem in relation to our attempts to launch a Pay Per Click campaign with the world’s biggest search engine company, Google.

Google (motto: do no harm) has a strict policy when in comes to online gambling advertising — they don’t do it! Geographical casinos are allowed, as are non-profit casino games such as those used by charities at social events. And if you make poker chips or roulette wheels, or have a surefire system to beat the house, you can advertise those too. So you will see PPC ads for gambling on Google properties when you search for relevant keywords.

However, you soon learn that some of these ads are not what they seem. The URL from a recent ad for “freegamblepackage.com” was

http://www.google.co.uk/aclk?sa=l&ai=BvJfi2WVWSImMB6HmQru-5ZgKscj_ Qb3ooq8FpbeaBfCzpQEIABABGAEoAzABOAFQn4rcigJgu76ug9AKoAHH teL5A8gBAYACAdkDLsK5FNgVhxjgAwg&sig=AGiWqty1Q9aMG-bEIwSlo_el85zW6e P2EQ&q=http://www.freegamblepackage.com/%3Faff%3D52123%26c%3D1.

However, after the inevitable blank screen where the background referral script worked out where it was being linked from, the page was redirected to http://www.primecasino.com/?aff=52123, which is a rival online casino and therefore not allowed under Google rules.

What’s going on here is an affiliate scheme; not of itself illegal (even Smart Live Casino is dipping a toe into the partnership model) but in the way campaigns like the one above currently operate, it’s just downright sneaky. In the above example, simply typing freegamblepackage.com into a browser produces a lame page for another surefire system to beat the roulette wheel, but in itself it is almost certainly a satellite website run by the affiliate marketeers.

Google say they are investigating and offenders will be removed. Another campaign run by online casino giants 888.com is (at the time of writing) now pointing to a geographical casino and not their online one.

Sadly, the reality is that as soon as these scams are stopped a new one pops up to take its place.

Another SEO manager told me recently that he didn’t believe in “ethical” SEO, and he is right … to an extent. Today’s “tweak” inevitably becomes tomorrow’s “White Hat technique” and next week’s “Black Hat swindle“. I would still err on the side of Google’s “Do No Harm”, although I might add “unless you know you won’t get caught”.

Me, I’m no gambler. I always believe I’ll get caught.

The Mystery of Google’s Page Rank Punishment

Tuesday, October 30th, 2007

So, you know I wrote about the cuts in Google’s page rank and how it was hitting people who’d bought in links and were feeling the pain of Google’s ethical stance? Well now I’m not so sure.

Yesterday, Barablu — my latest SEO project — felt the sting of demotion too. Its page rank fell one point, from 5 to 4.

Yet (as far as I can see) Barablu has NEVER in the past operated any dodgy practices, especially link buying: I wish the same could be said for the competition. No, the loss of a PR point in this case at least must be more than a Google moral backlash.

As usual, Google are keeping tight-lipped about the reasons behind the recent PR massacre and to be sure there are many sites around who’ve suffered more than Barablu. Yet, what makes this whole adjustment even more puzzling for me is that I know for a fact that sites which don’t exist are maintaining their page rank!

Now I’m not talking about some dodgy blackhat technique: the site in question — which I shouldn’t name for confidentiality reasons — ceased operations back in July because the owner couldn’t afford (or couldn’t be bothered) to pay his site dues. It was duly decommissioned and all the pages deleted; if you go there now you get the usual 404 errors.

However, if you search Google for the site right now you’ll be told that it has 41 pages and a PR of 4, albeit with no backlinks! As they say, go figure!

Go Tell The Marines!

In truth, Barablu’s real problem is years of unwitting, benign neglect. It was first in its field — making free calls using a mobile phone — and it still out-features the competition but it languishes in the lower reaches of the search engine rankings on almost all of its keyword phrases because, until now, no-one ever said anything. A recent comment on an Italian blog summed it up: “Even if Barablu is not very visible — not advertised properly — the software offers some interesting services you should try.”

As I write, I’m waiting to unleash a new Barablu website on an unsuspecting world, but for now all I can practically do is to encourage everyone here that I can to blog their socks off about Barablu and its associated technological fields via the Barablu Blog (catchy name, don’t cha think!).

For you see, as I think I’ve been saying for a while now, Content is King! And even with the meagre resources at hand right now, Barablu’s SEO is actually improving. See you at the top … .

Google Reinforces the Content Route

Thursday, October 25th, 2007

Google has just dropped a bombshell on many sites who obviously thought they’d got SERPS licked. They’ve cut huge swathes off the PageRank scores of many big name sites including engadget.com, forbes.com and problogger.net.

In some cases the figure has dropped by as many as THREE places (remember PR is not a linear scale: a PR of 2 is not just TWICE as good as a PR of 1; it’s something like SEVEN TIMES better!)

Which begs two questions …

  • Why have Google done this, and
  • What does it mean for the affected sites?

Taking the second first, the answer is not entirely clear. PageRank has been a controversial issue for some while now; some even argue it’s meaningless. It’s usually summed up as “the number of good sites pointing to yours”, and although the precise PR algorithm has changed since the original Google patent, it’s still largely based upon “backlinks”.

Put very simply, the better the PageRank of sites which link to yours, the better your PageRank will be. Conversely, lots of links from poor sites can actually harm your page rank; that’s one reason why link-swapping campaigns are such poor value if your site is already doing well.

Incidentally, the PR you see in the Google Toolbar or other SEO tool may be misleading: Google calculates PageRank on a regular basis, but the figure it displays to the world is “out of date” by several months.

What’s behind Google’s recent PR raid seems to be a question over the validity of these backlinks. Of late, one of the tools of the professional SEO has been to sidestep the problem of gathering backlinks by natural, organic means — which usually takes a very long time — by running “backlink campaigns”. These exercises can often run into many thousands of dollars and consist of “buying” stories on well-placed blogs, and links from directories, forums and other sites. Sites like PayPerPost.com exist solely to put willing bloggers in touch with SEOs looking for another backlink.

Recently, however, Google announced a crackdown on websites and search agencies that were buying links in order to artificially ramp-up search position (you can see a fuller list of the sites affected here). This chimes in with the search giant’s stated aim of attempting to make web searches honest — if you search for something, they argue, what you should get is a list of the most appropriate sites, not those with the biggest SEO budget. Content, once more, is king!

This leaves me in a quandary. My day job is get the free mobile phone calls site Barablu.com back on the top of the heap where it belongs, and my weapon of choice is to improve the content of the site by writing more, getting more people to contribute and making the site itself more accessible, more usable and simply more fun!

However, one thing that Barablu lacks — mainly because, unlike the competition, it’s never bothered with SEO before — is backlinks. Barablu’s current PR is 5 and that’s lower than its rivals but (these days) suddenly higher than searchengineguide.com and seo-scoop.com. Suddenly, the attractiveness of a backlinks campaign is less than it was.

Besides, these days PageRank is just one of a hundred or so metrics used by Google to order web sites. Does that make it irrelevant? At the time of writing, this very site has a PR of ZERO, yet it still tops Google searches for some terms.

Yet on reflection, I still think PR is relevant. It still seems to have some bearing over just how often your site gets indexed and how deeply and there are many other differences you notice when your Google PR increases.

So I reckon backlink campaigns will continue, only probably much more carefully, and much less visibly.

Statistical Nightmares

Wednesday, October 17th, 2007

Just finished my first week at barablu.com and we’re all expecting that the results should be coming through soon.

Actually, it looks like the results are coming through sooner than we expected and in surprisingly good shape. Hopefully it’s not all a case of lies, damned lies and statistics.

There are a lot of things going on in the background at barablu right now which it would be wrong to discuss here at the present; however, my raison d’etre being content, I had to make some early moves to give the web site a taste of things to come and that seemed to be the Barablu Blog!

I didn’t start the blog, it’s been around since the beginning of the year but the postings have been sparse and sporadic. Indeed, the decision had been taken to remove the link to the blog because its age made it an embarrassment. This is a key factor in content; unless you are dealing with some sort of historical event, old copy is just plain bad. Even if you have no plans to add new stuff, you should give what you have the once-over now and again. One site I recently bumped into was talking about presidential visits which seemed fine until it became clear they were talking about Clinton. (And I know Mr Clinton is STILL President Clinton, this wasn’t talking about some book tour in support of the spousal Senator for New York’s electorial ambitions).

Anyway, every day since I arrived I’ve been regularly posting to the Barablu Blog about getting my old PDA to work with the software, the latest Nokia phone — the gorgeous N95 8GB and my quest for a new telephonic gadget and it looks like it is having some effect in the daily rankings. There is new movement in an upwards direction.

In fact, there’s been quite a bit of movement; more than might normally be expected.

Whether it will continue at quite such a breakneck pace is unlikely. The statistical nightmare is that the people who have taken me on may well expect the numbers to continue in leaps and bounds; that’s the danger of any statistical calculation based on a very small set of results.

Yet, all in all, it does show that even the smallest efforts can — and do — have some effect on SERPS and, if the first results from the blog are confirmed by continuous improvement, it also shows the power of content with the new search algorithms.

What The Public Wants

Wednesday, September 12th, 2007

If the findings of a new US report are true, then content editors are going to need to rethink their news values.

For it seems that while hardened journalists are insisting that the headlines should be concentrating on Iraq, the world financial crisis and the debate about immigration, what web users are REALLY interested in is Britney Spears, the rise of Nintendo and the release of the iPhone.

Tom Rosenstiel, who helped to write the report for the Project for Excellence in Journalism, told the BBC …

“Users gravitated towards more eclectic stories. There was a sense that users sifting through a lot of raw information; rumour, gossip, propaganda and the news were all throw into the mix.”

The study compared headline news in nearly 50 mainstream news sources, including TV, radio and online, to that of three user-driven news sites. Seventy per cent of stories selected by Reddit, Digg and Del.icio.us came from blogs or non-news websites with only 5% of stories overlapping with the top 10 stories in the mainstream media.

The question is what does this really mean for content? Is all that journalistic training and experience for nought?

Firstly, Reddit, Digg and Del.icio.us (and StumbleUpon and the rest) are favourite haunts of a tech generation, just the sort of people fascinated by the Wii or the iPhone or Britney so the comparison with sites like TIME.com is not a direct one.

Second, the people who use Reddit, Digg, Del.icio.us et al are more likely looking for something light-hearted and off-beat. The Age of Citizen Journalism is here: there is plenty being said on blogs and news-you-can-use sites, and as election year dawns in the US the level of comment will only increase.

Actually, the researchers found traditional news outlets like TIME.com accounted for one in four stories on the user news sites and less than one in a 100 were actually original.

“That suggests that people are re-aggregating the news in the style of citizen editors rather than journalists,” Rosenstiel told BBC news. “These sites offer people a different take on the news but it doesn’t mean that traditional journalism has become irrelevant. They are forming more of secondary conversation about the news.”

So newsmen and women shouldn’t be reaching for their pink slips just yet. The new citizen journalist is more likely a citizen commentator, or just someone sharing their opinion over a few beers (but possibly without the beers).

For content professionals there is a silver lining. All this talk implies that there is an unquenchable thirst for something interesting on the web. All you have to do now is cater to that thirst!