Archive for the 'Google' Category

Key facts about keywords

Wednesday, October 17th, 2012

There is still much uncertainty about keywords and it’s not surprising: because keyword “science” is not — and probably never will be — settled at all.

The temptation is to “stuff” content with as many relevant words as possible in the hope that they will match the sorts of words that people are searching for. That was how Search Engines used to work and it’s a simple concept: pages with lots of keywords will be relevant to more people and so appear in more search results.

The problem with this approach is that the search result is more often than not completely irrelevant.

It is also susceptible to spamming because one way to get lots of people to your site used to be to cram as many different keywords into a piece of text as possible. Spammers work on very small margins: send out a thousand fake emails and even one click is worth it – especially when you’re actually sending a million!

That’s why today’s search engines are much more sophisticated. Their business is to produce relevant results, which makes them trusted (and more popular), which in turn means that their advertisements get seen by more people.

That requires an accurate search result to come from a very few words, even if those words aren’t the ones used by the searcher.

Actually, we all use a host of words for a single concept – for example: rooms, hotels, accommodation, stays, beds, auberge, boarding house, caravansary, dump, fleabag, flophouse, hospice, hostel, hostelry, house, inn, lodging, motel, motor inn, public house, resort, roadhouse, rooming house, spa, tavern – and modern search engines realise this.

Mistakes

It gets more complicated because sometimes we say one thing and mean another, or we don’t know what we want but we expect the search result to tell us what it is, or … our spolling is simply not up to scrotch.

Search engines know this …

And because we’re not always forthcoming with our search queries, Search Engines also use other signals to decide exactly what is the best search result for the supplied phrase.

For example, if you’re signed in to any Google service, your previous search history will be used to tailor your current results, meaning that if you regularly search for any term or terms — like ‘hotels in Town‘ — your results will be affected by this. If yours is a regular search, it’s likely that your results will differ significantly from those of someone else who has used that query for the first time (perhaps a potential customer).

Your results are also affected by geography: a search at London Heathrow’s departure lounge will return a different set of results to the one you carry out in the arrivals lounge at JFK using the same tablet computer.

And this “tailoring” can happen even if you’re NOT logged into to a Google account.

One of the big SEO talking points of 2012 has been tracking users — and their marketing preferences — using methods that do not require cookies or other overt tracking solutions, even down to the profile of their hardware. This is partly because of new EU rules which require sites to declare their use of cookies for privacy protection but also because Google (and others) are witholding search data from end users like us: the infamous “data not provided”.

Brand

However, Google have been doing something similar for a long while: when Google’s Executive Chairman Eric Schmidt told a 2010 keynote at the IFA in Berlin that “We know where you are. We know where you’ve been. We can more or less know what you’re thinking about.” he wasn’t being Orwellian, just hinting at better search results.

What all this means for Organic Search is that the days of “stuffing” and “keyword densities” are over.

While we might not have reached predictive perfection – although predictive search results are physically changing the way people use keywords — we can finally concentrate on producing good, insightful, useful, informative and engaging content instead of worrying that we haven’t used the word “hotel” often enough in the first paragraph.

In fact, early in 2012, top Googler Pierre Farr, told the London Search Marketing Expo (SMX2012) that overusing “key” keywords such as “hotel” were almost intrinsically seen as spammy in sites whose core business is hotels.

That’s not to say that you can be lazy about content — a page about a hotel should be full of hotel keywords — it’s just that you should concentrate on presenting good information in an engaging way.

All the major search engines are now following Google’s lead in downgrading “over-optimised” websites. The major tool being used by Google is the Penguin updates, the latest just two weeks ago. These effectively downgrade sites with suspicious levels of poor-quality in-bound links, anchor text which optimises high volume keywords, or thin content.

Another recent update to Google’s algorithm is aimed at sites using keyword-rich domains and little relevant content; these include hastily-constructed affiliate-style sites which pretend to deliver important information but are actually just link farms pumping visitors to third-party sites. A particular favourite topic here is travel keywords.

Takeaways

To sum up, here are some simple guidelines for content, based on Google and Bing guidelines, as well as empirical data from our own (and industry) sources …

  • Do be aware of keyword volumes but use them as a guideline for the sorts of things you should be writing about
  • Do consider your end user and write content which will engage with their needs using local phrasing, information in depth and written in a readable way
  • Don’t keep using the same word so that the text seems stilted and robotic; use synonyms, alternatives, other ways of saying the same thing.
  • Do keep your content relevant to the subject of the page; drifting off to other topics will dilute its effectiveness. One topic, one page.
  • Don’t repeat yourself, say something new.
  • Do use short paragraphs
  • Do make your content readable, understandable, interesting and a call to action!
  • Don’t worry about keyword densities or getting as many keywords into the text as possible; if the page is relevant, you ARE effectively covering keywords!
  • Don’t repeat yourself! (this is important).

Finally, if you’ve managed to reach this sentence, you’ve seen how even vaguely interesting content can make people read on past the first paragraph.

And the longer you can engage the visitor the more chance you have that they will convert, as well as providing more keyword opportunities to bring more potential customers to the site.

You say potato and I say potato – the language of SEO

Monday, October 8th, 2012

Scotland — and the world — lost a whole language last week.

Retired engineer Bobby Hogg, the last native speaker of a dialect originating from a remote coastal village in northern Scotland, died aged 92 — and so did the dialect he spoke, Cromarty Fisherfolk, a mixture of old Scottish and military English from the soldiers that used to be stationed nearby.

It also appears to be the only Germanic language in which no “wh” pronunciation existed — so ‘what’ would become ‘at’ and ‘where’ would just be ‘ere’ — and the only Scots dialect that dropped the “H” aspiration, “heavy” became ‘evvy’.

Linguistics is important for SEO –- queries are what it’s all about — and while one may not want a website that caters exclusively for Cromarty Fisherfolk, there is scope in using language which may only make sense within a distinct region. Wikipedia details at least 40 different types of English, and that doesn’t include the variations in dialect between counties and states.

Standards

One bugbear of many British English speakers is “Americanisms” — language deemed to have come about from the power of American (mainly US) culture on the wider world.

There are even words about feelings about Americanisms: Amerilexicophobia means “fear of American words” while worse still is Amerlixicomania, craziness about American words to the point where you lose all rationality.

Some Brits go around changing every “ize” they see at the end of a word to “ise” — like “standardise” instead of “standardize” — but it turns out that it was actually the English who changed the spelling, stealing the “s” from the French, while Americans kept to the standardized form. It’s a similar story with “autumn” and “fall”. I recommend Bill Bryson’s “Made in America” for lots of revelations about how US English is sometimes more traditional than Amerlixicomaniacs give it credit for.

The problem with running a global website is that you have visitors using all types of English — from Maltese to Mancunian, and Harvard to Harare.

One solution is to use an “international” English, using words which are generally understood wherever the reader is from. There is a similar dilemma when it comes to variations between German spoken in Germany, Austria or Switzerland, or Spanish in Medellin or Madrid.

Targeted

The problem for users of that approach is that Google and the other search providers are going the other way, taking account of all the regional and local variations in speech to offer a search result which they feel meets the needs of local end users.

And as search becomes increasingly targeted and localized — especially with mobile searches — the pressure for us to similarly localize our pages can only increase too.

Getting Google to keep coming back

Tuesday, July 15th, 2008

A friend of mine has recently launched a blog on an unsuspecting world, starting from scratch.

She downloaded a copy of WordPress, bought some blog-only web space for a startling £27 a year and took one of the many WordPress themes and tweaked the content until it looked right to her. This all happened on June 27.

Then she regularly surfed the web news directories, pulled out the stories that caught her eye, rewrote them and stuck them on her new site at the rate of at least three a day.

Now, with a bit of a tweak here and there, and the inclusion of a WordPress plug-in that creates a sitemap on the fly, she is beginning to scent the sweet smell of success. While her blog has no PageRank and so far no-one is linking to it, she is already getting a daily index from Google and others.

I can’t tell you much more. I’ve promised that I won’t let on what the blog is called, or link to it: apparently, she’s using it as a test bed to see just how hard (or easy) it is to get top SEO purely from content. If it were me, I’d be shouting it from the rooftops, but then I’m a bloke so I have an ego to massage.

I can admit that I’m extremely jealous of her success because for technical reasons I’m still unable to add any pages to the site which is my day-job.

But my friend is right. It proves that once you take away all the coding factors, like semantic XHTML and proper linkage to and from the site, good SEO is REALLY all about content these days.

Provide good content, and provide it regularly, and Google and its rivals will come knocking.

According to best estimates of when the next PageRank update takes place, we have just 55 days to see how much daily indexing equates to good PR.

The Mystery of Google’s Page Rank Punishment

Tuesday, October 30th, 2007

So, you know I wrote about the cuts in Google’s page rank and how it was hitting people who’d bought in links and were feeling the pain of Google’s ethical stance? Well now I’m not so sure.

Yesterday, Barablu — my latest SEO project — felt the sting of demotion too. Its page rank fell one point, from 5 to 4.

Yet (as far as I can see) Barablu has NEVER in the past operated any dodgy practices, especially link buying: I wish the same could be said for the competition. No, the loss of a PR point in this case at least must be more than a Google moral backlash.

As usual, Google are keeping tight-lipped about the reasons behind the recent PR massacre and to be sure there are many sites around who’ve suffered more than Barablu. Yet, what makes this whole adjustment even more puzzling for me is that I know for a fact that sites which don’t exist are maintaining their page rank!

Now I’m not talking about some dodgy blackhat technique: the site in question — which I shouldn’t name for confidentiality reasons — ceased operations back in July because the owner couldn’t afford (or couldn’t be bothered) to pay his site dues. It was duly decommissioned and all the pages deleted; if you go there now you get the usual 404 errors.

However, if you search Google for the site right now you’ll be told that it has 41 pages and a PR of 4, albeit with no backlinks! As they say, go figure!

Go Tell The Marines!

In truth, Barablu’s real problem is years of unwitting, benign neglect. It was first in its field — making free calls using a mobile phone — and it still out-features the competition but it languishes in the lower reaches of the search engine rankings on almost all of its keyword phrases because, until now, no-one ever said anything. A recent comment on an Italian blog summed it up: “Even if Barablu is not very visible — not advertised properly — the software offers some interesting services you should try.”

As I write, I’m waiting to unleash a new Barablu website on an unsuspecting world, but for now all I can practically do is to encourage everyone here that I can to blog their socks off about Barablu and its associated technological fields via the Barablu Blog (catchy name, don’t cha think!).

For you see, as I think I’ve been saying for a while now, Content is King! And even with the meagre resources at hand right now, Barablu’s SEO is actually improving. See you at the top … .

Google Reinforces the Content Route

Thursday, October 25th, 2007

Google has just dropped a bombshell on many sites who obviously thought they’d got SERPS licked. They’ve cut huge swathes off the PageRank scores of many big name sites including engadget.com, forbes.com and problogger.net.

In some cases the figure has dropped by as many as THREE places (remember PR is not a linear scale: a PR of 2 is not just TWICE as good as a PR of 1; it’s something like SEVEN TIMES better!)

Which begs two questions …

  • Why have Google done this, and
  • What does it mean for the affected sites?

Taking the second first, the answer is not entirely clear. PageRank has been a controversial issue for some while now; some even argue it’s meaningless. It’s usually summed up as “the number of good sites pointing to yours”, and although the precise PR algorithm has changed since the original Google patent, it’s still largely based upon “backlinks”.

Put very simply, the better the PageRank of sites which link to yours, the better your PageRank will be. Conversely, lots of links from poor sites can actually harm your page rank; that’s one reason why link-swapping campaigns are such poor value if your site is already doing well.

Incidentally, the PR you see in the Google Toolbar or other SEO tool may be misleading: Google calculates PageRank on a regular basis, but the figure it displays to the world is “out of date” by several months.

What’s behind Google’s recent PR raid seems to be a question over the validity of these backlinks. Of late, one of the tools of the professional SEO has been to sidestep the problem of gathering backlinks by natural, organic means — which usually takes a very long time — by running “backlink campaigns”. These exercises can often run into many thousands of dollars and consist of “buying” stories on well-placed blogs, and links from directories, forums and other sites. Sites like PayPerPost.com exist solely to put willing bloggers in touch with SEOs looking for another backlink.

Recently, however, Google announced a crackdown on websites and search agencies that were buying links in order to artificially ramp-up search position (you can see a fuller list of the sites affected here). This chimes in with the search giant’s stated aim of attempting to make web searches honest — if you search for something, they argue, what you should get is a list of the most appropriate sites, not those with the biggest SEO budget. Content, once more, is king!

This leaves me in a quandary. My day job is get the free mobile phone calls site Barablu.com back on the top of the heap where it belongs, and my weapon of choice is to improve the content of the site by writing more, getting more people to contribute and making the site itself more accessible, more usable and simply more fun!

However, one thing that Barablu lacks — mainly because, unlike the competition, it’s never bothered with SEO before — is backlinks. Barablu’s current PR is 5 and that’s lower than its rivals but (these days) suddenly higher than searchengineguide.com and seo-scoop.com. Suddenly, the attractiveness of a backlinks campaign is less than it was.

Besides, these days PageRank is just one of a hundred or so metrics used by Google to order web sites. Does that make it irrelevant? At the time of writing, this very site has a PR of ZERO, yet it still tops Google searches for some terms.

Yet on reflection, I still think PR is relevant. It still seems to have some bearing over just how often your site gets indexed and how deeply and there are many other differences you notice when your Google PR increases.

So I reckon backlink campaigns will continue, only probably much more carefully, and much less visibly.

Why Content is on the Rise

Sunday, August 26th, 2007

This may not yet be the Golden Age of Content, but it IS coming. Getting those all important search engine places has, until now, been a matter of juggling organic search elements.

These include keyword factors like good meta tags, keyword density in text, internal links and even the domain name itself, domain registration age and history, good backlinks and relevance to the topical neighbourhood, the age of links and the quality of the sending domain and metrics such as the time spent on pages and the number of searches.

Google’s current algorithm certainly has some direct analysis of content beyond keyword densities, and there is some speculation that further content endorsement comes from good old human beings (search specialists will also tell you that a good route to prominent Google placement is via the The Open Directory project — dmoz.org –which is entirely human-based).

Google’s own comment on their search algorithm is simply: “Google’s complex automated methods make human tampering with our search results extremely difficult”.

Google watchers say the algorithm changed last year to the detriment of many existing sites using the arsenal of so-called “White Hat” tricks such as keyword density and Long Tail. One way of regaining SEO that seemed to work was increased pagination: more content. It seems as if Google (and other search engines) have good, relevant and interesting content in their sights.

But isn’t that what search engines were meant to be? Somewhere people went to find readable pages relevant to their interest.

You should take note of this now. Don’t abandon White Hat, but be aware that your site should be more than a series of search engine algorithm tricks. Content is king.