Affiliate thoughts for 2007 – keeping ahead of the chop

Easy come easy go… 

It’s no news to say that the days of easy rankings with easy commissions are long gone. With some search engines, it just no longer works. Anyone, and lots are, can whack up a DB or add a feed from some central source. It’s child play, and from a search engine viewpoint its just not welcome. They’d be happy to kick yo ass as soon as look at ya, and who could reasonably blame them? You can have the most well linked, beautifully constructed site in the world full of some mythical kw density perfection, css’ed to the nth with elements positioned to the max, but if you aren’t saying anything new, then the chances are that things could get pretty serious pretty quickly. Search engine death could well become you. Sure, you’ll get spidered, but expect to go supplemental pretty quickly, and if that don’t happen then you might get extra lucky and get lumbered with a nice fat -31 ranking penalty.

Fat or thin?

Over the years, there’s been quite a bit of discussion on what constitutes a thin or a fat affiliate. Lets look at travel. Fat boys like tripadvisor for example, are flying with lots of top spots on a range of travel related kw’s whereas others are floundering.

I recall a time when for like, 4 or 5 years a particular little travel network absolutely kicked arse on all of the big 3, Google, Msn and Yahoo. Be it ‘hotel in town‘ or  ‘town hotels’ these guys had top spots usually in the top 5 positions. They were nothing other than a well constructed, well linked network of affiliate feeds that did little other than pump out content that their suppliers provided. It really was an education to look at what these people had done. Their strategy was for the time, basically fab. They hosted a variety of big sites across a variety of IP’s. They mixed pages up with a mishmash of approaches doing things like varying page element factors, curtailing product description content, differing kw and kp densities, different navigational placement, text types, god you name it they’d factored it in one way or another, and it paid them big dividends. I guess really it was a day when it was all about getting as many pages into the search engine db’s as you possibly could. Their duplicate content filters were so underdeveloped that provided you did enough variation in the places that mattered, ie page naming, title tags, H tags general kw peppering here and there in your content spread etc, then you’d be pretty ok. In fact you got massively rewarded and could do some great stuff with inward link creation too. You didn’t have to worry about going out and sourcing zillions of links from here there and everywhere, you’d just create your own and ensure that they were appropriately placed and hidden across a network of unidentifiables, albeit in the sense of what the spider saw and registered at least!

A different breed of engine

Today of course, these guys are nowhere to be seen, at least not in any recognisable guise. Their network was nuked and they don’t rank for jack no more. Things like the Google eval team have given people using that particular strategy a short sharp shock.

New generation networks, if they hope to have sustainable long term SERP viability have to be a whole lot smarter in 007. Content feeds and databases, particularly with regard to outputting their contents within a site needs special attention – noindex tags, robot exclusion protocols really are serious considerations, to not do so could really be a huge folly. Drastic?,Perhaps so, but what with duplication filters and all, the question is one of almost can you afford not to?

Sure, there will always be those who look to employ methods for circumvention, all that lovely content is just too good to pass up on after all, right? Not sure about you, but I’ve seen all manner of interesting adaptations; things like replacing keywords and phrases programmatically so that an aspect of a phrase like um…this hotel is decorated to a fine standard  is changed to read… this fine placename hotel is adorned to a splendid configuration instead, or variations upon that theme. I’ve seen sites that rank well by using contractions of product descriptions, eg chopping the first 40 characters from the phrase and outputting the remainding 180 chars. Ive seen others that just hide them all together, via a document.write or iframe method. Some go as far as employing people to write phantom reviews, and some even write programs that write reviews on the fly! It really is incredible to see the ingenuity and nous that people have with this stuff, it really is the most elegant of elegant of spamination. I think its fair to say that people do this because they realise that things may well be tenuous, they know that unless you are whitelisted then you need to tread very carefully as your income stream is very precarious.

As simple as adding value then…

Perhaps its simple though, isn’t it all about  thinking  in terms of adding value, going above and beyond what your competitors are doing, seriously asking yourself will you be able to pass some random manual inspection, which lets face it, if you are ranking in a competitive earning space, you are likely to receive sooner or later. You’d be an idiot for thinking that just because you managed to outwit the bot via some clever use of string functions, or tag placement or link generation that a human wouldn’t pick up and notice something amiss.It isn’t unreasonable to assume they’d ask whether your site handles all the look up processes – Does it check for availability – Are the payments handled insite, or do they go off elsewhere?-  They’d see through a hidden frame or  include or some obfuscated url redirect,  you just will not be able to get away with what you once did, and if you think you will then, i wish i could share your complacency, as any serious examination of what you do would look at exactly some of these things.

On the positive, some of the better providers and networks do offer more advanced solutions of course, this helps insulate both them and their partners and is basic good business sense, but lots don’t too and for those who are getting hit via various penalties resulting, its a bit of a shame at best and a damn tragic waste at worst.

Should these guys be helping their income generators in this way?

If you are a search rep then you’d prolly say no, it sucks and doesn’t help in the goal of delivering varied unique content, but OTOH why would any big supplier expose themselves to the vagueries of singular url streams of income that could be cut off at the whim of a policy shift. I know what I’d say of course, I go with the majority scatter and seed approach. Watch the darwinian process evolve and reward my best performers. I’d also help nurture and protect  newcomers too, my future top performers. Give them tools to get their users interacting, enable the creation of communities,  feedback tools, make it all that little bit different, employ advisors to help steer and encourage and generally add value all round, but I guess i’m me, and not some multi layered corp that moves real slow.

I’ve used travel as its any easy example to flesh out and one that I’m at least familiar with. I do wonder whether other sectors face similar challenges; I expect they do no doubt to both lesser and greater extents, especially in some of the mass product markets. It would be great to read some inputs, feel free to call me out!

Advertisements