Monday, December 10, 2007

How many journalists check their web analytics against articles each morning?

Frederick is somone you don't wanna mess with.


A former student at the University of Westminster now running his own business, he has an eye for the back end coder wtih an radar for the new new thing.

So when I got skyping with him about SEO and rich key word linking, Frederick already had some ideas to give an article more google juice.

More on that on Thursday when I'll record a brief interview with him.

He's already built an application that facilitates symmetrical up-download from your site or outernet.

Clever lad.

But to that question.

How many journalists check the google ranking via a range of analytics in the morning?

And how soon before we reach more defined ground where writing for human consumption is still the end goal, but you have to satisfy the google beast or bots first on the way.

Looking at in and outbound links, as well as key rich word density.

Data mining journalism
We truly are heading for an automative journalism society.

And the future reall will have more and more clever clogs writing articles and avoidng filtering software to reach higher penetrations in the same way spammers rewrite code all the while to beat filters.

This morning in a bid to monitor bots entering Viewmag I found myself writing a new robot.txt file.

Then we delved into myphp database.

Frederick touched on a subject already made popular by Adrian Holovaty, but not used anywhere near as much- public data.

The amount of data available by government bodies etc is there to be mined by journalists, according to Frederick.

More on than and more on Thursday

2 comments:

Cliff said...

How many journalists check the google ranking via a range of analytics in the morning?

I don't necessarily check my stats every morning, but I do have basic site stats emailed to me every Monday and then I check those against the stats package for my websites.

About once a week as well, I check certain key search words and have found that my CMS is doing a bang up job with providing effective SEO integration and Google is keeping me in the top 10 for several search phrases.

What did you do to your robots.txt and why did you feel compelled to edit the file???

Cliff Etzel - Solo Video Journalist
bluprojekt

Dr David Dunkley Gyimah said...

Sounds like you're bang on.

I think the horizon would be how well an article is scoring against a competitor, but if your CMS doing a top ten job at the mo.. then your on that road.

Question: when you find an article a hit - do you follow up to mine its rich slip stream?

There was an outdated robot.

Sometimes I dump stuff in my server that I don't want to be picked up and indexed by bots so the code merely disallows.

I don't mind my pics being picked up as their file sizes are miniscule so I have removed the tag on this.

I'll go search for it on my other mac and post it for you to see.