Why Google Should Offer a Choice of Search Algorithms

Sifting and ranking millions of websites into a top 50 or so is difficult, obviously. (I say 50, because people probably rarely dig further than a few pages).

Most of the time, Google's algorithm works amazingly well considering the scale of the challenge, especially considering that spammers are constantly trying to take advantage of it.

But I think there might be room for improvement.

Google definitely seems to rely at least partly on the popularity of a site (as measured in click-throughs) to rank it. This is clearly open to all sorts of abuse, hence I think ranking should be more about the merit of the text rather than the number of clicks.

Popularity is all well and good, but just look at some of the things that are popular. I'm not being snobby about this - I can enjoy I'm a Celeb as much as the next person, but it would be a bit annoying if I had to craft a fiendishly complex search term or scroll down 1,000 results before I could watch Newsnight.

A choice of algorithms would help to fix this problem. One could search by popularity, another by subject or level of language used, and so on. I think it would be a good idea if there was a learning algorithm that was manually adjusted by actual people moving things up and down the rankings - experts in their field, perhaps, though obviously there'd be issues of staffing and bias to address. A textual meritocracy approach might also enable blogs to be ranked more fairly alongside traditional websites - at the moment, although they do appear in normal search results, they seem to be somewhat ghetto-ised into a specific blog search option.

The various considerations that Google uses now to rank things could be offered to the user - slide this to place more weight on posting frequency, or newness, or popularity, or meta information relevance, and so on. A particular advantage of this approach would be that it would shift some of the burden of defeating spamming tactics to the user.

Google's advanced search is a step in the right direction, but bizarrely the link to it from the main Google page was recently removed.

Another recent odd design decision, in my opinion, was the introduction of Javascript/meta redirects upon each results click. Is Google trying to discourage scrapers or something? Don't forget, Google, you are in effect a very large and advanced scraper yourself, and you rely on the openness of the web to function.

Google could also do more to transparently improve the way the internet is developing, and learn from other web companies. Take Twitter, for example. Can't stand it, personally, but I can see why hashtags and real-time trend updating might be useful to some people.

This is nothing that can't be done with current websites. #Webhashtags could be used on websites in exactly the same way they're used on Twitter, or it could be very easily incorporated into metadata. Faster results updating could be achieved by either directing spider and processing resources more evenly across the spectrum of website popularity, or by (perhaps more realistically) introducing a 'push' mechanism for websites to send updates to search engines.

Updated 22.03.2013 - Typo.