First, let me start by telling you that you want to apply traditional research methods to search engine results. Many SEOs did this in front of you and, as a rule, kept it to themselves, since sharing “amazing results” usually means that you cannot use or have the upper hand, this suggests that I will share, as far as I can, some pointers and all you need to look for
- Determine which part of the algorithm you are trying to improve?
Different searches perform different algorithms.
Broad searches
For example, in the broad sense of the search for h, engines tend to return many results. A common part of these results includes
- News feeds
- Products
- Images
- Blog posts
- Local results (this is based on a Geo IP search).
Which of these types of results is thrown into the mix may vary depending on the word.
Example: Cats return images of cats and news, Shoes returns local shopping for shoes. (this is based on my IP in Chicago on October 6th)
The goal of returning results for a broad term is to provide a little of everything for everyone so that everyone is happy.
Regional modifiers
As a rule, at any time when a regional term is tied to search, it will greatly change the results. If you are looking for “Chicago Web Design” because the word “Chicago” is attached, the results will start with the top 10 regional results. (this is one liner to the right of the map), after 10 lists will be displayed in a general way "fashion result".
The results in the "top ten local", as a rule, are very different from the results in the list of organic substances below. This is due to the fact that local results (from Google maps) rely on completely different data for ranking.
Example. Having a phone number on your website with the Chicago area code will help in local results ... but NOT in overall results. The same with the address, yellow book list, etc.
Speed of results
Currently (as of 10/06/09) Google is beta testing caffeine. The main feature of this assembly is that it returns results in almost half the time. Although you cannot consider Google to be slow right now ... speeding up the algorithm is important when millions of requests happen every hour.
Reduce Spam Lists
We all found an experienced search that was riddled with spam. A good example is the new Google Caffeine release http://www2.sandbox.google.com/ . Over the past 10+, one of the largest battles on the Internet has been between Search Engines and Search Engines. Gaming google (and other engines) is very profitable and that Google spends most of its time.
A good example is the new Google Caffeine release. So far, my research, as well as several others in the field of SEO, have found that this is the first build in the last 5 years to give more weight to Onsite elements (such as keywords, internal site linking, etc.) than previous builds. Prior to this, each “release” seemed to support incoming links more and more ... this is the first step back to the “content”.
Ways to test the algorithm.
Compare two assemblies of the same engine. This is currently possible by comparing Caffeine (see link above or google, google caffeine) and current Google.
Compare local results across regions. Try searching terms, such as web designs, that return local results without a local keyword dictionary. Then use the proxy (found through google) to search from different places. You will want to make sure you know the location of the proxy (find a site on Google that will tell you your IP address or the city of your geo IP address). Then you can see how different regions return different results.
Warning ... DONT select the term locksmith ... and be careful with any terms that have a lot of spam records when returning the result. Google local is pretty easy to spam, especially in competitive markets.
As mentioned in the previous answer, compare the number of click users needed to find the result. You should know that at present, none of the main engines uses “failure indicators” as indicators of site accuracy. This is PROBLEM, because it would be easy to make it look as if your result has a failure rate in the range of 4-8%, without having such a low level ... in other words, it would be easy to play the game.
Keep track of how many custom search options are used on average for a given term to find the desired result. This is a good indicator of how well a smart engine guesses the type of request (as mentioned in this answer).
** Denial of responsibility. These views are based on my experience since October 6, 2009. One thing about SEO and engines is EVERY DAY change. Google may release caffeine tomorrow, and that will change a lot ... it says it's fun to research SEO!
Greetings