@briankrebs but don't we know a specific ways that this happens?
I think it is a common strategy now to set up a page for high search rank, and then to swap it, hot swap it, for a worse site?
Given dynamic page creation this could be a very hard thing for Google to track. Essentially how many kinds of users does Google have to be as they crawl the web to detect all of the subterfuge?
What if a "good" site is actually putting out "bad" pages for old Android phones?
@briankrebs btw, as I've mentioned I think the dynamic nature of the new web is the reason Google has given up on caching.
A cache would only contain an image of the page given to the robot.
It would increasingly *not* look like what we see when we visit the same place.