Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

I think Groucho Marx could sum up this article as "Who are you going to believe, me or your own eyes?"

I didn't keep a record of my searches from 2000, but I do remember that I was extremely impressed and satisfied with google back then. That is no longer the case. I am frustrated on daily basis by the results of many searches.

It's not really your fault though. I blame it on mono culture. Google has such a huge hold of the search market that it's probably not even worth SEO peoples time to bother gaming other search engines. There is a whole industry feeding off your success and we all suffer for it.

After the china attacks earlier this/last year google put out a decree that they would phase out windows machines among their employees. I thought it made sense. Windows is too easy a target because of it's success. The same is true in nature and the same is true in search engines.

The best we can hope for is a new search engine that will be insignificant enough in terms of market share to avoid the scammers. As long as google is the dominant search engine it will never get better.



> I am frustrated on daily basis by the results of many searches.

I have some speculation on this. Are you sure that your frustration comes from unusually poor results? Could it be that we're so used to things being perfect now, that a page of mediocre results looks like the end of the world?

Look at 2000. No Shazam. If you heard a song on the radio, that was the end of the story. There were some services you could call for $5 and a human would help you identify it. Now in 2011 you press a button and wait 5 seconds.

The web had a lot less information on it. A few hundred million people were connected. Now, in 2011, a few billion people can connect. Huge shift in quantity. Very tough to keep the S/N going strong.

2000: No Wikipedia. Now in 2011 if Wikipedia isn't in your top results you might be upset, but in 2000 you were happy even though Wikipedia didn't yet exist!

So I pose the question to you: Are you absolutely, positively sure that Google's quality has declined and what you are seeing isn't just a side-effect of everything else being so awesome?


> Very tough to keep the S/N going strong.

I think there's also a problem that the quality of content that Google has to index is steadily declining. In the 90's, many people were keeping lists of links to good stuff they found on the internet. Google could crawl the links and make conclusions. Now, people hardly do that anymore because they can just google it.

So Google has to employ ever more powerful algorithms to maintain even the same level of search quality. I think here lies an existential threat to the search engines.


Yes you can probably whip out the Louis CK. I'll admit that part of it is acclimatization. I've come to expect good results all the time.

That doesn't detract from the fact that results are bad for many searches and that the google monoculture has thwarted it earlier success. I'm absolutely sure that my satisfaction with their product has declined. I know I look forward to a better search engine than google. In 2000 google was the better search engine.


I would say the noise ratio for the whole internet increased because people found out that they could make easy money with content-void or misleading websites.


I think what Matt is trying to say is that your memory of what Google was like in 2000 could be inaccurate. So he's demonstrating what Google was actually like in 2000.

Of course you were impressed and satisfied with Google back then — it was so much better than the alternatives. I'm sure you would have been just as (perhaps more) satisfied if 2011 Google existed in 2000. We've had 11 years to pinpoint the deficiencies of Google. Couple that with 11 years of rapid progress on the internet — we just expect more now.

What's confounding to me is that there's an obvious solution to Google's biggest problem (spammers and scammers):

  1. Allow users to individually block URLs and domains on a permanent basis.
  2. Accumulate massive amounts of data regarding blocked URLs/domains (a good quality indicator)
  3. Integrate this data into the PageRank algorithm
I'm not sure why it's taking Google so long to do this, at least step 1.


A reason could be that it would slow down searches too much. Firstly, all your searches would have to hit a server storing your blacklist. Secondly, it makes caching results of popular queries neigh impossible. Finally, some people will build blacklists so large and convoluted that the top hit on some of their searches would only be on Google's page 20.


One might think that personalized domain blacklisting is simpler than social search. http://www.google.com/support/websearch/bin/answer.py?hl=en&...


How about they build it in as a feature of Chrome? The servers can still send the canonical list, using cached results and all, but then it gets filtered down (and perhaps in the future reranked a bit) on the user's machine. As a bonus, it increases the value of switching to Chrome.


Aha, good explanation, thanks. I still have to think with all of Google's brainpower, they could figure something out. They already allow "starred" results, which I presume would function somewhat similarly to a blacklist from a technical standpoint (just showing instead of hiding).


Sure they could use other ranking methods like user black lists, white lists (bookmarks/favorites), timing the number of seconds until someone back buttons, etc... But as long as google garners the majority of the search market it will all be manipulated.


My experience is entirely different.

I do a lot of Perl. It used to be that as soon as something on CPAN matched my search, I'd get a handful of results, then it would find the CPAN match, then I'd get pages and pages of the same result from different CPAN mirrors. Now that no longer happens, and my searches for half-remembered Perl discussions are much, much less likely to end in frustration.

But that's me, and YMMV.


We were all very impressed with IE6 in 2001 as well and now even with IE9 that is no longer the case.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: