Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Building 2k Unique SEO Pages with GPT-3 (mackgrenfell.com)
13 points by mektrik on Oct 8, 2021 | hide | past | favorite | 12 comments


Wow. AI enabled Zapier to create ~29,800 variations of landing pages... Just so they can market to anyone who googles some variation of apps that Zapier offers integrations for. 29,800 virtually worthless, promotional content-marketing articles.

Essentially, AI is polluting human-written content with bland, AI-written content. This is terrible.

This is the type of thing that makes the internet less valuable and less usable. There should be a disclaimer on content/articles created by AI. To filter them out from a web search would be a fantastic feature.


If Google won't enable AI-filtering AI, they'll become irrelevant, and people will migrate, just as they did away from Altavista.

I'd be surprised if Google didn't already devalue "content" created by GPT-3.


I have seen an extreme number of fake blogs showing up on search engine front pages recently. A massive army of random generated blogs from many countries to sell you all kind of (real) products for their local markets.

Maybe GPT-3 is to blame for that.


Same here. Occasionally I will do an esoteric search and the only responses I get are obviously pages which were written automatically. They do not contain normal language, but rather a mish-mash of meaningless-outside-of-a-context text content. Very weird and hacky-- I always feel suspicious of such pages as if viewing them would somehow get me one step closer to getting my computer hacked.


What else would text-generating AI be good for? 99% is for this. Lets just wait for the automated news sites that steal content from real sites and use AI to rewrite it so it looks original content.


You don't have to wait. It's already here

https://en.wikipedia.org/wiki/Automated_journalism


They've always been there. That's literally what SEO companies do: create thematic content farm, link each other to become relevant and then sell backlinks to customers.

They were just written by cheap labour overseas and used spinners to generate multiple unique pages.

Eg. {this is great|awesome}, I can {generate 4 articles with this|make money while I drink pina colada on the beach}


I ran a business 10 years ago that was receiving tons of user queries (think like a search engine) and in order to rank first on Google we were just counting the X thousands most popular queries, generating landing pages with little content for all of them and then submitting them as multiple sitemaps. It worked really well, but that was before Panda. Google became more picky about writing unique content, I've heard.


Google's ranking team will obviously respond to this kind of page replication/variants if it becomes a common thing. What will be interesting to see is whether they favor it (because it gives searchers better more targeted and more actionable results) or punish it (because it pollutes the search results with near duplicates).

Perhaps sites will need to add a new form of cardinal url metadata to hint around this kind of customization?


This was going to happen eventually. I'm glad they're at least being open about it here.


Thanks, I hate it.


Oh christ, no.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: