Half of planned US data center builds have been delayed or canceled, growth limited by shortages of power infrastructure and parts from China — the AI build-out flips the breakers
The dark mode looks better. In the light mode they have to improve the contrast. The lines in the visualisations are almost invisible, also the data table on the left
I have a love/hate relationship with WordPress. I prefer it over any other CMS for reasons I can't fully explain. Probably nostalgia. I love the editor, the sidebar, the ability to find niche plugins in an endless marketplace, and how easy it is to get a site running.
What I don't love is how insecure it is. You're entirely dependent on plugins (if you're not a developer), and if WordPress updates and a plugin goes inactive, you're sitting on a vulnerability. It adds real stress for something that's just supposed to be a fun personal site.
I stopped building on it two years ago, even though I still like it more than Webflow and most alternatives I've tried. It's a bit sad.
This feels like one of those stories that spreads because it’s too good not to.
People see that Claude is a top contributor in OpenAI related repo and jump to conclusions about what it means
Same with the Ballmer/Mac story. It sort of happened, but not really how people tell it. The Mac was there, but he wasn’t actually using it the way the story implies.
A couple years back John Reilly posted on HN "How I ruined my SEO" and I helped him fix it for free. He wrote about the whole thing here: https://johnnyreilly.com/how-we-fixed-my-seo
Happy to do the same for you if you want.
The quickest win in your case: map all the backlinks the .net site got (happy to pull this for you), then email every publication that linked to it. "Hey, you covered NanoClaw but linked to a fake site, here's the real one." You'd be surprised how many will actually swap the link. That alone could flip things.
Beyond that there's some technical SEO stuff on nanoclaw.dev that would help - structured data, schema, signals for search engines and LLMs. Happy to walk you through it.
update: ok this is getting more traction than I expected so let me give some practical stuff.
1. Google Search Console - did you add and verify nanoclaw.dev there? If not, do it now and submit your sitemap. Basic but critical.
2. I checked the fake site and it actually doesn't have that many backlinks, so the situation is more winnable than it looks.
3. Your GitHub repo has tons of high quality backlinks which is great. Outreach to those places, tell the story. I'm sure a few will add a link to your actual site. That alone makes you way more resilient to fakers going forward. This is only happening because everything is so new. Here's a list with all the backlinks pointing to your repo:
4. Open social profiles for the project - Twitter/X, LinkedIn page if you want. This helps search engines build a knowledge graph around NanoClaw. Then add Organization and sameAs schema markup to nanoclaw.dev connecting all the dots (your site, the GitHub repo, the social profiles). This is how you tell Google "these all belong to the same entity."
5. One more thing - you had a chance to link to nanoclaw.dev from this HN thread but you linked to your tweet instead. Totally get it, but a strong link from a front page HN post with all this traffic and engagement would do real work for your site's authority. If it's not crossing any rule (specific use case here so maybe check with the mods haha) drop a comment here with a link to nanoclaw.dev. I don't think anyone here would mind if it will get you few steps closer towards winning that fake site
If I was the author, however, I'd still feel like I've been put in a predicament where I need to spend personal agency to fix something that Google has broken.
While that may just be a fact of life, my internal injustice-o-meter would be raging. Like, Google is going to take hours of my life because they, with all their billions of capital, can't figure out the canonically-true website when it's RIGHT THERE in the GitHub repository?
Ugh. I guess that's just the day we live in. But it makes me rage against the machine on the author's behalf.
I had the exact same thought while reading the above comment, as helpful and generous as it is. Google's entire business model is to help people find things on the internet. They're an insanely well resourced company with all kinds of smart programmers. They have a moral and financial incentive to direct people to canonical sources of information. And STILL it's on this open-source dev to do all the steps outlined just to get the situation corrected?
Google's business model is to help Google's customers pay money to Google. Google Search's customers are mostly scammers who run adverts. Helping the user find a thing is at odds with helping the user find a scam that pays Google money.
This is somewhat true; despite what HNers seem to think, online ads are not very effective (in terms of convincing people to buy things), and Google 'screws over' its advertising customers as often as it delivers deficient search results to users.
The billions of capital are exactly why they don't care about you. Also, Google didn't break anything. The only person who can claw out a place in this giant machine for yourself is you - all while billions of others attempt to do the same.
So the feeling is fine, and if he’s going to bother at all, which he is, he should be doing it efficiently. Everything so far was panic and inefficiency
No it's not, it's a sales pitch that intentionally ignores some of the things pointed out in the article. The author has invested time into proper SEO optimization, legit websites already link to it et cetera, it's all explained in the article.
From the perspective of a spammer: They need like 2 million MAU to earn below minimum wage. You're never getting those figures by doing something legit and actually useful to a tiny subset of people. You either need a vague site beyond any point of usefulness to anyone or you need a network of knockoff sites. The reason you can't compete with these shitty SEO spam version of your site is because they already have a network of "authoritative" (in Google's eyes) sites and all they have to do is to link from them to a new one to expand their shitty network.
From the perspective of SEO agencies: They can't guarantee results. They can tell you vague, easily-googleable best practices and give you an output of some SEO SaaS that's far too expensive for an individual to purchase. Ahrefs(.com) is the prime example of this, the cheapest paid version costs $129/month. Do you care about SEO that much? No, so you go to these agencies and give them money for them to give you the output of such a tool. But that SaaS also only contains vague and nebulous "things to fix" to follow "best practices" because they also cannot know what drives traffic to your competitor from the outside perspective.
My best suggestion would be to start a website from day one. Doesn't matter how good the website is at first, Google favours sites that exist for longer. If you're creating a website after the knock-off version already exist, you might as well give up immediately, it's gonna be near impossible to recover from that.
> No it's not, it's a sales pitch that intentionally ignores some of the things pointed out in the article.
Sales pitch or not, someone offering their time to help me with a problem is feels generous to me. To each their own, I suppose.
But again, you reinforce my point in your last sentence. Now anytime I want to make any little toy project (because how can anyone know when their toy project will blow up overnight?) I have to make a full blown website just to ensure I don't get SEO-spammed into oblivion?
My point still stands. Google is the problem and while we likely can't effectively do anything about it, it's frustrating as hell.
I never said Google isn't the problem, what I said is that going to an agency isn't gonna fix that problem any more than running a SaaS tool yourself will, because they're not Google and they have no insight into what Google made one website prioritised over the other. Because, as you've pointed out, Google is the problem.
> I have to make a full blown website just to ensure I don't get SEO-spammed into oblivion?
No, I said a crappy one on purpose. How good is it doesn't matter, the sooner the Google knows about the domain, the better. Might as well be a copy of your README file using one of the million SSGs GitHub supports that will turn that README file into a website. The only thing that matters is that the website exists and that Google knows about it before the other one.
That's why many people purchase the domain on day 1 before they even start building the thing and also why many have like a dozen domains in their account that is like a boulevard of broken dreams there to remind them once a year they haven't done anything with them.
Still cheaper than a SEO agency or in most cases even one month of ahrefs access.
If Nanoclaw generates some revenue, you should trademark the name and also buy nanoclaw.com. Move the site to the .com domain and then do the steps above. All things being equal, ".com" TLD should get you higher page rank than your existing ".dev". Google is ranking ".net" fake page higher than ".dev". If your page wasn't on .dev TDL it might be second already.
All this work to solve one website's problem... You can be sure MANY other open source projects are facing the same issue. It's just not a viable solution. There is something wrong with Google. Google has to fix it.
The writing style just has several AI-isms; at this point, I don't want to point them out because people are trying to conceal their usage. It's maybe not as blatant as some examples, but it's off-putting by the first couple paragraphs. Anymore, I lose all interest in reading when I notice it.
I would much, much, much rather read an article with imperfect English and mistakes than an LLM-edited article. At least I can get an idea of your thinking style and true meaning. Just as an example - if you were to use a false friend [1], an LLM may not deal with this well and conceal it, whereas if I notice the mistake, I can follow the thought process back to look up what was originally intended.
> Using them isn't an advantage, but not using them is a disadvantage. They handle the production part so we can focus on the part that actually matters: acquiring the novel input that makes content worth creating.
I would argue that using AI for copywriting is a disadvantage at this point. AI writing is so recognisable that it makes me less inclined to believe that the content would have any novel input or ideas behind it at all, since the same style of writing is most often being used to dress up complete garbage.
Foreign-sounding English is not off-putting, at least to me. It even adds a little intrigue compared to bland corporatese.
I get using a spell checker. I can see the utility in running a quick grammar check. Showing it to a friend and asking for feedback is usually a good idea.
But why would you trust a hallucinogenic plagiarism machine to "clean" your ideas?
It did not feel off at all. I read every single word and that is all that counts.
I think what you are getting wrong is thinking that the reader cares about your effort. The reader doesn't care about your effort. It doesn't matter if it took you 12 seconds or 5 days to write a piece of content.
The key thing is people reading the entirety of it. If it is AI slop, I just automatically skim to the end and nothing registers in my head. The combination of em dashes and the sentence structure just makes my mind tune it out.
So, your thesis is correct. If you put in the custom visualization and put in the effort, folks will read it. But not because they think you put in the effort. They don't care. But because right now AI produces generic fluff that's overly perfectly correct. That's why I skip most LinkedIn posts as well. Like, I personally don't care if it's AI or not. But mentally, I just automatically discount and skip it. So, your effort basically interrupts that automatic pattern recognition.
Fair point. This is more mindset than case study. The proof is still being built across client work. Though I'd say the same was true for SEO in the early days. People speculating on what made Google rank certain sites higher, what made pages index faster, etc. The frameworks came before the proven playbooks
Half of planned US data center builds have been delayed or canceled, growth limited by shortages of power infrastructure and parts from China — the AI build-out flips the breakers
reply