Have you ever seen a site that ranks for irrelevant keywords or queries that you don't want to rank for? Google's John Mueller says that if this is the case, your titles and content may need to be clearer if they're too vague.
He said that you can either ignore the fact that you're ranking high for those queries, or work to improve your content overall, adding that “sometimes a page will rank high for something you don't expect, and you can't prevent that, but it won't negatively impact the rest of your site.”
John wrote this in response to a question on LinkedIn.
The questioner was Alvaro Picho Torres.
If I want to prevent my website or a specific URL from appearing in impressions for a specific query or search, is it better to set meta noindex in the head or block it in robots?
My case: I did an SEO audit on the metal coatings industrial vertical and my website was showing up in the SERPs and Google Search Console impressions for “metal coatings workshops” which confused the ReferenceQueries. I want to remove them without deleting the posts.
John Mueller responded that blocking Google wouldn't do much good, but improving the content might. He said:
If you noindex or disallow a page with robots.txt, it won't show up in regular searches either. Just ignore it, or be a bit more clear if your title or description is vague. Sometimes a page may rank for unexpected reasons and you can't prevent it, but it shouldn't negatively impact the rest of your site.
If you want to block indexing of a page entirely, noindex is the appropriate mechanism.
what would you do?
Forum discussion on LinkedIn.