Can you force pages out of google's index during it's next crawl?

  • 1
  • Question
  • Updated 9 years ago
  • Answered
I'm going to delete quite a few pages soon and I don't want traffic to click on a link in google and get a 404.
Photo of Donald

Donald

  • 2991 Posts
  • 37 Reply Likes
  • http://youcanneverbetoosmart.com

Posted 9 years ago

  • 1
Photo of lambofgod

lambofgod

  • 875 Posts
  • 35 Reply Likes
I'm not to sure if it will unindex them, but to keep it from indexing you can you this meta tag:<!--
<meta name="robots" content="noindex">

-->
Photo of Emmy

Emmy

  • 5892 Posts
  • 299 Reply Likes
Hi there,

The noindex meta tag will request that the search engine not index certain pages, but this tag is meant to go in the head section of a page, which is not possible in Yola. I know that some people will put meta tags outside of the head section in an HTML Widget in their Yola page, and this may work sometimes, but I can't guarantee it as meta tags are looked for by the search engines in the head section.

Emmy
Photo of lambofgod

lambofgod

  • 875 Posts
  • 35 Reply Likes
Really it won't work in the top of the page?
Photo of Emmy

Emmy

  • 5892 Posts
  • 299 Reply Likes
Hi lambofgod,
Meta tags are meant to go in the head section of a page; so putting them outside the head section might not work. I can't say for sure, and it might possibly work in some circumstances, and then in some not. Search engines look for meta tags to tell them about a page, and the standard placement of meta tags is the head section. The search engine might take notice of the meta tag outside of the head section, but that what I was saying is that I cannot say for sure or not if this is the case.

Emmy
Photo of Donald

Donald

  • 2991 Posts
  • 37 Reply Likes
thanks for the answers.... you can "disallow" pages in webmasters though can't you?