Any ideas why Google won’t index pages based on them being ‘not indexed’ - I can assure you that the index box is checked and the ‘noindex’ box for each page is not checked.
Perhaps code is causing this problem, I am not sure though on this. any ideas ?
I have the same problem, all pages that I previously had a noindex on continue to be this for google even after I removed the noindex. It has been about a month since I removed this and I have asked Google to index again and uploaded the sitemap without this helping.
No, I had forgotten about this until I saw this post yesterday.
I have checked with an SEO software I have, and it shows that it has to do with the sitemap having no index on these pages.
I know very little about sitemaps and don’t know where to find the noindex part in the sitemap, but I found this in robots.txt User-Agent:* Disallow: /impressum
Could this have something to do with this and how can I fix this?
It’s the same website that you’ve been helping me with the image comparison for the past week. No, the website has not been indexed before, I put all the pages in noindex while I was working on them and I turned this off when I felt they were good, but they still show as noindex in the google search console.
On Google Search Console in the " Page indexing" (Index → Pages) page if you scroll down you should see the reason why your pages are not indexed.
It should look something like this:
In the “Source” column you have the source of the reason (can be “caused” by the website or by Google).
Keep in mind that for the “Crawled currently not indexed” reason, Google clearly states the following:
“The page was crawled by Google, but not indexed. It may or may not be indexed in the future; no need to resubmit this URL for crawling.”
In other words Google might not decide to index your page, and from my experience pages with low content usually don’t get indexed.
Sorry, but my Search Console is in Norwegian, but you can see here that 18 pages are not indexed because of “no-index”. 4-5 of them are correct in that they are listed as no-index, but the rest should be indexable.
Hi! If you’ve removed the noindex option from Sitejet, then in Google Search Console you can click on the first item where you see that the 18 pages are not indexed because of “no-index”.
On the new screen that will appear, press on the Validate Fix button to instruct Google Crawler to crawl the pages again. Keep in mind that it will take a while (a few days) before getting a resolution to the fix.
Hi @Pal_Nikolaisen@Lucian_Dinu I followed the above instructions here Lucian and Google still haven’t crawled the pages from weeks ago… Any ideas here?
I have tried this now, I will come back in a few days to say if it worked or not.
But I wonder what that text from robots.txt means “Disallow: /impressum” I have tried to google this without finding any sensible answer to this. Is this something that is special just for Sitejet?
Hey @Ryan_Crosbie - in the process of fixing this issue we have given extensive support to make sure all pages are set to “index” so Google would be able to crawl them.