We blocked a subdirectory using robots.txt (on purpose) and later we removed it but this page is still not being crawled. Could you please tell what could be the reason?
- I have created a customized sitemap and including the page
- Sub-directory “solutions” has been removed from the robots.txt.
- Page is not blocked by the meta-noindex tag
Any help will be highly appreciated!
It takes some days for Google to scan and re-index pages.
1-google-console Fetch the pages:
2-And also under google-console re-submit your robots.txt (“test”)
Steps 1 & 2 - should make the procces faster (instead of waiting for an automatic update).
Anyway this URL already index: