Search Engine Bots Discovery
Last updated
Last updated
By default, this setting will block Search Engine Bots from crawling and indexing your website.
Crawling & indexing your website by Search Engine Bots should be done only on production instances so we recommend leaving this setting in a prevented state for all other instances.
To prevent Search Engine Bots from discovering your website you must navigate to Portal > Selected Project > Selected Environment > Application Management > Search Engine Bots Discovery tab and click Block Search Engine Bots.
Confirm that you want to prevent Search Engine Bots Discovery.
Now Search Engine Bots Discovery setting tab will have an updated description.
In the Deployments tab, you will be required to run the next deployment without a cache. Automatic deployment will also happen without a cache.
After successful deployment, the Search Engine Bots will be blocked from crawling, discovering and indexing your website.
To allow Search Engines to crawl, discover, and index your website you must navigate to Portal > Selected Project > Selected Environment > Application Management > Search Engine Bots Discovery tab and click Allow Search Engine Bots.
Confirm that you want to allow Search Engine Bots Discovery.
Now Search Engine Bots Discovery setting tab will have an updated description.
In the Deployments tab, you will be required to run the next deployment without a cache. Automatic deployment will also happen without a cache.
After successful deployment, navigate to Magento Admin Panel > Content > Design > Configuration > Default Website > Search Engine Robots and confirm that Default Robots is set to INDEX, FOLLOW and Custom instructions for robots.txt are either empty or contain correct instructions for Search Engine Bots.
Click Save Configuration and flush the invalidated cache type! That is it!