block crawling of a page with robots.txt


create a robots.txt block thes pages, https://web-stories.in/c and https://web-stories.in/shop.php





User-agent: *
Disallow: /c
Disallow: /shop.php 





This robots.txt file will block all search engines from crawling the pages /c and /shop.php. As a result, these pages will not be indexed by search engines and will not appear in search results.
To create a robots.txt file, you can use a text editor to create a new file named robots.txt and add the above code to it. Then, you can upload the file to the root directory of your website.
Once the robots.txt file is in place, search engines will start crawling your website and will respect the instructions in the file. This will prevent them from crawling the pages /c and /shop.php, and these pages will not be indexed by search engines.
























Comments

Popular posts from this blog

cpanel exam CPSP Answers

How to install zimbra collaboration suite 8.8.11 on CentOS 7

awstats installation