Hi,
Is there a way to access robots.txt to be able to disallow a specific directory?
for example /cdn-cgi/
If it's not possible to disallow a specific directory, is it possible to disallow the entire support site?
note:
adding noindex won't solve my issue, as google will still crawl the pages I want to block
Thanks
Disallow a directory in robots.txt
Login to the community
No account yet? Create an account
Enter your E-mail address. We'll send you an e-mail with instructions to reset your password.