Page 1 of 1

robots.txt

Posted: Wed Feb 23, 2022 1:48 am
by kevindef
Hi,
Just a quick question; why does the robots.txt generated from WB by default rule set "allows all" despite I have set each page to either allow or disallow.

For example:
this is what I get from WB
User-agent: *
Allow: /
Sitemap: http://www.mydomian.com.au/sitemap.xml

Instead of what I should be getting:
User-Agent: *
Disallow:
Allow: /
Allow: /index
Disallow: /privacy
Disallow: /terms
Disallow: /error-contact
Disallow: /sucsess

Sitemap: http://www.mydomian.com.au/sitemap.xml

Thanks
Kevin

Re: robots.txt

Posted: Wed Feb 23, 2022 7:15 am
by Pablo
The default rule can be set in the first 'rule' property of the robots.txt dialog.
https://wysiwygwebbuilder.com/robots_txt.html

Re: robots.txt

Posted: Wed Feb 23, 2022 9:03 pm
by kevindef
Thanks