robots.txt
Posted: Wed Feb 23, 2022 1:48 am
Hi,
Just a quick question; why does the robots.txt generated from WB by default rule set "allows all" despite I have set each page to either allow or disallow.
For example:
this is what I get from WB
User-agent: *
Allow: /
Sitemap: http://www.mydomian.com.au/sitemap.xml
Instead of what I should be getting:
User-Agent: *
Disallow:
Allow: /
Allow: /index
Disallow: /privacy
Disallow: /terms
Disallow: /error-contact
Disallow: /sucsess
Sitemap: http://www.mydomian.com.au/sitemap.xml
Thanks
Kevin
Just a quick question; why does the robots.txt generated from WB by default rule set "allows all" despite I have set each page to either allow or disallow.
For example:
this is what I get from WB
User-agent: *
Allow: /
Sitemap: http://www.mydomian.com.au/sitemap.xml
Instead of what I should be getting:
User-Agent: *
Disallow:
Allow: /
Allow: /index
Disallow: /privacy
Disallow: /terms
Disallow: /error-contact
Disallow: /sucsess
Sitemap: http://www.mydomian.com.au/sitemap.xml
Thanks
Kevin