fix robots.txt
currently the robots.txt file is useless because it's interpreted as one path "/icons/ /fonts/ *.js *.css" (an example path that would be accepted -- and therefore disallowed for robots) by this regex would be `https://cobalt.tools/icons/ /fonts/ bla.js .css`, which is obviously nonsense & useless)
This commit is contained in:
parent
463ece02c7
commit
d936dd73fe
1 changed files with 4 additions and 1 deletions
|
@ -1,2 +1,5 @@
|
|||
User-Agent: *
|
||||
Disallow: /icons/ /fonts/ *.js *.css
|
||||
Disallow: /icons/
|
||||
Disallow: /fonts/
|
||||
Disallow: /*.js
|
||||
Disallow: /*.css
|
||||
|
|
Loading…
Reference in a new issue