Errors with Robots.txt files

When I use Google Page Speed Insights, I see a common error in my Robots.txt files.

Can someone explain to me if this is normal and if there is something I can do to explain this error to my clients when they see it?

1 Like

Hey @David_McArdle,

could you let me know the website IDs where this is happening for you?

Sure, it is project 255285

I was told putting a * in front of the .js the .css could resolve the issue. Is this something you can update on your end?

Hey @David_McArdle - I have created a developer ticket and we will discuss this soon which adjustments we have to make. Until then, I would recommend uploading a manual robots.txt.

This has been fixed.