Robots.txt gets replaced by default when publishing and indexing setting doesen't work

Hi,

I’m using Sitejet as Plesk extension and I find Sitejet behavior confusing.

I built a website and published it. Then I started adding more content. I saved, found out that the content wasn’t updating for some reason.

Then I realized I have to hit “Publish” every single time to make the changes go live.

Of course then I will always get the “Existing files in your websites document root will be removed. Consider to create a backup of the domain if necessary” notification which deletes the robots.txt and replaces it with the default version(which has the sitemap URL linked to the preview.sitehub.io website, not actual domain). Custom files I’ve uploaded like favicon files in the root are not affected.

Let’s top this with the fact that every page is set with “noindex, nofollow” meta tag even when Noindex is turned off!

When I manually edit html to index, follow it gets replaced by noindex, nofollow when I hit publish!

Hey @Polarfox - would you like to let me know your Plesk ID and the Website ID? I will forward this to a developer to check on this behavior.

SiteJet ID is 383369

As for Plesk ID, I don’t where it is located?

Hey there,

Currently, the website is not live. Therefore, it is a bit difficult for me to test the issues you are mentioning. Especially the index/noindex issue.

Do you like to use your own robots.txt and sitemap.xml or the one, that we are providing?

Are you planning to publish the website soon? Also, even better to see what is happening on your side, would be a screencast to show the issues.

If you do not like to share the screencast here, you can send us a support mail to help@sitejet.io with the link to this thread and I can directly have a look for you.

Let me know what works for you.

I don’t get it. My website has been live since last friday?

I sent an e-mail just now.

1 Like

Hey @Polarfox - with the recent release, these issues should be solved: Sitejet Builder - Plesk Extensions

Please take a look and let me know :pray:

1 Like

Hi,

I noticed indexing setting started working early friday morning. Custom robots.txt is still replaced after publishing the website, but the sitemap URL is now shown correctly as /sitemap.xml, not the preview website like before. After this update Google Search Console finally approved by indexing request.

Is there a way to keep using my own robots.txt or do you recommend I just leave it to the default settings?

1 Like

Perfect. Glad, that most is sorted out. Could you ask the Plesk support about the robots.txt? I reckon that they handle it differently, than we do with the Standalone Sitejet product.

Keep me updated, if you need additional help.

1 Like