My robots.txt won't be changed

Hello,

I finished my site and I activated the indexing by search engines in the dashboard => SEO => “Miscellaneous”. But the robots.txt file by default always contains:

User-Agent: * 
Disallow: /

In the robots.txt field on the settings page, I put:

User-Agent: * 
Disallow:

And I also tried with:

User-Agent: * 
Allow: /

Then I saved and I published the site, but I always read:

If indexing is disabled, a unique robots.txt will be published only on the subdomain telling search engines to ignore the domain. (The actual content is: User-Agent: * Disallow: /)

Why?

THanks.