We've recently come across an issue where our robots.txt files are marked as 'unreachable' and 'Not fetched - N/A' in our Google Search Console for our client websites.
Some instances show Google is able to grab the file and then an hour later can't again. Most of our clients' robots file hasn't been fetched since May 10-11.
Our standard robots.txt file looks like this:
User-agent: *
Disallow:
Sitemap: https://www.yourwebsite.com/sitemap.xml
Anyone have any ideas what's going on and how to fix this issue?