- GoogleBot can't see your content
- Content is not indexed
- People cannot find your content because it's not indexed.
Simple to fix, right? Just remove protection from you JS and CSS files. Maybe not so simple. Popular dynamic site generating tools like WordPress and Drupal have their own directory structure and style concatenation, limiting to allow indexing per file type can be tricky as the Robots.txt standard has limited support for wildcards, etc.
In addition to your own server, your site is probably loading resources from third party servers as well. These are out of your control, and can block resource loading on their own. A practical example:
- Your site uses the popular Yandex Maps API to provide driving directions to your store
- The Yandex server blocks GoogleBot from fetching the Yandex code: https://api-maps.yandex.ru/robots.txt