Google needs to be able to crawl your website so they can look up and compare different info like I.e. price, shipping, language and so on. If they aren't allowed to crawl your website, some products or the whole account will be disapproved. If your products and/or account currently is disapproved because of this, you need to make sure your robots.txt file allows both user-agents 'Googlebot' (used for landing pages) and 'Googlebot-image' (used for images) to crawl your full site. The robots.txt file can usually be found in the root directory of the webserver (for example, http://www.youordomain.com/robots.txt).