# Global rules User-agent: * Allow: / Disallow: /api/ Disallow: /admin/ Disallow: /dev/ Disallow: /temp/ # Priority paths - ensure thorough crawling Allow: /industries/ Allow: /solutions/ Allow: /case-studies/ # Sitemaps Sitemap: https://tetraheat.com/sitemap.xml # Crawl rate settings Crawl-delay: 10 # Host settings Host: https://tetraheat.com # For more information about how to use robots.txt, see: # https://developers.google.com/search/docs/crawling-indexing/robots/intro