robots.txt (1236B)
1 User-agent: TurnitinBot 2 Disallow: / 3 4 User-agent: AI2Bot 5 User-agent: Ai2Bot-Dolma 6 User-agent: Amazonbot 7 User-agent: anthropic-ai 8 User-agent: Applebot 9 User-agent: Applebot-Extended 10 User-agent: Bytespider 11 User-agent: CCBot 12 User-agent: ChatGPT-User 13 User-agent: Claude-Web 14 User-agent: ClaudeBot 15 User-agent: cohere-ai 16 User-agent: Diffbot 17 User-agent: DuckAssistBot 18 User-agent: FacebookBot 19 User-agent: facebookexternalhit 20 User-agent: FriendlyCrawler 21 User-agent: Google-Extended 22 User-agent: GoogleOther 23 User-agent: GoogleOther-Image 24 User-agent: GoogleOther-Video 25 User-agent: GPTBot 26 User-agent: iaskspider/2.0 27 User-agent: ICC-Crawler 28 User-agent: ImagesiftBot 29 User-agent: img2dataset 30 User-agent: ISSCyberRiskCrawler 31 User-agent: Kangaroo Bot 32 User-agent: Meta-ExternalAgent 33 User-agent: Meta-ExternalFetcher 34 User-agent: OAI-SearchBot 35 User-agent: omgili 36 User-agent: omgilibot 37 User-agent: PerplexityBot 38 User-agent: PetalBot 39 User-agent: Scrapy 40 User-agent: Sidetrade indexer bot 41 User-agent: Timpibot 42 User-agent: VelenPublicWebCrawler 43 User-agent: Webzio-Extended 44 User-agent: YouBot 45 Disallow: /categories/ 46 Disallow: /css/ 47 Disallow: /december-adventure/ 48 Disallow: /js/ 49 Disallow: /now/ 50 Disallow: /posters/ 51 Disallow: /posts/ 52 Disallow: /pubs/ 53 Disallow: /tags/