A good[ish] website

Web development blog, loads of UI and JavaScript topics

Block bad bots in your robots.txt

Filed under: Webdev— Tagged with: seo, tools

This post shows how to block bad bots in your robots.txt file. And it tries to figure out if it does anything? Spoiler alert, probably not.

Blocking bots from your robots.txt doesn’t block a human from scouring your site, it’s merely there to block some web-based tools or automated scraping attempts, and maybe really dumb script kiddies.

For example, blocking wget does actually block it, but any human with more than three brain cells knows how to bypass the robots limitation:

$ wget robots=off

I don’t actually have any data to show if blocking bad bots hampers attempts to scrape or otherwise prod your site, just guesswork and conjecture.

List of bad bots

This is where it gets tricky immediately, there used to be a site called, but the domain has been sniped and is replaced by a placeholder page with affiliate links. But I’ve got a copy of the bots listed on that site (from July 11th 2020).

// Copied from at 2020-06-11.
const badBots = [
  'Big Brother',
  'Black Hole',
  'Bookmark search tool',
  'BSpider/1.0 libwww-perl/0.40',
  'CACTVS Chemistry Spider',
  'Checkbot/x.xx LWP/5.x',
  'Crescent Internet ToolPak HTTP OLE Control v.1.0',
  'CyberPatrol SiteCat Webbot',
  'DISCo Pump 3.0',
  'DISCo Pump 3.2',
  'Download Demon',
  'Download Demon/',
  'Download Demon/',
  'Express WebPictures',
  'Express WebPictures (',
  'FairAd Client',
  'fido/0.9 Harvest/1.4.pl2',
  'Flaming AttackBot',
  'FlashGet WebWasher 3.2',
  'FrontPage [NC,OR]',
  'GetRight/4.2b (Portuguxeas)',
  'GetURL.rexx v1.05',
  'Go!Zilla (',
  'Go!Zilla 3.3 (',
  'Go!Zilla 3.5 (',
  'Hatena Antenna',
  "Hazel's Ferret Web hopper",
  'HTTrack 3.0',
  'HTTrack [NC,OR]',
  'Image Stripper',
  'Image Sucker',
  ' url crawler',
  'Indy Library',
  'Indy Library [NC,OR]',
  'Internet Ninja',
  'Internet Ninja 4.0',
  'Internet Ninja 5.0',
  'Internet Ninja 6.0',
  'ITI Spider',
  'JOC Web Spider',
  'Kenjin Spider',
  'Keyword Density/0.9',
  'KIT-Fireball/2.0 libwww/5.0a',
  'larbin (',
  'larbin_2.6.2 (',
  'larbin_2.6.2 (larbin2.6.2@unspecified.mail)',
  'larbin_2.6.2 (listonATccDOTgatechDOTedu)',
  'larbin_2.6.2 (',
  'larbin_2.6.2 larbin2.6.2@unspecified.mail',
  'larbin_2.6.2 listonATccDOTgatechDOTedu',
  'LinkScan/8.1a Unix',
  'Mass Downloader',
  'Mass Downloader/2.2',
  'Mata Hari',
  'Microsoft URL Control',
  'Microsoft URL Control - 5.01.4511',
  'Microsoft URL Control - 6.00.8169',
  'MIDown tool',
  'Mister PiX',
  'Mister Pix II 2.01',
  'Mister Pix II 2.02a',
  'Mister PiX version.dll',
  'MOMspider/1.00 libwww-perl/0.40',
  'Net Vampire',
  'Net Vampire/3.0',
  'NetCarta CyberPilot Pro',
  'NetScoop/1.0 libwww/5.0a',
  'NetZip Downloader 1.0 Win32(Nov 12 1998)',
  'NetZip-Downloader/1.0.62 (Win32; Dec 7 1998)',
  'Offline Explorer',
  'Offline Explorer/1.2',
  'Offline Explorer/1.4',
  'Offline Explorer/1.6',
  'Offline Explorer/1.7',
  'Offline Explorer/1.9',
  'Offline Explorer/2.0',
  'Offline Explorer/2.1',
  'Offline Explorer/2.3',
  'Offline Explorer/2.4',
  'Offline Explorer/2.5',
  'Offline Navigator',
  'Open Text Site Crawler V1.0',
  'Openfind data gatherer',
  'Oracle Ultra Search',
  'Papa Foto',
  'QueryN Metasearch',
  'Radiation Retriever 1.1',
  'RepoMonkey Bait & Tackle/v1.01',
  'Resume Robot',
  'SafetyNet Robot 0.1',
  'SmartDownload/1.2.76 (Win32; Apr 1 1999)',
  'SmartDownload/1.2.77 (Win32; Aug 17 1999)',
  'SmartDownload/1.2.77 (Win32; Feb 1 2000)',
  'SmartDownload/1.2.77 (Win32; Jun 19 2001)',
  'Solbot/1.0 LWP/5.07',
  'Spanner/1.0 (Linux 2.0.27 i586)',
  'Sqworm/2.9.85-BETA (beta_release; 20011115-775; i686-pc-linux',
  'SuperBot/3.0 (Win32)',
  'SuperBot/3.1 (Win32)',
  'Teleport Pro',
  'Teleport Pro/1.29',
  'Teleport Pro/1.29.1590',
  'Teleport Pro/1.29.1634',
  'Teleport Pro/1.29.1718',
  'Teleport Pro/1.29.1820',
  'Teleport Pro/1.29.1847',
  'The Intraformant',
  'URL Control',
  'URLy Warning',
  'Valkyrie/1.0 libwww-perl/0.40',
  'VCI WebViewer VCI WebViewer Win32',
  'Web Image Collector',
  'Web Sucker',
  'WebAuto/3.40 (Win98; I)',
  'WebCapture 2.0',
  'WebCopier v.2.2',
  'WebCopier v2.5',
  'WebCopier v2.6',
  'WebCopier v2.7a',
  'WebCopier v2.8',
  'WebCopier v3.0',
  'WebCopier v3.0.1',
  'WebCopier v3.2',
  'WebCopier v3.2a',
  'WebCrawler/3.0 Robot libwww/5.0a',
  'WebGo IS',
  'WebLinker/0.0 libwww-perl/0.1',
  'WebmasterWorld Extractor',
  'WebmasterWorld Extractor',
  'WebReaper []',
  'WebReaper []',
  'WebReaper v9.1 -',
  'WebReaper v9.7 -',
  'WebReaper v9.8 -',
  'WebReaper vWebReaper v7.3 - www,',
  'WebSauger 1.20b',
  'WebSauger 1.20j',
  'WebSauger 1.20k',
  'Website eXtractor',
  'Website Quester',
  'Website Quester -',
  'Website Quester -',
  'Webster Pro',
  'WebZIP/2.75 (',
  'WebZIP/3.65 (',
  'WebZIP/3.80 (',
  'WebZIP/4.0 (',
  'WebZIP/4.1 (',
  'WebZIP/4.21 (',
  'WebZIP/5.0 (',
  'WebZIP/5.0 PR1 (',
  'WhoWhere Robot',
  'WWW Collector',
  'WWWWanderer v3.0',
  'Xaldon WebSpider',
  'Xaldon WebSpider 2.5.b3',
  "Xenu's Link Sleuth 1.1c",
  'Zeus 11389 Webster Pro V2.9 Win32',
  'Zeus 11652 Webster Pro V2.9 Win32',
  'Zeus 18018 Webster Pro V2.9 Win32',
  'Zeus 26378 Webster Pro V2.9 Win32',
  'Zeus 30747 Webster Pro V2.9 Win32',
  'Zeus 32297 Webster Pro V2.9 Win32',
  'Zeus 39206 Webster Pro V2.9 Win32',
  'Zeus 41641 Webster Pro V2.9 Win32',
  'Zeus 44238 Webster Pro V2.9 Win32',
  'Zeus 51070 Webster Pro V2.9 Win32',
  'Zeus 51674 Webster Pro V2.9 Win32',
  'Zeus 51837 Webster Pro V2.9 Win32',
  'Zeus 63567 Webster Pro V2.9 Win32',
  'Zeus 6694 Webster Pro V2.9 Win32',
  'Zeus 82016 Webster Pro V2.9 Win32',
  'Zeus 82900 Webster Pro V2.9 Win32',
  'Zeus 84842 Webster Pro V2.9 Win32',
  'Zeus 90872 Webster Pro V2.9 Win32',
  'Zeus 94934 Webster Pro V2.9 Win32',
  'Zeus 95245 Webster Pro V2.9 Win32',
  'Zeus 95351 Webster Pro V2.9 Win32',
  'Zeus 97371 Webster Pro V2.9 Win32',
  'Zeus Link Scout',

Questions about that list:

  • Is it up to date? Probably not.
  • Does the author of the list know what they’re doing? I don’t know.
  • Should I feel more safe after blocking these bots? Defenitely not.
  • Does it actutally do anything? Yes... probably...

Making the robots.txt file

The following allows everything:

User-agent: *
Allow: /

The sytax for disallowing is not much different:

User-agent: userAgentName
Disallow: /

Targeting more than one user agent? Just list them one-by-one before the Disallow: / statement:

User-agent: userAgentName
User-agent: userAgentName2
Disallow: /

See the official documentation at

Here’s a simple Node script to create an example robots.txt file; that allows everything and then disallows the bad bots:

import badBots from './badBots.js'

const bots = => `User-agent: ${badBot}`)
const robotsTxt = `User-agent: *
Allow: /

Disallow: /`

export default robotsTxt

What are the big guns doing?

Alexa top 25 global sites (as of November 18, 2020):

SiteHas robotsBlocks bots

Most of them block paths in their applicaitons and some well-known bots. But nothing major, which makes me think more and more that this is useless. Or they’re doing the blocking at the web server level (NGINX, Apache).

Comments would go here, but the commenting system isn’t ready yet, sorry. Tweet me @hiljaa if you want to make a correction etc.

  • © 2021 Antti Hiljá
  • About
  • Follow me in Twatter → @hiljaa
  • All rights reserved yadda yadda.
  • I can put just about anything here, no one reads the footer anyways.
  • I love u!