Enter an IPv4 or IPv6 address to check if it's a valid crawler.
Try these examples.
- 184.108.40.206 (Googlebot)
- 220.127.116.11 (Bingbot)
- 18.104.22.168 (Twitter)
- 22.214.171.124 (Facebook)
Why do you need to validate search engine IP addresses?
- Identify requests from bots claiming to be search engines and block them
- Filter non-genuine bot activity from your logs to understand the genuine search engine crawler activity
How can you validate requests are from genuine crawlers?
Some crawlers such as DuckDuckGo provide a fixed list of IP addresses. Facebook and Twitter provide a list of IP ranges. Most search engines recommend using a method called reverse DNS lookup to validate their IP addresses.
First you run a reverse DNS lookup on the IP address in your logs to get the hostname. Then run a forward DNS lookup on the hostname to confirm it matches.
The full list of crawlers which can be validated is as follows.