Enter an IPv4 or IPv6 address to check if it's a valid crawler.
Try these examples.
- 66.249.90.77 (Googlebot)
- 40.77.167.15 (Bingbot)
- 199.59.149.165 (Twitter)
- 69.63.176.13 (Facebook)
Why do you need to validate search engine IP addresses?
- Identify requests from bots claiming to be search engines and block them
- Filter non-genuine bot activity from your logs to understand the genuine search engine crawler activity
How can you validate requests are from genuine crawlers?
Some crawlers such as DuckDuckGo provide a fixed list of IP addresses. Facebook and Twitter provide a list of IP ranges. Most search engines recommend using a method called reverse DNS lookup to validate their IP addresses.
First you run a reverse DNS lookup on the IP address in your logs to get the hostname. Then run a forward DNS lookup on the hostname to confirm it matches.
The full list of crawlers which can be validated is as follows.