As a first note; Consider offering a bot API for the future. If another company / etc. scans you, If this is the information you want to provide them, this makes your site valuable to them. Creating an API will significantly reduce the load on the server and give you 100% clarity for the people who scan you.
Secondly, based on personal experience (I created web crawls a long time ago), as a rule, you can immediately say by monitoring which browser had access to your site. If they use one of the automated or one of the development languages, it will be different from your regular user. Not to mention tracking the log file and updating your .htaccess with their ban (if that is what you are looking for).
Its usually different, which is pretty easy to spot. Repeated, very consistent opening of pages.
Check out this other post for more information on how you can deal with them, as well as some thoughts on how to identify them.
How to block intruders who crawl my site?
source share