.1. Require a login for access: So bots cant download your websites html anonymously.
2. change your websites html regularly By changing your websites pattern regularly you make it less vulnerable to web scrapers.
3. using CAPTCHAs To protect access of bots to your website.
4. Don't expose your complete dataset The lesser you expose your data lesser the information attacker gets.
5. Don't accept requests if the User Agent is empty / missing. Sometimes attackers forget to mention user headers with their bots.
6. Use and require cookies; use them to track user and scraper actions. 7. Use JavaScript + Ajax to load your content This will make the content inaccessible to HTML parsers which do not run JavaScript. This is often an effective deterrent to newbie and inexperienced programmers writing scrapers.