A main competitive advantage for this online hotel metasearch engine is their content. The attraction and retention of customers lies heavily from their investment in their image algorithms, geography and other intellectual property for accumulating proprietary information. However, an infiltration of scraping bots started to impact their web traffic to the point that any analysis of their traffic data was deemed unreliable for making business decisions. They were vulnerable in their ability to negotiate good rates with their suppliers without reliable data. And ultimately, they ran the risk of losing their customers without the ability to offer the best pricing from suppliers. As a tech company, the hotel metasearch engine attempted to build their own solution to solve their scraping bot problem, but they quickly realized the dimensionality of bot issues changed every day. The development of the homegrown solution was a massive undertaking in time and resources to analyze traffic statistics and tune the algorithms reactively, let alone proactively. They contemplated utilizing a blanket filter over the traffic as a solution. However, the solution did not enable the ability to distinguish the good bots from the bad bots, nor control various regions of identified bad traffic problems.