The client asked about not showing advertise banners for bots, because company losing money as a result.
I implemented this logic on non-AMP pages, I'm parsing the User-Agent by JavaScript on client side. The reason to implement it on client side was cache, I scared that if cache will saved, when bot parsing site, all humans after will not see advertise at all.
Now I have the challenge to use this logic on AMP pages of current site and think about how I can do it on client side too, because AMP project don't give me possibility to use custom JavaScript in usually way.
The term you are looking for is 'cloaking' and it's harshly penalized by search engines such as google: https://support.google.com/webmasters/answer/66355?hl=en&ref_topic=6001971
Google also penalize sites for various number of reasons, both for reasons of having their search results be relevent, not to redirect users to websites that are painful to use due to the amount of interstitial ads, and probably opaquely to make their Google ads more appealing to people then ads that are more intrusive.
In short, it's a bad idea, and will your site will get caught, and your site will suffer as a consequence.
That said, you should be able to filter the content based on the user-agent. Most well behaving bots will advertise that they are bots, but not all.
Unless you have an explicit list of ip addresses to serve different content to, you won't be able to catch the bots that impersonate users easily without using underhanded techniques.
This makes me ask,
Exactly how are they losing money as a result? If it's 'lost profits' then it's not losing money. The bots would never have responded to the ads anyway.
If it's bandwidth, then the cost is minimal compared to the loss you will get, if you serve content differently to bots then humans and get caught.
If it's that the bots are then re-serving your content to your users, filtering the ads, then you need to outright block those bots somehow, or otherwise get them to prove they are human before continuing, a CAPTCHA of sorts would be best.
If it's a simple reporting issue, MOST bots will generally report they are bots, and google analytics should be able to filter them with some tweaking, and the ones that arn't can't be easily distinguished anyway.