I solved it in a nice way a couple of years ago.
I had an email form on a small business website and I wanted to be as accessible as possible; spam bots found it and began to suppress legitimate messages. From reading server logs, I found out that the bots submitted the form without retraining it first - someone cached my form and simply sent POST whenever they had some kind of garbage for me to read. Hidden form input would help for several days, but then some bot owner would figure out the correct input, cache it, and the flood would start again.
I did not have a backend where I could add session information to the form and did not want to add it. Instead, between the "Enter your message here" field and the hidden element, I inserted script output that writes
<!-- instructions for spam robots: we are a waste of your money, go away, thanks -->
<div class="float-left" style="font-size: x-small;">
There will be a short delay before you may submit the form. If you
have been typing in your information, the delay may already have
ended.
<br/><span>
4 ...
</span><span>
<!--
d92cbd14985295ac27929a6db7891a90ec4173a8358dcadab134cc589ce2de54
1468365bd33b520754ddb8223252e7e6e7584ddb956ef1bb28628e27cfea86c6
-->
A block of garbage is randomly generated, making compression difficult. I experimented with how long garbage blocks should last. When I got the form size up to 200K, the spam messages stopped.
In fact, this is not so much additional data as adding a few additional images to the page. Even for a hypothetical client in dialup, the delay between rendering a text field and rendering a submit button is shorter than the time it would probably take to compose a message.
rob
source share