Seconded, websites with logins are practically unusable without the tool. We had to disable it once and our database got flooded by unverified accounts. Absolutely awful.
Not easily, and not at the time, no, it really was a very easy way to quickly reduce bot problems at the time.
You'd get random spam for stuff that could flood your forums or etc, and setting up captcha had an extremely immediate and palpable effect on reducing the spam that came in from random bot farms and shit.
I can personally confirm that when I implemented captcha on my forums i maintained 14 years ago, it pretty substantially reduced spammers by a huge degree.
There's no point in arguing what once was. Things have changed. CAPTCHAs are now less effective, far more invasive, and for many people, far more troublesome.
Cling to them if you like. I no longer use them on any of my sites, because I care about my users.
What will be effective depends on the nature of the site and that of the bots causing trouble. For example, a forum can limit posting privileges until an account builds a reputation, a paid goods/services site can restrict access until a purchase is made, a web service can use revocable credentials, and a data download site can use rate limits. (That last one is actually useful in a variety of situations, and can be done at the network level instead of or in addition to the application level.)
There is no silver bullet, but there are lots of small measures that can be very effective when applied thoughtfully, without turning a site into a frustrating-to-use surveillance tool for Google at the expense of the humans who want to or have to use it.
Even a small, locally hosted, activate-only-once, simple image or text-based CAPTCHA would be preferable to the ones operated by third parties.