Thursday, May 29, 2025

MacInTouch Paused

Ric Ford:

Website traffic is overwhelmingly dominated now by “bots” executing sophisticated cyberattacks and sucking up every scrap of content; only a tiny fraction of our traffic comes from legitimate human visitors. Unfortunately, these rampant and rising abuses and attacks drive rising server costs, and there’s no practical way to stop them — they originate from networks at Microsoft, Amazon, Oracle, Tencent, Russia, hosting companies, proxies, and limitless other networks everywhere in the world.

I personally need to stop and take a break for a while to re-assess priorities and approaches going forward. I’m putting macintouch.com on pause in an attempt to stem the rising costs, but I’ll note that tidbits.com offers an alternative with similar history and values.

I remain enormously grateful for the wonderful support and collaboration of the MacInTouch community over these past decades, regardless of the murky future we’re all facing. Thank you.

Miguel Arroz:

I’ve been following MacInTouch for… decades… I don’t even know any more. Sad to see the site being paused. I’m hoping Ric brings it back sometime in the feature, but whatever his decision is, I’m thankful for many, many years of great content about the Macintosh and Apple.

Same.

Adam Engst:

I understand all too well what he’s going through, and I wish him the best of luck in figuring out his next steps.

[…]

Our hosting plans don’t have any visit-based limits so I only worry about bandwidth, and since we use Cloudflare for caching and bot protection, that’s generally not a huge issue. The big win recently was switching to Cloudlflare’s bot prevention to block what could literally be hundreds of spambot-created accounts per day.

I’ve had intermittent problems with bots but so far have been able to avoid adding Cloudflare.

Adam Tow:

When I left The Wall Street Journal in 2014, one of my last tasks was to ensure all the article links remained active, even as the front pages redirected to WSJ’s tech section. Eleven years later, many of those links still work. Some embedded videos are gone, but the core content has largely survived.

The same cannot be said (right now) of Macintouch. With its pause, past articles, such as this one, now return 404 Page Not Found errors. It’s yet another reminder of the impermanence of the internet. Beloved, long-running sites can vanish overnight, taking decades of knowledge with them.

And don’t forget the forums.

Previously:

6 Comments RSS · Twitter · Mastodon


We inch ever closer to making the dead internet theory a reality.


I've had MacInTouch in my bookmarks for...25+ years? While Ric and the site haven't produced much essential coverage for the better part of a decade, it still makes me sad to lose another of the old guard of the Mac community.


At the risk of sounding uncharitable, I really wonder if Ric's problems were caused by his use of Wordfence (or WordPress more generally). Wordfence has had *issues* over the years. Macintouch was always snappy and responsive when he wasn't using it, and then it would be enabled again, and it would blanket ban VPN connections... what are bots, and Russians etc seeking on Macintouch specifically that they aren't getting from every other website in the world? I find it hard to figure what sort of honey pot Macintouch would have been, compared to any of the bigger less "one guy's project" sites.

Sad to see it go, but would be sadder if the cause was in part self-inflicted.


Hacking of websites had become fully automated. Every site will be sniffed, again and again and again and I bet that most of the time no human is even aware of it on either side.

Only big targets will have humans doing clever stuff. The rest is just bots trying shit.

It's a bloody nuisance. WordPress can be protected just as any other CMS btw. Wouldn't say it's either better or worse than any of the big names.


"what are bots, and Russians etc seeking on Macintouch specifically that they aren't getting from every other website in the world"

They don't care. If you look at reports from other website owners, you'll see that many of these AI crawlers are extremely poorly written. They'll hit the same website thousands of times per second, re-read the same address multiple times, follow every link regardless of its appearance, check robots.txt for disallowed links specifically to find more links they can scrape, and so on.

People's response is usually to implement proof-of-work checks like Anubis, which seems to work somewhat well for now.


Best thing I ever did was move to a fully static site setup with Cloudflare in front. Costs are negligible, scales to infinity and I never have to worry about hacks. But I get that it wouldn’t be applicable to everyone.

Leave a Comment