Mia Sato:
Packed with SEO keywords and phrases and generated using artificial intelligence tools, the Get Bullish store blog posts act as a funnel for consumers coming from Google Search, looking for things like Mother’s Day gifts, items with swear words, or gnome decor. On one hand, shoppers can peruse a list of products for sale — traffic picks up especially around holidays — but the words on the page, Dziura says, are not being read by people. These blogs are for Google Search.
[…]
It’s a universal experience for small business owners who’ve come to rely on Google as a major source of traffic and customers. But it’s also led to the degradation of Google’s biggest product, Search, over time. The problem is poised to only continue to spiral as business owners, publishers, and other search-reliant businesses increasingly use artificial intelligence tools to do the search-related busywork. It’s already happening in digital media — outlets like CNET and Men’s Journal have begun using generative AI tools to produce SEO-bait articles en masse. Now, online shoppers will increasingly encounter computer-generated text and images, likely without any indication of AI tools.
In April, e-commerce company Shopify — which is used by millions of businesses, including toymaker Mattel and Kim Kardashian’s brand, Skims — launched an AI tool that allows businesses to generate product descriptions using keywords. Other AI companies offer tools that generate entire websites using automation tools, filling sites with business names, fake customer testimonials, and images for less than the price of lunch.
[…]
“If I made a blog post that was just what you would want as a person — ‘Here are 25 gift items under $25,’ [added] a picture of each one, a price, and a link — Google would not like it. Google would hate that list,” she says. “So here we are with all this text that is written only for a search engine.”
Via Nick Heer:
The sharp divergence between writing for real people and creating material for Google’s use has become so obvious over the past few years that it has managed to worsen both Google’s own results and the web at large. The small business owners profiled by Sato are in an exhausting fight with automated chum machines generating supposedly “authoritative” articles. When a measure becomes a target — well, you know.
[…]
At the same time, I have also noticed a growing number of businesses — particularly restaurants — with little to no web presence. They probably have a listing in Apple Maps and Google Maps, an Instagram page, and a single-page website, but that could be their entire online presence.
Previously:
Artificial Intelligence ChatGPT Google Search Search Web
Abner Li:
In an unexpected announcement today, Google Domains is “winding down following a transition period,” with Squarespace taking over the business and assets.
[…]
Google cited “efforts to sharpen our focus” in selling the Google Domains registrar business, which launched in 2014 as a big proponent of HTTPS and top-level domains (TLDs) as of late. The service exited beta in 2022.
Gergely Orosz (Hacker News):
This is 10 million domains sold. Millions of customers like me learn again (and again!) that you cannot trust Google to keep their own products alive.
Show me another vendor that throws away customers like this…
[…]
Also: as a Google Domains customer, why am hearing about this news from Squarespace’s press release?
Google (my provider, and where I’m a customer) has not told me this is happening.
John Gruber:
I had an alert that the new domain name I registered would be put on “hold” today if I didn’t verify the email address I used to register it. I hadn’t seen any verification emails from Google about it. I told Google Domains to send another verification email. Still didn’t see it. Turns out that even though I used an @gmail.com address to register the domain, every single email from Google Domains was being flagged as spam. So Google’s own email service considers all emails from Google’s own domain name service to be spam.
David Heinemeier Hansson:
Google will eventually kill every single service you care about, if they can’t find a way to directly monetize it with ads at a scale of billions. They’re institutionally incapable of being in the product or service business, because neither products nor services butter Google’s bread. Advertisement does.
See also: Hacker News.
Previously:
Acquisition Domain Name System (DNS) Google Squarespace Sunset Web
WebKit:
In Safari 17, Private Browsing gets even more private with added protection against some of the most advanced techniques used to track you. Technical changes include:
- Adding blocking for known trackers and fingerprinting.
- Adding support for mitigating trackers that map subdomains to third-party IP addresses.
- Adding blocking for known tracking query parameters in links.
- Adding noise to fingerprintable web APIs.
- Adding console log messages when blocking requests to known trackers.
- Adding support for blocking trackers that use third-party CNAME cloaking.
- Adding support for Private Click Measurement for direct response advertising, similar to how it works for in-app direct response advertising.
Benjamin Mayo:
Adding tracking parameters to links is one way advertisers and analytics firms try to track user activity across websites. Rather than storing third-party cookies, a tracking identifier is simply added to the end of the page URL. This would circumvent Safari’s standard intelligent tracking prevention features that block cross-site cookies and other methods of session storage.
Navigating to that URL allows an analytics or advertising service at the destination to read the URL, extract those same unique parameters, and associate it with their backend user profile to serve personalized ads.
Cory Underwood:
Many of these items are already handled by Webkit’s Tracking Prevention, which historically hasn’t leveraged the full suite of capabilities when used in Private Mode, as nothing in private mode was persisted beyond the tab being closed.
[…]
I also believe that the blocking for known trackers is likely to leverage the tracker list provided by DuckDuckGo, in much the same way that IP Address Obscurification(released in iOS15) works today. This is new behavior as previously the identified domains would be routed across the internet to mask the user’s IP Address. Now they will be blocked at the network layer and the external resource won’t be loaded in a way similar to what Brave’s Shield technology does today. This may cause website features to fail unless you have designed the site to fail gracefully. There is a high likelihood that this will affect attribution and analytics platforms and prevent them from being loaded in Private Browsing instances.
Jeff Johnson:
In particular there’s a QUERY_PARAM.wplist
file that contains a list of URL query parameters to be removed. This list currently comprises 25 of the usual suspects, click identifiers such as fbclid
(Facebook), gclid
(Google), and msclkid
(Microsoft).
Incidentally, most of these parameters, and others not on Apple’s list, are already automatically removed by my Safari extension StopTheMadness.
[…]
I’ve been informed by a reliable source that Apple’s QUERY_PARAM.wplist
list of tracking query parameters came from PrivacyTests.org.
See also: MacRumors.
Previously:
Update (2023-07-25): See also: Hacker News.
Francisco Tolmasky:
This Safari query parameter removal thing is just going to become a cat-and-mouse game, right? Google can easily start using gclid2 or even switch to using something like “a,” that you’d be less willing to indiscriminately chop off the URL. Eventually, you could even imagine a dynamic query key scheme, where you can identify it with a credit-card-style hash function. For example, if f(KEY) = (char1 + char2 + … + charN) % 64 = 39, then it knows that that query key is its tracking ID.
[…]
My point is that you only feel safe mutating query parameters if you know for sure that they are tracking query parameters. You wouldn’t for example ever want to either remove, nor mutate, “q=”. That would break search. If trackers just start using less identifiable query keys, then it becomes very difficult to do anything to them without also potentially breaking legitimate websites.
iOS iOS 17 Mac macOS 14 Sonoma Privacy Safari StopTheMadness Top Posts Web
Asmodee (in 2021):
Asmodee announces the acquisition of Board Game Arena (BGA), the digital multiplayer board game platform, to provide BGA with high visibility among the consumer market and accelerate the release of Asmodee’s long-awaited successes by players.
Founded in 2010 by Grégory ISABELLI and Emmanuel COLIN, BGA has emerged as the global leader in online board games. The platform provides official online versions of more than 250 games, supported on 40 languages, to more than 5 million members around the world.
Dan Luu:
I wonder if it’s possible for a good platform to unseat BGA as the default place people play board games online within a decade. BGA offers a more mainstream-friendly UX than the major alternatives (BSW, yucata), but it’s comically bad on both performance and correctness (I frequently see blatant concurrency bugs that appear to come from not thinking about concurrency at all, serious rules errors go unfixed for years, etc.)
My feeling is that this is effectively impossible because you’d need to have hundreds of games to compete but, with how little money there is in online board games, the only way to get there is to let basically anyone implement games for your platform with little quality control, guaranteeing extremely buggy code.
[…]
When BGA got acquired, there was grumpiness about how the owners made $$ from volunteer labor for an indie site.
Some predicted this would cause mass migration to yucata but most devs doing free labor for a platform just want to be on the biggest platform and most users want some combination of the biggest platform and the least janky UX[…]
Previously:
Acquisition Board Game Board Game Arena Business Web