Thomas Ricouard:
With Mastodon it was time, I could finally make my own social network app, and with iOS 16 and all the great new SwiftUI API that came with it, it was the perfect timing.
[…]
The pinning and reading remote timelines feature was shipped as part of the initial release, and I received positive feedback about it every day. I know I’m not the first to do it, as other apps were already doing it before. However, making it a core feature and placing it in front of the user, on top of being easy to use, really helped raise awareness about it quite a lot.
[…]
The packages are split by domains and features. There is very little code in the app itself; everything is self-contained in its own package. This makes it easier to test (even if, for now, there are barely any tests) and faster to work at the package level with SwiftUI previews, faster build times, and so on.
[…]
It have one main view, one view model, and then it’s composed of small, targeted subviews. […] The idea is to connect and do the minimum amount of update possible in those subviews to keep updates while scrolling at the minimum (actually next to none in the case of scrolling a list of statuses). This played a big part into improving performances while scrolling the timeline in the 1.5.X versions of the app.
The repo is here.
Previously:
Ice Cubes iOS iOS 16 iOS App Mastodon Model-View-ViewModel (MVVM) Open Source Programming Swift Programming Language SwiftUI
Tom Hormby (in 2010, via Dave Mark):
Steve Jobs hired Sakoman in 1984 to help work on a laptop version of the Macintosh after the successful release of the HP Portable. When Jobs left Apple, these laptop plans were scrapped, and Sakoman helped lead the teams creating the Mac Plus, Mac SE, and Mac II.
He found the work uninteresting, however. He wanted to leave Apple to work on handheld computers, and he recruited Jean Louis Gassée to lead a brand new company that would be bankrolled by Lotus founder, Mitch Kapor. The plan fell through, since it appeared that Apple would probably sue the nascent company.
To keep the talented Sakoman from defecting, Gassée proposed creating a skunk works project to create an Apple handheld computer. Gassée got permission to start the project from Sculley (without telling him what was being researched), and Sakoman set to work.
[…]
Sakoman’s end goal for Newton was to create a tablet computer priced about the same as a desktop computer. It would be the size of a folded A4 sheet of paper and would have cursive handwriting recognition and a special user interface.
However, they ended up focusing on the smaller and cheaper Junior model.
Handwriting History Newton
Adam Chalmers (via Jim Rea):
My computer science degree had taught me all about algorithms, data structures, type systems and operating systems. It had not taught me about containers, or ElasticSearch, or Kubernetes. I don’t think I even wrote a single YAML file in my entire degree.
[…]
This article is aimed at engineers who need to deploy their code using Kubernetes, but have no idea what Kubernetes is or how it works. I’m going to tell you a story about a junior engineer. We’re going to follow this engineer as they build a high-quality service, and when they run into problems, we’ll see how Kubernetes can help solve them.
[…]
Kubernetes exists to solve one problem: how do I run m containers across n servers?
Its solution is a cluster. A Kubernetes cluster is an abstraction. It’s a big abstract virtual computer, with its own virtual IP stack, networks, disk, RAM and CPU. It lets you deploy containers as if you were deploying them on one machine that didn’t run anything else. Clusters abstract over the various physical machines that run the cluster.
Kubernetes Programming Web
Dan Moren:
Once upon a time, Apple offered its lightest notebook in two sizes: the 13-inch it sells today and a smaller 11-inch model. Alas, only the good die young, and the 11-inch Air was discontinued in 2019—the same year that Apple discontinued its other small laptop (and putative Air replacement), the 12-inch MacBook.
Nowadays, the smallest Mac laptop you can get is that 13-inch Air and while it’s shrunk down to be a bit closer to the 11-inch in many dimensions, it’s still larger and heavier than both of those discontinued models—and that’s a shame, because a small, light laptop has a lot going for it.
To me this is the biggest surprise of the Apple Silicon transition. A lot of people expected something like this right out of the gate. Aside from the butterfly keyboard, the knock against the 12-inch MacBook was that was too slow. The M1 processor, or even one of the recent A-series ones, would seem to be the solution. Apple kept saying that making their own processors would let them make Macs that were not possible with Intel. Yet, after years of 11- and 12-inch MacBooks with Intel processors, we’ve seen two generations of Apple Silicon MacBook Airs that start at 13 inches.
Previously:
Apple M1 ARM Macs Mac MacBook
Will Yager:
Every time I tried to take a picture of the engraved text, the picture on my phone looked terrible! It looked like someone had sloppily drawn the text with a paint marker. What was going on? Was my vision somehow faulty, failing to see the rough edges and sloppy linework that my iPhone seemed to be picking up?
[…]
Well, I noticed that when I first take the picture on my iPhone, for a split second the image looks fine. Then, after some processing completes, it’s replaced with the absolute garbage you see here.
[…]
Significantly more objectionable are the types of approaches that impose a complex prior on the contents of the image. This is the type of process that produces the trash-tier results you see in my example photos. Basically, the image processing software has some kind of internal model that encodes what it “expects” to see in photos. This model could be very explicit, like the fake moon thing, an “embodied” model that makes relatively simple assumptions (e.g. about the physical dynamics of objects in the image), or a model with a very complex implicit prior, such as a neural network trained on image upscaling. In any case, the camera is just guessing what’s in your image. If your image is “out-of-band”, that is, not something the software is trained to guess, any attempts to computationally “improve” your image are just going to royally trash it up.
Via Nick Heer:
This article arrived at a perfect time as Samsung’s latest flagship is once again mired in controversy over a Moon photography demo. Marques Brownlee tweeted a short clip of the S23 Ultra’s one-hundredfold zoom mode, which combines optical and digital zoom and produces a remarkably clear photo of the Moon. As with similar questions about the S21 Ultra and S22 Ultra, it seems Samsung is treading a blurry line between what is real and what is synthetic.
[…]
Another reason why it is so detailed is also because Samsung specifically trained the camera to take pictures of the Moon, among other scenes.
Previously:
Update (2023-03-14): ibreakphotos (MacRumors):
So, while many have tried to prove that Samsung fakes the moon shots, I think nobody succeeded - until now.
[…]
The moon pictures from Samsung are fake. Samsung’s marketing is deceptive. It is adding detail where there is none (in this experiment, it was intentionally removed). In this article, they mention multi-frames, multi-exposures, but the reality is, it’s AI doing most of the work, not the optics, the optics aren’t capable of resolving the detail that you see.
Via John Gruber:
Have to say I’m surprised both Raymond Wong and Marques Brownlee were taken in by this. These “amazing” moon photos seem impossible optically, and, more tellingly, no one is able to get these Samsung phones to capture similarly “amazing” 100× zoom images of random objects that aren’t the moon.
It’s strange, since Brownlee’s words are that “It’s not an overlay,” but then he references the Huawei controversy, where it was established that the details came from an ML model rather than a bitmap overlay. So he knows that Samsung might be doing the same thing yet seems to assume it’s just a great lens.
Nick Heer:
Samsung has explained how its camera works for pictures of the Moon, and it is what you would probably expect: the camera software has been trained to identify the Moon and, because it is such a predictable object, it can reliably infer details which are not actually present. Whether these images and others like them are enhanced or generated seems increasingly like a distinction without a difference in a world where the most popular cameras rely heavily on computational power to make images better than what the optics are capable of.
Update (2023-03-16): Samsung (via Hacker News):
As part of this, Samsung developed the Scene Optimizer feature, a camera functionality which uses advanced AI to recognize objects and thus deliver the best results to users. Since the introduction of the Galaxy S21 series, Scene Optimizer has been able to recognize the moon as a specific object during the photo-taking process, and applies the feature’s detail enhancement engine to the shot.
When you’re taking a photo of the moon, your Galaxy device’s camera system will harness this deep learning-based AI technology, as well as multi-frame processing in order to further enhance details.
Update (2023-03-21): John Gruber:
There’ve been a couple of follow-ups on this since I wrote about it a few weeks ago. Marques Brownlee posted a short video, leaning into the existential question of the computation photography era: “What is a photo?” Input’s Ray Wong took umbrage at my having said he’d been “taken” by Samsung’s moon photography hype in this Twitter thread.
Samsung’s phones are rendering the moon as it was, at some point in the past when this ML model was trained.
And that’s where Samsung steps over the line into fraud. Samsung, in its advertisements, is clearly billing these moon shots as an amazing feature enabled by its 10× optical / 100× digital zoom telephoto camera lens. They literally present it as optically superior to a telescope. That’s bullshit. A telescope shows you the moon as it is. Samsung’s cameras do not.
Android Camera iOS iOS 16 Photography Samsung