Archive for March 2017
Monday, March 20, 2017 [Tweets] [Favorites]
Rich Trouton notes that Parallels Desktop Lite is now available in the Mac App Store. This is possible because it uses the Hypervisor framework instead of kernel extensions. So it runs in the sandbox, and you get these weird little alerts—which don’t look official, but I guess they must be—that it wants access to various parts of your system, such as “EFI”.
It’s free to run Mac and Linux VMs, and there’s a $60 In-App Purchase to run Windows.
None of the limitations compared with the Standard Edition seem to affect me. It was easy to install both macOS 10.12 and Mac OS X 10.11 by dragging and dropping the installer app—no need to create a bootable installer volume. After installing the Parallels Tools, I could copy and paste back and forth with the host OS.
It feels a bit slower than VMware to me.
Previously: VMware Fusion, Parallels Desktop 9 and Parallels Access, Turning Off Ads in Parallels.
Mark Gurman (Hacker News):
Tim Cook has talked up a lot of technologies since becoming Apple Inc.’s chief executive in 2011. Driverless cars. Artificial intelligence. Streaming television. But no technology has fired up Cook quite like augmented reality, which overlays images, video and games on the real world. Cook has likened AR’s game-changing potential to that of the smartphone. At some point, he said last year, we will all “have AR experiences every day, almost like eating three meals a day. It will become that much a part of you.”
The technology is cool, but I just don’t see AR glasses being useful to most people most of the time.
People with knowledge of the company’s plans say Apple has embarked on an ambitious bid to bring the technology to the masses—an effort Cook and his team see as the best way for the company to dominate the next generation of gadgetry and keep people wedded to its ecosystem.
Too bad there’s no bloody ROI in an ambitious bid to fix their bug-ridden software.
But Apple really has no choice, says Gene Munster, a founding partner at Loup Ventures who covered the company for many years as an analyst. Over time, Munster says, AR devices will replace the iPhone. “It’s something they need to do to continue to grow,” he says, “and defend against the shift in how people use hardware.”
I don’t believe that.
Pierre Lorenzi (via Stephen Hackett):
HyperCardPreview is an Mac OS X application that can display HyperCard stacks, with an look very faithful to the original. It makes the stack files alive again in the Finder with their real icons, so they don’t appear as “binaries”, and provides a QuickLook plug-in.
It can only visualize the stacks: it can’t modify them, and it can’t execute them. But it lets the user inspect the stacks, backgrounds, cards, buttons and fields, see theirs scripts and get their text contents.
It’s written in Swift, without using any deprecated APIs.
Dan Counsell (tweet, Hacker News):
If Apple made a mini tower that was upgradable and could take a full sized graphics card (or two), I’d have purchased it in a heartbeat. However, they don’t. There’s no doubt that Apple has a refresh for the desktop market in the works, I just don’t know if it’s going to be enough to satisfy the creative market who seem to be slowly migrating to Windows.
Building a Hackintosh is not for everybody. It’s not a simple process, there is an overwhelming number of parts to choose from, and on top of this you need to pick the ones that are compatible. When you’ve built the machine you need to get macOS running on it, it’s not a quick process. If you want to do it, do your research and take your time. I’d probably say this build took me around 8 hours from unboxing the components to getting macOS installed.
I’ve been running this machine for a couple of weeks now and I couldn’t be happier. It’s super fast and I can easily switch between Mac and Windows. I’ve switched off auto-updates in Sierra. While system updates should work just fine, I prefer to hold off until the community over at tonymacx86 have confirmed there are no issues. This is probably one of the major drawbacks to running a Hackintosh.
The performance this machine was able to achieve, at the price he paid, is staggering. On single-core tasks, it’s faster than any Mac Apple currently sells and, if you forgo all the bells and whistles, it can be built for about $1,800.
Previously: Video Pros Moving From Mac to Windows for High-End GPUs, Building My $1,200 Hackintosh.
Update (2017-03-20): See also: Stephen Hackett, Kirk McElhearn.
Terry Crowley (via Simon Willison):
Unfortunately, [FrontPage’s] Preview view ended up being a gift that kept on giving. The complexity it introduced had nothing to do with any failure in the initial programming of the feature. The challenges were that as we added new functionality, Preview required special consideration — additional specification — about how it should behave and interact with these new features.
When the Word team went to estimate the cost of these features, they came back with estimates that were many times larger than PowerPoint’s estimates. Some of this cost was because the PowerPoint code architecture was better structured to add these types of visual features. But the bulk of the growth in estimates was because Word’s feature set interacted in ways that made the specification (and hence the implementation) more complex and more costly.
What I found is that advocates for these new technologies tended to confuse the productivity benefits of working on a small code base (small N essential complexity due to fewer feature interactions and small N cost for features that scale with size of codebase) with the benefits of the new technology itself — efforts using a new technology inherently start small so the benefits get conflated.
The framework typically fails to evolve along the path required by the product — which leads to the general advice “you ship it, you own it”. This means that you eventually pay for all that code that lifted your initial productivity. So “free code” tends to be “free as in puppy” rather than “free as in beer”.
In fact, it was clear that [Google Docs was] using an asymmetric technical attack of leveraging their simplicity to deliver sharing and co-editing features. These features were clearly differentiated and would be immensely hard to deliver on top of the existing highly functional (large N) Office apps. As we started building the web apps, we needed to decide whether we were going to “walk away” from our own complexity like we had when we developed OneNote or embrace the existing complexity, with the costs and constraints that that would imply for future development.
Friday, March 17, 2017 [Tweets] [Favorites]
Now repeat all the above, but this time rotate the image back to the way it was (that is, you’ll need to tap the rotate button three times). Tap Done.
Main feature illustration
You will now be able to zoom into the image via the pinch-expand gesture, and will be able to keep going all the way down to pixel detail – although the pixels will be blurred because of the way iOS handles zoom.
Katyanna Quach (via John Gruber):
Promos for Disney’s new Beauty and the Beast flick, released in American cinemas today, are being mixed into Google Home’s responses to questions. In a test with a Home owned by a Reg writer, the chatbot device started touting the kids movie in between telling the time and news headlines. This is with the default configuration: no opt-ins or opt-outs.
Google’s official response to this is absolutely pants-on-head mad:
This isn’t an ad; the beauty in the Assistant is that it invites our partners to be our guest and share their tales.
I don’t care how much storytelling tinsel an advertisement happens to be wrapped in — an ad is an ad.
I looked through Google’s support documentation for Home and even downloaded the app looking for anything that would specifically state that the device could be used for advertising. Nothing on the Google Home website implies that ads would start running on the device.
Update (2017-03-20): Nick Heer:
Disney may not have paid Google to tell Home users about their new movie, but that’s what it felt like to a lot of people.
Tim Hardwick (via Luc Vandal, Hacker News):
Previous star ratings given by users will be used to personalize their Netflix profiles, but the ability to rate a TV series or movie by awarding stars is set to disappear altogether, according to Variety.
According to Netflix, at one point subscribers had awarded over 10 billion 5-star ratings and more than half of all members had rated more than 50 titles. However, the company eventually concluded that star ratings had become less relevant, with some users giving documentaries 5 stars and silly movies just 3 stars, even though they would watch the silly movies more often than the highly rated documentaries.
“We are addicted to the methodology of A/B testing,” Yellin said. The result was that thumbs got 200% more ratings than the traditional star-rating feature.
I don’t think people agree on what a rating is supposed to mean, anyway. Secondly, the thumbs aren’t really a binary system because you can choose not to rate at all. So it seems to me that they’re effectively just getting rid of the 2- and 4-star options.
Update (2017-03-19): Jeff Johnson:
I think movies should be rated by how many times you’d watch them. 0 if you regret watching, 1 for a decent movie, and going up to infinity.
Update (2017-03-20): John Gruber:
For a personally curated collection, 5-star ratings can be meaningful. But for a recommendation service that averages ratings among all users, they are not. It’s the difference between designing for the ideal case of how people should behave versus designing for the practical case of how people actually behave.
Binary ratings make a lot more sense in certain contexts, and with YouTube, it’s a natural fit. You don’t rate a movie on YouTube; you generally rate a cat video, a TED Talk, or something short.
I disagree that this type of rating will work on Netflix.
What I didn’t expect was how awful the text looked on it. I hooked up the monitor to the MBP using my Apple TV HDMI cable. The text was unreadable. I use similar TV-style monitors for my main system and they display text just fine. However, I’m using normal display ports and cables for my mini. This is the first time I’ve gone HDMI direct.
All the searches lead to this ruby script. The script builds a display override file containing a vendor and product ID with 4:4:4 RGB color support. The trick lies in getting macOS to install, read, and use it properly. That’s because you can’t install the file directly to /System/Library/Displays/Contents/Resources/Overrides/ in modern macOS. Instead, you have to disable “rootless”.
Daniel Jalkut (tweet):
So, as a rule, Swift programmers who want to be advanced debuggers on iOS or Mac platforms, also need to develop an ability for mapping Swift method names back to their Objective-C equivalents. For a method like UIView.layoutSubviews, it’s a pretty direct mapping back to “-[UIView layoutSubviews]”, but for many methods it’s nowhere near as simple.
If the API has been rewritten using one of these rules, it’s almost certain that the Swift name of the function is a subset of the ObjC method name. You can probably leverage the regex matching ability of lldb to zero in on the method you want to set a breakpoint on[…]
Thursday, March 16, 2017 [Tweets] [Favorites]
Itai Ferber et al. (via Swift Evolution):
Foundation’s current archival and serialization APIs (
NSPropertyListSerialization, etc.), while fitting for the dynamism of Objective-C, do not always map optimally into Swift. This document lays out the design of an updated API that improves the developer experience of performing archival and serialization in Swift.
- It aims to provide a solution for the archival of Swift
- It aims to provide a more type-safe solution for serializing to external formats, such as JSON and plist
Daniel Rubino (via Steve Lubitz):
The Precision 5520 was supplied by Dell for this review. It features a 4K IGZO display, 32GB of RAM, Intel Xeon processor, NVIDIA Quadro graphics and 512GB of Class 50 solid-state drive (SSD).
The starting price for the Precision 5520 is $1,399. The unit we evaluated in this review retails for $2,867.
It also has room for a second hard drive or SSD, real function keys, a T arrow key arrangement, and a trackpad that clicks. It is about the same weight as the 2012 Retina MacBook Pro and gets the same battery life (7 hours).
The most similar Mac is a $3,099 MacBook Pro with a lower resolution display, 16 GB of RAM (max), a 2.9 GHz i7, and a Radeon Pro 460.
Previously: The Curious State of Apple Product Pricing, New MacBook Pros and the State of the Mac.
Jacek Suliga (via Peter Steinberger):
When LinkedIn embraced Swift and rebuilt its flagship iOS app using (mostly) Swift, I spent a good amount of time profiling the new compiler characteristics. At that time (mid 2015), it was clear that the compiler was CPU-bound (I/O and memory had no visible impact on build speed). However, the build times were determined primarily by the number of cores and threads available to the compiler, more so than by the CPU clock speed.
Based on this analysis, we decided to use 12-core MacPros for core development, and we saw a significant (2-3 times) speedup of our builds compared with quad-core laptops.
As you can see, 12-core MacPro is indeed the slowest machine to build our code with Swift, and going from the default 24 jobs setting down to only 5 threads improves compilation time by 23%. Due to this, even a 2-core MacMini ($1,399.00) builds faster than the 12-cores MacPro ($6,999.00).
For machines with more than 4 cores consider reducing number of concurrent jobs allowed for Xcode - until the concurrency issue described above has been resolved.
Be careful with disabling Spotlight indexing for ~/Library/Developer as it will also exclude dSYM symbol files of builds and archives.
Xcode uses Spotlight to search for matching dSYM files when symbolicating crash reports. Turning off Spotlight indexing for them might break crash symbolication completely.
Update (2017-03-16): Joe Groff:
This smells like a hardware or kernel issue.
Blaming any particular component is premature. “There Is A Bug Exposed By Swift 3 Running on Sierra” is all we know.
Core count doesn’t seem to matter either, we observed the same issue on 4-core Xeon setups. Seems to be Xeon-specific.
The bug report is here.
See also: Colin Cornaby.
Robert Obryk and Jyrki Alakuijala:
Guetzli [guɛtsli] — cookie in Swiss German — is a JPEG encoder for digital images and web graphics that can enable faster online experiences by producing smaller JPEG files while still maintaining compatibility with existing browsers, image processing applications and the JPEG standard. From the practical viewpoint this is very similar to our Zopfli algorithm, which produces smaller PNG and gzip files without needing to introduce a new format; and different than the techniques used in RNN-based image compression, RAISR, and WebP, which all need client and ecosystem changes for compression gains at internet scale.
Guetzli specifically targets the quantization stage in which the more visual quality loss is introduced, the smaller the resulting file. Guetzli strikes a balance between minimal loss and file size by employing a search algorithm that tries to overcome the difference between the psychovisual modeling of JPEG's format, and Guetzli’s psychovisual model, which approximates color perception and visual masking in a more thorough and detailed way than what is achievable by simpler color transforms and the discrete cosine transform.
Previously: Brotli, Zopfli, JPEG Image Compression, Lepton Image Compression.
Wednesday, March 15, 2017 [Tweets] [Favorites]
Given the preceding list, a strong case could have been made for Apple to price its new wireless headphones at $249, or even $299. The fact that Samsung priced its Gear IconX at $199 seemed to suggest a sub-$200 retail price for AirPods was unlikely. Instead, Apple sent shockwaves pulsing through the market by pricing AirPods at only $159. The action instantly removed all available oxygen from the wireless headphone space. The idea of Apple coming out with a new product that would underprice nearly every other competitor was unimaginable ten years ago.
At $269, Apple Watch Series 1 is one of lowest-priced smartwatches worth buying in the marketplace. Attractive pricing was one key factor driving record Apple Watch sales this past holiday quarter. In fact, even the Apple Watch Series 2, at $349, is one of the lowest-priced smartwatches in its class[…]
However, it seems like most Mac models are not as competitively priced right now, Apple is focusing on the high end of the tablet market, and it sounds like iPhone prices will be expanding up rather than down.
Update (2017-03-17): See also: MacRumors.
AirPods are still showing a delivery estimate of “6 weeks”. Either demand remains unexpectedly strong or production remains unexpectedly difficult (or some combination of both).
Other code in the view controller switched on the message’s existence to determine which views to allocate, how to lay them out, and so on. The optionality of the property represents more than existence of a simple string. It had implications in the rest of the view controller and the rest of the view layer.
The more essential point here is that the string now no longer represents a string that might be there or might not — it now represents the presentation style or mode that the view controller is in. Was it created in the context of a push notification or through normal browsing? The answer is in this barely related property!
His solution is to replace it with an enum or a promise, both of which have other benefits besides clarity.
A unique feature of Allo is that you can use Assistant while in the middle of a conversation with a friend. You could, for example, ask Assistant to search for restaurants in a certain area, while you’re talking to a friend about where to eat.
But Assistant isn’t perfect, and sometimes it responds with answers unrelated to questions at hand, or it will respond with an answer to an earlier question — and it’s then that it can inadvertently reveal a previous search query.
Google responded to our story: “We were notified about the Assistant in group chats not working as intended. We’ve fixed the issue and appreciate the report.”
Previously: Google Reneges on Allo Privacy Feature.
When Cocoa says Model-View-Controller it’s mostly trying to evoke the notion of Separated Presentation and Content in application design (the idea that the model and view should have a decoupled design and be loosely linked at construction). To be fair, it’s not just Cocoa that uses Model-View-Controller in this way: most modern uses of the term are really intended to convey Separated Presentation rather than the original Smalltalk-80 definition.
The important point to note is that the controller is the center of the object graph with most communication passing via the controller – distinct from the Smalltalk-80 version where the model was the center of the graph.
In my opinion, the critical failing for Cocoa Bindings remains the difficulty in adding custom transformations and custom properties. These could both be done but the work involved in registering transformers and exposing bindings dictionaries made a tiresome affair. It always just seemed easier to pass the data through the view controller without bindings. This meant that Bindings tended to help the simplest problems (that didn’t need much help) but didn’t have much impact on the harder problems.
Since Cocoa Bindings in Mac OS X 10.3, there haven’t really been any clear attempts by Apple to alter the design pattern used for Cocoa apps.
The problem with using the same name is shadowing: since the
names are the same, accessing the original definition is now hard. Again, this wouldn’t really be a problem if it
weren’t for the fact that the old MVC solved exactly the kinds of problems that plague the new MVC.
Previously: Model Widget Controller (MWC) a.k.a.: Apple “MVC” Is Not MVC.
Ben Francis (Hacker News):
During this five year journey hundreds of members of the wider Mozilla community came together with a shared vision to disrupt the app ecosystem with the power of the open web. I’d like to reflect on our successes, our failures and the lessons we can learn from the experience of taking an open source browser based mobile operating system to market.
In a rush to get to market we imitated the app store model with packaged apps, adding to the problem we set out to solve. Lost focus on what makes the web the web (URLs). Too much focus on the client side rather than the server side of the web stack.
After realising “open” on its own doesn’t sell, ultimately chose a strategy to compete mainly on price, which is just a race to the bottom. Suffered from a lack of product leadership and direction after the initial launch.
Tuesday, March 14, 2017 [Tweets] [Favorites]
It’s not like iOS isn’t meant to deal with asynchronous events. In fact, it has lots of different techniques… all mutually incompatible, owing to the long development of the Cocoa and Cocoa Touch APIs over the years, and the legacy of Objective-C.
So now let’s look at how ReactiveX deals with this. In Rx, things that change over time are offered as observables. […] So, where’s the rub? For starters, it takes a long time to get the hang of Rx, both to get into its mindset (it sucks when you just need to get the current value of something, but it’s an observable, so there’s no real concept of a “current” value). And then there are all the operators. Follow that link to the operators that I linked a few paragraphs back. There are over 70 of them.
In practice, there are lots of ways to mess this up. Are you right to only want one event? What if you forget to capture
self weakly? The worse problem is when your closure is never called: setting breakpoints in this code will do nothing, so it’s a hair-pulling exercise to figure out if
fooObservable is not producing events, or if your chain is screwing them up somehow.
In my experience, Rx turns out to be far more costly than it would originally appear from the propaganda. I can’t speak for my ex-employer of course, but my own takeaway is that 1) adopting Rx can take way longer than you’d expect, 2) RxSwift seems to really slow down the Swift compiler (possibly because of having to do a bunch of type inference through those chains of Rx operators).
I don’t buy the idea of replacing everything with it. But use the right tool for the right job. For chained operations it is better than promises/futures and much better than callbacks/closures.
Adam C. Engst:
I’m sure many of you are nodding your heads in agreement, or wondering why I’m telling you all this. I’m no frequent flyer, so I’m sure that these improvements have been obvious to road warriors for a while. But I wanted to document how to do all this because I saw a lot of people still relying on paper boarding passes on that last trip. The experience of relying on an iPhone is so good, and so much better than dealing with paper, that if you’ve avoided it so far, I urge you to give it a try next time. But feel free to print a paper boarding pass as a backup until you’re comfortable with using your iPhone instead!
At the end of the check-in process, the apps I’ve used have provided a relatively clear Add to Wallet link or button. Tap it, and when it hands you off to Wallet, make sure everything looks correct in the boarding pass and then tap the Add link in the upper-right corner. You may or may not need to do this for each boarding pass; check to see what’s in Wallet after the first one. Now you’re ready for when you go to the airport.
I’d still want a paper copy in case my iPhone spontaneously shuts down. Also, sometimes they won’t let you swipe the pass yourself, and I don’t want to hand my iPhone to the attendant. I admit to not really appreciating the benefit of Wallet here. It’s an extra step to use, but then you can access the boarding pass from the lock screen instead of having to open the app. And maybe it saves you from having to rely on the airline app at the critical moment? I’ve twice had an airline app stop working after arriving at the airport but before boarding. So now I always take a screenshot of the boarding pass just in case.
Eject (via Brian King):
One common pain point with Interface Builder is that as a view becomes more dynamic and is managed more programatically, Interface Builder becomes less helpful. This tool lets developers use Interface Builder without that concern, giving them an Eject button to hit when Interace Builder starts getting in the way, and provides an easy path to transition to full programatic view layout.
Previously: Working Without a Nib.
On 15 March 2017, Dropbox will convert all Public folders on free accounts to private folders, breaking existing links. Dropbox now recommends using a shared folder or shared link for sharing files with others. Dropbox Plus and Dropbox Business users can continue to use the Public folder until 1 September 2017.
Although it’s annoying to lose the Public folder and its HTML rendering capability, Dropbox remains useful because it’s integrated so well into the Mac and iOS experience, and it just works.
Although, some of its recent behavior has been troubling. I just want a basic folder that syncs and doesn’t peg my CPU when there’s filesystem activity in an unrelated folder.
Previously: Dropbox Discontinues HTML Rendering.
Auto-play videos suck. They use bandwidth, and their annoying sounds get in the way when you’re listening to music and open a web page. I happen to write for a website that uses them, and it annoys me to no end. (My editors have no control over those auto-play videos, alas.)
But you can stop auto-play videos from playing on a Mac. If you use Chrome or Firefox, it’s pretty simple, and the plugins below work both on macOS and Windows; if you use Safari, it’s a bit more complex, but it’s not that hard.
To do the whole thing from Terminal:
defaults write com.apple.safari WebKitMediaPlaybackAllowsInline 0
defaults write com.apple.safari com.apple.Safari.ContentPageGroupIdentifier.WebKit2AllowsInlineMediaPlayback 0
(Or substitute com.apple.SafariTechnologyPreview.)
I fought for six months to get those [videos] turned off. When I left, they turned ’em right back on.
Monday, March 13, 2017 [Tweets] [Favorites]
Jasim A Basheer (via Andy Matuschak):
“enables more powerful integrations for third-party developers” is stating it lightly. This is what the fine folks at Bohemian Coding has done — they opened up Sketch’s file format into a neat JSON making it possible for anyone to create and modify Sketch compatible files.
Can you imagine what kind of new things will now be possible? One word: design automation (okay, two words!).You want Artboards that showcase a font and its variations, like a Google Fonts page? There’s probably going to be a script to generate that file. There will be websites from which you can download freshly brewed Sketch files based on what you ask — say an image gallery, or a landing page, or a signup form. You’ll be able to pick your brand colors, choose a theme, randomize it, and voila! you have a Sketch design to start playing with. Someone could even build a Sketch equivalent that runs on the browser. The possibilities are many!
And Sketch itself comes with an in-built REPL where anyone can whip up a plugin in no time, and just hit Save to package it into a distributable plugin. There is nothing extra to do — it is all in the app. This makes it very easy for users of Sketch to get started with plugin development, and they happily take the bait!
But I wanted to make a point with all this: even though plugin development is undocumented and painful, developers still build crazy useful things on top of it.
When I was in high school in a small town in upstate New York, I didn’t really have anyone around to help develop or even share my interest in technology with. Twitter was my connection to the world I wanted to live in. Although I’d been a member of several forums in the past, I liked Twitter more than any forum because there was no pretense of being limited to any particular topic. In 2008, Twitter was accessible on my iPod touch in a way that other communities weren’t. From that iPod, I followed people who talked about Mac software, making web pages, podcasting, and politics, and that stream of information helped me figure out what I wanted to do with my future.
What started as a way for me to fill a void in the types of people I knew in “real life” changed as I left that small town. Today, Twitter is how I get my news.
Previously: Anil Dash’s Advice for Twitter.
Update (2017-03-20): See also: Stephen Hackett.
Opening Console, you’ll see the ceaseless spew from a cornucopia of processes, including many I never want, and will never use. It might be ‘quiet’ at times, but what I’ve found is that a number of Apple services get triggered from time to time to go into a state of endless bitching and moaning, often with messages that equate to “fix this bug someday”.
For example, here is this lovely new Apple bug involving touchui. On a Mac Pro no touch user interface exists, but the engineers at Apple don’t bother to test much any more, so the com.apple.nowplayingtouchui apparently is just going to fail forever.
The spew has been getting worse the last few versions of macOS, and it’s also more of a problem with 10.12 and its new logging subsystem. I’m running into more cases of runaway logging where the same message is repeated multiple times per second. Not only does this hog the CPU, but Console falls behind so that (even with a filter to hide the spew) it can take minutes for actual, relevant log entries to show up. The only remedy seems to be to restart the Mac.
Backblaze (via Hacker News):
Files can be written to the Vault even if a Storage Pod is down with two parity shards to protect the data. Even in the extreme — and unlikely — case where three Storage Pods in a Vault are offline, the files in the vault are still available because they can be reconstructed from the 17 available pieces.
We use Reed-Solomon erasure encoding. It’s a proven technique used in Linux RAID systems, by Microsoft in its Azure cloud storage, and by Facebook too. The Backblaze Vault Architecture is capable of delivering 99.99999% annual durability thanks in part to our Reed-Solomon erasure coding implementation.
We developed our Reed-Solomon implementation as a Java library. Why? When we first started this project, we assumed that we would need to write it in C to make it run as fast as we needed. It turns out that modern Java virtual machines working on our servers are great, and just-in-time compilers produces code that runs pretty quick.
Yes we do plan on expanding to more datacenters, and we do have emergency plans in place, though we do choose our datacenters carefully to make sure that we avoid any natural-disaster prone areas. As for backing up our own data - we certainly do make backups of our core info/necessary data. As for the user data that we store, that’s backed up across the storage pods in a vault as discussed in that post. We do not replicate customer data across multiple datacenters. At our price-point, that’s just not feasible.
Glyn Moody (Reddit):
For the last four years, the Web has had to live with a festering wound: the threat of DRM being added to the HTML 5 standard in the form of Encrypted Media Extensions (EME). Here on Techdirt, we’ve written numerous posts explaining why this is a really stupid idea, as have many, many other people. Despite the clear evidence that EME will be harmful to just about everyone -- except the copyright companies, of course -- the inventor of the Web, and director of the W3C (World Wide Web Consortium), Sir Tim Berners-Lee, has just given his blessing to the idea[…]
The question which has been debated around the net is whether W3C should endorse the Encrypted Media Extensions (EME) standard which allows a web page to include encrypted content, by connecting an existing underlying Digital Rights Management (DRM) system in the underlying platform. Some people have protested “no”, but in fact I decided the actual logical answer is “yes”.
The reason for recommending EME is that by doing so, we lead the industry who developed it in the first place to form a simple, easy to use way of putting encrypted content online, so that there will be interoperability between browsers.
Anne van Kesteren:
The fact that the CDM (DRM code in the article) is not part of the standard means the promise of interoperability is false.
And the fact that CDM sandboxing is not defined means you allow for a race to the bottom in terms of end-user security.
The EFF, along with the Free Software Foundation (FSF) and various other groups, has campaigned against the development of the EME specification. They signed an open letter voicing their opposition and encouraged others to sign a petition against the spec.
However, it’s not clear that EME does anything to exacerbate that situation. The users of EME—companies like Netflix—are today, right now, already streaming DRM-protected media. It’s difficult to imagine that any content distributors that are currently distributing unprotected media are going to start using DRM merely because there’s a W3C-approved framework for doing so.
Sarah Perez (via Farshad Nayeri):
Facebook’s Messenger bots may not be having the impact the social network desired. Just yesterday, online retailer Everlane, one of the launch partners for the bot platform, announced it was ditching Messenger for customer notifications and returning to email. Following this, Facebook today announced an upgraded Messenger Platform, which introduces a new way for users to interact with bots: via simple persistent menus, including those without the option to chat with the bot at all.
One of the problems with Facebook’s bots is that it’s often unclear how to get started. The directory of bots in Messenger wasn’t initially available and now only reveals itself when you start a search in the app. And it hasn’t always been obvious how to get a bot talking, once added, or how to navigate back and forth through a bot’s many sections.
Previously: Bots Won’t Replace Apps.
Friday, March 10, 2017 [Tweets] [Favorites]
The downside of not having a default case is, of course, more boilerplate to write. […] This isnʼt fun to write, and it would get even worse with more cases. The number of states the
switch statement must distinguish grows quadratically with the number of cases in the enum.
You can make this considerably more manageable with some intelligent application of the
_ placeholder pattern. While we saw above that a single default clause is not enough, one pattern per case is.
Unfortunately, like the enum example I talked about in the previous post, this conformance to
Equatable is very fragile: every time you add a property to the struct, you have to remember to also update the implementation of the
== function. If you forget, your
Equatable conformance will be broken, and depending on how good your tests are this bug has the potential to go undetected for a long time — the compiler wonʼt be able to help you here.
It occurred to me to use the standard libraryʼs
dump function as a safeguard.
dump is interesting because it uses Swiftʼs reflection capabilities to create a string representation of a value or object that includes all storage fields.
The biggest drawback of the solution might be that
dump is not a perfectly reliable way to determine equality. It should be pretty good at avoiding false negatives, but youʼll probably see some false positives, i.e. values that really are equal but whose
dump outputs are different.
Aaron Turon (via Joe Groff, Hacker News):
There are three dimensions of the reasoning footprint for implicitness:
- Applicability. Where are you allowed to elide implied information? Is there any heads-up that this might be happening?
- Power. What influence does the elided information have? Can it radically change program behavior or its types?
- Context-dependence. How much of do you have to know about the rest of the code to know what is being implied, i.e. how elided details will be filled in? Is there always a clear place to look?
The basic thesis of this post is that implicit features should balance these three dimensions. If a feature is large in one of the dimensions, it’s best to strongly limit it in the other two.
? operator in Rust
is a good example of this kind of tradeoff. It explicitly (but concisely) marks
a point where you will bail out of the current context on an error, possibly
doing an implicit conversion on the way. The fact that it’s marked means the
feature has strongly limited applicability: you’ll never be surprised that it’s
coming into play. On the other hand, it’s fairly powerful, and somewhat
context-dependent, since the conversion can depend on the type where
used, and the type expected in the scope it’s jumping to. Altogether, this
careful balance makes error handling in Rust feels as ergonomic as working with
exceptions while avoiding some of their well-known downsides.
For both applications, complying with Apple’s sandboxing and feature constraints to get them approved for sale would have required significant rewrites. And in Jettison’s case, it would also require that buyers download a separate helper app to enable its full functionality. I realize that some people will be put off or inconvenienced by the fact that these apps are no longer in the Mac App Store – my apologies if you’re one of those folks, but it just doesn’t make sense for Jettison and HistoryHound.
Their other two products are App Tamer, which I’ve been meaning to try to reduce Dropbox’s CPU use, and the venerable Default Folder X. Those are probably impossible to sandbox.
Update (2017-03-11): See also: Sip and Videoloupe (via Bad Uncle Leo).
A year after release, the supporter model is still working well. If you’re not familiar, the basic idea is that someone can download Time Out and use it for free, but some features only work for an hour at a time, as often as they like. So they can try all of the functionality, at their own pace, and decide if the advanced features are useful to them. If so, they can become a supporter for three, six, or twelve months. This permanently unlocks all of the current features as a reward. Even when the supporter period expires, those features remain fully available. So they can choose to extend their supporter status, or just keep using the app without paying any more. Of course, I hope that people do renew, to help fund ongoing sustainable development.
At present, about 9% of people who download Time Out end up purchasing one of the supporter options... which is a reasonable "conversion rate", which can often average more like 5% for normal trial apps. I feel pretty comfortable with that. But I’m also happy that people who choose not to become a supporter can still use a great break reminder tool to help them get or stay healthy.
Looking at both editions combined, you can see that most of the purchases were through the Mac App Store[…] Again, combining them into one chart, you can see that [revenue is] pretty much neck-and-neck for direct vs Mac App Store, due to the larger slice of the pie that Apple takes.
Vincent Esche (via Todd Ditchendorf):
One would expect both of these minimalistic programs to be semantically equivalent and thus be handled identically by the compiler, no?
The answer to this question and many more can be found in the Swift project’s official documentation on its “Type Checker Design and Implementation” (emphasis mine):
Swift implements bi-directional type inference using a constraint-based type checker that is reminiscent of the classical Hindley-Milner type inference algorithm. […]
The Swift language contains a number of features not part of the Hindley-Milner type system, including constrained polymorphic types and function overloading, which complicate the presentation and implementation somewhat. On the other hand, Swift limits the scope of type inference to a single expression or statement, for purely practical reasons: we expect that we can provide better performance and vastly better diagnostics when the problem is limited in scope.
Repeat after me: “Swift limits the scope of type inference to a single expression or statement”.
Thursday, March 9, 2017 [Tweets] [Favorites]
Adam C. Engst:
I won’t pretend that either of these earbuds is anywhere near as good as Apple’s AirPods. They’re not: their Bluetooth pairing can be annoying, audio quality won’t be as good, battery life is shorter, it’s more difficult to plug in their charging cables, their microphone quality is weak, and neither supports Siri or voice commands. And heck, you only get one, something that AirPods allow but don’t require.
But these earbuds are so cheap that you may not care. I wrote this while at the ASMC Summit, a conference for independent Apple resellers, and when I showed these two little earbuds to one attendee, he said that he didn’t even bring his AirPods to the conference for fear of losing them. Just as the best digital camera is the one in your pocket, the best earbud is the one you have with you.
Also, I’m talking about a category which has many more than these two entrants. When you search for “Bluetooth earbud” on Amazon, you’ll see a variety of comparable products with slightly different industrial designs that you might find more compelling. Some are as cheap as $5, and there’s no reason I can see to spend more than $20 given how similar they look. I wouldn’t expect any of them to last for all that long, and they’d be easy to lose, but a super low price makes up for a lot of sins.
Prior to getting AirPods, I used a Jawbone headset, which had many of the same issues, except that it cost almost as much as the AirPods. Even so, I usually preferred it to wired earbuds. So even if you don’t want AirPods, I recommend giving one of these a try.
Overnight, developers have noticed a silent policy change to iTunes Connect interface which does not seem to have been formally announced by Apple.
Developers are no longer able to edit descriptions, update notes or any other metadata for their apps without making a new version, which must be submitted to App Review for approval.
Now a mention of sale (promo) on App Store requires TWO separate version updates: first to add it, then to remove it. 🤦
Update (2017-03-10): Benjamin Mayo:
They rolled this back a day later (original post has been updated).
None of these figures are very large individually but, collectively, I’d conservatively estimate that I have about 1 GB of cached data on my iPhone that could be purged. I wish there were a button in every app’s settings panel to dump old or expired data, but I suspect this is a lot harder than it seems: how can iOS reliably know what’s old and expired?
Every app should implement a Clear Cache feature.
Mozilla (MacRumors, Hacker News, Reddit):
Mozilla is growing, experimenting more, and doubling down on our mission to keep the internet healthy, as a global public resource that’s open and accessible to all. As our first strategic acquisition, Pocket contributes to our strategy by growing our mobile presence and providing people everywhere with powerful tools to discover and access high quality web content, on their terms, independent of platform or content silo.
Pocket brings to Mozilla a successful human-powered content recommendation system with 10 million unique monthly active users on iOS, Android and the Web, and with more than 3 billion pieces of content saved to date.
Pocket will continue on as a wholly-owned, independent subsidiary of Mozilla Corporation. We’ll be staying in our office, and our name will still be on the wall. Our team isn’t changing and our existing roadmap has been reinforced and is clearer than ever. In fact, we have a few major updates up our sleeves that we are really excited to get into your hands in the coming months.
How does Mozilla fit into this equation? They’re adding fuel to our rocketship. We have worked closely with Mozilla as we partnered with their Firefox team, and established a deep trust with their team and vision. They have extraordinary resources, global scale, and reach to put Pocket in more places, and help us build an even better product, faster.
Previously: Pinterest Acquires Instapaper.
There’s a handful of basic words and letters that when typed into the URL field will instantly crash Safari on the Mac. Versions of Safari on iPhone and iPad don’t seem to be affected.
As mentioned by reader “MB” in the comments below, this bug can be deactivated by turning off Safari suggestions: open Safari’s preferences (Cmd+comma), click the Search tab, and remove the check from the Include Safari Suggestions box.
Wednesday, March 8, 2017 [Tweets] [Favorites]
FSMonitor is a macOS app that monitors all changes in the file system.
- Track all changes to the file system, including file creation, deletion, change of content, renames, and change of attributes.
- Examine the changed files with any of the four provided display modes.
Time to stop using fs_usage directly.
assdass (via Ben Sandofsky, Hacker News):
Just got this message for a few of my apps that are live in the app store (and have been for years).
"Your app, extension, and/or linked framework appears to contain code designed explicitly with the capability to change your app’s behavior or functionality after App Review approval, which is not in compliance with section 3.3.2 of the Apple Developer Program License Agreement and App Store Review Guideline 2.5.2. This code, combined with a remote resource, can facilitate significant changes to your app’s behavior compared to when it was initially reviewed for the App Store. While you may not be using this functionality currently, it has the potential to load private frameworks, private methods, and enable future feature changes.
This includes any code which passes arbitrary parameters to dynamic methods such as dlopen(), dlsym(), respondsToSelector:, performSelector:, method_exchangeImplementations(), and running remote scripts in order to change app behavior or call SPI, based on the contents of the downloaded script. Even if the remote resource is not intentionally malicious, it could easily be hijacked via a Man In The Middle (MiTM) attack, which can pose a serious security vulnerability to users of your app.
Apple today has started informing developers that use “hot code push” SDKs that it will soon start rejecting their applications.
While Apple has yet to publicly comment on the change, the email sent to affected developers seems to imply that services like Rollout.io are the cause.
It really shouldn’t come as too big of a surprise that Apple is starting to crack down on these type of SDKs. Seeing that they allow changes to be made to an app after App Store review, it’s really a miracle that they have lasted so long in Apple’s generally rather restricted ecosystem. Whether or not this is a good policy on Apple’s part, though, is up for debate.
How can Rollout allow you to push code-level updates to live iOS apps and be fully compliant with Apple’s guidelines? Glad you asked.
Apple’s guidelines explicitly permit you to push executable code directly to your app, bypassing the App Store, under these two conditions:
- The code does not provide, unlock or enable additional features or functionality
Rollout isn’t intended to push new features or functionality. It is meant to tweak or fix them, avoiding the minor releases needed to fix bugs, add logging or tracking, update messages, force users to upgrade, etc.
Many tweets about Apple rejecting apps using Rollout.io have mentioned React Native, but AFAICT, RN isn’t affected
I can see why people would jump to the conclusion that these rejections are about apps bypassing review, but that may not be the case.
It may be that Apple is more concerned about MITM attacks being used to hijack apps, or dynamic selectors being used to call private APIs.
App review should just say “We don’t allow Rollout” instead of scaring the shit out of everyone with respondsToSelector and performSelector.
The machine doesn’t know or care what’s public and what’s private. There’s no security boundary between the two. Private APIs do nothing that a third-party developer couldn’t do in their own code, if they knew how to write it. The only way Apple can check for private API usage is to have a big list of all the private APIs in their libraries and scan the app looking for calls to them. This is fundamentally impossible to do with certainty, because there’s an unlimited number of ways to obfuscate such calls.
Functionality that needs to be restricted due to privacy or security concerns has to be implemented in a completely separate process with requests from apps being made over some IPC mechanism. This is the only way to reliably gate access.
Apple’s prohibition against using private APIs is like an “employees only” sign on an unlocked door in a store. It serves a purpose, but that purpose is to help keep well-meaning but clueless customers away from an area where they might get confused, or lost, or hurt. It won’t do anything for your store’s security.
Update (2017-03-09): Rollout:
While Apple has not modified its guidelines, it appears that these guidelines are now being interpreted in a more narrow way. We are disappointed that Apple has made this change before we have had an opportunity to address any concerns. We have already reached out to Apple to discuss and are committed to adjusting our offering as needed to remain in compliance under the more narrow interpretation of the guidelines.
Update (2017-03-10): See also: Dave Verwer.
Ray Holley (via Dori Smith):
Indeed, [Tom] Negrino walked with a bit of a sway, but he went everywhere vigorously and purposefully. He was the author of 48 books, focusing on Macintosh computers and software. He wrote on his website, “I’ve been writing about Macs, other computers and software since dinosaurs ruled the earth. OK, it’s actually been since 1987.”
Negrino was a contributing editor for Macworld Magazine and a leading figure in the Macintosh movement in Southern California, where he met Smith.
Thank you for decades of good writing. Thank you for recommending my software. Best wishes for whatever comes next.
Update (2017-03-08): Negrino’s own announcement from May 2016.
Update (2017-03-09): See also: John Gruber, Kirk McElhearn, Jason Snell.
Update (2017-03-11): See also: Adam C. Engst (tweet).
Update (2017-03-12): See also: Andy Ihnatko.
Update (2017-03-14): See also: John Moltz.
Update (2017-03-15): See also: Jeff Carlson and Dori Smith.
Update (2017-03-16): See also: Jean MacDonald.
I’m glad to see lodging covered as part of the scholarship. Most of the time, the WWDC ticket itself is not the most expensive thing about the trip.
Bravo. When I attended WWDC on a student scholarship (in 2002, coincidentally the last year in San Jose), lodging was not included. However, Apple did book an inexpensive hotel, within walking distance, and arrange roommates. With triple occupancy, the lodging ended up being only about $250 per person for the whole week. Most of the food was included, so the main costs were airfare and ground transportation from/to SJC (not far).
Ever since updating to macOS 10.12.2, my MacBook Pro has had horrible problems with Bluetooth. Multiple times per day, the keyboard disconnects. Sometimes it reconnects automatically a few seconds later. Sometimes it reconnects only after I power cycle it or toggle Bluetooth. Sometimes to get it to reconnect I have to reset the Mac’s Bluetooth module by holding down the Option and Shift keys (on the internal keyboard, natch) to access the Debug submenu of the Bluetooth menu bar icon. And sometimes all that fails and I have to reboot the Mac.
At first I thought this was due to a hardware problem with my original aluminum Apple keyboard. I had been able to extend its life by making better connections to the batteries, but power problems with this model seem to be common, and eventually something inside of it breaks.
Liking the keyboard’s feel, but tired of dealing with the AA batteries, I replaced it with a Magic Keyboard (Amazon). I ended up liking the Magic Keyboard slightly less, as it’s flatter, it’s harder to feel the edges of the keys, and the left-right arrow keys are harder to find because they aren’t half-size.
More importantly, the Magic Keyboard also would disconnect all the time. Sometimes it would reconnect and think that a key was stuck down. I’d see the same letter repeat for several lines, or see several lines of text delete one character at a time. Still suspecting a hardware problem, I reported these problems to Apple Care. After ruling out Wi-Fi as a cause and also reproducing the problem on a second Mac, I got them to send me a replacement Magic Keyboard. It exhibited the exact same problems. Curiously, the Magic Keyboard also did not work reliably when directly connected via Lightning. I had thought that when used with a cable it would act like a regular USB keyboard, but apparently the cable only provides charging and Bluetooth pairing assistance.
Thinking/hoping that the problem was with Apple’s keyboards, I then bought a Logitech K811 (Amazon), which I’d heard good things about. Indeed, it’s a great keyboard. It’s like an improved version of the Apple aluminum keyboard that I liked so much. It can pair with three different devices at once and quickly switch between them. It’s still low-profile, but the keys have slightly more travel than Apple’s, they’re slightly clickier, and there are larger spaces between them, so it’s easier to feel their edges. It has the T-shaped arrow key layout, and you also get an extra function key: F13. One downside is that some of the hardware functions (like brightness) are assigned to different F numbers than on the internal keyboard, and I have had a hard time getting used to this.
There are also a bunch of software issues compared with the Apple keyboard. The OS doesn’t know the keyboard’s battery level. You need to install a kernel extension to make the media keys behave as standard function keys. Both the menu bar and flashing bezel indicators for Caps Lock get out of sync with the actual state of the key. It keeps forgetting the level I’ve set for the keyboard backlight. LaunchBar and Dictation can’t detect taps of the fn key.
And the Enter key doesn’t work. You’re supposed to be able to type Enter by pressing fn-Return, but (unlike with Apple’s keyboard) this just generates a Return. Logitech support first blamed this on a defective keyboard and sent me a replacement that had the same problem (as did another, older, Logitech keyboard that I tried). They then blamed a recent OS update, but I reproduced the problem on 10.10. It’s possible to use Karabiner Elements to program another key to act as Enter, but that didn’t seem worth the extra software to me. Instead, I opted to use the alternate keyboard shortcuts—unfortunately not consistent—in the apps where I used Enter: Control-Return to execute a BBEdit shell worksheet command, Command-Return to send a tweet in Tweetbot, Command-Return to submit an event edit in Fantastical, and Command-K to compile an AppleScript in Script Debugger.
The main problem, though, is that the K811 is subject to the same disconnection issues as Apple’s keyboard, although it seems to be slightly better at auto-reconnecting and does not repeat keys. I’m now convinced that the Bluetooth keyboard problems, which others have also noticed, are due to an OS bug. And it’s not limited to keyboards: when I tested a Microsoft Bluetooth mouse, it also kept disconnecting. Fortunately, my wireless mouse does not rely on Bluetooth.
The keyboard disconnections have gotten so frequent that I pulled my Apple aluminum USB keyboard out of storage. It works reliably, but I miss the narrower layout of the newer keyboards (which keep my mouse more centered), I keep forgetting that the corner key is Control rather than fn, and I miss the dual-purpose function keys that fn enables.
Update (2017-03-08): Addison Webb:
I’m having the exact same issues with my Late 2015 iMac. It’s super annoying and I also solved the problem with my Apple USB keyboard.
I have two Logitech BT keyboards, K780 and K380. Both completely unusable with Sierra, yet worked flawlessly with El Cap.
I was able to get the Magic Keyboard to work in wired mode by connecting it via Lightning and then unpairing it in the Bluetooth pane in System Preferences. I expect this to be more reliable, though it keeps auto re-pairing even before I reboot. I don’t want to turn Bluetooth off entirely because I use it for my AirPods and Universal Clipboard.
Sierra also introduced a couple of serious bugs with the way keyboards and trackpads are interpreted. I occasionally notice keypresses getting “stuck”, and my cursor sometimes lags when it is moved. Both of these bugs have been destructive for me: I have, more than once, deleted the wrong file, and have selected the wrong action in several applications.
Same problem with an MX Master & K780. My MBP lives on a swing arm, & the problem is reduced when I move it away from the desk.
Update (2017-03-17): Dan Frakes recommends FunctionFlip, which uses the accessibility APIs to invert the behavior of the function keys so that a kernel extension is not needed.
Tuesday, March 7, 2017 [Tweets] [Favorites]
Tom Scocca (via Andrew Abernathy):
So when I saw the news that Google’s search result box has been giving people bogus information in its algorithmic search for the One True Answer to various questions, I thought about the onions. If Google can’t figure out whether Barack Obama is plotting a coup or not, or whether or not MSG is lethal, can it at least recognize that the lie about cooking onions is a lie?
I typed “how long does it take to caramelize onions” into Chrome. The answer was worse than I could have imagined[…]
Not only does Google, the world’s preeminent index of information, tell its users that caramelizing onions takes “about 5 minutes”—it pulls that information from an article whose entire point was to tell people exactly the opposite. A block of text from the Times that I had published as a quote, to illustrate how it was a lie, had been extracted by the algorithm as the authoritative truth on the subject.
In fact, it made the lie even worse, because Google’s automated text analysis is too dumb to recognize that “about 5 minutes” followed by “about 5 minutes longer” means 10 minutes.
See also: Google’s “One True Answer” problem.
We use a Nest Cam in the bedroom as a baby monitor. It has worked well, and I did what I could to configure it for increased privacy. Most of the time it’s off. I did not sign up for the Nest Aware service that stores footage in the cloud. I also turned off the activity history, which saves snapshots. The camera should only be active when I log in from an iOS device or the Web.
Around 4:30 AM, the Nest Cam was supposed to be off, but its status lights came on. First blinking green, then solid blue, then yellow. This was a scary sight, especially in the dark, half awake. According to Nest’s key, these colors correspond to someone “remotely watching the live video stream,” “booting up or rebooting,” and “trouble connecting to your Wi-Fi network.”
I contacted Nest support to see what might have happened but have not yet heard anything reassuring.
First, I was told that it’s normal for the camera to turn on in the night if the night vision feature is set to Auto. It was, but it doesn’t make sense to me that the camera would be checking for ambient light changes or motion when it’s supposed to be off.
Then I was told that, even without Nest Aware, I should be able to see 3 hours of footage in the app. I guess that was referring to this, but I had the activity history feature off, and in any case it was now more than 3 hours later. So there was no evidence in the app of whether the camera had actually been on and recording anything.
Nest says that there is no way to see when an account has been accessed. However:
We have the best encryption available and I can assure you that there was no security breach. Our system has AES-128 bit encryption, if your system was breached, we would have known and would have informed you.
I don’t see how they could know whether someone had broken into my account. Then I was told:
When someone is watching the led light will blink only blue, since your camera blinked with multiple colors, that means that the camera was disconnected from Wi-Fi and was trying to connect back to the Network.
This makes sense except that we had blinking green, not blinking blue, and blinking green is supposed to mean live viewing.
Best case: Faulty memory about the colors, there was some sort of Wi-Fi problem in the night, and the Nest Cam reported this through its lights. I don’t think this is it, though, because I only got one e-mail notification of the camera getting disconnected from the network, and that was after manually unplugging it.
It’s also possible that the Nest Cam malfunctioned or got hacked. At present, there doesn’t seem to be any way to investigate this. However, my case has been escalated, so perhaps the next person I hear from will know more.
Previously: Vizio Tracking TV Viewing.
Update (2017-03-21): It’s now been two weeks, and Nest never followed up like they said they would.
I have been reporting this problem (and others like it) to Apple Support for a long time now. So far I have been completely ignored. No fixes were ever made for the UI problems I have found in 10.11 and 10.12.
I was easily able to reproduce all but one of his examples on my Mac. I had seen a few of these cases before, but my use patterns are such that I don’t run into this bug very often, and thus it has not caused as much frustration for me.
In what world is it OK to have a cursor like this one when the mouse is right in the middle of a paragraph of text in a word processor?
I really don’t know what it would take to make @apple acknowledge the flakiness of context-based cursor changes in #macOS AND ELIMINATE IT.
Update (2017-03-08): Nick Heer:
It was trivial for me to reproduce Stephen Braddy’s bug video, and it’s something I’ve noticed all the time on MacOS for the past couple of major versions of the operating system.
The goal of this release is to provide additional camera raw support, lens profile support and address bugs that were introduced in previous releases of Lightroom.
The direct download link is here.
The update claims to fix a bug introduced in 6.8 that broke the auto-import feature. Because of that bug, and one causing an incorrect warning on quit, I have reverted to 6.7 for the last few months. Unfortunately, customers are reporting that the auto-import bug is only fixed for new libraries. So I will continue to use 6.7.
Monday, March 6, 2017 [Tweets] [Favorites]
Eric Johnson quoting Ron Johnson (via Joe Rossignol):
“I remember the day I came in and told Steve about the Genius Bar idea and he says, ‘That’s so idiotic! It’ll never work!’” Johnson said. “He said, ‘Ron, you might have the right idea, but here’s the big gap: I’ve never met someone who knows technology who knows how to connect with people. They’re all geeks! You can call it the Geek Bar.’”
“And I said, ‘Steve, kids who are in their 20s today grew up in a very different world. They all know technology, and that’s who’s going to work in the store.’”
Natasha Singer (MacRumors, Hacker News):
Mobile devices that run on Apple’s iOS and MacOS operating systems have now reached a new low, falling to third place behind both Google-powered laptops and Microsoft Windows devices, according to a report released on Thursday by Futuresource Consulting, a research company.
While school administrators generally like the iPad’s touch screens for younger elementary school students, some said older students often needed laptops with built-in physical keyboards for writing and taking state assessment tests.
The public school system in Eudora, Kan., for instance, used to have rolling carts of iPads for elementary school classrooms and MacBook carts for older students to share. But last year, when administrators wanted to provide a laptop for each high school student, the district bought 500 Chromebooks at about $230 each.
To compete with Chromebooks, Microsoft announced last month that it had worked with Acer, HP and Lenovo to develop low-cost Windows laptops for schools, with prices starting at $189.
This is sounding like a familiar refrain, but it seems like either Apple doesn’t care about this market or it completely misjudged its needs. I haven’t used a Chromebook, but at least on paper it seems like a near perfect machine for education: great price and durability, a real keyboard, a larger screen than on Apple’s cheaper devices, cloud-based productivity apps, and little need for administration. In some cases, students would need the full power of a Mac or PC, but for most education uses they don’t.
People familiar with Apple’s plans said the iPhone releases this year would include two models with the traditional LCD and a third one with an OLED screen.
They said Apple would introduce other updates including a USB-C port for the power cord and other peripheral devices, instead of the company’s original Lightning connector.
My expectation has been that iPhones will never switch to USB-C — that Apple would stick with Lightning until they can do away with external ports entirely.
I have no inside dope on this, but it rings false to my ears. If there’s any truth to it, I’d bet that this year’s iPhones will ship with USB-C chargers, that use a USB-C to Lightning cable to connect to the phones. That makes sense, given that Apple has dropped USB-A ports from the newest MacBook models.
Joe Rossignol (via John Gruber):
All three iPhones rumored to be launched in 2017 will retain Lightning connectors with the addition of USB-C Power Delivery for faster charging, including an all-new OLED model with a larger L-shaped battery and updated 4.7-inch and 5.5-inch models, according to KGI Securities analyst Ming-Chi Kuo.
My initial expectations matched Gruber’s. I didn’t think Apple would drop Lightning. That said, I think there’s a pretty good case to be made that they should. That would take some courage because people don’t like change. But I don’t think it would be that hard of a sell. Lightning was the right choice when Apple adopted it because it was better than the alternatives that existed at that time. But now we have USB-C, which is an emerging standard and offers many of the same benefits, plus some of its own—though also some confusion.
Imagine that Lightning didn’t exist today. Would there be a compelling reason for Apple to invent it? USB-C looks like it will become a widespread standard. If Apple doesn’t switch, it had better have a good explanation for why it’s inconveniencing its customers with a proprietary connector.
And if it’s going to switch, why not do it now? Why ship a few hundred million more devices with Lightning when you know USB-C is the future? It’s better to be a little early than a little late on these transitions.
Update (2017-03-09): See also: Accidental Tech Podcast.
Amazon (via Sam 北島-Kimbrel, Hacker News):
The Amazon Simple Storage Service (S3) team was debugging an issue causing the S3 billing system to progress more slowly than expected. At 9:37AM PST, an authorized S3 team member using an established playbook executed a command which was intended to remove a small number of servers for one of the S3 subsystems that is used by the S3 billing process. Unfortunately, one of the inputs to the command was entered incorrectly and a larger set of servers was removed than intended. The servers that were inadvertently removed supported two other S3 subsystems. One of these subsystems, the index subsystem, manages the metadata and location information of all S3 objects in the region. This subsystem is necessary to serve all GET, LIST, PUT, and DELETE requests. The second subsystem, the placement subsystem, manages allocation of new storage and requires the index subsystem to be functioning properly to correctly operate. The placement subsystem is used during PUT requests to allocate storage for new objects. Removing a significant portion of the capacity caused each of these systems to require a full restart. While these subsystems were being restarted, S3 was unable to service requests.
While this is an operation that we have relied on to maintain our systems since the launch of S3, we have not completely restarted the index subsystem or the placement subsystem in our larger regions for many years. S3 has experienced massive growth over the last several years and the process of restarting these services and running the necessary safety checks to validate the integrity of the metadata took longer than expected.
From the beginning of this event until 11:37AM PST, we were unable to update the individual services’ status on the AWS Service Health Dashboard (SHD) because of a dependency the SHD administration console has on Amazon S3.
Amazon Web Services (Hacker News):
The dashboard not changing color is related to S3 issue. See the banner at the top of the dashboard for updates.
Jim Dowling (via Hacker News):
Aside from the outage, there are many limitations of working with S3 that make it a less than ideal long term storage technology, and most of its problems relate to S3 object replication and metadata. S3 is a an eventually consistent key-value store for objects. However, eventual consistency tells us nothing about what guarantees S3 provides.
Netflix does not trust the metadata provided by S3. They have replaced it with their own metadata service, s3mper, which is essentially an eventually consistent key-value store that stores a copy of the metadata in S3 [s3mper]. Netflix rewrote their applications to account for s3mper. In the diagram below, you can see that application programming becomes more complex. Creating an object in S3 becomes a write to DynamoDB and a create operation in S3. This is not done transactionally. All S3 read/list operations need to be re-written to query DynamoDB and S3 and compare the results.
For me, the S3 outage brought down part of my FastSpring store, and a bunch of serial number reminder e-mails and crash reports didn’t go out because Amazon SES kept failing. My server code had assumed that sending e-mails would always succeed. In fact, it relied on sending e-mails to myself in order to report errors with the site and store. I’ve since added SparkPost and FastMail as backup SMTP providers.
I also plan to store e-mails in a database until they’ve been successfully sent. This seemed like it would be really easy, but I ran into a weird issue with my database layer not saving, and I haven’t had time yet to track that down.