Archive for January 4, 2024

Thursday, January 4, 2024

Niklaus Wirth, RIP

Bertrand Meyer (Hacker News, Slashdot, Reddit):

We lost a titan of programming languages, programming methodology, software engineering and hardware design. Niklaus Wirth passed away on the first of January. We mourn a pioneer, colleague, mentor and friend.

David M Williams:

Wirth is well-remembered for his pioneering work in programming languages and algorithms. For these achievements, he received the ACM Turing Award in 1984, inducted as a Fellow of the ACM in 1994, and a Fellow of the Computer History Museum in 2004.

They include, among many, being chief designer for the programming languages Euler (1965), PL360 (1966), ALGOL W (1968), Pascal (1970), Modula (1975), Modula-2 (1978), Oberon (1987), Oberon-2 (1991), and Oberon-07 (2007).

Of these, perhaps the best-known and most used is Pascal. It was the major teaching language of introductory Computer Science courses well until the 1990s when Java, and later Python, began to take over.

Mike James:

Pascal was a language that was designed specifically for teaching good programming practice.These were more innocent times and object oriented programming had not taken hold. What mattered was getting away from the unstructured mess of assembler and Fortran to a modern implementation of structured programming. Pascal, as first introduced, was a vehicle to write structured code - it had control structures that eliminated the need to use the goto. Today it looks fairly standard, but at the time many programmers hated its over-constrained fussiness. Yes, there was a big anti-structure contingent. We have come a long way since then.

So too was it with his next language, Modula. The buzzword of the time was structured-modular programming and Modula pushed further into the encapsulation of code into modules which interacted in controlled ways. This approach developed evantually into encapsulation within the object-oriented paradigm.

Pascal was a huge sucess in the sense that most university Computer Science departments adopted it as their main teachning language. This was a golden age because they had a language which was built to make what they were teaching clear. Compare this to todays mess of different languages each with flawed academic credentials. However Pascal only took off in the wider world when Borland introduced the world to Turbo Pascal, a much more capable and practical programming enviroment that found in the original.

Jeff Dean:

Pascal was the first language I used seriously (initially on the UCSD p-System and later via Turbo Pascal), and I got my hands on this great book that he wrote when I was in middle school.

I also love the anecdote when asked about how to pronounce his name:

“Whereas Europeans generally pronounce my name the right way (‘Ni-klows Wirt’), Americans invariably mangle it into ‘Nick-les Worth’. This is to say that Europeans call me by name, but Americans call me by value.”

See also: Introduction to Macintosh Pascal, An Introduction to Programming Using Macintosh Pascal.

Previously:

Update (2024-01-05): See also: Association for Computing Machinery, John Carmack, Tim Sweeney.

Update (2024-01-09): J. B. Rainsberger:

I’ve been reading Kent Beck’s writing on Substack and on the occasion of the death of Niklaus Wirth, he shared part of a conversation he’d had with the professor when Kent had arranged to sit next to him on the flight home from a conference they’d both spoken at.

Extreme Programming was just starting to crackle & pop, so I’m sure I was a bit over-enthusiastic. After I had given an impassioned explanation of incremental design & refactoring, he paused, looked at me with those eyes, and, “I suppose that’s all very well if you don’t know how to design software.” Mic. Drop.

Update (2024-02-06): Liam Proven:

Wirth is justly celebrated as the creator of the Pascal programming language, but that was only one step in a series of important languages and research projects. Both asteroid 21655 and a law of computer design are named after him. He won computer-science boffinry’s highest possible gong, the Turing Award, in 1984, and that page has some short English-language clips from a 2018 interview.

Via John Gruber:

Wirth’s Law encapsulates Wirth’s philosophy: “The hope is that the progress in hardware will cure all software ills. However, a critical observer may observe that software manages to outgrow hardware in size and sluggishness.” Or, as he rephrased it in his paper describing Project Oberon: “In spite of great leaps forward, hardware is becoming faster more slowly than software is becoming slower.” In many ways, this remains the fundamental problem of our entire industry. It’s a truism, and can only be mitigated.

Deeje Cooley:

RIP Niklaus Wirth. In the late 1980s, I really cut my teeth as a developer on Object Pascal and the MacApp framework (using MPW), and your book “Algorithms & Data Structures” was so influential to me. I still have my copy.

NanoRaptor (via John Gruber):

Apple’s classic Pascal poster, remade as a nice clean vector image.

Bertrand Meyer (via Hacker News):

A peculiarity of my knowledge of Wirth is that unlike his actual collaborators, who are better qualified to talk about his years of full activity, I never met him during that time. I was keenly aware of his work, avidly getting hold of anything he published, but from a distance. I only got to know him personally after his retirement from ETH Zurich (not surprisingly, since I joined ETH because of that retirement). In the more than twenty years that followed I learned immeasurably from conversations with him.

[…]

Like a Renaissance man, or one of those 18-th century “philosophers” who knew no discipline boundaries, Wirth straddled many subjects. It was in particular still possible (and perhaps necessary) in his generation to pay attention to both hardware and software. Wirth is most remembered for his software work but he was also a hardware builder. The influence of his PhD supervisor, computer design pioneer and UC Berkeley professor Harry Huskey, certainly played a role.

Stirred by the discovery of a new world through two sabbaticals at Xerox PARC (Palo Alto Research Center, the mother lode of invention for many of today’s computer techniques) but unable to bring the innovative Xerox machines to Europe, Wirth developed his own modern workstations, Ceres and Lilith.

Martin Odersky (via Hacker News):

I was privileged to have worked with him as his PhD student, and to have learned a lot from him. In this note I want to write about some of the ways Niklaus influenced my work and my approach to programming.

Update (2024-02-28): See also: Simson Garfinkel and Eugene H. Spafford.

How to Be Optimistic About Technology Now

Nick Heer:

If you measure your level of optimism by how much course-correction has been working, then 2023 was a pretty hopeful year. In the span of about a decade, a handful of U.S. technology firms have solidified their place among the biggest and most powerful corporations in the world, so nobody should be surprised by a parallel increase in pushback for their breaches of public trust. New regulations and court decisions are part of a democratic process which is giving more structure to the ways in which high technology industries are able to affect our lives.

[…]

That is a lot of change in one year and not all of it has been good. The Canadian government went all-in on the Online News Act which became a compromised disaster; there are plenty of questions about the specific ways the DMA and DSA will be enforced; Montana legislators tried to ban TikTok.

[…]

If there was one technology story we will remember from 2023, it was undeniably the near-vertical growth trajectory of generative “artificial intelligence” products. It is everywhere, and it is being used by normal people globally.

Previously:

Doom at 30

Wouter Groeneveld:

On 10 December 1993, John Carmack, John Romero, Sandy Petersen, and the rest of the id Software crew completely changed the world by releasing the most violent and satisfying DOS shooter ever created. Hundreds of so-called “DOOM clones” followed, frantically trying to join in on the cash grabbing fun. Several controversial lawsuits and political statements were made (and dismissed and resurrected) because of all the gore.

But DOOM was more than a grown-up version of id’s previous first-person shooting attempt, Wolfenstein 3D. DOOM was also a technical masterpiece, mainly thanks to Carmack’s knack for implementing optimization techniques after sifting through academic papers on cutting edge computer vision rendering algorithms.

Twitch:

Watch as David L Craddock moderates a chat between John Romero and John Carmack about Wolfenstein, DOOM and Quake. One hour of great conversation on DOOM’s 30th Anniversary.

Ted Litchfield (via Hacker News):

The conversation was understandably warm and celebratory, but I was also surprised at how critical the two were of their own work. Carmack alluded to “flashier” (and potentially technically riskier) graphical effects he wishes he had built into Doom’s engine, and he noted that he thinks the more grounded, military sci-fi aesthetic of Episode One has aged better than the abstract hellscapes later in the game.

Romero, meanwhile, contrasted Doom with the id games before and after, arguing it represented a technical “sweet spot” before Quake and full 3D acceleration started to seriously complicate development and limit how many enemies they could fit on screen. The developer praised Doom’s engine for allowing more complex maps than Wolfenstein though, ruefully remarking that “Making levels for Wolfenstein had to be the most boring level design job ever.”

tcmb:

It’s also worthwhile to remember where the team honed their craft, as was also mentioned in this session: The early id team worked for a publisher called Softdisk that provided a game subscription where customers received a new game every month. This was basically a way to iterate on the practice and process of game development in one month cycles. The shareware release of [Wolfenstein] had four months of development, which sounds crazy short by today’s standards, but for them it was unusual to have so much time.

There’s several talks on Youtube by John Romero where he tells the story of the early id software[…]

Liam Proven:

Just as Doom redefined video games in 1993, Windows NT redefined PC operating systems. The first version came out just a few months before Doom, and it was even more influential. ’93 also saw the release of NCSA Mosaic, the OG web browser. Mosaic’s spin-off, Netscape, started under the name Mosaic Communications Corporation, and somehow, that company homepage is still there. Later, Mosaic Corp evolved into Netscape, and that begat today’s Mozilla.

Apple’s Mac Gaming Push

Raymond Wong (MacRumors, Slashdot):

No doubt “losing” in gaming for decades has not been fun for Apple. It’s certainly painful and disappointing for Mac users both new and old, who have to buy a separate PC or console to play AAA games. But in 2023, the winds of change began to blow.

[…]

Gaming on the Mac in the 1990s until 2020, when Apple made a big shift to its own custom silicon, could be boiled down to this: Apple was in a hardware arms race with the PC that it couldn’t win. Mac gamers were hopeful that the switch from PowerPC to Intel CPUs starting in 2005 would turn things around, but it didn’t because by then, GPUs started becoming the more important hardware component for running 3D games, and the Mac’s support for third-party GPUs could only be described as lackluster.

[…]

When Apple announced the M3 chips at its “Scary Fast” October event, it touted certain hardware features built into the silicon that were big firsts for Macs. There’s the aforementioned hardware-accelerated ray tracing and mesh shaders, which make games look more realistic with real-time lighting and models with more detailed polygons and textures. But the feature few people are talking about, and the one that could make games on M3-powered Macs and future Apple computers really shine, is Dynamic Caching.

I don’t think the hardware has been the main problem. I wonder how much they’re doing with developers behind the scenes because I doubt Game Porting Toolkit and this PR push on their own are convincing anyone.

When I ask Apple’s marketing managers what they’re doing to improve the distribution of games on the Mac, specifically through the Mac App Store, I get less assuring responses. I remind them that, unlike the App Store for iOS, developers have had a rocky history getting their software sold on the Mac App Store. Panic, makers of the indie game Firewatch and the Playdate handheld, famously gave up on bringing its critically acclaimed Untitled Goose Game to the Mac App Store because of some seemingly arbitrary Mac App Store policies (though it’s still available on Mac via other platforms like Steam).

Previously:

Update (2024-02-01): Damien Petrilli:

All they focus on is trying to seduce gamers. They listen to them and try to provide what they think is missing: “more power, ray tracing, better GPUs, etc”. However, look at the Switch: underpowered by any metric, yet very successful.

Apple is blind sided by its dogma “developers owe us everything” thus they don’t see their value (they forgot their past, when the Mac was struggling to get any software).

The reality is that game developers bring in gamers. Not the opposite.

Apple won’t accept it because it goes against their vision of the world, they would have to consider developers as equal partners. But Apple sees itself at the top of the hierarchy, which is why Apple platforms won’t go mainstream in gaming anytime soon.

What Apple needs to do is not to seduce gamers, but game developers. Once game developers are on board, gamers will follow.

David Feldman, RIP

Legacy:

He earned a BS in Computer Science from Dartmouth College and an MBA from Harvard Business School. In 2023, he served as Distinguished Visiting Technologist at the MIT Center for Art, Science & Technology, and received an honorary Doctor of Fine Arts from the Maine College of Art and Design for developing the technology for the monumental sculptures of Studio Janet Echelman. At Apple Computer, David served as principal engineer for the file system on System 7 and invented the concept of an alias to a file or folder. He often laughed that perhaps his most well-known computer science legacy would be his recording of the “quack” beep sound, the first human voice on all Macintosh computers. He co-founded several successful startups, and founded Feldman Advisors, a technology investment, strategy and development consulting firm.

Jim Luther:

Dave made it easier to find things and then find things again on the Macintosh by adding CatSearch, CreateFileID, ResolveFileID, DeleteFileID, and ExchangeFiles to the System 7 File Manager, and for making backwards compatible changes to the HFS file system format for file reference IDs. These API and file system changes made it possible for the Alias Manager to track files that had been moved or renamed, made Finder Alias files possible, and made searching for files by name, date, size, etc. much faster.

The descendants of those API are in modern macOS, and are supported by Apple’s APFS file system.

I worked on Dave’s code years later, and later to enhancements of the features he helped introduce. One of the many times I was a standing on the shoulders of giants.

David K. Every:

Insert a floppy disk and rename it “KMEG JJ KS”, or “Like Wow Man.HFS For 7.0!” (where the space after ‘Man.’ is actually an option-space; you’ll have to type this somewhere else like the Note Pad then cut/paste it into the disk name), or “Hello world JS N A DTP” (exactly as is).

Eject the disk using Commad-E and double-click on the ghosted disk icon. The resulting message is “HFS for 7.0 by dnf and ksct”. In other words, it is saying “Hierarchical File System for System 7 by David N. Feldman and Kenny S. C. Tung.”

Previously: