OS X equals stability, security, and peace of mind. That’s been a central tenet of Apple’s marketing philosophy since it launched OS X. In the early 2000s, Apple focused on OS X’s roots in Unix and BSD as well as the lack of malware. Today, following some high-profile malware attacks, the company’s verbiage focuses on Time Machine and application sandboxing. The features and sales pitch may have changed, but the argument hasn’t. Apple presents its operating system and software as the best possible solution whether you’re a kid or a Mac Pro-wielding power user.
The actual quality of the OS that backed up those claims, however, may be slipping. A recent article from Lloyd Chambers, a professional software engineer and Mac user for the better part of three decades, argues that there are significant number of low level problems with OS X that are undermining its reputation for quality and reliability in high-end work environments.
The Finder, according to Chambers, will periodically freeze midway through file copies, damaging the boot drive’s file system. Plug enough volumes into Time Machine, and it starts excluding critical drives from its backup list without notifying the end user, even after being ordered not to do so. Plug a Fusion Drive SSD into a pre-existing OS X install, and Apple’s Disk Utility can’t return the drive to a usable state without erasing every other mounted volume.
All operating systems have bugs, and Chambers acknowledges that he’s a high-end professional user – not Apple’s typical customer. Several of the problems he documents, particularly the Time Machine bugs, may be related to the number of storage volumes in his workstation. So what’s the big deal?
The curious case of power users
Power user problems are edge cases, almost by definition, but solving their issues early can provide critical insights when it comes time to ramp mass market features later on. For a simple example, consider the evolution of symmetric multiprocessing (SMP) support in modern desktops.
Before Hyper-Threading debuted in 2002, PC desktops (and plenty of workstations) were single-core, single-socket machines. Back then, dual-socket workstation motherboards were expensive enough before you factored in the need for registered ECC RAM and the substantially higher processor cost. Hyper-Threading challenged the old cost structure in 2002 and AMD’s Athlon 64 X2 demolished it altogether in 2005.
The switch to multi-core processing went as smoothly as it did because both Microsoft and Apple foresaw the need to support the technology years in advance – even when doing so meant catering to the needs of a handful of users with specialised applications and highly specific needs.
Desktop hardware isn’t evolving like it used to, but software is. One of Chambers’ theories is that the sheer number of internal volumes he has mounted (up to 10) may be tripping up Time Machine’s functionality. How many people have 10 internal HDD volumes? Darned few.
But one of the emerging trends for desktop and laptops in the home is that they serve as a sort of command centre for other peripheral devices. Contemporary NAS devices like the WD My Book Duo, Synology DS213+ and Seagate BlackArmor NAS 220 all prominently feature Time Machine support. All of them allow for some degree of storage daisy chaining. When you start to think of a Mac as the interface point for a wide range of other devices and an integrated multi-disk backup solution, 10 volumes is still a lot – but it’s not a stratospherically high number.
The general trend across both cloud sharing and home networks, meanwhile, is to make external storage options as transparent as possible. Microsoft’s Windows 8 Storage Spaces doesn’t work particularly well, but it has the right idea. Unified, simplified storage management across a variety of devices is attractive for a host of reasons – and Time Machine, for whatever reason, currently falls short.
The halo effect still matters
No one is arguing that Mac sales (much less Mac Pro sales) account for the lion’s share of Apple’s revenue any more. Nevertheless, Apple’s high average selling prices (ASPs) on notebooks exist because consumers, for a variety of reasons, believe that Apple products are better value than competing products from Dell, HP, or Lenovo.
If professional users start migrating towards other solutions, Apple’s brand image will suffer. Apple products, for better or worse, command a great deal of attention. The disastrous debut of Final Cut Pro X in June 2011 made headlines across tech publications. It also illustrated that even long-time video production houses can be swayed to adopt other solutions.
Under Tim Cook, Apple has committed to a yearly OS refresh cycle, despite dubious evidence that this will result in a better final product. Microsoft’s own plans for Windows, for what it’s worth, don’t inspire confidence on this front, either. Quick updates make good sense in the iOS world. Whether they work for a product as complex and evolving as OS X or Windows isn’t at all clear.
At the very least, we agree with Chambers that these trends in OS X development are troubling. Not because any of the bugs are disastrous in and of themselves, but because all the movement is in the wrong direction.
Image Credit: MacPerformanceGuide