I've gone (back to) Mac & why you should, too.

An overview of Apple’s technologically significant M1 chip architecture. Neato torpedo…

Back in 2009, I needed to have the CPU repasted in my MacBook Pro as it was running hot. Ill-advised in retrospect, I poked around Craig’s List until I settled on someone offering repair services that I felt I could trust. I spoke to him on the phone a few times, and we arranged to meet. I dropped off the laptop, we chatted jovially for about ten minutes, and then I went home.

I never saw that machine again.

I had been a contented Apple user since the (very) late 1970’s, but as the 2000’s wore on, my satisfaction had been whittled away by a range of issues like Apple’s pricing, the rise of the walled garden, and limitations preventing me from using non-Apple gear, of which I have a lot. As a working writer, the most pressing issue of the time, however, was that I had $600 to get a replacement machine so I could get back to work. It took me four years to save up the $2,400 for it in the first place and a used machine wasn’t going to cut it. I had no choice. I had to buy a Windows machine.

I started with an HP ProBook I picked up open box at Microcenter for $600. It was okay. After a couple of years it became nigh unusable for my Team Fortress 2 gaming, so I got another $600 open box deal at Microcenter, this time a Lenovo Flex 3 with discreet graphics. To its credit, my wife still uses it to this day. It’s slow as hell, but she won’t give it up. Not blowing a ton of cash on these machines, however, allowed me to save up another chunk of cash and, happy with Windows 10 at the time, I dropped $1,800 on my first gaming laptop, an Alienware 15 R3 with an 8GB GTX1060 graphics card. Despite all my research, it turned out to be something of a shitbox and now sits in a drawer with a dead battery that’s really hard to replace. So, early on in the Pandemic, I bought an open box HP Elitebook 840 G6 for way less than it’s street value. And yes, it was also $600. If you’re curious, I don’t have any particular affinity for the number 600, just hilarious coincidence.

Just don’t buy Dell. Period. They’re crap and only care about their enterprise & their hot XPS laptop, which they recently sabotaged with a hideous new industrial design. Typical Dell, am I right?

Now, more than a decade later, I have a new M1 Mac Mini on my desk, an iPhone 12 Mini in my pocket, and a 2013 MacBook Pro I got from a friend. Yes, I still have the HP EliteBook 840 (running the thoroughly disappointing Windows 11 which I’m planning to downgrade back to Windows 10), but I’ve barely used it since I got the Mac Mini, aside from taking the time to charge it up and make sure it has all updates. And it was an easy switch, as most of my tools are online, run from my web host or on my NAS, and/or have long been cross-platform. With a little practice I was back in the groove.

So, what prompted this sudden turn-around?

The first thing that caught my eye was the then new iPhone 12 Mini. I’d been an avid Blackberry user for years before migrating to the iPhone 4s, but when I lost my MacBook Pro to stupidity, I started trying out Android devices as they integrated better with Windows. Those early days were harsh, and after working through a range of Android devices, I eventually circled back to Blackberrys with BBOS 10 which had Android app support. I’m pretty sure you already know how that ended, which is when I bought my first truly great Android device, the OnePlus 2. Following that was an LG v30, then a Samsung A70, and saw that these things were getting too damn big. That iPhone 12 Mini spoke to me, so I bought one.

Just one month later in 2020, Apple shocked the world with the announcement of the first M1-powered systems. For the second time in years, I was excited for something from Apple. After ingesting an unhealthy number of YouTube videos issuing test after test illustrating the M1's astounding superiority and efficiency after having been held in thrall by the iPhone 12 Mini for six months, I felt it was time to dip my toes back into Apple’s newly inviting waters.

I pulled the trigger in early 2021.

There was more to my decision-making, though. I’d been monitoring the battle between Epic Games and Apple in hopes that the Cupertino behemoth would start opening up more of iOS. I’d also acquired the aforementioned 2013 MacBook Pro from a friend and was pleasantly surprised to find that it ran the then current macOS just fine, which reminded me of Apple’s legendary reliability and long life.

Microsoft had just rolled out Windows 10 to “correct” the Windows 8/8.1 Start Menu debacle and were doing all kinds of amazing new stuff. I’d come to believe that Apple and Microsoft had switched places, something like a Silicon Valley version of Freaky Friday (Phreaky Phriday would be an apropos title for the film). Microsoft’s efforts were exciting and full of promise, while Apple was building ludicrously expensive cheese graters, spending five years apologizing for the atrocious “Butterfly” keyboard, ignoring their other keyboard “innovation,” the TouchBar, figuring out more important things to remove, and not much else of interest.

Apple’s moneymaker was the iPhone and from the outside looking into Apple’s legendarily opaque operations, their mobile Golden Goose appeared to receive all of Apple’s focus. The Mac had been left to Apple’s chief designer Jony Ive and his bizarre, experimentally unnecessary proclivities, which are exemplified more by the removal of features that forced the creation of new revenue streams (i.e., AirPods, removing the wall charger from iPhone boxes, etc…) than the work-a-day aspects of technological evolution. When Steve Jobs called for the removal of the floppy drive from the original iMac, it was because he predicted that the CD-ROM drive would replace the older, slower, less capacious storage medium.

Steve Jobs was right…

Why did Jony Ive, Apple’s design chief, remove the headphone jack. Were people not using headphones anymore? Were headphone jack modules too big for their thin, innovative industrial designs? Were Apple’s customers demanding true wireless earbuds? No. None of these things were true. They did it because it meant they could sell far more over-priced AirPods and make a load more money over including a cheap pair in the box. And with the removal of the charging adapter, they outright claimed it would contribute to saving the planet, when it’s just another accessory being monetized and whose packaging adds even more waste to our overflowing landfills.

So, you might be asking yourself why, amid all this chaos, would I jump ship… again? I mean, if Apple is doing all this shady crap with their walled garden and taking a 30% cut of sales from the App Store and everything else, why would I again immerse myself into Apple’s “flawed” ecosystem after swimming in the warm seas of Windows for a decade? Well, for one, that ecosystem isn’t as flawed as it was in 2009. Two, it’s complicated for a raft of personal reasons I’ve already touched on. It hasn’t helped that, after a few years of reliable innovation, Microsoft itself has been sending mixed signals about their future plans, and those signals are confusing. They were doing phones, then not. They were doing ARM-based Surface tablets, then those shriveled up when their performance failed to meet expectations. They were going to bring augmented reality to the masses with the HoloLens. Where the hell’s that thing?

In my opinion, I believe that Cook and Ive had a disagreement on the ultimate direction of Apple after the six long years of the Butterfly Keyboard debacle, rising thermals from Intel parts as well as their inability to get past 10nm processes, and languishing sales that were at odds with Ive’s industrial design desires. All new systems essentially roll-back Apple’s design ethos to pre-2016 forms with the iPhone 12 cloning the wildly popular iPhone 5 and the new Late-2021 MacBook Pro’s sporting the new M1 Pro and Max chips recalling the look and, more importantly, most of the ports from the pre-Butterfly designs and ditching the TouchBar for full-size function keys.

I’ve long understood that corporations do what they do for themselves and their shareholders. Many of them recognize that being somewhat responsive to the needs of their customers and producing products that people actually want to buy is easier than… doing the opposite (ahem, Dell.) I’ve also come to understand that it’s all just noise. The industry is what it is, and until consumers actually speak with their wallets, something we most decidedly do not do, Apple and Microsoft and every other gigantic corporation will continue to do as they please. So, until we put our money where our collective mouths are, we have to base our purchasing decisions on something.

Instead, we must focus on who is making the most compelling technological innovations, the chip design that takes years and years and thousands of super smart people who know math and physics and science and programming, to offer the features and forward-thinking I consider when deciding which path I’ll follow. After Microsoft has effectively abandoned any semblance of a desire to innovate to focus on reliable, consistent revenue generation, the tea leaves say that Apple’s silicon will be the one to watch for the coming decade.

My desktop as of today, Feb. 20th, 2022, with labels. Kinda looks like a cockpit, no?

I mean, come on! I’ve got a $1,500 Mac Mini on my desk that can reportedly often match or exceed the performance of a Mac Pro “Cheese Grater” costing ten thousand dollars or more! The guys over at Max Tech on YouTube haven’t been able to sell their $15K Mac Pro while performing most of their editing on a 24" M1 iMac. It’s insane the leap in power and efficiency Apple has brought to the table, and they’ve only just started.

I’d be an idiot not to ride that wave…

Of course, I’m not everyone. I’m a writer with casual gaming tendencies. I don’t push my hardware that much anymore. Back in the early 00’s I was running a Citrix MetaFrame server in the garage office we had at the time. I’d tired of Windows and was running Caldera OpenLinux on my desktop, but still needed Windows apps to write Windows books and had $30k of Citrix’s enterprise software sitting around from a previous book, so I put it to work. Prior to that, I was using Macs exclusively, and worked on Windows material using Connectix’s Virtual PC. I’d say that at least 25% of my Windows-based work was done using systems other than Windows. Clever girl.

It’s hilarious that my wife found this as I am writing this article. This is an actual Beta 2 disc I used for my work on some of the books of the time. The server was running Windows 2000 Server, which was running the Citrix MetaFrame software serving Microsoft Office apps to my Caldera OpenLinux desktop system while I ran XP in Virtual PC on my PowerMac 7300/200. So very meta when meta wasn’t even meta yet…

Times, for better or worse, have changed, and so has technology. Whilst I remain an avid humanist and decry corporate efforts to commodify the planet and its contents for the lascivious pleasure and seemingly bottomless enrichment of the 1%, I do love me some sweet, sweet computer hardware and high-quality software. As previously mentioned, even after being separated for nigh a decade, I eased back into using macOS like putting on an old pair of comfortable shoes. All of the basics hadn’t left me, and I only needed to retrain myself on the lesser used functions involving shortcuts and modifiers. Windows did a number on me in that regard, but it’s all about getting stuff done and how little your OS gets in your way.

To that end, macOS is aces. Windows has gone through many significant changes in the last twenty years. Windows 7 was the penultimate exemplar of the classic Windows UI. Windows 8, however, decided to rip off its clothes and run through City Hall with a picket sign reading “Mission Accomplished” before getting tackled by cops and made to put on pants before Windows 10 came to bail it out. But even Windows 10 couldn’t commit to Microsoft’s own legacy, and now Windows 11 is breaking everything in order to be more Apple-like. Had you not use a Mac lately, you might think that this means Windows is ultra-modern and macOS is languishing in the past, but that’s simply not the case.

Using a Mac may retain the familiarity of classic System 7’s form coupled with the modern sensibilities of Mac OS 10.4 “Tiger” but it also contains a dizzying array of functionality improvements and new features. Apple simply didn’t see the need to change the basic conceit of the macOS user interface and user experience of its venerable OS. The Mac is both of the future while paying deep homage to the past. I’m inclined to think that anyone familiar with macOS of a decade or more ago should have a similar experience to mine. Microsoft, on the other hand, can’t even figure out what to do with their control panel items.

Today’s application climate does, however, play a significant role. I don’t think my switch back to the Mac would have been possible without the internet’s comprehensive shift towards web-enabled and web-powered apps. As Apple has been able to maintain a 15% share of the desktop market (which includes laptops) and a large percentage of Mac users are creatives, lots of apps are cross-platform. It also doesn’t hurt that the iPhone and iPad are both juggernauts of their respective markets.

All this exposition, for what again?

When all is said and done, what is it that I now derive from the new M1 Mac Mini mounted vertically on my desk that I couldn’t have gotten from my three year old 8th Gen HP business laptop running Windows 11? Peace of mind for one. Windows 11 is a mess. Just search for “Windows 11 problems” and you’ll find an endless slew of examples. Here’s one that Forbes just posted about the other day:

Cool, man… Click to read the article. NOTE: You get four free reads if you don’t already subscribe.

This might even sound familiar since it seems like the vast majority of Windows updates offer some kind of flag on the field that even Microsoft can’t predict before it ushers the baby code into the dark world of uncontrolled component drivers. Is it any wonder the Redmond giant wants to clamp down on hardware diversity? Apple has been doing it for decades, and their reliability numbers are significantly better than Microsoft’s.

That’s not to say Macs don’t have issues. They do, just far less frequently, at least in my experience. I generally go weeks without restarting my Macs. Windows 10 was good for a few days, but Windows 11 has been a near daily reboot cycle on the EliteBook (which has good driver support), mostly because of stupid errors, app errors, and persistent UI faults that won’t go away until you IT Crowd the damn thing. The worst I’ve seen on the M1 Mac Mini is easily fixed with restarting the faulty app itself.

I’ve got about fifty tabs open in seven tab groups on Safari, am running eight to ten applications, have a range of support apps running in the background, and it’s just peachy. It doesn’t get loud, like, at all. I don’t think I’ve ever heard the fan. It also never slows down, no matter how much I load up the system. Everything remains perfectly responsive and if there are any faults, they’re typically from an app’s services, poor coding practices, and/or lagging online content.

I’ve also not had to worry about legacy apps. Rosetta 2, the translation system that converts x86 code to ARM and back again (there’s a Bilbo joke in there somewhere) is seamless. Literally seamless. I honestly don’t know if I have any non-native apps installed. I might, and they just run. No popups, alerts, weird icon flags, or anything. I also haven’t had the need to use Windows virtualized. Sure, I had it installed in Parallels, but just to goof around in the ARM version of Windows 11. I haven’t fired it up in months.

(For the record, I just did so I could update it and I’m having to wait for Parallels and Windows to perform all their updates before I can start using it. The entire process took half an hour, not for lack of performance, but because Windows takes forever to get anything done. Even in 2022, Windows Update chokes on its own dependencies where updates fail on the first try because something else didn’t get updated first, and this isn’t limited to Beta versions. Super fun.)

What do you tell people about upgrading to Apple Silicon?

Just do it. Unless you use one or more tools that work only on Windows in an x86 hardware environment or are a PC gamer, just do it. You don’t need to spend $1,500 like I did, either. I got 16GBs of RAM and 1TB of storage space, but I always over buy as I generally get more years of use out of the expenditure rather than getting what I need in the moment. It’s also important to understand that with Apple’s new chip architecture you are stuck with whatever RAM and storage space you select at the time of purchase as those components are now integrated into the processor.

The vast majority of users will do fine with the baseline 8GBs of RAM and 256GBs of local storage. Two Thunderbolt 4 ports on the back ensure that large, high-quality external drives are fast, and they’re way cheaper than buying more storage from Apple. Here are some additional pointers about getting (back) into the Mac in the 20’s:

  • There’s a rumored Apple Event coming on March 8th. Wait until then to see what they roll out. Any new M-series chips will be incrementally better than last gen parts, but it’s always best to get the latest hardware if you can. You will derive performance, efficiency, and feature benefits from the changes Apple makes to the previous iteration. Then again, it’s not always about the performance, but the value.
  • Regularly check MacPrices.net for updates on sales. It’s best to keep a tab open to the Latest Deals page. Base model M1 Mac Minis can get down to $650, which is a steal for a machine that will get official support for seven years and actually last longer. Seriously. Also, get AppleCare. It’s worth it because Apple makes it worth it. Be patient, and you can get open box deals for as low as $550. My Late-2013 15" MacBook Pro is officially supported to the previous major version of macOS. That’s NINE YEARS, of support. Windows 8 and 8.1 was out for only four years, three if you count the fact that Windows 10 was released one year before Microsoft ended support for 8.1 in 2016. Windows 11 was released six years later in 2021. As of yet, we don’t know how Microsoft will arrange support for Windows 10, which works on all PCs, and Windows 11, which is only supported on PCs with TPM 2.0 modules and a few other requirements. That’s not a good look, Mr. Nadella.
  • Do NOT buy any MacBook with a TouchBar. They were made from 2015 to 2019 and have the atrocious Butterfly keyboards which fail if you cough lightly or dust anything within ten feet (fifteen feet if the lid is closed.) The only exception is the 13" M1 MacBook, and if you’re looking at that, just get a MacBook Air. Apple replaced the function keys with a touch display, and then ignored it, so its effectively useless. So, unless you plan on using it as a laptop with an external keyboard or like replacing parts frequently, just avoid them. The article below gives you a rundown of which models to avoid when shopping for used deals.
  • Reflect on what Microsoft has done for you lately, or to you if you’re in a particularly salty mood, and recognize that these are all just computers that perform tasks for you. They aren’t pantheons or ideological camps, despite appearances. They’re machines and you likely need one to perform tasks. I suggest the Mac because of their qualities and benefits over buying Windows. With a Mac you know what you’re getting. With a Windows box, god knows what they slapped into that box, or how poorly. There are plenty of pre-built PC horror stories. Laptops aren’t spared, either, as manufacturers use a wide range of parts they could get deals on, even in the same model, often making driver management a maddening crap shoot.

I recently posted a piece discussing the lowly, but highly functional keyboard shortcut and the idea that keeping your hands on the keyboard is the key to efficiency, but it also serves to illustrate yet another core advantage Apple’s macOS has over Windows.

This is clearly my personal opinion, but I’ve found the Mac to be a solid, all-around, thoughtful computing ecosystem that has remained consistent for decades and reliable to a refreshing fault. While we may pay a premium, something that rankles to this day and I will complain about until Apple addresses it, you get a system that will last for many years longer than the competition. And if you care about such things, Apple products have excellent resale value, making upgrades far less onerous for your wallet. Then again, I’ve found it difficult to give up my old gear when the time comes for said upgrades. That could be why I still have a MacBook Pro from 2013 in active use and a load of my old Mac stuff in storage, including my PowerBook 145b, the machine on which I started my writing career

Apple Silicon is an evolutionary revolution…

Marketing and PR types like to sling the word “Revolutionary” around a lot when describing their marginally improved products. In many cases it’s pure hyperbole, but Apple’s ARM-based systems, starting with the M1 with it’s base, Pro, and Max variants are more than the sum of their parts. Under the hood it may be an ARMv8.4 architecture part derived from their iPhone processors, but Apple’s chip engineers have built a new architecture that leapfrogs all of the current work being done in CPUs.

Intel’s 12th generation Core processors are damned fast, but to achieve that performance they suck up a ton of power to get there and fall flat when unplugged. Meanwhile, Apple’s M1’s sip 30 watts of power and perform exactly the same plugged in or on battery power and can last a day on one charge. It’s not magic and they didn’t create new technologies the likes of which we have never seen before, but they did learn from their experiences designing and improving the iPhone. And all this after Steve Jobs had vocally denied having a phone in the works at all before gleefully announcing it on stage at San Francisco’s Moscone Center in January of 2007.

Those first iPhones used more standard ARM designs, but by 2010 Apple would roll out their 45nm (“thicc” in nerd parlance) A4 chip with a single 32-bit core that had been designed in-house. One decade later, the 64-bit A14 Bionic sports six performance and four efficiency cores built on a 5nm process with a quad-core GPU and a 16-core AI processor as well as a small constellation of support cores. The M1, derived from the A14 chips, can run for up to 18-hours in a $999 MacBook Air while still offering desktop-grade professional video editing functionality. If they can do that, I’m pretty sure they can handle YouTube videos and writing emails.

Some people would call these innovations a “sea change,” myself included. I think if Steve were alive he would be blasé about it, having planned for it years in advance. He was prescient about so many things, removed older technologies at the right time to push the industry towards adoption of incoming technologies, and kept everything he could as close to his chest as possible to limit tipping off the competition. When Cook and Ive took on his role jointly it became clear they didn’t share Steve’s deep insight, likely born of his relentless research into coming innovations. In other words, Jobs was playing 4D chess while Cook and Ive were still trying to figure out checkers, at least until Ive left Apple to start his own design firm.

“Steve used to say that we make the whole widget. We’ve been making the whole widget for all of our products, from the iPhone, to the iPads, to the watch. This was the final element to making the whole widget on the Mac.” — Greg “Joz” Joswiak, Apple CMO

It’s clear that one thing survived Jobs passing, though; the plans laid to make the “whole widget.” Apple ditched the PowerPC because IBM was incapable of fabbing a part that could run cool enough for a PowerBook, and the move to Intel parts was just a stopgap along the way to said widget. As I spoke of earlier, the removal of the headphone jack to replace it with egregiously overpriced Bluetooth earbuds was an entirely cynical, capitalistic thing to do in the same vein as taking the charger out of the iPhone box and packaging it separately while claiming to be saving the planet, but really just creating nearly twice the landfill entrée per device sold through. (Pro Tip: You are allowed to change the box design, Apple.)

Apple’s secrecy strikes again. Microsoft suddenly flounders?

Back in the mid-2010’s, as Apple was spinning further and further away from the path Jobs had set them on, Microsoft was having a kind of renaissance with new CEO Satya Nadella who, with CPO Panos Panay’s emotionally engaging presentation style, sold us on a new age of innovation that would be issuing forth from Redmond. Windows 10 had a whole new Agile-based release process, they were working on the HoloLens Augmented Reality glasses, and talking about bringing a new (not) phone to market. They showed off hot new gear in September 2020 like the (not a phone, per se) Surface Duo and the never-shipped Surface Neo tablet while the HoloLens moved upmarket as a consumer-unfriendly enterprise device priced at nearly $5,000.

Then in a November 2020 event, Apple drops a bomb; the all-new, didn’t-see-it-coming M1 chip slotted into the fanless MacBook Air and the fanned MacBook Pro 13" and Mac Mini. The surprise event seemed sufficient to push Microsoft off course, and they didn’t correct well. With their ARM-based hardware and software initiative struggling to keep its head above the water in the ankle-deep shallows of a calm lake, Apple’s successes with their shift to Apple Silicon effectively bullied their competition into handing over their floaties. But I don’t think this indicates that Apple has any real power over Microsoft, only that Microsoft’s efforts weren’t as thought out as we’d hoped. This is nothing new, however, as Microsoft has had issues keeping the lumbering juggernaut on its tracks for many, many years.

As a consumer, the take away here should be that Apple is offering a well-supported, stable, secure, and capable platform that meshes extremely well with the tens of millions of iPhones and iPads that reside in the hands of Windows users, and Microsoft has not seemed capable of correcting course in a timely manner. I don’t think these factors will cause a mass exodus from Windows, but it certainly won’t help Microsoft maintain their massive lead over Apple in the installed user base.

I know this is a lot to think about.

I think the two most important things you should take away from this are that the Internet has had a profound effect on cross-platform compatibility and that Apple is doing some amazing things. Whether they will pan out in the future is only for the future to know, but from what I see, the best bet most of us have for a real value in computing gear has started to shift from the WinTel (Windows on Intel) industry towards the Mac.

Apple’s traditional marketshare has been 14%. It’s now 16% while Windows has been steadily declining, something that’s easier to see with a larger data set. [SOURCE: statcounter]

Of course, Apple’s not going to jump from 16% marketshare to overtake Microsoft’s 76% anytime soon, but the Cupertino company has laid the foundation.

From my perspective, that foundation’s far beefier than Microsoft’s for the foreseeable future. Let’s agree to meet up in 2032 to discuss.

PS: Let’s keep Ukraine in our thoughts. That sovereign country is being attacked by a thug who knows nothing but violence. I normally try to be positive and constructive in these postscripts, but this time is different. The Russian Kleptocracy needs to come to an end and the Russian people and the peoples of the former Soviet states must be freed from Putin’s mob-inspired tyranny of lies, murder, and police violence. Even the Russian citizenry know this is wrong. They’re out there protesting when they know they face violent arrest. Nearly 2,000 have been arrested already.

PPS: Be kind to each other. We can do it if we just try. So, try harder, for all our sakes.


America's most underrated musical genius | Bruce Hornsby

Bruce Hornsby with The Grateful Dead performing at Soldier Field on July 4, 2015 in Chicago. Jay Blakesberg/Invision for the Grateful Dead/AP Images

I’m not going to go into a long, winding diatribe about how and why and when Bruce became a humble god living among us mortals. He’d just deny it. Instead, I’ll show you.

Hell, this is from 1999. There’s another 20 years of new stuff to take in from there, and he’s still going. Not now, of course. Pandemic, anyone?

You certainly know him from the title track from his debut album, The Way It Is, but that was back in 1986. What did he do from 1986 to 1999? Hmm…

  • The Way It Is (1986)
  • Scenes From The Southside (1988)
  • A Night On The Town (1990)
  • Harbor Lights (1993)
  • Hot House (1995, a personal favorite… fantastic album)
  • Spirit Trail (1998)

Not to mention countless live shows, playing with the Grateful Dead (a lot.)

So, after this show, what more can you expect to find? Take a look…

  • Here Come The Noisemakes (2000, a collection of live recordings from 1998 to 2000 as the Noisemakers would become his new band)
  • Big Swing Face (2002)
  • Halcyon Days (2004)
  • Intersections (2006, an essential box-set of loads of unreleased material, live and studio)
  • Ricky Skaggs & Bruce Hornsby (2007, some freaking amazing bluegrass)
  • Camp Meeting (2007, an album of jazz tracks!, very lovely)
  • Levitate (2009)
  • Bride of The Noisemakers (2011, another live collection I can listen to all day long)
  • Rehab Reunion (2016, where Bruce spends more time playing the dulcimer, and it’s amazing)
  • Absolute Zero (2019)

For a more complete listing of almost everything Bruce has done, check out his discography on Wikipedia.

Bruce is a freaking space wizard. He’s a master at two-handed piano, and incorporates that skill into many tracks, like Spider Fingers from the Hot House album, and he loves to improvise when playing live, as you can see here…

Here’s the album version. See if you can pick out how each hand is playing…

Then, there’s all those times Bruce played with The Grateful Dead, and The Dead were always great with bootlegging, so here’s a show from 1990… with Bruce on the keys…

All in all, Bruce makes fantastic music and I think you’ll love it all as much as I do, considering there’s tons to choose from. And if you love live shows as much as I do, then check out Nugs.net where you can purchase official “bootlegs” that are simply wonderful. There are numerous free shows to pick from, and they offer a lot of FLAC versions for most shows, all recorded from the board and sounding FABULOUS!!

If, when you look up from a five hour binging session are surprised that so much time has passed, I’ll happily take the blame ;)


Hotel Transylvania: Transformania | Film Review

Despite the growing recognition in the West that animation is not a medium made expressly for children, there is an expansive industry that revolves around productions that aim specifically for that audience. Ahem… Disney? Heard of ‘em? There are a number of outfits that play in that space these days. Warner, Illumination, Pixar, DreamWorks, Universal, and Sony Pictures Animation, in no particular order, are a smattering. Each studio has their own “voice” but Sony’s Hotel Transylvania franchise is a series of films that speaks in its own voice, and that voice is Genndy Tartakovsky.

And yet, Tartakovsky’s participation in the fourth entry in the tetralogy is limited to writing the story, co-penning the script, and acting as an executive producer. Instead, the helm was handed off to co-directors Derek Drymon and Jennifer Kluska, who are both first timers in the director’s chair (that’s a big chair.) Tartakovsky wasn’t the only one to decline participation in this latest, and reportedly final, installment, either. Most notably, Adam Sandler was replaced by Brian Hull as the voice of Count Dracula and Kevin James’ Frankenstein was replaced by Brad Abrell. Trust me when I say that you won’t miss them. The replacements are almost exact matches, indicating that star power just isn’t necessary to support feature-length animation. All it does it foster bloated production costs.

Sony describes Hotel Transylvania: Transformania as follows:

When Van Helsing’s mysterious invention, the “Monsterfication Ray”, goes haywire, Drac and his monster pals are all transformed into humans, and Johnny becomes a monster. In their new mismatched bodies, Drac, stripped of his powers, and an exuberant Johnny, loving life as a monster, must team up and race across the globe to find a cure before it’s too late, and before they drive each other crazy. With help from Mavis and the hilariously human Drac Pack, the heat is on to find a way to switch themselves back before their transformations become permanent. [SOURCE: IMDB.com]

At its core, Hotel Transylvaniais both an homage to classic luminaries of the Golden Age of animated shorts like Wily Coyote, Tom & Jerry, Popeye the Sailorman, and even more contemporary adherents to classic animated hijinks like Roger Rabbit and Ed, Edd, & Eddy (a meta turducken, if you will) and entirely its own thing, something like Rush is to rock music or Ben Folds Five is to alternative.

Nuts & Bolts

As the fourth entry, HT4 is a solid evolution of the defined art style, but it doesn’t stray from the franchises established parameters. This is no Spider-Verse and yet the color palette is rich and bright, lighting and luminance effects are excellent, and the correct application of bokeh and the lens effects bring believable depth to most scenes. It’s clear, that Sony has been working hard to refine their process. That said, some of the textures don’t look great (the rock slide in the opening musical number looks like crap) which is a big oversight on such a high-visibility production.

Character designs, on the other hand, are peak franchise, hewing closely to the immutable nature of classic cartoon character design that said characters are iconic and must remain recognizable. They only change in service of a gag and always return to their pristine base form when the gag is complete. This is the same basic rule that defines all character designs from all cartoon where these icons always wear the same clothes, et al. Imagine Fred Flintstone wearing actual pants or Jessica Rabbit wearing something other than her signature red, sequined dress… oh. Well, not that, but you get the idea. Thanks, Disney Whitewashing Division…

The story telling is solid, if a bit on the basic side, something I’d expect of Illumination’s work. It’s not bad, but it’s also not sophisticated. The comedy revolves almost entirely around the physical gags which, in turn, defines the entire film. In a film that is more a buddy adventure flick, ala the far superior Emperor’s New Groove, then another entry in the HT canon it works to a degree, but not entirely. There’s little to no meta humor that pleases both adults and kids alike, depending on your depth of knowledge. It comes down to the gag, and that gag is always a physical comedy bit. The approach takes away from the core conceit; two people coming to terms with each other by walking in each others shoes.

Unexpected hair loss syndrome…

If anything jumps out at me it’s that the film feels claustrophobic. We start in Drac’s house, then the hotel, the plane, the provincial town, the bus, the jungle, and the cave, then back to the hotel. While the story is about Johnny and Drac and their journey towards acceptance of their differences, and spending time in each others shoes, it pulls back from the more expansive world-building as exemplified in the Tartakovsky-helmed entries. Since each scene needs to serve the setup for the next visual gag, any potential avenues for additional, foundational scope and scale is ditched in favor of setting up for the next gag.

And that’s where it lies; at the intersection of storytelling and spectacle, with the visuals winning over the written by a long shot. It’s not a terrible movie, to be sure, but you won’t derive much but the most deconstructed joy, the kind that makes a little kid roll around in the dirt with the giggles but generally doesn’t survive the transition to higher modes of cognition that come with age.

It should be a hit for years to come as a go-to for quiet toddler mommy hours worldwide.


Are keyboard shortcuts really more efficient than a mouse?

As a Mac user, there is one very specific admonition from Windows and Linux users I hear frequently; too much mouse! But I’m not just a Mac peep, I’m also a Windows and Linux guy. I started as an Tandy Radio Shack kid, became an Apple ][ kid, then a PC teen, and ultimately a Mac jerk (read: young adult.) In the mid-90’s, when the public internet was spun up, I was still a Mac guy sporting a PowerBook 145b emitting a veritable thatch of SCSI peripherals. But the story gets even more complicated.

When I met my wife and started my second phase of life as a writer, I was working on Windows books. Without any PCs at the time, I did all my work in Connectix Virtual PC. Later I would upgrade to a PowerMac 7300/200 and buy a cheap Packard Bell from Circuit City to act as a proxy server for our little dial-up network (and despite PB’s history of shit, it was a real workhorse for years.) Without going into more exhaustive detail, I would use Macs for many more years, then switch to Windows laptops for about a decade until the launch of Apple’s M1 Macs would draw me back in.

My Linux journey began in New Mexico with Caldera OpenLinux at a time when there were few niceties, and certainly no app stores. I had to compile my own apps, and in order to do that, I had to learn how to do it all from scratch. I’ve been using one form of Linux (called a distribution, or distro for short) ever since, primarily on a Dell Venue 11 Pro tablet. My current favorite is Linux Mint, and it is light years beyond Caldera OpenLinux, a distro that passed away a long time ago amidst a raft of controversies brought about by the SCO Group’s attempt to take “ownership” of Linux. tl;dr: they lost.

Over the years, working with macOS, Windows, and Linux, I have developed what I would think it a more than passing level of expertise in all three of the most important operating systems. I don’t think I’m a rare bird, but I doubt that my archetype is very common at all. What this extensive experience does grant me, however, is deep insight into the user experience of these desktop systems. I think that gives me the right to say the following:

Keeping your hands on your keyboard, or not having to reach for your input device, increases efficiency? It’s a stupid myth.

I hear it all the time from most all Linux and Windows power users. Taking your hand off your keyboard to manipulate your mouse/trackball/trackpad/etc, according to these naysayers, is an enormous waste of time. A killer of efficiency, if you will. Personally, I am a trackball user. My current device is an Elecom Huge (Japanese brand, funny names) and it’s marvelous, but I digress. I have, however, used mice for years, as well, primarily for gaming. I also have a lightly used Apple Magic Trackpad. For my keyboard, I switch between a Logitech MX Keys for Mac and an Apple Magic Keyboard.

I use keyboard shortcuts the least in Linux. The problem I’ve found myself facing is, that with each new epoch of Linux and with its built-in fragmentation, there is no unified set of them. GNOME and KDE have their own and some distros introduce their own, like System76’s Pop_OS!-flavored window tiling manager. And in Linux terminology the Windows/Command key is fluid, mostly due, I believe, to general saltiness over having to use the bespoke key of a proprietary OS. Unless I dedicated all of my time to Linux, I doubt I’ll ever develop real proficiency with its keyboard shortcuts. One thing that Linux nerds say all the time, however, is true; using the command line can be faster. I update all of my software by typing out a few commands. There is a learning curve, though.

I use keyboard shortcuts about half the time in Windows. The issue with Windows isn’t that the keyboard shortcuts are a chaotic mess, but that they traditionally use the CTRL key as the primary modifier, which is stupid since you have to manipulate it with your pinkie finger. We’ve had the Windows Key since 1994, and even to this day with Windows 11, it’s effectively useless. Cut, copy and paste all use the CTRL key, and switching windows is performed using ALT+TAB. There is a WINDOWS+W shortcut that will close “some” windows or tabs, but the default is ALT+F4. So, the world of Windows keyboard shortcuts is long established and yet it’s nearly as chaotic as the mess that are shortcuts in Linux.

You saw this coming a mile away but yes, I use keyboard shortcuts the most on macOS. Part of that is a direct result of Apple’s fully-integrated environment that presents a mostly standardized set of F-keys that perform functions like media control and accessing display and navigation features. The other part comes from Apple’s dedication to Command key placement and it acting as a central modifier for all basic shortcuts. There is always one Command key to the left of the spacebar, and often one to the right, depending on the keyboard. CMD+Q quits apps. CMD+W closes a window or tab. CMD+TAB switches apps. CMD+~ switches windows in an app. CMD+[ or CMD+LEFT is browser back. Cut, copy and paste are as expected. Moving around the Finder is a breeze, too. CMD+UP/DOWN/LEFT/RIGHT lets you navigate all windows and files without ever touching the mouse, and using SHIFT instead of CMD selects files as expected. There are dozens more, and if you add the Control, Option, and Shift keys, there are hundreds. For instance, OPT+R gets you this: ®. And yes, I used CMD+SHIFT+4 to make this screenshot which shows more.

Just a selection of the most common keyboard shortcuts for macOS.

Now, I realize this is starting to sound like I’m cheerleading using the keyboard and not wasting time touching my filthy (not literal) trackball to do stupid mouse tricks that would be “easier” on the keyboard, but that wouldn’t be accurate. In reality, I use my trackball far more often where it matters to me; writing and editing.

Despite being able to OPT+LEFT/RIGHT to jump around with my cursor, one word at a time, I find it far more accurate to mouse to the edit insertion point. Think about it… a mouse pointer is an extension of your hand-to-eye coordination. When you are accustomed to your input device, the pointer goes to where you look. This combination is, as Steve Jobs predicted, is key to a fast and efficient user experience. There are no keyboard shortcuts to make the cursor jump to where you’re looking, at least not yet. Imagine spotting a typo two paragraphs above where you’re writing. Now, get to it with your keyboard, then with your mouse. How much time did you spend doing each?

I think the difference is stark, and clearly in favor of a mouse.

Ultimately your experience will be your own or, in internet parlance, your mileage may vary. It all comes down to how you use your computer and the level of proficiency you have developed in one or more operating systems. It’s quite clear that macOS is where I feel most comfortable, and I would argue that most people who haven’t already used a Mac would come to the same conclusion if you could transition seamlessly. But, we all know that’s not yet a thing, much like flying cars or setting up a new Android phone with the click of a button.

To the question of efficiency, I think the idea that keeping your hands on your keyboard is the single most efficient use case is bullshit. I’ll admit that I’m likely biased because I use a trackball. It’s always in the same place and never moves, making it a lot easier to get my hand on it. The very large track ball makes it easy to get the mouse pointer exactly where you want it (see the link to the Elecom Huge above), and I don’t think very many people believe clicking a mouse button is difficult (I’m not including people who actually do find it difficult and need accessibility tools to help them.) Even so, any mouse, trackpad, or trackball leverages hand-to-eye coordination, and I think that’s a hard combo to beat when it comes to general computer navigation.

So, to all you keyboard masters out there who can fly around a computer never touching a mouse, kudos to you, but I don’t think the vast majority of the billions of computer users have the same experience. The combination of familiarity in your operating system and a good quality input device is the way almost all of us interact with our systems. That’s all I have to say on that.

PS: Voting rights. Seriously, voting rights. Can’t hear me? VOTING RIGHTS.

PPS: What will you do when you’re not allowed to vote because you mailed in your ballot a little late and any buffer time was removed or you can’t vote by mail any more? Does voting still happen on a weekday and you don’t get that day off to go vote? When the polls close earlier than they used to or your state closed a lot of polling places? When you’ve stood in line for hours only to be turned away because the rules changed and polls no longer stay open long enough to let everyone in line vote?

PPPS: What voter fraud? Where’s the proof? Why doesn’t anyone ask for said proof and, when bullshit is tendered as “proof” why isn’t it raked over the coals like it should? Have a nice day. I love you all, even the people who don’t deserve it. We should all get the chance to live a life, not just the ones who falsely believe they’re superior to everyone else.


Using introspection to understand my own limitations