Can we please end the stupid "Natural" vs. "Reverse" scrolling debacle?

I am quite well versed in the extensive history of the operating system wars. Apple effectively created the home computer industry, and Microsoft has been doggedly chasing after ever since. This is not a new story in the slightest.

Historical documents tell us that Steve Jobs managed to arrange a tour at Xerox’s Palo Alto Research Center, frequently called Xerox PARC, which had been tasked by their parent company with developing a wide range of modern computing technologies for the future. The PARC teams were responsible for laster printers, Ethernet, the desktop paradigm for graphical user interfaces, e-paper, Very Large-Scale Integration or VLSI, the process which made all of the computer chips we use today even possible, and the Alto, among other things.

A Xerox Alto II on display… somewhere. Unfortunately, the contributor didn’t note where they had taken the picture, so it’s anyone’s guess, though likely a computer museum… again, somewhere.

The Alto II was an amalgam of all of the technologies developed at PARC and was effectively a “desktop” computer. The idea might seem insane when you look at the literal behemoth above, but this where we were back then. The Alto II was never released as a product for the general public and only 2,000 were built, mostly deployed inside PARC with some distributed to universities. Many believe that the Mac team’s exposure to the work on the GUI at PARC helped them break through some of their final stumbling blocks before the release of the Macintosh in 1984. I’m pretty sure we’ve all seen the Super Bowl commercial

There was, however, one more piece of radically new hardware that had been invented a decade earlier by a bloke named Douglas Englebart at the Stanford Research Center which brought it all together, a device that would become known worldwide as…

…The Computer Mouse.
The inventor of the computer mouse, Douglas Englebart, never profited from his invention. During an interview, he said “SRI patented the mouse, but they really had no idea of its value. Some years later it was learned that they had licensed it to Apple Computer for something like $40,000.” Not a great deal, peeps…

Avoiding the rabbit hole…

So as to avoid a long(er) narrative about the history of the mouse, let’s leave it there and move to this piece’s raison d’être; the mind bogglingly silly problem of computer mouse scroll wheel scrolling directions.

You try to use fewer words to describe it whilst insuring all readers know what you’re talking about! I’m open to suggestions.

I’m quite sure most casual users who have always used a Mac or a Windows machine likely haven’t given a single thought to which direction the document in the display scrolls when you roll the scroll wheel up or down or drag two fingers across a trackpad. It’s kind of a thing that most people get. Kids learn all this stuff before they can string words together, for crying out loud!

But for users like myself who use both macOS and Windows operating systems, it’s a pain. (For the record, I’m also a Linux power user. Mint, Pop, and Fedora, in that order.) Here’s why:

Take a look at Kunal Rathore’s excellent article on the Mac’s natural scrolling versus Windows’ established reverse scrolling implementation, including a lovely animated illustration to, uh… illustrate the difference for those who have withstood the awesome attraction of “the other side”.

As you can plainly see, they just do the opposite. It’s not like you can iterate on a linear motion control. It goes up. It goes down. There’s not a lot of room for creativity. If you take a look at the Mouse settings in macOS and Windows you’ll start to see where the problem lies. Let’s look at macOS 12.4 “Monterey” and how it handles scrolling options:

WHAT? The Mac makes this an OPTION???!!! I sense the Windows user inside me starting to get nervous…

Well, how does Windows handle this, then? Let’s take a look at my HP EliteBook running Windows 10 and it’s scroll wheel options:

I like Dark Mode. Sue me.

As you can see, the only options for controlling the scroll wheel in Windows are to tweak how quickly and how far it scrolls when you twiddle the control, but not direction.

Grrr.

Yes. All this is the fault of Microsoft…

So, I think we can all agree that this is rather silly, has no place as an issue on computers made since 2010 (so, we’re 12 years late), and is just dirt easy to fix. And no, I don’t want any registry hacks or 3rd party utilities. It’s built into everything else, so yeah. Clearly, everyone else seems to have cracked the code, as it were.

I’m not a coder on any level, but if programmers can make mistake that lose people hundreds of millions of dollars and fix that epic screwup, I’m quite sure Microsoft can add a scrolling direction option toggle to a Windows update coming sometime this year, and that’s being generous.

I’d be willing to wager a farthing and a nice six-pack of gluten-free beer that a couple of their superstar hackers could whip up a fix in an afternoon. Tell “Up” and “Down” to do the opposite.

Ooooooooooo! Rocket Science!!!

I’d be interested in hearing how or why this would not be a feasible endeavor. Both pragmatic and loony answers are welcome. I only have two rules; have fun. be nice.

The Conclusion…

It’s time. Get it done, Microsoft.


PS: The violence in America is intensifying. There are reports of mass shootings coming in daily now. The Laguna Woods shooting was just a few miles from our home. There are very clear signs that America is heading towards some kind of event or series of events that will likely change life in this country forever. I’d much rather it be a march on the Capitol for a protest of epic proportions than another mass shooting.

PPS: One way we can look at this is via representation, one of the core tenets of the foundations of American society. Most Americans are represented, at least the ones who aren’t homeless, an illegal immigrant, or Black. Those should be easy red flags. Then divide them into Democratic, Independent, and Republican silos. There’s more red flags. They’re everywhere you look. We keep divvying ourselves up into little groups of people who should get more than everyone else. And we know very well that the big group with the most rabid desire is the white supremacy movement. Another is the National Rifle Association. I get it. It’s easy to get them mixed up.

PPPS: When reflecting on the tragedy of these terrible times when some of our “representatives” choose to dissemble in the face of the slaughter of nineteen 9–11 year olds in the name of their god, the gun, remember that the NRA membership is less than 2 million… in a country of 330 million.

PPPPS: How’s that for representation!


How to sacrifice children for a thriving weapons industry | 2022 Q2 Update


Welcome one and all! Is your weapons industry suffering in the markets when there’s little direct war action to feed the coffers for your “investors”? Our simple program allows you to generate almost unlimited “excitement” for weapon, accessory, and ammunition sales.

2022 has been a year of explosive growth for the gun industry, and we see nothing but profits for the ongoing future. Take a look at some of the successful results from our non-stop “marketing” campaign for “freedom”.

NOTE: The continued performance of gun “enthusiasm” despite the “unintended restructuring” of the National Rifle Association and their extensive messaging platform is a huge cost savings since we no longer funnel funds into their Ackerman McQueen-led AgitProp strike team. We applaud the NRA’s continued efforts to remain relevant, however.

Without further ado here are examples of “successful results” from our industry’s non-stop “marketing campaigns”…

May 24th 2022: Uvalde, TX. — Robb Elementary School — 21 murdered

February 14th, 2018: Parkland, TX. — Marjory Stoneman Douglas High School — 17 murdered

December 14th, 2012: Newtown, CT. — Sandy Hook Elementary School — 27 murdered

This is how. You do everything in your power to get gun laws relaxed, if not stricken entirely, so that more and more guns reach more and more true, red-blooded Americans. That, however, only reaches a certain saturation point. It’s not enough that one person own one gun. They need to need more guns. In order for your profit margins to soar in the short term, you need a catalyst.

You need the “Secret Sauce”; fear, uncertainty, and doubt, also known by its acronym, FUD.

As silly as FUD sounds, it is a devastating weapon that can strike without provocation or warning, and at scale, and all without specific instruction. It’s not enough that they feel unsafe in their homes, they need to feel existential dread. One of the most proven methods of instilling deep, unwavering dread is to see their children slaughtered like fish in a barrel, repeatedly. Preferably with some breathing room in between events so we can hone our messaging. We’d also like to thank Alex Jones who has been an indispensable asset in our efforts. We wish him well in his future endeavors.

One side benefit, as well, is that with so many excess guns and ammunition, a lot gets stolen, which must be replaced, which means insurance claims, and so on and so forth, providing a solid foundation for “gravy” profit margins in the peripheral markets and for suppliers. That also puts a load of guns into the “grey” and “black” markets, some of which generate more profits for us at gun shows, while illegal arms feed into the suburban, urban, crime, and minority fear secondary markets. And “Ghost Guns” are fine for now, but as they start to make headway, we’ll need to kick-off a “Vuse Maneuver” so we can leverage their work into our profit with minimal capital outlay.

As you can see here, the data is clear. Sales were distressingly low from the mid-90’s to the mid-00’s, but through our continued efforts on multiple fronts, we’ve been able to “eliminate” roadblocks to unlimited growth potential. A strong domestic weapons industry is critical. If we fail to protect this precious resource, hundreds of “Job creators” WILL continue to suffer, and we simply cannot allow that to happen.

You can’t leave it at that, however, you’ve got to dig deeper. You need help from the community, even if you have to manufacture consent. People are sheep. They don’t know what they need. But we, the wealthy and powerful, the ones manipulating things behind the curtains and maintaining a carefully curated veil of legitimacy, we do. Well, we know what we need, and in the end, that’s the only important thing we care to know. Knowing more is exhausting.

At the end of the day, however, we applaud this young man for his crowning achievement as 2nd Place Master Provocateur in our fight to return America to it’s “Roots”; a “colored folk” shooting range dotted with murderous, little fiefdoms teeming with happy, little aryans, bristling with their powerful, little assault rifles. Our Manifest Destiny is at hand, my fellow Americans!

Won’t it be wonderful?


In case it’s not clear, this is satire. I’m upset about Uvalde, TX. How can I not? These little babies were murdered, slaughtered like so many pigs in an abattoir, just like Parkland and Sandy Hook and thousands of others mass shootings year after year after year before, and if things don’t change, many more in the coming years. And we pretend this is a “mental health” issue, claim the time after tragedy is exclusively for grieving and discourse of any kind dishonors the recently deceased, then make deals behind closed doors and in exchange for “contributions” to give the gun industry its fucking money’s worth.

A lot more than our children pay the price, as well. Just look up how many have been killed by gun violence in America in 2022 alone, and the humanists among us are traumatized because we feel emotions; the closer we are, the exponential growth in heartache. We’ve been here before, and we’ll be here again unless the people act, and that means getting out into the streets, which is what I hope we finally do.

I wrote the above piece back in 2017. I wrote another on my blog, the one that was destroyed by The-Host-Who-Shall-Not-Be-Named, and no, it’s not Blue Host. I dislike them openly. I may repost it at some point. It takes a lot of work to convert from WordPress Gutenberg post formatting, so probably not.

I hope we get some of this shit figured out before someone does something so stupid we can’t come back from it.

I really do.


PS: Depression is a deeply misunderstood ailment that often doesn’t feel real, either to people who suffer from it and don’t know or to those lucky few who’ve never been depressed, though I’m not sure the latter is possible. I’ve suffered from depression for decades now and I doubt it will ever go away, but I’ve still managed to write… at least until Covid-19 came along…

PPS: My wife, daughter, and I had Covid very early, like in February 2020. Fortunately for us, our primary symptom was body-wide pain and we had none of the other, potential fatal problems. I’ll just gloss over the fact that I happened to get Shingles at the same time, but for those of you who know, you know. What came after, however, has been the most catastrophic symptom of my Covid infection; Long Covid brain fog. If it was already hard enough to write while depressed, Shingles kicked it up a few thousand levels of distressing.

PPPS: I’ve made countless commitments to restart my writing efforts, but I’ve only managed a few spurts of activity over the last few years. I don’t know if I can change that, but I’ll damn well try. I can’t sit idly by, randomly opining on the internet while atrocities are committed on a near daily basis. I need to start being a more active participant in this great experiment gone wrong, a sentiment I hope many more Americans like myself, humanist, open-minded, pragmatic, for forward-thinking, will choose to adopt. I’ll do my best and all I can do is thank you for reading my words.


How to kill an over-powered super villain

Thanks to Darrel Miller for the heavy lifting here. Check out his expansion pack for the CC&VF tabletop RPG.

Got a super villain problem? Need to “eliminate” the threat when faced with complicated geo-political circumstances? Not sure how to get it done without escalating the situation?

I’ve got a few ideas you’re welcome to…

  • The leader of a threatened nation that is meeting to negotiate with the villain in their sinister lair could have an explosive device surgically implanted that would explode during a scheduled meeting. It would be best if the device replaced an existing, known medical implant, i.e., an artificial hip, so as to avoid tipping off the scoundrel. When calculating yield, go overboard. Nothing angers an arch-enemy more than almost being killed. Of course, ’tis a noble sacrifice on the part of the official who chooses this most ultimate of tasks. We should honor that.
  • When the villain is showing off their tank full of sharks with lasers on their frickin’ heads, kick them in. Despicables such as these are frequently so over-confident that they thoughtlessly place themselves in easily exploited positions, believing that their target would never suspect that they are the lunch. If said villain has managed to protect themselves from the sharks, sacrifice yourself. It’s highly unlikely that, in the mad scrum of blood and gore, the sharks could limit their fervor for feeding.
  • If Mein Führer appears to have thought of all potential soft vectors their adversaries might employ in an effort to stop their advances without engaging in outright battle, bomb them. The problem might be that they’d have access to an advanced weapon that, even if they were eliminated, could still wreck untold destruction. To bypass that potential, carry a number of bunker busters into the upper atmosphere on giant balloons, deploy them on giant, remotely controllable ram-air parachutes, and drop them when over the target or targets. Location intelligence here is key, so work your double-agents and spy agencies hard.
  • Travel forward in time to get an older, mostly broken version of your foe and take him to his young self to use as leverage to convince the junior villain to travel forward in time to kill his older self, thereby preventing the wreck that he would become. If the villain’s future turns out annoyingly positive, you can shift to another dimension where things didn’t work out and engineer your desired outcome, but then… you’d be the villain. If any of this makes sense, you get a cookie. It will be delivered via Einstein-Rosen Bridge portal services LLC. Please make sure someone was/is/will be there to sign for the package.
  • For those of you with a bent for the subtle, long game with an unknowable outcome, try using public education as a trojan. With a bespoke curriculum designed to gently guide each subsequent generation towards very specific ideological precepts, you can steer the populace towards the opinion you want the public court to espouse. When the time comes for that inevitable super villain’s appearance, you’ll have already set the groundwork, at which point the populace will have determined how best to eliminate the future threat… Maybe.
  • Finally, lots of people have power suits to make them strong, but how many have built a power suit designed to keep a baddie under control? Simply design a highly complex, mobile, armored suit powered by a super-computer-grade A.I. that will stop said villain from being able to do anything but eat & drink, keep clean, and watch hundreds of hours of Night Court, Cheers, and 3rd Rock from the Sun. Racquetball on Tuesdays is mandatory. Don’t be late (not that they have any choice, of course.)

Nobody likes a villain, but that doesn’t really matter because, despite our likes and dislikes, the world doesn’t work like that and we get baddies. Period. Full stop. In fact, they appear to be about as common as your standard, every-day, shit-eating housefly. While our “fortune” regarding the percentage of villains who have achieved “success” has leaned in favor of less mass-murdery types for several decades, there are still plenty petty, self-important tyrants about to spoil most anyone’s day/year/millennia.

And they do, with alarming frequency.

There’s no simple answer to the question of how to deal with a super villain as each is unique and brings their own set of conundrums heroes need to mitigate or manage somehow. If there were easy fixes, we’d not be facing the situation in Ukraine in 2022. But not everyone agrees that other humans have the right to be free or choose their path or just to have the opportunity to take another breath. It’s terrible. We watch the carnage on TV all day long and feel like we have no power to foment change while it appears the villains have all the power to get away with whatever they want…

…then again, I could rant for days. So I won’t.

I’ll leave you with two thoughts:

  1. If you get enough people to believe in the potential of some concept or thing, you can make it real. Just having the idea isn’t enough and no single person is enough to surmount any and all problems on the road to willing it into existence. It takes a village, quoth he with spite roiling on the tip of his tongue at his need to utter her words, after all. The Randian “John Galt” is an unobtainable mythos. We can’t have it, and the efforts being made in the attempt will destroy the world as we know it, unless we stop it as a collective of peoples who may not share the same ideologies, but do share the desire for our cultures to continue, in peace. I’m not suggesting it will be easy, but we could start with a) not killing each other for a while and b) getting people housed, fed, productive, and inoculated, not necessarily in that order and not as severe as I make it sound. That and trying to save our planet from ourselves should keep us busy for a few decades.
  2. Money isn’t real. It’s a disease, but it’s like the common cold or the flu; we know how to fight it even though it keeps coming back for more. We just have to manage it better until people get used to money being as meaningless as it ever was and get on with the business of progressing as a species. Oh, I’d say about 1,200 years of solid work should put a good dent in the problem. That’s going to take a lot of hero-types who will step forward and stand up for logical, meaningful, humanist policies instead of forever bending our knees to our corporate overlords. Your first target is stupidity. And because stupidity is so stupid, you will have to be ruthless. It’s only fair, since ruthless is all they’ve been to the rest of us, especially since most of us oppressed have been making their world turn for decades. And by ruthless I don’t mean violent, just that you need to stop letting their stupid run the show. This also means that you must also stop money from being the loudest asshole in the room. If I figure out how, I’ll write it down, but for now people far smarter than me should already have some ideas.

Of course, and I think this goes without saying, if you ever get the opportunity to kick the villain into a volcano, just do it.

Why the hell not? What’s the worst thing that could happen? World War III?

PS: ^^^


America's most underrated musical genius | Bruce Hornsby

Bruce Hornsby with The Grateful Dead performing at Soldier Field on July 4, 2015 in Chicago. Jay Blakesberg/Invision for the Grateful Dead/AP Images

I’m not going to go into a long, winding diatribe about how and why and when Bruce became a humble god living among us mortals. He’d just deny it. Instead, I’ll show you.

Hell, this is from 1999. There’s another 20 years of new stuff to take in from there, and he’s still going. Not now, of course. Pandemic, anyone?

You certainly know him from the title track from his debut album, The Way It Is, but that was back in 1986. What did he do from 1986 to 1999? Hmm…

  • The Way It Is (1986)
  • Scenes From The Southside (1988)
  • A Night On The Town (1990)
  • Harbor Lights (1993)
  • Hot House (1995, a personal favorite… fantastic album)
  • Spirit Trail (1998)

Not to mention countless live shows, playing with the Grateful Dead (a lot.)

So, after this show, what more can you expect to find? Take a look…

  • Here Come The Noisemakes (2000, a collection of live recordings from 1998 to 2000 as the Noisemakers would become his new band)
  • Big Swing Face (2002)
  • Halcyon Days (2004)
  • Intersections (2006, an essential box-set of loads of unreleased material, live and studio)
  • Ricky Skaggs & Bruce Hornsby (2007, some freaking amazing bluegrass)
  • Camp Meeting (2007, an album of jazz tracks!, very lovely)
  • Levitate (2009)
  • Bride of The Noisemakers (2011, another live collection I can listen to all day long)
  • Rehab Reunion (2016, where Bruce spends more time playing the dulcimer, and it’s amazing)
  • Absolute Zero (2019)

For a more complete listing of almost everything Bruce has done, check out his discography on Wikipedia.

Bruce is a freaking space wizard. He’s a master at two-handed piano, and incorporates that skill into many tracks, like Spider Fingers from the Hot House album, and he loves to improvise when playing live, as you can see here…

Here’s the album version. See if you can pick out how each hand is playing…

Then, there’s all those times Bruce played with The Grateful Dead, and The Dead were always great with bootlegging, so here’s a show from 1990… with Bruce on the keys…

All in all, Bruce makes fantastic music and I think you’ll love it all as much as I do, considering there’s tons to choose from. And if you love live shows as much as I do, then check out Nugs.net where you can purchase official “bootlegs” that are simply wonderful. There are numerous free shows to pick from, and they offer a lot of FLAC versions for most shows, all recorded from the board and sounding FABULOUS!!

If, when you look up from a five hour binging session are surprised that so much time has passed, I’ll happily take the blame ;)


I've gone (back to) Mac & why you should, too.

An overview of Apple’s technologically significant M1 chip architecture. Neato torpedo…

Back in 2009, I needed to have the CPU repasted in my MacBook Pro as it was running hot. Ill-advised in retrospect, I poked around Craig’s List until I settled on someone offering repair services that I felt I could trust. I spoke to him on the phone a few times, and we arranged to meet. I dropped off the laptop, we chatted jovially for about ten minutes, and then I went home.

I never saw that machine again.

I had been a contented Apple user since the (very) late 1970’s, but as the 2000’s wore on, my satisfaction had been whittled away by a range of issues like Apple’s pricing, the rise of the walled garden, and limitations preventing me from using non-Apple gear, of which I have a lot. As a working writer, the most pressing issue of the time, however, was that I had $600 to get a replacement machine so I could get back to work. It took me four years to save up the $2,400 for it in the first place and a used machine wasn’t going to cut it. I had no choice. I had to buy a Windows machine.

I started with an HP ProBook I picked up open box at Microcenter for $600. It was okay. After a couple of years it became nigh unusable for my Team Fortress 2 gaming, so I got another $600 open box deal at Microcenter, this time a Lenovo Flex 3 with discreet graphics. To its credit, my wife still uses it to this day. It’s slow as hell, but she won’t give it up. Not blowing a ton of cash on these machines, however, allowed me to save up another chunk of cash and, happy with Windows 10 at the time, I dropped $1,800 on my first gaming laptop, an Alienware 15 R3 with an 8GB GTX1060 graphics card. Despite all my research, it turned out to be something of a shitbox and now sits in a drawer with a dead battery that’s really hard to replace. So, early on in the Pandemic, I bought an open box HP Elitebook 840 G6 for way less than it’s street value. And yes, it was also $600. If you’re curious, I don’t have any particular affinity for the number 600, just hilarious coincidence.

Just don’t buy Dell. Period. They’re crap and only care about their enterprise & their hot XPS laptop, which they recently sabotaged with a hideous new industrial design. Typical Dell, am I right?

Now, more than a decade later, I have a new M1 Mac Mini on my desk, an iPhone 12 Mini in my pocket, and a 2013 MacBook Pro I got from a friend. Yes, I still have the HP EliteBook 840 (running the thoroughly disappointing Windows 11 which I’m planning to downgrade back to Windows 10), but I’ve barely used it since I got the Mac Mini, aside from taking the time to charge it up and make sure it has all updates. And it was an easy switch, as most of my tools are online, run from my web host or on my NAS, and/or have long been cross-platform. With a little practice I was back in the groove.

So, what prompted this sudden turn-around?

The first thing that caught my eye was the then new iPhone 12 Mini. I’d been an avid Blackberry user for years before migrating to the iPhone 4s, but when I lost my MacBook Pro to stupidity, I started trying out Android devices as they integrated better with Windows. Those early days were harsh, and after working through a range of Android devices, I eventually circled back to Blackberrys with BBOS 10 which had Android app support. I’m pretty sure you already know how that ended, which is when I bought my first truly great Android device, the OnePlus 2. Following that was an LG v30, then a Samsung A70, and saw that these things were getting too damn big. That iPhone 12 Mini spoke to me, so I bought one.

Just one month later in 2020, Apple shocked the world with the announcement of the first M1-powered systems. For the second time in years, I was excited for something from Apple. After ingesting an unhealthy number of YouTube videos issuing test after test illustrating the M1's astounding superiority and efficiency after having been held in thrall by the iPhone 12 Mini for six months, I felt it was time to dip my toes back into Apple’s newly inviting waters.

I pulled the trigger in early 2021.

There was more to my decision-making, though. I’d been monitoring the battle between Epic Games and Apple in hopes that the Cupertino behemoth would start opening up more of iOS. I’d also acquired the aforementioned 2013 MacBook Pro from a friend and was pleasantly surprised to find that it ran the then current macOS just fine, which reminded me of Apple’s legendary reliability and long life.

Microsoft had just rolled out Windows 10 to “correct” the Windows 8/8.1 Start Menu debacle and were doing all kinds of amazing new stuff. I’d come to believe that Apple and Microsoft had switched places, something like a Silicon Valley version of Freaky Friday (Phreaky Phriday would be an apropos title for the film). Microsoft’s efforts were exciting and full of promise, while Apple was building ludicrously expensive cheese graters, spending five years apologizing for the atrocious “Butterfly” keyboard, ignoring their other keyboard “innovation,” the TouchBar, figuring out more important things to remove, and not much else of interest.

Apple’s moneymaker was the iPhone and from the outside looking into Apple’s legendarily opaque operations, their mobile Golden Goose appeared to receive all of Apple’s focus. The Mac had been left to Apple’s chief designer Jony Ive and his bizarre, experimentally unnecessary proclivities, which are exemplified more by the removal of features that forced the creation of new revenue streams (i.e., AirPods, removing the wall charger from iPhone boxes, etc…) than the work-a-day aspects of technological evolution. When Steve Jobs called for the removal of the floppy drive from the original iMac, it was because he predicted that the CD-ROM drive would replace the older, slower, less capacious storage medium.

Steve Jobs was right…

Why did Jony Ive, Apple’s design chief, remove the headphone jack. Were people not using headphones anymore? Were headphone jack modules too big for their thin, innovative industrial designs? Were Apple’s customers demanding true wireless earbuds? No. None of these things were true. They did it because it meant they could sell far more over-priced AirPods and make a load more money over including a cheap pair in the box. And with the removal of the charging adapter, they outright claimed it would contribute to saving the planet, when it’s just another accessory being monetized and whose packaging adds even more waste to our overflowing landfills.

So, you might be asking yourself why, amid all this chaos, would I jump ship… again? I mean, if Apple is doing all this shady crap with their walled garden and taking a 30% cut of sales from the App Store and everything else, why would I again immerse myself into Apple’s “flawed” ecosystem after swimming in the warm seas of Windows for a decade? Well, for one, that ecosystem isn’t as flawed as it was in 2009. Two, it’s complicated for a raft of personal reasons I’ve already touched on. It hasn’t helped that, after a few years of reliable innovation, Microsoft itself has been sending mixed signals about their future plans, and those signals are confusing. They were doing phones, then not. They were doing ARM-based Surface tablets, then those shriveled up when their performance failed to meet expectations. They were going to bring augmented reality to the masses with the HoloLens. Where the hell’s that thing?

In my opinion, I believe that Cook and Ive had a disagreement on the ultimate direction of Apple after the six long years of the Butterfly Keyboard debacle, rising thermals from Intel parts as well as their inability to get past 10nm processes, and languishing sales that were at odds with Ive’s industrial design desires. All new systems essentially roll-back Apple’s design ethos to pre-2016 forms with the iPhone 12 cloning the wildly popular iPhone 5 and the new Late-2021 MacBook Pro’s sporting the new M1 Pro and Max chips recalling the look and, more importantly, most of the ports from the pre-Butterfly designs and ditching the TouchBar for full-size function keys.

I’ve long understood that corporations do what they do for themselves and their shareholders. Many of them recognize that being somewhat responsive to the needs of their customers and producing products that people actually want to buy is easier than… doing the opposite (ahem, Dell.) I’ve also come to understand that it’s all just noise. The industry is what it is, and until consumers actually speak with their wallets, something we most decidedly do not do, Apple and Microsoft and every other gigantic corporation will continue to do as they please. So, until we put our money where our collective mouths are, we have to base our purchasing decisions on something.

Instead, we must focus on who is making the most compelling technological innovations, the chip design that takes years and years and thousands of super smart people who know math and physics and science and programming, to offer the features and forward-thinking I consider when deciding which path I’ll follow. After Microsoft has effectively abandoned any semblance of a desire to innovate to focus on reliable, consistent revenue generation, the tea leaves say that Apple’s silicon will be the one to watch for the coming decade.

My desktop as of today, Feb. 20th, 2022, with labels. Kinda looks like a cockpit, no?

I mean, come on! I’ve got a $1,500 Mac Mini on my desk that can reportedly often match or exceed the performance of a Mac Pro “Cheese Grater” costing ten thousand dollars or more! The guys over at Max Tech on YouTube haven’t been able to sell their $15K Mac Pro while performing most of their editing on a 24" M1 iMac. It’s insane the leap in power and efficiency Apple has brought to the table, and they’ve only just started.

I’d be an idiot not to ride that wave…

Of course, I’m not everyone. I’m a writer with casual gaming tendencies. I don’t push my hardware that much anymore. Back in the early 00’s I was running a Citrix MetaFrame server in the garage office we had at the time. I’d tired of Windows and was running Caldera OpenLinux on my desktop, but still needed Windows apps to write Windows books and had $30k of Citrix’s enterprise software sitting around from a previous book, so I put it to work. Prior to that, I was using Macs exclusively, and worked on Windows material using Connectix’s Virtual PC. I’d say that at least 25% of my Windows-based work was done using systems other than Windows. Clever girl.

It’s hilarious that my wife found this as I am writing this article. This is an actual Beta 2 disc I used for my work on some of the books of the time. The server was running Windows 2000 Server, which was running the Citrix MetaFrame software serving Microsoft Office apps to my Caldera OpenLinux desktop system while I ran XP in Virtual PC on my PowerMac 7300/200. So very meta when meta wasn’t even meta yet…

Times, for better or worse, have changed, and so has technology. Whilst I remain an avid humanist and decry corporate efforts to commodify the planet and its contents for the lascivious pleasure and seemingly bottomless enrichment of the 1%, I do love me some sweet, sweet computer hardware and high-quality software. As previously mentioned, even after being separated for nigh a decade, I eased back into using macOS like putting on an old pair of comfortable shoes. All of the basics hadn’t left me, and I only needed to retrain myself on the lesser used functions involving shortcuts and modifiers. Windows did a number on me in that regard, but it’s all about getting stuff done and how little your OS gets in your way.

To that end, macOS is aces. Windows has gone through many significant changes in the last twenty years. Windows 7 was the penultimate exemplar of the classic Windows UI. Windows 8, however, decided to rip off its clothes and run through City Hall with a picket sign reading “Mission Accomplished” before getting tackled by cops and made to put on pants before Windows 10 came to bail it out. But even Windows 10 couldn’t commit to Microsoft’s own legacy, and now Windows 11 is breaking everything in order to be more Apple-like. Had you not use a Mac lately, you might think that this means Windows is ultra-modern and macOS is languishing in the past, but that’s simply not the case.

Using a Mac may retain the familiarity of classic System 7’s form coupled with the modern sensibilities of Mac OS 10.4 “Tiger” but it also contains a dizzying array of functionality improvements and new features. Apple simply didn’t see the need to change the basic conceit of the macOS user interface and user experience of its venerable OS. The Mac is both of the future while paying deep homage to the past. I’m inclined to think that anyone familiar with macOS of a decade or more ago should have a similar experience to mine. Microsoft, on the other hand, can’t even figure out what to do with their control panel items.

Today’s application climate does, however, play a significant role. I don’t think my switch back to the Mac would have been possible without the internet’s comprehensive shift towards web-enabled and web-powered apps. As Apple has been able to maintain a 15% share of the desktop market (which includes laptops) and a large percentage of Mac users are creatives, lots of apps are cross-platform. It also doesn’t hurt that the iPhone and iPad are both juggernauts of their respective markets.

All this exposition, for what again?

When all is said and done, what is it that I now derive from the new M1 Mac Mini mounted vertically on my desk that I couldn’t have gotten from my three year old 8th Gen HP business laptop running Windows 11? Peace of mind for one. Windows 11 is a mess. Just search for “Windows 11 problems” and you’ll find an endless slew of examples. Here’s one that Forbes just posted about the other day:

Cool, man… Click to read the article. NOTE: You get four free reads if you don’t already subscribe.

This might even sound familiar since it seems like the vast majority of Windows updates offer some kind of flag on the field that even Microsoft can’t predict before it ushers the baby code into the dark world of uncontrolled component drivers. Is it any wonder the Redmond giant wants to clamp down on hardware diversity? Apple has been doing it for decades, and their reliability numbers are significantly better than Microsoft’s.

That’s not to say Macs don’t have issues. They do, just far less frequently, at least in my experience. I generally go weeks without restarting my Macs. Windows 10 was good for a few days, but Windows 11 has been a near daily reboot cycle on the EliteBook (which has good driver support), mostly because of stupid errors, app errors, and persistent UI faults that won’t go away until you IT Crowd the damn thing. The worst I’ve seen on the M1 Mac Mini is easily fixed with restarting the faulty app itself.

I’ve got about fifty tabs open in seven tab groups on Safari, am running eight to ten applications, have a range of support apps running in the background, and it’s just peachy. It doesn’t get loud, like, at all. I don’t think I’ve ever heard the fan. It also never slows down, no matter how much I load up the system. Everything remains perfectly responsive and if there are any faults, they’re typically from an app’s services, poor coding practices, and/or lagging online content.

I’ve also not had to worry about legacy apps. Rosetta 2, the translation system that converts x86 code to ARM and back again (there’s a Bilbo joke in there somewhere) is seamless. Literally seamless. I honestly don’t know if I have any non-native apps installed. I might, and they just run. No popups, alerts, weird icon flags, or anything. I also haven’t had the need to use Windows virtualized. Sure, I had it installed in Parallels, but just to goof around in the ARM version of Windows 11. I haven’t fired it up in months.

(For the record, I just did so I could update it and I’m having to wait for Parallels and Windows to perform all their updates before I can start using it. The entire process took half an hour, not for lack of performance, but because Windows takes forever to get anything done. Even in 2022, Windows Update chokes on its own dependencies where updates fail on the first try because something else didn’t get updated first, and this isn’t limited to Beta versions. Super fun.)

What do you tell people about upgrading to Apple Silicon?

Just do it. Unless you use one or more tools that work only on Windows in an x86 hardware environment or are a PC gamer, just do it. You don’t need to spend $1,500 like I did, either. I got 16GBs of RAM and 1TB of storage space, but I always over buy as I generally get more years of use out of the expenditure rather than getting what I need in the moment. It’s also important to understand that with Apple’s new chip architecture you are stuck with whatever RAM and storage space you select at the time of purchase as those components are now integrated into the processor.

The vast majority of users will do fine with the baseline 8GBs of RAM and 256GBs of local storage. Two Thunderbolt 4 ports on the back ensure that large, high-quality external drives are fast, and they’re way cheaper than buying more storage from Apple. Here are some additional pointers about getting (back) into the Mac in the 20’s:

  • There’s a rumored Apple Event coming on March 8th. Wait until then to see what they roll out. Any new M-series chips will be incrementally better than last gen parts, but it’s always best to get the latest hardware if you can. You will derive performance, efficiency, and feature benefits from the changes Apple makes to the previous iteration. Then again, it’s not always about the performance, but the value.
  • Regularly check MacPrices.net for updates on sales. It’s best to keep a tab open to the Latest Deals page. Base model M1 Mac Minis can get down to $650, which is a steal for a machine that will get official support for seven years and actually last longer. Seriously. Also, get AppleCare. It’s worth it because Apple makes it worth it. Be patient, and you can get open box deals for as low as $550. My Late-2013 15" MacBook Pro is officially supported to the previous major version of macOS. That’s NINE YEARS, of support. Windows 8 and 8.1 was out for only four years, three if you count the fact that Windows 10 was released one year before Microsoft ended support for 8.1 in 2016. Windows 11 was released six years later in 2021. As of yet, we don’t know how Microsoft will arrange support for Windows 10, which works on all PCs, and Windows 11, which is only supported on PCs with TPM 2.0 modules and a few other requirements. That’s not a good look, Mr. Nadella.
  • Do NOT buy any MacBook with a TouchBar. They were made from 2015 to 2019 and have the atrocious Butterfly keyboards which fail if you cough lightly or dust anything within ten feet (fifteen feet if the lid is closed.) The only exception is the 13" M1 MacBook, and if you’re looking at that, just get a MacBook Air. Apple replaced the function keys with a touch display, and then ignored it, so its effectively useless. So, unless you plan on using it as a laptop with an external keyboard or like replacing parts frequently, just avoid them. The article below gives you a rundown of which models to avoid when shopping for used deals.
  • Reflect on what Microsoft has done for you lately, or to you if you’re in a particularly salty mood, and recognize that these are all just computers that perform tasks for you. They aren’t pantheons or ideological camps, despite appearances. They’re machines and you likely need one to perform tasks. I suggest the Mac because of their qualities and benefits over buying Windows. With a Mac you know what you’re getting. With a Windows box, god knows what they slapped into that box, or how poorly. There are plenty of pre-built PC horror stories. Laptops aren’t spared, either, as manufacturers use a wide range of parts they could get deals on, even in the same model, often making driver management a maddening crap shoot.

I recently posted a piece discussing the lowly, but highly functional keyboard shortcut and the idea that keeping your hands on the keyboard is the key to efficiency, but it also serves to illustrate yet another core advantage Apple’s macOS has over Windows.

This is clearly my personal opinion, but I’ve found the Mac to be a solid, all-around, thoughtful computing ecosystem that has remained consistent for decades and reliable to a refreshing fault. While we may pay a premium, something that rankles to this day and I will complain about until Apple addresses it, you get a system that will last for many years longer than the competition. And if you care about such things, Apple products have excellent resale value, making upgrades far less onerous for your wallet. Then again, I’ve found it difficult to give up my old gear when the time comes for said upgrades. That could be why I still have a MacBook Pro from 2013 in active use and a load of my old Mac stuff in storage, including my PowerBook 145b, the machine on which I started my writing career

Apple Silicon is an evolutionary revolution…

Marketing and PR types like to sling the word “Revolutionary” around a lot when describing their marginally improved products. In many cases it’s pure hyperbole, but Apple’s ARM-based systems, starting with the M1 with it’s base, Pro, and Max variants are more than the sum of their parts. Under the hood it may be an ARMv8.4 architecture part derived from their iPhone processors, but Apple’s chip engineers have built a new architecture that leapfrogs all of the current work being done in CPUs.

Intel’s 12th generation Core processors are damned fast, but to achieve that performance they suck up a ton of power to get there and fall flat when unplugged. Meanwhile, Apple’s M1’s sip 30 watts of power and perform exactly the same plugged in or on battery power and can last a day on one charge. It’s not magic and they didn’t create new technologies the likes of which we have never seen before, but they did learn from their experiences designing and improving the iPhone. And all this after Steve Jobs had vocally denied having a phone in the works at all before gleefully announcing it on stage at San Francisco’s Moscone Center in January of 2007.

Those first iPhones used more standard ARM designs, but by 2010 Apple would roll out their 45nm (“thicc” in nerd parlance) A4 chip with a single 32-bit core that had been designed in-house. One decade later, the 64-bit A14 Bionic sports six performance and four efficiency cores built on a 5nm process with a quad-core GPU and a 16-core AI processor as well as a small constellation of support cores. The M1, derived from the A14 chips, can run for up to 18-hours in a $999 MacBook Air while still offering desktop-grade professional video editing functionality. If they can do that, I’m pretty sure they can handle YouTube videos and writing emails.

Some people would call these innovations a “sea change,” myself included. I think if Steve were alive he would be blasé about it, having planned for it years in advance. He was prescient about so many things, removed older technologies at the right time to push the industry towards adoption of incoming technologies, and kept everything he could as close to his chest as possible to limit tipping off the competition. When Cook and Ive took on his role jointly it became clear they didn’t share Steve’s deep insight, likely born of his relentless research into coming innovations. In other words, Jobs was playing 4D chess while Cook and Ive were still trying to figure out checkers, at least until Ive left Apple to start his own design firm.

“Steve used to say that we make the whole widget. We’ve been making the whole widget for all of our products, from the iPhone, to the iPads, to the watch. This was the final element to making the whole widget on the Mac.” — Greg “Joz” Joswiak, Apple CMO

It’s clear that one thing survived Jobs passing, though; the plans laid to make the “whole widget.” Apple ditched the PowerPC because IBM was incapable of fabbing a part that could run cool enough for a PowerBook, and the move to Intel parts was just a stopgap along the way to said widget. As I spoke of earlier, the removal of the headphone jack to replace it with egregiously overpriced Bluetooth earbuds was an entirely cynical, capitalistic thing to do in the same vein as taking the charger out of the iPhone box and packaging it separately while claiming to be saving the planet, but really just creating nearly twice the landfill entrée per device sold through. (Pro Tip: You are allowed to change the box design, Apple.)

Apple’s secrecy strikes again. Microsoft suddenly flounders?

Back in the mid-2010’s, as Apple was spinning further and further away from the path Jobs had set them on, Microsoft was having a kind of renaissance with new CEO Satya Nadella who, with CPO Panos Panay’s emotionally engaging presentation style, sold us on a new age of innovation that would be issuing forth from Redmond. Windows 10 had a whole new Agile-based release process, they were working on the HoloLens Augmented Reality glasses, and talking about bringing a new (not) phone to market. They showed off hot new gear in September 2020 like the (not a phone, per se) Surface Duo and the never-shipped Surface Neo tablet while the HoloLens moved upmarket as a consumer-unfriendly enterprise device priced at nearly $5,000.

Then in a November 2020 event, Apple drops a bomb; the all-new, didn’t-see-it-coming M1 chip slotted into the fanless MacBook Air and the fanned MacBook Pro 13" and Mac Mini. The surprise event seemed sufficient to push Microsoft off course, and they didn’t correct well. With their ARM-based hardware and software initiative struggling to keep its head above the water in the ankle-deep shallows of a calm lake, Apple’s successes with their shift to Apple Silicon effectively bullied their competition into handing over their floaties. But I don’t think this indicates that Apple has any real power over Microsoft, only that Microsoft’s efforts weren’t as thought out as we’d hoped. This is nothing new, however, as Microsoft has had issues keeping the lumbering juggernaut on its tracks for many, many years.

As a consumer, the take away here should be that Apple is offering a well-supported, stable, secure, and capable platform that meshes extremely well with the tens of millions of iPhones and iPads that reside in the hands of Windows users, and Microsoft has not seemed capable of correcting course in a timely manner. I don’t think these factors will cause a mass exodus from Windows, but it certainly won’t help Microsoft maintain their massive lead over Apple in the installed user base.

I know this is a lot to think about.

I think the two most important things you should take away from this are that the Internet has had a profound effect on cross-platform compatibility and that Apple is doing some amazing things. Whether they will pan out in the future is only for the future to know, but from what I see, the best bet most of us have for a real value in computing gear has started to shift from the WinTel (Windows on Intel) industry towards the Mac.

Apple’s traditional marketshare has been 14%. It’s now 16% while Windows has been steadily declining, something that’s easier to see with a larger data set. [SOURCE: statcounter]

Of course, Apple’s not going to jump from 16% marketshare to overtake Microsoft’s 76% anytime soon, but the Cupertino company has laid the foundation.

From my perspective, that foundation’s far beefier than Microsoft’s for the foreseeable future. Let’s agree to meet up in 2032 to discuss.

PS: Let’s keep Ukraine in our thoughts. That sovereign country is being attacked by a thug who knows nothing but violence. I normally try to be positive and constructive in these postscripts, but this time is different. The Russian Kleptocracy needs to come to an end and the Russian people and the peoples of the former Soviet states must be freed from Putin’s mob-inspired tyranny of lies, murder, and police violence. Even the Russian citizenry know this is wrong. They’re out there protesting when they know they face violent arrest. Nearly 2,000 have been arrested already.

PPS: Be kind to each other. We can do it if we just try. So, try harder, for all our sakes.


Hotel Transylvania: Transformania | Film Review

Despite the growing recognition in the West that animation is not a medium made expressly for children, there is an expansive industry that revolves around productions that aim specifically for that audience. Ahem… Disney? Heard of ‘em? There are a number of outfits that play in that space these days. Warner, Illumination, Pixar, DreamWorks, Universal, and Sony Pictures Animation, in no particular order, are a smattering. Each studio has their own “voice” but Sony’s Hotel Transylvania franchise is a series of films that speaks in its own voice, and that voice is Genndy Tartakovsky.

And yet, Tartakovsky’s participation in the fourth entry in the tetralogy is limited to writing the story, co-penning the script, and acting as an executive producer. Instead, the helm was handed off to co-directors Derek Drymon and Jennifer Kluska, who are both first timers in the director’s chair (that’s a big chair.) Tartakovsky wasn’t the only one to decline participation in this latest, and reportedly final, installment, either. Most notably, Adam Sandler was replaced by Brian Hull as the voice of Count Dracula and Kevin James’ Frankenstein was replaced by Brad Abrell. Trust me when I say that you won’t miss them. The replacements are almost exact matches, indicating that star power just isn’t necessary to support feature-length animation. All it does it foster bloated production costs.

Sony describes Hotel Transylvania: Transformania as follows:

When Van Helsing’s mysterious invention, the “Monsterfication Ray”, goes haywire, Drac and his monster pals are all transformed into humans, and Johnny becomes a monster. In their new mismatched bodies, Drac, stripped of his powers, and an exuberant Johnny, loving life as a monster, must team up and race across the globe to find a cure before it’s too late, and before they drive each other crazy. With help from Mavis and the hilariously human Drac Pack, the heat is on to find a way to switch themselves back before their transformations become permanent. [SOURCE: IMDB.com]

At its core, Hotel Transylvaniais both an homage to classic luminaries of the Golden Age of animated shorts like Wily Coyote, Tom & Jerry, Popeye the Sailorman, and even more contemporary adherents to classic animated hijinks like Roger Rabbit and Ed, Edd, & Eddy (a meta turducken, if you will) and entirely its own thing, something like Rush is to rock music or Ben Folds Five is to alternative.

Nuts & Bolts

As the fourth entry, HT4 is a solid evolution of the defined art style, but it doesn’t stray from the franchises established parameters. This is no Spider-Verse and yet the color palette is rich and bright, lighting and luminance effects are excellent, and the correct application of bokeh and the lens effects bring believable depth to most scenes. It’s clear, that Sony has been working hard to refine their process. That said, some of the textures don’t look great (the rock slide in the opening musical number looks like crap) which is a big oversight on such a high-visibility production.

Character designs, on the other hand, are peak franchise, hewing closely to the immutable nature of classic cartoon character design that said characters are iconic and must remain recognizable. They only change in service of a gag and always return to their pristine base form when the gag is complete. This is the same basic rule that defines all character designs from all cartoon where these icons always wear the same clothes, et al. Imagine Fred Flintstone wearing actual pants or Jessica Rabbit wearing something other than her signature red, sequined dress… oh. Well, not that, but you get the idea. Thanks, Disney Whitewashing Division…

The story telling is solid, if a bit on the basic side, something I’d expect of Illumination’s work. It’s not bad, but it’s also not sophisticated. The comedy revolves almost entirely around the physical gags which, in turn, defines the entire film. In a film that is more a buddy adventure flick, ala the far superior Emperor’s New Groove, then another entry in the HT canon it works to a degree, but not entirely. There’s little to no meta humor that pleases both adults and kids alike, depending on your depth of knowledge. It comes down to the gag, and that gag is always a physical comedy bit. The approach takes away from the core conceit; two people coming to terms with each other by walking in each others shoes.

Unexpected hair loss syndrome…

If anything jumps out at me it’s that the film feels claustrophobic. We start in Drac’s house, then the hotel, the plane, the provincial town, the bus, the jungle, and the cave, then back to the hotel. While the story is about Johnny and Drac and their journey towards acceptance of their differences, and spending time in each others shoes, it pulls back from the more expansive world-building as exemplified in the Tartakovsky-helmed entries. Since each scene needs to serve the setup for the next visual gag, any potential avenues for additional, foundational scope and scale is ditched in favor of setting up for the next gag.

And that’s where it lies; at the intersection of storytelling and spectacle, with the visuals winning over the written by a long shot. It’s not a terrible movie, to be sure, but you won’t derive much but the most deconstructed joy, the kind that makes a little kid roll around in the dirt with the giggles but generally doesn’t survive the transition to higher modes of cognition that come with age.

It should be a hit for years to come as a go-to for quiet toddler mommy hours worldwide.


Are keyboard shortcuts really more efficient than a mouse?

As a Mac user, there is one very specific admonition from Windows and Linux users I hear frequently; too much mouse! But I’m not just a Mac peep, I’m also a Windows and Linux guy. I started as an Tandy Radio Shack kid, became an Apple ][ kid, then a PC teen, and ultimately a Mac jerk (read: young adult.) In the mid-90’s, when the public internet was spun up, I was still a Mac guy sporting a PowerBook 145b emitting a veritable thatch of SCSI peripherals. But the story gets even more complicated.

When I met my wife and started my second phase of life as a writer, I was working on Windows books. Without any PCs at the time, I did all my work in Connectix Virtual PC. Later I would upgrade to a PowerMac 7300/200 and buy a cheap Packard Bell from Circuit City to act as a proxy server for our little dial-up network (and despite PB’s history of shit, it was a real workhorse for years.) Without going into more exhaustive detail, I would use Macs for many more years, then switch to Windows laptops for about a decade until the launch of Apple’s M1 Macs would draw me back in.

My Linux journey began in New Mexico with Caldera OpenLinux at a time when there were few niceties, and certainly no app stores. I had to compile my own apps, and in order to do that, I had to learn how to do it all from scratch. I’ve been using one form of Linux (called a distribution, or distro for short) ever since, primarily on a Dell Venue 11 Pro tablet. My current favorite is Linux Mint, and it is light years beyond Caldera OpenLinux, a distro that passed away a long time ago amidst a raft of controversies brought about by the SCO Group’s attempt to take “ownership” of Linux. tl;dr: they lost.

Over the years, working with macOS, Windows, and Linux, I have developed what I would think it a more than passing level of expertise in all three of the most important operating systems. I don’t think I’m a rare bird, but I doubt that my archetype is very common at all. What this extensive experience does grant me, however, is deep insight into the user experience of these desktop systems. I think that gives me the right to say the following:

Keeping your hands on your keyboard, or not having to reach for your input device, increases efficiency? It’s a stupid myth.

I hear it all the time from most all Linux and Windows power users. Taking your hand off your keyboard to manipulate your mouse/trackball/trackpad/etc, according to these naysayers, is an enormous waste of time. A killer of efficiency, if you will. Personally, I am a trackball user. My current device is an Elecom Huge (Japanese brand, funny names) and it’s marvelous, but I digress. I have, however, used mice for years, as well, primarily for gaming. I also have a lightly used Apple Magic Trackpad. For my keyboard, I switch between a Logitech MX Keys for Mac and an Apple Magic Keyboard.

I use keyboard shortcuts the least in Linux. The problem I’ve found myself facing is, that with each new epoch of Linux and with its built-in fragmentation, there is no unified set of them. GNOME and KDE have their own and some distros introduce their own, like System76’s Pop_OS!-flavored window tiling manager. And in Linux terminology the Windows/Command key is fluid, mostly due, I believe, to general saltiness over having to use the bespoke key of a proprietary OS. Unless I dedicated all of my time to Linux, I doubt I’ll ever develop real proficiency with its keyboard shortcuts. One thing that Linux nerds say all the time, however, is true; using the command line can be faster. I update all of my software by typing out a few commands. There is a learning curve, though.

I use keyboard shortcuts about half the time in Windows. The issue with Windows isn’t that the keyboard shortcuts are a chaotic mess, but that they traditionally use the CTRL key as the primary modifier, which is stupid since you have to manipulate it with your pinkie finger. We’ve had the Windows Key since 1994, and even to this day with Windows 11, it’s effectively useless. Cut, copy and paste all use the CTRL key, and switching windows is performed using ALT+TAB. There is a WINDOWS+W shortcut that will close “some” windows or tabs, but the default is ALT+F4. So, the world of Windows keyboard shortcuts is long established and yet it’s nearly as chaotic as the mess that are shortcuts in Linux.

You saw this coming a mile away but yes, I use keyboard shortcuts the most on macOS. Part of that is a direct result of Apple’s fully-integrated environment that presents a mostly standardized set of F-keys that perform functions like media control and accessing display and navigation features. The other part comes from Apple’s dedication to Command key placement and it acting as a central modifier for all basic shortcuts. There is always one Command key to the left of the spacebar, and often one to the right, depending on the keyboard. CMD+Q quits apps. CMD+W closes a window or tab. CMD+TAB switches apps. CMD+~ switches windows in an app. CMD+[ or CMD+LEFT is browser back. Cut, copy and paste are as expected. Moving around the Finder is a breeze, too. CMD+UP/DOWN/LEFT/RIGHT lets you navigate all windows and files without ever touching the mouse, and using SHIFT instead of CMD selects files as expected. There are dozens more, and if you add the Control, Option, and Shift keys, there are hundreds. For instance, OPT+R gets you this: ®. And yes, I used CMD+SHIFT+4 to make this screenshot which shows more.

Just a selection of the most common keyboard shortcuts for macOS.

Now, I realize this is starting to sound like I’m cheerleading using the keyboard and not wasting time touching my filthy (not literal) trackball to do stupid mouse tricks that would be “easier” on the keyboard, but that wouldn’t be accurate. In reality, I use my trackball far more often where it matters to me; writing and editing.

Despite being able to OPT+LEFT/RIGHT to jump around with my cursor, one word at a time, I find it far more accurate to mouse to the edit insertion point. Think about it… a mouse pointer is an extension of your hand-to-eye coordination. When you are accustomed to your input device, the pointer goes to where you look. This combination is, as Steve Jobs predicted, is key to a fast and efficient user experience. There are no keyboard shortcuts to make the cursor jump to where you’re looking, at least not yet. Imagine spotting a typo two paragraphs above where you’re writing. Now, get to it with your keyboard, then with your mouse. How much time did you spend doing each?

I think the difference is stark, and clearly in favor of a mouse.

Ultimately your experience will be your own or, in internet parlance, your mileage may vary. It all comes down to how you use your computer and the level of proficiency you have developed in one or more operating systems. It’s quite clear that macOS is where I feel most comfortable, and I would argue that most people who haven’t already used a Mac would come to the same conclusion if you could transition seamlessly. But, we all know that’s not yet a thing, much like flying cars or setting up a new Android phone with the click of a button.

To the question of efficiency, I think the idea that keeping your hands on your keyboard is the single most efficient use case is bullshit. I’ll admit that I’m likely biased because I use a trackball. It’s always in the same place and never moves, making it a lot easier to get my hand on it. The very large track ball makes it easy to get the mouse pointer exactly where you want it (see the link to the Elecom Huge above), and I don’t think very many people believe clicking a mouse button is difficult (I’m not including people who actually do find it difficult and need accessibility tools to help them.) Even so, any mouse, trackpad, or trackball leverages hand-to-eye coordination, and I think that’s a hard combo to beat when it comes to general computer navigation.

So, to all you keyboard masters out there who can fly around a computer never touching a mouse, kudos to you, but I don’t think the vast majority of the billions of computer users have the same experience. The combination of familiarity in your operating system and a good quality input device is the way almost all of us interact with our systems. That’s all I have to say on that.

PS: Voting rights. Seriously, voting rights. Can’t hear me? VOTING RIGHTS.

PPS: What will you do when you’re not allowed to vote because you mailed in your ballot a little late and any buffer time was removed or you can’t vote by mail any more? Does voting still happen on a weekday and you don’t get that day off to go vote? When the polls close earlier than they used to or your state closed a lot of polling places? When you’ve stood in line for hours only to be turned away because the rules changed and polls no longer stay open long enough to let everyone in line vote?

PPPS: What voter fraud? Where’s the proof? Why doesn’t anyone ask for said proof and, when bullshit is tendered as “proof” why isn’t it raked over the coals like it should? Have a nice day. I love you all, even the people who don’t deserve it. We should all get the chance to live a life, not just the ones who falsely believe they’re superior to everyone else.