Apple has jumped the shark

It wasn't long ago that you couldn't walk down the street without tripping over some new Apple rumor or buzz over the latest and greatest Apple gear. Now, it's all about the white noise we hear from the tech industry as a whole. Are we living in a politics-style news bubble, is Microsoft beating Apple at their own game, or does the Cupertino megalith have something up its sleeve that would make the ghost of Steve Jobs giggle.

I remember, albeit vaguely, when Steve Jobs rolled out the brand new iMac in 1998. You really had to be there to see it live, but I did manage to see it eventually. I was, at the time, an Apple "phanboi", Ever since my first Apple ][, I had loved Apple. I've had lots of Apple products. I drank the kool-aid, as it were. After all, Apple was just dropping new products like creepy old ladies drop candy on Halloween. The iMac, the PowerMac, then the shift to Intel and the advent of the MacBook. Then, like a bolt from the blue, the iPhone and the iPad. The computer industry was having a hard time keeping up. It looked like Apple had a crystal ball and the competition tried and failed to copy Apple's formula, but Steve beat them handily, time and time again.

Then Steve Jobs died.

I'm sorry if that seems harsh (you might want to have that looked at), but it did happen. Tim Cook was installed as CEO to shepherd along what was already one of the single most valuable companies in the entire world. It didn't take long to see that Apple did not have a crystal ball, though. What they had was Steve Jobs. That's because, despite all of his ludicrous flaws and foibles that are common to genius, he had a crystal ball in his head. He could see the trend-makers and beat the competition to the punch, but the one thing he couldn't do was teach that trick to anyone else.

Even with annoying British designer Jonny Ive at his side, Tim Cook has been struggling to define a course for Apple that still pops out innovations. There was no Steve micromanaging every tiny detail every day, all day long. So, they just plodded along and started to copy what others had done while chasing them. The iPhone got bigger. The iPad got smaller. The Apple TV added voice and games. The Mac Pro got more expensive. Every exercise that used to produce real innovation melted boorishly into iterative microchange with a premium price attached for good measure. Apple, in my estimation, jumped the shark around the iPhone 6s and/or iPad Pro.

While the rest of the industry has now long been hawking the 2-in-1 lappy nee tablet in full awareness that the tablet industry is tailing off, Apple still makes nothing more than traditional laptops. Where you can get a tablet that runs full octane Windows 10, your iPad Pro still runs tablet software. Grab yourself an overpriced Samsung Galaxy S8 and you can take it swimming, where Apple still slaps you on the wrist if you get their gear damp. If you want something hot and new in Apple products, just grab yourself the new MacBook Pro with it's amazing Touchbar, a video strip that replaces the function key row. huzzah.

And, of course, everything Apple does is promoted with breathless intensity. Every event is Bob Hope's presentation of the recently risen Jesus Christ atop a gleaming, floating cloud hovering over Trump's Maralago. Yet, there were few showmen of the same caliber as Steve Jobs, and Cook has not followed in his mentors footsteps. Nobody has. The only person in tech today I can think of who has a presence as compelling as Jobs is Microsoft's Panos Panay. Panos is a natural on stage, speaks in an unscripted manner, interacts well with the crowd, and is enthusiastically hyper about Microsoft's Surface product line like an amp cranked up to 11. 

Now Apple rolls out a $5000 iMac Pro?! Is this Apple's response to Microsoft's astonishing, if subtly flawed, Surface Pro? I'm not going to dig into the world of pain that is Intel's i9 X-series multicore mega parts clusterfuck, but Apple has bought in completely. The stupid thing is that the X-series gear is designed for enthusiasts (sorta, more cobbled, but then I'm quibbling) and is meant to be built, not presented. Apple "presents" gear. You are meant to take it as it comes, use it as long as you can, and replace it with another steeply overpriced gadget they've breathlessly announced. In a sense, Apple is lucky that the mobile phone blew up, since that kind of gear is right up their alley.

None of this bodes well for a company that has long been playing at the edges of marketshare. I don't mean to suggest that Apple will go away. Far from it, but it does risk sliding back into the same tasteless, colorless mire it did when they first lost Steve Jobs. It's a fascinating history and if you don't know it, go look it up. You'll be amazed.

TL;DR - Steve brought in former Pepsi head John Scully to make corporate things work better after Apple's early success with the Apple ][. Following the introduction of the Macintosh in 1984 and after a lot of grief in the executive suites later, the board votes to fire Steve and put Scully in the CEO spot to replace Mark Markkula (yeah, Steve wasn't CEO). Steve goes off to found NeXT and Pixar, while Mike "The Diesel" Spindler was screwing up Apple's next gen OS and mobile aspirations. This led to Gil Amelio signing his own pink slip by suggesting Apple bring Steve Jobs BACK to consult. Then iCEO Jobs cut loads of fat from Apple's projects roster, started work on Mac OS X, ushered in the iMac, and began the road to making Apple one of the most powerful companies in the world before he died. Crazy, eh.

Yes, Apple has a huge share of the market in the iPhone, but all of the momentum they built over the years with desktops, laptops, and mobile devices is starting to catch the edges of reality and slow down. I don't think Tim Cook has much longer as CEO, and somebody needs to hand that Ive dude a severance check. His moody crap is really starting to bother me.

Why outrage doesn't matter anymore

Here's the thing. We, as Americans, have lost the ability to lash indignant hellfire at those who violate our collective sensibilities. Cops outright murder Black people almost every day, and we get mad, but nothing changes. Our politicians lie, cheat, steal, and send dick pics to minors, and we get mad, but nothing changes. Companies like Facebook, Google, Uber, the entire oil industry, the entire pharmaceutical industry, the medical establishment sell each of us for $12 a year, tricks hundreds of thousands of people into indentured servitude, skirt the law with impunity, and price gouge with lip-licking voraciousness, and nothing changes. Banks, housing, dialysis, the battle for a living wage, the death of education, the abandonment of the separation of church and state, and much, much more is killing us all every day.

And nothing changes.

Sure, we get mad, we harangue people on the internet, we watch "news" programs that are little more than shills for an ideology, and nothing changes. That's because, those who benefit from the status quo have discovered a simple truth. As long as there are enough Americans making enough money to maintain a decent credit score and there's enough "white" noise in the media, nobody gives a fuck. 

Americans have their interests predetermined. Get born, go to school, get a job, buy a house, raise two and a half kids, buy a few cars, take a vacation every year on your tax return, enjoy blockbuster movies at your local cinema, keep the riff raff out of your neighborhood, keep your head down and go to church every Sunday, or at least on the major Christian holidays. Most of all, however, is don't question anything outside of your little sphere of reality.

We've been conditioned to be selfish and think that's okay. There are many, many, MANY things wrong with American society, politics, education, social justice, healthcare, and a billion other things, but one of the biggest, simplest pieces is this grab, grab, grab ideal we have (that was an intentional reference to capitalist in chief, Trump). If we cared, stopping rape would matter. If we cared, women would have the same chances and choices as men. If we cared, we would make real reparations to native Americans and the descendants of slaves. If we really cared, we'd recognize that all of us are descended from immigrants, except the people we stole this country from in the first place, of course.

So, until we decide we aren't going to be selfish and share just a little with everyone else, things won't change.

And that really sucks.

America's Ghost in the Shell really sucks

So, I'm watching Ghost in the Shell and the only thing I can think of is, "these gringos don't get Japanese story tells." Not that such a thing stops white people from stealing non-white media and remaking it in their forced, anemic, pasty white image. It's clear that they cribbed mostly from the original film and Ghost in the Shell: Stand Alone Complex (an anime series) to create a franken-movie, not bothering to understand that the film and the series are two different tales which just happen to use many of the same characters. Just to be clear, GITS: SAC is one of the best anime series ever made. It's the only anime series that ever made me weepy over a robot. A ROBOT. GITS, the feature film, is considered one of the best examples of the art of animation, up there with Akira, Wings of Honneamise, and most of Miyazaki-sans works. The new American GITS would be awesome as an episode of the new Mystery Science Theater 3000 series on Netflix.
As for the whitewashing, it's just plain stupid. Everybody is hating on it, and Hollywood isn't listening. Dumb. You idiots are already losing tons of money because people don't want to spend $50 to watch TV in a big room for a few hours. There's TV at home, and it has better stuff playing. Whitewashing is also racist and puerile. We've got enough hate going around without having it shoved in our faces by what's supposed to be entertainment.

If the racism wasn't enough, there's the shitty, moody pacing and the constant, nagging remedial reminders that "Major" isn't really human and that's what the story is supposed to be about. So, GITS 101... What makes a human human? Can a thing be human if it contains the mere consciousness of a being, or is that just a clever copy that only seems alive? Ultimately, it questions the soul and where it resides, if at all. This is a subtlety that American filmmakers just can't seem to grasp.

See, there's this thing in Japanese storytelling, and even I don't fully grok it but I believe I'm well ahead of the curve for Westerners, that focuses on the experiential aspects of a tale. For example, in Mamoru Oshii's 1995 theatrical version, there are extended scenes which feature nothing but Kenji Kawai's haunting vocal track and scenes of New Port City in Japan. Not a single aspect of this sequence adds anything to the story, considered criminal in Western film-making, but adds both a layer of familiarity and presence to the teeming locale and injects a deeply emotional tone through the score.

Japanese storytelling often features the seasons with special attention to Cherry blossoms in Spring, the beach in Summer, festivals and fireworks in Fall, and Christmas in Winter. My intuition tells me that this is derived from the strong sense of tradition in Japan as native Japanese people culturally seek out the beauty and significance of life, the world, nature, and even human works. These are the bits and bobs that get left out or wholly misunderstood when translating Japanese media into American fare.

And that's all I have to say on that.

How Pixar killed traditional animation

It's hard not to think of Pixar and, by extension Apple, as amazing American institutions built by the astonishing, guiding hand of the late Steve Jobs. Pixar, after all, has churned out a steady stream of box office smashes, with the occasional stumble. Yet, at the same time, Pixar has become a cancer that has infected the Western World's lauded history of traditional, hand-drawn animation like an invasive species. We are paying a very steep price for Pixar's success today, and for the foreseeable future. 

There is no question that Steve Jobs was a visionary and reshaped our expectations of computers and technology. He and friend Wozniak almost single-handedly created the personal computer market in the mid-70's. However, by 1985 he was ousted from his own company because he didn't fit the standard corporate mold. Steve didn't rest, however. He created NeXT Computers and later, Pixar. Pixar had a megahit with Toy Story in 1995 and Jobs sold NeXT to Apple in 1997. In that same year, Steve Jobs returned to Apple as Interim CEO. 

While Jobs was reshaping what we understand as personal technology, Pixar was hard at work creating a new kind of animation using 3D rendering technology it had invented. That, however, is where the two diverge. As Apple created an environment where other manufacturers would start to compete with Apple, Pixar was starting to carve out a niche that would eventually become the entire market, forcing all comers to migrate to 3D or fall behind. 

It's hard to ignore a studio whose every release rakes in hundreds of millions worldwide, time after time, almost without fail. Even the films considered relative failures by critics made tons of money for Pixar and distributor, Disney. Spielberg, Katz, and Geffen's DreamWorks SKG was the first real studio practically formed to take Pixar head-on, and eventually it zeroed in on a number of critical hits, namely the popular Shrek franchise. Others would start to make their marks, as well, like Sony Animation, Blue Sky, and Universal. Even Disney started making 3D features in-house. 

1995 saw the theatrical release of six traditionally animated films and Toy Story, the first feature length 3D animated film. Toy Story went head-to-head with Disney's Pocahontas and A Goofy Movie and Amblin's Balto. By 2012 Pixar pitted the Celtic-themed Brave against seven CG-based and three stop motion films.*

2011's Winnie The Pooh from Disney, was the last significant traditionally animated feature to be released in the US.

2015 saw the release of Nickelodeon's The SpongeBob Movie: Sponge Out of Water which featured traditional, CG, and live-action sequences, so can't be counted as a traditional film. Feel free to poke around the lists yourself, though it's quite depressing. In fact, Japan is the only major media producer that predominantly uses traditional animation, though it is commonly blended with cost-saving CG backgrounds and other non-character elements. Most Japanese feature-film releases are also traditionally animated, and Japan's most applauded animation director, Hayao Miyazaki, only rarely applies the use of CG animation, and never to anything important. 

Image result for Spirited Away

So, the result has been the almost complete dissolution of Western traditional animation studios. Period. It's not really a matter of the efficiency of output, after all, Japan produces a literal fuckton of animation over four seasons each and every year. It seems like it's come down to mere one-upmanship, and that sucks for animators or anyone who wants to go into animation. 

Traditional animation is an art form. It is based purely in art as a creative, visual outlet that springs from human hands and is viewed by human eyes. While 3D animation can, and often is, beautiful, it is far less organic in variance, creative and/or cultural diversity, and frequently devoid of emotional impact. That last bit is critical. Sure, a story can be strong, and when edited together well, with good voice acting, and a compelling soundtrack, a CG film can be emotionally engaging. They simply lack the additional tonal quality of analog. 

Like vinyl records. 

* NOTE: All films noted or referenced were released primarily in the US market. 

Disney, Stop Ruining Animated Classics

So, I watched Beauty & The Beast (2017) last night. Wow. Just SUPER wow, and not for the reasons you might think. It's a dud. A flop. I hate it. The entire opening musical sequence, so full of life and deeply engaging in the animated feature, is dead, limp, and lifeless. Everything after that is a disjointed, misshapen mutation of the brilliant, reinvigorating, emotional feature-length animated version from 1991.

httpsi0wpcommedia2slashfilmcomslashfilmwpwp-contentimagesBeauty-and-the-Beast-belle-songjpg

Disney had a hit in 1977 with The Rescuers, but were having trouble with 1981's The Fox & The Hound, 1985's The Black Cauldron, and 1986's The Great Mouse Detective. They struck gold in 1989 with The Little Mermaid and 1991's Beauty & The Beast would be the first two steps in a long string of successes (barring 1995's Pocahontas, but that's just me). Had they not buggered about with their long history of storytelling, we might not have ended up with a 2017 version.

My recommendation to Disney? Stop it. Do NOT remake one of the most beloved animated films of all time, The Lion King. Stop all plans for other remakes. People don't want Live Action. They want new stories, not rehashed versions of old stories. If they hadn't remade Beauty & The Beast, then they wouldn't have invited discussions of misogyny and rape culture. It would have remained one of Disney's greatest films. Now it's just poop.

Hollywood is having a hard time making ends meet, with fewer people going to theaters to see films. It doesn't help that It costs nearly $50 for a family of four to see one movie and that TV series from Netflix, Hulu, Amazon, HBO, and others are far more engaging and far more affordable. It's also not helpful that most movies that come out of Hollywood these days are designed for the Chinese boxoffice. After all, the money goes where the money is, and America, you just ain't it anymore.

The bigger problem, aside from the economy and jobs not being where Washington would like us to think it is (sorta non-sequitur and kinda not), that, and I know you're not expecting this, Pixar ruined everything. In 1995, Pixar dropped a bomb on the animated film industry with Toy Story,and have since made precious few missteps until being acquired by Disney in 2006.

More insidious, however, is that Pixar did for Hollywood what Apple had done for technology, changed the two industries forever. There's no question that Steve Jobs was an amazing person, and he did usher in a wide range of technological advances via Apple, but the same can't be said for animation. The sad truth is that Pixar killed traditional, hand-drawn animation as an American art form.

I don't mean to suggest that 3D animation isn't a form of art. Indeed, it can be beautiful as Pixar, Blue Sky, Dreamworks, and others have shown, but it is not the evolution of animation. We've seen too many times complaints like those about the characters from Frozen looking exactly like the characters from Tangled, probably because they do. Traditional, hand-drawn animation allows for any number of styles.

Mulan features stylized Chinese art forms. Lilo & Stitch featured Chris Sander's unique character design and beautiful watercolor backgrounds instead of the traditional gouache. The Emperor's New Groove featured character designs based on South American art styles. Hercules drew from Roman forms of art and architecture for it's look. It's hard to say where any 3D animated film draws its inspiration from.

So, that's pretty much it. America needs to get back to its roots and stop trying to make shortcuts to everything. We need to stop killing art and demonizing the artistic. Disney needs to get back to creating amazing, hand-drawn animation and soon, or it will become the company whose pillars are Marvel and Star Wars and a bunch of old stuff they used to make. What a legacy, Bob. What a legacy.


Salted Wounds: The Ultimate Fuck You

I've talked about this before. I don't talk about being homeless a lot. It's painful, living in hotels for two and a half years. It's a short story if I leave out a lot of the detail. Two and a half years ago, around Christmas, we were evicted from our apartment in Mission Viejo because my publisher sat on their asses getting my advance check to me for my book, Getting An IT Help Desk Job For Dummies (available on Amazon, cheap plug). I worked my ass off to finish that book. 330 pages in three months. They didn't care. Companies don't care, so we were screwed. We figured we'd be out for a month or two then find a new place.

The thing is, nobody would take us.

It's easy. Just have the biggest recession since The Great Depression, mix in one guy with no college degree and 20 years of professional writing and IT experience, a disabled daughter, and a wife who has to take care of said daughter 24/7, then throw in shitty credit and families that lack any kind of real empathy, and you have us. By 2008 I was making $75,000 a year with full benefits. After the recession hit, I could barely make $20 an hour on contract. I moved from job to job, working my ass off, mostly for idiots who refused to listen to simple reason, but those are different stories.

I will say, I had this one client for six months. He ran a small business in Tustin. He wanted to automate his business operations, and had heard I was the guy. We talked and he told me he wanted a system that did X, Y, and Z and all for free. I told him that X and Y could be done but that Z would need to be handled by a different tool and there would need to be integration and that he'd have to spend some money. He hired me, but he didn't listen. I spent six months trying to convince him that there was no way in hell there was any system out there that would do exactly what he wanted, out of the box, for nothing. Nice guy, but dumb as a box of bricks.

Eventually, the jobs ran out. Nobody was hiring any more and the economy was changing. I was too old, had too much experience, and didn't have a degree. I worked for SendGrid for almost a year earning $65,000 with meager benefits, but was fired after I pointed out some marketing material lied about some stuff the company was capable of doing for clients. Learn that as a lesson, kids. Honesty doesn't win you anything in this world. I got a gig with Mirantis for a bit, but their own internal squabbling resulted in my position being eradicated before I could even get started, so there's that.

That's when the book gig landed on my feet, which ultimately resulted in us losing our apartment. We fought tooth and nail to get back on our feet. We begged and begged and begged. We fought with our respective fathers, Rima and I. I've personally spent what seems like months worth trying to convince my father that the help he was giving us wasn't helping. We made several abortive attempts to buy an RV to live in, one of which resulted in us getting robbed outright to the tune of $3,500, by a former OC Sheriff, no less. All the while, Rima is keeping Leah in school and getting a college education, and I eventually start working for Uber. All that time, we still have to pay the hotel, usually money we didn't have.

So, two years later, we're still in hotels and we get a letter from the Anaheim Housing Authority. They've pulled Leah from the California State Section 8 waiting list at the request of Regional Center of Orange County, the State services group that helps with her disabilities. So, we did as they asked and we started to feel as if things were finally going to change.

We filled out the application while at a hotel in Santa Ana. The forms said we didn't have to live in Anaheim, but apparently that's not how they feel internally, so we were declined. We moved to a hotel in Anaheim and applied again. They looked over everything and we were declined again, this time for insufficient information. Of course, we had sent the information, it's just that someone else looked at the file and didn't know what was going on, so they sent out the denial. Maybe I could get a job there making their shit work.

Finally, we get them to understand they have what they asked for and they finally approve our application! Good lord, the heavens opened up that day, but only about a fraction of a second. So, we go in, have a meeting, get our fingerprints taken, a background check run, and handed a voucher to go talk to a landlord at one apartment building. That's where we find out about Kathy. Kathy Nutter. Kathy Nutter, the head of the Orange County Community Housing Corporation, a non-profit group that has been operating in Anaheim for 40 years.

She was so nice. She talked to us, we told her our story, and she empathized! So few people were doing that, it felt new. She agreed to let us rent from her and we started doing the paperwork. She arranged to have some couches a board member was giving away moved in and it turned out her daughter had a fridge she didn't need anymore, so we got that too. We finalized things, got the keys, and started moving our stuff.

That turned out to be a mistake.

When we had first seen the apartment, we didn't see any cockroaches or smell anything, but by the time we started moving in, we found the roaches everywhere and a smell of urine in the kitchen that was just overbearing. We stayed in the hotel one more night. The next night, out of money from the move, we stayed at the apartment. We cobbled together sleeping arrangements and eventually fell asleep, only to be awoken by the screams of our daughter who had roaches crawling on her face and body. We haven't stayed there another night since.

We told Kathy what happened and she appeared to be shocked. She said she'd have the maintenance people over there in no time to take care of things. They sprayed something and went away. A few days later, when we checked, we found plenty of roaches so we told Kathy again. She sent her people over again and they put down gel and gave us bugbombs. We bagged up some of our things that weren't in boxes and set off about eight bombs. Two days later, we find plenty of roaches dead, but more than enough to sustain horror alive and well, crawling all over the place.

Kathy said she'd never had a complaint of roaches or bad smells before. Funny thing is, I spoke to a neighbor and she said they had roaches and had complained many times, and that a sewer line had broken last year and stunk up the entire building for several weeks. Wow.

We've taken pictures and videos. We've documented all of our communications with Kathy. We gave her every chance in the world to make it right, and she didn't take a single one. It turns out that the loving, friendly, helpful, generous Kathy Nutter we thought we knew was just another sleazy slumlord like all the rest. The rent for that place was $1400. We were going to pay less than $400 a month for our share, the Anaheim Housing Authority would cover the $1,100 remaining, and it was a pig sty at best.

Now, I haven't even mentioned the homeless guy living, literally, on our front doorstep or the half dozen or so drug dealers who were running shop in the alleyway behind the building. Everything about this deal just felt shady, and we were being screwed over royally. We thought we were safe, having made it into the helpful arms of a HUD approved group. Things just keep getting better, though.

We asked Kathy to reimburse us for the cost of the hotel. That was denied. We asked for compensation for the money we spent moving in. That was denied. We contacted AHA and asked them to come look at the place again. They did that, saw how bad things were, and said they would cancel their contract with OCCHC. They then suggested we move into another apartment, which looked good from what we could tell, but turned out to be managed by Kathy Nutter, as well. Isn't that a kick in the pants!?

So, we are now asking the AHA to help us get on another waiting list or help us fix this problem some other way, and so far we haven't heard back yet. It seems they might have hung us out to dry, as well. Only time will tell.

To get our money back, we need to take the OCCHC to court. Small claims. Yay. So that's what we're doing now. And back in a hotel. Me, three weeks into a horrible backpain episode. Leah still going to school. Bills have to be paid. Paychecks only come in when they do, and it's all hell.

So, this is that Fuck You I was talking about. It's the Fuck You that the Universe is giving us. The Jumbo Middle Finger of Fate.

I tell you, it's the most awesome thing ever.

Damon's The Great Wall isn't so bad

You know, if more Americans watched Chinese movies, they might understand films like The Great Wall better. I'm more of a Japanophile myself, but I watch a good number of Chinese epics. Netflix is loaded with them, and some of them are quite good.

The thing you need to understand about Chinese culture is that the ideal of working for the benefit of the whole has been around for a lot longer than Mao Zedong's Cultural Revolution (communism, if you didn't learn about Mao in school). The other thing you need to understand is the Chinese love for tales of mythology. Chinese culture has been around for a very, very long time, so they've got a lot of them.

Now, before we get to the idea of whitewashing, I'll say up front, I disagree. Matt Damon plays a white dude who tries to get to China to trade for gunpowder. Despite being a highly skilled mercenary who is mad down with the bow and arrow skills, he loses all but one of his cohort, only to be taken into custody by the Chinese Army at the eponymous wall.

Without revealing any spoilers, I'll say that Damon's stonehearted mercenary is ultimately swayed by the amazing qualities of the Chinese people to connect and fight a shared enemy. He learns that there is much more to life than just fighting for food and money. I'm not suggesting that China has been this oasis from pain and fear and life as we now it all this time, nor that communism is a fix for our ails, but this is what drives a primary element of Chinese culture if you're going to watch these films.

On that note, if you want to see some subtly subversive Chinese film making, check out Chronicles of The Ghostly Tribe. On its surface, it looks like a love letter to the Cultural Revolution, but just check out the overtly hyper-positive attitudes and glassy-eyed recitation that makes it more clownish than oppressive. It's not a bad film, either.

The only thing I want to say now is that you should give this film a chance. It's an epic fantasy that might even be a little too short to tell the entire tale, but it works. You can even skip the beginning bit right up until they get to the Great Wall. That's where I would have started the film with a short, explanatory preamble.

Why Pharah Sucks

httporig03deviantartnetb3b1f201432034overwatch_wallpaper__phara_by_haikai13-d86ksb7jpg

Pharah. That flying rat. The one and only flying hero in Overwatch, is the single most annoying hero of the bunch. Nobody else can fly. Mercy can glide and Winston can jump really far, but no other hero has flight capability like Pharah, which makes her suck. Pharah's missiles are also quite lethal, which means anyone playing her who is even vaguely good at twitch games can do a load of damage, and it's very difficult to counter her.

Counter her with a sniper, like Ana or Widow, and it works, but they have to be really good snipers. Counter her with Torbjorn, and her missiles can ditch that Swedes turret in a couple of shots before the gun's bullets can deal enough damage. Counter her with another Pharah, and its like watching two first-timers have a dogfight.

So, Blizzard. Dump Pharah. Having a single hero that can control the skies is just dumb. There's no balance. Drop Pharah like a hot potato. Ditch her like an ugly blind date. Be smooth. Remove.


The future face of computing

You might not have noticed, since you’re most likely looking at your smartphone, but a significant amount of time people spend on the internet is through a mobile device, dominantly said smartphone. We’ve got some nifty charts lined up from ComScore so you can see just how much.

[[Had an image here, but need to go track it down. Hopefully I don't forget. ed.]]

As is illustrated in the above chart, things start shifting around 2012 when mobiles started to take over from desktops (this includes laptops, I presume), and that gap continues to widen. In general, this shows what we likely already know just from looking around; when you need the internet, you reach for your phone around two out of every three times. What’s even more interesting is that age is not a factor anymore. Older people may not use the internet as much, or grab their smartphone as often, but the trend is remarkably stable across the scope.

[[Same here. ed.]]

In all cases, 19 to 65 and up, people tend to grab their smartphone first. This chart shows it’s when the user has both a smartphone and a tablet, but the trending is clear. Mobiles are taking over the internet. What you don’t need a chart to see is that so-called phablets appear to have won, with even Apple having rolled out a larger mobile with the release of the iPhone 6 Plus, but that’s a discussion for another story. Here, we are going to talk about the future of computing, and the above data is where we are at now.

Pocket Power

Computing comes in many diverse forms these days, there are smartphones (natch), feature phones (the not-cool ones), tablets, laptops in various form factors, and desktops of endless variety. If only that were it, however. There are also smart TVs, smart DVD and Blu-Ray players, video game consoles like the Wii U, PlayStation 4, and the Xbox One, scads of media devices like Apple TV, Roku, and the Chrome Stick, and enough smart watches to choke a Blue Whale. If that wasn’t enough, there are photo frames, radios, and even TV remotes that use the internet to make them better (though that’s debatable).

In part, it’s no wonder people gravitate to their mobile devices when seeking information. There’s so much… internet, it’s hard to break it down into the small, easy-to-swallow pieces user’s desire. This is a primary emotional component in why certain apps do so well on the various App Stores. The better an app performs at granting the desired instant gratification, the more likely users will treat it as their Go-To solution. It explains a lot when you consider that desktops in general don’t have “apps” and tablets can’t fit in your pocket (ignoring that some phones have a hard time fitting into the average pocket). To their credit, Microsoft does have apps on their Windows 8 and 10 desktops, but the app store has yet to mature and present developers with an enticing potential marketplace.

Future Tense

As you can well imagine, with PC sales dropping like a stone and smartphone and tablet sales soaring, systems producers are scrambling to figure out what the consumer is going to want in the next two to five years, and it’s hard to imagine where that might go. I thought I’d have a think on it and see if I can’t use the old Predict-O-Tron to suss out a few things. First, however, let’s remind ourselves about the categories we’re looking at:

  • Mobile phones (i.e., Android, iOS, Windows Phone, and other mobile OS)
  • Ultraportable devices (i.e., iOS & Android tablets, ChromeBooks)
  • Portable devices (i.e., PC & Mac tablets, laptops, 2-in-1s, convertibles)
  • Micro systems (i.e., PC sticks, Intel NUC, & other tiny non-portables)
  • Desktop systems (i.e., small form factor, mini tower, tower, and All-in-one systems)
  • Gaming systems (i.e., Microsoft’s Xbox One, Sony’s PlayStation 4, Nintendo’s Wii U)
  • Smart systems (i.e., home media, TVs, and DVD/BR players)

With that in mind, let’s dig into what might happen by 2020:

  • The Phablet – I don’t think phone sizes are going to increase much from here on out. There’s little incentive for consumers to have a seven or eight inch phone they can’t fit into their pockets. 5-5.5″ devices are going to inhabit the sweet spot in this market, while smaller devices will continue to fill the needs of the budget market. As always, there will be outliers, but they will be few and, unless extremely crafty, won’t last long. Advances will continue in hardware and software development. We’ll likely see better cameras, some interesting iterations of curved displays, and desktop-style docking as Microsoft revealed with its new Lumia 950 and 950XL. I have a feeling that we might see some edge-less displays in the next few years, which would allow the introduction of a workable folding device.
  • The CPU – Not many people but hardcore nerds talk about CPUs all that much, but they are critical to computing. Clock speeds won’t likely get any higher. They’ve been in stasis for a number of years already. What has grown are the number of cores, the efficiency of manufacturing, and the ability to reduce the amount of heat produced. One key example is Intel’s Atom line. When they first appeared, they were anemic and slow. The current crop is fast, capable, and efficient. As such, we will continue to see advances in multi-core parts, further reductions in heat production and power consumption, and increased functionality to handle complex operations and multi-tasking.
  • Desktop systems – Due to the continued miniaturization of computer tech, we will witness the ongoing death of the general purpose desktop. The “box” remains a staple in enterprise deployments, but even that’s being eaten away by the laptop. The consumer market is already being euthanized with tablets and will likely be replaced by the selection of small form factor and tiny desktop systems like Intel’s NUC series and an increasing number of all-in-one systems similar to Apple’s iMac. I have a feeling these won’t catch fire, but they will maintain at least some level of presence in the market.
  • The Laptop – If anything has been an effective agent of chaos against the forces of the desktop, it’s the laptop. Long gone are the days when a laptop was three or more times more costly than a desktop PC, and with the growing performance of low cost CPU parts, the multi-component relic of the PC just isn’t appetizing to many consumers. Yet, since the introduction of the iPad, Apple’s ultimate disruptor, the laptop has been facing it’s own foe, and fought back hard. The disastrous “netbook” era was short lived and ill conceived as a possible combatant to the tablet, so more tablets were released.
    The most compelling change in the laptop battle, however, is convergence. In other words, if you can’t beat them, join them, or, if you’re the Borg, assimilate them. Asus’ popular Transformer line of 2-in-1 devices is testament to that, but it’s Microsoft’s surprise rollout of the gorgeous and amazing Surface Book that is likely going to act as a template for the future of the laptop. By simply removing the display, you get a tablet PC. Reattaching the display, you get a full on laptop with longer battery life and even gaming-grade discrete graphics as an option. I predict that various forms of this configuration will become popular. There is, however, the matter of systems interconnect. After all, nobody will want a tablet that weighs five pounds, even if it sports a Core i7 and 32GBs of RAM. We may see some very interesting innovations in this particular space.
  • All The Others – There’s not much more to say, really. Gaming systems are going to get more game-ier. It’s unlikely that Sony or Microsoft will diverge from their course. Both command very lucrative markets and both see only each other as their nemesis. Nintendo, on the other hand, literally owns the mobile, dedicated gaming market, but still have to wage war against the monster that is smartphone gaming. I’d bet my shirt that Nintendo knows they need to emphasize that some games are just better with physical controls and will offer some spin on that when codename “NX” is revealed sometime next year. They’ve finally figured out they need to release some games on iOS and Android, but to really thrive they need to continue to innovate.
    Smart devices are going to get smart-ier. It’s not clear what will happen in the smart device vs. smart add-on war. Smart TVs lock you into that manufacturers ecosystem, while add-on devices like the Roku offer a much wider range of options. It will be interesting to see if anyone develops a line of TVs that have a low-cost, interchangeable dock in the foot that allows the attachment of an Apple TV or Roku 4 box and just bypass “smart” altogether. That’d be smart, if you’ll pardon the pun. If anything takes off in consumer electronics, though, I think it will be better, open, touch-and-pair wireless display technology using a combination of NFC and Wifi. The stupid dongle thing is a nuisance and restrictive.

In general, much of what is likely to come in the next five years won’t be astonishing, but iterative, an evolution, if you will, and much of it will allow access to the internet in some form. I can’t predict major breakthroughs that we’ll need to move beyond where we are now, but there’s still plenty of room in the tech that we have to continue development. For example, if the Surface Book were to really spawn a PC replacement market, it would need to have enough functionality to operate as a tablet without the base, but still be able to offer real power when docked. That’s going to require some real innovations in board-level interconnectivity as well as license allowances for dual CPU systems in the consumer market.

I also expect we’ll see some new takes on the docked smartphone tablet in the next few years. If you look at how Microsoft uses a tiny little dock to turn their Lumia 950 and Lumia 950XL into a desktop analog, you can easily imagine that functionality being integrated into tablet and laptop forms. It’s just dead until you slide in your phone. This is nothing new. The Palm Foleo was one of the more beautiful implementations of this concept, but it was cancelled before it was released. There was going to be a Foleo 2, but that never took form as Palm was falling apart, was bought by HP, and then unceremoniously shut down. More recently, the Motorola Atrix 4G and the optional laptop dock was actually available, but Incipio’s Clambook never really materialized. About the only place you can get such functionality is BlackBerry’s elegant Blend for OS 10.3, Windows and Mac OS X desktop software that lets you work on your BlackBerry via your system’s keyboard and mouse, even without a direct internet connection.

Whatever happens, though, it promises to be interesting. After all, we’ve seen Apple copying from others instead of defining the market and Microsoft come from being an oldster with stars in it’s eyes about the Good Old Days to being one of the hottest shops for real innovation in just the last few years. It’s going to be a real hoot to see what comes down the pike next to thrill and entice us.

I’ll Disable My Ad Blocker When You Stop Exploiting Me

On January 8th, ExtremeTech published a piece about Forbes forcing users to disable their ad blockers in order to see any content, and guess what happened. Malware.

For the past few weeks, Forbes.com has been forcing visitors to disable ad blockers if they want to read its content. Visitors to the site with Adblock or uBlock enabled are told they must disable it if they wish to see any Forbes content. Thanks to Forbes’ interstitial ad and quote of the day, Google caching doesn’t capture data properly, either.

What sets Forbes apart, in this case, is that it didn’t just force visitors to disable ad blocking — it actively served them malware as soon as they did. Details were captured by security researcher Brian Baskin, who screenshotted the process:

Malware1

And now back to the original piece…

One of the things I loved about the internet in the 2000’s is that it was an overflowing treasure trove of content. Following in the footsteps of AltaVista and Yahoo!, Google had made the internet accessible. PHP, Java, and Ajax were coming on strong and reforging plain HTML to make the internet usable. Creative types and entrepreneurs were developing new ways to leverage the internet. Sure, there were ugly things like HoTMaiL, GeoCities, and MySpace, but we also got YouTube, Amazon, WikiPedia, CraigsList. GMail showed up, Facebook took its early steps, and Twitter popped up out of nowhere.

There were also a lot of ads. Corporate Earth had found the internet to be a new resource to exploit, and exploit it did. This was the era of the internet that created the need for the pop-up blocker feature being added to just about every browser on the planet. New advances in web tech also created news ways for site developers to be more efficient and more expressive. This created the Flash revolution and early Javascript-based pop-ups. Both website owners and ad network owners were having conniptions over click rates and revenues, and web users were getting really sick of ads splashed into every corner.

Hell, they still are.

Yet Google built their entire empire, one of the largest companies on Earth, almost entirely on the simple concept of plain text ads that didn’t stand out like a sore thumb, but few others followed that lead. From this incomplete history, admittedly lacking nuance, we know today that advertising on websites is deeply annoying. We have interstitials, ads which pop up between stories on some websites or when you jump from one site to another to keep you from reading before you look at the sponsor’s message. We have all manner of Javascript-based pop-ups that appear when you scroll down far enough, try to click the Close Tab control, or flip up to ask you to complete a survey telling them how much you loathe their website because of the ads. Even I use them to hawk my book or get you to follow me on Twitter.

DISCLOSURE: I employ Google’s AdSense on my site and a few others and, get this, I earn a whopping $30 a month. It pays a few internet-related bills. whee.

Then there’s the “Despicable”-class items. These are more behaviors than actual ads. The most common one people come to know and despise is Link Bait, links with titles shrouded in mystery, dropping just enough bombshell to get you to click. Then, of course, the resulting page is saturated in ads. One of the even more painful forms of this is the “Amazing List”-class. Here’s a simple tutorial; think up something gross or sexual, find five or more celebrities who have possibly admitted to doing it, create a gallery of these entries with one entry per page, entitle it something like “7 Celebrity Men Who Have Worn Women’s Panties”, now advertise. Guess what! Schlubs have to load that many pages, each full of ads, just to get through the list. Hideous.

Enter the Ad Blocker. Ad blockers promise one thing; to block ads from appearing in your browser. The results are simply astounding, if you use the right one. I personally use AdBlock, a plugin for Chrome on Windows, which effectively blocks all ads I don’t want to see, but allows advertisers who behave responsibly to display their tasteful ads. AdBlock is one of the most popular because it works well. In fact, it works so well, the internet advertising industry and sites that derive revenue from ads instead of subscriptions is engaging in collective howls of “foul” claiming that it works too well and too many people are using it. They’re simply loosing too much money and they’ll have to stop publishing if we don’t let them violate our eyeballs with their ads (or the ears of our deaf friends who must endure hideously convoluted crap in their screen readers).

It’s gotten so bad, in fact, that now it’s difficult to go to just about any website without seeing some pop-up (am I the only one seeing the irony here?) begging visitors to please whitelist their site so they can continue to exist. Some truly heinous asshats will just block the content altogether until you disable your ad blocker. If that wasn’t bad enough, ads are just about everywhere. They’re in our Free-To-Play games, which should really be called Free-To-Play-But-Costs-Money-To-Play-Well games. They flash brightly on giant electronic signs in our cities, blinding us while we drive at 80 MPH on the freeway. They invade our shows on Hulu, even when we pay a subscription fee (that’s changed lately, but it illustrates the point). We’ve been fed pre-movie ads in the form of trailers for so many decades, we now look forward to them! Billions of revenue dollars flow from one corporate entity to the next because of ads, but ad blockers have been putting a dent in that, at least on the internet.

Well, so what!

Who cares if you obnoxious ad people and website operators complain that not every human being on Earth is actively enthralled by your short-form, advert-oriented expositions of so-called creativity. You are hawking stuff and not everybody wants to look at gaudy promotional material every waking minute of every day so you can make a few more millions, shocking though that may be. If your damned ads weren’t so freakishly annoying and obtrusive, we probably wouldn’t be blocking them! They slow down page loading times. They require plugins people don’t want or need and likely shouldn’t be using because they open security holes on their systems. Ad networks have even been a source of viral attacks on millions of unsuspecting people who never once thought they may get a virus from their respectable website.

In a nutshell, you are exploiting us and we don’t like it. We now have the power to stop it on the internet, and that bothers you. When Replay TV and TiVo first came out, they had the ability to skip ads. Where’s Replay TV now? Dead. Where’s TiVo? They had to cripple the function to survive, but have recently announced their new console that brings back Replay TV’s long coveted 30-second jump, thumbing their noses that their oppressors. Millions of people are cutting the cord, ditching cable TV, and getting subscriptions to Netflix, Hulu’s new ad-free program, and just getting TV the old fashioned way, through an antenna. YouTube even has an ad-free service for $10 a month.

Nobody loves your ads because you abuse it and there are some people who will take that abuse to the extreme. So, here’s the breakdown. You stop horribly exploiting us users and we’ll stop blocking your entirely reasonable, unobtrusively placed ads and you can go on making revenue.

Better yet, why not try charging a super-small monthly fee to go ad-free, and no, you don’t get to spam those who don’t pay. Just consider asking for a few bucks a month. This is the internet, after all. You can reach millions of people instead of a few thousand in a neighborhood. You can make real money. It’s not that hard.

Look at Google.