Living In the Future: Windows Phone, Pixel, and iPhone

My daughter is home from her first semester at college. It’s wonderful to be able to hop in the car and run to Starbucks with her, just like the good ol’ days. Great one-on-one time, no worries, just her and me.

She did the driving this time, and I asked my phone a question about something that we were discussing. She immediately jumped all over me, in a good-natured mocking sort of way, for “talking to my phone,” which “NOBODY does, dad. Except you and two of my friends who always try to talk to Siri and it always fails.” This is a point I have made many times over the past couple of years with my Pixel and Pixel 2 phones: Google users use their voices, iPhone users don’t. Why? For the simple reason that Siri sucks and Google Assistant is great and voice is vastly superior to typing on a tiny glass keyboard (yes, even an XL phone screen is tiny compared to an actual computer keyboard).

This also reminded me of a post I did not write just after Thanksgiving. Walking by a big, printed canvas photo hanging on our wall that was taken in March 2014, I was struck by the memory of having shot it with my Lumia Icon, a Nokia Windows Phone that had an AMAZING 20 megapixel PureView camera with a Zeiss lens. The detail and color on that 16×20 print are spectacular. Again, this was almost 5 full years ago. The phone was white, just like the Pixel 2 I recently sold in exchange for a Pixel 3 XL, and the feel was pleasingly similar to the Pixel 2 in the hand. Along with the camera, the 441ppi 1920 x 1080 AMOLED display was simply stunning (Apple’s newest phone, the XR, has a 326ppi 1792 x 828 LCD for comparison), as was the responsiveness of the entire system. It was a 5-inch screen, an incredible jump in size from the iPhone 4S with Siri that it replaced, and handing it to my iPhone-toting family invariably drew a mix of mocking and disdainful reactions. “You are SO weird, dad!” But then I’d bark out a few Hey Cortana voice commands, and the phone would actually do what I told it to do. Miraculous! And oh, what Microsoft taught the world to do with typography as UI! Apple customers take for granted the use of different font sizes throughout the iOS experience, which replaced gaudy “skeuomorphic” (only die hard Apple devotees, as I once was, would be familiar with such terms) design elements of previous iOS versions. But Microsoft often shows the way for others to follow, thanks to its decades of time, talent, and treasure that have been poured into basic research including anthropological studies by actual Microsoft-employed anthropologists, and that was definitely the case with Windows Phone and its beautiful typography. People’s attention can be wrangled and focused by the simplicity of clean typography with overarching elements being bigger than the ones that can be further drilled down to.

It saddens me to contemplate what so many iPhone users miss out on, just because they have to use what “everyone else” is using. The things being accomplished by Google through software, through artificial intelligence, in computational photography, are insanely next-level, especially compared to Apple’s phone cameras. Most people know that “Google phones are better at low light photography,” but in practice, every time we are out at night, the iPhone users insist that we use my Pixel for the group pics. It is worth going out of the way to find and harass the Google phone guy into taking the picture, rather than anyone using their own iPhone. Which, of course, I am very glad to do, because nothing bothers me more than suffering through embarrassingly bad low-light iPhone photography when a simple year-old Pixel 2 would have made all the difference. The iPhone users are aware of the picture-taking inferiority of their devices, but mostly unaware of the other shortcomings. This is especially true for kids, teens, and college students, but also for run-of-the-mill people who simply don’t care about the latest and greatest. As long as it’s got an Apple logo, they know their star bellied sneetch status remains inviolate. As long as the message bubble is blue and not green, they are safe. As long as they are paying 20% price hikes over last year for the privilege of changes or improvements that are indiscernible to 99% of them, all is well. Yes, I know “trust” and “Google” and blah blah blah. But if anyone believes Apple is not collecting more than is known or suspected by its users, I would venture to guess that they are mistaken. No, it may not be as much as Google. And Facebook is beyond any of this, so I leave them out of the discussion entirely. Even if an iPhone user is giving their money to Apple instead of Google or Samsung or LG, they are still being tracked by every app or website they use on that iPhone, so it really does not matter in the big scheme of user privacy considerations.

What I am talking about is the pure joy of experiencing the latest technological innovations from the greatest technological innovators, Microsoft and Google. I really do wish Windows Phone had survived, because I truly loved the Windows Phone tile-based UI, with the ability to customize both the size and placement of individual app tiles on the screen, something that still cannot be done in either iOS or Android. If this is what we had 5 years ago, what would we have now, if full-fledged development had continued? One can only wonder.

Don’t Try To Make It Something It Isn’t

An IBM PCjr was not a Commodore 64. An iPhone was not a Blackberry. An iPad is not a Mac or a PC. The PCjr, while far more useful than a Commodore 64 for things that were rapidly becoming more important than easily programming sprite graphics in the exact same way it was done on the C64, or using an already-purchased Commodore cassette recorder to save programs, was a bitter pill to swallow for a 14-year old boy who cared nothing about IBM-compatible Okidata dot matrix printers. I didn’t care about writing papers that I could print out for school; I wanted to do what I had taught myself to do, the way I had taught myself to do it, and I needed a C64 for that. Soon enough, however, the capabilities of the PC changed the daily existences of millions, even billions, of people around the world, and the C64 (along with the PCjr) was eventually, inevitably, left for dead.

I also recall years spent in a corporate environment with people who were more obsessed with their Blackberries than any teenager with an iPhone. The thumb-operated physical keyboards blazed away at all hours of the day and night, at desks, during meetings, around the house. Then the iPhone came along, and it had NO KEYBOARD BUTTONS. How IDIOTIC! It may seem hard to fathom for those who did not witness it firsthand, but for a long time, many Blackberry loyalists completely eschewed iPhones. But what the iPhone COULD do, once understood, changed the daily existences of millions, going on billions, of people around the world in the years that followed, once again relegating a beloved technology to the graveyard of gadgets past.

Will the same be able to be written about the iPad vanquishing laptops with desktop OS’s when the annals of tech history are updated at some point in the future? No one can answer that (yet). I can state that I am banging out the letters that form the words which convey these thoughts on an Apple Magic Keyboard paired wirelessly via Bluetooth to a 10.5 inch 2017 iPad Pro, alongside an Apple Pencil which is soon to tap the Publish button in the WordPress interface above and to the right. I have the iPad oriented vertically, in portrait mode, rather than in the traditional pc landscape orientation that is preferred by so many. It is my personal preference to closely replicate the experience of writing in a portrait-oriented pad or notebook, which I am able to freely exercise due to the fact that this is an iPad, not a laptop. The device also affords me the unreasonably delightful experience of slowly flicking the page up and down with the Apple Pencil to linger over what I have written as I write, then to touch the screen with the Pencil tip directly where I want to make a revision, all without having to grasp a mouse and hold buttons down while some primitive, non-True Tone screen featuring Stone Age refresh rates jerkily scrolls up or down as it impatiently waits for a mouse button click to engage a disembodied pointer icon for insertion of a cursor to make a change.

For more than a few iPad users, the choice of device/screen orientation is often removed, dictated by the Apple Smart Keyboards attached to their iPads, forcing the horizontal layout of the screen. Many others simply prefer landscape mode, likely as a subconscious vestige of years of traditional Mac or PC use, or perhaps due to the iPad Pro’s split screen that can be enabled in landscape mode for exceedingly useful multitasking capabilities. In fact, as I sit here typing in portrait mode, contemplating said utility of landscape split screen while I ponder a synonym for the word “beautiful” that I  employed in the preceding paragraph, I just rotated the iPad into landscape and drug a Safari tab over to split the screen and look up a better word than “beautiful.” “Delightful” is much better. There, the change has been made.

And now, after scrolling through my words for one final pass with a screen possessing a buttery smoothness that is still, if you pause to consider it, almost impossible to describe with words, an incomparable experience that is available on no device other than the iPad Pro, I use the Pencil to touch the Publish button.

Apple, Google, or Microsoft?

For me, the answer is yes.

I try them all, I love them all, I hate them all. Ok, maybe I don’t hate any of them, but the products produced by Apple, Google, and Microsoft elicit some very strong emotions from their users, myself included. Being old enough to remember living through the rise of each of these tech titans, the hopes and dreams of new and better futures promised by them, allows one to look back and evaluate who was, is, and will be most likely to deliver. To do so, we need to look back at not only the founders, but also to the ones chosen to succeed those founders, and why.

Steve Jobs wanted to create beautiful things. He loved beauty, and thought the world would be a better place if it were filled with more beauty, produced by more people. The Mac was his solution to that problem. Another thing he loved was music – simply look at what he decided to name his company, after the record label of his favorite band, the Beatles. So he continually threw everything he could behind music:  sound on the Mac, iPod, iTunes, ripping and burning, Apple Music streaming, Beats and headphones, Apple Music (formerly iTunes) Festivals, headphone jacks in EVERYTHING, including the iconic ads for iPods featuring silhouettes of corded headphones worn by people listening to music on their beloved devices.

Bill Gates wanted . . . well, I’m not exactly sure what. But business dominance was his early pursuit, and he was comfortable with winning at any cost. So, perhaps his dream was to dominate in the nascent arena of computing, and to do so by outsmarting, or more specifically, outmaneuvering, his competitors. If the burgeoning world of corporate computing, with its logical extension of personal computing, were a game, he would lay it out as a strategic battlefield with pieces to be moved, escape avenues for adversaries to be cutoff, and unconditional capitulation the preferred outcome. Only the full weight and resources of the U.S. federal government and court system thwarted his ultimate victory.

Sergey Brin wanted something other than the Soviet Union. In other words, something other than a crumbling, totalitarian society paralyzed by corruption and the lies that were required to maintain that power structure. For him, born in Moscow and then transplanted to the United States and immersed in its institutions of higher learning and cooperation, information and the freedom to use it were all the power that he seemed to desire. That impossibly altruistic motivation for him and fellow Stanford Computer Scientist Larry Page was enough to drive the creation of a company that would grow to be potentially more powerful than Apple and Microsoft combined, and conceivably as threatening as the Soviet Union was, minus the nukes and troops. You might say that does not sound like much of a threat, but remember, the objective of the Soviet Union was not to annihilate civilization. It was to control it. To manage any threat that may be posed to the Soviet Union itself. No different, really, than any other country’s overriding concern. The most valuable resource the Soviet government had was its intelligence apparatus, which could identify and eliminate small or medium threats before they became larger and more dangerous to the state. The Soviet Union’s leadership would have done anything to acquire the information that Google now possesses on billions of people around the world:  where we go, who we talk to, what we say, what we buy, read, watch, and cheer for, the schools and classes we have attended, who taught them.

Somewhere along the line, something in each of these companies changed. And it changed when the original leadership was replaced. In the case of Apple, it remains to be seen whether the change was for the better or the worse since Tim Cook took the helm; yes, they are more protective of consumer privacy, but they are not as obsessed with providing the best tools with which to create, and they have approached or even surpassed Microsoftian levels of anti-competitive consumer lock-in behavior. Not to mention the seemingly trivial, yet really not, decision to eliminate headphone jacks not only from their own portable devices, but also by virtue of their market leadership, the headphone jacks of most of the important portable devices being turned out today. With Microsoft, it seems as though it is better now that Gates and Gates Jr. (Steve Ballmer, an even more hyper-competitive, less nerdy version of Gates) have been succeeded by the more cooperative, plays-well-with-others Satya Nadella. And for Google, we are all far worse off than we were before the founders brought on “adult leadership” in the form of Eric Schmidt, who introduced maximum profit as the main reason for Google to exist, rather than organizing the world’s information with a guiding principle of “don’t be evil” (which has recently been officially dropped by Google as a guiding principle as they pursue censored product offerings monitored by the Chinese government as well as the use of their internally developed AI advances for military purposes by national governments).

Where does all of this leave us? Well, it means that in order to figure out where we might be headed, it is really useful to understand where we came from. And in a world where so much of our daily existence incorporates certain devices and services, we can ask ourselves what we prioritize in those devices, and what we are willing to give up in exchange for what they provide for us. Will we give up more money? Camera quality? Some freedom of choice in how to do things? Interoperability with our families and friends? Privacy? That’s what I endeavor to explore and explain here. Not only for you, but for myself. As one who freely and happily employs technology and devices from all three of these extraordinarily powerful entities (I exclude Facebook from all of this for the simple reason that, given what has already transpired so publicly with that company, I can only recommend that people stay as far away from any Facebook product as possible; please know that I am exercising tremendous personal restraint in the measured word choice employed here, and that I do believe that it is just a matter of time before the company is really taken to the woodshed by various governmental authorities including those in the United States and Europe), I am keenly aware of the trade-offs that I make with each decision. I would like to share some of that experience and insight with anyone who is interested in it, with whatever small slice of finite attention he or she is able to spare.

Consider the iPad Pro 14.9

Builder of games, maker of apps, and hacker of things Steve Troughton-Smith tweeted a thought into existence outside of his own mind on Halloween, but I just came across it via 9to5mac today:

 

There’s a nice thread there, and it makes perfect sense. Just as the Air laptops came in 11 and 13-inch varieties and the Pros came in 13 and 15-inch, I believe that size specific delineation is the right one for the iPad lines as well. 11 inches is portable as hell, and I certainly do not begrudge those who rely on iPad Pros for their professions and either want or need a smaller size. For my money though, if there were to be just two sizes of Pro, 13 and 15 (or 12.9 and 14.9, whatever) seems right. The extra inches for photo, video, coding, going over blueprints or other digital designs with clients, and all other manner of creating, would be so in demand that even Apple must be aware and working towards that eventuality. If one size had to be dropped, I suspect it would be the 11-inch Pro, leaving us with a 13 and a 15-inch Pro, along with consumer sizes of 11 and something smaller. [I refuse to call it Mini; the word “Mini” is insulting to me, whether applied to iPads or the newest Macs, because that word connotes some lesser form of a thing. These devices are full-fledged, consummately capable machines, and naming them “Mini” is as ridiculous as calling something with nothing more than a larger screen size “Max.” Unless you are someone other than Apple, which, if Apple isn’t careful, is in danger of becoming.]