Pokemon Go Observations

Pokemon Go is on another level. It was released in the U.S. four days ago and it's expected to surpass Twitter in number of daily active users in the next two days (it's already soared past Tinder). What's crazier is that the app is being used for an average of 43 minutes, 23 seconds a day, a number significantly higher than Whatsapp, Instagram, Snapchat, and Messenger. It seems that even American volunteers fighting ISIS are addicted to it.

While walking in the San Francisco Botanical Garden today, I saw nearly 20 groups of people hellbent on catching Pokemon. These were people looking at their iPhones, occasionally flicking the screen, and walking around erratically. The cashier at the Japanese Tea Garden was playing it. The group of kindergarteners were playing it. The two senior citizens were playing it. It was absolutely surreal. This was a secret world that only Pokemon Go users knew about, and there were constant smiles as people walked by each other and noticed each other playing Pokemon Go. There were a couple of groups of strangers that gave each other pointers. And I'm far from the only one to have made observations about this very evident change in human behavior on a large scale. As one person has noted, "[I] haven't seen this level of shared street experience with strangers since 9/11 in NYC."

There are quite a few people that think that this is a fad. I really hope it's not. Although this is anecdotal, the app seems to have, practically overnight, encouraged millions of people to explore, exercise, and bond with each other in public. Yea, this bodes well for the future of augmented reality, but more importantly, it injects some community back into a world that currently so desperately needs it.


Today, LinkedIn was acquired by Microsoft for $26.2B. Immediately after the news broke, my Facebook and Twitter feeds were inundated with articles and videos on the acquisition. The "thought leaders"  of the Internet went into overdrive, whipping up articles and listicles positing that this is a good deal, that this is a horrible deal, that ___ will benefit most from the deal, that ___ won't benefit from this deal, that you should buy Twitter, and that ___ will be purchased shortly by ___ as well.

For a great example, check out this Inc. article, which provides zero original insight except for this classic intro: "I'm a customer of both LinkedIn and Microsoft. Heck, I wrote the first draft of this article on MS Word."

The best quality analyses are rarely the earliest. It takes time to let the hype bias wear down, come up with a truly insightful take, digest all the available information, reach out to sources, and do additional research to prove out hypotheses. However, the system is heavily biased against this. Given the eyeball-driven nature of most online media and the instantaneous flow of information, the biggest prize goes to those that ride the initial wave of excitement immediately preceding the event. The world won't care about the acquisition two weeks from now, and a well-researched article that captures 100% of total interest then will still reach a much smaller audience than a surface-level opinion piece that captures 10% of total interest immediately after the announcement. News should be time-sensitive, but the fluffy punditry that inevitably follows it shouldn't.

Techies and Sneakerheads

The tech world is infatuated with sterile design. Flat, non-skeumorphic, thin, small, minimal coloring, crisp lines, and "minimaluminiumalism." Although there are some rebellious trends nipping at this design (brutalism anyone?), expressiveness in tech products has been gradually sapped in favor of cleanness. 

Compare this to the world of shoe design. Google any Jordan shoe, any Lebron shoe, any shoe that's remotely coveted by sneakerheads, and you'll get something like this:

The best sneakers are richly colored (sometimes to a point of garishness), expressive, and made of a hodge-podge of materials ranging from fiber mesh to hyperposite. Imagine how a laptop or website version of the Lebron 12s (pictured above) would look. The tech products your hands engage with seem to derive from a completely different, almost alien design consciousness compared to the ones that designed the shoes on your feet.

This difference was exemplified by Under Armour's Curry Two "Low Chef" shoes that came out this week. On paper, it followed the classic formula for successful sneakers: it's backed by the best basketball player in the world (one that's also universally well-liked), it's expensive, it's in limited supply, and it targets the casual sneakerheads not the pro basketball players. Yet, the shoes tanked. Why? Because they look like the Macbook Pro of shoes. 

The collective sneakerhead world excoriated the shoes over its blandness, with people comparing them to "the bathtubs in that Cialis ad" and joking that it was made for "emergency room nurses", "arcade managers", and "guys who used to wear New Balance but were still having too much sex." Funny thing is, the same "blandness" that the Curry Twos were panned for is the same blandness that made Apple the most valuable company in the world. 

Maybe the sneakerheads are onto something. A product can only be so unicolor. A product can only be so thin. A product can only be so bare. A product can only be so aluminum. Perhaps we've reached an inflection point in the cycle of design trends with regard to tech design. 

P.S. no I don't hate Macbook Pros.


Apple's Disruption

There's been some chatter in the tech sphere about Apple being ripe for disruption. Marco Arment and Walt Mossberg both recently came out with great pieces exploring this subject (here and here). The thought is that AI will be the next big platform that touches every facet of our lives (ie cars and homes), one that will be more universal than and will eclipse mobile (which will be a subset of the overall AI platform). Mobile is the current "it" platform and also happens to be Apple's cash cow. In this shift from mobile to AI, the winner won't be the best hardware maker/designer (indeed many AI systems, like the Echo, require minimal usage of the hardware). The winner will be the company with the most advanced AI, and the general sentiment amongst the tech community is that Apple is behind. This is in part due to its stance on privacy, which has hindered its desire to gather huge amounts of consumer data on the scale that Facebook and Google have, data that is vital to training top-notch AI, and its historical/cultural emphasis on hardware over software. 

But then, just today, Tech Insider threw a wrench in this narrative. If its sources are correct, Apple's recent acquisition of VocalIQ has helped it leapfrog Google and Facebook in some major areas of AI, especially around speech. VocalIQ has the ability to remember context, which in turn has led to results in which "VocalIQ’s success rate was over 90%, while Google Now, Siri, and Cortana were only successful about 20% of the time." The sources were vague on other functionalities of VocalIQ, and the high accuracy may very well be constrained to a specific domain, which would have easily allowed VocalIQ to outperform general purpose AI like Google Now, Siri, and Cortana

Google Now also has had context for some time now; for example if you ask 'Who is Barack Obama' and follow this up with 'how tall is he?' Google Now will automatically assume 'he' refers to Barack Obama. Perhaps VocalIQ has a found a different, more powerful way to integrate context into its algorithms. Combine this possibility with reports that Apple is building an Echo competitor and will open up the Siri API to third-party devs and Apple might not be in so much trouble after all. 

In Defense of Floppy Disks

Recently, news that floppy disks are still being used to coordinate the U.S. nuclear program has been making the rounds on the Internet. This has led to outrage amongst genuinely concerned citizens and armchair experts eager to use this as an example of the U.S. government's technological incompetence. The reasoning goes: floppy disks are ancient, nukes are devastatingly powerful, ergo why isn't the system controlling nukes more modern?

The problem with this obvious reasoning is that it depends largely on societal norms. We live in a world where it has become engrained in our minds that the latest, most-up-to-date technology is the safest. And in many cases (ie updates with software patches), it is. 

But in this specific case, it might not be. Let's reason this out from first principle. Fundamentally, the U.S. military is much stronger at bits than bytes. The manpower and firepower the U.S. military has is magnitudes more powerful than its cybersecurity team. Moreover, the world of bytes is much more volatile; even the most up-to-date systems have gaping security holes waiting to be taken advantage of and it's a constant cat-and-mouse game between the defenders and attackers. 

Given this, would you rather have the gates to America's nuclear program be digital or physical? There's no question. You can rely on overwhelming firepower and manpower to protect a physical object such as a floppy disk and there is no amount of cybersecurity presence that can compare in protecting digitally connected objects. So what do you do to be 100% certain that your digital connection isn't hacked? Don't connect things digitally. Sometimes the best tech is low-tech, and in this case, floppy disks are probably the best tech.

Granted, the military might have wanted to update its systems and it may very well be due to incompetence and bureaucracy that the systems haven't been transitioned to digital. But that doesn't mean this was a wrong choice. 


The Importance of Touch

The other day, I stumbled across my old HP laptop, which was gathering dust back home. I've been a Macbook Pro user for about a year now, but before then I was a staunch PC guy. The specs were similar, so I gave the HP laptop another chance. 

It was disgusting. 

Again, this is coming from a once-diehard PC guy; I was the dude in any group that regarded Mac users as gullible pseudohipsters that overpaid for an okay laptop (in terms of specs). And granted, the HP laptop performed pretty comparably; this wasn't the main issue. The main, immediate issue I had with my once-beloved HP laptop was how goddamn flimsy it felt. My hands almost cringed at the laptop's shitty plastic shell. Gone was the smooth coolness of the Macbook's aluminum unibody. They had similar specs but the HP felt magnitudes more low-quality simply because of the material it was wrapped in externally.

The external material used in many consumer hardware products is one of the most important parts of the product. This is the material that will be interacted with, in some cases, for hours on end everyday by the consumer. This also tends to be one of the more overlooked steps in the process of product creation, the part that takes a back seat to the pure design of the product. 

Metal (especially aluminum) and glass will always feel more high-quality than plastic. A clear example is the Mac family vs. most PCs. This analogy can be extended toward most iPhones vs. most Androids, or any item used in daily life.

In fact, Apple's emphasis on using metal and glass might be one of the more overlooked factors in why it has such a good reputation for quality amongst its users. Every single day users turn on their Macs or iPhone, they feel the unibody aluminum or the cold, smooth glass. This hooks users and over time, their hands won't settle for anything else. 

This strategy is extremely effective when your competitors are trying to compete on price and as a result are churning out products that might have similar specs but are made of plastic. Price is a one-time cost, but a subpar touch experience is something that accumulates over the lifetime of the product, one that customers will realize immediately after trying out higher-quality alternatives. In the long-run, the item with the higher-quality external materials will inevitably win (holding specs constant) simply due to higher retention (although their initial pool of users may be smaller).

Expectations and Reality

Expectations and reality start out as two separate parallel lines.

These parallel lines want to converge, they must converge, and when they do converge, the intersection point is an accomplishment.

Yet too often people change the trajectory of their expectations to intersect with reality. Maybe because they realize their expectations are “unrealistic” or simply too difficult. Maybe because their inner fire slowly flames out. Thus they bend their expectations and their accomplishments are simply in line with reality.

The small minority that hold on to their expectations (or even increase it) will confound reality. Subconsciously stubborn, they will give reality no choice but to twist and bend to accommodate their expectations. 

Expect nothing less, and you will move reality.