You Are Probably a Bad Person

Or, the Cure to Anxiety

Maybe you expected to see this subhead say something like “and here’s why that’s okay” or “and it doesn’t matter”. But this is not some ode to self-care. Nor is it some weird segue into explicating the doctrine of original sin (I’m no theologian, though I am a notoriously bad Catholic). No, it is a simple observation that in general, you are most likely a bad person, on a basis of sheer probability. So am I, though my ego is already trying to assert that if anything I am probably worse than you, in some perverted desire to stand out from the crowd.

The reason why you are probably a bad person is largely by virtue of the classic sin of omission, by the way. You may donate to charity, and yet how much of your income do you actually contribute versus, say, your spending on clothes? Perhaps you could have helped save some lives by buying off-brand, but you did not. Do you donate blood? Oh, you don’t because you are afraid of needles? Well, that blood would surely have been appreciated by that person who just died in a hospital because they had no matching donors. How many times a day do you judge someone silently or aloud, consciously or unconsciously, blithely assigning a complex, unknown person a label, epithet or verdict without taking barely any time to truly know them? If you are scrolling Instagram while you read this with half an eye and little of a mind, did you pause to consider how exploited the labor that assembled your iPhone is?

This series of increasingly inane accusatory questions was designed to lead to the inescapable conclusion that we do not truly have a way of knowing whether we are actually doing as much good as we should. Sure, we can strive for the bare minimum accepted by nearly all cultures worldwide since time immemorial, i.e. not murder anyone. Perhaps we also do not steal, and try to maintain that with the utmost scrupulosity. I’d wager, however, we’ve all stolen here or there, e.g. via a slightly adjusted timesheet, or a purloining of office goods. How bad is that? Well, how bad were any of the notes above? What truly IS our obligation to our fellow people?

A much holier and better person than I, Pope Francis I, would say, much as he did in his most recent encyclical Fratelli Tutti, that our obligation is to love one another. I’d concur, and once again conclude that by that measure most of us once again are bad people. Do you really love even many of the people you interact with? And by love, I mean, knowingly and willingly put their best interests above your own, consistently? I’ll be frank in that I don’t, being relatively selfish.

With the rate of our failures as high as they are, it is understandable why our nature is hardwired to try to think of ourselves as “good” people, deep down. The cognitive dissonance of coldly accepting that we fail most of the time, and are bad people that do not live up to our full potential is ultimately often too painful. But ironically, it is the only path forward to a key insight: Once you accept the daily struggle, it becomes a liberation. Much like how cleaning or exercise or any of the minutiae of rewarding habits can seem too difficult if you think there is a nearby finish line, but find yourself needing to expend effort daily just to keep up, avoidance of the reality that this is your lot in life is likely to culminate in depression. I’d wager that the skyrocketing rates of mental health issues being diagnosed in the early 21st century, the Age of Anxiety, are in no small part driven by the disjunct between this calm acceptance and many business models relying on spurring us into a cycle of endless consumption in pursuit of “end goals”.

To those tragically contemplating suicide, perhaps that does actually seem like an end, the only actual closure. Unfortunately, your life is a hefty amount to wager on such a reality. Succumbing to despair is alien to the essence of humanity; please reconnect with someone, anyone, and learn how much you matter beyond your wildest dreams. And that, interestingly enough, is perhaps the oddest secret of all that you discover once you accept your flaws as well as those of others: You realize that we are all in this together. Rather than expecting more from others, you more often than not exist in a state of astounded gratitude at how, despite all our flaws, so many people end up doing the right thing, in a given moment. Sure, they may fail later, but isn’t that part and parcel of being mostly bad?

If so, maybe the secret of calm, poise, equanimity, whatever you care to call it, is in the simultaneous recognition of personal and external flaws. You don’t cut someone else slack because you are the better person – you cut someone else slack because you look at them and say, with a smile: “I see you, my brother/sister/fellow/comrade.” It is not necessarily that in that moment you find peace because you find yourself to be equal, because it is very easy for many to then follow that moment of clarity with self-doubt, a negation of your personal virtues and pride in favor of returning to contemplating your own flaws. Peace only follows for a time, as long as the balance of that recognition and knowledge of self-worth is maintained. It is a delicate balancing act, one the world’s current state plus the vagaries of human nature are not inclined to tolerate.

But this is not an age of tolerance, despite what woke/social justice movements may want you to think. We tolerate each other less than ever before; we tolerate ourselves less than ever before. Suicides are skyrocketing; children injure themselves; drug abuse abounds; the coldest of cold wars spreads steadily across the globe, sparking internecine conflict in every nation.

There are many symptoms of this intolerance, and even more potential causes. Clergy and demagogues will appeal to the better angels of our nature as the solution, but I propose a more sardonic approach befitting this very nihilistic (if not quite completely) age: Let’s reveal the worse devils of our nature. Let’s begin, not with self-care, but with self-recognition. We are probably bad people, in many ways.

The thing is, if we all are, maybe no one of us can or should feel superior to another, for any reason. And if that’s true, what then follows? I don’t know about you, but when I realize that, I breathe a sigh of relief. The pressure is off, the rat race is a lie, the comparisons are all invalid. No, all I need to do is accept that likelihood, and then go about changing it. And even if I don’t get too far, at least I got somewhere.

The Art of Disagreeing

Is learning how to disagree an art? You may disagree. However, that is entirely within your rights. Art is notoriously subjective, after all. Moreover, I acknowledge that the fact you disagree is very unlikely to be motivated solely by the desire to somehow hurt me; if it is, then I regret to inform you that although our opinions are, in some mysterious way, part of ourselves, I am well aware that they are opinions and thus I should not attach any personal attachment to them. So let us note that we disagree on this matter, and proceed with our discussion.

The above paragraph contains much that I aspire to be, and also does remind me why some tell me that at times I give off every sign of being somewhat android-like. But hey, as a mid-millennial (born in ’91, so barely old enough to remember pre-Internet days but young enough to be shaped very much by it), I’ve been exposed to enough disagreement that a strategy had to be developed to survive and indeed thrive online. More personally, as a Catholic raised conservative (although now a known troll with a grab-bag of oft-conflicting views) in Seattle, learning how to disagree politely was a necessity in maintaining amity.

It’s interesting that such a thing is not taught, but rather expected to be learned. When you think about it, humans are hardwired to find agreement if possible in your own tribe, and then dwell comfortably ensconced in an environment where your shared beliefs are reinforced and peace of mind remains unchallenged. But that is difficult to do, as the near-limitless network of the internet will both enable you to find likeminded others but also dooms you to the potential for exposure to ideas and facts you may find downright distasteful. So why not learn how to disagree? I am hardly an expert in anything, but given a rich history of disagreeing passionately, disagreeing purely for the hell of it, disagreeing casually, disagreeing to the point I have incurred considerable costs both financial and physical by doing so, and disagreeing because I have an odd mix of beliefs and principles, I figured I may as well spell out the art of disagreeing.

1: Emotion & Ego

Disagreeing is neurologically painful. As in, disagreement with others can produce the same chemicals in your brain that pain does. Much like heartbreak can actually reproduce actual physiological distress, so does disagreement to overall pain. That is because disagreements generally involve emotion and the ego. Our opinions, we like to think, are part of what makes us who we are. They are crucial to our identities.

But only to the extent that we consciously let them. Most of what we believe is likely not strictly provable and therefore not necessarily true. None of us is as objective or rigorously consistent as we like to think we are. So, when any disagreement over any topic whatsoever arises, especially in an emotionally charged scenario, we must recognize the role of the ego and the level of emotion that is currently engaged. Is this disagreement related to something that I like to think is a core part of my identity? How “heated” am I getting given this disagreement? Am I engaging in the classic signs of unconscious distress, e.g. biting my fingernails, rubbing the back of my head, ruffling hair?

In any pending or current disagreement, one must recognize consciously the roles of emotion and ego to start.

2: Detach & Engage

From there, one must emotionally detach, if possible. I say only if possible because although we should all be able to rule our emotions, we aren’t perfect. Also, one’s inhibitions may be compromised; shockingly enough, I find after a significant amount of booze I am somewhat less thoughtful and more prone to emotional outbursts. (On a sidenote, isn’t it hilarious how we can deliberately deprive ourselves of our rationality and self-control, some of our best attributes?)

But we shouldn’t detach just from emotions. We also must detach from the specific environment and recognize precisely what the other person is likely feeling as well. It is critical to do so to be able to lay the foundation to engage. That person is potentially also feeling discomfort, and, if they are being deliberately malicious, is also facing the same challenges to their ego. They may or may not really mean what they say, or understand the full implications, but they are using the imperfect tools of language and speech to try to convey something that may be meaningful to them. So just as you are, they too may be. It usually takes a step or two for me to recall that, and sometimes sadly I don’t, not having the hugest range of empathy (the flipside of having the emotional range of a teaspoon, as a dearly loved one once told me, is that my amount of empathy is not huge).

3: Repair & Relate

Many writers far more eloquent than I have found ways to convey how little other people think about you. That seems harsh, or even rude, but honestly, it’s true. People generally tend to be preoccupied with their own affairs. Disagreements are no different, and are arguably more exacerbated nowadays, because it’s never been easier to tell someone they are wrong, even anonymously. But once one has detached and engaged, then one can repair the thread of the conversation that has been disrupted by a disagreement. It’s not that the thread is irrevocably torn – perhaps if I was a better writer I’d use the metaphor of a flowing stream disrupted by a large stone.

And the surest way to repair that thread is to relate. It is not common in usual casual conversation to be frank about acknowledging a disagreement to the full extent that I have above, but as I am not a fan of many social norms or conventions, I happily flout them by saying something along the lines of “I see that we are disagreeing and it’s okay, for I respect your opinion and recognize that I may not be right, but we can leave it at that and move on”. One of my dearest colleagues has Slacked me the phrase “let’s move on” approximately 341 times thus far. If I were a better skilled conversationalist, maybe I’d have a defter tactic, but nobody has ever accused me of subtlety before, so it is what it is.

From there, as you will recognize in many conversations is a natural human instinct, people tend to find something in common once again to reassure the other person that they are on “the same page”. (That may be the most commonly used phrase, actually.) Next time you are at a bar and overhear a conversation, it is quite likely that you will recognize this pattern in many a dialog. It makes sense, right? After disagreement, you wish to return to harmony, to once again find common ground so that you can reassure the person you are speaking to that you are in accord. It’s the human and humane thing to do.

Why does all of this matter? Because in 2020, we are disagreeing more than we have in quite a while. It is quite likely that I do not share some of your most passionately held convictions, whereas you may have no idea what mine are. (Frankly, neither do I sometimes.) But disagreement is not the end of the world, despite what most of social media and nearly all media would like you to think. They profit from your outrage and pain, after all. It is okay – more than okay, it is good to disagree, because then we collectively have better odds of arriving at the truth, rather than putting all our eggs in one basket via unanimity. So embrace disagreement.

You can be assured that I shall.

P.S. After further thought and especially the furor around the latest nominee to the US Supreme Court, I wanted to make a note regarding when one ought to disagree. Should I disagree with the mugger who accosts me with a demand for my wallet and phone? Violently. Should I disagree with a friend on abortion? Sure, per my beliefs and worldview. But unless actual violence is set to occur, one can still disagree and value the relationship. Few relationships should ever be sacrificed over matters of principle, because most of the time they don’t need to be. Antonin Scalia and Ruth Bader Ginsburg were both avowed intellectual enemies and yet friends. Hector and Achilles bonded before their final duel. In the last days of 1914, soldiers played football in the no-man’s-land of France. Violence and toxicity – essentially, when the costs of disagreeing become too high to be borne within reason – are really the only occasions for letting an irrevocable rift emerge. Sadly, these times do occur. Over the next decade, I am likely to lose friends due to my beliefs. The thing is, that’s okay – to everything, there is a season. Sometimes it is better to gently fade rather than rage against the dying of the light.

Love & Death in the Time of Coronavirus

A few weeks ago, my good friend Alex died. It was not due to coronavirus, rather, some other illness took his life in what was a complete shock to everyone who knew him, from friends to colleagues to acquaintances. At the same time, the all-encompassing threat of the COVID-19 pandemic kept encroaching slowly upon everyday consciousness, like the incoming tide that so relentlessly, subtly creeps up to lap at your feet.

When there is such an unprecedented tragedy of epic proportions occurring, it is hard to contextualize something as final and personal as a friend’s death. It is difficult to, at times, walk or run past a bar in your friend’s old neighborhood and think, “Oh, that was his favorite spot, and five weeks ago we played pool together there for a few hours,” while thousands of people around you are facing a future rife with uncertainty. It’s as if a palpable tension lies in the air, like the slight smoky moisture in the air before a summer storm. And through that atmosphere you just happen to be strolling, with a much more immediate concern dwelling on your mind, even as the air imperceptibly clings to you.

Just before travel restrictions began shutting down airports, and on the same day bars and restaurants were closed, and a week or so before we were all ordered to stay home if at all possible, a group of friends and family were able to convene briefly at a beer hall. There is no agenda for such a thing, really; Alex’s sister just wished to meet all of the friends she’d heard much about over the past few years. (By the way, I have to come clean – I was a good friend of Alex’s, but I was in no way as close to him as some of my other good friends were, so I was affected far less than they. In addition, I am well acquainted with death, unfortunately.) And it was a remarkably cathartic experience, even if no overt emotion was expressed. People milled about and discussed this and that, calmly relaying plans to hold an honorary brewery crawl in Arizona where Alex had gone to undergrad, or learning about what his sister and her husband were doing in London.

It was one of those seemingly ordinary things, but performed as a ceremony of sorts, that we all cling to so fervently when the fabric of the everyday is rent and the shocking or the cataclysmic occurs. The mysterious comfort in a sadly customary ritual – a group of those who wished to pay respects, those who knew someone who is suddenly gone, gathering together even if their only tenuous bond is that one, absent individual – is easily taken for granted. Now, in the age of the first global pandemic in a century, it is even more apparent how even the most minor of liberties, taking a stroll outside, is suddenly that much more precious when it is lost. We are confronted with losses of all kinds in all walks of life; in response, we grasp for the familiar to rebuild our rituals, being a resilient species. Love, primarily, is what we’ll return to in unconscious instinct, whether it be eating out more than we care to, in order to keep local small businesses afloat, or checking in on friends we don’t speak to that often, or donating to healthcare-focused charities. It’s unfortunate it takes the worst to bring out the best in us, but that’s the way it goes.

There are thousands if not millions of words being written to express that precise sentiment, but probably only hundreds will acknowledge the inevitable: Upon the end of the extraordinary measures taken to combat the COVID-19 pandemic, we will also revert to old habits. Nearly all of us…will eventually cease to grasp the exhilaration of freedom of movement; will forget to truly appreciate an in-person conversation over a fine meal at a restaurant; may chafe at the economic, fiscal, monetary and social debts incurred to defeat the pandemic, forgetting how critical they may have been; ultimately return to much of what we were like before this truly global catastrophe. Much will have changed in, say, remote working, government powers, public health approaches, etc. But people won’t really have changed all that much. (Including myself, frankly, I am already a tad wistful for a return to normalcy.) Maybe a little. People don’t change much; and if they do, it usually takes time, as well as often a tragedy or triumph…so just perhaps the pandemic may qualify…among other things.

Perhaps there is another thing, as of late, that belongs among those other things. One thing that won’t change is our memory – my friends and I, that is – of Alex.

The second wave of digital addiction is coming…or is it?

Thesis: Even as modern societies struggle with the adverse effects of digital immersion on mental and social health, a second wave of even more immersive experiences is imminent, with unanticipated yet not wholly negative consequences for cultural norms.

Virtual reality is quite hard – augmented reality may be even harder, all said and done, based on my considerably uninformed knowledge of its inherent engineering challenges when interacting with the physical world. But as Oculus and other companies aided by massive capital infusions despite the fact they may never release a viable product *cough* Magic Leap *cough* continue to slowly advance the basic tech within the space, it isn’t inconceivable that we could encounter an even more immersive digital experience than that currently proffered by smartphones.

That does not bode well for the human brain’s ability to resist the chemical rush of notifications, always-on communication and the rare yet always treasured brand-new, high-quality meme. If you think memes have broken the ability for everyone under the age of 30 to converse normally now, wait until they are interactive (arguably, Tik-Tok is already doing this). But however much more compelling a more seamless digital-physical reality in major urban centers boasting advanced tech capable of simulations may be, there may actually be some more positive outcomes than may be supposed at first.

As #InstagramReality is already proving, it’s irresistibly tempting for people to try to portray themselves in as flattering a light as possible online – that’s just human nature. With more virtual reality overlays possible in videosharing and photos overall, the norms of any reality are going to be so obviously reset to impossibly glamorous, stylized levels that more authentic communication will become chic again to some degree. This is already happening to some degree – people will still obviously want to look their best but a new median will arise that tries to blend digital polish with physical imperfections more closely. (I tend to agree with Alex Danco – – that glasses or headsets for virtual reality are a tough sell beyond probably just gaming, hence why I think its applications will still be primarily conveyed via smartphone interfaces.)

But another intriguing possibility pops up here. Deepfakes are already starting to proliferate, so it’s only a matter of time that they become a two-way street. Rather than used for lulz or cons, both governments and individuals will likely employ AR-empowered ‘live’ filming and photography to further their agendas, e.g. evade facial recognition or find hidden details or promulgate whatever narratives they want. Services to help evade identification and/or preserve privacy will pop up as tools available for any imagery editing while still being able to use social media like normal. Heck, that may be finally another use case for crypto-based payments that will help popularize them even further.

I’d argue that’s more of a social positive than may be suspected when one contemplates a future a la Blade Runner 2049 wherein AR displays are primarily for advertising or, well, overlaying a VR presence onto a human woman for reasons you can probably guess at. Holograms are still likely a ways off, but there could also be value in rendering communication in even more lifelike of a fashion via screensharing, or any manner of telehealth. Think of any given virtual appointment via videosharing you can conduct with a primary care physician right now – One Medical comes to mind – and imagine a much more lifelike consultation session wherein your phone’s camera scans whatever portion of yourself is visible and recreates it in a 3-D model for the physician to analyze on their end. (Yes, this will lead to hilarious videos that will inevitably leak online.)

All in all, there could be quite a few more positives to AR in the future than an even-worse digital addiction epidemic. As always, there is more nuance in reality than may be suspected at first.

The Cycle of Centralization, Part 1

Thesis: What makes the most sense to be decentralized? The maintenance and operations of the people closest to doing the thing. What makes sense to be centralized? The most common denominators and broadest parameters of the scenarios, e.g. laws that apply to all humans based on human nature, and decisions directly pertaining thereto.

Human history rhymes, but it does not repeat, as the saying goes. Composed of intertwining cycles, history often lurches between one extreme to the other, although it’s worth contemplating just how much more swiftly such lurches have become given the compounding powers of technology. A critical cycle that often seems to provoke significant division is the basic swing between centralization and decentralization. Especially nowadays, as we head into the Roaring 20s 2.0 (as if that wasn’t confusing enough already) and Bernie Sanders appears to be somehow gaining steam again, Venezuela rampages toward complete failed-state status, Putin appears immortal and Boris Johnson proves UK voters only trust politicians with terrible haircuts, it appears that the center truly cannot hold.

However, there is much more even tension and potential balance between the two forces of centralization than may seem at first glance. I’d contend that right now, several different phases of the centralization cycle across political systems, economies and sectors (all of course related) are occurring:

  1. China is attempting to stick the landing of a pseudo-capitalist yet centralized state, but there are more troubling signs for China than many suspect.
  2. The US is recentralizing somewhat and potentially will recentralize even further if, as seems likely, antitrust talk becomes action in the next couple of political cycles. However, the groundwork at the fringes is being laid for decentralization.
  3. The EU is slowly decentralizing, as potentially the most audacious experiment in recent political history starts to totter.
  4. Centralized online information flows are approaching senescence and thus are likely to start splintering; the decentralization/destruction of traditional news media is nearly complete.
  5. Entertainment within the most developed nations is mostly centralized but beginning to fray at the edges as consolidation reaches its apex, e.g. Disney becoming a monolith in the US. In less-developed nations, however, entertainment cycles are still oscillating significantly.
  6. The energy industry is looking set to start decentralizing very slowly over the next several decades.

What’s interesting and instructive about all the above is that they all exemplify how and why centralization SHOULD occur, as opposed to where it often naturally comes into being due to system design and/or sheer scale. Let’s examine each of the three main arenas above that have contributed to these different phases of the centralization cycle:

Information Flows

Is it now time to declare that open source has won? Or will another attempt at a Microsoft-like monopoly potentially rise again in the future? I think not, for a variety of reasons, but the most compelling one is there is no point in trying to wall off access to and ability to innovate code bases when you can instead own the infrastructure behind it and learn from the best of what unpaid volunteers decide to offer. It’s simply better business.

For a similar reason, Apple is pivoting mainly to services as they have seen the writing on the wall and know that trying to still offer the best hardware & software fusion in the world for mobiles is not possible for much longer given the degree of competition and innovation already beating them (for example, it’s arguable Google has been making overall better phones for at least a few years).

In short, it’s much easier to profit from a relatively open, accessible database for experimenting in code and/or offering the infrastructure for people to get their projects up and running. (That’s why Google Cloud, Azure and AWS have grown so massively in the past several years.) The primary reason why is that decentralization works extremely well for systems with unified rules of logic that operate in intangible environments.

However, as those systems create increasingly improving and more widely accessible tools that are used to promulgate the multifarious viewpoints of individuals, the degree of decentralization was likely to regress for a time. Why? Because even though each person is unique, it’s not by much. We all share the same biological makeup, and thus tend to be co-opted by particular narratives of fear, passion and humor. Accordingly, there is much money and power to be had by owning the largest publishing platforms where narratives can proliferate and draw in attention, which is therefore monetized in a variety of ways.

Social media platforms are the current primary actors in this drama, and in their efforts to finetune that monetization machine they’ve kinda made a mess of it much faster than even yellow journalism did in the 1890s. Once the intended audiences and participants in narrative design become aware of manipulation – however benign the intentions of those who designed the narrative algorithms on Facebook, Twitter and other media hubs actually were – then fragmentation was inevitable even for those ostensibly trying to be merely the marketplaces/publishing platforms.

For providers of news, it is even worse. Media centered on news in general is notoriously difficult to centralize unless total control over citizens’ existences is in place, because shades of interpretation of even something as simple as a tragic helicopter accident can spin out of control and spawn a thousand lines of discourse in a matter of hours, in the online world. Granted, much of the discourse is the same, but as it is communicated in varying degrees of proficiency it all appears checkered until usually a few main strands consolidate. News is necessarily ephemeral; decentralization is inevitable. Technology has only accelerated the ebbs and flows of the news decentralization and recentralization cycle. Only the most advanced brands have resisted destruction in online news, e.g. The Wall Street Journal, the New York Times, and even they are feeling the pinch. I don’t think they’ll die completely but will be recast in a new paradigm that honestly I haven’t quite figured out yet. It makes sense that at the very end of the traditional news breakdown, the last entities standing are the ones that encapsulate tried-and-true clusters of human mentalities and traits; by that I mean the WSJ is the brand for conservatives and the whole host of associated narratives, traits and ideas, likewise for the NYT and liberals.

The good news is inevitably new brands will arise as discourse and economic political realities evolve. Centralization will rise again, though it’ll take some time. What is difficult to anticipate is the degree to which augmented/virtual reality may help transform the experience of news and information access in general. Trust will be at a premium, so brands could resurface much faster than anticipated as people instinctively flock to what providers they can trust, but it is also possible a prolonged era of distrust and fragmentation is upon us. Whichever outlets figure out the best way to present immersive news with significant messaging around a balanced perspective will be the Fox plus WSJ, or CNN plus NYT of the 2030s. Exciting times, indeed.

Up next are nation-states, where some truly interesting events are occurring.

Surprise with joy

The sentiments of Christmas are so enduring because they so powerfully combine both the nostalgia and idealism intrinsic to human nature. Peace on earth to all humankind – who would openly disparage that? Joy to the world – however you follow that up, or for whatever reason you mention it, it is hardly a declaration that many would take exception to.

But joy is harder and harder to come by as you age. It’s not quite like the joy I observed this morning on my six-year-old nephew’s face, as he unwrapped a Lego Batman set, and his entire hour, or maybe even day, was made joyful. When you are younger, it is much easier to encounter joy, and also despair, and such intensities of emotion in general. But as you get older, joy and similar intensities are necessarily muted due to accumulated experience…and sadly, often worries. It becomes rarer.

As with all things rare, however, it becomes more valuable. And in a way, as that sensation of joy rarefies, it is also refined. It becomes more surprising, and almost always is found in establishing connections. Given the pressures and vagaries of life, adults usually know that time and connection – whether it be enjoying the company of friends or family or even finding connection within oneself between warring sentiments – are priced at premiums. Consequently, they value time spent with others over material expressions of value (if their priorities and ability to communicate are attuned, which is not always the case). Children may not value the joy of explicit connection with others as consciously, but they can more easily reach the joy of being in a particular moment, or receiving a particular thing.

It’s harder and harder to preserve that sensation, nowadays, and even harder to achieve the joy of true connection. We are drowning in information even as we thirst for connection – sitting at home alone because going out is either expensive or doesn’t seem like precisely what we want right now or both, bombarded with images and headlines of surpassing negativity and/or appetite-inducing, as those are what hack our brains all too effectively. It is possible to achieve some connection online, and feel some approximation of joyfulness thereby, but we are physical beings who will always crave the full ability to speak face to face and approach as close a connection as possible.

But we all unconsciously comprehend and are more openly grappling with these phenomena. And so we reach for the trappings and traditions of the holiday season, making the effort to go home both near and far and see friends, family, loved ones. It may be stressful, and it would certainly be easier at first to try to grasp once again the immediate joy that my seven-year-old goddaughter reveled in when she woke up Christmas Day, but past a certain age, deep down you know what type of joy you really wish for. The more difficult, and more refined, joy of connection.

I suppose they are both joys of connection, tied together in a perpetuating loop, for to my nephew and goddaughter, the gifts received were hard evidence of love and attention – children are savvy and thus look for such proof constantly, after all. But once you can fend for yourself, the gifts we look for are more difficult to come by as they necessitate giving first. You must reach out, whether in word or deed, gift or gab, and meet your fellow men and women, your tribe by blood or bond or both, or even acquaintances to strangers, in just the smallest gesture. It’s difficult – and thus surprising when successful. But it’s the holiday season – it is not the most wonderful time of the year, to surprise with joy?

The Misinformation Age

The coming of the internet promised easier access to quantities of information on a previously unimaginable scale. Until the first truly ingenious browsing tool was invented by a company with a name alluding to quantities, there was fierce competition for the ability to find that information. There is still competition for the best aggregation of information, and arguably no one compelling solution. But now, after the first few decades of the information age, it is unmistakable that the central issue in developed information economies is not access to nor quantity of information, but rather curation/verification. 

A few months ago a Forbes article posited that 2.5 quintillion bytes of data were being created every day. On a purely anecdotal basis, it is possible to consume thousands of tweets per day. (You really shouldn’t.) I don’t even have the nerve to search for how much content is being generated on Facebook or Instagram. We are drowning in information, even if you only regularly visit, say, Reddit or Instagram.

The effects of such inflows of information are still being studied. But this post is not about the consequences, good or evil, of the volume of information. Rather, it is about the value, or lack thereof. More importantly, it is about the market opportunity for information curation and validation that I believe is set to be one of the more important in the next decade or so.

The Personal Touch

Just a few hours ago I was strolling in Seattle’s First Hill neighborhood with some cousins, one of whom asked me what the price of a ticket to the Seattle Aquarium was. This type of information request is still commonplace in social gatherings, as it is potentially highly efficient, fosters companionship and serves as an indicator of trust and perhaps affection…or at the very least as something to fill a gap in conversation. However, if you pause to think about it, such requests, even though very common, are potentially quite flawed when viewed through a lens of objectivity. First, it presumes I have either gone to or contemplated attending the aquarium to the degree I looked up ticket prices; second, it presupposes that is something I’d be interested in; and third, it could although not necessarily contain the implication that what I enjoy she also would enjoy.

In this particular case, my cousin was simply being savvy; we tend to enjoy the same things, and I have looked up ticket prices to the aquarium. This is the key reason behind the frequency of such questions amid relatives and friends; we presume that our similarity, whether genetic or not, will result in similar preferences. To some degree this can be accurate, but only to some degree. By and large, the disparity in tastes will result in random asks such as “Hey, do you recommend this coffee shop?” to a friend who resides in that neighborhood as you happen to pass by it and subsequent recommendations potentially backfiring. 

That is the degree of error implicit in personal recommendations regarding fairly low-stakes decisions, such as where to get coffee. But we all engage in such requests and recommendations constantly, as it is a personal and personable route to gathering potentially useful information, even though it is riddled with subjectivity. What happens, however, when we remove such interactions away from the face-to-face by one step, to the digital realm?

Sharing Isn’t Necessarily Caring

By now, multiple tropes have sprung up around the sharing of content by connected yet semi-remote people on social networks, from the “crazy uncle that shares memes about building the border wall” to the “Starbucks-sipping soccer mom trying to stay hip”. Regardless of whether you chuckle or get vaguely peeved at the content shared, the fact is you consume at least part of it, often the most salacious or incendiary or often misleading tidbit that spawned the headline. 

This is what I like to think of as two-degree-distant information, as opposed to the one-degree, face-to-face interaction. You are most often connected on social media to people whom you actually have met in person, as extreme extroverts are rare. These are people whom you have at least some slight investment in staying connected to. So, how do you treat information from them? If it’s purely personal, that is the best of all possible worlds, as such is the most useful if mundane purpose of social media, e.g. learning my friend in London has been training for races. That way, you can bond over new fitness routines when you see him or her again. 

What if this information is not related to a topic that is personal, or even something that that person is invested in to a degree that multiple individuals trust them? To illustrate, there is a difference between your friend Bob, who is a data scientist, sharing an article about “10 common myths in data science” versus said Bob opining about the best way to cook turkey just because he really likes turkey. (By the way, turkey tends to be garbage meat so there really are few to no good ways of cooking it. Stick to better meats like pork, chicken, beef, fish, alligator, quail, duck, rabbit…you get the picture.) The disparity obviously lies in areas of expertise. But how many of your friends and family really stick to their areas of expertise when sharing content and/or information? To take my personal egregious example, I retweet dozens of observations and articles weekly, simply because I found them interesting. That’s it. My sole criterion is my personal interest. Fortunately, nobody regards me as an expert, but at best as a prolific, weak-form curator.

The point of this line of questioning around information gathered from personal connections is that it represents a significant source of the level of information we are bombarded with each day, but even though it is generated by people we are connected to and care about in varying degrees, that doesn’t necessarily make it valuable, or even accurate. It merely makes it more personal. But therein lies the rub.

The Market for Misinformation

A huge market exists for misinformation, and it’s all in your own head. We are primed since birth with information that then shapes our worldviews and likes and dislikes. Information that then contradicts those established, favored facts and beliefs is cognitively more difficult to process, and consequently we seek to avoid such jarring revelations. It takes valuable time to read through something after all, even if you continue to disagree with it after the fact. 

Hence why we are, in a way, doubly primed to favor the articles shared by friends that we get along with very well, especially on topics on which we know we share similar opinions. Confirmation bias is a heady drug. Accordingly, a huge market for misinformation has and always will exist, but is becoming vaster every week as the internet continues its inexorable expansion. The best-known examples include the 2016 election in the US, and abuse of Facebook networks in Southeast Asia, but there are countless salient iterations of desire for misinformation that confirms what certain markets or tribes, so to speak, believe. 

But, amid this flood of information, much of it misinformation, what can we actually believe? How can we figure out what is true or not?

Quid Est Veritas

Pontius Pilate’s famous question of “What is truth?” often serves as part of the reason he is maligned. But he shouldn’t be for that question alone. It is one of the fairest questions we can ask, and especially nowadays, asking “What is true?” more frequently is a noble endeavor. (What actual truth is, in noun form, is a much harder question to answer, frankly.) What can actually serve as useful criteria for assessing truthfulness, in an age where even bastions of public information like the Wall Street Journal and New York Times are decried as shills for agendas and the very office of the US presidency is occupied by someone whose accuracy can be most generously described as suspect?

Here are a few rules that I think can serve as most practical:

  1. Information from a source that does not help nor confirm the source in any way.
  2. Information from a source that actively hurts that source.
  3. Information from a source that corresponds to the most widely and historically held beliefs.  

All of these except the second can be difficult in some ways to achieve, but are so simple they are quite useful as heuristics. For example, it serves the Pew Research Center little to no good to publish the results of a survey that confirms that the American political spectrum is becoming more and more bimodal – i.e. there is less common ground between registered Democrats and Republicans than has been in decades. One could argue that such bad news would potentially benefit its social reach, under the belief that “if it bleeds, it leads”, but as everyone already has been able to see such a bimodal distribution in action in the wake of the 2016 election, it does Pew Research Center little good beyond contributing to its utility as a source to keep on publishing findings that many presume are common knowledge. Hence, Pew is likely to be trusted.

It may be a lesser-known law of public relations that in nearly all cases it is best for corporations to admit any wrongdoing and accept the consequences rather than try to execute a cover-up. At the individual level, we all too frequently try to cover things up, that such an action may seem rather too good to be true. However, it often is – getting out ahead of a negative finding in advance is the best course of action, simply because people already tend to believe the worst, so it’s best to admit to the truth, as best as it is known. For example, Facebook’s Cambridge Analytica debacle is supposed to be contributing to its record-setting market cap loss in the past few months. But until Facebook owned up to the full extent of the breach, the rumors swirling around its true reach were far worse than reality. Furthermore, once a company confirms the extent of wrongdoing there is strong predilection to believe that that extent is true at the least. You may still suppose, if you are negatively biased toward Facebook like I am, that there is even more skulduggery lurking in the shadows, but even I am willing to believe that what they’ve admitted thus far is true.

Last but not least, the historical process of trial and error conducted by humans for millennia is perhaps one of the most reliable heuristic to use. We simply know that pairing certain foods prepared in a certain way lasted for centuries in a given region because the combination was nutritious, tasty and easier to produce given the region’s natural abundance. For a concrete example in the case of diet, there was no reason to really strip out the nutritious husk from a grain of rice, if it worked well for our ancestors until we finally achieved the ability to create white rice. We know that we can trust certain people in certain scenarios. Utilizing the inherited wisdom of our ancestors can reduce errors of all kinds.

The Market Opportunity

If I was a much smarter man, I’d already be working on the company that can crack this. But it’s a bit complicated, so I’m still thinking my way through it. Essentially, given the flood of information we all bathe in daily, we often still employ the classic if flawed heuristics of relying on personal recommendations and/or authorities, e.g. the New York Times. The problem with relying on a brand is at a certain point a brand, although it can be reliable at times given its incentives, at some point is reliant on a person or machine with bias. (Yes, all machines also have biases.) Driven by our tribal instincts, we still unconsciously tend to associate the most popular brands or information that aligns with our views, however achieved, with veracity, to a dangerous degree. 

A curator is necessary. And I think that it isn’t impossible to improve upon the current state of mass media by simply creating a customized blend of sources that can be trusted but only upon certain topics. Of course, that is the first step. Careful study of the track record and biases of each source, such as the Wall Street Journal editorial page, must be codified and then continually monitored to avoid bias drift next, and as for the final step, a short, handy rating of sorts could be applied to either filter down to acceptable levels of certain types of biases or even to only ever produce pairs of countering biases. 

This may sound like overkill. Why not just read Matt Levine for his take on notable Things That Happen? No offense to Mr. Levine, but then we must rely on his take, which, as excellent as it often is, is limited to, as he declares, Money Stuff. Furthermore, we are relying on human-scale curation. I’m envisioning ratings for the entire internet, which necessarily demands machine scale. Granted, there will still be some bias involved as humans typically have to build things, but at the least it’d be a step further along than the human-designed algorithms that already dominate our social media feeds. Imagine a Twitter feed with every article possessing a short bar indicating veracity, which one can click upon for a full rationale that explains why, based on source, topic of the article, track record of the source with said topics, demonstrated domain expertise, and more. 

A host of minefields remain to be explore as I think through this idea, but that is as far as I have gotten. In the Misinformation Age, such a tool is more than important – it is critical.

The Graduation Speech No One Will Let Me Give

(Armed with nothing more than mid-to-late 20s hubris, here’s the graduation speech that I wish most undergraduates and, heck, all other types of graduates could hear.)

Most of you will fail at most of the things you undertake in your life. Some of those failures will be so small you may not even consider them failures, or even notice them. Some of them will be noticeable, at least to you. And a few will be catastrophic or nearly so. The volume of mistakes and failures, of course, will vary considerably. Many of you already have heard the proclamations to own your failures, perhaps to not let them define you. Some of you have already failed quite a bit and have moved past it. Unfortunately, there’s some bad news: It will never stop. You will fail continuously for the rest of your life, most of the time you try things.

By the way, I do mean all of you, if not all of humanity. No matter how intelligent you are, you will fail at most things. Amusingly enough, the more intelligent you are, the correspondingly huger your mistakes tend to be, as you may have the talent to land in a very important role that requires making very critical decisions, one of which may well spectacularly backfire.

Oh, I’m not finished. You are going to die, no matter how much money the current crop of the mega-rich desperately throws at longevity initiatives. You also are inheriting this epoch’s particular problems, which are intriguing inasmuch as they seem to dwarf even the past couple decades’ passels of problems, which is saying something given we’ve all lived through 9/11, the dot-com crash, the financial crisis, etc. You are now uncomfortably, increasingly aware that the grownups who are supposed to be managing the world may not be doing such a great job. Some of you think you can do a better job. Some of you think that just the right set of current grownups need to take the reins. Nearly all of you don’t want to think of yourselves as full-fledged grownups that are actually more responsible for the current state of affairs than you think, and are already in the thick of the real world, given the last comfort blanket has now been tugged away.

However, a smaller subset of you is keenly aware, having been on the inside of institutions or a perceptive student of statistics, that all of these observations just made do have one thing in common: They all acknowledge the objective reality we live in, as demonstrated by statistics. A nicer way to phrase my opening line is that most people aren’t that successful in what they aim to do. That’s the simple reality we live in. If most people succeeded in even half the time in what they were trying to do, the world would be remarkably different. For example, once you get into a corporate job, you will be struck by how often you presumed the people actually running the businesses that make up the economy aren’t necessarily good at their jobs most of the time. Some of them are good some of the time, very few are true high performers, and some are outright incompetent.

Now, this moment is where one would expect the narrative to reverse. This is where I should flip my statements and explain why it’s actually all okay. That’d be a little too easy, however. And it wouldn’t be entirely accurate, because some of you will burn with ambition or hunger or jealousy or all of the above your whole lives, internally dissatisfied with the fact that you didn’t measure up, in some way, to what you thought you should achieve. Others will temper ambition, assess what truly makes them happy, then pursue that, and realize it to some degree, or maybe entirely. Many others will exist in some nebulous area between those two states, shifting depending on their changing environments, searching for fulfillment.

Such searching is what got this species to where it is. It is unavoidable, hardwired into your very flesh, blood, bone and brain. The downside is that such focus on the extrinsic, especially in the US, often tends to neglect the intrinsic. But failure knows no such boundaries. And thus, confronted with the possibility of failure on all fronts, the focus on the extrinsic, the searching for achievements outside of oneself, ignores the one area where you have the most control: the intrinsic. You.

The extrinsic matters too, of course. What you give to the world, what you accomplish or fail to accomplish for the most part, all is critical for actualization of the interior. But you are the node between those extrinsic affairs, operated by the intrinsic. And, again, there is only one of those areas where one can credibly say you have more than a fighting chance, and that is inside your own head.

Failures at most things in life, which again often go unnoticed – for example, being lazy and skipping that intended workout, or blowing a test that was a last-ditch effort – are all too often measured in terms of extrinsicity. But measurements that only take into account one part of a system often do so to their ultimate peril. The holistic nature of the intrinsic person accomplishing the extrinsic tasks.

To remain cognizant of the intrinsic interior, where we have the most control, we must recognize and declare our own value. Please do not mistake this with the abused term “self-care”. Recognition of one’s own value does not mean approval or condemnation. Rather, it is a clearheaded assessment that you do matter, just as every human being on this planet matters, in an incalculable, intrinsic fashion. That is the first step. The second, necessary step is to recognize that the purpose of this intrinsically valuable creature is to accomplish something in this world. However, given human flaws, such accomplishments are not only hard to define and find but also difficult to pull off. We must come to terms with the extent to which we may and probably will fail.

From there, we must define the terms of our failures internally. What IS a failure, after all? Is it merely missing a goal? Should that have been a goal in the first place? Was that goal unreasonable? Was it reasonable, and within your capabilities, and thus an actual failure? By asking just those few questions, it should already become clear again that most of the time you will fail. And yes, many of those failures will be bad. But they will not be significant. They will represent small morasses of frustration that, if dug into, could yield experience. That is, of course, if you are willing to do the dirty work of digging into them. Which is what one must, if one wishes to become resilient. Resilient to the degree that one can move past that particular failure…on to new ones.

That process is not easy and never-ending. At times it will be easier, and then become harder again, because humans aren’t particularly good at follow-through. That is the primary reason failure is so common, by the way. The virtue of persistence is preached constantly, as it has to be, but astonishingly few of us practice it, well, persistently.

The only way to render that process marginally easier is to embrace humility. Humility and security go hand in hand. And it is fascinating how most of us, including myself, do not possess much interior security. Hence our constant search for the extrinsic, for the outward validation or confirmation via achievements that we are worth something, or anything. The most successful among us can still suffer from that particular affliction. But achieving that inner security, that poise, that self-command, isn’t achievable through purely intrinsic means, e.g. meditation. It is achievable only in a blending of both the extrinsic, outward achievements, which in turn transform the intrinsic person that you are, which enables further outward achievements. It’s a quintessential chicken-and-egg cycle, unceasing in its oscillations.

But when a balance has been struck for a time, and humility is obtained, from that bedrock can sprout the tried-and-true, best approach to failure: learning. Again, your cycle of failures will never cease. With grace and luck and diligence and endless work, you may mitigate most your failures. You may lessen their incidence. But you will possess the resilience and the tools to embrace your endless probability of failure.

Let me be clear: This may seem like a grim way of looking at the world until you actually embrace it. Once you do that, however, it is the most reliable source of serenity in the often mundane humdrum business of life. Sure, it is tiring, but it is the seasoning that lends the rare successes true sweetness in life. Not only because they finally happened, but because you can place them in their proper context. Your successes won’t completely define you, and nor will your failures. You are too complex for that, so do not attempt to measure your self-worth by what you have done, but what you have done, who you are, what you can accomplish, and, maybe most importantly, what you can’t.



It’s rather sardonically amusing that, even amid the latest uproar around giant tech companies’ violations of user privacy, most people don’t really seem to care all that much about their privacy. Exchanging one’s browsing history and app usage for free access to such great tools such as Gmail, Google Drive, Facebook and the like still seems a good bargain. (As a dedicated Google ecosystem user, I am a perfect example.) Now, it would seem like people SHOULD care more than they do. You don’t know what you got till it’s gone, right? Corporations’ and governments’ intentions may seem fairly benign now, but what could happen should we willingly compromise our complete identity digitally?

The way it appears to me is that there are a few potential paths forward. First, slowly but surely, regulations such as the EU’s GDPR are enacted that curb or at least set the boundaries for data collection and usage, and the worst remaining threats to citizens’ privacy remain governing bodies. Second, an uneasy equilibrium between governments and corporations emerges, wherein citizens gain some greater agency via market mechanisms, with incumbents cleaning up their practices in order to retain consumers, e.g. Google shifting Chrome features somewhat to forestall DuckDuckGo’s encroachment upon its turf. Last but not least, however, the status quo could essentially worsen, as the majority of consumers simply cease to care about said violations of privacy. What’s interesting about that last potential outcome is that if consumers simply begin or continue to presume that surrendering their data is the effective price they pay for greater convenience (such as more effective ads on various websites), how will it effectively translate into negative outcomes?

Presuming every centralized data hoard eventually suffers a system breach, leakage of confidential information is the most likely negative outcome that has already been demonstrated. The allure of being able to access and then auction off personal data is only likely to increase as the quantity and quality of said personal data increases. But you could argue that such an occurrence is all but inevitable. The only plausible alternative is for each person to securely manage access to their own information…which could still lead to security breaches, potentially, but at far lower probability. Few are willing to invest much time or effort in such security. So the majority of users are likely also willing to endure such occurrences, as long as they remain tolerably rare. But it’s key to note that this essential tacit auction of personal data will also only persist as long as users feel they are being compensated adequately. Amazon must continue to give Prime members discounts, free shipping, access to more channels and more, if it is to continue tracking and targeting. In fact, Amazon will likely have to ramp up its features to retain and continue to attract users, simply as Prime members become accustomed to such perks. If companies do not keep up, in short, they are likely to see their user base contract. Facebook may well face such a challenge in the next year or so on its core platform, negated only by its potential growth in Asia and Africa.

So it appears that there definitely exists a market value for privacy that we are willing to pay, but over time, scope creep among what we expect to be compensated for surrendering our personal information is to be expected. This is where things get even more interesting. Where could equilibrium eventually emerge?

All forms of digital commerce seem the most likely realm for equilibrium. Rewards and membership programs are proven to work exceedingly well, as psychologically they do not violate longstanding sociocultural norms, and usually entail only the types of information gathering that most of us are fairly amenable to (many people, for example, could presume that I am interested in exotic coffees and thrillers based on just a few minutes’ conversation). Competition between such buy-in via subscription platforms will then result in services at various degrees of convenience and varying treatments of privacy, with additional services coming at the cost of additional layers of completion to companies’ physical and psychological profiles of users.

Social media is a much trickier proposition. One of the reasons I think LinkedIn has largely escaped as much scrutiny as Facebook is a) Facebook’s more aggressive approach; and b) the fact Facebook has tried to truly bill itself as a network of true, personal connections, as opposed to business-oriented networking. Facebook shouldn’t be viewed as “feel-good” in any sense; it is just as ruthless as any other company. But the disconnect between its marketing and its unveiled reality is more jarring on some subliminal level, I believe, which provokes more visceral reactions such as leaving Facebook (or barely using it, which could still prove deleterious to Facebook’s core business, eventually). But Facebook itself needn’t perish as long it can provide a decent-enough service and clarify to users what it is really doing with its data to the extent digital commerce companies do. Again, people are willing to part with privacy to a certain extent of compensation. An amazing UI, handy events scheduling feature and easy buy-in could well prove sufficient for many. What Facebook has been failing at is figuring out where its limits truly lie.

Let’s presume, however, that the limits to not just Facebook but also other services are far less than they may currently seem. Let’s return to that original third premise wherein consumers simply cease to care really at all about privacy. What type of market compensation would be required for consumers to willingly surrender nearly all their information, if not all? Even governments don’t have that much information (at least outside of China, soon enough). I presume it would be a multi-service platform offering unprecedented convenience, with enough security cachet it could coax even wary users into compliance.

Let’s start with a base case of Amazon Prime, and dial it up to where Bezos seems to want to take it in about five years. First, groceries and basic medications would be provided for additional prices (still discounted) to Prime customers, whom then would be coaxed into giving up their eating patterns and potential additional physical information. This in turn would of course only improve the predictive power and matching of advertising within Amazon for all other products. (More vegetables being ordered? Why not purchase this scale to help with your weight loss efforts?) The next step would be a more holistic psychological profile via analysis of streaming patterns based on the types of food most recently consumed and medications taken. How frequently does this person eat junk food and then stream for 17% longer later into the night? Does this spark melatonin supplement purchases?

As you can see, the cascading effect could only continue in what would seem the perfectly virtuous cycle from the company’s perspective. The trick would be to assuage any consumer concerns by ensuring excellent quality across the board (or at least a tolerable range) as well as potentially take a radical step of providing a monthly summary of data tracked and opt-in or out…accompanied by clear warnings of what would be surrendered each time. This could best work if a company is determined at creating its own complete marketplace/one-stop shop, as opposed to serving as the entry point instead of vendor, a la Google. It could both soothe users’ fears and inculcate in them a sense of comfort that the company is always willing to engage with their needs (not to mention an illusory sense of control, frankly).

Frankly, that’s the outcome I view as most likely, simply because buying in to ecosystems is going to become nearly irresistible unless decentralization efforts can become much more compelling and of sufficient quality. It’s a bit disturbing to contemplate, but does seem more likely, given the current landscape.

Controlling Time

A favorite if frustratingly smug exclamation of mine is “Time is a biological construct.” It’s not strictly true, as time is more of a purely physical construct than anything else that enables our limited mentalities to make sense of a world in which we are bound to three dimensions bodily but can dimly grasp the existence of more. Time is necessary, but it is also the great limiter, a factor that constantly thwarts our abilities to achieve and perceive. Moreover, humans are notoriously bad at really understanding how the inexorable advance of time is flowing all around us, and yet how it can, with no small effort, be somewhat mastered. As perhaps my favorite examples of the vagaries of time all reside within financial markets, I’ll illustrate what I mean later, but first it is important to make a few notes about temporality in general.

Everyone lives their own time

This point is readily self-evident and usually acknowledged when thought about. Perhaps one of the most common if at times tedious internal exclamations that adults tend to make when they see small children of friends after some years is some variation on “How time has flown!” This is entirely normal – as noted above, we aren’t particularly good at living outside our own perceptions in the first place. But it is important to recall that each and every person experiences and perceives time at their own pace, and that phenomenon of aging is maybe the easiest gateway toward comprehension. When you are young, experiences are painfully and preciously immediate, as they tend to be novel. As you age, the accumulation of lived experiences necessarily entails that novelty diminishes, and thus, time seems to pass faster as you aren’t experiencing that much that is new for the first time. Accordingly, it is clear that perception of time is what really defines time for us, and each person perceives time uniquely. Each person’s life is, to them, a unique accumulation of lived experiences perceived differently upon each viewing, like mutating layers of rock strata banded in kaleidoscope hues.

Perceiving time in and of itself can change it…particularly memories

When it comes to memories, the human brain is a tricky thing. Recently, it seems as though our prior model that experiences are consolidated before moved to long-term storage may be inaccurate; rather, the process is continuous, with memories being stored while they are being formed, at varying rates (as sleep, for example, is critical for the brain to organize its memory warehouse, so to speak).

But as the brain and indeed whom you are can be viewed as a collection of memories encoded in neurons and weakened or strengthened by the firing of synapses to unearth particular instances, it stands to reason that by choosing to recollect certain moments more frequently, you favor the preservation of some memories more than others. Two things then occur: First, you cull the brain’s entire warehouse for the memories most often remembered, thereby pruning by way of not bothering to preserve what you ate for breakfast on July 19, 2017 in favor of retaining the memory of your goddaughter’s baptism; and second, even by recollecting that baptism, you do not remember it the same way each time. We crave familiarity and repetition at times, but novelty brings exciting stimulus, so as we retread the familiar, some slight nuance may be either lost or added, depending on what is focused upon each time.

Being relative, time entraps us but is within our control

It’s intriguing how frequently time is cast or referred to as a pitiless destroyer of all things. The truth of the matter is that we live time linearly, but really, it is curvable, as best as we can tell. Moreover, our perception of it is elastic. Some may quibble that our perception may be elastic but not time itself, truly, so does it matter? I’d respond that if our perception of time is the building of memory, and we are our memories, doesn’t our manipulation of perception ultimately matter more than the reality of time? Frankly, we don’t quite know enough about time yet to truly escape it, but it is much more within our control than may be thought at first.

A life of routine without pauses of appreciation or reflection will surely accelerate perception of time, much like you can absentmindedly commute to work in what seems like no time at all, only to realize that pattern is so ingrained that 40 minutes passed in what appeared to be the span of 10. However, a life of pure novelty is impossible, so the ideal prescription appears to be a balance between the two. Moreover, pure novelty isn’t required to slow down time, but rather, different circumstances can help deploy a slowing of time, or acceleration.

For example, consider this: Find as close to a silent space as possible and close your eyes. Begin to inhabit each passing moment. Reflect upon the immediacy of every sensation remaining to you. At the other extreme of inducement, much like soldiers in combat scenes report time dilating to their perception, scenarios that hijack the body’s alert system can also help dilate time, such as moments of extreme physical effort or perception of danger. (I can assure you that the dozen or so seconds of freefall I experienced while skydiving felt quite long.) Now, should one wish to kill time, so to speak, rather than slow it down, it’s simple: Seek out diversions that are of necessity novel but not so novel that they require active perception. Hence the common temptation to rewatch a favorite film when bored but lazy on a weekend afternoon.

The payoff, or what prompted this reflection on time

The primary impetus of this post was an idle reflection in a bar on a lazy Friday afternoon when some of my mid-20s peers were bemoaning the onset of more severe hangovers, attributing them to “getting old”. I was amused in a sardonic way, reflecting on how mystery novels from the 1930s I used to read would call men of thirty to thirty-five “young”, and how the times can change so relatively quickly to land us in our youth-obsessed world. I don’t really blame the baby boomers for their predilection with the 20s and so on; after all, some young men and women of the American or French revolutions seemed to unconsciously evoke the same sentiments. Each age has its own preoccupation with youth; ours is just exacerbated by the proliferation of micro-narratives that become long term through their preying upon common human fears. But it’s rather tiresome to try to mark out your life in conventional signposts and give in to the worst vapidities of our time, focused relentlessly on looking and staying youthful. (Caveat, I’m as bad as any of my peers, only having always looked prematurely older than I am by virtue of balding early and hitting puberty very young, I frequently refer to myself as being essentially 40.)

But all these typical comments aren’t really a significant issue, although if overindulged they can prompt harmful behaviors. What is more important to note is that especially given the return of volatility to global markets, and increasing concern around whether the narrative of global economic recovery is truly real or not, understanding time and your relation to it is crucial when it comes to investing. Many will seek to recall the basic lesson they’ve all heard time and again: “Buy low, sell high.” And consequently, they will look to begin hoarding cash and hope for a dip in the markets only to buy everything cheaply. This isn’t a bad plan, but it is going to be very hard to follow. Until it happens, you don’t know how you are going to react to it. And the ‘it’ in this case is another recession, which could be even worse or milder than the last one (the evidence is mixed at best when it comes to assessing which). Plus, for those of us overexposed to financial assets, it’s not that pivoting all your money into capital preservation right now and then rotating out once the S&P 500 drops a substantial percentage will help either, but rather that as that overall return begins to dip lower and lower, one should remember the evocative nature of stress with regard to time, and how it elongates perceptions. This too shall pass, in other words.

Furthermore, to hearken back to what I noted earlier, it’s easy to remember the financial crisis and Great Recession differently each time. But it took quite some time for everything to unspool, and there were quite a few hiccups along the way. We have selectively pruned our memories to compress that time, which could provide us with a misleading mental map of how long it took for us to 1) realize we were in deep trouble, 2) financial markets to price in the level of risk, and 3) the actual level of pain and worry experienced. Consequently, one must soberly review the actual chain of events at length, and assess how history is rhyming – not repeating – this time around. For my purposes, for example, accumulating cash is still a sound strategy, yet I am maintaining exposure to emerging markets with a healthy dose of aggressive risk-taking as there could be a brief window of time in which capital flows to emerging markets may occur, should the US experience the first rumblings of significant shock, again. However, that window may well be quite brief, as currently, US monetary policy may induce flows into the dollar, and emerging markets companies may feel the pinch of higher debt payments much more acutely than others. But, bearing in mind all I’ve said, I’m also tracing the history of the financial crisis, and debating whether the same pattern of contagion may occur – and I have no firm conclusions as of yet.

In conclusion, adopting as much of an awareness and control of time in this period of heightened volatility, both financial and political, will assuage the tendency to let outliers or subtly altered memories overly affect us.