Surprise with joy

The sentiments of Christmas are so enduring because they so powerfully combine both the nostalgia and idealism intrinsic to human nature. Peace on earth to all humankind – who would openly disparage that? Joy to the world – however you follow that up, or for whatever reason you mention it, it is hardly a declaration that many would take exception to.

But joy is harder and harder to come by as you age. It’s not quite like the joy I observed this morning on my six-year-old nephew’s face, as he unwrapped a Lego Batman set, and his entire hour, or maybe even day, was made joyful. When you are younger, it is much easier to encounter joy, and also despair, and such intensities of emotion in general. But as you get older, joy and similar intensities are necessarily muted due to accumulated experience…and sadly, often worries. It becomes rarer.

As with all things rare, however, it becomes more valuable. And in a way, as that sensation of joy rarefies, it is also refined. It becomes more surprising, and almost always is found in establishing connections. Given the pressures and vagaries of life, adults usually know that time and connection – whether it be enjoying the company of friends or family or even finding connection within oneself between warring sentiments – are priced at premiums. Consequently, they value time spent with others over material expressions of value (if their priorities and ability to communicate are attuned, which is not always the case). Children may not value the joy of explicit connection with others as consciously, but they can more easily reach the joy of being in a particular moment, or receiving a particular thing.

It’s harder and harder to preserve that sensation, nowadays, and even harder to achieve the joy of true connection. We are drowning in information even as we thirst for connection – sitting at home alone because going out is either expensive or doesn’t seem like precisely what we want right now or both, bombarded with images and headlines of surpassing negativity and/or appetite-inducing, as those are what hack our brains all too effectively. It is possible to achieve some connection online, and feel some approximation of joyfulness thereby, but we are physical beings who will always crave the full ability to speak face to face and approach as close a connection as possible.

But we all unconsciously comprehend and are more openly grappling with these phenomena. And so we reach for the trappings and traditions of the holiday season, making the effort to go home both near and far and see friends, family, loved ones. It may be stressful, and it would certainly be easier at first to try to grasp once again the immediate joy that my seven-year-old goddaughter reveled in when she woke up Christmas Day, but past a certain age, deep down you know what type of joy you really wish for. The more difficult, and more refined, joy of connection.

I suppose they are both joys of connection, tied together in a perpetuating loop, for to my nephew and goddaughter, the gifts received were hard evidence of love and attention – children are savvy and thus look for such proof constantly, after all. But once you can fend for yourself, the gifts we look for are more difficult to come by as they necessitate giving first. You must reach out, whether in word or deed, gift or gab, and meet your fellow men and women, your tribe by blood or bond or both, or even acquaintances to strangers, in just the smallest gesture. It’s difficult – and thus surprising when successful. But it’s the holiday season – it is not the most wonderful time of the year, to surprise with joy?


The Misinformation Age

The coming of the internet promised easier access to quantities of information on a previously unimaginable scale. Until the first truly ingenious browsing tool was invented by a company with a name alluding to quantities, there was fierce competition for the ability to find that information. There is still competition for the best aggregation of information, and arguably no one compelling solution. But now, after the first few decades of the information age, it is unmistakable that the central issue in developed information economies is not access to nor quantity of information, but rather curation/verification. 

A few months ago a Forbes article posited that 2.5 quintillion bytes of data were being created every day. On a purely anecdotal basis, it is possible to consume thousands of tweets per day. (You really shouldn’t.) I don’t even have the nerve to search for how much content is being generated on Facebook or Instagram. We are drowning in information, even if you only regularly visit, say, Reddit or Instagram.

The effects of such inflows of information are still being studied. But this post is not about the consequences, good or evil, of the volume of information. Rather, it is about the value, or lack thereof. More importantly, it is about the market opportunity for information curation and validation that I believe is set to be one of the more important in the next decade or so.

The Personal Touch

Just a few hours ago I was strolling in Seattle’s First Hill neighborhood with some cousins, one of whom asked me what the price of a ticket to the Seattle Aquarium was. This type of information request is still commonplace in social gatherings, as it is potentially highly efficient, fosters companionship and serves as an indicator of trust and perhaps affection…or at the very least as something to fill a gap in conversation. However, if you pause to think about it, such requests, even though very common, are potentially quite flawed when viewed through a lens of objectivity. First, it presumes I have either gone to or contemplated attending the aquarium to the degree I looked up ticket prices; second, it presupposes that is something I’d be interested in; and third, it could although not necessarily contain the implication that what I enjoy she also would enjoy.

In this particular case, my cousin was simply being savvy; we tend to enjoy the same things, and I have looked up ticket prices to the aquarium. This is the key reason behind the frequency of such questions amid relatives and friends; we presume that our similarity, whether genetic or not, will result in similar preferences. To some degree this can be accurate, but only to some degree. By and large, the disparity in tastes will result in random asks such as “Hey, do you recommend this coffee shop?” to a friend who resides in that neighborhood as you happen to pass by it and subsequent recommendations potentially backfiring. 

That is the degree of error implicit in personal recommendations regarding fairly low-stakes decisions, such as where to get coffee. But we all engage in such requests and recommendations constantly, as it is a personal and personable route to gathering potentially useful information, even though it is riddled with subjectivity. What happens, however, when we remove such interactions away from the face-to-face by one step, to the digital realm?

Sharing Isn’t Necessarily Caring

By now, multiple tropes have sprung up around the sharing of content by connected yet semi-remote people on social networks, from the “crazy uncle that shares memes about building the border wall” to the “Starbucks-sipping soccer mom trying to stay hip”. Regardless of whether you chuckle or get vaguely peeved at the content shared, the fact is you consume at least part of it, often the most salacious or incendiary or often misleading tidbit that spawned the headline. 

This is what I like to think of as two-degree-distant information, as opposed to the one-degree, face-to-face interaction. You are most often connected on social media to people whom you actually have met in person, as extreme extroverts are rare. These are people whom you have at least some slight investment in staying connected to. So, how do you treat information from them? If it’s purely personal, that is the best of all possible worlds, as such is the most useful if mundane purpose of social media, e.g. learning my friend in London has been training for races. That way, you can bond over new fitness routines when you see him or her again. 

What if this information is not related to a topic that is personal, or even something that that person is invested in to a degree that multiple individuals trust them? To illustrate, there is a difference between your friend Bob, who is a data scientist, sharing an article about “10 common myths in data science” versus said Bob opining about the best way to cook turkey just because he really likes turkey. (By the way, turkey tends to be garbage meat so there really are few to no good ways of cooking it. Stick to better meats like pork, chicken, beef, fish, alligator, quail, duck, rabbit…you get the picture.) The disparity obviously lies in areas of expertise. But how many of your friends and family really stick to their areas of expertise when sharing content and/or information? To take my personal egregious example, I retweet dozens of observations and articles weekly, simply because I found them interesting. That’s it. My sole criterion is my personal interest. Fortunately, nobody regards me as an expert, but at best as a prolific, weak-form curator.

The point of this line of questioning around information gathered from personal connections is that it represents a significant source of the level of information we are bombarded with each day, but even though it is generated by people we are connected to and care about in varying degrees, that doesn’t necessarily make it valuable, or even accurate. It merely makes it more personal. But therein lies the rub.

The Market for Misinformation

A huge market exists for misinformation, and it’s all in your own head. We are primed since birth with information that then shapes our worldviews and likes and dislikes. Information that then contradicts those established, favored facts and beliefs is cognitively more difficult to process, and consequently we seek to avoid such jarring revelations. It takes valuable time to read through something after all, even if you continue to disagree with it after the fact. 

Hence why we are, in a way, doubly primed to favor the articles shared by friends that we get along with very well, especially on topics on which we know we share similar opinions. Confirmation bias is a heady drug. Accordingly, a huge market for misinformation has and always will exist, but is becoming vaster every week as the internet continues its inexorable expansion. The best-known examples include the 2016 election in the US, and abuse of Facebook networks in Southeast Asia, but there are countless salient iterations of desire for misinformation that confirms what certain markets or tribes, so to speak, believe. 

But, amid this flood of information, much of it misinformation, what can we actually believe? How can we figure out what is true or not?

Quid Est Veritas

Pontius Pilate’s famous question of “What is truth?” often serves as part of the reason he is maligned. But he shouldn’t be for that question alone. It is one of the fairest questions we can ask, and especially nowadays, asking “What is true?” more frequently is a noble endeavor. (What actual truth is, in noun form, is a much harder question to answer, frankly.) What can actually serve as useful criteria for assessing truthfulness, in an age where even bastions of public information like the Wall Street Journal and New York Times are decried as shills for agendas and the very office of the US presidency is occupied by someone whose accuracy can be most generously described as suspect?

Here are a few rules that I think can serve as most practical:

  1. Information from a source that does not help nor confirm the source in any way.
  2. Information from a source that actively hurts that source.
  3. Information from a source that corresponds to the most widely and historically held beliefs.  

All of these except the second can be difficult in some ways to achieve, but are so simple they are quite useful as heuristics. For example, it serves the Pew Research Center little to no good to publish the results of a survey that confirms that the American political spectrum is becoming more and more bimodal – i.e. there is less common ground between registered Democrats and Republicans than has been in decades. One could argue that such bad news would potentially benefit its social reach, under the belief that “if it bleeds, it leads”, but as everyone already has been able to see such a bimodal distribution in action in the wake of the 2016 election, it does Pew Research Center little good beyond contributing to its utility as a source to keep on publishing findings that many presume are common knowledge. Hence, Pew is likely to be trusted.

It may be a lesser-known law of public relations that in nearly all cases it is best for corporations to admit any wrongdoing and accept the consequences rather than try to execute a cover-up. At the individual level, we all too frequently try to cover things up, that such an action may seem rather too good to be true. However, it often is – getting out ahead of a negative finding in advance is the best course of action, simply because people already tend to believe the worst, so it’s best to admit to the truth, as best as it is known. For example, Facebook’s Cambridge Analytica debacle is supposed to be contributing to its record-setting market cap loss in the past few months. But until Facebook owned up to the full extent of the breach, the rumors swirling around its true reach were far worse than reality. Furthermore, once a company confirms the extent of wrongdoing there is strong predilection to believe that that extent is true at the least. You may still suppose, if you are negatively biased toward Facebook like I am, that there is even more skulduggery lurking in the shadows, but even I am willing to believe that what they’ve admitted thus far is true.

Last but not least, the historical process of trial and error conducted by humans for millennia is perhaps one of the most reliable heuristic to use. We simply know that pairing certain foods prepared in a certain way lasted for centuries in a given region because the combination was nutritious, tasty and easier to produce given the region’s natural abundance. For a concrete example in the case of diet, there was no reason to really strip out the nutritious husk from a grain of rice, if it worked well for our ancestors until we finally achieved the ability to create white rice. We know that we can trust certain people in certain scenarios. Utilizing the inherited wisdom of our ancestors can reduce errors of all kinds.

The Market Opportunity

If I was a much smarter man, I’d already be working on the company that can crack this. But it’s a bit complicated, so I’m still thinking my way through it. Essentially, given the flood of information we all bathe in daily, we often still employ the classic if flawed heuristics of relying on personal recommendations and/or authorities, e.g. the New York Times. The problem with relying on a brand is at a certain point a brand, although it can be reliable at times given its incentives, at some point is reliant on a person or machine with bias. (Yes, all machines also have biases.) Driven by our tribal instincts, we still unconsciously tend to associate the most popular brands or information that aligns with our views, however achieved, with veracity, to a dangerous degree. 

A curator is necessary. And I think that it isn’t impossible to improve upon the current state of mass media by simply creating a customized blend of sources that can be trusted but only upon certain topics. Of course, that is the first step. Careful study of the track record and biases of each source, such as the Wall Street Journal editorial page, must be codified and then continually monitored to avoid bias drift next, and as for the final step, a short, handy rating of sorts could be applied to either filter down to acceptable levels of certain types of biases or even to only ever produce pairs of countering biases. 

This may sound like overkill. Why not just read Matt Levine for his take on notable Things That Happen? No offense to Mr. Levine, but then we must rely on his take, which, as excellent as it often is, is limited to, as he declares, Money Stuff. Furthermore, we are relying on human-scale curation. I’m envisioning ratings for the entire internet, which necessarily demands machine scale. Granted, there will still be some bias involved as humans typically have to build things, but at the least it’d be a step further along than the human-designed algorithms that already dominate our social media feeds. Imagine a Twitter feed with every article possessing a short bar indicating veracity, which one can click upon for a full rationale that explains why, based on source, topic of the article, track record of the source with said topics, demonstrated domain expertise, and more. 

A host of minefields remain to be explore as I think through this idea, but that is as far as I have gotten. In the Misinformation Age, such a tool is more than important – it is critical.


The Graduation Speech No One Will Let Me Give

(Armed with nothing more than mid-to-late 20s hubris, here’s the graduation speech that I wish most undergraduates and, heck, all other types of graduates could hear.)

Most of you will fail at most of the things you undertake in your life. Some of those failures will be so small you may not even consider them failures, or even notice them. Some of them will be noticeable, at least to you. And a few will be catastrophic or nearly so. The volume of mistakes and failures, of course, will vary considerably. Many of you already have heard the proclamations to own your failures, perhaps to not let them define you. Some of you have already failed quite a bit and have moved past it. Unfortunately, there’s some bad news: It will never stop. You will fail continuously for the rest of your life, most of the time you try things.

By the way, I do mean all of you, if not all of humanity. No matter how intelligent you are, you will fail at most things. Amusingly enough, the more intelligent you are, the correspondingly huger your mistakes tend to be, as you may have the talent to land in a very important role that requires making very critical decisions, one of which may well spectacularly backfire.

Oh, I’m not finished. You are going to die, no matter how much money the current crop of the mega-rich desperately throws at longevity initiatives. You also are inheriting this epoch’s particular problems, which are intriguing inasmuch as they seem to dwarf even the past couple decades’ passels of problems, which is saying something given we’ve all lived through 9/11, the dot-com crash, the financial crisis, etc. You are now uncomfortably, increasingly aware that the grownups who are supposed to be managing the world may not be doing such a great job. Some of you think you can do a better job. Some of you think that just the right set of current grownups need to take the reins. Nearly all of you don’t want to think of yourselves as full-fledged grownups that are actually more responsible for the current state of affairs than you think, and are already in the thick of the real world, given the last comfort blanket has now been tugged away.

However, a smaller subset of you is keenly aware, having been on the inside of institutions or a perceptive student of statistics, that all of these observations just made do have one thing in common: They all acknowledge the objective reality we live in, as demonstrated by statistics. A nicer way to phrase my opening line is that most people aren’t that successful in what they aim to do. That’s the simple reality we live in. If most people succeeded in even half the time in what they were trying to do, the world would be remarkably different. For example, once you get into a corporate job, you will be struck by how often you presumed the people actually running the businesses that make up the economy aren’t necessarily good at their jobs most of the time. Some of them are good some of the time, very few are true high performers, and some are outright incompetent.

Now, this moment is where one would expect the narrative to reverse. This is where I should flip my statements and explain why it’s actually all okay. That’d be a little too easy, however. And it wouldn’t be entirely accurate, because some of you will burn with ambition or hunger or jealousy or all of the above your whole lives, internally dissatisfied with the fact that you didn’t measure up, in some way, to what you thought you should achieve. Others will temper ambition, assess what truly makes them happy, then pursue that, and realize it to some degree, or maybe entirely. Many others will exist in some nebulous area between those two states, shifting depending on their changing environments, searching for fulfillment.

Such searching is what got this species to where it is. It is unavoidable, hardwired into your very flesh, blood, bone and brain. The downside is that such focus on the extrinsic, especially in the US, often tends to neglect the intrinsic. But failure knows no such boundaries. And thus, confronted with the possibility of failure on all fronts, the focus on the extrinsic, the searching for achievements outside of oneself, ignores the one area where you have the most control: the intrinsic. You.

The extrinsic matters too, of course. What you give to the world, what you accomplish or fail to accomplish for the most part, all is critical for actualization of the interior. But you are the node between those extrinsic affairs, operated by the intrinsic. And, again, there is only one of those areas where one can credibly say you have more than a fighting chance, and that is inside your own head.

Failures at most things in life, which again often go unnoticed – for example, being lazy and skipping that intended workout, or blowing a test that was a last-ditch effort – are all too often measured in terms of extrinsicity. But measurements that only take into account one part of a system often do so to their ultimate peril. The holistic nature of the intrinsic person accomplishing the extrinsic tasks.

To remain cognizant of the intrinsic interior, where we have the most control, we must recognize and declare our own value. Please do not mistake this with the abused term “self-care”. Recognition of one’s own value does not mean approval or condemnation. Rather, it is a clearheaded assessment that you do matter, just as every human being on this planet matters, in an incalculable, intrinsic fashion. That is the first step. The second, necessary step is to recognize that the purpose of this intrinsically valuable creature is to accomplish something in this world. However, given human flaws, such accomplishments are not only hard to define and find but also difficult to pull off. We must come to terms with the extent to which we may and probably will fail.

From there, we must define the terms of our failures internally. What IS a failure, after all? Is it merely missing a goal? Should that have been a goal in the first place? Was that goal unreasonable? Was it reasonable, and within your capabilities, and thus an actual failure? By asking just those few questions, it should already become clear again that most of the time you will fail. And yes, many of those failures will be bad. But they will not be significant. They will represent small morasses of frustration that, if dug into, could yield experience. That is, of course, if you are willing to do the dirty work of digging into them. Which is what one must, if one wishes to become resilient. Resilient to the degree that one can move past that particular failure…on to new ones.

That process is not easy and never-ending. At times it will be easier, and then become harder again, because humans aren’t particularly good at follow-through. That is the primary reason failure is so common, by the way. The virtue of persistence is preached constantly, as it has to be, but astonishingly few of us practice it, well, persistently.

The only way to render that process marginally easier is to embrace humility. Humility and security go hand in hand. And it is fascinating how most of us, including myself, do not possess much interior security. Hence our constant search for the extrinsic, for the outward validation or confirmation via achievements that we are worth something, or anything. The most successful among us can still suffer from that particular affliction. But achieving that inner security, that poise, that self-command, isn’t achievable through purely intrinsic means, e.g. meditation. It is achievable only in a blending of both the extrinsic, outward achievements, which in turn transform the intrinsic person that you are, which enables further outward achievements. It’s a quintessential chicken-and-egg cycle, unceasing in its oscillations.

But when a balance has been struck for a time, and humility is obtained, from that bedrock can sprout the tried-and-true, best approach to failure: learning. Again, your cycle of failures will never cease. With grace and luck and diligence and endless work, you may mitigate most your failures. You may lessen their incidence. But you will possess the resilience and the tools to embrace your endless probability of failure.

Let me be clear: This may seem like a grim way of looking at the world until you actually embrace it. Once you do that, however, it is the most reliable source of serenity in the often mundane humdrum business of life. Sure, it is tiring, but it is the seasoning that lends the rare successes true sweetness in life. Not only because they finally happened, but because you can place them in their proper context. Your successes won’t completely define you, and nor will your failures. You are too complex for that, so do not attempt to measure your self-worth by what you have done, but what you have done, who you are, what you can accomplish, and, maybe most importantly, what you can’t.




It’s rather sardonically amusing that, even amid the latest uproar around giant tech companies’ violations of user privacy, most people don’t really seem to care all that much about their privacy. Exchanging one’s browsing history and app usage for free access to such great tools such as Gmail, Google Drive, Facebook and the like still seems a good bargain. (As a dedicated Google ecosystem user, I am a perfect example.) Now, it would seem like people SHOULD care more than they do. You don’t know what you got till it’s gone, right? Corporations’ and governments’ intentions may seem fairly benign now, but what could happen should we willingly compromise our complete identity digitally?

The way it appears to me is that there are a few potential paths forward. First, slowly but surely, regulations such as the EU’s GDPR are enacted that curb or at least set the boundaries for data collection and usage, and the worst remaining threats to citizens’ privacy remain governing bodies. Second, an uneasy equilibrium between governments and corporations emerges, wherein citizens gain some greater agency via market mechanisms, with incumbents cleaning up their practices in order to retain consumers, e.g. Google shifting Chrome features somewhat to forestall DuckDuckGo’s encroachment upon its turf. Last but not least, however, the status quo could essentially worsen, as the majority of consumers simply cease to care about said violations of privacy. What’s interesting about that last potential outcome is that if consumers simply begin or continue to presume that surrendering their data is the effective price they pay for greater convenience (such as more effective ads on various websites), how will it effectively translate into negative outcomes?

Presuming every centralized data hoard eventually suffers a system breach, leakage of confidential information is the most likely negative outcome that has already been demonstrated. The allure of being able to access and then auction off personal data is only likely to increase as the quantity and quality of said personal data increases. But you could argue that such an occurrence is all but inevitable. The only plausible alternative is for each person to securely manage access to their own information…which could still lead to security breaches, potentially, but at far lower probability. Few are willing to invest much time or effort in such security. So the majority of users are likely also willing to endure such occurrences, as long as they remain tolerably rare. But it’s key to note that this essential tacit auction of personal data will also only persist as long as users feel they are being compensated adequately. Amazon must continue to give Prime members discounts, free shipping, access to more channels and more, if it is to continue tracking and targeting. In fact, Amazon will likely have to ramp up its features to retain and continue to attract users, simply as Prime members become accustomed to such perks. If companies do not keep up, in short, they are likely to see their user base contract. Facebook may well face such a challenge in the next year or so on its core platform, negated only by its potential growth in Asia and Africa.

So it appears that there definitely exists a market value for privacy that we are willing to pay, but over time, scope creep among what we expect to be compensated for surrendering our personal information is to be expected. This is where things get even more interesting. Where could equilibrium eventually emerge?

All forms of digital commerce seem the most likely realm for equilibrium. Rewards and membership programs are proven to work exceedingly well, as psychologically they do not violate longstanding sociocultural norms, and usually entail only the types of information gathering that most of us are fairly amenable to (many people, for example, could presume that I am interested in exotic coffees and thrillers based on just a few minutes’ conversation). Competition between such buy-in via subscription platforms will then result in services at various degrees of convenience and varying treatments of privacy, with additional services coming at the cost of additional layers of completion to companies’ physical and psychological profiles of users.

Social media is a much trickier proposition. One of the reasons I think LinkedIn has largely escaped as much scrutiny as Facebook is a) Facebook’s more aggressive approach; and b) the fact Facebook has tried to truly bill itself as a network of true, personal connections, as opposed to business-oriented networking. Facebook shouldn’t be viewed as “feel-good” in any sense; it is just as ruthless as any other company. But the disconnect between its marketing and its unveiled reality is more jarring on some subliminal level, I believe, which provokes more visceral reactions such as leaving Facebook (or barely using it, which could still prove deleterious to Facebook’s core business, eventually). But Facebook itself needn’t perish as long it can provide a decent-enough service and clarify to users what it is really doing with its data to the extent digital commerce companies do. Again, people are willing to part with privacy to a certain extent of compensation. An amazing UI, handy events scheduling feature and easy buy-in could well prove sufficient for many. What Facebook has been failing at is figuring out where its limits truly lie.

Let’s presume, however, that the limits to not just Facebook but also other services are far less than they may currently seem. Let’s return to that original third premise wherein consumers simply cease to care really at all about privacy. What type of market compensation would be required for consumers to willingly surrender nearly all their information, if not all? Even governments don’t have that much information (at least outside of China, soon enough). I presume it would be a multi-service platform offering unprecedented convenience, with enough security cachet it could coax even wary users into compliance.

Let’s start with a base case of Amazon Prime, and dial it up to where Bezos seems to want to take it in about five years. First, groceries and basic medications would be provided for additional prices (still discounted) to Prime customers, whom then would be coaxed into giving up their eating patterns and potential additional physical information. This in turn would of course only improve the predictive power and matching of advertising within Amazon for all other products. (More vegetables being ordered? Why not purchase this scale to help with your weight loss efforts?) The next step would be a more holistic psychological profile via analysis of streaming patterns based on the types of food most recently consumed and medications taken. How frequently does this person eat junk food and then stream for 17% longer later into the night? Does this spark melatonin supplement purchases?

As you can see, the cascading effect could only continue in what would seem the perfectly virtuous cycle from the company’s perspective. The trick would be to assuage any consumer concerns by ensuring excellent quality across the board (or at least a tolerable range) as well as potentially take a radical step of providing a monthly summary of data tracked and opt-in or out…accompanied by clear warnings of what would be surrendered each time. This could best work if a company is determined at creating its own complete marketplace/one-stop shop, as opposed to serving as the entry point instead of vendor, a la Google. It could both soothe users’ fears and inculcate in them a sense of comfort that the company is always willing to engage with their needs (not to mention an illusory sense of control, frankly).

Frankly, that’s the outcome I view as most likely, simply because buying in to ecosystems is going to become nearly irresistible unless decentralization efforts can become much more compelling and of sufficient quality. It’s a bit disturbing to contemplate, but does seem more likely, given the current landscape.


Controlling Time

A favorite if frustratingly smug exclamation of mine is “Time is a biological construct.” It’s not strictly true, as time is more of a purely physical construct than anything else that enables our limited mentalities to make sense of a world in which we are bound to three dimensions bodily but can dimly grasp the existence of more. Time is necessary, but it is also the great limiter, a factor that constantly thwarts our abilities to achieve and perceive. Moreover, humans are notoriously bad at really understanding how the inexorable advance of time is flowing all around us, and yet how it can, with no small effort, be somewhat mastered. As perhaps my favorite examples of the vagaries of time all reside within financial markets, I’ll illustrate what I mean later, but first it is important to make a few notes about temporality in general.

Everyone lives their own time

This point is readily self-evident and usually acknowledged when thought about. Perhaps one of the most common if at times tedious internal exclamations that adults tend to make when they see small children of friends after some years is some variation on “How time has flown!” This is entirely normal – as noted above, we aren’t particularly good at living outside our own perceptions in the first place. But it is important to recall that each and every person experiences and perceives time at their own pace, and that phenomenon of aging is maybe the easiest gateway toward comprehension. When you are young, experiences are painfully and preciously immediate, as they tend to be novel. As you age, the accumulation of lived experiences necessarily entails that novelty diminishes, and thus, time seems to pass faster as you aren’t experiencing that much that is new for the first time. Accordingly, it is clear that perception of time is what really defines time for us, and each person perceives time uniquely. Each person’s life is, to them, a unique accumulation of lived experiences perceived differently upon each viewing, like mutating layers of rock strata banded in kaleidoscope hues.

Perceiving time in and of itself can change it…particularly memories

When it comes to memories, the human brain is a tricky thing. Recently, it seems as though our prior model that experiences are consolidated before moved to long-term storage may be inaccurate; rather, the process is continuous, with memories being stored while they are being formed, at varying rates (as sleep, for example, is critical for the brain to organize its memory warehouse, so to speak).

But as the brain and indeed whom you are can be viewed as a collection of memories encoded in neurons and weakened or strengthened by the firing of synapses to unearth particular instances, it stands to reason that by choosing to recollect certain moments more frequently, you favor the preservation of some memories more than others. Two things then occur: First, you cull the brain’s entire warehouse for the memories most often remembered, thereby pruning by way of not bothering to preserve what you ate for breakfast on July 19, 2017 in favor of retaining the memory of your goddaughter’s baptism; and second, even by recollecting that baptism, you do not remember it the same way each time. We crave familiarity and repetition at times, but novelty brings exciting stimulus, so as we retread the familiar, some slight nuance may be either lost or added, depending on what is focused upon each time.

Being relative, time entraps us but is within our control

It’s intriguing how frequently time is cast or referred to as a pitiless destroyer of all things. The truth of the matter is that we live time linearly, but really, it is curvable, as best as we can tell. Moreover, our perception of it is elastic. Some may quibble that our perception may be elastic but not time itself, truly, so does it matter? I’d respond that if our perception of time is the building of memory, and we are our memories, doesn’t our manipulation of perception ultimately matter more than the reality of time? Frankly, we don’t quite know enough about time yet to truly escape it, but it is much more within our control than may be thought at first.

A life of routine without pauses of appreciation or reflection will surely accelerate perception of time, much like you can absentmindedly commute to work in what seems like no time at all, only to realize that pattern is so ingrained that 40 minutes passed in what appeared to be the span of 10. However, a life of pure novelty is impossible, so the ideal prescription appears to be a balance between the two. Moreover, pure novelty isn’t required to slow down time, but rather, different circumstances can help deploy a slowing of time, or acceleration.

For example, consider this: Find as close to a silent space as possible and close your eyes. Begin to inhabit each passing moment. Reflect upon the immediacy of every sensation remaining to you. At the other extreme of inducement, much like soldiers in combat scenes report time dilating to their perception, scenarios that hijack the body’s alert system can also help dilate time, such as moments of extreme physical effort or perception of danger. (I can assure you that the dozen or so seconds of freefall I experienced while skydiving felt quite long.) Now, should one wish to kill time, so to speak, rather than slow it down, it’s simple: Seek out diversions that are of necessity novel but not so novel that they require active perception. Hence the common temptation to rewatch a favorite film when bored but lazy on a weekend afternoon.

The payoff, or what prompted this reflection on time

The primary impetus of this post was an idle reflection in a bar on a lazy Friday afternoon when some of my mid-20s peers were bemoaning the onset of more severe hangovers, attributing them to “getting old”. I was amused in a sardonic way, reflecting on how mystery novels from the 1930s I used to read would call men of thirty to thirty-five “young”, and how the times can change so relatively quickly to land us in our youth-obsessed world. I don’t really blame the baby boomers for their predilection with the 20s and so on; after all, some young men and women of the American or French revolutions seemed to unconsciously evoke the same sentiments. Each age has its own preoccupation with youth; ours is just exacerbated by the proliferation of micro-narratives that become long term through their preying upon common human fears. But it’s rather tiresome to try to mark out your life in conventional signposts and give in to the worst vapidities of our time, focused relentlessly on looking and staying youthful. (Caveat, I’m as bad as any of my peers, only having always looked prematurely older than I am by virtue of balding early and hitting puberty very young, I frequently refer to myself as being essentially 40.)

But all these typical comments aren’t really a significant issue, although if overindulged they can prompt harmful behaviors. What is more important to note is that especially given the return of volatility to global markets, and increasing concern around whether the narrative of global economic recovery is truly real or not, understanding time and your relation to it is crucial when it comes to investing. Many will seek to recall the basic lesson they’ve all heard time and again: “Buy low, sell high.” And consequently, they will look to begin hoarding cash and hope for a dip in the markets only to buy everything cheaply. This isn’t a bad plan, but it is going to be very hard to follow. Until it happens, you don’t know how you are going to react to it. And the ‘it’ in this case is another recession, which could be even worse or milder than the last one (the evidence is mixed at best when it comes to assessing which). Plus, for those of us overexposed to financial assets, it’s not that pivoting all your money into capital preservation right now and then rotating out once the S&P 500 drops a substantial percentage will help either, but rather that as that overall return begins to dip lower and lower, one should remember the evocative nature of stress with regard to time, and how it elongates perceptions. This too shall pass, in other words.

Furthermore, to hearken back to what I noted earlier, it’s easy to remember the financial crisis and Great Recession differently each time. But it took quite some time for everything to unspool, and there were quite a few hiccups along the way. We have selectively pruned our memories to compress that time, which could provide us with a misleading mental map of how long it took for us to 1) realize we were in deep trouble, 2) financial markets to price in the level of risk, and 3) the actual level of pain and worry experienced. Consequently, one must soberly review the actual chain of events at length, and assess how history is rhyming – not repeating – this time around. For my purposes, for example, accumulating cash is still a sound strategy, yet I am maintaining exposure to emerging markets with a healthy dose of aggressive risk-taking as there could be a brief window of time in which capital flows to emerging markets may occur, should the US experience the first rumblings of significant shock, again. However, that window may well be quite brief, as currently, US monetary policy may induce flows into the dollar, and emerging markets companies may feel the pinch of higher debt payments much more acutely than others. But, bearing in mind all I’ve said, I’m also tracing the history of the financial crisis, and debating whether the same pattern of contagion may occur – and I have no firm conclusions as of yet.

In conclusion, adopting as much of an awareness and control of time in this period of heightened volatility, both financial and political, will assuage the tendency to let outliers or subtly altered memories overly affect us.


Reflections: Looking Back at 2017

Frankly, I’ve never understood the antipathy toward New Year’s resolutions. They are simple bargains you make with yourself or others toward self-improvement…what’s not to like? Perhaps many take them too seriously, in which case, I refer them to Captain Barbossa and urge: “They’re more like guidelines, anyhow.”

So as I begin to assemble my list of resolutions (I tend to go big or go home and thereby complete about 30% to 40% of them, to be even more frank), I thought I’d review some key lessons learned in 2017.

Never Let Your Bias Cloud Your Judgment

It is no secret that I still detest President Trump and much if not all of what he stands for. Even if disaster hasn’t materialized yet I still believe he is incompetent, unfit for office and a poor example. Those sizable caveats aside, I must admit I think his administration (which may or may not be really led by him) has made some more laudable moves: at least a few decent judicial appointments, more explicit support of the recent Iranian uprising and actual commitments of support to Ukraine. The verdict on the tax bill is still out, as frankly it is quite complex and I haven’t had time to delve into it fully, and given the extent of Republican dominance one would have expected even Trump to be able to do more. But as has been made manifestly clear, any administration is capable of making some good decisions, and one shouldn’t let one’s bias prevent recognition of said good occurrences. (I could go on and on about this with regard to personal investing, but this is supposed to be a brief reflection, not a manifesto.)

Work & Personal Life Shouldn’t Be a Balance, But a Cycle

Work and personal life shouldn’t be a zero-sum game. People all too often view time as a fixed quantity, when one should view it bimodally: Time’s finite units have variable productivity, and thus really can expand or contract in all practicality. Being fortunate enough to enjoy most of my job, and as 2017 was my most challenging year professionally to date, it became a valuable learning experience of prioritization. I found that to still enjoy work, I had to keep exercising to reduce stress and attend most meaningful family events such as birthdays or reunions. Such events are useful for grounding perspectives. It’s not about taking time off to just rest per se, but switching activities so dramatically from your workday’s typical roster – going without screen interaction, for example – it is a corrective to your normal cycles. And you can indulge in such resets all the time. Take a different bus or drive to work. Take a two-hour lunch. Use 15 minutes during some downtime at work to read a non-work-related book or article, or finally renegotiate that Comcast bill you’ve been putting off. (Everyone does this, but indulge it in more meaningfully as no halfway decent manager expects you to really work all eight hours of a given day.) Work 10 hours one day and then arrange to work six the next. Experiment with no headphones for the first couple hours of the day. Grab lunch with a random friend or family member you haven’t seen in a while close by to your work. Humans thrive fairly comfortably on change sufficiently large to excite and yet sufficiently small so as not to be painful.

Recognize When You Cease to Be Effective

Rather than suffering burnout, take more time off than you need. Perhaps one of my most frequently repeated mantras to my colleagues is “Recognize when you cease to be effective.” It’s very tempting to pull long shifts and feel virtuous about your persistence and fortitude in working a 12-hour day, but you are likely ineffective for much of that and essentially banging your head against a wall. Worse, you may well be slowing others down with your inefficacy. So learn to recognize when your rate of productivity is not improving or even entering a plateau, and take a break.

Willpower Isn’t Truly Finite

Willpower is finite. But you can let habits do the heavy lifting, thus making deployment of willpower variable. This year I made some significant mistakes both personally and professionally, primarily because I hadn’t made good habits take up the slack of willpower deployment at work and thus ran out too early in a given period of time. A key focus in 2018 will be letting habits do the hard work. In the end, it’s about being constantly honest with yourself – radically candid if you will – and identifying when you are prone to which personal failings.

Use Your Personas Wisely, But Always Be Cognizant of Why

Personality is such a fluid construct that people deploy different facets of their personalities at different times via what can be called personas. In a professional setting, for example, most have a set persona. It can evolve over times, or be markedly different from the personal persona, or be quite similar. I am notorious for a fairly bombastic, extroverted, exuberant persona at work – which is amusing, because like many I am neither introvert nor extrovert but a blend of the two. I don’t care much for personality tests or similar fairly rigid boxes of self-identification, but I know that I am far more introverted than many of my colleagues may suspect. By virtue of my role and because I enjoy indulging in certain other facets of my personality at times, however, my extroverted persona became more useful over time. It’s only critical to recognize that carrying on in just one persona all the time can be rather exhausting, and thus you should always be cognizant of why you are utilizing those particular characteristics of your entire personality.




5 Key Lessons Learned from First-time Management

2017 has been my most professionally challenging year ever. This was due in no small part to the fact I became an official manager at its start, with two direct reports. Although I was de facto managing people’s workflows for at least a year before then, and I have managed far larger teams of individuals, it was an intriguing and rewarding experience by and large to experience management in a true corporate setting. So, without further ado, here are the five key takeaways I’ve gleaned thus far from the day-to-day responsibilities of managing, in no particular order.

Leverage your personality.

Unfortunately, my workplace persona quirks include occasionally emitting a quietly sung word in a song’s phrase or silently head-bobbing at my desk to music or mediocre-to-atrocious puns. (Yep, it can be just as annoying as it sounds, so I constantly work on curbing these indulgences.) Most of the time, it works within my team’s camaraderie, but especially if you can also tend to come off as either overly bombastic or, at worst, pompous, you have to harp on egalitarian themes to compensate. Even better, you can’t be sensitive in any way to people poking fun at said mannerisms, because that way they can blow off steam if need be. And it is best of all to check your team’s mood frequently and react accordingly, because to everything there is a season. Yet as you may be able to guess, such gregariousness and humor can help create a welcoming environment.

Sticking with just a handful of persona-delineating traits gives people boundaries and consistency, which are crucial to maintaining a team’s stability. My colleagues know that if it is a particularly busy time for me, I tend to remain silent, and can respond accordingly. Moreover, I can puncture tension or stress with a silly observation if need be. My team is odd in that one of my reports is nearly as experienced as I, while my department as a whole is very congenial and closely knit, so being able to signal around the limits of your ego is critical. Frankly, I still most often fail with regard to simply indulging in silly or mildly amusing hijinks too much, even if my team and other colleagues find them enjoyable. It’s useful in that I clearly show that I don’t take myself very seriously at all, but I usually take it too far still, and thus must still work on toning it down.

There are other examples of deployment, but relying too much on your personality can also lead to a major pitfall, such as:

Favoring radical candor over cruel kindnesses.

I am a big proponent of radical candor. But it is devilishly hard to maintain. Nobody likes to disappoint people or chide and correct behaviors. Yet if one refrains, it is embracing a false kindness that is actually crueler than being candid, as inevitably either a flaw grows to such a serious level that it can result in a very poor outcome or performance remains mediocre and the employee typically grows frustrated.

As a relatively young manager, the key hurdle is being able to assume a level of gravitas and empathy to communicate difficult truths thoroughly and clearly. Empathy, luckily enough, comes fairly readily as I worked through both of the roles my direct team handles, so I know the frustrations and difficulties faced. (If you don’t have that benefit, you frankly have to ask for it to be communicated to you much more frequently and clearly, soliciting clarity and honesty.) Gravitas is much more difficult for most, because there are the traps of bluster or presumption or, most damning of all, the demand for respect and consequent self-justification. That simply doesn’t work, especially for me, given my youth and level of experience. So that leaves only one option: radical candor concerning your own flaws and level of experience. As a first-time manager, I most frequently fall back on an approach resembling that of a team lead rather than a more senior manager, e.g. “I have done it this way in the past, but is it the best way? I don’t know, to be honest, so let’s explore this together.” The crucial concept underlining such an approach is that since you are the one in charge, responsibility ultimately lies with you, and thus you are the decider.

That approach only works well if buttressed by two more key concepts.

Respect is best earned via demonstration and offering of loyalty.

You can’t demand respect – that is probably one of the surest ways to lose it. In a team of relatively young people with not a ton of experience, I simply can’t afford to regard myself as meriting the respect of my peers based on slight seniority or the fact I was chosen as manager or significant outperformance. Even if all of those are true, only outperformance is likeliest to work, and sadly, I am not skilled nor talented enough that I outperform on every single project. Accordingly, what works best is extending respect to your team members for their relative merits and continually reiterating and demonstrating that respect as warranted. This is further reinforced by shows of loyalty, e.g. lobbying for promotions or raises and praise in public settings, as people may admire significant intelligence or hard work, but ultimately people care and respect those whom demonstrate they care for and respect them. Those first two are important as well, but proffering respect is even more important for first-time managers in particular.

Sincerity also cannot be emphasized enough, which ties back into personality. My personality type lends itself to exaggeration and overly effusive praise, and thus I must be wary and rein it in as appropriate. Conversely, if you are more taciturn, your encomia may carry more weight, but you have to remember to voice it likely more frequently than you would normally do so.

Exhibiting trust by offering ownership.

There are no ideal situations in business. At some point, you simply have to trust someone with an important project, try to hand them all the tools to succeed, and then pay consistent attention and offer constant support to mitigate any potential issues. Giving people a clear sense of ownership is crucial – and people usually can sense when you are still working behind the scenes. Everyone appreciates support, but they have to know and appreciate they are being entrusted with an important task, and it is primarily on them whether it succeeds or fails. At no point must one state the clear truth, which is that you will stand up for them and take the blame if the endeavor fails as it would be your fault ultimately, but all credit is theirs. Instead, the fact you have their back must be implicit.

There is one final key lesson I learned:

Consensus must allow for respectful dissent, which can be healthy.

This is a key piece of radical candor, yet also ties into personality. I am a terrible liar and also disagree with certain choices made by clients or even my superiors. Obviously I don’t need to proclaim when the latter two occur, but I must be upfront with my team members should they also express concern as I cannot dissemble and, more importantly, they deserve to hear my reasoning. Of course, one must also ground your explanation of your differing opinions in the context that the decision has been made and you could well be wrong too. The important thing to remember is that dissent is not bad – actually, it is often good. If your team feels secure to dissent (in a respectful manner) but signal that they will still abide by your decision, that is a good sign, as there is more than sufficient trust and camaraderie present. If you also adequately explain why you disagree with the choice made by the client or your own boss, then you are exhibiting respect once again by showing that you appreciate their level of sophistication and maturity considerably. Plus, it teaches a valuable lesson, namely, few to no business decisions are foolproof.



Path to a New World Order: Musings on Tribalism vs. Nation-states vs. Hyper-connection

Looking back upon the birth of the Internet, the more hyperbolic of encomia that proclaimed it a powerful tool to end division and promote harmony remain about as woefully erroneous in hindsight as Kurzweil’s continued predictions of the Singularity’s date. The Internet only aids and abets harmony as much as a given user wants to aid and abet harmony. You can’t really get away from the perpetuity of human nature, no matter how much historical cycles actually vary, it seems.

But this view does omit the actions of many whom, even intermittently, do employ the Internet to contribute, share, empathize, or otherwise perform positive actions that help increase community. Ostracized or lonely individuals can find communities anywhere, no matter how niche the topic. This is still tribalism, but it is the more positive kind of tribalism, as opposed to the darkest recesses of 4chan or the peculiar, puerile idiocy of racist outcroppings of the alt-right. More wide-ranging and relatively neutral in effect tribalism, such as those organized around common historical, linguistic and social bonds, as well as perhaps ideological, are much more powerful in general, given their wider common denominators. The best current examples that come to mind are secession movements in Europe.

One thing I often ponder is how strong such positive outpourings of tribalism may end up truly being. Viewed as purely an informational flow mechanism, the Internet can enable any number of niche communities. Of those many, some few that perhaps could not have been able to achieve any critical mass that could lead to sparking change now have been able to. This is often called the tyranny of the minority, as most notoriously exemplified by the extent to which splinter factions of white nationalism in the US are now enjoying media coverage beyond their wildest dreams.

So where does this trend go from here? There is a sweet spot between communities that mainly exist by the enabling information flows of the Internet and extant communities that are now empowered more than ever before in terms of effectively delivering their message. For example, Catalan nationalists and Scots currently enjoy greater abilities to effectively organize and promulgate their messages than ever before, as opposed to, say, Occupy Wall Street, as both the former had much stronger actual ties. (It’s worth noting that Occupy Wall Street did enjoy a rich historical heritage in America of fairly left-leaning anarchic factions popping up every once in a while.)

Few communities can clear that hurdle of effective real-life ties (aka skin in the game, however much Nassim Nicholas Taleb irks me he did popularize that highly effective phrase) merged with the power of modern communication infrastructure. Yet will more end up doing so if the first few are successful? Should Catalonia successfully secede, will Scotland be next, emboldened by not only that example but the parent-state’s own Brexit?

It is tempting to think so, mainly because the neoliberal global political order that endured since the end of World War II has rarely looked so strained. Key nation-states were the primary component of said order, with the indispensable nation of the US, which challenges so many political theories, seemingly the most troubled. But at the same time that separatist parties or, essentially, those that can be classified as status quo challengers, are empowered by the Internet, the multinational corporations that not only provide consumers with the very means of access but also grow increasingly monopolistic in major sectors grow in power too. And whether they admit it or not, businesses such as Facebook or Amazon or Shell or BP have a vested interest in catering to diverse groups but still unifying them under one massive umbrella: their customer base. Hence to court customers, corporations will enable formation of communities because, after all, connection is what humans crave so avidly, as long as they can achieve buy-in to that most basic yet essential community – that of the corporation’s customer.

This is neither wrong nor right – it is simply the natural evolution of the corporation’s strategies as it currently exists. But since they have such powerful incentives to keep the peace, as it were, or at least maintain the ability to keep growing and tapping into newer markets, hyperconnection is a definite end goal. Facebook and Google are the most obvious exemplars that come to mind, seeing as their businesses are so uniquely dependent on eyeballs. But every multinational isn’t that far behind.

Meanwhile, we have the supposedly beleaguered nation-state, that now has to contend with not only empowered internal tribalist factions but multinationals’ suite of incentives. But the nation-state can’t be counted out just yet – apart from empires, it is the most successful form of its kind thus far, after all. And the oldest nation-states that most successfully integrated its various tribes do seem to be robust. France has had its splinter separatist movements for a while, but few look able to attain critical mass thanks to how successfully monarchs and then draconian republican governments stamped out or absorbed local dialects and identities. The United Kingdom was less successful, and faces the consequences. China is an exemplar of absorption of tribal factions, often at immense human cost, although they are hardly alone when it comes to that, when looking at the US. (That unique nation was essentially founded in part on the ideal of tribal factions burying the hatchet just long enough to compromise endlessly in favor of common interests.) Germany is more in line with France in terms of how well it stitched together duchies, kingdoms, etc. Thus far, in fact, only a few nations that, due to their particular histories, never assimilated all their citizens to a sufficient degree look to be in danger of secession actually occurring.

But will that really matter? How small does a nation-state have to be to not count as a nation-state? Presuming that a nation-state is a group of people or peoples with sufficient weight of history, customs, languages and such in common to assemble under one banner, does size truly matter once it grows to a scale dwarfing that of tribes?

Hence we now arrive at an interesting crux where the current global order appears balanced between growing hyperconnection fostered by pure economic/financial interests (multinationals and partially certain nation-states with globe-spanning economies), tribalism enabled by said connections that can undermine certain nation-states, and the former status quo largely based upon a handful of nation-states (the G20, to be generously inclusive) mostly reliant on both trade and information flows continuing yet also suffering their consequences.

Where this ends up is truly anyone’s guess. But separation in and of itself is not truly a bad thing. Sometimes the consequences simply aren’t worth it – if the issue of slavery hadn’t even existed in the US, I contend that secession still wouldn’t have been worth it as it would have increased longer-term probability of warfare, which should be avoided at all costs. (Hence why I am still against secession, for what it’s worth.) The same COULD be true of, say, Catalonia and Spain, or Scotland and England. There are lesser odds in both instances of conflict arising in the future, but that is more due to the degree of mutual benefits from refraining.

Actually, that is the real question: What degree of mutual benefits and, for that matter, shared potential costs can help assess whether tribalist instincts could triumph or not? For example, I don’t really believe secession will happen in the US despite unfortunately perceptive Steve Bannon’s prognostications about California, mainly because the mutual benefits are too obviously great. The same could be said of Scotland, although Catalonia is a tossup. So wherever it is possible to ascertain that disparity, it should be easier to assess where tribalism will triumph. And, in a fairly cynical move on my part, I am also willing to bet multinational corporations will do their best to subtly undermine said processes when absolutely necessary and when they can to ensure the bottom line remains intact – since the status quo was so beneficial to many such enterprises, they do have a pretty sizable incentive to preserve much of the current balance of power.

Granted, that is all contingent on humans’ ability to not shoot themselves in the foot when acting in concert. I have been very wrong about that before, and frankly, on a personal level, continually go against my own best interests. But thanks to modern information flows, even as swiftly as lies and other fanning of tribalist flames occur, so can the benefits of communicating with other, similar members of an online tribe with potentially varying motives accrue. Time will tell which bonds necessarily are the most crucial.


In Favor of Information Overload

As technology advances, the pace of innovation quickens at an almost exponentially faster pace. At times I ponder whether the excesses of the information age now imply that in the most developed, richest nations, we are experiencing its profligate last stages, before augmented and virtual reality jump the information age to an entirely higher stratosphere. After all, with the mass amounts of spurious information now being created by the gigabyte as I type this, as well as the perhaps even more massive totals of accurate information being generated and yet never to be analyzed, are we drowning in our own excess?

The primary challenge for anyone looking to remain relatively well informed is how to deal with the torrent of information bombarding your typical day, with or without your consent. You already take in more than you may think – for example, right now I am writing this while listening (please don’t judge me) to Demi Lovato, after listening to a medley of Rostam’s latest album Half-Light, just after checking Twitter, with several tabs open to articles, one of which I am about to cite. Can the typical human brain handle this much information?

Yes and no. It is obvious that we have limited focus and attention spans, and consequently have to make deliberate choices in what we really want to pay attention to (for example, I fear that I am not really paying THAT much attention to Ms. Lovato’s lyrics). The most common complaint that usually follows this obvious problem of training your ability to focus – which, by the way, is probably one of the greatest competitive differentiators for employees nowadays, in my experience – is that it is increasingly hard given exposure to so much information flow. We should limit our intake, so that line of reasoning goes, very selectively. Prune your Twitter feed, cull your magazine subscriptions, etc.

I disagree. I think that we underrate the human brain’s ability to handily reject information that isn’t relevant to the needs of the moment. Case in point: I’m about a third of the way through my customary Saturday morning reading time, and I can relate how I have read two interviews in Quanta magazine, one with Ken Ono, a gifted mathematician, about his book regarding an Indian math prodigy from the 19th century, and another with physicist Neil Johnson about modeling extremist behaviors online. I read an Aeon article about the transformative nature of the changing-self paradigm, and another about creepypasta. I read an Art of Manliness article about four key money-saving principles. And last but not least, I read an intriguing article from Quanta (again) that I recognized would prove of use in writing this contemplation of information intake.

That article explores how the “information bottleneck”, wherein less-relevant details are pruned from memories by the human brain unconsciously to preserve more important details for working memory purposes, could prove of use as a concept to trainers of deep neural networks. That process of mental pruning of relevant details is constant – I am already losing details of those articles I listed above, apart from the one I just cited, as my parsing of that one was repeated. Such repetition increases the brain’s retention because reviewing increases the firing of neurons down the same pathway, signaling to the brain that this is worth storing in working memory.

Now, if this process is already underway, to what degree do we truly lose if we deliberately expose our informational capacity to as much as possible, or even more than anything possible? Isn’t that the wiser course, as then the primary challenge is to hone our sense of what is best to retain in working memory? Rather than throttle the stream of information at the source, why not gently nudge its flows into more disparate channels of varying priority? In brief, train your prioritization, not your volume of intake.

The cultivation of one’s stores of information is critical, of course, but maintaining the diversity of the mental ecosystem, so to speak, is as important as the biome’s variety to the body’s overall health. Managing the only truly limited resource, one’s precious time, then, is the accompanying, paramount challenge.


Path to a New World Order: Deglobalization via Demographics

Everyone likes to think they are a contrarian in some way, in an ironic twist reminiscent of the fact most rate themselves as above-average drivers. I am as guilty as anyone of this phenomenon, which is why I am be predisposed to agree with the authors of an intriguing new paper, available here. In brief, the authors’ argument is that aging demographics will lead to savings being lower than investment, which will contribute to rising real interest rates, inflation and wage growth, all the while inequality falls. You should read the paper for a full runthrough of their reasoning, but I couldn’t help but take their conclusions a few steps further.

Ever since Fukuyama declared “The End of History” many have been trying to poke holes in the theory that the current world order is truly that superior. Some say that rising inequality in developed nations (noted in the aforementioned paper) as well as Brexit, populist political movements and, most tellingly, Trump’s surprising victory, are all symptoms of backlash to the consequences of globalization.

You can’t turn back the clock, however much you may wish to. Current populist movements all succumb to the allure of nostalgia when they wish to have their cake and eat it too by enjoying all the benefits of globalization – relatively inexpensive cellphones, access to a massive diversity of inexpensive goods, etc. – without having to face the consequences of exporting many jobs.

But what if the clock moves even further forward, as a consequence of globalization? Globalization inadvertently helped exacerbate inequality, which in turn was further worsened by certain policy decisions, both monetary and fiscal. That doesn’t mean globalization is inherently bad – after all, doesn’t a white working-class man still enjoy a higher standard of living than a factory worker in Bangladesh, now? But as developed nations grew even richer and, eventually, the best-positioned nations to take advantage of globalization and ride the boom of being producer to unimaginably wealthy consumers, demographics began to change. Median ages crept older, as birth rates declined and immigration proved insufficient to stem the tide. Intriguingly, and as a result of one of the most ineptly cruel policy decisions of all time, even China faces a demographic shortfall due to its one child policy.

So as demographics shift, let’s presume that the conclusions put forth in the paper cited above are correct. Should labor become scarcer, its value will only rise. Consequently, technological advances that have been steadily if slowly progressing in the background that only further erode reliance on labor will become even more in demand. Increased automation of manufacturing processes will become even more widespread, rendered far more intelligent by the application of advanced artificial intelligence programs.

More importantly, they could well become localized. Why rely on overseas factories if it becomes cost-effective enough to make things domestically and reduce transportation costs? I am well aware that’s a significant supposition, and likely requires additional factors such as reduction in taxes, significant technological advances and perhaps even subsidies. But in many industries, I think many underestimate the extent to which they only rely on overseas labor simply because it is cheaper, currently. Once the intersection of human labor costs and labor-replacing machinery costs is reached, local production may well seem more reasonable. If the US was still the third-most prolific producer of cotton in 2016, for example, and a company such as Atlanta-based Sewbots can automate most clothing production, perhaps made in the USA won’t be as much of a signal of loftier prices.

Once centralized factors of production for many types of goods that once were located overseas are again located domestically, how much different will the status of globalization look? I’m not supposing trade will necessarily shrink by a massive amount – after all, natural resources are distributed inequally – as the flow of human capital and information via the Internet will continue, hopefully. But the world will lose one primary avenue of globalization, although as I said earlier, the clock can’t be turned back and cultural exportation and assimilation and, inevitably, ensuing backlash will continue.