I love sleep. I’m not ashamed to admit it, even as someone who used to repeat tropes like “I can sleep when I’m dead” and “Sleep is for the weak.” After the better part of a year embracing the value of sleep, I have returned to the life of minimizing those precious unconscious hours. It seemed fine to run on six hours a night during undergrad, but now that I think back to all the times I fell asleep in class, maybe it wasn’t so great even then. The science seems to show that I was probably deluding myself.

Last week, Joe Rogan released a podcast with the self-titled “Sleep Diplomat”, Dr. Matthew Walker. I rarely listen to The Joe Rogan Experience because the conversations commonly stretch more than two hours, but I had several hours of traversing the prairie, and Joe brings is a very good interviewer of very interesting people. An appropriate topic came up: the fact that we can fall unconscious in a state that appears to be sleep, but it’s not actually restful sleep.

First, a bit of background. Neuroscientists have broken down the different phases our brain waves go through when we sleep into five stages: 1, 2, 3, 4, and rapid eye movement (REM). These creatively named stages of sleep reflect the different patterns of brain waves discernable on an electroencephalograph (EEG) output. The first two stages are actually quite similar to wakefulness. You’re resting not deeply sleeping, and most people will wake from these stages with only a small disturbance. This is sort of “sleeping with one eye open”. In stages three and four, you enter deep sleep. Brain waves slow way down, and your brain starts secreting all sorts of hormones necessary for repair of both neural and musculoskeletal cells. When you wake up feeling absolutely dreadful, and your limbs feel like they’re made of lead, you probably were in this stage when your alarm went off. The final stage, REM, is when your dreams happen. Your brain is actually more active than it is when you’re awake, and you’re essentially playing out scenarios in your head at 20x speed. Why these scenarios are often fantastical and utterly nonsensical is a mystery, but this process is critical for memory formation.

The reason I provided that little primer is to say succinctly that when you’re in an unfamiliar place – a hotel or a friend’s couch – you never really enter those truly restful third and fourth stages. You sort of flit between stage 1-2 and maybe a half-baked REM sleep where half your brain is staying conscious. From an evolutionary standpoint, this makes a lot of sense. Before the days of secure society and safe hotels, those who were able to sleep lightly when they weren’t in a known and secure environment probably lasted longer than those who could fully conk out anywhere. Of course, there are exceptions to all of this, and these are broad generalizations, so if you’re the type of person who sleeps soundly and deeply on the road, please consider a career like a travelling salesman so those of us who can’t sleep so well on the road don’t have to suffer the sleep deprivation.

And all of this comes back together with that whiny sentiment. I’ve been on the road most of the last two weeks and haven’t been sleeping particularly well even in my apartment. Combine that with multiple early flights, late nights, and a compulsion to start my day well before dawn and the depth of the perpetual circles under my eyes have gotten to a point that even startled me.

And there’s no end in sight. I have four more nights couchsurfing in Chicago (yes, Invenergy would have paid for a hotel room, but I couldn’t justify accepting the exorbitant rates regardless of who was paying the bill), and then it’s off to Kansas again for the foreseeable future.

Anyway, that’s about as exciting as my life has become. With the current schedule, I’ve been too tired to try to spice things up with any exploring, excepting this little weekend adventure. I came out to Chicago a couple days early to see a lecture by Dr. Jordan Peterson, a psychology professor at the University of Toronto. There is much thinking to be done on the ideas that he unloaded on us yesterday, but hopefully, I’ll have the time (and mental capacity) to write some of those reflections down.



“So. I’m going give about 10 seconds to get up and get out of bed,” the terrifying voice of a former Navy SEAL commander uttered from the phone sitting on my counter.

With my limbs feeling like lead, I tossed off the covers, rolled onto the floor into the leaning rest, and began my pushups as Jocko counted: “1, 2, 3, 4, 5, 6, 7, 8, 9, 10.”

The motivational morning speech continued as I turned on the lights, folded my blankets, turned my little futon back into a couch, and changed into my workout clothes.

Once I turned off the alarm, I changed back into my sleeping clothes, remade my bed, turned off the lights, and set the alarm for 10 minutes later, 8:50 pm.

I was practicing.

Last week, I had snoozed for over an hour on multiple mornings, skipping my workouts, which started a downward spiral for those days. I’m sure I’m not the only one who has noticed that the resolve to get up and workout in the morning is much stronger when I’m setting the alarm than when I’m responding to it.

Fortunately, we don’t have to rely on resolve. The old trope that we are creatures of habit is both true and extremely powerful. Firmly building good habits takes time, but once they are in place, it’s like being on autopilot. For many parts of our lives, we don’t want to be on autopilot. We want to be focused and present while we’re solving new problems at work or being with friends.

But this energy is finite, and if we’re using up our mental energy on simple things like getting out of bed, that’s taking away from the more important things later in the day. Now, instead of fighting through the internal debate of “to get up or not to get up” as my alarm creeps earlier and earlier into the wee hours of darkness, the habit is reforming. After only about a week of consistency, I’m pretty much unconscious until I’m stepping out into the cold and jogging over to the gym. By that time, the “not to get up” argument has little force.

I’ve built many good habits over the years, and the momentum of those habits has buoyed me through my psychological struggles over the past few months, but there are plenty of habits that I didn’t even know I’d been cultivating.

One particularly important one, which has had some unexpected consequences, is the habit of constant relocation. Before moving to Denver, I was in Fort Collins for about three months. Before that, I was on the road for a month. Before that, I was in Poland for a month. Before that, on the road for one. Before that, Norway for two, Visby for seven, Uppsala for one, Fort Collins for three, Denver for one, Fort Collins for three, on the road for two, southern Seoul for four, central Seoul for four, Fort Collins for two, Milton for one, Pace for seven, Pensacola for seven, Glenn Burnie for two, Fort Collins for one, and finally, almost five years back, Annapolis for a long four years (if you don’t count summers and that semester in Colorado Springs).

And to note, I’m still in the cycle. Even though I won’t be moving this year, my office is moving, so it’s hard to justify getting too involved in Littleton or in activities along my commute.

The point here is that this habit of being in one place for only a few months at a time means that my concept of time has been distorted. To me, “a long time” is like a year. Who gets anything done in a year? The projects I’m working on will take at least three years to complete, probably more. Anyone who has become reasonably competent in anything (barring prodigies) worked toward that goal for years if not decades, even scores.

Although I’ve gone through the motions of settling down to make some legitimate progress on my goals (career, education, fitness, etc.), I haven’t actually made the mental shift. I’m still in the mindset of making sprints instead of training for the marathon. This translates into pervasive impatience. I try to attack all my goals at once, get frustrated that I can’t keep up with them, question why I’m pursuing the goals to begin with, and backslide.

But how does one practice pursuing long-term goals?

I’m still working on this, but here’s what I’m trying. And at least for the first five days, it seems to be going really well.

  1. Patience & Perseverance. Building good habits (and breaking bad ones) takes time. For those like me who view “long” time scales on the order months, this process seems like a really long time. I’ve lost a lot of my good habits, and my impatience to get them back has led me to rationalizations like “Ok, that was a good week. Now let’s double all my goals, commit to triple the time, and push past where I was a year ago.” That, of course, leads almost immediately to failure, depression, nihilism, and self-doubt. Instead, the current plan is to focus on #2:

  2. Build a solid foundation. Yes, one of my ultimate goals is to finish this damn book that I’ve purportedly been working on for the past two years, but immediately trying to jump back in and build the habit of writing 2,000 words a day is a recipe for disaster. Right now, I don’t have the habit of focusing for a reasonable amount of time (2,000 words typically takes me 3-4 hours), the habit of thinking in a narrative structure, or even the habit of doing cognitively demanding work outside of working hours. Being patient enough to accept that I’m not going to get to my ultimate goal for several months is drastically increasing the odds that I’m actually going to do it, and having the perseverance to do so is greatly helped by #3:

  3. Set rewards. I’m going to take a stab at explaining some neurochemistry here, so please forgive the inevitable mistakes. Good feelings can be understood by which chemicals are elevated in the brain. One of these is dopamine. Dopamine is what makes that chocolate indulgence so attractive. It’s what makes you excited about shopping or opening a big paycheck. It makes you want things enough to pursue them. But here’s the thing, our bodies only excrete dopamine if we can actually see our objective and believe we can achieve it. We can only maintain that dopamine if our actions continue to reinforce that belief, so we need to be making noticeable progress toward the objective. Nebulous goals like “improve” or “do more” don’t trigger a dopamine release. That’s why setting concrete goals is so important. But the desire may wax and wane. There’s another way to hack this system: link the goal to some other desirable thing. For me, food is basically the greatest thing ever, and I will do just about anything to get myself in front of a mountain of delicious food. I’m also pretty easy to please, so a reward like spending the afternoon munching on a fresh loaf of bread and fancy cheese goes pretty far. It also helps to have the reward related to the goal. I have a goal of exercising every day for 30 days. The reward is to treat my abused body with a massage. However, even with these guidelines, I’m bound to slip up, for the final guideline, I’ll take one of Jordan Petersons “12 Rules for Life”.

  4. Compare yourself to who you were yesterday. The reason I’m undertaking this whole endeavor is that the person I was yesterday is not whom I want to be tomorrow, but who I am today is somewhere in between. When I slip and fall back into old routines, I have to remember the effort up to this point has not been in vain. The practice that I’ve done, no matter how small, has been an improvement upon who I was when I started. To throw it all away because of one misstep is in ignorance of the fact that even though it may be a step back, it is a step backward from multiple steps forward. Even if it’s just as many steps backward as it was forward, at least I’m back where I started instead of further behind. This point also recognizes the fact that we are changeable. Indeed, we inevitably and perpetually changing. Making the comparison to who I was yesterday means that I’m not comparing myself to who I was a year ago or a decade ago, and I’m certainly not comparing myself to who anyone else is today. I’m focusing on the little gains, the small steps toward my goals.

Even with these guidelines, though, the process is not easy. I’ve had far too long to build up bad habits, accept some faults, and move significantly backward from much of the progress I made last year. Fortunately, I have plenty of time to get it back. I’ll almost certainly be here at least until next January (when my apartment lease expires), probably a lot longer. That’s a solid year to rebuild good habits and settle into what it feels like to settle down. This situation isn’t permanent – I know that for certain – but impermanent does not necessarily mean “a few months”. A few years is hardly anything even in short few decades I’ve already lived.


Convincing myself

Today’s featured image: Rocky Mountain sunrise from the steps of the Capitol. This has been my phone wallpaper since I snapped it on a Saturday morning run shortly after I moved to Denver. I try to keep my wallpaper of something local to enforce the feeling of presence and recognizing the beauty where I am instead of wishing I were somewhere else. It has worked better in other locales.

It’s now been over a year since the current American president took office, and it’s still weird. I don’t think it will ever be not weird, even after he’s gone. Honestly, I’m surprised he’s lasted this long. The day he was elected, I knew that I would need to return to the US, but I’ve spent every day since then trying to convince myself that I was happy about it.

I have yet to be successful, and the closer I got to my return, the more anxious I became. Some of you may remember a rather distressing post from July in which I learned what nihilism actually feels like. For me, philosophies and religions had always been theoretical constructs, mental toys to be mulled over, but never indescribable mental realities. Unfortunately, the nihilistic feeling has continued to haunt me.

I’ve continued to maintain what have been the core aspects of my life and character, but those things come almost entirely out of habit. My fitness routines, work ethic, and other pursuits continue to hobble along because of momentum build over years of reinforcing practice. That momentum is, however, noticeably bleeding off.

I accepted this life of corporate servitude in exchange for a bit of stability that would allow me to pursue personal goals that will take longer than the average of a few months that I’ve been in each new place over the past three and a half years. Yet my fear of static constantly undermines each new endeavor. Perhaps the most successful experiment so far was exploiting my newly found unlimited access to a weight room when I successfully broke 200lbs for the first time in my life (and no, it wasn’t all fat). That took me about a month.

In the ensuing two weeks, I lost all of it, returning to the weight I was when I graduated high school and have been almost invariably since then (and no, that weight change wasn’t fat either, sadly). A week of experimenting with intermittent fasting (or time-restricted eating, if you want to be pedantic about it) left me feeling fantastic and back on track, but a creeping self-loathing brought it to a crashing halt with a weekend full of cheap pizza and doughnuts.

Logically, this doesn’t make any sense. As soon as things start going in the right direction, I’m proud of what I’m doing, and I physically feel good, I decide it’s time to pick up some bad habits. It’s absolutely insane.

Well, it is if you (as I often did) accept the premise that living a healthy and productive lifestyle is a good thing. On its face, this seems perfectly obvious. Why would anyone pursue anything else? Even if you suck at living a healthy and productive life (which almost all of us do), you know what you could do to change. It’s the changing that’s the hard part.

But what if even the easy part (knowing you should change) also comes into question?

And that’s where I am: unconvinced of the reason to enforce the discipline that had defined my life for so long.

It is reinforced, in part, by my disastrously cynical outlook on the future. One of our first assignments during my wind power master’s program was to make an argument for or against the proposition: society will do what is necessary to keep the average global temperature within two degrees Celsius above pre-industrial temperatures.

I argued in the negative bsed on three points:

  1. We have already used up much of the estimated carbon budget
  2. Momentum of established political and social institutions retard the transition away from fossil fuels
  3. Current technology is still unable to transition just the energy sector, much less the transportation, residential, and agricultural sectors, which make up the overwhelming majority of the emissions problem

In sum, our recent frenzy of renewable energy development and well-intentioned political movements are too little, too late.

This fatalism of a world of changing weather patterns and rising tides combined with a cynical view of mankind’s primitive, tribalistic, and violent tendencies have brought me to the following outlook for our species:

In the next couple decades, climatic changes are going to exacerbate political instabilities to a breaking point. Mass migrations, fervent uprisings, and military retaliation will draw the major powers into global conflict. Even in the unlikely event that a rogue actor does not set off a nuclear holocaust, either the conventional weaponry deployed will have left most modern civilizations in ruins and/or the global government that emerges from the destruction will bring us to a 1984-esque society.

With this future to look forward to, why should I prepare for it? What use will all our wind turbines be when climate change leads to the downfall of modern society anyway? For whom ought I work when my expectations of humanity are so abysmal? And when I’ve come to terms with my own mortality and the futility of controlling the world around me, why should I even work for myself?

That’s what I sat down to answer.

The assumptions that lead to this conclusion are based on a couple logical leaps that are actually quite unreasonable.

The main premise – that global action on climate change will be too slow to prevent 2-degree temperature rise – is likely still true. A handful of unlikely events would need to coincide for the world to meet its objective under the Paris agreement. First of all, deployment of renewable energy would need to increase rapidly. Even at the fastest rate the US has ever seen, it would take several decades to transition the energy sector almost completely away from fossil fuels. With the coming phase-out of renewable energy subsidies, it is unlikely we will sustain the rate of development that we saw at the end of 2016. Even if we did increase renewable development, it would need to coincide with a few more factors to get the world to a low-carbon economy before our budget is out. Developing nations like China and India would need to transition even faster, and for countries much more focused on getting power to their tens of millions of poor, this also seems unlikely. Even in the event that we solve the electricity problem, we are still starting with less than one percent penetration of low-emission light-duty vehicles (consumer cars and trucks) and basically zero low-emissions presence in heavy duty vehicles (think trucks and construction machinery), shipping, heavy rail, and aviation. Except for some particularly efficient designs in Europe, most residences are heated by burning some sort of fuel, animal agriculture (which is feeding a growing hunger for meat in the developing world) is actually increasing its emissions, and there are plenty of manufacturing isn’t making much of a shift to electric power. Unless we see a revolutionary breakthrough in battery and hydrogen technology, the trend is in the wrong direction.

Right. I’m off to a good start on the whole positive outlook thing.

The leap comes from the connection of global warming to the downfall of modern society. Fortunately, history does not support the claim that mass migration, resource wars, and civil unrest lead to destruction. Indeed, it is quite the opposite.


Let us look at past instances of these threats and their outcomes. Between 1880 and 1930, some 27 million immigrants flooded onto the shores of a (not-quite-unified) United States that was still rebuilding after a brutally bloody civil war. In many of these years, more than a million immigrants came here, similar numbers we see today, but these immigrants were being assimilated into a population about a quarter that of the current US. While there are certainly differences between the migrations of then and now, there are more similarities. People pushed out of their homelands due to war, famine and persecution have been moving for centuries, and the deluge in the US did anything but cause social collapse. Of course, there were tensions, gang violence was rampant, and there was a distinct period of transition, but it is impossible to argue that the United States did not, in the end, benefit. During this time, it fought a “splendid little war” that launched it onto the world stage, laying the foundation for what would become the most powerful nation in all of human history.

That process, of course, came through its violent twentieth century. The US emerged from World War I into the now infamous “Roaring Twenties”. By the sacrifice of the more than 100,000 men who fought and died in the fields of Europe, American society prospered. Indeed, it prospered a bit too much. When the financial system came crashing down in 1929, it was reasonable to believe that it might never recover. Indeed, the Great Depression gave ample justification for a rapid increase of government intervention into the American economy. Whatever you feel about this expansion of the federal government, you can’t deny that people at the time were demanding a different system.  However, nothing solves a depression like a good world war. Europe was still in the process of recovery as well when the bullets started flying again. After the continent was yet again ravaged by war (and the US was somewhat reluctantly dragged in again), the world entered what may be the most intense period of global prosperity in human history. Western Europe rebuilt itself into idyllic social democracies, former colonies gained their independence and started to learn how to govern themselves (albeit painfully), and the United States entered a period of such strong economic growth that the best thing to do was procreate to make sure your kids could experience the endless prosperity. Even though international tensions were flaring up into prolonged overseas conflict, things at home were so steady, people began fighting for their rights so fiercely that civil rights progressed more rapidly toward equality than at any point since Reconstruction. Even though Eastern Europe and Asia emerged into tyrannical Communist regimes, it was only a matter of time until they too began to make the transition to liberal democratic societies.

What was the effect of this global catastrophe we call World War II? Decades of prosperity and innovation, drastic declines in violent death, rapid liberalization of states around the world, and the near eradication of extreme poverty.

It appears that history does not justify my fear of mass migration, war, or civil unrest leading to societal collapse or global tyranny. However, history also does not paint a particularly rosy forecast. Though society will likely persist, it has done so through millions upon millions of violent deaths, barbaric atrocities, and endless injustices. We may not have utopia to look forward to, but it could be worse. Far worse.

And though I’ve still not convinced myself to like my country of residence, the argument I originally made still carries the day. Keeping the worst repercussions at bay will require the full participation of the United States, and our furious building of renewable energy requires all hands on deck.

This doesn’t solve all my personal problems, but it at least reinforces the necessity of doing all I can at work, and being at my best during business hours requires that I take care of myself otherwise.

I’ll dig more into what exactly that means later. Thanks for enduring my ramblings.

Corporations are people

Sitting crammed on the opposite side of my boss’s rickety desk, I stuffed letter after letter into envelopes that my colleague had addressed, as quickly as I could while being mindful of the most dreaded occupational hazard of the office worker: a papercut. The pile of tri-folded letters grew as my boss tossed another stack my direction. We had 86 letters to send and they needed to get to the post office within the hour. With the equitable division of labor from the director to the office administrator, the task fell quickly, making for a relievingly mindless job to finish out the day.


Each of these letters is an invitation that is now on its way to a landowner in Wyoming, where our company is working to secure land where someday dozens of enormous wind turbines will spin in the wind. This form invitation was drafted by an employee who knows how long ago, but its current iteration is only the latest of countless revisions. A young man who relocated here to the high plains of Colorado from the windy city of Chicago crafted the words to convey the humanity that flourishes behind the company logo in the letterhead.


His words will be read by Wyomingites who will make decisions about land likely held by their family for generations. These are the first words in a long conversation that they will have with this young developer as he walks them through the process of taking this hallowed ground on which ancient battles were once fought and generations of settlers farmed and herded to survive; together, they will transform it into a symbol of humanity’s precarious future.


In a few short years, this vast plain in the shadow of the Rocky Mountains will glimmer with the swirling white arms of dozens of wind turbines, stretching hundreds of feet into the wide open sky. Energy, which will be released from the nuclear fusion of atoms in the heart of the sun, will traverse the vast emptiness of space at the speed of light until it collides with the surface of the earth, where it will be absorbed as heat. This heat will warm the air, forcing it to expand, pushing other air out of the way, causing the great gales that those of the Great Plains have come to accept as a part of life. By turning the immense machines, the energy will be passed on to communities across the Mountain West, where it will illuminate homes, cook food, charge smartphones, and run the servers that power the internet. It will also run the billions of dollars worth of life-saving machinery in hospitals, power the computers through which essential knowledge will reach the minds of students, and charge the batteries of the rapidly expanding fleet of electric vehicles.


And in this complex, convoluted chain of events, one seemingly insignificant but indispensable step was the 15 minutes of my life spent stuffing envelopes in my boss’s office.


It is easy to see corporations as nebulous, senseless, inhuman entities that rule over our helpless pseudo-democracy, but it is much harder to see that these corporations are made up of people. Six weeks into my first proper corporate job, there have been many changes to adapt to and many new perspectives to see.


While it is true that some corporations have assumed disquieting power, directed by disquietingly few and unaccountable people, what these corporations do is undeniably the action of a group of people. Whether these groups create things great or small or have an impact that is global or local, they are just people, who go to work every day to labor on, guided by an invisible hand that makes this modern world possible. When you think of the lack of coordination, of direction, of control, you can’t help but see that it’s a miracle that this system functions at all.


Yet it does, and as I slowly come to terms with the fact that I will have my home base here for longer than I’ve been in one place in several years, I also come to terms with the fact that these modern contrivances such as corporations and money, actually lead to real things, such as electricity. It helps to know that my efforts as a tiny cog in this massive machine are going toward a cause that may lead to a sustainable human society.


But even if I never see the total fruits of my labor, I have experienced, in that moment of stuffing envelopes, that even in large corporations, the words that come to customers have been crafted by, and even physically touched by, human hands.

Discipline is freedom

This is not a prescription for how to live your life. This is not an attempt to explain some human truth. It’s not an argument for some political aim. It’s not even an endorsement of Jocko Willink’s new book, Discipline Equals Freedom, whence I took the title.

It is none of these things because I am not qualified to pontificate on such subjects. I’m not sure anyone is, but what I know of the world and the people who inhabit it is so superficial and incomplete that such pronouncements would necessarily be only the weakest stitching together of the findings of professionals far more learned than me.

It is an exploration of a concept I have been finding to be true, and the successful (so far) implementation of this concept in my life.

Those of you who have been following this blog will have noticed that I have recently taken quite a liking to the lectures of Dr. Jordan Peterson, professor of psychology at the University of Toronto. He often expounds upon the methods that he has learned through his own life and through his practice as a clinical psychologist to improve one’s wellbeing.

At the core of these methods is one simple idea: stop doing things that make you weak.

These things include lying, pursuing instant gratification, comparing yourself to others, assuming that you have nothing to learn from someone, and even slouching.

These things are easy. They are often very natural. We lie to protect our reputation, a survival skill that was essential when the best way to know about a person was from tribe gossip. We chase what is expedient because it feels good to get that hit of dopamine when we eat a sweet treat or find that thing we’ve been shopping for. We measure ourselves against everyone around us because its how we determine our place in the hierarchy, a crucial process for the smooth functioning groups of apes like ourselves. We don’t listen to others because it would make us challenge what we already think we know, and we don’t sit up straight because it takes effort.

Actually, overcoming all of those things takes effort. It’s hard to fight biology. It’s not pleasant. But it’s exactly what makes us different from the rest of the animal kingdom. We can fight our desires and emotions in order to pursue things that will give us greater satisfaction later.

And it’s exactly what Jordan Peterson and Jocko Willink are tapping into with their life rules. You can call these “rules for success”, but really they’re just rules for happiness. Contrary to what my nose tells me when I smell baking cookies or what my tired brain is telling me when I want to waste the day watching YouTube, giving in to urges does not lead to satisfaction.

I’ve had to deal with this recently because elevated levels of anxiety over being unemployed, starting a new job, living at home, and generally feeling directionless have led to some bad habits. As my diet and fitness routine fell apart, I not only lost the physique I had been so proud of but had started to see that lack of discipline creep into other parts of my life that were making me more irritable, more stressed, and less happy.

I had a series of false starts (probably cataloged in the blog), but a few things have conspired to shape what may lead more permanently upward. Most directly was discovering the motivation of former Navy SEAL commander Jocko Willink. After his discussion of his new book in the stead of the host of the Tim Ferriss show pushed me through a morning workout, I set this track as my alarm. I’ve popped out of bed at 5:00 am for the longest streak I think I’ve maintained since I returned home.

With LCDR Willink’s stereotypically terrifying warrior voice and the enlightening teachings of Jordan Peterson, I have resolved to start taking steps. First off, I have committed to getting my diet back in order. It’s not where I would like it to be, but I’m moving in the right direction. Thinking about cutting sugar and alcohol from my diet (at least 6 days a week) immediately sparked a feeling of anxiety, an unjustifiable feeling of missing out. I have had the feeling since I was in Poland that I need to take advantage of things while I can because I will never have the opportunity again. In such a great brew town as Fort Collins, refraining from beer most of the week is actually a bit of a daunting challenge. But my secret weapon was my accountabilibuddy. Even with her at a distance, my requirement to report on my adherence to my two rules (1. No distracted eating  2. No sugar/alcohol) each day has kept me honest.

And even though I’ve been to two breweries since I started, I have found that just getting a glass of water instead of doing the expected thing of ordering a beer, has not increased that feeling of missing out but dampened it. By standing by my principles (or at least my diet rules), I strengthened myself, and it felt good. Like waking up at 5:00 am to start the day with exercise, it’s a small little victory.

It’s not a victory over anyone else, but over myself. At least, over the self that I don’t enjoy being, over who I was yesterday. It’s my way of putting into practice Jordan Peterson’s fourth rule: “Compare yourself to who you were yesterday.”

These small acts of discipline prove to me that I can work toward and achieve things that are hard, things that I know will bring me greater satisfaction. It liberates me from my base urges and empowers me to drive my own life.

It has proven to me that discipline is freedom.

What could I have done?

I stood there in disbelief, munching a spoonful of black beans as I leaned against the railing outside the cafe. The gentleman had kindly dropped off a colleague and was now slowly backing his small sedan out of the diagonal street parking. Fully turned around, staring out the back windshield, he eased backward ever so slowly. Those parking spots are notoriously dangerous to get out of because it’s almost impossible to see oncoming traffic. Fortunately, traffic had stopped at the light. Unfortunately, the line had backed up, and the red SUV was stopped directly behind him. Really, directly behind him. There was no way that he didn’t see her. It was only when the SUV shuddered, and a little burst of dopamine shot through my brain as the thunk of the two vehicles colliding satisfied the moments of expectation.

Don’t get me wrong, I felt bad for the guy. Apparently, he was off the hook; it looked like the woman told him not to worry about it even though there was noticeable damage to her rear passenger door. But I know he still felt like an idiot. A few hours later now, his nerves are probably still on edge, and he’ll be beating himself up for days.

And I just stood there like a moviegoer, munching my snacks and enjoying the show. But what could I have done? Shouted? Run out toward him? Thrown my bowl of beans on his windshield to get his attention?

Perhaps. The whole incident took probably 30 seconds, and the car was maybe ten steps away. But I didn’t act because I didn’t know what to do. At the time, it appeared as if he was in control, as if he knew what was behind him. At the time, stopping a collision wasn’t even a thought because it wasn’t going to happen.

But it did. And in the moments after, as I watched the man curse himself back to his car, I thought about all the things I could have done, replaying the moment, trying to figure out when it would have been obvious that he was oblivious to the danger behind him.

I’m still not sure if I would be able to spot a similar situation or if I would figure out quickly enough what to do. We almost never have time to figure out the best course of action in a moment that requires action, but we have a fantastic tool in our mental kit: imagination.

We humans have the wonderful (and terrible) ability to imagine things that don’t exist and situations that haven’t happened yet. We can perform thought experiments to figure out “What would happen if …” long before any such situation arises and while we have more than enough time to figure out how to react.

How to stop a fender bender probably isn’t the most exciting or useful application of this ability, but the process is the same for any situation. Whether we are imagining two vehicles on a collision course or we are imagining a threat from a bloodthirsty attacker, it’s the same imagination, the same consideration of what could be.

But if we don’t take the time and make the effort, we will almost certainly be frozen in disbelief if the situation arises, unable to act because we don’t know what to do.

All men are not created equal

When Thomas Jefferson penned (and Benjamin Franklin revised) the now famous words of the Declaration of Independence in early 1776, he had something very specific in mind. The second half of that opening statement is not an addition to the first but a clarification.

We hold these truths to be self-evident, that all men are created equal, that they are endowed by their creator with certain unalienable rights, that among these are life, liberty, and the pursuit of happines.

Thomas Jefferson, as well as many of the founding fathers, understood that men (a word that we may now understand as “people”) are equal only in one respect: their natural rights. All people have the rights not to be oppressed by their government, not to be molested in their own communities, not to lose their property to thieves. A government ought to protect these rights and few others. All people ought to be entitled to these rights, and infractions against them ought to be brought to light, prosecuted, and protested.

However, this is just about the only way in which people actually are equal. In our highly individualistic society, we ought to be happy for this fact. Many of us are eager to prove our uniqueness in a world that often demands conformity. Yet when the conversation turns to such differences as intelligence, somehow we have crossed an invisible line. Why this is, I’m not entirely certain. I believe that it has to do with the idea, which has developed especially strongly in the United States, that anyone can do anything they want in life. While this hopeful idealism can be empowering in some sense, we must understand it’s limitations. Idealism tempered with realism has been the hallmark of the American identity since before the United States was even a coherent political entity.

Idealism lets a poor girl growing up in a broken home believe that she can rise above her situation and bring children into a better home in the future. Realism tells her that it’s going to take many years of hard work, constant failure, and a long series of moments in which she is almost convinced that she will fail. Idealism reassures an aging entrepreneur who has lost everything that it is never too late to start again. Realism ensures that he knows he probably does not have the energy he used to and that he’ll have to leverage different strengths than he did the first time.

Idealism also tells us collectively that we can build a better society with greater opportunities for the current generations and for future ones. Realism helps us understand that each person must find their role in that society in which they are useful, they feel useful, and they provide something that others appreciate. That role depends heavily on what each member of that society’s brain is capable of, especially in today’s technological society in which professions like law, medicine, and engineering require certain cognitive ability. This does not mean, however, that there is not a role for everyone. No matter what role one chooses – whether it be janitor or CEO – one can live a joyful and fulfilling life if they are recognized and appreciated, if their sudden disappearance would be missed.

This is the point of Dr. Richard Herrnstein’s and Charles Murray’s scientific exploration of cognitive distributions (and proverbial lightning rod), The Bell Curve. The 800-page, excessively cited tome is, in fact, a bit daunting for me to have acquired a copy (and it is conspicuously absent from the public library). But I did find an abridged audiobook, narrated by Murray, himself.

If you’ve heard anything about the book but have not read/listened to any of it or read/listened to any of Murray’s other work, you probably have a misconception of the book’s core message.

Herrnstein and Murray stress again and again, from start to finish, a few important points:

  1. Simply knowing the IQ of an individual tells you almost nothing about the person, but knowing the average IQ of groups can lead to insightful statistical analyses.
  2. Though there are significant differences between the averages of racial groups (races being defined as the participants self-identify), there is so much overlap that the only appropriate way to deal with anyone of any genetic background is as an individual.
  3. The refusal to engage in conversation on the subject has already led to counterproductive public policy and will likely lead to increased extremism on both sides: those insisting that all people have equivalent cognitive ability (wrong) and those insisting that genetic differences are proof of a racial supremacy (also wrong).

The effect that the increasing value of intelligence has had on American society over the past few decades has been significant, and to ignore it is to walk blindly into a trap that will almost certainly tear our country apart even more than it already has. To what extent intelligence determines one’s opportunities and how much we can actually influence the cognitive ability of the next generation is not yet fully determined. But they will never be determined if we don’t have the conversation.

All men are not created equal. Once we accept this and have a grown-up conversation about its implications, we can get to work building a better society for everyone, regardless of their IQ.

A conversation about conversations

Aristotle once wrote that “it is the mark of an educated mind to be able to entertain an idea without accepting it.” Of course, you must also remember that Albert Einstein said that “only two things are infinite: the universe and human stupidity.” If you are struggling to reconcile the words of the two great thinkers, just remember the wise words of Abraham Lincoln: “The problem with quotes found on the internet is that they are often not true.”

And yes, if you are paying attention, you’ll have recognized that none of those quotes actually originated with the attributed author. Their exact origins seem to have been lost in the depths of the interwebs, but I’ve shared them because they serve a useful point.

The first quote, which seems to be a transmogrification of a quote from Aristotle’s Nicomachean Ethics, has been passed around for years and probably uttered millions of times. I posit that this popularity is not a result of Einstein’s infinite stupidity but a result of the fact that whoever wrote it made a rather astute observation.

Because the source is unknown and therefore the exact original indeterminable, let’s not split hairs about the definitions of each word.

The way I understand it is that one who has been properly educated has the ability to consider and understand ideas with which they may disagree. At least, I hope that this is a matter of education and not of intelligence, which Dr. Richard J. Herrnstein and Charles Murray have conclusively shown does not change much after about the age of six.

Let’s consider for a moment the implications of this assertion. If you are able to understand a contradictory idea, you are able to understand the ideas of others, you can take seriously thoughts that have originated outside your own mind, and you can consider objectively whether or not you ought to incorporate these ideas into your own belief system. You are able to take the thoughts of others not only to regurgitate them in a manner that passes as being well-read but actually to form new and better ideas.

And isn’t this the whole point of an education? To produce minds that will generate new and better ideas that will continue to improve society? To give people the tools to make better lives for themselves, their families, and their communities? As Dr. Jordan Peterson argues, such an education strengthens the mind and helps it avoid such illogical traps as Marxism and neoliberalism.

Despite President Lincoln’s warning, we need not be immediately dismissive of even the most spurious quotes we find on the web. This anonymous author has quite a good point.

If you disagree with anything I’ve written thus far, you are certainly free to dismiss me entirely, vilify me for being an elitist pig, or simply click away in boredom. But what will you have gained? You will leave this page with the same preconceived notions that you had, no better or worse, no more fully formed, no more deeply resolute, no more able to give you strength in times of ethical quandary.

Perhaps I am wrong, and perhaps the internets should have left Aristotle’s Nicomachean Ethics well enough alone. But the validity of my analysis is irrelevant. What matters is whether or not I’ve had a positive effect on the conversation because it is through conversation that we determine what is true, what is good, what is right, and what is useful. If we avoid such conversations, prevent ideas from entering the conversation, or keep ourselves blind to perspectives potentially useful to the conversation, we are all worse for it.

I’m a bit saddened that we’re actually having this conversation, but sometimes we need to refresh the conversation about conversations. When we are seeing violent rebuke of conversation, we need it now more than we have in a long time.

Next time, I’ll dig more into the ideas of the controversial writers whose work I slipped into my arguments above and address the anti-speech movement more directly.


The freedom to speak and to listen

Earlier this year, students at Middlebury College in Vermont verbally and physically attacked author Charles Murray and his hosts after what was supposed to be a speech sponsored by a conservative student group. Murray, who has faced this kind of response since he co-authored an extremely controversial book 25 years ago, was forced to give the speech from a sound booth on another part of campus because the protestors filled the lecture hall with such a ruckus. The book I’m referring to The Bell Curve, which explores the measurement and value of IQ, the separation of classes along lines of intelligence, and the implications for American society.

I would love to go more into the book itself and the misinformed reactions to it, but for this post, I’m only using this to lead into the larger issue at hand, which has been brewing across the US and in Canada. Although I’ve yet to determine precisely how widespread this anti-speech phenomenon actually is, several cases have grabbed headlines such as the incident at Middlebury and multiple incidents at UC Berkeley.

The protesting students at Middlebury argue that they were fighting against the spread of ideas like eugenics. They argue that their protest was not an attack on free speech because it was purely a statement of preventing giving “odious” ideas a platform.

While this could easily devolve into an ad hominem rant, I will refrain. What can be said, though, is that the movement to suppress controversial ideas is coming almost entirely from what people like Sam Harris like to call the “regressive left“. Members of the movement often argue that to give speakers like Charles Murray or Milo Yiannopolous (even though those two people have little in common except their reviled status among university students) would be to promote ideas that are abhorrent to modern liberal values.

Canada has been drawn into this mostly by University of Toronto professor of psychology, Jordan Peterson. He pushed back again a university initiative that gave seemingly excessive permission to students to challenge professors for not being precisely respectful to transgender students. Dr. Peterson has since become outspoken about the post-modernist roots of the anti-speech movement. In his synopsis, it comes down to the idea that dialogue cannot lead to understanding because each person’s experience is too different for people truly to understand each other. Therefore, people of different opinions must be silenced (or at least not given a platform, which if taken to its logical conclusion, is silencing them).

I haven’t spent enough time with anyone who actually holds these views to understand if Dr. Peterson’s interpretation is accurate, but it seems to capture the sentiment of anti-speech mobs and postmodernist campus groups that fight “dangerous” or potentially harmful speech.

Whether or not you agree with Dr. Peterson isn’t the point. I disagree with a lot of what he says, and I think he has many underlying biases that cloud his reasoning. However, I love listening to him because he has interesting (if controversial) ideas. If you want to attack those ideas, please do. Actually, just send me a message ( or tweet at me (@geoffreydesena) because I would love to have the conversation.

And that’s the difference between the resistance from those like Dr. Peterson and the resistance from anti-speech groups. He is not fighting the person, the speaker, or even the speech. He is fighting the idea because that’s how the best ideas come to the fore. When we pit the strongest form of two ideas against each other, we do not get destruction; we get even better ideas.

That was a bit of a mess, but it will serve as a primer for some more coherent thoughts on The Bell Curve, Jordan Peterson, and controversial ideas.

Blog at

Up ↑

%d bloggers like this: