Category Archives: essays

Diaries, Journals, Commonplace Books and Notebooks

What is the difference between a diary and journal? I can’t find much of a difference in how the terms are used. They seem interchangeable, but that only means that somewhere on the Internet, a big flame war exists over the subtle differences between these terms.

Accord to Merriam-Webster, a diary is “a record of events, transactions, or observations kept daily or at frequent intervals.” After that it says “: JOURNAL”. I had to lookup what that meant in the Explanatory Chart. It is a synonymous reference, which is Merriam-Webester’s way of saying that diaries and journals are the same thing.

Merriam-Webster says a journal is “a record of experiences, ideas, or reflections kept regularly for private use.” I noted that there was no synonymous reference back to DIARY. Both, it seems, are a record of events and experiences. The definition of “diary” refers to transactions, which is sort of odd. I think of journal (specifically, a double-entry book-keeping journal) as more transactional than a diary.

I think I use the terms interchangeably, although I say that I journal (verb) more than I say, “write in my diary.”

Journals/diaries don’t seem as popular as they once were. At least, from my reading, it seems that people kept diaries more than they used to. There are, of course, famous diaries, like those of John Adams, and John Quincy Adams, or Anne Frank. At one point last year I began reading the diary of Samuel Pepys. I suspect there are three reasons I don’t see as many people admitting to having diaries as they once did:

  1. There are no courses in keeping a diary. Certainly, I never learned how or why to do this in my schooling, a lapse that I am both grateful for, and that I also lament.
  2. Time is occupied by other activities. John Quincy Adams, even at his busiest, did not have social media, movies, and television competing for his attention.
  3. Litigation. People worry that what they write can be subpoenaed and so they don’t record anything.

It is something of a shame, really. All of those historical diaries sitting in various collections contain valuable data about everyday life across all walks of life. It seems like there is useful research information in that aggregate data.

For many years, I used red Standard Diaries, keyed to the current year. These were convenient for their ready-made pages, but limiting in that there was only one relatively small page per day. If I wanted to write more, I felt constrained. If I didn’t fill a page, I felt it a waste. Now I use large Moleskine Art Collection Sketchbooks, which have big blank pages that I can use however I see fit.

Assuming that diary and journal are interchangeable, there are two other written records that confuse me from time-to-time. There is the notebook, which Merriam-Webster defines as “a book for notes or memoranda.” When I think of a notebook, I think of the notebooks of Leonardo da Vinci. The line blurs, it seems to me. Certainly some of his notes were memoranda, some were notes, some were reflections, some designs. Were these not really just “working”journals?

Lab books are another type of notebook. Lab books are supposed to be a scientists notes for their experiments and discoveries. They showed progress, evolution of thought and ideas, and ultimately provided a recipe for others to reproduce their results. That is how my “notes” are today, although they are digital rather than notebook form. But that is not how I was taught to keep a lab book in college. In college, the implication in my chemistry and physics classes was that you had two lab books. One for your raw notes, the other one, a “cleaned up” version that you turned in for grades. I could never afford two so I always turned in my messy, raw notes.

A commonplace book is perhaps the most interesting of these forms of recording, and yet Merriam-Webster gives it the shortest shrift: “A book of memorabilia.” I first learned about commonplace books reading a biography of Thomas Jefferson. Back in his time, a commonplace book was a kind of learning tool. He recorded passages from his readings in the book, along with his own notes. It seems like another valuable learning tool that I was never taught in any of my formal schooling. You don’t hear much about commonplace books these days, although there was recently an article about digital commonplace books in the New York Times.

Today, instead of diaries and journals and commonplace books, we have blogs and Twitter and Facebook. And yet I keep thinking about something Walter Isaacson wrote in his biography of Leonardo da Vinci:

HIs mind, I think, is best revealed in the more than 7,200 pages of his notes and scribbles that, miraculously, survive to this day. Paper, turns out to be a superb information-storage technology, still readable after five hundred years, which our own tweets likely won’t be.

If I’d had a commonplace book, I might have copied this passage into it, instead of just highlighting it in the book.

The Pre-Registration Farce

Of the many ridiculous marketing schemes foisted upon us, the concept of “pre-registration” has to be among the most redundant. This occurred to me after I received a weekly notification from my county reminding me that I am, in fact, pre-registered for my COVID vaccine, and that I would be notified with instructions once registration opens.

The county is sending out these weekly reminders because they are being flooded by questions from people asking if they are, in fact, pre-registered for the vaccine. But it got me thinking what it means to be pre-registered. You can often pre-register for conferences at “early-bird” specials. Aside from that, there doesn’t seem to be much of a different in the over all process. The beauty of the marketing hides the plain truth: you registered. You have completed the first step of the process, and you will be notified when it okay to complete the second-step.

That is the crux of it: pre-registration makes it sound like you are doing something before the actual thing you are doing. But really, pre-registration is just the first step in the registration process. I convinced myself of this by drawing it out on a timeline:

Clearly, the first step in the process is not registering for the vaccine. The first step is pre-registering. But if you take those first two steps as a whole, you don’t register first, then pre-register. Those two steps could be simply called “registration” with, perhaps small-print indicating that registration will be a two-step process.

I suppose someone might argue that pre-registration guages demand, but isn’t the assumption, given that everyone has been cooped up for a year, that the demand will be high? It seems, therefore, that the real value of the pre-registration step is make-work: it’s there to make people feel like they are moving through the process. I’ve gotten a start. I’ve taken that first step. I’ve pre-registered. Of course, we could have skipped the whole pre-registration farce, and registered with equal satisfaction that I am on way.

We seem to like our pre’s. We have pre-boarding getting on the airplane. This verges on oxymoron. How can you board before you board? And yet, that is what happens for some folks: those with lots of frequent flier miles, active military members, parents traveling with small children, and those who need extra time making their way to the airplane. All of them get pre-board. They board before they board.

Sometimes, there is the opportunity to get a sneak preview of something. We get to see it before everyone else. Software companies are notorious for their pre-releases. In the book world, things are more forthright: “preview” copies of a book are called “advanced review copies”, or ARCs for short. No pre- in the publishing world.

You know that something is ridiculous and meaningless when the efficiency of the business world excludes it. I don’t find myself attending “pre-meetings” or doing any “pre-code reviews.” If someone says, “Why don’t you pre-print these slides for the meeting,” it’s time to have a serious conversation with that person.

Call it what you will, I am at present (pre-) registered for a COVID vaccine in my county. I guess I should be thankful for that.

And in complete fairness, and the spirit of full-disclosure (I almost wrote pre-disclosure) I should mention for the record that I pre-published (read: scheduled) this post last night right after pre-writing (read: drafting) it and right before the pre-reading (magazine article) I planned to do before I the real reading (Paul Theroux book) commenced.

Brain Drain

Portrait of a weary writer.
Portrait of a weary writer

I don’t know about you, but as I get older my brain seems to burn out more quickly than it used to. Or, maybe it’s not age, but stretching my ability to comprehend the stuff that I work on. I’ve been working on developing a fairly complex software system in the day job, and so far, it is one of the things I am most proud of in the 26+ years I’ve been working there. But it taxes my brain like nothing else.

Take yesterday, for instance. I started work day reviewing a list of issues that came up the previous day and that I intended to fix before my first meeting of the day. I was committed to getting it done then, because my first meeting was the first of seven I had throughout the day, including one stretch of five hours without a break.

I was hacking my way through elegant (yet convoluted) lines of code, like an explorer making his way through a jungle. I was just getting a sense of how to fix a particular problem when I was distracted by a completely unrelated problem on another project. I had to mentally switch languages, and then dive into that problem. I’d made it about halfway to the solution before I had to give up. It was meeting time!

Over the years I’ve gotten careful about meetings. I don’t like wasting other people’s time, and the meetings that I had scheduled today were necessary and very productive. I will say this, however: it is never a good idea to schedule seven technical meetings on the same day. I do this by accident from time-to-time. On Monday, for instance, I realized that we needed some design work and began scheduling meetings, looking for time on people’s calendars, and noticed that my Thursday was getting booked up. No worries, I told myself, It’s not until Thursday. You don’t have to think about it until then. After the first two meetings today I began to wonder, not for the first time, how I ever thought all these meetings were a good idea.

(I am reminded of similar experiences in college, pulling all-nighters. An all-nighter is appealing, and even exciting right up until about 3:30 or 4:00 am, at which point, the very idea of an all-nighter is appalling.)

These technical meetings take a lot out of me. I don’t know about you, but it is not easy for me to hold all of the technical connections in my head and see how they fit together. I can do it, but I feel like I’ve been through some kind of mental marathon afterward. I want to lay down and not think at all.

I made tacos for dinner, and have no real memory of making them. When dinner was over and everyone raced from the table, I looked at the dishes piled up by sink and considered, briefly, just leaving them there. All I wanted to do was get in bed and read.

Once everything was finished, and I got this post written, I knew I could finally get in bed and read–only my brain felt so worn out, I didn’t know if I’d comprehend what I was reading.

I don’t have as many meetings today as I did yesterday–mere four compared to yesterday’s seven. But four is a lot for a Friday. And these are all technical meetings. And then next week are the rehearsals for our second milestone demo which I am giving in a week.

At least, this project should wrap up in April. On the other hand, there are other projects queued up that I’ve been asked to work on. I’d say I could look forward to a vacation, but with the pandemic we haven’t planned any vacations for this year yet.

You want a good description of just how much mush my brain is right about now? This is the seventh attempt at a closing paragraph for this essay, one that tries to tie things neatly together in a funny or amusing way, as is my wont for pieces like these. None of the six previous attempts worked. Four of them didn’t even make any sense.

To Snap or Not to Snap?

There are 25,766 photos in my digital photo library. On seeing that number recently, I longed for the days of film cameras with 24- and 36-roll clips of film. Back then, if you wanted to take a picture, you needed to make sure it was worth one of those 24 exposures. Now, it seems, I often have dozens of shots of the same scene, each taken milliseconds after the one before it, so similar that only a computer could detect the differences. And why bother deleting them? It’s not like they are taking up space in album somewhere, accumulating dust and wrinkling at the corners.

For the last several years, I have deliberately cut back on the pictures I take. Having a digital camera (as part of my phone) everywhere I go was a novelty at first, but I realized after the first ten or fifteen thousand photos that I could either be an observer or a participant. Others may balance these roles better than I can. I found when I took photos, I didn’t feel like a participant in the event. I didn’t remember the event as well. I wasn’t present in the moment, or to use the phrase of the day, I wasn’t mindful.

Back when photos came in batches of 24 or 36, it was was relatively easy to curate them. I took no time to flip through 24 photos and pick out the ones that were worth keeping, or sticking in a photo album. With nearly 26,000 photos that is nearly impossible. I did the math–that number averages out to about 6 photos a day, every day for the last 13 years.

The cost of the film alone would have made me much more selective about my photos. I checked and a 4-pack of 24-exposure film costs about $35. That’s about $0.36 per photo. It costs around $5 to develop a roll of that film, so call it another $0.21/photo. That’s $0.57 per photo. Looking at the 25,766 photos in my library, it would have cost me around $15,256 if those photos had been on film instead of digital. There’s no way I would have taken even a tenth of those photos if it cost me that much money.

Also, I’m not big a photo browser. I generally don’t go back and look at old photos unprompted. I’ll do it if the kids ask me to show them something, or if a specific need comes up, or if I am reminded of something. But with so many photos in such disorganized jumble of bits and bytes, it can be hard to find the specific photo I am looking for.

Thinking about all of this, I think I made the right decision to cut back on how many photos I take. I’d rather enjoy the moment and remember it (and maybe write about it later) than have a hundred photos that I won’t likely look at again. Also, there are usually half a dozen other people taking the same photo, so it’s around somewhere if I wanted it

Whether ’tis nobler in the mind to suffer the snaps and flashes of outrageous snapchatters, or take pen in hand and scribble what I remember and enjoy the moment, I think I’ll enjoy the moment.

How to Learn to Write Code in 37 Short Years, Part 1: Hello World!

Recently, I passed a personal milestone. It is an arbitrary milestone, one for which I am the sole judge, but it is one that has been 37 years in the making. I have been a professional developer (coder) for about 27 years now. In that time, I’ve generally had no problem considering myself a professional. No impostor syndrome there. But there is one thing from which I have refrained referring to myself: an expert.

Until now, that is. Now, after 37 years of learning how to write code, I think I do it at a level which could be considered expert. Perhaps even by someone other than myself.

“Developer” sounds a little phony to me. I’ve read about people who are developers (land developers) and I’ve never really been clear on what that entails. I prefer referring to myself as a (professional–now expert) “coder.” It’s less formal, but despite its reputation, I’ve always felt there is something informal about coding. I’ve been wary of the term “expert.” It seems to me that it is overused to such an extent as to water down the meaning. I’ve seen all kinds of books about becoming an “expert” this or that in 30 days. Maybe I’m just bitter at being a slow-learner but it took me 37 years before I finally considered myself an “expert.”

One thing I don’t consider myself is a software engineer. An engineer has a specific meaning in my mind, and entails a certain kind of formal education in software development that I lack. I am not an engineer. I am immediately suspect when I see “software engineer” on a resume when I don’t see an engineering degree along with it. I’m much less suspect if I see “coder.”

I tend to be a slow-learner, perhaps because I dive in head-first and try everything at once. I’ve written about how it took me 14 years of writing and submitting stories, collecting more than a hundred rejection slips before selling my first story. It took me 11 years of teaching myself to write code, before I landed my first (and so far, only) professional gig. In that regard, it only took another quarter century or so before I felt I could call myself an expert.

So how does one learn to write code in 37 short years? For me, it began with hangman, and WarGames.

The first computer I ever saw was a Commodore VIC-20. I saw it in my 5th grade math class sometime in the late winter or early spring of 1983. There are exactly 3 things that I remember from that math class. First, our teacher was missing part of a finger. Second, one of our lessons was learning to read the stock pages in The Providence Journal. Third, was the Commodore VIC-20.

There was no math associated with our introduction to that computer. I remember it was wheeled into the classroom, connected to a television set. We spent the class using the VIC-20 to play Hangman. I didn’t see a line of code during that introduction, but I was intrigued by what I saw.

The summer of 1983 was my last on the east coast before I moved with my family to Los Angeles. It was the summer that WarGames with Matthew Broderick and Ally Sheedy made its debut. I saw the movie in New York with my cousins. I don’t recall the movie making much of an impression one way or another at the time. What I remember most about that day was going back to my cousin’s house after the movie, and being introduced to his Timex Sinclair 1000. It was the first computer I ever laid my own hands on.

My cousin turned on the computer and showed my how to write a simple program in BASIC. The program was:

20 GOTO 10

The program doesn’t do much, but something in my brain clicked. It was like I understood the concept of programming in that instant. With a finite (even small) set of instructions, and some basic logic, you could make the computer do all kinds of things.

That evening, we made the computer break into a top secret installation. We didn’t have a modem or any kind of connection to the outside world. But using our memories of WarGames, and my quick absorption of BASIC, we wrote a program that made it seem like we were hacking into some secret computer system. I can’t remember what I program looked like, but it was probably something like this:

20 PRINT "Enter your password:"
30 INPUT x
40 IF x = "password" THEN GOTO 50 ELSE GOTO 60
50 PRINT "Welcome to Global Thermonuclear War"
60 PRINT "Wrong password"

Yeah, we didn’t have the logic quite right, but the idea that through some simple instructions you could make the computer to all kinds of things was a revelation for me.

At the end of that summer, we said goodbye to the east coast, and hello to Los Angeles. Any move is tough on an 11-year old, but moving across the country, away from all of your friends is particularly tough. But I had something cooking in my mind that I was looking forward to. I was going to figure out a way to get my own computer. I didn’t particularly like the keyboad on the Timex computer. What I had in mind was the VIC-20 I’d seen in my 5th grade math class. I thought about it so often that I’d dream about it. I remember a couple of occasions that I dreamed I’d gotten a VIC-20. I was so excited! Then I’d wake up, uncertain at first if it had been a dream, and then, crestfallen, that it had.

Until one day, I had one! My very own Commodore VIC-20. And it came with a tape drive! My real coding experience was about to begin…

A Weekend Traveling the World

I spent the weekend traveling the world, an event I had been training for my entire life. That training was inspired by–although I didn’t know it at the time–a talk with my mom when I was 5 or 6 years old, about the value of books. “Books can take you anywhere,” I remember her telling me. I seemed always to interpret things she told me literally, so there I was, youngster just beginning to read, and discovering just how book could take me places.

I quickly began to develop my imagination, realizing that this was the boarding pass required to turn pages of words in experiences. I drew a lot, I read more and more, I began to write my own stories. The earliest story I remember writing was for a social studies project in 3rd grade. Around that time I grew interested in airplanes and flying. I had no access to planes, but access to The Student Pilot’s Flight Manual and from that, I learned to draw control panels and would use those drawing to pretend I was flying a plane here and there.

The more I wrote, the more I read, the more my imagination improved. It was a painfully slow process day-to-day, but exercising it as I did, year in and year out, seemed to hone my imagination in to something I had more and more control over. I wrote more stories, I began submitting them, and eventually, even began to sell them. I greatly expanded the focus of my reading–from what was initially mostly science and science fiction to everything and anything that could interest me. I’ve often thought it interesting that, when reading an essay about quantum mechanics, I visualize what is being described as if I could actually see it. When reading about the death of a star by supernova, I am there, hovering at the outskirts of that unfortunate solar system to witness the event.

Stories pull me in, and the world melts away. It is a wonderful talent to have, although it has its darker side. I often envision what-if scenarios, and that same imagination makes them often feel too real for comfort.

We like getting out as family. We like road trips, both long and short, and in years past, our weekends would often be full of exploring nearby places (sometimes to the point where I needed a weekend off, just to relax). We’d drive down to Florida a few times a years, stopping a places along the way. We’d drive up to Maine in the summers doing the same. For a year now, we’ve been mostly stuck at home like everyone else, and then need to get out has been growing, even in me, someone perfectly content to stay in. It is an irony we are all currently experiencing that I am desperate to travel and cannot.

Which is how I came to Paul Theroux’s The Great Railway Bazaar on Saturday morning. I enjoy travel books, but hadn’t read anything by Theroux and so first thing Saturday, after building a fire in the fireplace, I sat with the book and traveled (mostly by train) from London through the Mid-East, and into India, and then up to Japan, and across the Trans-Siberian Railroad arriving, early Sunday morning, back in London.

This was the event that I had been training for all these years since my mom first put the idea in my head that books could take me anywhere. I back in time and across and across large swaths of the world in little over a day, sitting on my couch, in front of a fire, with temperatures in the teens outside. I didn’t feel like reading. It felt like traveling, it felt like I was there. I could see it, smell it, taste it, hear it. It was wonderful.

I finished The Great Railway Bazarr this morning, and decided I needed more, so now I am making my way Theroux’s 3 collections of essays (starting with the most recent one). The weekend may be coming to an end, but my travels, it seems, are just beginning.

30 Years of L.A. Story

Steve Martin’s L.A. Story is one of my favorite movies. I thought it first debuted 30 years ago this summer, but it turns out, it was first released on February 8, 1991, so it now just over 30 years old.

I saw the movie for the first time with my brother and distinctly recall the advertising for the movie as “the first great comedy of the 1990s.” I loved it. Aside from its Shakespearean overtones, it caricatured Los Angeles in a way seemed to perfectly capture all that the city was about in the early 90s. At the time I first saw the movie, I’d been living in L.A. for about 8 years, with another 11 years to go and the film was something I could recognize about the place where I lived.

L.A. Story became the first video I repeatedly rented in college. My roommates and I would watch the movie over and over again until we had every line of the film memorized (I can still remember most of the lines today). Enya’s music from the film is part of the Littlest Miss and my nap playlist. I am after reminded of the street art that appears in the film when I see photos of Santa Monica street art posted on Twitter by my by my great-great-great grandboss.

Even though L.A. didn’t seem so to me at the time, L.A. Story captured an idealized version of L.A. for me, one that I look back on fondly–something I never imagined I’d do while living there. I watched the movie for the first time in a while last summer and it was just as good as I remembered it being. It is one of those movies that does’t lose its luster as it ages.

When I first saw the film, I was nearly 19 years old. Thirty years later, as I sat down to write this post, a strange thing occurred to me. I had to look it up to confirm it, but confirm it I did. I am today, nearly 4 years older than Steve Martin was when the film came out. Even so, my hair isn’t quite as white as his was (except maybe on the sides).

Today when I think about L.A. Story, I sometimes wonder whatever happened to Harris K. Telemacher and Sara McDowel. Did they really live happily ever after? And what about SanDeE* (“Big-S, small-A, small-n, big-D, small-E, big-E… and there’s a star at the end”) and Roland? Whenever a story makes me wonder about where the characters might be thirty years later, it is a good story.

Ticking Clock: Behind the Scenes at 60 Minutes: A Fascinating Read–And a Struggle

It is rare that I don’t know what to make of a book. If I zip through a book with ease, it is usually a sign that I enjoyed it. If I struggle through it but finish, it was okay, but not necessarily something I’d write home about. But what about a book that I zip through with ease, and struggle with along the way? That doesn’t happen often, but it happened while reading Ira Rosen’s new book, Ticking Clock: Behind the Scenes at 60 Minutes.

Rosen was a long-time producer at 60 Minutes working with many of the correspondents, especially Mike Wallace and his book was about his time as a producer in television. (He also worked for ABC for a time before returning to 60 Minutes.) Hollywood memoirs are a kind of guilty pleasure of mine, and I particularly enjoy memoirs and biographies about journalists: My War by Andy Rooney, A Reporter’s Life by Walter Cronkite, A Life on the Road by Charles Kuralt to name just a few. But I struggled with Rosen’s book in ways that I did not with these other books.

While I wouldn’t characterize Rosen’s precisely as mean-spirited, it certainly came across as someone who decided to air all of his grievances and show the worst sides of those people he worked with. I think this would be understandable if Rosen had been treated poorly and that poor treatment affected him in a negative way. Rosen recounts many times when people like Mike Wallace, Diane Sawyer, and Morley Safer treated him or other people rudely. But what made Rosen’s account interesting was that he never seemed to mind this treatment. He was sort of immune to it, and focused on doing the best job as he could as a producer. So why complain about it now, in a book? I just couldn’t understand that.

Perhaps the reason is perspective: Rosen writes from the point of view of a producer, while other books I’ve read are from the points of view of the reporters themselves. Andy Rooney was grumpy at times, as everyone knows–that’s part of what people loved about him. He had complaints about CBS, but he generally didn’t single out people, but the organization as a whole. Cronkite and Kuralt, in their books, seemed to handle this by omissions: they wrote about people they admired, or who helped them out, and didn’t mention those who created problems or roadblocks.

Still, despite Rosen’s dramatic characterizations of those correspondents he worked with, the book was endlessly fascinating. Reading it, I felt like a fly on the wall at some interesting conversations. It made some of the more outlandish stories Rosen had to tell about people all the more out of place in the book, and made me wonder: was the book written as a memoir, or as memoir disguised a vehicle for Rosen to vent about his treatment as a producer? Maybe it was both, and maybe that’s what made it both a fascinating read and a struggle.

How Much Does It Cost To Browse the Internet, Ad-Free?

Nothing makes me give up on a website faster than seeing every available space on the page filled with ads. If the article I am reading is interesting enough, I’ll try to continue reading only to find that I have to scroll past a large ad every paragraph or two and then try to figure out if the text that I am reading is part of the original article, or ad copy. When the popups asking me to subscribe start, I’m out.

Economics was my worst subject in college, but it seems to me there must be a diminishing return for all of that advertising. If people bail before reading the article, let alone the ads, how can the site be worth advertising on?

I was thinking about this, and as my thoughts wandered, I began to think about cable TV. When I was a kid, there were 3 network channels, and UHF. Cable was a new phenomenon when I was 9 or 10 years old. The thing about some of the channels (like early HBO) that impressed me was that you could watch movies without commercials. Sure, you paid a monthly fee for that privilege, but it seemed moderate enough (to my 10-year-old self) to make skipping the commercials worthwhile.

I also recall the early days of the Internet, which came into its own in 1994, the same time I graduated from college and began my career. Back then, there wasn’t much advertising on websites. Indeed, for a time it seemed anathema. I remember sometime in the late 1990s, when I first saw a Yahoo! commercial on television, and thought, Wow, they have the kind of money to buy a television spot? In those days, you didn’t have to worry about pages filled with ads. You just hand to be careful of the blinking text that was all the rage for a time as people learned to use HTML.

The early days of Facebook also seem, in my memory, to be relatively ad-free, at least compared to today. I suppose that is the classic bait-and-switch of these services: grab you with the services, and then start putting ads in front of you if you want to continue using it–which, of course, many people do. A few months ago, I wondered why Facebook and other social media companies didn’t offer an ad-free version, one in which users would subscribe to via a monthly or annual fee. Imagine what it would be like to use these services without ads. After a few minutes thought (I was walking through the park and clearly remember where I was as I pondered this), I realized that social media companies must make far more money off showing ads to individuals than the would from any reasonable subscription fee that those individuals could pay.

Isaac Asimov, in his science essays, would occasionally explore extremes. I especially loved those essays: what’s the smallest possible distance? The largest? The coldest temperature that can exist? The hottest? It was a thought experiment as much as anything, and thinking of those essays made me wonder: is it possible to estimate how much it would cost the average individual to browse the Internet, completely ad-free? It doesn’t matter what the the answer is. What matters is the possibility of calculating it. I did some rough browsing on this question (seeing plenty of ads in the process) and didn’t come up with much. The cost questions center around how much ad companies make on people, or how much access to the Internet costs. Neither of those is what I am interested in.

Put another way: for the Internet to continue to have new content in much the same way it does today, but to be entirely ad-free, how much would access cost an individual? I imagine it would require calculating profits of countless companies and then dividing that number by total Internet users. What would the order of magnitude be for, say, one month of ad-free Internet browsing? Would it be $50/month per person? $500/month? $5,000/month?

I guess I’d like to know the answer, because once I know it, I’d wonder if it would be worth paying.

Mount To-Be-Read and the Danger of the Doubling Charm

There is a scene in one of the Harry Potter films where Harry and his friends end up in a treasure vault which has been boobie-trapped with a “Gemino Curse,” a variant of the Doubling Charm. Each thing touched, instantly doubles. Touch those things and they double. This continues without end. I have sympathy for Harry in that scene. I know the feeling. Each book I read spawns more books to read. And those books spawn more books. This continues in an endless doubling, tripling, quadrupling that has been growing increasingly doubtful of my ability to read every book ever written.

At various times, my to-be-read list can have anywhere from dozens to scores of books on it, each one of which is a butterfly’s flap to who knows how many other books to read.

This was illustrated to me in a stark way this afternoon, after I began playing around with the Mind-Map plug-in to Obsidian, my new favorite text editor. I was trying to see how my reading had progressed–and how Mount To-Be-Read had grown–since the weekend, just a few days ago.

I picked the New York Times Book Review as my starting point. (The Washington Post and a few other lists may have been involved as well.) From this I started listing out the books that interested me and that I ultimately read. From there, I began listing books I came across in those books that interested me and that I either noted on my list, or read. From there… well, you get the picture.

This formed a simple outline in my text file, and with a few keystrokes, I’d turned it into a mind-map:

A mind-map of recent reading.

Since Sunday, I’ve read 3 of the books on the mind-map (Probable Impossibilities by Alan Lightman, In Praise of Wasting Time also by Alan Lightman, and When Einstein Walked with Godel by Jim Holt). I’ve also nearly finished (as in I will finish it this evening.) The three books that I have finished spawned eight other books that have since been added to the mountain that is my to-be-read list. If we go with 2.67 new books per book I read, those eight newly added books will spawn 21 more books to add. Those 21 books will spawn 56 additional books.

You get the idea. I’m reminded of poor Ali Sard, in Dr. Seuss’s Did I Ever Tell You How Lucky You Are?. Ali is the one who had to mow grass in his uncle’s back yard, quick-growing grass:

The faster he mows it, the faster he grows it.

The faster I read, the more I fall behind.

The Most Successful People Wake Up Before 6:30am

Part of the beauty of the Internet is that it is an open forum for the exchange of ideas. Part of the problem with the Internet is that people can say anything. All things being equal, I’d prefer an Internet where people could say anything, but I’d certainly like to see more skepticism among its users.

There is so much on the Internet today that it is hard to get noticed without resorting to some kind of extreme. The result is some outlandish statements to attract attention, like one I encountered yesterday morning that asserted “The most successful people wake up before 6:30 am.” When I see an assertion like this, I roll my eyes. It is clearly an attempt at getting our attention. (The mildly observant reader will notice that I used that very line in the title of this post. I did so as a sort of experiment. I wonder (a) if this post will get more attention, and (b) if it will become one of the many evergreen posts I’ve written over the years.)

An assertion like “the most successful people wake up before 6:30 am” is so fraught with generalities that it is hard to know where to begin in interpreting it. To know what “most successful” means we first have to know what “successful” means. It seems to me that the definition of success varies based on a number of factors. Moreover, it is a generality. Successful in what? In life? In a career? In happiness? In love?

If we can manage to get to a definition of success that we can agree on, then we have to define what we mean by “most successful.” Is this the majority of successful people, meaning 50% plus one? It is a standard deviation, the narrow end of a Bell curve?

To me, an assertion like this is just asking for counter-examples. I consider myself a successful person based on my own criteria for success. But I spent only a relatively small fraction of my life waking before 6:30 am. These days, I generally wake up somewhere between 6:15 and 6:45 am. I rarely use an alarm. I’m not trying to wake up before 6:30 am. And I’m not sure I feel any more successful on days that I sleep in until 7 am as I do when I wake up at 6 am. (Sometime, waking at 6 am, I feel more sluggish than when I wake at 7 am.)

To me, an assertion like this has a specific type of person and a specific type of success built into it. It implies that someone working a night shift, for instance, can’t end up in the group of “most successful” people because they sleep through the day and may not wake up until the sun is going down. I suspect there are a lot of successful first-responders, doctors, pilots, and others who fall into this group.

And why 6:30 am? That seems arbitrary to me. I suspect part of the problem is that when culling examples of what we deem successful people, we find that these people wake up before 6:30 am. But aren’t we being selective in our sample? We are selecting people who are likely outliers in their success, who have the means broadcast their success, and who have discussed what they attribute to their success (or have had in implied about them). These outliers have to represent a small fraction of successful people, many of whom are too busy being successful to get involved in publicizing their success and writing about it for others to later cull as examples.

A final problem with an assertion like “the most successful people wake up before 6:30am” is that is has the illusion of being readily actionable–which alone makes it suspicious in my mind. It implies that because a person gets up before 6:30 am, they are successful, which is a fallacy encapsulated by the Latin phrase post hoc ergo propter hoc (because of this, therefore this). I suspect that there are many successful people who wake up after 6:30 am. I suspect that there are many unsuccessful people who wake up well before 6:30 am.

Still, I can see people being lured in by the simplicity of the implication. All I have to do to change my life is get up 2 hours earlier!

Vague assertions like these annoy me (clearly). Yesterday, the Little Miss was watching a gamer on YouTube play Minecraft. The gamer was decribing the two types of Minecraft players: those that play in survival mode, and those that play in creative mode. “People who play in creative mode,” this gamer stated, “are inherently lazy.”

I’ve played my share of Minecraft, and I have never once played in survival mode. I always play in creative mode. I never once thought of it as lazy. I thought of it as the most efficient way to get the most enjoyment out of the game that I could, by doing exactly what I wanted to–build things.

Looking at these two assertions, I come to the conclusion that I must have a fairly thick skin. I am not among the most successful people, and I am inherently lazy, it seems.

But I’m okay with that.

Fireplace Philosophy

We’ve been using our fireplace quite a bit this winter. We didn’t use it at all last winter. I like to think it has to do with the cold weather, but really, it is more about the ambiance. We have an open living room/dining room/kitchen area and the fireplace is in the living room, just beneath the TV mounted above it. Is is visible from anywhere in the living room/dining room/kitchen. In fact, it is visible from right here at my desk in the office.

Sitting by the fireplace, reading.

There is one corner of the sofa that closer to the fireplace than others. I like to sit there and read while the fire is roaring. I can feel the warmth from the fire. The only downside is that the flames often distract me from my book. I like to watch them dance around. I enjoy watching sparks separate from the flames like miniature sky lanterns, zipping up the chimney.

The burning wood smells pleasant. The crackling of the logs and snapping of the air around the flames is calming. Perhaps why a fireplace is so enjoyable is that it engages nearly all of our senses at once. We see the flames, feel the warmth, smell the wood, and hear the snap-crack-pop of the air and logs. Lately, I’ve been keeping the fire going most of the day, allowing it to burn out at night before we head off to bed.

Sometimes, however, I am distracted. Busy with work, or stuck in meetings, the flames may die out, and when I wander from office to the living room, I’ll notice that the flames have gone out, and only smoldering embers remain. That’s my cue to get things going again.

Until this afternoon, however, I never actually witnessed the flame wink out. I was sitting on the couch, reading the essay titled, “A Mathematical Romance” in Jim Holt’s enjoyable book, When Einstein Walked With Godel. There is some tricky (for me) mathematical discussion in the book, and whenever I read tricky mathematical discussions, I have to pause and visualize each step in my head to make certain I am getting it. In this case, I looked up from the book, and stared at the fireplace. A single flame danced sluggishly about. I watched the flame flicker slowly here and there. And then, just like that, it winked out. It was just gone. All that was left was the smoldering embers of what was left of the log.

I stared at the log, all thoughts of math gone from head. I was overcome by a kind of sadness. I’d witnessed the death of a flame. I’d seen it wink out of existence. In that moment, my mind jumped billions of years into the future, to an outpost somewhere outside our solar system. I seemed to sit there, in a comfortable room, warmed from a source of heat I couldn’t quite see. I was reading a book of science essays, very much like the book I’d been reading minutes–and billions of years–ago. Before me was view of a very dim star, centered against a background of other stars.

I looked up from my book in order to visualize the math that I’d been reading about–just in time to see that dim star–our star, the sun–flicker for the last time, and then wink out completely.

Who knows if there will be anyone around in several billion years from now to watch the sun wink out the way I watched the flame in the fireplace wink out. If they happen to be our descendants, I imagine they will be unrecognizable. But, if they recognize the star that served as the life source for their ancestors, I can imagine them feeling a moment of sadness as they watch it wink out. I wondered, as I returned to the sofa, and stood to light more wood, is it better to to notice and feel that momentary sense of loss? Or is it better for flames to wink out unnoticed by others, like a dog walking into the woods to die?