8-track tapes

Preserving past communication for the future

When I first wrote about “dead media” in 2005, the most common way for people to obtain music, movies, software, and other digital content was to buy it on physical objects. And sure, vinyl has been making a comeback, and some people do still buy audio on CDs and video on DVDs (or, more likely, Blu-ray discs). But we all know that physical media for most types of content is racing toward obsolescence. Most of us stream tunes using a service like Apple Music or Spotify; we stream video from Netflix, Hulu, and similar providers; and as for software…well, anything that doesn’t already run in a web browser can be downloaded. No one expects to buy this stuff on objects in boxes anymore.

And so—spoiler alert!—one could argue that almost all forms of physical media are either already dead, or are about to be. But that doesn’t mean their legacy—or their future—is any less interesting.

Around the time I was in junior high school, I began the process of replacing all my 8-track tapes with cassettes. About two decades later, I replaced my cassettes with CDs, and then my CDs with MP3 files, which now seem quaint compared to modern digital audio formats. The same is true of all those videocassettes, floppy disks, and many other assorted media that used to seem so valuable to me but are now unwanted trash. But even today, I occasionally run across a data storage device of some kind (a Zip disk! a microcassette!) that probably contains interesting and useful information, but that I have no way to access. This problem is neither new nor—despite current media delivery methods—completely solved.

It’s Dead, Jim

When a type of media can no longer be decoded, displayed, or presented readily, it’s said to be “dead.” So 8-track tapes have been dead for a very long time. Even though you can, with some effort, still locate a working player, new media is not being created in that format, and the existing media is deteriorating—sooner or later it will be completely unusable, even if you have the necessary equipment. This process is not unique to modern times. Media formats have come and gone regularly for as long as humans have had the ability to communicate. But although technology must march on, we still lose something valuable every time media dies: the words, images, sounds, or ideas it contained.

This may not seem like much of a problem when it comes to bad pop music from the ’70s, but think about older media. Computer punch cards. Wire recorders, which predated tape recorders. The wax cylinders used in early phonographs. Stereopticon images. Magic lantern slides. Over the millennia, thousands of varieties of media have been used to record and exchange information, and as many of those media have died, the information has slipped from our grasp too. Never mind that more recent media are quantifiably better in almost every respect; if the only recording of someone’s voice, say, from over 100 years ago is in a fragile and rapidly disintegrating medium that can only be retrieved with nonexistent equipment, that does us no good today—and yet it’s clearly something of tremendous historical interest.

Book of the Dead

In 1995, well-known science fiction author Bruce Sterling presented a manifesto called The Dead Media Project: A Modest Proposal and a Public Appeal at the International Symposium on Electronic Arts in Montréal. Sterling described the urgent need for someone to catalog all the forms of media humanity has created and then allowed to die off, documenting their successes, failures, and all the elements—cultural, political, financial, and technological—that may have contributed to their demise. This project would not only provide an important historical record, but crucially, would help technologists of the present and future to avoid the many mistakes of the past. Every medium in use today will surely be superseded or replaced someday—yes, even SSDs, flash drives, and the like. Perhaps paper books will still be around long after DVDs are no more than a faint memory; perhaps blogs and other websites will survive for decades or centuries. But sooner or later, the world will move on to new ways of communicating. It always does.

Sterling proposed, specifically, that someone undertake the task of writing what he called The Handbook of Dead Media—as he put it, “A naturalist’s field guide for the communications paleontologist.” And as a tiny incentive, he offered a crisp, new $50 bill to the first person or group to produce such a book. He even offered his own notes on the subject for anyone to use freely, and started an email discussion list and website called the Dead Media Project where members of the public could contribute their own information about dead media. All this data was there for the taking, royalty-free, for anyone willing to sit down and do the research, track down all the relevant facts and images, and produce a nice coffee-table book that could be used as a reference for people like Sterling, who imagine future technologies for a living.

What Don’t You Know?

As Sterling admitted, such a project is full of ambiguities. What counts as media, anyway? Does it include, for example, delivery mechanisms such as carrier pigeons, pneumatic tubes, or the telegraph? Does it include ephemeral means of communication, such as Native American smoke signals? What about media production devices, such as unusual typewriter designs? Or particular methods of encoding information—say, obsolete computer file formats? For that matter, what does it really mean for media to be dead? Are there degrees of deadness? If media can still be recovered somehow, does it make the cut? If a few scattered hobbyists actively use a medium that’s otherwise dead, should it be included? If the Vatican uses smoke signals to indicate the selection of a new pope, does it mean that medium is still alive? The questions, to which there are no definitive answers, go on and on. One could spend months simply trying to define the scope of the project.

Now, over 20 years later, The Handbook of Dead Media still does not exist—although Garnet Hertz’s 2009 A Collection of Many Problems (In Memory of the Dead Media Handbook) was a vigorous nod in the right direction. The Dead Media Project itself is, if not dead, barely kicking—the website hasn’t been actively maintained since mid-2001, and the mailing list no longer functions. Other programs and websites devoted to the preservation of old media, especially early audio recordings, are in operation, but as yet, I’ve seen no signs that anything approaching the scope of Sterling’s manifesto is even in the works.

Needless to say, I myself find this project enormously interesting and inspiring—linguistically, historically, and technologically. Compiling The Handbook of Dead Media would be right up my alley, a project I could really sink my teeth into. And if I ever become independently wealthy with absolutely nothing to do for a year or two, I’ll get right on it. The problem, of course, is that there’s almost certainly not enough money in such a book to reimburse an author for the time required to research and write it. That, like the death of media, is a great pity.

Speaking of death and books… I wrote a book called Take Control of Your Digital Legacy, which is all about ensuring that your digital data outlives you. The premise is that, just as you may have a will that details what should happen to your physical possessions and money after you die, you should take analogous steps to make sure that your files, email, photos, social media posts, and other digital data are handled however you want them to be; if you don’t decide on that now, your heirs are almost certain to make different decisions. Among many other topics in the book, I talk about the tendency of media to become obsolete (including media on which you might want to commit your data to posterity, such as hard drives and flash drives), and I offer suggestions for steps you can take to improve the odds of indefinite media availability.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on April 19, 2005.

Source: Interesting Thing of the Day

A color TV being tested

CBS made the first commercial color television broadcast on June 25, 1951, and we remember that anniversary today as Color TV Day. (Things got a bit messy and confused after that, but I think we can let bygones be bygones now. The switch from black-and-white to color TV was a big deal, so much so that my middle-class family couldn’t afford our first color TV until the mid-1970s, when I was in elementary school. (I remember being totally entranced—I’d had no idea television could look like that, and all of a sudden, everything we watched seemed real for the first time. It was a magical experience for me.) Now, the occasional movie or TV show shot in black and white is considered stylishly retro and artsy, but few of us would want to go back to a world with no color on our screens at all.

Image credit: By Archives New Zealand [CC BY-SA 2.0], via Wikimedia Commons

Source: Interesting Thing of the Day

Image of His Imperial Majesty Emperor Norton I, also known as Joshua A Norton.

Monarch of San Francisco

When I lived in San Francisco, I always bristled to hear people call it “San Fran.” People who live in other parts of the world may think “San Francisco” has too many syllables, but locals don’t ever call it “San Fran.” Ever. And only in an effort to be intentionally gauche or ironic would a resident call it “Frisco.” That’s just wrong, and it immediately identifies anyone who says it as clueless. This judgment goes way, way back. A century and a half ago, by the emperor’s decree, calling the city “Frisco” was a high misdemeanor punishable by a $25 fine.

Today’s interesting “thing” is ostensibly a person, though in fact it’s more of a concept: the notion that someone could declare himself to be an emperor, and—without any force or intimidation—actually get an entire city to go along with the fantasy, at least superficially, for more than 20 years. I am speaking of one of San Francisco’s most colorful historical figures: Joshua A. Norton, a.k.a. His Imperial Majesty Norton I, Emperor of the United States and Protector of Mexico.

The New Emperor’s Clothes

Joshua Norton was born in England in the early 1800s—sources vary as to the exact year of his birth, but it was somewhere between 1811 and 1819. From 1820 to 1849, he lived with his parents in South Africa. He then moved to the United States, taking up residence in San Francisco. Norton had come to the United States with a small fortune, and for several years, he was a successful businessman. Then a major investment turned out poorly, and Norton went bankrupt in 1858. He left the city, but returned the following year—wearing an army uniform with gold-plated epaulets and a funny hat. He presented a piece of paper to the editor of a local newspaper declaring himself to be “Norton I, Emperor of the United States.” The editor found Norton’s claim to be so amusing and bold that he printed the declaration on the front page of the paper as though it were genuine. And, surprisingly enough, the general public’s reaction was apparently “Cool. We have our own emperor.” Norton began striding regally around San Francisco in his uniform as if he really were the nation’s supreme ruler, and the city’s residents indulged him.

By all accounts, Norton, whatever his eccentricities or mental deficiencies, was a kind, affable, and benevolent man who was much loved by almost everyone he encountered. Whether or not he truly believed he had any power or right to rule, the people of San Francisco treated him with great deference and respect. He spent his days patrolling the streets, inspecting sidewalks and cable cars, supervising street repairs, and praising citizens for any good deed he observed. He especially liked to monitor “his” police force to make sure they were serving the city’s needs properly.

Norton was given free meals at restaurants, and in fact, restaurants that served him liked to put up brass plaques bragging that they were an official supplier of the emperor—it was good for business. Theaters reserved prime seats for Norton and his two dogs at every opening, and he was invited to lead parades and participate in various civil ceremonies. A local printer even provided Norton with his very own imperial currency, which merchants accepted as cash. (These notes are worth thousands today at auction.)

On one occasion, a police officer who didn’t appreciate Norton’s status arrested him and attempted to send him to a mental institution. The public outcry was immense. The police chief released Norton with a formal apology, and thereafter, all police officers saluted the emperor when they passed him on the street.

Laying Down the Law

Emperor Norton is perhaps best known for his frequent and audacious decrees, which the local newspapers always printed with great relish. (And when there wasn’t an actual decree, sometimes the newspapers made them up.) An 1860 decree dissolved the United States of America; a few months later, another decree prohibited Congress from meeting in Washington, D.C. In 1869, Norton issued a decree abolishing both the Democratic and Republican parties. Of course, no one ever took these decrees seriously, but this apparently didn’t diminish Norton’s enthusiasm for his vocation.

Some of the decrees, though, were strictly local—and showed an uncanny degree of foresight. In 1872, Norton decreed that a suspension bridge should be built between Oakland and San Francisco by way of Goat Island (now called Yerba Buena Island). Later that year, another decree demanded that a survey be undertaken to determine whether a bridge or tunnel was the best way to connect San Francisco and Oakland—and the members of the city’s Board of Supervisors were to be arrested for having ignored his earlier decrees. (They weren’t.) But a bridge very much like the one Norton had described was built eventually; the San Francisco–Oakland Bay Bridge opened in 1936. And in 1972, almost exactly 100 years after Norton’s decree, so did a tunnel—it’s used by BART, the Bay Area Rapid Transit system.

Norton’s reign as emperor lasted for 21 years. He died in 1880 of an apparent stroke while walking down the street. Although Norton died penniless, the city gave him a regal burial, and about 30,000 people turned out to watch his funeral procession.

In the years since, some people have argued that Norton really was the emperor, inasmuch as the public (at least in San Francisco) acknowledged that title. And there’s something to be said for that line of thought; after all, the only reason any government is legal or valid is ultimately the general assent of those governed. The United States of America, after all, declared itself into existence once; why shouldn’t an individual do the same?

I, Joe I

Norton did not have a successor, but there’s no reason someone else couldn’t step into the role, even now. Knowing what I do about the follies and failings of the nation’s current government, it seems clear to me that we should once again bring in an emperor to enforce order, fairness, decorum, and maybe a bit of whimsy. Clearly this individual must be someone with intelligence, charm, good humor, and poor fashion sense. As the logical choice, I’d be honored to serve.

And so, citizens of the United States and/or Mexico, should you choose to accept me as your new emperor by public acclaim, I will humbly and faithfully discharge the duties of the office. I promise to come up with great ideas that can’t possibly be implemented for a century; to issue impressive-sounding, unreasonable, and unenforceable decrees on a regular basis; to dress in a goofy uniform; and to behave in a generally benevolent and weird manner. In other words…pretty much life as usual. I’ll be expecting to receive your tax payments shortly. Oh, and Congress: you’re fired.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on May 26, 2005.

Source: Interesting Thing of the Day

Until today I could not have told you exactly a praline is, though I had the vague idea that it involved nuts and sugar. It turns out that a praline is—are you ready?—any of numerous confections composed mainly of nuts and sugar. I was unaware until just now, but apparently there are French, Belgian, and American varieties of praline (among others), and they’re so different from each other that only the nuts and the sugar are really the constants. The word is even pronounced differently depending on where you are and what type of praline you’re talking about. The kind they make in New Orleans involves pecans, butter, white and brown sugar, vanilla, and milk or cream. But you can have any kind you want today.

Image credit: Charles Barilleaux [CC BY 2.0], via Flickr

Source: Interesting Thing of the Day

The Dvorak keyboard layout

Chasing QWERTY

When I was first learning to type, I asked the same question everyone else asks: why are the keys arranged so stupidly? Why aren’t they laid out in a more logical order, as in, just to take one random example, alphabetically? The answer I’ve heard countless times is that the first typewriter keyboards were arranged alphabetically, but that caused mechanical problems—once typists became reasonably proficient, the keys jammed frequently because the hammers corresponding to certain frequently used letter sequences were too close together. As a result, so the story goes, the QWERTY layout was designed to prevent jamming by moving those letters farther apart, thus slowing down the typists to a speed the machine could handle. Meanwhile, a more sensible and efficient layout called the Dvorak Simplified Keyboard has been around for a long time, but never became very popular because QWERTY simply had too much momentum in the marketplace.

That story has frequently been used (even by me) as an example of how an inferior, inefficient design came to be the standard—and remained so, long after the original reasons for its success became irrelevant. But the truth is more complicated and surprisingly controversial.

Faster than a Speeding Typist

The first issue is whether the switch to the QWERTY layout was truly intended to slow down typists—whether it was a matter of deliberately inconveniencing people for the sake of machines or whether it was simply expedient engineering. Evidence strongly suggests the latter. I have never seen any statistics as to how fast anyone could type using the original, alphabetical keyboard layout, but I think it’s fair to imagine that it would be faster only for people who have to look at the keys while they type. For well-trained touch-typists, I very much doubt that an alphabetical layout would yield any speed increase over QWERTY, and there is some evidence to suggest exactly the opposite. QWERTY may have its faults, but it seems to me that one need not get upset (as many have) that we’re all using a layout that’s substantially worse than the original.

But is there some layout that’s demonstrably much better than QWERTY? A lot of people think the Dvorak Simplified Keyboard meets that description. Dr. August Dvorak, a professor at the University of Washington in Seattle, designed this alternative layout in 1932 and patented it in 1936. Dvorak’s goal was to reduce typing fatigue by minimizing finger movement, so he put the most commonly used letters (including all the vowels) on the “home” row, while placing letters like Q and Z and certain punctuation characters in spots that are harder to reach. The Dvorak layout also favors the right hand, on the grounds that the majority of people are right-handed. Dvorak naturally claimed that his design was much better than QWERTY, but we need not take his word for it. A study performed by the U.S. Navy in 1944 showed speed improvements of as much as 75% when people who had previously learned the QWERTY layout were retrained on Dvorak keyboards. And that, say Dvorak supporters, should be that.

Dark Days for Dvorak

However, another study—this one performed in 1956 by the U.S. General Service Administration (GSA)—failed to confirm the results of the Navy test. It found, in a nutshell, that QWERTY was at least as efficient as Dvorak, and possibly more so. Further research conducted from the 1950s through the 1970s showed little or no advantage for Dvorak.

Two economists, Stan Liebowitz and Stephen E. Margolis, have written extensively about the Dvorak-versus-QWERTY debate. They mention the research suggesting QWERTY’s superiority and point out a number of significant flaws in the 1944 Navy study, not the least of which is the fact that Dvorak himself apparently oversaw that research in some capacity. In short, they say, the Dvorak layout isn’t and never was any better than QWERTY, and the only thing the pro-Dvorak studies really prove is that anyone who is retrained on any keyboard layout will get better. Therefore, no one should use QWERTY as an example of the free market “choosing” an inferior technology.

Rejoinder to the Rebuttal

But wait! Dvorak’s not out yet. Supporters of Dvorak claim that Liebowitz and Margolis have an axe to grind, and that in the very process of showing how earlier research was fudged, they fudged facts themselves. Specifically, pro-Dvorak folks say, the economists failed to mention that Earl Strong, who was in charge of the 1956 GSA study, had a personal grudge against Dvorak and had made public statements before that study was even performed voicing his opposition to any alternative keyboard layout. Randy C. Cassingham, author of the 1986 book Dvorak Keyboard: The Ergonomically Designed Keyboard, Now an American Standard (an unbiased title if ever I heard one), attempted to debunk Liebowitz and Margolis’s findings soon after they were originally published, but his work has been little noticed—except by other die-hard Dvorak fans looking to bolster their position.

The ferocity with which both pro- and anti-Dvorak views are evangelized in some circles rivals that of a religious or political cause. Both sides selectively downplay or emphasize whichever facts suit them best, and there’s precious little research on the subject that’s both truly objective and modern enough to have been performed using computers rather than typewriters. Anecdotally, Dvorak users frequently cite greater comfort as one reason for preferring it, and some claim that because Dvorak involves less finger movement, it’s less likely to contribute to repetitive stress injuries. Opponents counter that if you truly can type faster with Dvorak, then the increased number of movements will offset the ergonomic gains made by the decreased range of motion. And the debate goes on and on.

The Best Test

Virtually all modern computers, tablets, and smartphones include the capability of switching into a Dvorak layout if that’s what you prefer (though keys on physical keyboards won’t match the characters they type unless you perform some minor surgery on your keyboard or put stickers over the existing letters). So if you want to try out Dvorak yourself, you need only consult your device’s Help or your favorite search engine to find out how to change that setting (in some cases, you might have to download an app). Chances are you’ll find that it takes a few weeks or so to retrain yourself to the point where you’re about as fast as you were using QWERTY. If you can tolerate the temporary loss in productivity, you may find the experiment useful.

I tried learning Dvorak myself, years ago, but gave up before I became proficient—I had to get my work done. On the other hand, I can already type as fast as I should ever need to using QWERTY. For me, the bottleneck is usually how fast I can think, not how fast I can type, so I suspect Dvorak would not make my life meaningfully better. If I spent all day, every day, doing straight transcription or had a quota to meet, though, I might relish any chance for a potential speed increase. Better yet, I might start looking for a job that didn’t involve typing at all.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on August 27, 2007.

Image credit: By User:Tiki katin [CC0], via Wikimedia Commons

Source: Interesting Thing of the Day