Archive for July 2018

A Lichtenberg Figure

Artificial lightning fossils

Go into almost any gift shop these days, and you’ll see paperweights, ornaments, or other decorations consisting of little crystal blocks with 3D laser-etched figures inside: a local landmark, a comic book hero, or whatever. You can even find booths set up where someone will take a 3D scan of your head and convert it into a work of art right there on the spot. Working from the scanned coordinates, a laser beam focuses on a tiny spot inside the crystal, heating it until it cracks slightly, and in so doing makes a visible dot. It repeats this a few thousand times at various spots throughout the glass block and next thing you know, you’ve got a lovely image, especially when it sits on its custom-made stand (available at extra cost), illuminated from beneath by colored LEDs. I thought this was a pretty cool effect the first time I saw it many years ago, but now that these things are so ubiquitous they’ve lost their appeal to me.

So my initial reaction upon seeing a Lichtenberg figure, after receiving a recommendation from a reader to look into them, was similarly ho-hum. The picture appeared to be just the same as the laser crystal art—right down to the LED-illuminated base—except that the pattern looked like lightning rather than someone’s head. Lovely, I thought, but no big deal. However, on closer examination I discovered that the similarities were superficial. What had made this image was in fact a wholly different and much more interesting phenomenon.

Fractals and Photocopiers

During a thunderstorm, if you look up into the sky, you may see lightning that appears not as a single jagged bolt but as a series of interconnected branches that look very much like roots coming from the clouds. If you were to look more closely at a photograph of the lightning, you’d notice that the irregular branches have smaller sub-branches, those branches have even smaller sub-branches, and so on. This gives lightning an overall shape resembling what mathematicians call a fractal, in which a pattern keeps repeating itself at smaller and smaller (or larger and larger) scales. Fractal-like shapes appear frequently in nature, most obviously in ferns and some kinds of trees, and are often seen in computer-generated artwork.

In the case of lightning, the shape exists for just a fraction of a second. But what if you could capture that shape in a static form? German physicist Georg Christoph Lichtenberg discovered how to do just that in 1777. He generated a strong electrostatic charge and discharged it through a needle touching a plate made of insulating material. He then covered the plate with resin dust, which stuck to the charged parts of the plate. The result was a two-dimensional image showing numerous radial lightning-like branches. Lichtenberg himself saw no practical use for this technique, but it was later to become the basis of all photocopiers and laser printers, which work by selectively charging portions of a drum, which then picks up toner in just the charged spots and transfers it onto paper.

Occasionally, under certain very specific conditions, Lichtenberg figures appear naturally after a lightning strike. People who have been struck by lightning sometimes have red marks on their skin in this characteristic shape, and the figures may also appear on grass or other vegetation. And when lightning strikes sand, the sand occasionally melts into fragile glass tubes called fulgurites, which roughly resemble lightning branches and are, in the most literal sense, natural lightning fossils.

Blocking a Charge

The most popular way to make Lichtenberg figures artificially today is to take a block of transparent acrylic and use a particle accelerator to bombard it with electrons. Because the electron beam is quite powerful, the electrons penetrate deep into the plastic. Acrylic is an insulator (or dielectric), meaning that it can hold a charge but current can’t pass through it. However, when the strength of the electric field passes a certain threshold, all those extra electrons begin knocking more electrons free, ionizing the acrylic molecules. This rapidly becomes a chain reaction (known as “dielectric breakdown”) in which portions of the acrylic spontaneously change from being highly insulating to being highly conductive. As the electrons all rush along this newly conductive path to escape the acrylic, they leave a visible trail of ionized acrylic molecules. Sometimes this breakdown (which, like lightning, takes a tiny fraction of a second) occurs spontaneously, as the electrons migrate toward an irregularity or defect in the acrylic’s surface; other times an artist provokes the discharge manually by striking the acrylic with a sharp, conductive object (like a nail). In any case, the result is a loud, bright spark and a lasting impression of the discharge path.

Of course, the end result of all this is still a paperweight or desktop sculpture. But if you like to think of art as holding up the mirror to nature, this is just what you need: a miniature, three-dimensional image of lightning frozen in time.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on August 14, 2006.

Image credit: By Bert Hickman [GFDL, CC-BY-SA-3.0, or CC BY 2.5 ], via Wikimedia Commons

Source: Interesting Thing of the Day

Harry Potter wordmark

According to the historical record, Harry Potter was born on July 31, 1980. That makes him 38 today. Happy birthday, Harry! By a complete coincidence, July 31 is also the birthday of author J.K. Rowling, who turns 53 today. (For those keeping score, actor Daniel Radcliffe celebrated his 29th birthday just last week, on July 23.) I, on the other hand, have the dubious distinction of sharing a birthday with Severus Snape (January 9). The late Hogwarts headmaster would have turned 58 this year, but he died at age 38—the same age Harry is today. Weird to think about, right?

Image credit: Public domain, via Wikimedia Commons

Source: Interesting Thing of the Day

A selection of style guides

I’m OK, you’re okay

When I received the edited manuscript of one of my books from the publisher, I was annoyed to find that every instance of “OK” had been turned into “okay.” Well, not quite every instance: in places where I was talking about a button on a computer screen that actually said “OK,” that was allowed to stay. There followed a lively exchange between my editor and me, she claiming that it had to be “okay” because that’s what the publisher’s style guide said, and me claiming “okay” is etymologically illegitimate, style guide or no. I couldn’t countenance the thought of having a book with my name on it include grating juxtapositions like “It’s okay to click the OK button.” I eventually got my way, though I lost quite a few other battles over differences between my style of writing and what the publisher prescribed.

In cases like these, a dictionary or English textbook is of little help. Most dictionaries say that both “okay” and “OK” are acceptable, sometimes along with “O.K.” and “o.k.,” the only difference being which is listed as the preferred spelling. But preferred by whom? Under what circumstances? And why? The question is even trickier when it comes to recently coined terms. Should there be a hyphen in “email”? Is “website” one word or two? Is “internet” capitalized always, sometimes, or never? And then there are questions of usage that even scholars debate. Are expressions such as “for free” and “from whence” redundant? Is it mandatory, optional, or forbidden to use “they,” “them,” and “their” as singular, gender-neutral pronouns?

A Matter of Style

If you want to know the preferred way to use a term for a particular type of writing, you should consult a style guide. In high school or college, your instructors probably insisted that papers you write follow a set of rigid guidelines in order to make the writing consistent and clear. Academic style guides are often referred to by their authors, such as “Turabian” (Kate L. Turabian’s A Manual for Writers of Research Papers, Theses, and Dissertations) or “Strunk and White” (The Elements of Style by William Strunk Jr. and E.B. White). Other style guides are specific to fields such as psychology, mathematics, or linguistics. And most publishers require their authors and editors to follow a house style guide, whether writing magazine articles or technical books.

Even outside the worlds of academic and professional writing, issues of usage are still quite common and quite important. English, like all languages, is constantly changing. The more frequently a given usage occurs, the more likely it is to become canonized as “proper”—even if the most frequent usage is based on a misunderstanding. This happens automatically and often haphazardly, which is why English has so many strange spellings and inconsistent rules. The further the language strays from its past conventions, the harder it is to learn, teach, and use. Although language change is inevitable, those who care about the ability to communicate clearly will naturally want to choose ways of writing that adhere to conventions in such a way as to avoid distracting readers. A style guide can help a writer make those decisions.


The job of a style guide author is to figure out, for a given point in time and type of publication (and usually, a specific audience), which way of saying or writing something is best—not necessarily correct. This is not as easy as it sounds. The English textbooks you read in school made it sound as if the language is ordered by sacred, inflexible rules, but in reality, there is ultimately no single objectively correct way to say anything. There are only better ways—that is, less ambiguous or more commonly used—and worse ways. So style guide authors get to proclaim which variant is, in their opinion, the one that serves the language best in some particular context.

To return to the “OK” example, everyone agrees on how it should be pronounced, but the spelling is another issue because there are numerous competing theories as to the expression’s origin. According to reliable sources, “O.K.” most likely came from the initials of a tongue-in-cheek alteration of “all correct” as “oll korrect.” There’s also a claim that it originated in the presidential campaign of Martin Van Buren as the initials for his nickname, “Old Kinderhook.” Some linguists think its origins are much older, from the West African language Wolof spoken by many slaves in the U.S. In Wolof, a word that sounds like “waw-kay” means “OK” (more or less), and it’s at least plausible that this was the term’s entree into English. There are probably a dozen other theories as well. The consensus seems to be that the letters O and K don’t stand for anything individually (at least, not now—even if they once did), so it would be misleading to include periods. But as to whether it should be written as a phonetic word (“okay”) or a pair of uppercase letters (“OK”), that is a matter of strenuous debate.

Choosing a Guide

Sorting through all these theories and opinions—for hundreds or thousands of expressions—is one of the tasks a style guide author must undertake. Style guides also provide detailed instructions for the use of punctuation and capital letters, typographical elements such as bullets and italics, and grammatical advice on such matters as split infinitives, dangling prepositions, and irregular verbs. Of course, advice is all it is or can be; no two style guides agree on everything. As a result, in choosing a style guide—and in choosing how diligently to follow it—one must consider not only the style guide’s intended audience but also the author’s credentials, attitude, and rationale for making decisions. Good writers must ultimately be prepared to make, and justify, their own decisions about style.

If I had to recommend just one general-purpose style guide, it would be Garner’s Modern English Usage (previously titled Garner’s Modern American Usage, and before that, A Dictionary of Modern American Usage). Garner’s advice is absolutely brilliant, and the reasoning he provides for almost all of his usage pronouncements strikes me as both thorough and reasonable. He’s detailed and strict where he needs to be; in other places, he is appropriately critical of silly, anachronistic writing rules that no reasonable person should have to follow. In other words, I trust him with my language. That’s not to say I agree with him all the time, or that he covers everything I wonder about (technical terms, for instance, are a bit sparse). But I find his advice refreshingly sane. Another favorite is The Chicago Manual of Style, which (along with the Apple Style Guide and a few other sources) informs the house style guide we use at Take Control Books.

I like to do things right, to the extent that I can figure out what “right” is. I think I’m fairly proficient in English, but it’s still a perplexing language, and I’d be lost without a good style guide. I like to think that by choosing a guide wisely, I’m helping to keep English a little more sane.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on May 8, 2003, and again in a slightly revised form on September 16, 2004.

Image credit: fixedandfrailing [CC BY-SA 2.0], via Flickr

Source: Interesting Thing of the Day

Old Penguin paperback books

On July 30, 1935, Penguin Books published its first paperback books. And so, each year on this date, we celebrate Paperback Book Day. I’ve written a few paperbacks myself, and although I run a publishing company that currently sells only ebooks, I still have tremendous fondness for paperbacks (and hundreds of them on my shelves). Among many other virtues, paperbacks require no battery power and can be read in the bathtub without worrying about ruining an expensive device. Pick up a good paperback today and immerse yourself in another world for a few hours.

Source: Interesting Thing of the Day

A recreation of spontaneous human combustion

Answering the burning questions

As a kid, I always wanted to be a mad scientist or inventor of some kind. So I taught myself just enough about chemistry and electronics to be dangerous, and I often had some sort of project or experiment underway. Around age 16 or 17, I was hard at work on my latest contraption, using my bed as a workbench since my desk was perpetually covered with junk. This project involved some soldering, a task at which I was moderately skilled. However, as I was leaning over my work, trying to steady myself by resting my elbow on the mattress, my arm slipped and I fell forward onto the bed with the soldering iron sandwiched between my forearm and the bedspread. Apart from the initial shock, the first sensation I recall experiencing was the smell of burning flesh and hair, followed by the realization that I had ruined my bedspread, and then very shortly thereafter, a good bit of pain.

Any number of lessons could be learned from such an experience—for instance, “Don’t solder in bed.” It’s also a reminder that there are any number of ways to generate dangerous levels of heat in close proximity to one’s body. Fortunately, this incident did not set me on fire. But if conditions had been just right, could this run-in with the soldering iron have reduced me to ash? This is just the sort of question pondered by those who investigate the phenomenon known as Spontaneous Human Combustion (SHC).

Up in Flames

Let’s start with some science. Spontaneous combustion, in and of itself, is a well-known phenomenon—and though perhaps surprising, it’s not in any way mysterious. The term is used when something erupts into flame without any apparent exterior cause (such as a spark). One classic example is the pile of oily rags. Some types of oil undergo a fairly rapid oxidation when exposed to air; it’s a straightforward chemical reaction that happens to produce some heat. If the heat cannot escape (by being dissipated into the air), it may eventually build up to a temperature above the flash point of the oil, at which point it will begin burning—just as it would if someone had dropped a hot coal onto the rags. A similar type of spontaneous combustion can occur in a pile of hay, due to the heat produced by bacteria that feed on the hay. Mechanics and farmers know all about these phenomena and generally take appropriate precautions to prevent heat from building up to dangerous levels.

But for centuries, stories have circulated claiming that a comparable process can set a human being on fire. What’s the source of this belief? Could it possibly be true? And if so, how does it happen?

Is It Hot in Here, Or Is It Just Me?

Picture this: you walk into a room and find the charred remains of a person, with only a few identifiable body parts left intact—a leg, perhaps, or maybe the skull. The area immediately surrounding the spot where the person breathed their last is also heavily charred, but the fire never spread to the rest of the room. It looks as if the person somehow caught on fire, but then burned up so rapidly that the fire was unable to spread. There being no apparent source of ignition nearby, it’s easy to conclude that the origin of the fire must have been internal. Hence spontaneous human combustion.

What I’ve just described may sound unbelievable, but such scenes have been reliably documented hundreds of times in the last few centuries. (It’s also been fictionalized—for example, by Charles Dickens in Bleak House.) The question is not whether the bodies burned, it’s how they burned.

Light Me Up

Human fat—of which there’s usually an ample quantity even in thin people—does make a good fuel. If heated sufficiently (and able to ooze out of the body), it could certainly burn quite well, perhaps using the person’s clothing as a wick like an inside-out candle. And plausibly, by the time all the fat in a body had burned, nothing would be left but ashes—not unlike cremation. But how would the fat get that hot in the first place? Some believers in SHC posit a heretofore-unidentified chemical reaction in the body that could produce a buildup of heat much like the one in the pile of oily rags. But so far, no direct evidence of such a reaction exists, and scientists remind us that combustion itself cannot take place inside the body anyway due to a lack of oxygen. In addition, there have been cases where the skin burned and not the internal organs, but no cases in which it was the other way around—weakening the theory that the source of the heat was internal. Other investigators have pointed out that no strange chemical reaction is necessary, that mundane sources of fire such as a cigarette, candle, or maybe even a spark from static electricity could potentially set fire to a person’s clothing and thus produce the required heat—consuming the “evidence” in the process.

Wouldn’t a person who has just caught on fire tend to notice that fact and do something about it? Advocates of the SHC theory generally claim that the heat is so intense and sudden that the victim is consumed before having the chance to do anything. Critics say that in fact a lower temperature of combustion explains the evidence better—that the bodies appear to have slowly smoldered over a number of hours, and that the smaller amount of heat kept the fire from spreading. And a person may not react if, for example, they were already dead—or maybe just drunk or drugged.

All that to say: the (rare and tragic) phenomenon of human combustion is not in question, but its spontaneity is. Even in the few cases where the victim survived or witnesses attested that there was no apparent external source of flame, there is no scientific need to imagine a mysterious internal source of heat—and no reason you should lose sleep worrying that you didn’t drink enough water today and therefore might suddenly burst into flames. All the same, I recommend doing your soldering on a nice solid table.

Note: This is an updated version of an article that originally appeared on Interesting Thing of the Day on October 22, 2004.

Image credit: By Good Video [CC BY-SA 4.0], via Wikimedia Commons

Source: Interesting Thing of the Day