Remembering Ralph Baer, Part 2

I lived in Australia in the early 1980s, and became an honorary member of a wonderful family whose friendship I still cherish. They had a contraption that I had never seen before hooked to their television set—it turned the TV into a video game. Their youngest son, who was about 6, would goad me into playing the ping-pong game with him, because, honestly, I was hopelessly bad at it. I can still hear his squeals of laughter at my incompetence.

About a decade later, my nephew got his first game console and asked me to play. Even though I was a lot more computer-savvy by that point, I was still a pathetic opponent. I could never get the hang of the controls or understand the goals of these games. I really didn’t see what all the fuss was about.

Ralph Baer in his home workshop, 2003

Ralph Baer in his home workshop, 2003. ©2003 Smithsonian Institution; photo by Jeff Tinsley.

Then in 2003, I met Ralph Baer, the “father of the home video game,” as he is often known. I was at his home on a collecting trip with colleagues from the National Museum of American History. Ralph was a warm, funny, charming, and marvelously creative individual who told us the now-familiar story of why he invented a way to play games on a television set. All you could do with a TV, he said, was turn it on and off or change the channel. Ralph thought that was a frustrating waste of good technology.

During the course of talking about papers and artifacts that Ralph would donate to the national collections, he showed us the “Brown Box,” the prototype for the first multiplayer, multiprogram video game system. Of course, he challenged me to a game of ping-pong. But losing to the inventor of the game somehow didn’t sting as much as losing to a 6-year-old Australian.

Losing to the inventor of the home video game.

Losing to the inventor of the home video game. © 2003 Smithsonian Institution; photo by Jeff Tinsley.

On the day I learned of Ralph’s passing, I came across an article about two German graduate school inventors who are working on a way to make pedestrian crossings safer—by giving people who are waiting for the “walk” signal the chance to play videogames against their counterparts on the other side of the street.

I think Ralph would have been tickled by this testament to his legacy . . . but I dread the day when I will have to win a video game before I can cross the street.

Rest in peace, Ralph.

Remembering Ralph Baer (1922-2014)

The Lemelson Center notes, with great sadness, the passing of Ralph Baer, considered by many to be the “father of video games.”

Ralph Baer plays his Telesketch game, 1977.

Ralph Baer plays his Telesketch game, 1977.

In 1966, Baer transformed people’s relationship to home television by inventing a way for them to interact with their sets, playing games like Ping-Pong, tennis, checkers, and more. His work led directly to the game Odyssey in 1972, the first home video game for the consumer market, and launched a million-dollar industry, though no one predicted that at the time. That same year, a young Nolan Bushnell played Odyssey at a trade show. Bushnell went on to found Atari and create the arcade version of Baer’s Ping-Pong game, the now infamous Pong. Baer’s groundbreaking work shaped the leisure-time activities of a large segment of the world’s population and spawned numerous businesses, but the historical record of his achievements very nearly disappeared. In 2008, Baer told us the following story of how his original documents and apparatus were lost … and then found. They are now preserved in the Smithsonian’s National Museum of American History, where Baer’s home workshop will go on display next summer.

In the late 1990s I became aware that there is a growing community of classic video-game enthusiasts in the U.S. and elsewhere and that collecting hardware (game consoles, accessories, etc.) is an ongoing, growing hobby. So I became very concerned about the fate of all of the TV game hardware, as well as its supporting documentation, that we had built at Sanders Associates in Nashua, N.H., from 1966 through 1969. Hence, I began to inquire into the whereabouts and recoverability of the developmental games and the many supporting documents that Bill Harrison, Bill Rusch, and I generated during that period.

Photocopies of some of that documentation, used in various lawsuits against infringers of our patents, had been in my possession, but that was about all. Fortunately, people at Sanders were very cooperative and agreed that this material should be preserved. So I took charge of the collection effort and went after the hardware because I knew where it was stored and because it was most vulnerable to loss or destruction. Tracking down original documentation took place several years later.

There were a series of video-game-related patent infringement lawsuits, starting in the mid-seventies and finally ending in the mid-1990s, and I was not only deposed frequently but often appeared as an expert witness in court. Therefore I had a chance to review our game-development documents over and over again during that period of twenty years. That fact would later be of great help to me in my new task of trying to locate and preserve both hardware and documentation.

Depositions for the last of the lawsuits took place in Chicago in 1997 and I found myself once again in the offices of Leydig, Voit & Mayer, an intellectual-property law firm. During that deposition, video was shot in the Leydig, Voit & Mayer offices (these VHS tapes were recovered in 2004 by my friend and associate, David Winter) and show me handling every one of the eight different models of TV games, as well as other apparatus, that we had built at Sanders in the 1960s. In July 1997 I got the recovery effort rolling and received about half of the games from Leydig, Voit & Mayer a year later; the other games could not be located.

Because of health issues, I needed someone to help me continue the effort of data collection. I felt that that individual had to be a person of integrity who was a dedicated video-game historian and who had an existing, detailed knowledge of much of the early 1970s hardware, starting with the Magnavox Odyssey 1TL200. Preferably, he or she would also be someone who understood the technology and who was intensely interested in the details of the genesis of video games.

The only person I knew who qualified under these preconditions was David Winter in Paris, France. I began corresponding frequently by email with David starting in 1998. He was then in his mid-twenties, had a background in electrical engineering and software, and had already amassed a substantial collection of classic video-game consoles. What is more, he turned out to be that rare individual who is a true collector, intensely interested in the history of video games. He was then already exceptionally knowledgeable about the technical and marketing details of video-game hardware from the early 1970s.

And in many ways, he reminded me of myself. I came back to the U.S. from Paris as a GI in 1946, World War II being over, and brought eighteen tons of foreign small arms home with me. I had been teaching the use of some of these weapons to GIs and in the process was thoroughly drawn into understanding those weapons’ heritage and technology. I became the self-motivated collector of, and resident expert on, foreign small arms for the War Department. This intense experience made me intimately familiar with what it takes to be a collector–and I recognized those traits in David Winter.

With my health deteriorating, I decided to invite David to accompany me on a trip to Chicago in June 2002 to try and locate the missing TV game hardware items. Unfortunately, in two long, hot, and dusty days at the storage facility, we located only one hardware item, but did find hundreds of photocopies of documents that shed new light on the early history of video games. For example, we found records of the pre-Odyssey product consumer acceptance tests carried out by Magnavox, as well as many internal documents that fleshed out the details of how the home-console video-game business really got started at Magnavox. With the availability of all that data, we had moved a big step forward in our mission to create a detailed historical track of who did what, when, and where to make the home-console video-game industry a reality.

The next year, David went to Chicago again, this time on his own, to take one more crack at the storage boxes in an effort to find Harrison’s, Rusch’s, and my original documents. He was eminently successful, locating the standard-issue Sanders notebooks kept by Bill Rusch and Bill Harrison during 1967 and 1968 as well as the original loose-leaf papers containing my 1966-72 notes, including the cover sheet and four-page document that I wrote on September 1, 1966. In that document I had laid out the concept of playing games using an ordinary TV set and proposed a lot of game ideas, thus making that document the closest thing to the Magna Carta of the home video-game industry. David also found the original guest book used by Magnavox in May 1972 to sign in visitors at its San Francisco showroom. There Nolan Bushnell, later president of Atari, signed that book and played the Odyssey Ping-Pong game hands-on. That act became the genesis of the arcade game Pong.

Baer demonstrating his original video game equipment in 2003.

Baer demonstrating his original video game equipment in 2003. © 2003 Smithsonian Institution; photo by Jeff Tinsley.

In October 2002 I had contacted David Allison, chair and curator of the Division of Information Technology and Communications at the Smithsonian’s National Museum of American History, and we agreed, in principle, that the Smithsonian would become the repository for the game apparatus, as well as some of the supporting photocopied documentation. Finding and donating the original documents was not even a remote possibility at the time. But in January 2006 I sent the newly located original documents to the Smithsonian. These have been digitized by the Lemelson Center and are available as part of the finding aid  to the materials that I have donated.

The search goes on for additional documents relevant to the 1966-75 period that brackets the conceptual, developmental, and early production and marketing years of the nascent console video-game industry. As they surface, I expect to add them to the material already in the custody of the Smithsonian. Through the work of the Lemelson Center’s Modern Inventors Documentation (MIND) program, the fragile historical record representing my nearly lost legacy and that of other inventors as well is preserved and accessible.

For more about Ralph Baer, watch the video of his visit to Spark!Lab in 2009 and listen to our 2007 podcast.

A Twist of Fate: The Invention of the Rubik’s Cube

Happy 40th birthday, Rubik’s Cube!

I’ve practically grown up with the toy, which I first encountered around 1981 when my elementary school classmate Matt dazzled us with his ability to solve it in mere minutes while the rest of us struggled to master the 3×3 cube. We didn’t have the advantage of online instructions or videos to give us helpful tips, since we didn’t have the World Wide Web yet. So puzzle-loving kids and adults invested hours solemnly twisting the cube segments over and over again. This was at the height of the toy’s popularity in the U.S., which quickly waned but never quite died.

Original Rubik's cube prototype.

Original Rubik’s cube prototype.

Today the toy and its inventor are celebrated in Beyond Rubik’s Cube, a traveling exhibition at the Liberty Science Center in New Jersey. Born 70 years ago on July 13, 1944, Hungarian Ernő Rubik is the man behind the Cube. His mother, Magdolna Szántó, was a poet and his father, Ernő Sr., was an aircraft engineer known for his glider designs. He said of his father: “Beside him I learned a lot about work in the sense of a value-creating process which has a target, and a positive result too.” (1) Young Ernő studied sculpture, design, and architecture in Budapest and eventually became a professor of architecture.

In 1974 he thought up the idea for the Rubik’s Cube in order to help teach 3-dimensional design to his students. Initially, he created a 3x3x3 rotating cube out of wood. “There was a workshop in the school, and I just used wood as a material because it is very simple to use and you don’t need any sophisticated machines. So I made it by just using my hands—cutting the wood, drilling holes, using elastic bands and those kind of very simple things.” (2) The following year he applied for a patent, which he received in 1977. Since this was Soviet-era Hungary, when the “Iron Curtain” divided Eastern and Western Europe, Rubik’s options were limited for manufacturing and marketing his invention. He worked with a small Hungarian company Politechnika to start selling colorful plastic versions of his “Bűvös Kocka,” translated into English as “Magic Cubes.”

A disassembled Rubik's cube, via wikimedia commons.

A disassembled Rubik’s cube, via wikimedia commons.

Buvos Kocka packaging.

Buvos Kocka packaging.

Rubik’s big breakthrough came when an expat Hungarian entrepreneur took the Magic Cube to the Nuremberg Toy Fair in Germany in 1979. There Tom Kremer, who owned a games and toys company called Seven Towns Ltd., saw the Cube and believed it could be a great success on the toy market if he could just find the right company to license it. Fluent in Hungarian and English, Kremer negotiated a deal with the Ideal Toy Company, who renamed it the “Rubik’s Cube” and launched it on the international market in 1980.

Hungarian patent.

Hungarian patent.

The Rubik’s Cube was an immediate worldwide sensation, winning many Toy of the Year awards in 1980 and 1981. Approximately 100 million were sold by 1982, but almost as quickly as it rose to fame the Cube seemed doomed to become a one-hit wonder. By 1986, The New York Times reported it had been “retired to the attic, the garbage heap and, with a bow to its elegance and ingeniousness, to the permanent collection of the Museum of Modern Art.” (3) However, the colorful toy never really disappeared, and over time it morphed into a popular culture icon. Today the number of Rubik’s Cubes sold worldwide is estimated at about 350 million.

Hungarian stamp honoring the Rubik's cube.

Hungarian stamp honoring the Rubik’s cube.

In Hilton, New York, Northwood Elementary School students are petitioning to get the Cube inducted into the National Toy Hall of Fame. “The project started in Jenny Ames’ and Julie Fiege’s sixth grade classes in November… Students worked in groups to pick a toy that they thought should be inducted, conducted research and then presented their argument to a panel of judges…The presentation included criteria set by the Hall of Fame—icon status, longevity, discovery and innovation. The Rubik’s Cube won! So now the entire C Core—six teachers and 160 students—is working to get the Cube nominated for its 40th birthday this year.” (4) Hopefully, the Rubik’s Cube might win induction into the Toy Hall of Fame in November also to honor Ernő Rubik’s 70th birthday.

Northwood Elementary School students

Northwood Elementary School students.

For teachers and families, there is now an educational program called “You Can Do the Rubik’s Cube” focusing on math learning and 21st Century Skills. As Ernő Rubik said, “If you are curious, you’ll find the puzzles around you. If you are determined, you will solve them.” (5)



Inspiring Inventor: Stephanie Kwolek (1923-2014)

I want to pay homage to one of our favorite inventors, Stephanie Kwolek, who passed away June 18 at the age of 90. The DuPont chemist who invented Kevlar®, Kwolek came to the Lemelson Center in 1996 to participate in an “Innovative Lives” program, speaking with middle-school students about her childhood inspirations, life, and career. We were so intrigued by her personal and professional stories, and the impact of her invention, that we highlighted her in the Center’s “She’s Got It: Women Inventors and Their Inspirations” video, podcast, and educational materials. We also prominently featured her in our award-winning exhibition Invention at Play.

Kwolek with Kevlar Button.

Of the diverse inventors in Invention at Play, evaluations showed that Kwolek was the most inspiring for museum visitors of all ages and backgrounds. They were impressed by the fact that she was a female inventor who started working at DuPont in 1946 when few women were hired as scientists. Of course they were impressed also by her important invention in the 1960s. The polymer fiber that Kwolek created―Kevlar®―is very light weight, stiff, and, pound for pound, five times stronger than steel! It’s also chemical and flame resistant. Today Kevlar® is used in bullet-resistant vests, cut-resistant gloves, fiber-optic cables, helmets, tires, sports equipment, and even the International Space Station. If you look around your home or office, you’re bound to have at least one product that contains Kevlar.

Kwolek with Kevlar products

Kwolek with Kevlar products.

International Space Station

Via Wikimedia Commons.

Kwolek earned many important awards and professional accolades, including being inducted into the National Inventors Hall of Fame in 1995 and receiving the National Medal of Technology in 1996 and Lemelson-MIT Lifetime Achievement Award in 1999. As our senior historian Joyce Bedi said, “She was a wonderful person and an inspiration to many, especially young women interested in science and invention.” We were indeed lucky to have known her.

Kwolek, age 3, on a horse.

Kwolek, age 3.

Concerned about Inequality? Blame the Ancient Coppersmiths.

The following is a guest post by Edward Tenner, a senior research associate of the Lemelson Center and author of Why Things Bite Back and Our Own Devices.

More than 50 years ago, the Israeli archaeologist Pessah Bar-Adon discovered a trove of over four hundred copper objects and other priceless artifacts in a cave high in the cliffs overlooking the Dead Sea. The Polish-born scholar, who had lived for years among the Bedouin, had uncovered the “Cave of the Treasure” by a dry riverbed known as Nahal Mishmar. Originally chosen for its remoteness, the site was used by a vanished civilization over six thousand years ago. The artifacts helped define the Chalcolithic or Copper Age of technology (4500-3600 BCE), first recognized in the early twentieth century. This era marked a transition from the Stone Age to metallurgy, which brought with it the rise of villages controlled by chiefs, an expansion of agriculture, and the development of specialized crafts on an unprecedented scale.

Crown with Building-Façade Decoration and Vultures. Copper.

Crown with Building-Façade Decoration and Vultures. Copper.
H. 17.5 cm; Diam. 16.8 cm. Naḥal Mishmar, 4500–3600 BCE. Israel Antiquities Authority: 1961-177, exhibited at the Israel Museum, Jerusalem. Collection of Israel Antiquities Authority. Photo © The Israel Museum, by Ardon Bar Hama

Masters of Fire: Copper Age Art from Israel, an exhibition currently on view at the New York University Institute for the Study of the Ancient World (it will travel next to the Legion of Honor Museum in San Francisco), features stunning and often enigmatic objects (like anthropomorphic and zoomorphic ossuaries that held the bones of the dead) from Nahal Mishmar and other Chalcolithic periods sites. Beyond examining the artistry of the objects, the exhibition raises a fascinating question: Did mastery of metallurgy open the door to ancient, and thus modern, inequality?

Connections between metalworking and elitism have been made in other contexts. Investigations of the 5,000-year-old South Tyrol mummy Oetzi, for example, suggest that his copper-bladed axe was not only a highly efficient tool but a status symbol available solely to a small number of elite males. While the Nahal Mishmar hoard yielded clearly practical copper objects—such as heads of the maces that were the characteristic weapon of the period—it also included finely-worked “scepters,” “crowns,” and vessels that suggest the wealth and prestige of those reburied there, and the riches of their temples.

Mace Head with Vertical Rows of Protruding Knobs.

Mace Head with Vertical Rows of Protruding Knobs. Copper. H. 7.3 cm; Diam. 3.3 cm. Naḥal Mishmar, 4500–3600 BCE. Israel Antiquities Authority: 1961-108, exhibited at the Israel Museum, Jerusalem.
Photography by Elie Posner © The Israel Museum, Jerusalem.

Scepter with Grooved Shaft and Four Horned Animal-Head Finials.

Scepter with Grooved Shaft and Four Horned Animal-Head Finials. Copper. H. 8.2 cm; Diam. (Shaft) 1.8 cm. Naḥal Mishmar, 4500–3600 BCE. Israel Antiquities Authority: 1961-86, exhibited at the Israel Museum, Jerusalem.
Photography by Elie Posner © The Israel Museum, Jerusalem.

Whatever the objects mean and however they were used, the cost of obtaining, transporting, and smelting ore, and working copper—not to mention military protection of the new wealth—promoted inequality and social stratification. As Thomas E. Levy notes in the handsomely illustrated exhibition catalog, it took thirty-five hours of smelting time and fifty hours of work to produce a single copper axe. An egalitarian Neolithic society could not finance such skilled and specialized production without changing its character. “Once this transition was put in place, by the early fourth millennium BCE,” Levy writes, “there was no returning.”

If you’re in New York through this weekend, or in the Bay Area this summer or fall, don’t miss this mesmerizing exhibition (there’s an excellent review by Edward Rothstein of the New York Times).

Inventing a No-Chip Manicure

I used to think manicures were only for elegant ladies who walked poodles, had afternoon tea, and were always perfectly coiffed. In other words, not me. If it’s raining outside, even a little bit, I’m the person who gets drenched in spite of using an umbrella. If I had a choice between spending time in a salon and hiking with a guidebook, I’d choose the woods. And every time I’ve gotten a manicure—without fail—I’ve chipped it the same day or shortly thereafter.

Inventor Hedy Lamarr. Image credits: Wikimedia Commons.

Inventor Hedy Lamarr. Image credits: Wikimedia Commons.

This all adds up to skepticism of a no-chip nail innovation that was recommended to me: the shellac manicure. The shellac nail polish and manicure system is credited to Creative Nail Design, or CND, a company that spent five years testing and improving this product before releasing it to the market. Unlike a regular manicure, shellac lasts up to two weeks and is touted as chip-free.

Image Credit: Wikimedia Commons

Image Credit: Wikimedia Commons

So how does it work? There isn’t much professionally written about its exact science, but the general idea is that the specially formulated shellac nail polish is applied like normal polish. Nails are then cured by placement under a UV lamp after each coat. And unlike a normal manicure, nails are dried and ready to go immediately, which helps a lot when you’re fishing around in your purse for money to pay.

On a related note, while researching this blog post, I came across the story of a scientist named Hope Jahren, who hacked Seventeen magazine’s #ManicureMonday on Twitter in fall 2013. #ManicureMonday is traditionally a place for girls and women to post images of their manicures, but Hope wanted to show girls that it’s not just how their hands look, but what they do with them. So she and other scientists tweeted images of their manicured hands doing all sorts of fun, science-related stuff. The Smithsonian has gotten in on the fun, showcasing the great work being done behind-the-scenes at our various museums and research centers. I think this is a great message to send girls and women who are future scientists, inventors, and innovators—that you can have fun with fashion and be a serious, smart professional in a STEM field.

A scientist takes part in #ManicureMonday. Image credit: Sara Kross (@Sara_kross), Twitter.

A scientist takes part in #ManicureMonday. Image credit: Sara Kross (@Sara_kross), Twitter.

Like many successful inventions, the shellac manicure has made my life easier and is a vast improvement from easily chipping nail polish. It seems like there are continual updates to the system and more variety in color and textures. Maybe you’ll see me on the next #ManicureMonday.


The Cell Phone’s True Magic

“Any sufficiently advanced technology is indistinguishable from magic.” Thus pronounced Sir Arthur C. Clarke, the prophetic author of 2001: A Space Odyssey and many other science fiction classics, in his “Third Law” of 1973. The connection between technology and magic can be traced back even farther than the 19th century, when electrical inventors Thomas Edison and Nikola Tesla were popularly known as wizards—a label that neither apparently tried to shed. In the era of Edison and Tesla, magic was part of the mystique of invention, and, frankly, an effective marketing strategy. Both electrical inventors were self-promoters fixated on reputation and market share.

The cover of The Daily Graphic of New York, 9 July 1879, pronounced Edison a “wizard.” SI negative #80-18655

The cover of The Daily Graphic of New York, 9 July 1879, pronounced Edison a “wizard.” SI negative #80-18655

Sir Arthur’s statement prompted me to reflect on if, and how much, times have changed. It seems to me that we are thoroughly accustomed and adapted to today’s electrical devices that have transformed our public and personal lives. They no longer astonish. Yet these gadgets are more powerful, mysterious, and “magical,” than ever. Their inner workings are infinitely miniaturized and shrouded in impenetrable disposable boxes. We are increasingly reliant on latter-day magicians, a priesthood of technicians, to mediate between them and us.

Of all the devices that surround us, the cell phone may qualify as the most magical. Yet, unlike other disruptive technologies—the telegraph, telephone, and light bulb—few people are familiar with the name of its inventor, Martin “Marty” Cooper. I had the privilege of interviewing him in a public forum last month. (Watch the interview on C-SPAN3).

The author (left) interviews cell phone inventor Martin Cooper. Photo by Chris Gauthier.

The author (left) interviews cell phone inventor Martin Cooper. Photo by Chris Gauthier.

While working at Motorola, Cooper introduced the public to the first true cell phone in 1973 (the year that Clarke handed down his Third Law), and brought forth what is surely the most ubiquitous technology on the planet. It is estimated that the number of cell phones in use in 2014 will actually exceed the world population of seven billion. A game-changing technology by any measure, it has become the continually-evolving platform for a dizzying array of novel devices whose social impacts are incalculable. We humans are the ultimate social animals and, for good or ill, the cell phone is the perfect tool for addressing our social needs—and neediness.

Yet Marty Cooper refuses to mystify himself or his amazing accomplishment. One of the most down-to-earth people I have ever met, the trim white-haired Cooper is above all a teacher, devoted to building bridges between the public and technology.

During our interview last month, I asked Cooper about his “Eureka moment” in the discovery of the cell phone. He immediately rejoined that there never was such a moment—and added that I should not have expected one. Like almost all major inventions, he said, the cell phone was the culmination of many small, often anonymous improvements made over a long period. He gave ample credit to his coworkers at Motorola as well as to other engineers, and stressed that he never came forth with the mythical “Aha!” It was not magic, but plain hard work. The greatest challenge was reducing the size of the phone, called the Dyna-TAC, from the size of a brick (hence, the nickname) weighing 2.5 pounds. Even after the major breakthrough of the invention of the integrated circuit, shrinking the handset was a long, hard slog.

Two early Motorola "brick" phones and an early flip phone. Photo by Chris Gauthier.

Two early Motorola “brick” phones and an early flip phone. Photo by Chris Gauthier.

Well, what about your April 3, 1973, public demonstration of the first cell phone in New York City, I asked? Was it anything like Samuel Morse’s first telegraph message, “What hath God wrought?” a line borrowed from the Bible. No, his was hardly a mystical moment, Cooper remembered. He made the call not to a colleague, but to Joel Engel, his rival at AT&T, who was then working on a cellular car phone. Known for his puckish sense of humor, Marty said: “Joel, I’m calling you from a cellular phone, a real cellular phone, a handheld, portable, real cellular phone,” making sure that Engel got the point.

Another major thing Cooper wanted us to know is that a cell phone is not a phone at all. Unlike illusionists who try to distract our gaze when they perform their tricks, he pointed out exactly where we should look. He explained that the cell phone is a special kind of radio, tracing its lineage not to Bell but to radio pioneer Guglielmo Marconi (that’s why the sound you hear when you first turn on your handset isn’t the once-familiar dial tone but the hiss of radio static.) As a communications engineer, this is a lineage that he is clearly proud of.

As for his greatest accomplishment, Cooper is quoted as saying the “personal telephone [is] something that would represent an individual, so you could assign a number not to a place, not to a desk, not to a home, but to a person.” The capability of calling an individual rather than a place, he insists, was the real value of his technical breakthrough. AT&T was focused on the car phone—place, not person. Told repeatedly that the personal cell phone was impossible, Cooper prides himself on his perseverance and faith in himself. No surprise then that he puts great stock in the cell phone as an agent of human individuality in our mass culture. That is its essence and true magic.

Cooper holds up his invention. Photo by Chris Gauthier.

Cooper holds up his invention. Photo by Chris Gauthier.

In May, the Lemelson Center will join with Smithsonian Magazine and the Arthur C. Clarke Center for Human Imagination to explore themes of invention both magical and factual. The 2014 “The Future Is Here” festival will focus on “Science Meets Science Fiction: Imagination, Inspiration, and Invention.” We are inviting the public to explore with us the tantalizing realm where the real, the imagined, and the illusory meet, if not in the “Twilight Zone,” at least in the same neighborhood. “Science Meets Science Fiction” will open new vistas on society’s future by highlighting the visionaries in science, invention, and science fiction who epitomize human imagination and creativity. We invite you to join us. But we’ll ask you to silence your cell phones.

Future Is Here Marquee

Inventing the Modern Organic Farm

As I sliced into a perfectly ripe, farm-fresh, red tomato, thoughts of a hot summer day flashed in my head. To me, there is nothing more satisfying than a juicy, salty, sweet tomato when the August sun is high in a cloudless sky. But it was late May, the temperature was a cool 45 degrees, and this wasn’t a typical tomato. It was grown during the coldest months of winter on a windswept peninsula off the coast of Maine, and it wasn’t grown using pesticides or chemical fertilizers. And guess what? It tasted absolutely divine.

Organic tomatoes.

Tomatoes just like these German Johnsons can be grown year-round in an unheated greenhouse. Photo courtesy of Johnny’s Selected Seeds.

“I’ve always been fascinated by the word ‘impossible,’” says Eliot Coleman, the pioneer farmer behind this tomato. It’s a fascination that has lead Coleman to invent, create, and innovate tools and techniques that have taken on the “impossible” in organic farming. His innovations have been instrumental in changing the way people grow food through the coldest winter months. Indeed, without Coleman, the White House probably wouldn’t be growing greens in December.

American consumers’ eating habits are changing, and the latest iteration of the US Department of Agriculture’s Farm Bill reflects that. It’s considered to be one of the most progressive farm bills to come out of Washington in decades. With significant growth in spending on local and regional food systems (from $10 million annually to $30 million), and a new emphasis on organic foods, the 2014 Farm Bill—signed by President Obama in February—goes a long way to supporting the small farmer. Many of the ideas proposed in the bill find their roots in the early organic revolution of the 1960s, which was lead, in part, by Eliot Coleman.

As the son of a Manhattan stockbroker, Coleman came to farming by happenstance. After graduate school in Vermont, he found himself teaching Spanish at a college in New Hampshire, where he met his first wife, Sue. One day while shopping in a general store, Eliot came across the book, “Living the Good Life,” by Helen and Scott Nearing. Struck by the Nearing’s experience living “off the grid” in mid-coast Maine, Coleman was inspired to seek out a similar adventure of his own. He and Sue left New Hampshire in 1968 with $5000 in savings and bought a piece of property from the Nearings in Harborside, Maine. There, with not a structure in sight, some of the least ideal soil for growing crops you could want, and nothing but a few hand tools and boundless energy, the Colemans began what would eventually become Four Seasons Farm, and a new organic year-round farming philosophy emerged.

But Eliot Coleman wouldn’t say that there was anything innovative about the way he approached organic farming. He’d say that it was simply an extension and adaptation of farming techniques that were practiced throughout Europe and the Americas prior to the advent of industrial farming. The old ways of doing things emphasized ecosystem management to be successful: compost, crop rotation, and naturally occurring soil nutrients.

“Using compost and natural systems to grow food was so simple,” he says. “The world’s best fertilizer, compost…is made for free in your backyard from waste products. The soil, the natural world was giving me everything I needed as inputs for this system. This place really is well designed, isn’t it? And it’s only because an awful lot of people haven’t been paying attention to [the fact that the natural world is well designed] is why we have difficulties.”

But what makes Eliot Coleman innovative is that he views with disdain and skepticism many cutting-edge trends in farming, such as relying on chemical fertilizers, monocrops, and industrial-scale tools. Central to his (innovative) philosophy is that there is much more value in diversity and sustainability.

Coleman began his farm by clearing the land by hand and working to make the rocky, acidic soil more balanced and fertile. It was a slow process, one acre giving way to two acres and so on—a process that continues to this day. Along the way there have been countless challenges, giving Coleman many opportunities to be creative in finding solutions.

For example, how do you weed between 30-foot rows of lettuce quickly and without breaking your back? This was a problem Eliot took on headfirst, and he devised the Collinear Hoe:


The Collinear Hoe, from Johnny’s Selected Seeds, a garden and farm supply company that Eliot Coleman works closely with to develop his ideas into production models. Photo courtesy of Johnny’s Selected Seeds.

Watch Eliot Coleman demonstrating how to use the Collinear Hoe here:

Or, how about a quick way to incorporate the right amount of compost within your soil so your compost isn’t too deep or too clumpy? Well, hook up a cordless drill to a tiller with small tines and you get Coleman’s “tilther.” What used to take 25 minutes now takes five.

Tiller mixing compost into soil.

Eliot Coleman prepares a bed in the garden using his invention, the Tilther, to mix compost into the soil. Photo courtesy of Johnny’s Selected Seeds

Mr. Coleman shares Benjamin Franklin’s belief that “As we benefit from the inventions of others, we should be glad to share our own…freely and gladly.” So, he was never interested in obtaining patents for his inventions. He just wanted a tool that would make farm work a little easier. Any ideas he had, he gave to an engineer or manufacturing company so that they could perfect the tool or product. That way, Eliot and his farmer friends could all benefit from it.

Perhaps his most significant contribution to commercial organic small-scale farming is the moveable hoophouse. The latest iteration is the New Cathedral Modular Tunnel, a structure that allows users to grow crops in progression with the seasons. When one area of the garden needs to be covered, the tunnel or greenhouse is lifted by 4 people and moved, or pushed along tracks that run the length of the fields. This invention is what allows Eliot to grow juicy red tomatoes all year long.

Putting up frames for modular greenhouse.

Eliot Coleman poses with his daughter Clara Coleman at Four Season Farm in Harborside, Maine. The two have just completed framing part of the 14’ Gothic Modular Moveable Tunnel, based on Mr. Coleman’s designs. September, 2013 Photo courtesy of Johnny’s Selected Seeds

The latest innovation Mr. Coleman has helped usher is a tool called the Quick Cut Greens Harvester, which, like the tilther, uses a cordless drill as its motor. Most exciting about this invention, which makes harvesting fresh salad greens much easier than the old method of cutting by hand, was that it was invented by a 16-year-old named Jonathan Dysinger, who visited Four Season’s Farm and was encouraged and inspired by Mr. Coleman to pursue the idea.

Watch Eliot Coleman demonstrate the harvester here:

Eliot Coleman’s contributions to small-scale and organic farming are numerous. From his philosophy to the methods and tools used to make it a viable business option, rejecting the conventional and daring to try the impossible are hallmarks of his work and legacy.


Remembering Apple’s “1984” Super Bowl ad

Today marks the 30th anniversary of Apple’s famous “1984” television ad that aired on January 22, 1984 during the third quarter of the Super Bowl XVIII between the Los Angeles Raiders and Washington Redskins. Historian Eric Hintz describes how the “1984” ad and the introduction of the Apple Macintosh were key milestones both in the history of computing and the history of advertising.

The Super Bowl is a cultural event that attracts the attention of more than just football fans. In 2013, Super Bowl XLVII was the third most watched telecast of all time, with an average viewership of 108.7 million people. With so many eyeballs tuned in, advertisers bring out some of their best work and casual fans tune in for the groundbreaking TV commercials as much as for the game. Who could forget Steelers Hall of Famer “Mean” Joe Greene selling Coca-Cola (1979) or the Budweiser guys coining “Wassuuuup?!?” (2000) as everyone’s new favorite catchphrase? However, Apple’s “1984” ad during Super Bowl XVIII is arguably the most famous Super Bowl commercial of all time.

In 1983, the personal computing market was up for grabs. Apple was selling its Apple II like hotcakes but was facing increasing competition from IBM’s PC and “clones” made by Compaq and Commodore. Meanwhile, Apple, led by Steve Jobs, was busy developing its new Macintosh computer. Remember that in 1983, most businesses and governments still employed large, expensive, and technically intimidating mainframes. And while the first personal computers of the early 1980s were smaller and less intimidating, they still featured black screens with green text-based commands like C:\> run autoexec.bat.

Drawing inspiration from the pioneering Xerox Alto and improving on the underperforming Apple Lisa, Jobs and the Apple team built the Apple Macintosh with several revolutionary new features we now take for granted. A handheld input device called a “mouse.” A graphical user interface with overlapping “windows” and menus. Clickable pictures called “icons.” Cut-copy-paste editing. In short, Jobs and his team were creating an “insanely great” personal computer that was intuitive and easy to use—one he hoped would shake-up the PC market. At the same time, Apple had recently lured marketing whiz John Sculley away from Pepsi to be the firm’s new chief executive. Sculley, who had masterminded the “Pepsi Generation” campaign, raised Apple’s ad budget from $15 million to $100 million in his first year.

Apple Macintosh (“classic” 128K version), 1984, catalog number 1985.0118.01, from the National Museum of American History.

Apple Macintosh (“classic” 128K version), 1984, catalog number 1985.0118.01, from the National Museum of American History.

Apple hired the Los Angeles advertising firm Chiat/Day to launch the Macintosh in early 1984; the account team was led by creative director Lee Clow, copywriter Steve Hayden, and art director Brent Thomas. The trio developed a concept inspired by George Orwell’s dystopian novel, 1984, in which The Party, run by the all-seeing Big Brother, kept the proletariat in check with constant surveillance by the Thought Police. In the ad, IBM’s “Big Blue” would be cast as Big Brother, dominating the computer industry with its dull conformity, while Apple would re-write the book’s ending so that the Macintosh metaphorically defeats the regime. To direct the commercial, Chiat/Day hired British movie director Ridley Scott who’d perfected the cinematic look and feel of dystopian futures in Alien (1979) and Blade Runner (1982). The 60-second mini-film was shot in one week at a production cost of about $500,000. Two hundred extras were paid $125 a day to shave their heads, march in lock-step, and listen to Big Brother’s Stalinist gibberish. Shot in dark, blue-gray hues to evoke IBM’s Big Blue, the only splashes of color were the bright red running shorts of the protagonist, an athletic young woman who sprints through the commercial carrying a sledgehammer, and Apple’s rainbow logo. The commercial never showed the actual computer, but ended with a tease: “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984.’”

Scenes from Apple’s “1984” Super Bowl advertisement.  From

Scenes from Apple’s “1984” Super Bowl advertisement. From


When shown the finished ad in late 1983, Apple’s board members hated it. Sculley, the Apple CEO, instructed Chiat/Day to sell back both the 30 and 60-second time slots they’d purchased from CBS for $1 million, but they were only able to unload the 30 second slot.  Apple was faced with the prospect of eating the $500,000 production costs of an ad that could really only air during calendar year 1984, so it swallowed hard and let the ad run once during the third quarter of the Super Bowl. Some 43 million Americans saw the ad, and when the football game returned, CBS announcers Pat Summerall and John Madden asked one another, “Wow, what was that?”

The ad, of course, was a sensation. The commercial’s social and political overtones held particular resonance in the mid-1980s, as the United States and Soviet Union were still engaged in an ideological Cold War. And, like Lyndon Johnson’s famous “Daisy” ad from the 1964 presidential campaign, the ad aired only once in primetime, but was replayed again and again on the network news that evening as the ad itself became a buzz-worthy source of free publicity. But even the mystique of the single airing wasn’t entirely true. Chiat/Day had quietly run the ad one other time, at 1 a.m. on December 15, 1983 on KMVT in Twin Falls, Idaho, so that the advertisement qualified for the 1983 advertising awards.  As expected, the ad won several prestigious awards, including the Grand Prize at the Cannes International Advertising Festival (1984) and Advertising Age’s 1980s “Commercial of the Decade.” But the ad’s most enduring legacy is that it cemented the Super Bowl as each year’s blockbuster moment for advertisers and their clients.

While the ad aired during the Super Bowl on January 22, it merely pointed to Macintosh’s official debut two days later. On January 24, 1984, Apple held its annual shareholders meeting at the Flint Center auditorium on the campus of De Anza College, just a block from Apple’s offices in Cupertino, California. After dispensing with the formalities of board votes and quarterly earnings statements, the real show began. Steve Jobs walked on stage in a double-breasted suit and bow tie and rallied the troops by tweaking his chief rival: “IBM wants it all and is aiming its guns on its last obstacle to industry control, Apple.  Will Big Blue dominate the entire computer industry, the entire information age?  Was George Orwell right?”

Jobs then presented perhaps the greatest new product demonstration in history. Jobs walked over to a black bag, unzipped it, and set up the Macintosh to wild applause.  Then Jobs inserted a floppy disk and started the demonstration of the Mac’s windows, menus, fonts, and drawing tools, all set to the stirring theme from Chariots of Fire. Then, the Mac spoke for itself: “Hello, I am Macintosh…”

So when you watch the Super Bowl on February 2 this year, it’s possible that the ads will overshadow the game. And for that you can thank Apple’s Macintosh, Chiat/Day and “1984.”

Innovating New Traditions

As Thanksgiving approaches, our thoughts naturally turn to traditions—national traditions like the Macy’s Thanksgiving Day Parade and our own personal traditions, which in my family means kielbasa and apple pie, going to the local Christmas tree farm, and my family members pretending to be shocked when I decline a serving of carrots for the 28th year in a row. (And, of course, my mother’s mashed potatoes, over which I rhapsodized in a previous post.)

Woodcut of a turkey

Woodcut, The Marchbanks Calendar–November by Harry Cimino. Smithsonian American Art Museum.

We all have traditions, but where did they come from? When we deep-fry the turkey or add a spiral ham to the menu, it may not seem particularly innovative. But the technology behind these yummy traditions had to come from somewhere. While doing some Thanksgiving-inspired Googling, I came across this fun video from History on the invention of deep-fried turkeys, turduckens, and honey baked hams:

While we may not know who invented the deep-fried turkey, we can take a look at Harry Hoenselaar’s patent (#2470078A) for an “apparatus for slicing ham on the bone.” Hoenselaar’s invention was ingeniously created out of various objects found around his home—a pie tin, brackets, a hand drill, and a broom handle, to name a few. The patent application reads:

In the meat industry there is a large market for sliced meats, particularly for ham slices, but the bone construction and the shape of a ham is such that no wholly satisfactory method of slicing it exists. This statement also applies to legs of lamb and other like cuts of meat.

It is an object of the invention to provide a method and a machine for slicing ham and other joints, which are of exceptional efficiency in operation. Another object of the invention is to prepare ham for the market in a new and superior form.

Millions of spiral cut hams are sold every year, so I believe we can safely say that Hoenselaar accomplished what he set out to do—create an “efficient” ham.

Patent drawing of the ham slicing machine.

Patent drawing by Harry Hoenselaar.

So whatever your traditions are this Thanksgiving, enjoy the holiday!

And remember, when frying a turkey, safety first!