I lived in Australia in the early 1980s, and became an honorary member of a wonderful family whose friendship I still cherish. They had a contraption that I had never seen before hooked to their television set—it turned the TV into a video game. Their youngest son, who was about 6, would goad me into playing the ping-pong game with him, because, honestly, I was hopelessly bad at it. I can still hear his squeals of laughter at my incompetence.
About a decade later, my nephew got his first game console and asked me to play. Even though I was a lot more computer-savvy by that point, I was still a pathetic opponent. I could never get the hang of the controls or understand the goals of these games. I really didn’t see what all the fuss was about.
Then in 2003, I met Ralph Baer, the “father of the home video game,” as he is often known. I was at his home on a collecting trip with colleagues from the National Museum of American History. Ralph was a warm, funny, charming, and marvelously creative individual who told us the now-familiar story of why he invented a way to play games on a television set. All you could do with a TV, he said, was turn it on and off or change the channel. Ralph thought that was a frustrating waste of good technology.
During the course of talking about papers and artifacts that Ralph would donate to the national collections, he showed us the “Brown Box,” the prototype for the first multiplayer, multiprogram video game system. Of course, he challenged me to a game of ping-pong. But losing to the inventor of the game somehow didn’t sting as much as losing to a 6-year-old Australian.
On the day I learned of Ralph’s passing, I came across an article about two German graduate school inventors who are working on a way to make pedestrian crossings safer—by giving people who are waiting for the “walk” signal the chance to play videogames against their counterparts on the other side of the street.
I think Ralph would have been tickled by this testament to his legacy . . . but I dread the day when I will have to win a video game before I can cross the street.
The Lemelson Center notes, with great sadness, the passing of Ralph Baer, considered by many to be the “father of video games.”
Ralph Baer plays his Telesketch game, 1977.
In 1966, Baer transformed people’s relationship to home television by inventing a way for them to interact with their sets, playing games like Ping-Pong, tennis, checkers, and more. His work led directly to the game Odyssey in 1972, the first home video game for the consumer market, and launched a million-dollar industry, though no one predicted that at the time. That same year, a young Nolan Bushnell played Odyssey at a trade show. Bushnell went on to found Atari and create the arcade version of Baer’s Ping-Pong game, the now infamous Pong. Baer’s groundbreaking work shaped the leisure-time activities of a large segment of the world’s population and spawned numerous businesses, but the historical record of his achievements very nearly disappeared. In 2008, Baer told us the following story of how his original documents and apparatus were lost … and then found. They are now preserved in the Smithsonian’s National Museum of American History, where Baer’s home workshop will go on display next summer.
In the late 1990s I became aware that there is a growing community of classic video-game enthusiasts in the U.S. and elsewhere and that collecting hardware (game consoles, accessories, etc.) is an ongoing, growing hobby. So I became very concerned about the fate of all of the TV game hardware, as well as its supporting documentation, that we had built at Sanders Associates in Nashua, N.H., from 1966 through 1969. Hence, I began to inquire into the whereabouts and recoverability of the developmental games and the many supporting documents that Bill Harrison, Bill Rusch, and I generated during that period.
Photocopies of some of that documentation, used in various lawsuits against infringers of our patents, had been in my possession, but that was about all. Fortunately, people at Sanders were very cooperative and agreed that this material should be preserved. So I took charge of the collection effort and went after the hardware because I knew where it was stored and because it was most vulnerable to loss or destruction. Tracking down original documentation took place several years later.
There were a series of video-game-related patent infringement lawsuits, starting in the mid-seventies and finally ending in the mid-1990s, and I was not only deposed frequently but often appeared as an expert witness in court. Therefore I had a chance to review our game-development documents over and over again during that period of twenty years. That fact would later be of great help to me in my new task of trying to locate and preserve both hardware and documentation.
Depositions for the last of the lawsuits took place in Chicago in 1997 and I found myself once again in the offices of Leydig, Voit & Mayer, an intellectual-property law firm. During that deposition, video was shot in the Leydig, Voit & Mayer offices (these VHS tapes were recovered in 2004 by my friend and associate, David Winter) and show me handling every one of the eight different models of TV games, as well as other apparatus, that we had built at Sanders in the 1960s. In July 1997 I got the recovery effort rolling and received about half of the games from Leydig, Voit & Mayer a year later; the other games could not be located.
Because of health issues, I needed someone to help me continue the effort of data collection. I felt that that individual had to be a person of integrity who was a dedicated video-game historian and who had an existing, detailed knowledge of much of the early 1970s hardware, starting with the Magnavox Odyssey 1TL200. Preferably, he or she would also be someone who understood the technology and who was intensely interested in the details of the genesis of video games.
The only person I knew who qualified under these preconditions was David Winter in Paris, France. I began corresponding frequently by email with David starting in 1998. He was then in his mid-twenties, had a background in electrical engineering and software, and had already amassed a substantial collection of classic video-game consoles. What is more, he turned out to be that rare individual who is a true collector, intensely interested in the history of video games. He was then already exceptionally knowledgeable about the technical and marketing details of video-game hardware from the early 1970s.
And in many ways, he reminded me of myself. I came back to the U.S. from Paris as a GI in 1946, World War II being over, and brought eighteen tons of foreign small arms home with me. I had been teaching the use of some of these weapons to GIs and in the process was thoroughly drawn into understanding those weapons’ heritage and technology. I became the self-motivated collector of, and resident expert on, foreign small arms for the War Department. This intense experience made me intimately familiar with what it takes to be a collector–and I recognized those traits in David Winter.
With my health deteriorating, I decided to invite David to accompany me on a trip to Chicago in June 2002 to try and locate the missing TV game hardware items. Unfortunately, in two long, hot, and dusty days at the storage facility, we located only one hardware item, but did find hundreds of photocopies of documents that shed new light on the early history of video games. For example, we found records of the pre-Odyssey product consumer acceptance tests carried out by Magnavox, as well as many internal documents that fleshed out the details of how the home-console video-game business really got started at Magnavox. With the availability of all that data, we had moved a big step forward in our mission to create a detailed historical track of who did what, when, and where to make the home-console video-game industry a reality.
The next year, David went to Chicago again, this time on his own, to take one more crack at the storage boxes in an effort to find Harrison’s, Rusch’s, and my original documents. He was eminently successful, locating the standard-issue Sanders notebooks kept by Bill Rusch and Bill Harrison during 1967 and 1968 as well as the original loose-leaf papers containing my 1966-72 notes, including the cover sheet and four-page document that I wrote on September 1, 1966. In that document I had laid out the concept of playing games using an ordinary TV set and proposed a lot of game ideas, thus making that document the closest thing to the Magna Carta of the home video-game industry. David also found the original guest book used by Magnavox in May 1972 to sign in visitors at its San Francisco showroom. There Nolan Bushnell, later president of Atari, signed that book and played the Odyssey Ping-Pong game hands-on. That act became the genesis of the arcade game Pong.
In October 2002 I had contacted David Allison, chair and curator of the Division of Information Technology and Communications at the Smithsonian’s National Museum of American History, and we agreed, in principle, that the Smithsonian would become the repository for the game apparatus, as well as some of the supporting photocopied documentation. Finding and donating the original documents was not even a remote possibility at the time. But in January 2006 I sent the newly located original documents to the Smithsonian. These have been digitized by the Lemelson Center and are available as part of the finding aid to the materials that I have donated.
The search goes on for additional documents relevant to the 1966-75 period that brackets the conceptual, developmental, and early production and marketing years of the nascent console video-game industry. As they surface, I expect to add them to the material already in the custody of the Smithsonian. Through the work of the Lemelson Center’s Modern Inventors Documentation (MIND) program, the fragile historical record representing my nearly lost legacy and that of other inventors as well is preserved and accessible.
“The Rise of Innovation Districts” is a new report developed by Bruce Katz and Julie Wagner of the Brookings’s Institution’s Metropolitan Policy Program. Courtesy of the Brookings Institution.
In their report, Katz and Wagner trace what they call “a remarkable shift…in the spatial geography of innovation” away from the suburbs and back to cities. As documented by historians like Bill Leslie and Scott Knowles, many high-tech firms moved to the suburbs in the years after World War II where they built sprawling, space-age corporate campuses and R&D facilities. Some of the best known examples include the General Motors’ Technical Center (Warren, MI, in 1956), IBM’s Thomas Watson Research Center (Yorktown Heights, NY, in 1961), and AT&T’s Bell Laboratories (Holmdel, NJ, in 1962). The thinking at that time was to isolate industrial scientists from the suits and bean-counters at headquarters (and from other competitors) by plopping them in an idyllic university-like setting where they could invent new cutting-edge technologies.
In 1962, AT&T opened a sprawling 472-acre campus for Bell Labs in suburban Holmdel, NJ. It was designed by modernist architect Eero Saarinen, who created similar suburban campuses for General Motors (Warren, MI) and IBM (Yorktown Heights, NY). Photo courtesy of user MBisanz on Wikimedia Commons.
However, according to Katz and Wagner, by virtue of their suburban locations, these campuses were “accessible only by car, with little emphasis on quality of life or on integrating work, housing, and recreation.” Thus, “a new complementary urban model is now emerging” giving rise to what they call “innovation districts.”
These districts, by our definition, are geographic areas where leading-edge anchor institutions and companies cluster and connect with start-ups, business incubators and accelerators. They are also physically compact, transit-accessible, and technically-wired and offer mixed-use housing, office, and retail.
In contrast to the post-war suburban campuses, think Kendall Square in Cambridge, MA; University City in West Philadelphia, PA; and South Lake Union district in Seattle, WA. These urban districts feature big research universities (MIT, Univ. of Pennsylvania, Drexel) and big high-tech firms (Amazon, Microsoft) that serve as anchors for attracting additional high-tech startups, plus the housing and complimentary retail businesses that support them.
As Bruce Katz argues in this short video, innovation is increasingly taking place where people come together, not in isolated spaces. Courtesy of the Brookings Institution, via YouTube.
Katz and Wagner explain some of the reasons for this geographic shift back to cities. First, in terms of demographics, married families with children—the residents most likely to enjoy the suburbs—now constitute less than 20% of all U.S. households. Young professionals and retired empty-nesters increasingly prefer to live in cities where they can walk to the gym, visit a museum after work, or meet a friend at a coffee shop or tavern.
But a more interesting observation concerns the changing nature of innovation strategies and why today’s innovators and entrepreneurs favor the inter-connectedness of cities. During the mid-20th century, the big high-tech firms pursued a linear model of innovation, in which they believed pure scientific research (R) would lead directly to the development (D) of marketable new technologies, all within the confines of the firm’s suburban R&D labs. That strategy worked for a while: for example, researchers at Bell Labs developed transistors and lasers while earning 11 Nobel Prizes. But Bell Labs would eventually fall victim to the “campus curse” as the isolation and insularism of its pristine suburban labs slowed the pace of innovation. In fact, AT&T sold off Bell Labs to Alcatel-Lucent in 1996, and the famous Holmdel, NJ laboratories, designed by modernist architect Eero Saarinen, now sits empty. It may be converted into a medical center…or razed.
Instead, most small and medium-sized businesses are increasingly turning toward an “open innovation” strategy, in which they develop some of their own in-house technologies, while also partnering with other firms to buy or license certain new inventions. As Katz noted in his talk, this mixed, open innovation strategy “craves proximity” and “extols integration” in order to make the necessary connections. So in terms of geography, it makes more sense to locate a startup in an urban innovation district where the density of development increases the odds of finding new ideas and technology partners—at the “incubator” next door or in a serendipitous exchange at the corner coffee shop.
Coffee shops (like Detroit’s Great Lakes Coffee) are now places for entrepreneurs to work and network. Photo credit: Marvin Shaouni, originally published in Model D, and featured in the Brookings report.
As Katz and Wagner note in their full report, these 21st century innovation districts are in many ways a return to the original 19th and 20th century industrial districts that flourished in the first wave of industrialization. In these districts, firms from the same industry would cluster in a city neighborhood as workers walked to work and patronized local businesses. We see this clearly in some of our Places of Invention exhibition case studies. For example in 19th-century Hartford, skilled machinists in the Coltsville neighborhood lived in company-built housing and walked a few hundred yards to Samuel Colt’s famous armory where they mass produced revolvers with interchangeable parts. After their shift, they might walk next door to Charter Oak Hall to practice with the Colt Armory band, take in an evening lecture, or use the lending library while mingling with fellow workers. To use Katz and Wagner’s language, the Colt Armory was the “anchor firm” of the Hartford “innovation district,” which grew to include the Weed Sewing Machine Co., the Pope Manufacturing Co. (bicycles, automobiles), both the Underwood and Royal Typewriter companies, and Pratt &Whitney (machines tools), the last of which was a spin-off founded by two former Colt machinists. Similarly, in the 1950s several medical device firms clustered around two Twin Cities anchors—the University of Minnesota’s Variety Club Heart Hospital and Medtronic—in “Medical Alley” Minnesota.
A bird’s-eye view of “Coltsville,” 1877. This industrial village along the Connecticut River in Hartford included Samuel Colt’s famous onion-domed factory (foreground), and behind it, workers’ housing, a baseball field, and a church. To the right of the armory and below the church is Charter Oak Hall, where workers could engage in numerous leisure activities. A detail from the lithograph “City of Hartford” (1877) by O. H. Bailey, courtesy of The Connecticut Historical Society.
Katz and Warner believe there is strong potential for the growth of innovation districts in several U.S. cities. Indeed, as new urban innovation districts emerge in cities like St. Louis, Detroit, and Boston, civic leaders would be wise to brush up on their history to learn lessons from earlier Places of Invention.
Chesbrough, Henry. Open Innovation: The New Imperative for Creating and Profiting from Technology. Boston: Harvard Business School Press, 2003.
Godin, Benoit. “The Linear Model of Innovation: The Historical Construction of an Analytical Framework.” Science, Technology & Human Values 31 (2006): 639–667.
I want to pay homage to one of our favorite inventors, Stephanie Kwolek, who passed away June 18 at the age of 90. The DuPont chemist who invented Kevlar®, Kwolek came to the Lemelson Center in 1996 to participate in an “Innovative Lives” program, speaking with middle-school students about her childhood inspirations, life, and career. We were so intrigued by her personal and professional stories, and the impact of her invention, that we highlighted her in the Center’s “She’s Got It: Women Inventors and Their Inspirations” video, podcast, and educational materials. We also prominently featured her in our award-winning exhibition Invention at Play.
Of the diverse inventors in Invention at Play, evaluations showed that Kwolek was the most inspiring for museum visitors of all ages and backgrounds. They were impressed by the fact that she was a female inventor who started working at DuPont in 1946 when few women were hired as scientists. Of course they were impressed also by her important invention in the 1960s. The polymer fiber that Kwolek created―Kevlar®―is very light weight, stiff, and, pound for pound, five times stronger than steel! It’s also chemical and flame resistant. Today Kevlar® is used in bullet-resistant vests, cut-resistant gloves, fiber-optic cables, helmets, tires, sports equipment, and even the International Space Station. If you look around your home or office, you’re bound to have at least one product that contains Kevlar.
The following is a guest post by Edward Tenner, a senior research associate of the Lemelson Center and author of Why Things Bite Back and Our Own Devices.
More than 50 years ago, the Israeli archaeologist Pessah Bar-Adon discovered a trove of over four hundred copper objects and other priceless artifacts in a cave high in the cliffs overlooking the Dead Sea. The Polish-born scholar, who had lived for years among the Bedouin, had uncovered the “Cave of the Treasure” by a dry riverbed known as Nahal Mishmar. Originally chosen for its remoteness, the site was used by a vanished civilization over six thousand years ago. The artifacts helped define the Chalcolithic or Copper Age of technology (4500-3600 BCE), first recognized in the early twentieth century. This era marked a transition from the Stone Age to metallurgy, which brought with it the rise of villages controlled by chiefs, an expansion of agriculture, and the development of specialized crafts on an unprecedented scale.
Masters of Fire: Copper Age Art from Israel, an exhibition currently on view at the New York University Institute for the Study of the Ancient World (it will travel next to the Legion of Honor Museum in San Francisco), features stunning and often enigmatic objects (like anthropomorphic and zoomorphic ossuaries that held the bones of the dead) from Nahal Mishmar and other Chalcolithic periods sites. Beyond examining the artistry of the objects, the exhibition raises a fascinating question: Did mastery of metallurgy open the door to ancient, and thus modern, inequality?
Connections between metalworking and elitism have been made in other contexts. Investigations of the 5,000-year-old South Tyrol mummy Oetzi, for example, suggest that his copper-bladed axe was not only a highly efficient tool but a status symbol available solely to a small number of elite males. While the Nahal Mishmar hoard yielded clearly practical copper objects—such as heads of the maces that were the characteristic weapon of the period—it also included finely-worked “scepters,” “crowns,” and vessels that suggest the wealth and prestige of those reburied there, and the riches of their temples.
Whatever the objects mean and however they were used, the cost of obtaining, transporting, and smelting ore, and working copper—not to mention military protection of the new wealth—promoted inequality and social stratification. As Thomas E. Levy notes in the handsomely illustrated exhibition catalog, it took thirty-five hours of smelting time and fifty hours of work to produce a single copper axe. An egalitarian Neolithic society could not finance such skilled and specialized production without changing its character. “Once this transition was put in place, by the early fourth millennium BCE,” Levy writes, “there was no returning.”
If you’re in New York through this weekend, or in the Bay Area this summer or fall, don’t miss this mesmerizing exhibition (there’s an excellent review by Edward Rothstein of the New York Times).
I used to think manicures were only for elegant ladies who walked poodles, had afternoon tea, and were always perfectly coiffed. In other words, not me. If it’s raining outside, even a little bit, I’m the person who gets drenched in spite of using an umbrella. If I had a choice between spending time in a salon and hiking with a guidebook, I’d choose the woods. And every time I’ve gotten a manicure—without fail—I’ve chipped it the same day or shortly thereafter.
This all adds up to skepticism of a no-chip nail innovation that was recommended to me: the shellac manicure. The shellac nail polish and manicure system is credited to Creative Nail Design, or CND, a company that spent five years testing and improving this product before releasing it to the market. Unlike a regular manicure, shellac lasts up to two weeks and is touted as chip-free.
Image Credit: Wikimedia Commons
So how does it work? There isn’t much professionally written about its exact science, but the general idea is that the specially formulated shellac nail polish is applied like normal polish. Nails are then cured by placement under a UV lamp after each coat. And unlike a normal manicure, nails are dried and ready to go immediately, which helps a lot when you’re fishing around in your purse for money to pay.
On a related note, while researching this blog post, I came across the story of a scientist named Hope Jahren, who hacked Seventeen magazine’s #ManicureMonday on Twitter in fall 2013. #ManicureMonday is traditionally a place for girls and women to post images of their manicures, but Hope wanted to show girls that it’s not just how their hands look, but what they do with them. So she and other scientists tweeted images of their manicured hands doing all sorts of fun, science-related stuff. The Smithsonian has gotten in on the fun, showcasing the great work being done behind-the-scenes at our various museums and research centers. I think this is a great message to send girls and women who are future scientists, inventors, and innovators—that you can have fun with fashion and be a serious, smart professional in a STEM field.
A scientist takes part in #ManicureMonday. Image credit: Sara Kross (@Sara_kross), Twitter.
Like many successful inventions, the shellac manicure has made my life easier and is a vast improvement from easily chipping nail polish. It seems like there are continual updates to the system and more variety in color and textures. Maybe you’ll see me on the next #ManicureMonday.
It’s the time of year when mosquitos are hatching in preparation to swarm us, bite us, and make us itch. Mosquitos are and always have been not only an annoyance, but also a major health risk. Mosquitos spread diseases such as Yellow Fever, Malaria, and West Nile Virus that can result in death. Many natural products have been used to repel mosquitos with modest results such as citronella candles, smoke, and various plant extracts like eucalyptus oil. The real breakthrough in repellent, however, came from the invention of DEET.
DEET is a synthetic repellent invented by the U.S. Army for use by military personnel in insect-infected areas. Inventor Samuel Gertler of the U.S. Department of Agriculture received a patent in 1946 for using DEET as an insect repellent in the form of a cream, lotion, or powder. DEET was not registered for use by the general public until 1957.
Researchers at University of California Davis discovered that mosquitoes find the smell of DEET unappealing and consequently avoid areas that smell like DEET. Many companies have created an array of insect repellent products, including sprays, sunscreen, wipes, and sticks, containing varying concentrations of DEET. It is estimated that each year 78 million people in the US and 200 million people globally use DEET[i].
Although DEET is generally considered the best mosquito repellent on the market, it is not without concerns. Even though the EPA has determined that it is only “slightly toxic,” products containing DEET have been reported to cause rashes and there have been some cases of children becoming ill from its use[ii]. In extremely strong doses, it is capable of melting plastic and nylon.[iii] Additionally, DEET is expensive for people in places that need it most—such as Africa. The results of a 2010 study by researchers who identified some DEET-insensitive mosquitos are also of concern. They found that the gene adaptation that makes mosquitos insensitive can be passed on to the next mosquito generation.
But have hope! Inventing a more effective synthetic mosquito repellent may be on the horizon. Researchers at the University of California Riverside have identified the olfactory receptors mosquitos use to detect and dislike DEET. They have also identified three compounds in natural products that mimic DEET. The research team’s leader, Anandasankar Ray, said that the compounds they identified “are approved by the Food and Drug Administration for consumption as flavors or fragrances, and are already being used as flavoring agents in some foods. But now they can be applied to bed-nets, clothes, curtains—making them ward off insects.”[iv] “One of them is present in plum,” he says. “The other is present in orange and jasmine oil. Some of them are present in grapes. And, as you can imagine, they smell really nice.[v]”
Unfortunately, the commercial development and production of such DEET-mimicking repellents are still several years away. So it seems that the only comfort from mosquitoes I’ll receive this summer will come from the belief that the invention of an inexpensive, natural, and fully effective mosquito repellant will exist during my lifetime.
“Citius, Altius, Fortius”—translated as “faster, higher, stronger”—is the motto of the modern Olympic Games. This phrase could also sum up the goals of scientists, engineers, and other inventors working with athletes to develop new and improved sports equipment, clothing, and even technical skills. In the Olympic games, viewers around the world see the latest science and technology in action as skiers, skaters, and sledders take to the slopes, rinks, and tracks in Sochi, Russia, to compete for quadrennial Olympics glory. This post first appeared on O Say Can You See.
As summarized in the introduction, “this enlightening 10-part video collection, narrated by NBC Sports’ Liam McHugh, delves into the physics, engineering, chemistry, design and mathematics behind the ‘world’s foremost sporting event.'” The videos feature U.S. Olympians and Paralympians whose names you may know, alongside scientists and engineers whose important research has been funded by NSF. Complementary educational materials are provided for the budding scientists and engineers in our lives.
A poster from the first Winter Olympic Games which were held in Chamonix, France, in 1924. Via Wikimedia Commons.
Personally, I am fascinated by the Winter Olympics, the first of which was held in Chamonix, France, in 1924. Generally I favor the figure skating and alpine skiing part because I’ve experienced the pain and pleasure of trying them myself—so I played those videos first.
Skating is all about physics, which Olympic hopefuls like Gracie Gold and Ashley Wagner make look easy as they gracefully jump and spin on the ice. The skiing video features Julia Mancuso, who has won multiple Olympic medals including a bronze in Sochi, and Heath Calhoun, an Iraq veteran and 2010 and 2014 Paralympics contender. Also stars are the scientists and engineers behind them, like Dr. Kam Leang of the University of Nevada, Reno, who uses nano-scale carbon tubes to help reduce vibration in skis.
A commemorative stamp for the 1932 Winter Olympic Games held in Lake Placid, New York. It is in the collection of the Smithsonian’s National Postal Museum.
Of course, during the Olympics TV marathon, I often end up watching less popular sports, too, that are sometimes ignored in the U.S. during intervening years. The video about the engineering behind bobsledding, featuring U.S. team members Steve Holcomb and Steve Langton, raised my interest in watching that more carefully. I didn’t realize that bobsledding is one of the most dangerous sports, and the video illustrates numerous issues about weight, stability, speed, and drag that engineers must address to meet the sport’s official requirements.
Bonnie Blair’s speed skin from the 1992 Winter Olympics in Albertville, France.
Check out Shani Davis’ cutting-edge speed skating suit in the video clip “Engineering Competition Suits.” Perhaps one day, he will donate his suit to the Museum to join Bonnie Blair’s speed skating suit from the 1992 Olympics, which was cutting-edge in its time. We also have a pair of Apolo Anton Ohno’s speed skates among other great Olympics-related objects in the Museum’s sports collections. Doubtless NBC Olympics coverage will mention more than once that Blair and Ohno are the most decorated U.S. Winter Olympic athletes, with six and eight medals respectively, while Davis has won two gold medals at the last two Olympics and is competing in Sochi for more.
Shaun White’s outfit and snowboard.
The short video about the physics of snowboarding featuring Shaun White reminded me a lot of skateboarding, which the Lemelson Center for the Study of Invention and Innovation featured during Innoskate 2013, albeit on a much smaller, temporary half pipe built simply as a demonstration stage. Interestingly, White is both a medal-winning skateboarder and snowboarder and competed in the latter sport in Sochi. I should note that the Museum’s sports collection includes a Burton snowboard donated by White as well as an accessible snowboard invented by then-students Nathan Connolly and Matt Capozzi, who were featured in the Lemelson Center’s Invention at Play exhibition.
An accessible snowboard invented by then-students Nathan Connolly and Matt Capozzi, who were featured in the Lemelson Center’s Invention at Play exhibition. (0174706).
If this year’s NSF-NBC video series just whets your appetite, be sure to watch their previous collaboration, the “Science of the Olympic Winter Games 2010,” with informational segments about the science behind skiing, ski jumping, ice skating, and more.
We can thank Baron Pierre de Coubertin for reinventing the Olympic Games starting in 1896. An aristocratic French educator, he was inspired by ancient Greek culture and also the opportunity to use sports as a way to encourage intercultural communication and trust. The three core values of the Olympic Movement are Excellence, Respect, and Friendship, the latter defined in part as “build[ing] a peaceful and better world thanks to sport, through solidarity, team spirit, joy, and optimism.” Hopefully this year’s games in Sochi will live up to these values that helped spawn this international sports festival 118 years ago.
U.S. stamp commemorating the centennial of the Olympic Games.
Today marks the 30th anniversary of Apple’s famous “1984” television ad that aired on January 22, 1984 during the third quarter of the Super Bowl XVIII between the Los Angeles Raiders and Washington Redskins. Historian Eric Hintz describes how the “1984” ad and the introduction of the Apple Macintosh were key milestones both in the history of computing and the history of advertising.
The Super Bowl is a cultural event that attracts the attention of more than just football fans. In 2013, Super Bowl XLVII was the third most watched telecast of all time, with an average viewership of 108.7 million people. With so many eyeballs tuned in, advertisers bring out some of their best work and casual fans tune in for the groundbreaking TV commercials as much as for the game. Who could forget Steelers Hall of Famer “Mean” Joe Greene selling Coca-Cola (1979) or the Budweiser guys coining “Wassuuuup?!?” (2000) as everyone’s new favorite catchphrase? However, Apple’s “1984” ad during Super Bowl XVIII is arguably the most famous Super Bowl commercial of all time.
In 1983, the personal computing market was up for grabs. Apple was selling its Apple II like hotcakes but was facing increasing competition from IBM’s PC and “clones” made by Compaq and Commodore. Meanwhile, Apple, led by Steve Jobs, was busy developing its new Macintosh computer. Remember that in 1983, most businesses and governments still employed large, expensive, and technically intimidating mainframes. And while the first personal computers of the early 1980s were smaller and less intimidating, they still featured black screens with green text-based commands like C:\> run autoexec.bat.
Drawing inspiration from the pioneering Xerox Alto and improving on the underperforming Apple Lisa, Jobs and the Apple team built the Apple Macintosh with several revolutionary new features we now take for granted. A handheld input device called a “mouse.” A graphical user interface with overlapping “windows” and menus. Clickable pictures called “icons.” Cut-copy-paste editing. In short, Jobs and his team were creating an “insanely great” personal computer that was intuitive and easy to use—one he hoped would shake-up the PC market. At the same time, Apple had recently lured marketing whiz John Sculley away from Pepsi to be the firm’s new chief executive. Sculley, who had masterminded the “Pepsi Generation” campaign, raised Apple’s ad budget from $15 million to $100 million in his first year.
Apple Macintosh (“classic” 128K version), 1984, catalog number 1985.0118.01, from the National Museum of American History.
Apple hired the Los Angeles advertising firm Chiat/Day to launch the Macintosh in early 1984; the account team was led by creative director Lee Clow, copywriter Steve Hayden, and art director Brent Thomas. The trio developed a concept inspired by George Orwell’s dystopian novel, 1984, in which The Party, run by the all-seeing Big Brother, kept the proletariat in check with constant surveillance by the Thought Police. In the ad, IBM’s “Big Blue” would be cast as Big Brother, dominating the computer industry with its dull conformity, while Apple would re-write the book’s ending so that the Macintosh metaphorically defeats the regime. To direct the commercial, Chiat/Day hired British movie director Ridley Scott who’d perfected the cinematic look and feel of dystopian futures in Alien (1979) and Blade Runner (1982). The 60-second mini-film was shot in one week at a production cost of about $500,000. Two hundred extras were paid $125 a day to shave their heads, march in lock-step, and listen to Big Brother’s Stalinist gibberish. Shot in dark, blue-gray hues to evoke IBM’s Big Blue, the only splashes of color were the bright red running shorts of the protagonist, an athletic young woman who sprints through the commercial carrying a sledgehammer, and Apple’s rainbow logo. The commercial never showed the actual computer, but ended with a tease: “On January 24th, Apple Computer will introduce Macintosh. And you’ll see why 1984 won’t be like ‘1984.’”
Scenes from Apple’s “1984” Super Bowl advertisement. From Folklore.org.
When shown the finished ad in late 1983, Apple’s board members hated it. Sculley, the Apple CEO, instructed Chiat/Day to sell back both the 30 and 60-second time slots they’d purchased from CBS for $1 million, but they were only able to unload the 30 second slot. Apple was faced with the prospect of eating the $500,000 production costs of an ad that could really only air during calendar year 1984, so it swallowed hard and let the ad run once during the third quarter of the Super Bowl. Some 43 million Americans saw the ad, and when the football game returned, CBS announcers Pat Summerall and John Madden asked one another, “Wow, what was that?”
The ad, of course, was a sensation. The commercial’s social and political overtones held particular resonance in the mid-1980s, as the United States and Soviet Union were still engaged in an ideological Cold War. And, like Lyndon Johnson’s famous “Daisy” ad from the 1964 presidential campaign, the ad aired only once in primetime, but was replayed again and again on the network news that evening as the ad itself became a buzz-worthy source of free publicity. But even the mystique of the single airing wasn’t entirely true. Chiat/Day had quietly run the ad one other time, at 1 a.m. on December 15, 1983 on KMVT in Twin Falls, Idaho, so that the advertisement qualified for the 1983 advertising awards. As expected, the ad won several prestigious awards, including the Grand Prize at the Cannes International Advertising Festival (1984) and Advertising Age’s 1980s “Commercial of the Decade.” But the ad’s most enduring legacy is that it cemented the Super Bowl as each year’s blockbuster moment for advertisers and their clients.
While the ad aired during the Super Bowl on January 22, it merely pointed to Macintosh’s official debut two days later. On January 24, 1984, Apple held its annual shareholders meeting at the Flint Center auditorium on the campus of De Anza College, just a block from Apple’s offices in Cupertino, California. After dispensing with the formalities of board votes and quarterly earnings statements, the real show began. Steve Jobs walked on stage in a double-breasted suit and bow tie and rallied the troops by tweaking his chief rival: “IBM wants it all and is aiming its guns on its last obstacle to industry control, Apple. Will Big Blue dominate the entire computer industry, the entire information age? Was George Orwell right?”
Jobs then presented perhaps the greatest new product demonstration in history. Jobs walked over to a black bag, unzipped it, and set up the Macintosh to wild applause. Then Jobs inserted a floppy disk and started the demonstration of the Mac’s windows, menus, fonts, and drawing tools, all set to the stirring theme from Chariots of Fire. Then, the Mac spoke for itself: “Hello, I am Macintosh…”
So when you watch the Super Bowl on February 2 this year, it’s possible that the ads will overshadow the game. And for that you can thank Apple’s Macintosh, Chiat/Day and “1984.”
As Thanksgiving approaches, our thoughts naturally turn to traditions—national traditions like the Macy’s Thanksgiving Day Parade and our own personal traditions, which in my family means kielbasa and apple pie, going to the local Christmas tree farm, and my family members pretending to be shocked when I decline a serving of carrots for the 28th year in a row. (And, of course, my mother’s mashed potatoes, over which I rhapsodized in a previous post.)
Woodcut, The Marchbanks Calendar–November by Harry Cimino. Smithsonian American Art Museum.
We all have traditions, but where did they come from? When we deep-fry the turkey or add a spiral ham to the menu, it may not seem particularly innovative. But the technology behind these yummy traditions had to come from somewhere. While doing some Thanksgiving-inspired Googling, I came across this fun video from History on the invention of deep-fried turkeys, turduckens, and honey baked hams:
While we may not know who invented the deep-fried turkey, we can take a look at Harry Hoenselaar’s patent (#2470078A) for an “apparatus for slicing ham on the bone.” Hoenselaar’s invention was ingeniously created out of various objects found around his home—a pie tin, brackets, a hand drill, and a broom handle, to name a few. The patent application reads:
In the meat industry there is a large market for sliced meats, particularly for ham slices, but the bone construction and the shape of a ham is such that no wholly satisfactory method of slicing it exists. This statement also applies to legs of lamb and other like cuts of meat.
It is an object of the invention to provide a method and a machine for slicing ham and other joints, which are of exceptional efficiency in operation. Another object of the invention is to prepare ham for the market in a new and superior form.
Millions of spiral cut hams are sold every year, so I believe we can safely say that Hoenselaar accomplished what he set out to do—create an “efficient” ham.
Patent drawing by Harry Hoenselaar.
So whatever your traditions are this Thanksgiving, enjoy the holiday!