Thank you for being a friend to the Lemelson Center! We hope you had a festive and inventive holiday season. Remember to visit our new website in 2015!
I lived in Australia in the early 1980s, and became an honorary member of a wonderful family whose friendship I still cherish. They had a contraption that I had never seen before hooked to their television set—it turned the TV into a video game. Their youngest son, who was about 6, would goad me into playing the ping-pong game with him, because, honestly, I was hopelessly bad at it. I can still hear his squeals of laughter at my incompetence.
About a decade later, my nephew got his first game console and asked me to play. Even though I was a lot more computer-savvy by that point, I was still a pathetic opponent. I could never get the hang of the controls or understand the goals of these games. I really didn’t see what all the fuss was about.
Then in 2003, I met Ralph Baer, the “father of the home video game,” as he is often known. I was at his home on a collecting trip with colleagues from the National Museum of American History. Ralph was a warm, funny, charming, and marvelously creative individual who told us the now-familiar story of why he invented a way to play games on a television set. All you could do with a TV, he said, was turn it on and off or change the channel. Ralph thought that was a frustrating waste of good technology.
During the course of talking about papers and artifacts that Ralph would donate to the national collections, he showed us the “Brown Box,” the prototype for the first multiplayer, multiprogram video game system. Of course, he challenged me to a game of ping-pong. But losing to the inventor of the game somehow didn’t sting as much as losing to a 6-year-old Australian.
On the day I learned of Ralph’s passing, I came across an article about two German graduate school inventors who are working on a way to make pedestrian crossings safer—by giving people who are waiting for the “walk” signal the chance to play videogames against their counterparts on the other side of the street.
I think Ralph would have been tickled by this testament to his legacy . . . but I dread the day when I will have to win a video game before I can cross the street.
Rest in peace, Ralph.
The Lemelson Center notes, with great sadness, the passing of Ralph Baer, considered by many to be the “father of video games.”
In 1966, Baer transformed people’s relationship to home television by inventing a way for them to interact with their sets, playing games like Ping-Pong, tennis, checkers, and more. His work led directly to the game Odyssey in 1972, the first home video game for the consumer market, and launched a million-dollar industry, though no one predicted that at the time. That same year, a young Nolan Bushnell played Odyssey at a trade show. Bushnell went on to found Atari and create the arcade version of Baer’s Ping-Pong game, the now infamous Pong. Baer’s groundbreaking work shaped the leisure-time activities of a large segment of the world’s population and spawned numerous businesses, but the historical record of his achievements very nearly disappeared. In 2008, Baer told us the following story of how his original documents and apparatus were lost … and then found. They are now preserved in the Smithsonian’s National Museum of American History, where Baer’s home workshop will go on display next summer.
In the late 1990s I became aware that there is a growing community of classic video-game enthusiasts in the U.S. and elsewhere and that collecting hardware (game consoles, accessories, etc.) is an ongoing, growing hobby. So I became very concerned about the fate of all of the TV game hardware, as well as its supporting documentation, that we had built at Sanders Associates in Nashua, N.H., from 1966 through 1969. Hence, I began to inquire into the whereabouts and recoverability of the developmental games and the many supporting documents that Bill Harrison, Bill Rusch, and I generated during that period.
Photocopies of some of that documentation, used in various lawsuits against infringers of our patents, had been in my possession, but that was about all. Fortunately, people at Sanders were very cooperative and agreed that this material should be preserved. So I took charge of the collection effort and went after the hardware because I knew where it was stored and because it was most vulnerable to loss or destruction. Tracking down original documentation took place several years later.
There were a series of video-game-related patent infringement lawsuits, starting in the mid-seventies and finally ending in the mid-1990s, and I was not only deposed frequently but often appeared as an expert witness in court. Therefore I had a chance to review our game-development documents over and over again during that period of twenty years. That fact would later be of great help to me in my new task of trying to locate and preserve both hardware and documentation.
Depositions for the last of the lawsuits took place in Chicago in 1997 and I found myself once again in the offices of Leydig, Voit & Mayer, an intellectual-property law firm. During that deposition, video was shot in the Leydig, Voit & Mayer offices (these VHS tapes were recovered in 2004 by my friend and associate, David Winter) and show me handling every one of the eight different models of TV games, as well as other apparatus, that we had built at Sanders in the 1960s. In July 1997 I got the recovery effort rolling and received about half of the games from Leydig, Voit & Mayer a year later; the other games could not be located.
Because of health issues, I needed someone to help me continue the effort of data collection. I felt that that individual had to be a person of integrity who was a dedicated video-game historian and who had an existing, detailed knowledge of much of the early 1970s hardware, starting with the Magnavox Odyssey 1TL200. Preferably, he or she would also be someone who understood the technology and who was intensely interested in the details of the genesis of video games.
The only person I knew who qualified under these preconditions was David Winter in Paris, France. I began corresponding frequently by email with David starting in 1998. He was then in his mid-twenties, had a background in electrical engineering and software, and had already amassed a substantial collection of classic video-game consoles. What is more, he turned out to be that rare individual who is a true collector, intensely interested in the history of video games. He was then already exceptionally knowledgeable about the technical and marketing details of video-game hardware from the early 1970s.
And in many ways, he reminded me of myself. I came back to the U.S. from Paris as a GI in 1946, World War II being over, and brought eighteen tons of foreign small arms home with me. I had been teaching the use of some of these weapons to GIs and in the process was thoroughly drawn into understanding those weapons’ heritage and technology. I became the self-motivated collector of, and resident expert on, foreign small arms for the War Department. This intense experience made me intimately familiar with what it takes to be a collector–and I recognized those traits in David Winter.
With my health deteriorating, I decided to invite David to accompany me on a trip to Chicago in June 2002 to try and locate the missing TV game hardware items. Unfortunately, in two long, hot, and dusty days at the storage facility, we located only one hardware item, but did find hundreds of photocopies of documents that shed new light on the early history of video games. For example, we found records of the pre-Odyssey product consumer acceptance tests carried out by Magnavox, as well as many internal documents that fleshed out the details of how the home-console video-game business really got started at Magnavox. With the availability of all that data, we had moved a big step forward in our mission to create a detailed historical track of who did what, when, and where to make the home-console video-game industry a reality.
The next year, David went to Chicago again, this time on his own, to take one more crack at the storage boxes in an effort to find Harrison’s, Rusch’s, and my original documents. He was eminently successful, locating the standard-issue Sanders notebooks kept by Bill Rusch and Bill Harrison during 1967 and 1968 as well as the original loose-leaf papers containing my 1966-72 notes, including the cover sheet and four-page document that I wrote on September 1, 1966. In that document I had laid out the concept of playing games using an ordinary TV set and proposed a lot of game ideas, thus making that document the closest thing to the Magna Carta of the home video-game industry. David also found the original guest book used by Magnavox in May 1972 to sign in visitors at its San Francisco showroom. There Nolan Bushnell, later president of Atari, signed that book and played the Odyssey Ping-Pong game hands-on. That act became the genesis of the arcade game Pong.
In October 2002 I had contacted David Allison, chair and curator of the Division of Information Technology and Communications at the Smithsonian’s National Museum of American History, and we agreed, in principle, that the Smithsonian would become the repository for the game apparatus, as well as some of the supporting photocopied documentation. Finding and donating the original documents was not even a remote possibility at the time. But in January 2006 I sent the newly located original documents to the Smithsonian. These have been digitized by the Lemelson Center and are available as part of the finding aid to the materials that I have donated.
The search goes on for additional documents relevant to the 1966-75 period that brackets the conceptual, developmental, and early production and marketing years of the nascent console video-game industry. As they surface, I expect to add them to the material already in the custody of the Smithsonian. Through the work of the Lemelson Center’s Modern Inventors Documentation (MIND) program, the fragile historical record representing my nearly lost legacy and that of other inventors as well is preserved and accessible.
Happy 40th birthday, Rubik’s Cube!
I’ve practically grown up with the toy, which I first encountered around 1981 when my elementary school classmate Matt dazzled us with his ability to solve it in mere minutes while the rest of us struggled to master the 3×3 cube. We didn’t have the advantage of online instructions or videos to give us helpful tips, since we didn’t have the World Wide Web yet. So puzzle-loving kids and adults invested hours solemnly twisting the cube segments over and over again. This was at the height of the toy’s popularity in the U.S., which quickly waned but never quite died.
Today the toy and its inventor are celebrated in Beyond Rubik’s Cube, a traveling exhibition at the Liberty Science Center in New Jersey. Born 70 years ago on July 13, 1944, Hungarian Ernő Rubik is the man behind the Cube. His mother, Magdolna Szántó, was a poet and his father, Ernő Sr., was an aircraft engineer known for his glider designs. He said of his father: “Beside him I learned a lot about work in the sense of a value-creating process which has a target, and a positive result too.” (1) Young Ernő studied sculpture, design, and architecture in Budapest and eventually became a professor of architecture.
In 1974 he thought up the idea for the Rubik’s Cube in order to help teach 3-dimensional design to his students. Initially, he created a 3x3x3 rotating cube out of wood. “There was a workshop in the school, and I just used wood as a material because it is very simple to use and you don’t need any sophisticated machines. So I made it by just using my hands—cutting the wood, drilling holes, using elastic bands and those kind of very simple things.” (2) The following year he applied for a patent, which he received in 1977. Since this was Soviet-era Hungary, when the “Iron Curtain” divided Eastern and Western Europe, Rubik’s options were limited for manufacturing and marketing his invention. He worked with a small Hungarian company Politechnika to start selling colorful plastic versions of his “Bűvös Kocka,” translated into English as “Magic Cubes.”
Rubik’s big breakthrough came when an expat Hungarian entrepreneur took the Magic Cube to the Nuremberg Toy Fair in Germany in 1979. There Tom Kremer, who owned a games and toys company called Seven Towns Ltd., saw the Cube and believed it could be a great success on the toy market if he could just find the right company to license it. Fluent in Hungarian and English, Kremer negotiated a deal with the Ideal Toy Company, who renamed it the “Rubik’s Cube” and launched it on the international market in 1980.
The Rubik’s Cube was an immediate worldwide sensation, winning many Toy of the Year awards in 1980 and 1981. Approximately 100 million were sold by 1982, but almost as quickly as it rose to fame the Cube seemed doomed to become a one-hit wonder. By 1986, The New York Times reported it had been “retired to the attic, the garbage heap and, with a bow to its elegance and ingeniousness, to the permanent collection of the Museum of Modern Art.” (3) However, the colorful toy never really disappeared, and over time it morphed into a popular culture icon. Today the number of Rubik’s Cubes sold worldwide is estimated at about 350 million.
In Hilton, New York, Northwood Elementary School students are petitioning to get the Cube inducted into the National Toy Hall of Fame. “The project started in Jenny Ames’ and Julie Fiege’s sixth grade classes in November… Students worked in groups to pick a toy that they thought should be inducted, conducted research and then presented their argument to a panel of judges…The presentation included criteria set by the Hall of Fame—icon status, longevity, discovery and innovation. The Rubik’s Cube won! So now the entire C Core—six teachers and 160 students—is working to get the Cube nominated for its 40th birthday this year.” (4) Hopefully, the Rubik’s Cube might win induction into the Toy Hall of Fame in November also to honor Ernő Rubik’s 70th birthday.
For teachers and families, there is now an educational program called “You Can Do the Rubik’s Cube” focusing on math learning and 21st Century Skills. As Ernő Rubik said, “If you are curious, you’ll find the puzzles around you. If you are determined, you will solve them.” (5)
As I was preparing for my summer vacation, I realized I needed a new beach chair. The one I had was starting to rust, the seat fabric was fraying, and it was difficult to unfold no matter how much WD40 I sprayed on it. I thought this would be a 10-minute errand, taking longer to drive to the store than it would for me to select and pay for my chair. Clearly I hadn’t shopped for a beach chair very recently, though, because there were what seemed like dozens from which to choose. They all looked pretty similar at first glance, but upon closer inspection there were many varieties: chairs that sit low in the sand, chairs that sit high; ones that recline and others that lie flat; chairs that are backpacks, have coolers built in, or have wheels.
My shopping experience made me curious about the origins of the beach chair, and whether the design had changed much over time. My research led me first to the story of Wilhelm Bartelmann, a German basketmaker who invented the “strandkorb,” a wicker chair designed for the beach. In 1882, the basketmaker is said to have been approached by a woman who had been advised by her doctor that sea air would be good for her—but that she was not supposed to sit on the sand because of another ailment. She asked if Bartelmann could create a chair that would allow her to enjoy the beach while keeping her off the sand. Thus, the strandkorb was born. The basket-like chair provided comfortable seating on the beach, while also protecting its users from sun, sand, and wind. The next year, Bartelmann began making two-seater chairs, and also established a successful strandkorb rental business.
So what about beach chairs in the United States? Canopied chairs like the strandkorb were certainly used in the U.S., but I was curious to learn whether there were other designs that would look more familiar.
The earliest patent I discovered that specifically mentions a chair for the beach is Helen Petrie’s 1892 “Seaside Seat.” While there are other, earlier U.S. patents for folding, reclining, and convertible chairs, Petrie’s patent (number 470,255) specifically references the beach: “My invention relates especially to a foldable reclining seat or lounge for use in camps, on yachts, at beaches, and in similar places.” Unlike Bartelmann’s rather substantial chair designed to shield the user from the elements, Petrie’s design appears open, more compact, and easily movable—not unlike many of today’s beach chairs.
Indeed, beach chair designs do not seem to have changed all that much since Petrie’s time. Materials and manufacturing technology have changed, but the basic concept of what a beach chair is (and does) is fairly similar. Still, inventors have found creative ways to elaborate on the basic “seaside seat.” Here are a few of my favorites:
So with all these cool, innovative beach chairs on the market, which did I decide on? I opted for one that I can carry like a backpack, reasoning it would be easiest to get to and from the beach with all the other paraphernalia that a visit to the shore always entails. (My store, sadly, did not have a suitcase beach chair for sale.)
James Hamilton received a patent for a “Combination Backpack/Beach Chair” in 1985. Thanks, Mr. Hamilton, for making my vacation to the beach both comfortable and convenient!
On June 9, 2014, I attended a program at The Brookings Institution with my colleagues Laurel Fritzsch and Lemelson Center fellow Matt Wisnioski. The program marked the release of a new report entitled “The Rise of Innovation Districts: A New Geography of Innovation in America,” developed by Bruce Katz, Vice President and Co-Director of Brookings’ Metropolitan Policy Program (MPP), and Julie Wagner, a non-resident senior fellow with MPP. The report and accompanying program provided a present-day and more policy-oriented perspective on many of the issues the Lemelson Center is exploring through Places of Invention, our exhibition (and accompanying book) set to open in Summer 2015.
In their report, Katz and Wagner trace what they call “a remarkable shift…in the spatial geography of innovation” away from the suburbs and back to cities. As documented by historians like Bill Leslie and Scott Knowles, many high-tech firms moved to the suburbs in the years after World War II where they built sprawling, space-age corporate campuses and R&D facilities. Some of the best known examples include the General Motors’ Technical Center (Warren, MI, in 1956), IBM’s Thomas Watson Research Center (Yorktown Heights, NY, in 1961), and AT&T’s Bell Laboratories (Holmdel, NJ, in 1962). The thinking at that time was to isolate industrial scientists from the suits and bean-counters at headquarters (and from other competitors) by plopping them in an idyllic university-like setting where they could invent new cutting-edge technologies.
However, according to Katz and Wagner, by virtue of their suburban locations, these campuses were “accessible only by car, with little emphasis on quality of life or on integrating work, housing, and recreation.” Thus, “a new complementary urban model is now emerging” giving rise to what they call “innovation districts.”
These districts, by our definition, are geographic areas where leading-edge anchor institutions and companies cluster and connect with start-ups, business incubators and accelerators. They are also physically compact, transit-accessible, and technically-wired and offer mixed-use housing, office, and retail.
In contrast to the post-war suburban campuses, think Kendall Square in Cambridge, MA; University City in West Philadelphia, PA; and South Lake Union district in Seattle, WA. These urban districts feature big research universities (MIT, Univ. of Pennsylvania, Drexel) and big high-tech firms (Amazon, Microsoft) that serve as anchors for attracting additional high-tech startups, plus the housing and complimentary retail businesses that support them.
As Bruce Katz argues in this short video, innovation is increasingly taking place where people come together, not in isolated spaces. Courtesy of the Brookings Institution, via YouTube.
Katz and Wagner explain some of the reasons for this geographic shift back to cities. First, in terms of demographics, married families with children—the residents most likely to enjoy the suburbs—now constitute less than 20% of all U.S. households. Young professionals and retired empty-nesters increasingly prefer to live in cities where they can walk to the gym, visit a museum after work, or meet a friend at a coffee shop or tavern.
But a more interesting observation concerns the changing nature of innovation strategies and why today’s innovators and entrepreneurs favor the inter-connectedness of cities. During the mid-20th century, the big high-tech firms pursued a linear model of innovation, in which they believed pure scientific research (R) would lead directly to the development (D) of marketable new technologies, all within the confines of the firm’s suburban R&D labs. That strategy worked for a while: for example, researchers at Bell Labs developed transistors and lasers while earning 11 Nobel Prizes. But Bell Labs would eventually fall victim to the “campus curse” as the isolation and insularism of its pristine suburban labs slowed the pace of innovation. In fact, AT&T sold off Bell Labs to Alcatel-Lucent in 1996, and the famous Holmdel, NJ laboratories, designed by modernist architect Eero Saarinen, now sits empty. It may be converted into a medical center…or razed.
Instead, most small and medium-sized businesses are increasingly turning toward an “open innovation” strategy, in which they develop some of their own in-house technologies, while also partnering with other firms to buy or license certain new inventions. As Katz noted in his talk, this mixed, open innovation strategy “craves proximity” and “extols integration” in order to make the necessary connections. So in terms of geography, it makes more sense to locate a startup in an urban innovation district where the density of development increases the odds of finding new ideas and technology partners—at the “incubator” next door or in a serendipitous exchange at the corner coffee shop.
As Katz and Wagner note in their full report, these 21st century innovation districts are in many ways a return to the original 19th and 20th century industrial districts that flourished in the first wave of industrialization. In these districts, firms from the same industry would cluster in a city neighborhood as workers walked to work and patronized local businesses. We see this clearly in some of our Places of Invention exhibition case studies. For example in 19th-century Hartford, skilled machinists in the Coltsville neighborhood lived in company-built housing and walked a few hundred yards to Samuel Colt’s famous armory where they mass produced revolvers with interchangeable parts. After their shift, they might walk next door to Charter Oak Hall to practice with the Colt Armory band, take in an evening lecture, or use the lending library while mingling with fellow workers. To use Katz and Wagner’s language, the Colt Armory was the “anchor firm” of the Hartford “innovation district,” which grew to include the Weed Sewing Machine Co., the Pope Manufacturing Co. (bicycles, automobiles), both the Underwood and Royal Typewriter companies, and Pratt &Whitney (machines tools), the last of which was a spin-off founded by two former Colt machinists. Similarly, in the 1950s several medical device firms clustered around two Twin Cities anchors—the University of Minnesota’s Variety Club Heart Hospital and Medtronic—in “Medical Alley” Minnesota.
Katz and Warner believe there is strong potential for the growth of innovation districts in several U.S. cities. Indeed, as new urban innovation districts emerge in cities like St. Louis, Detroit, and Boston, civic leaders would be wise to brush up on their history to learn lessons from earlier Places of Invention.
Chesbrough, Henry. Open Innovation: The New Imperative for Creating and Profiting from Technology. Boston: Harvard Business School Press, 2003.
Godin, Benoit. “The Linear Model of Innovation: The Historical Construction of an Analytical Framework.” Science, Technology & Human Values 31 (2006): 639–667.
Katz, Bruce and Julie Wagner. The Rise of Innovation Districts: A New Geography of Innovation in America. Washington, DC: Brookings Institution, 2014. Accessed 19 June 2014.
Leslie, Stuart W. and Scott Knowles. “Industrial Versailles: Eero Saarinen’s Corporate Campuses for GM, IBM, and AT&T,” Isis 92, no. 1 (March 2001): 1-33.
Rigby, Bill and Alistair Barr. “Will Apple, Google, Facebook, and Amazon fall victim to the ‘campus curse?’” San Jose Mercury News, 28 May 2013. Accessed 19 June 2014.
I want to pay homage to one of our favorite inventors, Stephanie Kwolek, who passed away June 18 at the age of 90. The DuPont chemist who invented Kevlar®, Kwolek came to the Lemelson Center in 1996 to participate in an “Innovative Lives” program, speaking with middle-school students about her childhood inspirations, life, and career. We were so intrigued by her personal and professional stories, and the impact of her invention, that we highlighted her in the Center’s “She’s Got It: Women Inventors and Their Inspirations” video, podcast, and educational materials. We also prominently featured her in our award-winning exhibition Invention at Play.
Of the diverse inventors in Invention at Play, evaluations showed that Kwolek was the most inspiring for museum visitors of all ages and backgrounds. They were impressed by the fact that she was a female inventor who started working at DuPont in 1946 when few women were hired as scientists. Of course they were impressed also by her important invention in the 1960s. The polymer fiber that Kwolek created―Kevlar®―is very light weight, stiff, and, pound for pound, five times stronger than steel! It’s also chemical and flame resistant. Today Kevlar® is used in bullet-resistant vests, cut-resistant gloves, fiber-optic cables, helmets, tires, sports equipment, and even the International Space Station. If you look around your home or office, you’re bound to have at least one product that contains Kevlar.
Kwolek earned many important awards and professional accolades, including being inducted into the National Inventors Hall of Fame in 1995 and receiving the National Medal of Technology in 1996 and Lemelson-MIT Lifetime Achievement Award in 1999. As our senior historian Joyce Bedi said, “She was a wonderful person and an inspiration to many, especially young women interested in science and invention.” We were indeed lucky to have known her.
My family is obsessed with Dunkin’ Donuts coffee. The ubiquitous pink and orange logo is emblazoned in my memory. Grandpa Andy’s routine consists of a daily walk to the nearest Dunkin’s — about 500 yards from the house he built — for two cups of medium regular. Those cups sit on the counter until the moment comes to zap one in the microwave for a minute before he sits down with the daily crossword. His affection for the beverage is so well known that people bring him souvenir coffee cups from all over the world. His collection now contains over 2500 unique cups. The irony? He always drinks his coffee out of Dunkin Styrofoam, never out of a ceramic cup.
So when I flew the coop at the age of 18, the best coffee I’d ever had was a Dunkin iced hazelnut extra extra. Now, not an insignificant number of years later, I can’t even bring myself to drink the stuff. What changed? Sure, my tastes have changed. But back then, that’s all that was available in the small city I called home. These days, the proliferation of specialty coffee shops has fundamentally changed how the world consumes coffee. Innovations in coffee consumption are at the heart of this transformation.
Coffee History, In Brief
The roasting and brewing of coffee didn’t change much between the 1400s and late 1800s. From Yemen to China to Ethiopia, the process by which green coffee beans were converted into the dark, caffeinated liquid was pretty much the same: small batches of dried beans were cooked in a pan over a source of heat, then pulverized and steeped in hot water.
That all changed in 1880s, when Jabez Burns patented his coffee roaster.
Suddenly, coffee could be roasted in volume. At the same time, the industrial revolution introduced workers to unnatural sleep patterns and long hours. While coffee roasters previously had paid particular attention to sourcing good quality coffee beans, producing in large quantities to satisfy the needs of workers resulted in lower quality roasts. The invention opened a new market from which iconic brands such as Folgers and Maxwell House emerged.
Then, in 1890, an inventor from New Zealand named David Strang created instant coffee, and (in my opinion) the quest for the perfectly roasted cup took a huge leap backwards. I know some people swear by instant coffee. But instant coffee is a poor substitute for the real thing. Despite this, the popularity of the freeze-dried substance grew. It was sent overseas with the troops during the two world wars, and it became a staple in many households throughout the century. This was known as The First Wave of coffee.
Folgers and Maxwell House may be the most well known mass producers of coffee, both instant coffee and the traditional kind. And up until the 1950s, most coffee consumption was done at home, in sit-down restaurants or on the fields of battle. But then came the second wave of coffee, with the introduction of the quick stop coffee shops like Dunkin’ Donuts, Peet’s Coffee, and Starbucks. It might not be fair to put all those in the same category, and depending on your loyalties, you might be offended by this notion. Peet’s Coffee is known to have been the first to rediscover, if you will, that coffee—depending on the origins and roasting methods—could have nuances in flavor. It was also at a Peet’s coffee that many Americans first encountered espresso and the myriad drinks that have since proliferated.
With this second wave, the emphasis shifted from the need for mass production to the desire for a better cup of coffee, and retail prices rose in conjunction with it. This lead to inventions such as the French Press coffee maker, the moka pot, the Mr. Coffee automatic coffee maker, and (my personal favorite) the Chemex.
Chemex is my favorite because it was invented and is still produced in a town I consider a second home, Pittsfield, MA. It also, until recently, produced the best cup of coffee I have ever tasted.
The Third Wave
Last year, I discovered a coffee shop in Portland, ME, called Speckled Ax. It’s a specialty coffee shop where the owner, Matt Bolinder, is the roaster and also the main importer. He is part of a growing trend of coffee aficionados who will travel the world looking for the best coffee beans to bring home and roast. The third wave, according to Climpson and Sons, well-known coffee experts,
“…is focused on craftsmanship; where beans are sourced from farms instead of countries and roasting is about bringing out unique characteristics of a bean. …third wave is in the throes of achieving the same level of detail and understanding from bean to cup that wine connoisseurs have demanded for decades – farm, harvest, processing style, roast date, coffee variety and tasting notes.”
At Speckled Ax, and countless other small, independent shops popping up around the country, the emphasis is on creating a unique cup of coffee, unparalleled in flavor, color, texture, aroma, etc.
The quest for the perfect cup has lead to many innovations in roasting and brewing. At Speckled Ax, for example, Matt Bolinder roasts his coffee beans using wood fire—“Our object is to complement the distinctive flavors inherent to our select coffees with the subtle aromatics that only a wood fire can impart.”
On the brewing side of things, I recently came across a process called “Steampunk” at a coffee shop here in Washington, DC called La Colombe. This process produces the best cup of coffee I’ve ever tasted, using beans selected with utmost care by Todd Carmichael, a celebrity in the coffee world. The Steampunk process takes concepts as old as coffee brewing itself, and aims to use modern technology to achieve the perfect cup.
But can there really ever be a perfect cup of coffee?
No. No there can’t be. Perfection is impossible. But I’d love to be proved wrong.
The following is a guest post by Edward Tenner, a senior research associate of the Lemelson Center and author of Why Things Bite Back and Our Own Devices.
More than 50 years ago, the Israeli archaeologist Pessah Bar-Adon discovered a trove of over four hundred copper objects and other priceless artifacts in a cave high in the cliffs overlooking the Dead Sea. The Polish-born scholar, who had lived for years among the Bedouin, had uncovered the “Cave of the Treasure” by a dry riverbed known as Nahal Mishmar. Originally chosen for its remoteness, the site was used by a vanished civilization over six thousand years ago. The artifacts helped define the Chalcolithic or Copper Age of technology (4500-3600 BCE), first recognized in the early twentieth century. This era marked a transition from the Stone Age to metallurgy, which brought with it the rise of villages controlled by chiefs, an expansion of agriculture, and the development of specialized crafts on an unprecedented scale.
Masters of Fire: Copper Age Art from Israel, an exhibition currently on view at the New York University Institute for the Study of the Ancient World (it will travel next to the Legion of Honor Museum in San Francisco), features stunning and often enigmatic objects (like anthropomorphic and zoomorphic ossuaries that held the bones of the dead) from Nahal Mishmar and other Chalcolithic periods sites. Beyond examining the artistry of the objects, the exhibition raises a fascinating question: Did mastery of metallurgy open the door to ancient, and thus modern, inequality?
Connections between metalworking and elitism have been made in other contexts. Investigations of the 5,000-year-old South Tyrol mummy Oetzi, for example, suggest that his copper-bladed axe was not only a highly efficient tool but a status symbol available solely to a small number of elite males. While the Nahal Mishmar hoard yielded clearly practical copper objects—such as heads of the maces that were the characteristic weapon of the period—it also included finely-worked “scepters,” “crowns,” and vessels that suggest the wealth and prestige of those reburied there, and the riches of their temples.
Whatever the objects mean and however they were used, the cost of obtaining, transporting, and smelting ore, and working copper—not to mention military protection of the new wealth—promoted inequality and social stratification. As Thomas E. Levy notes in the handsomely illustrated exhibition catalog, it took thirty-five hours of smelting time and fifty hours of work to produce a single copper axe. An egalitarian Neolithic society could not finance such skilled and specialized production without changing its character. “Once this transition was put in place, by the early fourth millennium BCE,” Levy writes, “there was no returning.”
If you’re in New York through this weekend, or in the Bay Area this summer or fall, don’t miss this mesmerizing exhibition (there’s an excellent review by Edward Rothstein of the New York Times).
I used to think manicures were only for elegant ladies who walked poodles, had afternoon tea, and were always perfectly coiffed. In other words, not me. If it’s raining outside, even a little bit, I’m the person who gets drenched in spite of using an umbrella. If I had a choice between spending time in a salon and hiking with a guidebook, I’d choose the woods. And every time I’ve gotten a manicure—without fail—I’ve chipped it the same day or shortly thereafter.
This all adds up to skepticism of a no-chip nail innovation that was recommended to me: the shellac manicure. The shellac nail polish and manicure system is credited to Creative Nail Design, or CND, a company that spent five years testing and improving this product before releasing it to the market. Unlike a regular manicure, shellac lasts up to two weeks and is touted as chip-free.
So how does it work? There isn’t much professionally written about its exact science, but the general idea is that the specially formulated shellac nail polish is applied like normal polish. Nails are then cured by placement under a UV lamp after each coat. And unlike a normal manicure, nails are dried and ready to go immediately, which helps a lot when you’re fishing around in your purse for money to pay.
On a related note, while researching this blog post, I came across the story of a scientist named Hope Jahren, who hacked Seventeen magazine’s #ManicureMonday on Twitter in fall 2013. #ManicureMonday is traditionally a place for girls and women to post images of their manicures, but Hope wanted to show girls that it’s not just how their hands look, but what they do with them. So she and other scientists tweeted images of their manicured hands doing all sorts of fun, science-related stuff. The Smithsonian has gotten in on the fun, showcasing the great work being done behind-the-scenes at our various museums and research centers. I think this is a great message to send girls and women who are future scientists, inventors, and innovators—that you can have fun with fashion and be a serious, smart professional in a STEM field.
Like many successful inventions, the shellac manicure has made my life easier and is a vast improvement from easily chipping nail polish. It seems like there are continual updates to the system and more variety in color and textures. Maybe you’ll see me on the next #ManicureMonday.