The author, John Marek, is a writer and executive director of the Anson Economic Development Partnership.
The earliest humans undoubtedly discovered the value of fire as a means of food preparation by accident. A lightning strike caused a forest fire; Grog and his family found a few incinerated woodland creatures, decided they tasted pretty good, and as a bonus, didn’t make you sick like uncooked meat occasionally did. At some point, they figured out that there are ways to capture that fire and keep it burning. The big leap, though, was when ancient man advanced from collecting fire to manufacturing it.
Any former Boy or Girl Scout will tell you that while it is technically possible to start a fire by rubbing two sticks together, it is not a quick or intuitive process. At some point in human evolution, some dude (or possibly dudette) decided he would rub a couple of sticks together for several hours and see what happened. Ultimately, he became a hero, but I suspect he got some mean Tweets along the way. Even more intriguing are the undoubtedly numerous failures; that poor guy who devoted his life to rubbing two woodchucks together, and it never really paid off.
Although there were some advances in the quality and efficiency of the heat source over the years, the cooking of food remained fundamentally unchanged for centuries, right up through World War II. Then a chance discovery (or at least that’s how one story goes) by an engineer at Raytheon, a company that manufactured magnetron tubes for early radar systems, changed everything.
Legend has it that Percy Spencer was carrying a peanut butter bar in his pocket while performing tests on a more robust tube and noticed that the bar had become very warm. He began experimenting with other foodstuffs – including, yes, popcorn – and determined that the radar waves produced by the magnetron were cooking it. That’s a cool story, but other sources say scientists had figured out the link between microwave radiation and cooking a decade earlier.
Whatever the actual origin, this new “microwave” cooking method was destined to take over the world, but not for another 20 years. Commercializing the technology turned out to be a challenge. A 1950s magnetron capable of cooking food was the size of a refrigerator and required its own dedicated cooling system. As the technology evolved, however, the size decreased and the price became increasingly affordable.
The first widely available consumer microwave, the Amana Radarange, came out in 1967 and was priced at $495. That’s the equivalent of $3,900 today, so a luxury item to be sure, but something the checkbook of an upper-middle-class family could conceivably cover. The original Radaranges were larger than the typical microwave today and covered in beautiful chrome. Many of the ones still in existence are fully functional, a testament to their quality and design.
As the technology and manufacturing techniques improved through the 1970s, the “microwave oven” became increasingly accepted and affordable. Nevertheless, there was a contingent of consumer-safety advocates who questioned the notion of cooking food with radiation. After testing 15 microwave ovens in March 1973, Consumers Union, the forerunner of Consumer Reports, warned that none could be considered “completely safe” because there was no reliable data on what constituted a safe level of microwave radiation emission. And there was, of course, the “metal” issue. Objects with even a trace of metal in them would arc a tremendous light show, burst into flame, and perhaps even destroy the microwave. While not putting metal in the microwave is now an intuitive part of our human experience, it was a difficult concept to grasp for many Carter-era cooks.
I recall a television commercial from maybe 1975 or 1976 in which an appliance store chain, Highland, offered free cooking classes with the purchase of any microwave. The commercial made fun of the fact that the family was only able to make popcorn in their new microwave.
“The popcorn’s really tender tonight, Dear!”
“And tomorrow night, popcorn casserole.”
The microwave has become such an established part of our lives over the past 50 years that I think even those of us who can remember a time without them are astonished to recall how things were done previously.
TV dinners came in tin trays that had to be heated anywhere from 30 minutes to an hour in the oven, so while convenience foods took the time out of preparing meals, there was still a lengthy cooking process. Pre-microwave, the “boil pouch,” was a popular form of prepackaged cooking. These were plastic pouches of prepared frozen foods that were dropped into a pot of boiling water for 5-10 minutes until the contents thawed and heated. Of course, it took 10 minutes to bring a pot of water to a boil, and fishing the pouch out of the boiling water was a skill in itself, so they weren’t as convenient as they may sound.
My family also made extensive use of the deep-fryer. You could prepare a serving of fries, breaded mushrooms, even chicken and veal cutlets in a few minutes once the oil was up to temperature. However, having a vat of 400-degree oil sitting on your kitchen counter comes with its own set of challenges and risks.
We got our first microwave in 1979. By then, prices had fallen to a level that just about anyone could afford. Initially, my mother wasn’t entirely sure what to make of it, so maybe those Highland “popcorn” commercials weren’t that far off-base, but she figured it out in short order and was heating soups, frozen dinners and leftovers like a pro within a few weeks. My father, whose reading material consisted mainly of tabloids with headlines like, “Man Uses Microwave, Turns Into Bloodthirsty Killer,” was a tougher sell. Looking back on it, I’m not sure I ever saw him use it.
What about you? Do you recall a time before the microwave? When did you get your first one?