Blog

Florida “Crackers”

The word “cracker” is often used to refer to people who have lived in Florida (and Georgia) for multiple generations. The term originally referred to Florida cowmen and pioneers but is now used to describe people whose families settled in Florida before the invention of window screens, air conditioning, insect repellant, and the Interstate Highway system. To paraphrase Jeff Foxworthy, if your family’s roots in Florida go back earlier than, say, 1950, you may be a cracker.

The term “cracker” is often said to refer to the “crack” of the slavemaster’s or the cattle herder’s whip. This is the History Channel’s theory on origin of the word and also the theory advanced by the Florida Historical Society. And while it is true that Florida cowmen herded cattle by whip and dog rather than by lasso, the term cracker probably derives from the Middle English word craik, meaning a braggart. That is clearly the meaning of the term as it appears in Shakespeare’s play King John (1595): “What cracker is this … that deafes our ears / With this abundance of superfluous breath?” In contemporary Gaelic, craik is still used to refer to a vain, boastful person. The “crack of the whip” business is probably a confection.

Other theories I have heard include the guess that “cracker” is a variation on the Spanish word cuaquero, meaning Quaker. In the colonial era, the Catholic Spanish referred to all Protestants derisively as “Quakers,” so “cracker” was intended as a slur on the Protestant Scots-Irish settlers of the region. Still another guess is that the word derives from “cracking” (grinding) corn to make corn flour. Still another is that it refers to the hardtack biscuits (crackers) that were the cowboys’ staple fare. There is no serious etymological evidence favoring any of these interpretations.

Whether the term is a compliment or a disparagement depends on context. When the African American relatives of Trayvon Martin refer to his killer as a “redneck cracker,” cracker is synonymous with bigot or racist. And if a Yankee calls a long-haired country boy from North Florida a “cracker,” cracker is synonymous with ignorant or backward. But if two cattle herders from Kissimmee are in the bar having a few beers and refer to each other as “crackers,” it is a term of endearment.

Mike Miller’s essay on the Florida cracker (“Florida Cracker: Smile When You Call Me a Cracker, Stranger!”) sets out the following standards. A proper cracker:

  • Has a family history that predates the huge explosion in Florida’s population after the Second World War.
  • Is self-reliant (“When modern civilization collapses, the Florida cracker will be hunting, fishing, trapping and growing his own food while the rest of us will be standing in line at the government owned grocery store with our ration stamps.”)
  • Is white, was raised in a rural setting, and self-identifies as a Southerner (usually meaning that there is a Confederate battle flag displayed on the cracker’s pickup truck).

Real crackers also eat cooter (a soft shell turtle), grits, chitlins (pig intestines cleaned and deep-fat fried), greens, fried alligator tail meat, hominy (white corn soaked in lye and then boiled), and pineywoods rooters (feral hogs that crackers love to hunt). Cracker politics are anti-Establishment, anti-big-government, and anti-welfare-state conservatism.

Excerpt from Florida: An Unnatural History of America’s Weirdest State

Let me know if you’d care to read more…

The lies begin at the border. When you cross over into Florida from Georgia or Alabama, there are large signs that read “Welcome to the Sunshine State.” (They also announce that Florida is “Open for Business,” an addition at the behest of the very conservative Governor Rick Scott.) But contrary to the implication of the state’s famous motto, Florida is not the nation’s sunniest state. Not even close. The state’s Percent Sun number (the average percentage of time between sunrise and sunset that the sun reaches the ground) is a respectable 66%, but that is exceeded by Arizona (85%), Nevada (79%), New Mexico (76%), Colorado (71%), Hawai’i (71%), California (68%), and Wyoming (68%). Florida only averages 101 clear days annually (days with no clouds), which is fewer than the number of clear days in 22 other states. Where we do shine, so to speak, is in the number of hot and humid days, but somehow, “Welcome to the Muggy State” or “Florida: the Partly Cloudy State” just don’t work as marketing slogans.

Also contrary to popular perception, Florida was not named for its abundant flowering species. Florido is the Spanish word for “flowery” (also “florid,” incidentally) so many people assume that Florida means “the land of flowers.” Not so. The name Pascua Florida was bestowed upon the state by the Spanish explorer Ponce de Leon on Easter Day, 1513, as Ponce and his crew sailed into what is now Matanzas Bay. In Spain, the Easter Celebration is known as the Feast of the Flowers, and a literal translation of Pascua Florida is therefore “Flowery Easter.” The name Ponce chose for the state was not intended to refer to the orchids, violets, petunias, and other flowering plants that are native to Florida but rather to the Catholic celebration of the death and resurrection of Christ.

 Juan Ponce de Leon led the first Spanish expedition into what is now Florida, and so he is often said to have been the state’s “discoverer.” Of course, Florida was occupied for 12,000 years or so before Ponce and his hardy band of conquistadores showed up. Ponce did not “discover” the state, he only initiated (or rather, tried to initiate) the European conquest of Native American lands in the region.

Ponce’s 1513 expedition came ashore somewhere near present-day St. Augustine. This was when the name Pascua Florida was bestowed. He then sailed south and eventually came upon what we now know as the Florida Keys, the outermost of which is Dry Tortugas. Tortuga is Spanish for “turtle” and Ponce evidently chose the name because of the turtles he observed there. (It was “dry” because there seemed to be no fresh water anywhere on the island.) He then turned north, landed near what is now Port Charlotte, ran into some very unfriendly natives, and high-tailed it back to Puerto Rico.

Ponce returned to the Port Charlotte area in 1521 with the intention of establishing a colony but again ran afoul of the local tribesmen, the Calusa. Ponce was fatally wounded in this confrontation and the Spanish colonization effort was abandoned until 1565. After the faceoff with the Calusa, the conquistadores withdrew to their base in Cuba where Ponce de Leon died from his wounds. His remains were encrypted and subsequently moved to Puerto Rico where his burial crypt can be viewed today in the Cathedral of San Juan Bautista in Old San Juan.

It is often said that Ponce came to Florida to find the mythical Fountain of Youth. More lies, of course – not that you would know it if you went to St. Augustine to visit a tourist attraction called Ponce de Leon’s Fountain of Youth Archaeological Park. The Park houses practically nothing of genuine archaeological significance and one assumes that the reference to archaeology in the park’s name is intended to impart a faux impression of scientific merit or importance. In fact, there is no legitimate historical evidence to suggest that Ponce was looking for anything like a “Fountain of Youth.” As was true of all the Spanish explorers of the New World, he was looking for gold and silver to seal his favor with the Spanish crown. The whole “Fountain of Youth” business is a myth likely perpetrated by one Gonzalo Fernández de Oviedo y Valdés, who is said to have despised Ponce and cooked up the Fountain of Youth story to depict Ponce as a gullible, dim-witted fool. Even in 16th century Spain, the idea that water could reverse aging was considered pretty unlikely.

Cliches

Say What?

Do people ever think about what they are saying?  For years now, I’ve noticed an increasing reliance in public discourse on clichés that seem profound but are really pretty silly if you give them a moment’s thought.

Consider, for example, the common admonition not to “preach to the choir.”  “Preaching to the choir” means that you are not persuading anyone of anything, you are just wasting time and energy pumping up the passions of those who are already converted.  Have you nothing better to do with your time than “preach to the choir?”

But what is the implication?  If not to the choir, to whom should one preach?  The congregation?  But surely, the congregation is also comprised, overwhelmingly, of fellow converts.  The infidels outside the church door?  But on pain of what evidence would one believe that the infidels will be persuaded by the sermon?  If they were inclined to buy the message, wouldn’t they already be inside?

I’ve had two close friends who served as the directors of their respective church choirs.  It turns out that the most difficult thing about directing the church choir is not finding someone to sing the solo part in next Sunday’s hymn, or choosing the pieces to perform in the annual Christmas concert, or even summoning up the courage to tell old Miss Clara that she can’t carry a tune in a bushel basket.  No, the hardest thing about directing the church choir is getting the members to show up for rehearsal and for the Sunday performances.  Finding ways to motivate the choir to stick to it consumes the largest share of the director’s attention.

The lesson?  If you don’t preach to the choir from time to time, one day you’ll turn around and the choir won’t be there.  The larger lesson:  If the passions of the true believers are not fueled pretty regularly, the flame goes out.  And what movement long survives without the ardor of true believers.  Of course you should “preach to the choir.”  That’s the most important preaching most people can ever do.

How often have you been warned not to “reinvent the wheel?”  Like preaching to the choir, reinventing the wheel is the ultimately pointless effort.  The intended meaning is evidently that someone — somewhere, sometime — has already figured out a solution to the problem at hand, probably a pretty obvious solution at that, and rather than “re-invent” that solution, we just need to discover what it was and adapt it to our own circumstances. “Re-inventing the wheel” would be stupid.

Or would it?  In point of fact, the wheel was reinvented many times in the course of human history.  A potter’s wheel can be dated to Ur in about 3500 BC and probably revolutionized the production of pottery goods.  It took another 300 years before someone in Mesopotamia figured out that a wheel could also be adapted to the task of transporting goods and people.  Independently, the Egyptians re-invented the wheel around 2000 BC and various European cultures re-invented it again, also independently, about 1400 BC.  (There is no evidence that this most recent European reinvention was influenced in any way by the cultures and traditions of the Middle East).  In contrast, the native civilizations of the Americas never did come up with a wheel.  The Incan, Aztec, and Mayan civilizations were entirely wheel-less, and ditto the various American Indian cultures, whose introduction to the idea of a wheel awaited European contact.

Two points are clear.  First, despite appearances to the contrary, the concept of a wheel and its application to solving human needs are not obvious.  Seemingly the simplest of devices, a working wheel is a pretty sophisticated technological achievement.  The Incan, Aztec and Mayan civilizations were quite advanced in astronomy, horology, mathematics, tapestry, medicine, engineering, metallurgy, architecture, and the like, but none of their great thinkers managed to “re-invent the wheel” and their civilizations were clearly the worse off because of it.

Thus, secondly, if you don’t know what a wheel is or you have forgotten how to make one, reinventing the wheel turns out to be quite an excellent idea after all.  The frequent alternative to reinventing the wheel is dragging sticky problems across resistant turf on crude skids yoked to yaks.

Of course, it is not enough to avoid reinventing the wheel; you also have to “think outside the box.”  Apparently, the caveat here is that there are no creative solutions inside the box, that what we need is unconventional thinking and new perspectives. Someone who “thinks outside the box” is novel, creative, smart.  The rest of us are struggling under the dead weight of convention.

But just what is “the box” and what is inside and outside of it?

Inside the box, of course, is what is sometimes called the “conventional wisdom,” which is only a snide way of referring to the accumulated knowledge, facts, lore, and wisdom of the ages.  “Inside the box,” in short, is our common cultural heritage – everything from language to science to custom to ethics to the multiplication tables.  Obviously, the tools in the intellectual “box” are tried and true; they’ve proven themselves over millennia.  That, after all, is why we put them “in the box.” And most of what is outside the box is out there for good reason – failed concepts, flawed insights, unworkable bigotries, and many other varieties of plain old bad ideas, all very much “outside the box.”

Setting aside everything that’s inside the box to think unconventionally sometimes bears fruit.  Einstein’s ideas about relativity were far outside the box but revolutionized science and over time have proven to be correct.  That, by the way, is why the general and special theories of relativity are now very much inside the box.  A moment’s thought will make it obvious that venturing outside the box is a foolish and dangerous enterprise in all but the most exceptional cases.  If you venture too far outside the box, for example, you’ll find that you have to reinvent the wheel…

How often have you been told by the bearers of bad news, “Hey, don’t kill the messenger” or “I’m only the messenger”?  In the modern university, Chairs are to be spared because they are only reporting bad news from the Dean, Deans are spared because they are only describing the bad news from the Provost, the Provost shouldn’t be killed because he only conveys bad news from the President, and the President must be held harmless because he is only passing along bad news from the state legislature or the university’s Board of Governors.  As can be seen, the decision not to kill the messenger pushes blame further and further upstream.  After a bit, the true villain – the guy who deserves to be killed – is either invincible or completely out of your reach.  So while “spare the messenger” starts out as a warning against misplaced rage, in the end it renders the ultimate victim impotent.

Sophocles introduced the phrase, “Don’t kill the messenger,” in his tragedy Antigone.  In the play, an unnamed messenger appears to announce the suicide of Haemon (“Haemon lies dead, slain by his own hand in wrath against his father’s deed.”)  The delivery of this message occasions much grief, more death, and eventually the unraveling of the Theban city-state through the suicides of most of its royal family.  What if the messenger had been waylayed on route and never showed up to deliver his message?  Would Thebes have fallen anyway?

It was the custom throughout ancient Greece and Rome for the rulers to employ messengers to broadcast news throughout the city or region.  Laws decreed that the messengers were to pass unharmed, however grim their news might be.  But clearly, the messenger was an agent of the state and thus a convenient target for popular rage.  Killing the messenger sends an effective, unambiguous communiqué back to the source.  It is a way, and often the only way, to convey one’s displeasure to the ruling class.

The Center for Consumer Freedom is funded by fast food joints and soda companies to lobby against obesity-related public health campaigns.  Its website includes such Op-Ed gems as “Leave calorie counts off the menu; Nutrition is more complex than a few figures can convey!” and “Big fat lies: CDC must retract obesity deaths study.”  Richard Berman, a CCF flack, describes the Center’s lobbying approach thusly:  “[Our] strategy is to shoot the messenger . . . We’ve got to attack [food activists’] credibility as spokespersons.”  Apparently, the organization spends some $3 million a year on their lobbying efforts, and if recent statistics on obesity are any guide, they’ve been pretty successful.  Shooting the messenger, it seems, is a remarkably effective way to stifle a message you don’t want people to hear.

If you “think outside the box,” you might come up with an idea, invention, or development that someone will praise as “the greatest thing since sliced bread.”  Clearly, no one can say who first thought to slice a loaf of bread (vs., say, tearing it into manageable chunks with your hands ), but the first whole-loaf-at-a-time bread slicing machine was invented by one Otto Frederick Rohwedder in Davenport, Iowa, in about 1928 and was pressed almost immediately into commercial use.  Pre-sliced bread was hailed at the time as “the greatest forward step in the baking industry since wrapped bread.”

To designate an idea, invention or development as “the greatest thing since sliced bread” is therefore to claim that it is the best thing to have come along since 1928.  Other “great things” that have come along since would include antibiotics, computers, television, commercial air travel, air conditioning, the Internet, and a great deal more.  I don’t consider pre-sliced bread, least of all the Wonder Bread variety thereof, to be such a great thing and certainly not “greater” than my laptop computer or flat-screen high-definition digital TV.  Good grief!

Claude R. Wickard, the United States’ Food Administrator in the Second World War, was also not convinced that sliced bread was such a great thing.  Wickard ordered a ban on sliced bread in January 1943, on the grounds that pre-sliced bread required a heavier wrapping paper than an unsliced loaf and was therefore inconsistent with wartime conservation efforts.  Assurances that supplies of waxed paper were sufficient and, in any case, not essential to the war effort caused Wickard to rescind the ban some two months later.

All else equal, whole loaves of bread are more economical than pre-sliced loaves.  Pre-slicing exposes every slice to air and therefore to contamination by bacteria, mold spores and other icky stuff.  In consequence, bread destined for pre-slicing must be loaded up with preservatives to stave off premature decay.  Pre-slicing also causes the bread to dry out (go stale) more quickly.  All in all, this adds up to a pretty heavy price to pay for the ability to make faster sandwiches, don’t you think?

Joshua Bickel, a journalist in Chillicothe, Missouri (which claims to be the ancestral home of pre-sliced bread), has written, “The best thing about sliced bread is that you do not have to cut it because it has already been cut for you.”  But surely, the labor savings of sliced bread are trivial in comparison to those realized, say, by the dishwasher, vacuum cleaner, microwave oven, or even the electric coffeepot – unless, I suppose, you have to make a hundred peanut butter and jelly sandwiches in very short order.  If we don’t have the time and energy to slice our own bread, well, no wonder we’ve become a nation of fatsos.  CCF – are you listening?  It’s not the sugar in Coke that has made us obese.  It’s sliced bread!

“It ain’t rocket science!” is another cliché often used ironically to describe any endeavor or solution that is deceptively simple and straightforward – i.e., the sort of thing any idiot ought to be able to figure out, as opposed, say, to “rocket science,” which requires deep, complex, higher-ordered brain power.  Evidently, “rocket science” stands somewhere near the top in its demand for advanced intellectual prowess, right up there with the frequently voiced alternative, brain surgery.

Or does it?  The basic scientific principle underlying “rocket science” is Newton’s Third Law of Motion:  For every action, there is an equal and opposite reaction.  The three laws of motion have been part and parcel of scientific discourse (“inside the box”) since 1687.  The laws of motion are covered extensively in every high school physics class.  Contrary to implication, their mastery does not seem to require a very advanced intellectual capacity.

Workable rockets were known to the Chinese by about 1000 AD and their potential application to astronautics was understood by the beginning of the 20th century.  You can buy one in any hobbyist’s shop and teenage boys shoot them off every Fourth of July.  Granted, engineering a rocket big enough to reach outer space and figuring out how to stabilize its flight, steer it to the correct destination, communicate with it, and the like, are by no means passing trivialities.  But still, the rudiments of “rocket science” are not metaphysical mysteries penetrable only by those with 160+ IQs.

The world’s largest collection of rocket scientists is probably to be found at the Kennedy Space Center, just down the road from where I live.  Officials in the jurisdictions that abut KSC were scared to death about what would happen to the local economy once the Space Shuttle was phased out.  NASA anticipated shedding some 3,000-4,000 jobs in the span of two years once the Space Shuttle fleet was retired.  Earlier estimates ranged upwards to 8,000 “rocket science” jobs lost.  And in fact, NASA had workforce programs in place since at least 2005 to figure out how to sustain its workforce and to develop “career paths” once the space shuttle left.  A simple question suffices:  Why do we need federal retention bonuses and expensive job retraining programs to avoid mass unemployment among space shuttle scientists and engineers?  If rocket scientists were so damned smart, don’t you think they could figure this out on their own?

So contrary to fatuous clichés, by all means preach to your choir, reinvent your wheel, stay inside the box, and slay the messenger if it is the only way to send a message back to the powers-that-be.  None of this is rocket science and all of it is much better than sliced bread.