Lately, I have been talking to more and more "older people" - that is, those past the age of 55 or so, many well past 65 - and have noticed a few things that are probably more than anecdotally related.
For one thing, older people seem to think more conservatively (with occasional Conservative leanings, but not necessarily). They may make slightly inappropriate remarks, racist, sexist, or otherwise "narrowminded" in ways that may have shocked their younger selves (and certainly now surprise me). They generally also shut down their willingness to try new things. Xenophobia, wariness of novelty, and anxiety about the unfamiliar: all these point to a more conservative way of thinking and dealing with the world.
Why should that be? That the elderly become distressed about "newfangled" technology or foods or customs is a commonplace. A good way to get an easy laugh in a TV show is to show an obviously elderly person trying out a computer or video game - and either getting horribly flustered or mastering it in record time. Like many stereotypes, this one has a ring of truth to it. But no one asks why.
Familiarity suggests access to robust memories. You cannot feel certain about anything you just encountered for the first time, and as you age, your suspicion of novelty - untried, inverified things - grows. We like to have memory, the deeper the better, to check for data. (Younger people tend to embrace the new: since pretty much everything in the world is new to them, this is generally adaptive trait, even if it can lead to trouble when wise counsel from elders is disregarded.) The strongest memories are the oldest ones. What we call habits are actions and expectations that have been repeated and proven safe, useful, reliable, etc. All this reliability relaxes us and saves us mental (and sometimes physical) energy.
And that energy allocation is key.
When we have to make a decision, e.g., see this movie not that, or try this weird food our friends are pushing on us, we engage certain parts of our frontal lobes, and this requires increased nutrient-rich blood flow. If we are tired, hungry, stressed, etc., that blood may be needed elsewhere in our brain. The thought of having to decide something is annoying to say the least. Remember how it was the last time you had to "have a talk" with a significant other or boss after a few hours' sleep or while you had the flu? Awful, wasn't it? You didn't have the energy to weigh things up, say the right thing, avoid further trouble. Such things are always better when we are in top form.
As we age, our bodies require more energy just to keep functioning smoothly. Systems start to break down. Food processing becomes less efficient (fact: stomach acid production decreases with age and leads to incomplete digestion and uptake). Muscles, among other things, heal more slowly. And on and on. The last thing a person like this needs is the unfamiliar.
Sad to say, the familiar includes one's usual group - whatever that group happens to be. It is sad because the avoidance of the Other, the outsider can turn into, or merely appear to turn into, bigotry. Me = good. Not-me = bad.
If this is all in the brain, so to speak, does that mean society should give carte blanche to the elderly to say whatever they want? No. Anyone still in possession of their faculties should be reminded that such remarks are not appreciated, no matter how devoid of ill will they may be. What this knowledge should do is help us understand what's going on, foster patience as we introduce new things (no life is without unwelcome novelty), and remind us what kind of energy battle is going on just over and behind the eyes - as it goes on in everyone, everyday.
Omnivore Brain
An exploration of the anthropology, neuropsychology, sociology, and economics of our highly diverse diet - and how this diet has helped to make us human.
Saturday, June 14, 2014
Tuesday, February 25, 2014
Homo economicus - and the Passenger Pigeon
This year marks the 100th anniversary of the death of an entire species. The Passenger Pigeon officially went extinct on Sept. 1, 1914, when "Martha" died, age 29.
Although this is tragic, for many reasons, and is not - alas - a unique event, even in the past 150 years, the extinction of this bird stands out because it didn't happen to some rarity, such as a species endemic to a single island. The species we wiped out once numbered in the hundreds of millions, if not billions. Whole flocks took days to pass overhead, blackening the skies, even changing the weather with the collective force of their wing beats. That was one species that "should have" withstood any and every assault the human population had the power and stupidity to exact upon it.
Yet it didn't.
There are many theories about the rapid collapse of this once-numerous bird species, and I won't get into them here. What has always interested me is the psychology - a collective mind, if you will - behind the relentless slaughter.
Human beings are incredibly diverse in their thinking, yet we can make some generalizations (albeit carefully). When faced with a huge supply of some "resource" (a word I have learned to use sparingly), something in the average mind opens up the floodgates. It is open season on whatever that abundant item happens to be: fresh water (e.g., huge lake, regular rains); minerals; fruit; easily caught animals; even human capital, e.g., housework performed by wives (that is, unpaid labor as part of a social contract). The abundant good or service ends up being taken for granted, and it is often exploited, abused. Even if it has high utility - it is very useful, for example, for a working man if he can come home to a clean house and a warm meal - it will end up having low value. That's when the trouble begins.
Lacking any grasp of ecology (a word that didn't yet exist, actually), the North Americans of the mid-1800s relentlessly pursued the passenger pigeon, and forced it into oblivion. If some people with the power of persuasion - a big if! - had studied the birds and figured out what role they played in the ecosystem, perhaps they could have curtailed the overkill. In fact, had they taken a look at the almost contemporary overkill of the bison, a species that went to the brink but did not go over (thus exists in small numbers to this day), they may have stopped before it was too late.
I think the numbers doomed the passenger pigeon. A smaller population overall, or many locally modest gatherings, perhaps, could have failed to provoke the frenzy of blood lust, and spared the species. (Habitat loss contributed to the decline, by reducing nesting sites, but the slaughter was responsible for more actual deaths.)
Are we genetically predisposed to act like this? Are we, as author Philip Roscoe suggests in his new book, I Shop Therefore I Am, Homo economicus? Do we always see the world in terms of costs and benefits, calculations of value, and the almighty Bottom Line?
It's an interesting proposal, and I tend to agree. I wish he were wrong. Although people defy "conventions" and buck trends all the time, it's a pretty sad state of affairs if we tend to see the world through money-colored glasses. It is cold and short-sighted, and results in things like disappearing species. In fact, it may have helped usher in the Sixth Great Extinction.
It's time we tweaked that vision, exploded the paradigm.
That other "eco" word - ecology - is, in my opinion, a better alternative.
I will write about that another time.
Monday, December 16, 2013
Emotional Affairs, Familiarity, and the Internet
Quite a departure for todays' post: it's more sociological than (neuro)psychological. Though, to be sure, there is an element of the latter, otherwise it wouldn't fit the theme!
A few years ago, a phenomenon called the Emotional Affair caught my eye. Why the qualification of a word in otherwise normal use? Well, unlike a garden-variety affair, which involves having a physical relationship outside of marriage or equivalent committed relationship, an emotional affair is sex-free. The connection is based solely on an emotional intimacy. The love between the two people blooms and grows without much if any physical contact, and is never consummated. Apparently, the incidence of such affairs, sometimes conducted over long distances via e-mail and/or phone and/or texting, is on the rise. How do we know this? Since no one thought to take a survey of such "unconsummated affairs" in the 19th century, we have no idea if they were more common than the novels and plays of the day let on. It is kind of strange to hear that, in this extremely permissive day and age, EAs occur more frequently than before. Don't people just do what they like, hang the consequences? No, some try to compromise: intimacy without cheating (or so it seems). And it is easier than ever to access that intimacy discreetly, thanks to modern technology. A special friend, on a phone or laptop, can be there day and night, ready for a chat.
But the internet is more than e-mail, isn't it? It's a doorway to search engines, which in turn allow us to look up old flames, old crushes, even old lovers. I have no stats on hand, and have no idea if any exist, but I'm willing to bet that the majority of EAs occur between people who knew each other more than 10 years before. In other words, familiarity had already been established with the object of interest, perhaps very well established.
This fascinates me. We have no shortage of potential partners out there, available through face-to-face encounters or online ones, yet so many of us seem drawn to prior relationships. I wonder if this connects with the brain's fundamental tendencies - in (illicit) romance of all things. (I knew I'd bring up the brain sooner or later.)
Whether we are aware of it or not, one of the most stressful things about meeting another human being is reading intentions. Sometimes, depending on us, or on them, we can size them up in an instant. Usually, it takes far longer to get past good manners, conventions, etc. In order to sort out all those signals, the brain has evolved a plethora of mechanisms for handling them - from analysis of expressions to detection of subliminal cues. Culture helps us a lot, as well. (Which may explain the popularity of movie love stories, for example.) How much easier if a person comes "pre-tested"!
Imagine a woman who has been married for 15 years. She feels a bit bored, and starts daydreaming about an old boyfriend she really clicked with 25 years ago. In previous eras, she would have had a major challenge finding his whereabouts unless he had an unusual name, or his family still lived nearby. Nowadays, that old flame could be a few mouse clicks away. If he's happy to hear from her, the e-mails will shuttle back and forth, drawing on nostalgia, loneliness, facility of communication, and the sense of doing "nothing morally wrong." Of course, it can get very complicated very quickly. Emotional connections are sometimes more powerful than purely sexual ones. Familiarity and nostalgia can fan the flames even higher. After all, we think on some level, I know he's a good guy. He would never lead me astray. (As if only strangers are potentially weak or wicked!)
As I have mentioned in previous posts, the human brain owes much of its complexity to social behavior and - according to my theory - our challengingly diverse diet. One of the most precious things about the environment, any environment full of friends and foes, human or edible, is familiarity. If we know Joe is a member of our tribe, we relax a little when we recognize him approaching; when we see a berry we know to be sweet, safe, and easy to eat, we relax and reach for it. Unknowns require more decision. They involve more risk.
The world is full of unknowns for the curious, adventurous, sometimes impulsive Omnivore Brain. Navigating the unknowns is a perilous but highly exciting journey.
More in future posts about how we pass on our results to others, helping them avoid some of our mistakes!
Friday, December 6, 2013
Violence, Injustice and the Brain
Yesterday, a great man of peace died. You have heard about that already by now.
Today is the 24th anniversary of the Montreal Massacre. On Dec. 6, 1989, 14 young women were shot to death by a young man in a Montreal university. (You probably have never heard of this tragic event unless you lived in Canada at the time.)
How can there be a connection between the life of Nelson Mandela and a senseless act of violence specifically directed against women (the shooter famously said he was targeting "feminists")? They are such different takes on anger. In fact, they lie at separate poles.
In one case, a man acted on his anger and hurt, ending the lives of 14 women in their prime of life, and ruining the lives of all who loved them. In the other case, a man who had suffered greatly answered back not with vengeful acts of violence but with love and guidance.
Even when facing the brutality of apartheid, Mandela would not resort to violence. Furthermore, even after 27 years in prison for his nonviolent protests in the 50s and 60s, he did not emerge a bitter man. In fact, when he became president of South Africa, he created the Truth and Reconciliation Commission to bring former adversaries into harmony with each other. (At the same time, he certainly had no desire to sweep all the past wrongs under the rug. He wanted it all out in the open, to be resolved, where apologies and restitution could be made to the victims of apartheid.)
We marvel at this man and his legacy because it is so extreme. The "norm" seems to be revenge and settling scores. A whole film genre seems devoted to this idea, and we have become numbed or inured to such behavior because it seems ubiquitous.
It does not have to be this way.
Everywhere, we hear of nations getting back at each other for recent transgressions, or ones that are hundreds of years old. On a personal level, we hang onto slights and remember them even after many years have passed. We nourish our wounds, as if letting them heal will only hand further victory to the people who hurt us. The cycle of hurt and hatred never ends.
The human brain evolved emotions and the capacity for culture - which, together, gave us both the incentive to avenge wrongs done to us or our kin and the strength to forgive. Evolution can be cited in both cases, not only the one that sells the most movie tickets. When someone says we are warrior species and that nothing can be done about it, we have to think of Great Souls like Mandela, Gandhi before him (another South African, by the way), Olaf Palmer, Martin Luther King Jr., and other peacemakers. They resisted (or, more accurately, rechanneled) the strong impulses born of anger. By doing so, they inspired millions: by showing love and guidance instead of hatred and retribution. They lived their values and truly believed in a universal brotherhood.
Human beings and human society are not simple - and neither are the strategies to deal with difficult situations like resource scarcity, natural disasters and interpersonal misunderstandings. We are doomed to repeat the mistakes of history if we cling to the idea that our violent heritage has no room for our equally strong capacity for love and cooperation.
Memorial to the 14 women |
How can there be a connection between the life of Nelson Mandela and a senseless act of violence specifically directed against women (the shooter famously said he was targeting "feminists")? They are such different takes on anger. In fact, they lie at separate poles.
In one case, a man acted on his anger and hurt, ending the lives of 14 women in their prime of life, and ruining the lives of all who loved them. In the other case, a man who had suffered greatly answered back not with vengeful acts of violence but with love and guidance.
Even when facing the brutality of apartheid, Mandela would not resort to violence. Furthermore, even after 27 years in prison for his nonviolent protests in the 50s and 60s, he did not emerge a bitter man. In fact, when he became president of South Africa, he created the Truth and Reconciliation Commission to bring former adversaries into harmony with each other. (At the same time, he certainly had no desire to sweep all the past wrongs under the rug. He wanted it all out in the open, to be resolved, where apologies and restitution could be made to the victims of apartheid.)
We marvel at this man and his legacy because it is so extreme. The "norm" seems to be revenge and settling scores. A whole film genre seems devoted to this idea, and we have become numbed or inured to such behavior because it seems ubiquitous.
It does not have to be this way.
Everywhere, we hear of nations getting back at each other for recent transgressions, or ones that are hundreds of years old. On a personal level, we hang onto slights and remember them even after many years have passed. We nourish our wounds, as if letting them heal will only hand further victory to the people who hurt us. The cycle of hurt and hatred never ends.
The human brain evolved emotions and the capacity for culture - which, together, gave us both the incentive to avenge wrongs done to us or our kin and the strength to forgive. Evolution can be cited in both cases, not only the one that sells the most movie tickets. When someone says we are warrior species and that nothing can be done about it, we have to think of Great Souls like Mandela, Gandhi before him (another South African, by the way), Olaf Palmer, Martin Luther King Jr., and other peacemakers. They resisted (or, more accurately, rechanneled) the strong impulses born of anger. By doing so, they inspired millions: by showing love and guidance instead of hatred and retribution. They lived their values and truly believed in a universal brotherhood.
Human beings and human society are not simple - and neither are the strategies to deal with difficult situations like resource scarcity, natural disasters and interpersonal misunderstandings. We are doomed to repeat the mistakes of history if we cling to the idea that our violent heritage has no room for our equally strong capacity for love and cooperation.
Friday, June 28, 2013
Why the Movie "World War Z" is Not So Far-fetched
Some people have trouble going to movies. It's hard for them to suspend disbelief, especially when the director, script-writer and special effects get the science wrong. When I say wrong, I don't mean they have the nerve to suggest that time travel could work or people can fly. No, I'm talking about movies where the premise is embedded in reality but is still somehow unaware or in defiance of current scientific understanding. It's as if they expect the audience not to know any better. (Not only is that insulting, it ends up acting as a self-fulfilling prophecy.)
I have railed at movies that depict certain animals inaccurately (creating irrational fear outside the cinema!), and when characters with a mental illness do not act in keeping with what we know about the disease (in other words, the director relies on stereotypes of "otherness" for effect). The misunderstanding of basic biology, in particular, can make my skin crawl.
To my great relief, a movie I saw a few days ago didn't push my buttons - and it was a zombie movie, of all things! World War Z, directed by Marc Forster.
Of course, the movie expects a certain suspension of disbelief: for one thing, there is no such thing as the undead. (If an organism is still reactive and moving, it's alive. It may not have much quality of life, mind you, but it's not dead.) And the disease passed from the zombies via a ferocious bite transforms victims in mere seconds.
What I really liked about the movie--besides seeing a very intense Brad Pitt on the screen almost all the time, that is--is the zombie idea itself. The disease, which is never isolated and identified, affects a victim's brain. Suddenly, and I mean suddenly, the zombie becomes a twitchy, speedy, chomping machine with a single motivation: to attack human beings and bite them. That's all. There is no brain eating. The infected zombies want to recruit others.
The infection (which takes hold so rapidly) drives them to do this.
This is where the movie makes the most sense, scientifically.
In nature, there are many examples of pseudo-zombification, cases where a parasite - often a macroscopic one, like a fluke, but plenty of viruses, too - takes over the brain of the host and compels it to complete the parasite's life cycle. It's very, very creepy but true.
Example: toxoplasmosis, which makes rats lose their fear of cats, disinhibiting them to the point of actually asking Kitty to play. Kitty, not clear on the concept, eats rat, ingests parasite, excretes eggs at a later date.
Example: grasshoppers that commit suicide by jumping into pools, pleasing the parasites that have taken over their brains. (The worm needs water to reproduce.)
Maybe one day, some parasite will jump from another species and cause humans to act strangely around other people or maybe an intermediate host (which then infects humans). The parasite will change us to get what it wants.
It's horrible to contemplate. But it isn't all that far-fetched.
I have railed at movies that depict certain animals inaccurately (creating irrational fear outside the cinema!), and when characters with a mental illness do not act in keeping with what we know about the disease (in other words, the director relies on stereotypes of "otherness" for effect). The misunderstanding of basic biology, in particular, can make my skin crawl.
To my great relief, a movie I saw a few days ago didn't push my buttons - and it was a zombie movie, of all things! World War Z, directed by Marc Forster.
Of course, the movie expects a certain suspension of disbelief: for one thing, there is no such thing as the undead. (If an organism is still reactive and moving, it's alive. It may not have much quality of life, mind you, but it's not dead.) And the disease passed from the zombies via a ferocious bite transforms victims in mere seconds.
What I really liked about the movie--besides seeing a very intense Brad Pitt on the screen almost all the time, that is--is the zombie idea itself. The disease, which is never isolated and identified, affects a victim's brain. Suddenly, and I mean suddenly, the zombie becomes a twitchy, speedy, chomping machine with a single motivation: to attack human beings and bite them. That's all. There is no brain eating. The infected zombies want to recruit others.
The infection (which takes hold so rapidly) drives them to do this.
This is where the movie makes the most sense, scientifically.
In nature, there are many examples of pseudo-zombification, cases where a parasite - often a macroscopic one, like a fluke, but plenty of viruses, too - takes over the brain of the host and compels it to complete the parasite's life cycle. It's very, very creepy but true.
Example: toxoplasmosis, which makes rats lose their fear of cats, disinhibiting them to the point of actually asking Kitty to play. Kitty, not clear on the concept, eats rat, ingests parasite, excretes eggs at a later date.
Example: grasshoppers that commit suicide by jumping into pools, pleasing the parasites that have taken over their brains. (The worm needs water to reproduce.)
Maybe one day, some parasite will jump from another species and cause humans to act strangely around other people or maybe an intermediate host (which then infects humans). The parasite will change us to get what it wants.
It's horrible to contemplate. But it isn't all that far-fetched.
Thursday, May 16, 2013
The Power of Analogy
Yesterday, I read a fascinating article co-written (with Emmanuel Sander) by one of the great science writers of our time, Douglas Hofstadter. "The forgotten fuel of our minds" can be found in the May 4 issue of NEW SCIENTIST. According to the two authors, new research examining the use of analogy in everyday thought suggests that it is far from an occasional tool for supporting a proposition or explaining something: it is the very basis of most thinking in the first place. Whether we are learning something new by mapping it onto pre-existing knowledge, or we are simply having a conversation, we take old ideas and use them in novel situations.
We use analogy whenever we use metaphors and similes, e.g., "His hair was like a bunch of ropes" or "Your premise doesn't hold water." And everyone uses expressions like these all the time. In fact, it is hard to imagine a conversation devoid of them altogether!
As I was reading, the power of analogy took place in my own mind. I drew parallels with my ongoing research into the omnivore brain. Could the ability to draw connections with previously acquired knowledge about the world (and oneself) have begun when we "learned" to be omnivores, about two million years ago?
Think about it: a herbivore (e.g., an antelope or a cow) doesn't need to differentiate much of its environment, but an omnivore, which must navigate a sea of choices (good, bad and in between) every time it wants to eat, does nothing but differentiate. It starts off its life by slowly learning safe choices from mother, then adds - presumably by trial and error - to this collection of useful facts as it goes along.
It is no accident that some of the smartest animals are the ones most diverse in their eating habits, e.g., bears, crows, raccoons. (Elephants being a notable exception that proves the rule.)
Humans pride themselves for their congitive abilities, citing logic, reason, and occasional intuitive powers among these strengths. The power of analogy contributes to our success as a species that confronts many choices - some lethal or at least challenging - every day. Wouldn't it be interesting if it is due to ancient adaptions to a diverse diet?
We use analogy whenever we use metaphors and similes, e.g., "His hair was like a bunch of ropes" or "Your premise doesn't hold water." And everyone uses expressions like these all the time. In fact, it is hard to imagine a conversation devoid of them altogether!
As I was reading, the power of analogy took place in my own mind. I drew parallels with my ongoing research into the omnivore brain. Could the ability to draw connections with previously acquired knowledge about the world (and oneself) have begun when we "learned" to be omnivores, about two million years ago?
Think about it: a herbivore (e.g., an antelope or a cow) doesn't need to differentiate much of its environment, but an omnivore, which must navigate a sea of choices (good, bad and in between) every time it wants to eat, does nothing but differentiate. It starts off its life by slowly learning safe choices from mother, then adds - presumably by trial and error - to this collection of useful facts as it goes along.
It is no accident that some of the smartest animals are the ones most diverse in their eating habits, e.g., bears, crows, raccoons. (Elephants being a notable exception that proves the rule.)
Humans pride themselves for their congitive abilities, citing logic, reason, and occasional intuitive powers among these strengths. The power of analogy contributes to our success as a species that confronts many choices - some lethal or at least challenging - every day. Wouldn't it be interesting if it is due to ancient adaptions to a diverse diet?
Friday, December 21, 2012
Overeating and tradition
It's the time of year when many Christians, former Christians, as well as those who want to jump on the bandwagon, so to speak, celebrate the end of the calendar year with at least one feast day, Christmas. It's no coincidence that the day to honor Christ's birth falls around the winter solstice. It is the gloomiest part of the northern hemisphere year, the start of winter, and a time for embracing any excuse for indulging in light, heat, and ample food (and drink). The Roman (pagan) feast of Saturnalia falls around this time, and we all know that the Romans were party animals.
Today is one of the two ironic days of the solar year. It marks the beginning of winter - yet tomorrow the days start growing longer until we reach the summer solstice (around June 21). That day is the first day of summer, of course, but it also heralds the shortening of the daylight hours all the way up to Dec. 21 or 22.
If you look at a calendar from 100-200 years ago - say, 1800 - you will see how many saints' days there were. People were expected to fast on certain days, feast on others. I suppose this was a system that had been worked out (not necessarily "planned" by an elder or a committee, though who knows what kind of control went on, way back when?) to help with food distrubution amongst the populace. It was normal to go without all food for a day or two, or without certain foods - usually scarcities like land animals and their products - for a longer period, e.g., Lent. Later, there was a feast of some kind. The entire year was broken up into this on/off pattern, not just at major holidays like Easter.
When people now laugh about overeating at Christmas, they are simply honoring tradition, if you really think about it. The trouble is, they do not honor tradition all the time! They never fast. They never go without certain foods as a way of 1) respecting scarcity; 2) valuing the foods they consume by going without them from time to time (absence makes the heart grow fonder). That lopsided "traditional" behavior contributes, no doubt, to the epidemic of obesity in Western countries, the plague of food waste, the proliferation of intense farming operations, and the attendant environmental pollution and animal cruelty. More appreciative consumers would better honor the living beings who feed them.
Maybe it's time to bring back some of the old ways - while we still have the freedom to choose to abstain. Any future scarcities - leading to local or widespread famine - will force us to go without luxuries and maybe without minimal daily nutrients. How about starting a fashion for pre-holiday fasts? They wouldn't involve starvation, just selective abstinence and calorie reduction. How much better roasted bird and high-calorie desserts would taste after a week of bean soup and dry crackers!
HAPPY SOLSTICE!
If you look at a calendar from 100-200 years ago - say, 1800 - you will see how many saints' days there were. People were expected to fast on certain days, feast on others. I suppose this was a system that had been worked out (not necessarily "planned" by an elder or a committee, though who knows what kind of control went on, way back when?) to help with food distrubution amongst the populace. It was normal to go without all food for a day or two, or without certain foods - usually scarcities like land animals and their products - for a longer period, e.g., Lent. Later, there was a feast of some kind. The entire year was broken up into this on/off pattern, not just at major holidays like Easter.
When people now laugh about overeating at Christmas, they are simply honoring tradition, if you really think about it. The trouble is, they do not honor tradition all the time! They never fast. They never go without certain foods as a way of 1) respecting scarcity; 2) valuing the foods they consume by going without them from time to time (absence makes the heart grow fonder). That lopsided "traditional" behavior contributes, no doubt, to the epidemic of obesity in Western countries, the plague of food waste, the proliferation of intense farming operations, and the attendant environmental pollution and animal cruelty. More appreciative consumers would better honor the living beings who feed them.
Maybe it's time to bring back some of the old ways - while we still have the freedom to choose to abstain. Any future scarcities - leading to local or widespread famine - will force us to go without luxuries and maybe without minimal daily nutrients. How about starting a fashion for pre-holiday fasts? They wouldn't involve starvation, just selective abstinence and calorie reduction. How much better roasted bird and high-calorie desserts would taste after a week of bean soup and dry crackers!
HAPPY SOLSTICE!
Labels:
animal welfare,
famine,
farming,
fast,
feast,
honor,
overeating,
seasons,
solstice,
tradition
Subscribe to:
Posts (Atom)