The next few years could see a shift in emphasis in the non-profit world – at least, if the work of organisations supporting entrepreneurs is an indication of the direction the sector is taking. “Philanthropy is one of those wonderfully antique words that we will stop using in 10 to 15 years,” says Bill Drayton, who founded Ashoka and pioneered the idea of identifying and investing in entrepreneurs. “The business/social boundaries are simply collapsing.”
As models such as venture philanthropy, microfinance and social entrepreneurship are embraced by non-profit organisations, and corporations start to focus on social issues, the barriers between the business and non-profit sectors continue to erode.
Leading the way, when it comes to breaking down these barriers, is an expanding cohort of non-profit organisations whose mission is to support for-profit entrepreneurs. In Afghanistan, for example, Arzu Rugs helps women generate income by sourcing and selling their rugs, providing employment for about 700 female weavers in the country’s villages.
In the US, Count Me In, which was founded in 1999, also focuses on female activities, providing loans and other resources to businesses owned by women. “It’s a long-term approach to poverty alleviation – creating businesses that create jobs,” says Nell Merino, founder of Count Me In.
Aid to Artisans, a 30-year-old non-profit organisation, helps craftspeople in developing countries gain access to new markets for their products and establish sustainable business models. Those it supports now sell to outlets such as Neiman Marcus, Saks Fifth Avenue, Pottery Barn and Crate & Barrel, as well as to smaller boutiques.
First, the organisation helps artisans develop products that will appeal to western consumers. “Design is really key because you need a good product to develop a viable business,” says David O’Connor, chief executive of Aid to Artisans. “We also do a lot of skills training in areas such as production and marketing. Then we link producers with buyers.”
As well as funding individuals, many of the non-profit organisations targeting entrepreneurs aim not only to help those entrepreneurs expand their business but also to have a wider impact.
Ashoka seeks out entrepreneurs (it calls them “Fellows”) with big ideas. By making a financial investment in them and giving them access to its network of other Fellows, as well as to experts in areas such as marketing and accounting, Ashoka believes it can help both the individuals it supports and the communities they serve. Moreover, these entrepreneurs can serve as role models to others. “They have to get local people in thousands of communities to say ‘this idea is better’, and introduce the idea of being a changemaker to the community,” explains Drayton. “Then you have a highly contagious process – and that’s the most important thing social entrepreneurs do.”
A similar philosophy lies behind the work of Acumen Fund. Acumen takes the philanthropic funds it receives from organisations such as Google.org and the Bill & Melinda Gates Foundation, as well as from wealthy individuals, and looks for entrepreneurs serving poor markets.
In India, for example, it supports Drishtee, an internet kiosk franchise business. The kiosks enable farmers to check commodities prices online and give villagers access to services such as health insurance via the internet. As well as taking an equity stake in the business, Acumen also funds individual loans to entrepreneurs who want to buy the start-up kit they need to open a kiosk.
Another organisation that supports entrepreneurs is Endeavor, which identifies what it calls “high impact” entrepreneurs in emerging markets such as Argentina, Mexico, Uruguay, South Africa, India, Egypt and Jordan.
However, rather than raising capital or granting funding, it mentors entrepreneurs in areas such as financing, marketing and leadership development. In Brazil, for example, it helped Tecsis, a technology company, come up with a programme for expansion into international markets, which created more than 1,000 local jobs.
By supporting the right entrepreneurs, the organisation gets a bigger bang for its buck. “In 2006, Endeavor’s worldwide budget was $6.87m and Endeavor entrepreneurs generated revenues of $1.9bn,” says Elmira Bayrasli, the organisation’s head of partnership policy and outreach. “That’s a pretty good return on investment.”
Leslie Lenkowsky, a professor of philanthropic studies at Indiana University’s Center on Philanthropy, believes that for the next generation of donors, this model is highly attractive. “A new generation of philanthropists is coming on the scene,” says Lenkowsky, who also teaches a social entrepreneurship course at the University’s School of Public and Environmental Affairs. “A lot of people now feel traditional philanthropy is something their grandmothers did, and they want to do it differently and more effectively.”
However, some remain sceptical of the market-led approach, Lenkowsky says. He cites comments made in the book Just Another Emperor? The Myths and Realities of Philanthrocapitalism by Michael Edwards, director of the Ford Foundation’s governance and civil society unit (see column on right). “One of the points Edwards makes is that the examples of successful social entrepreneurship ... most frequently cited typically succeed because of continuing subsidies of one sort or another,” he says. “There are very few that have become market sustainable.”
However, many argue that addressing poverty is not a case of choosing between the market and the charitable sector, but will involve what Lenkowsky calls “a marriage of for-profit and non-profit for philanthropic goals”.
Drayton argues that the old division between sectors is meaningless for entrepreneurs. “Entrepreneurs disrespect boundaries – they don’t care if it’s business or social,” he says. “With every human need, you have a business system serving the need and a social system serving the need and for centuries they haven’t talked to each other – that’s all changed.”
Bayrasli agrees. “For so long, people thought you needed a private sector, government and non-profit world and everyone would do their own thing. But no one was connecting the dots.”
She believes the role of non-profits will remain crucial when it comes to supporting entrepreneurs. “We’ve been asked the question: ‘Why aren’t you doing this as a for-profit organisation?’” she explains. “And the reason is that, in emerging markets, people don’t have a community to go to and they don’t trust anyone. Entrepreneurs are afraid to show their business plans to anyone because they’re afraid their ideas will be stolen.”
Because Endeavor has no financial stake in any of the businesses of its entrepreneurs, she says, they feel confident sharing their ideas and problems. “We connect them to the business people that they would otherwise not trust,” she says. “So we’re creating an environment that’s safe and non-threatening.”
Non-profit organisations can also test new operational models and take risks that for-profit organisations – which are accountable to shareholders and clients – may not be able to. Moreover, when it comes to making loans, legislation means some models need to be run through a non-profit structure.
“If we were a for-profit, we’d be subject to banking laws and those limit what you can do to test new things,” says Merino of Count Me In. “So we went at it from it from a non-profit perspective, which allows us to experiment. And the notion of accelerating women’s business growth is not something we might have come to as a for-profit organisation.”
For Ashoka’s Drayton, the type of organisation doing the funding is less important than the process of tracking down and supporting entrepreneurs whose ideas have far-reaching potential. “We don’t care how big our organisation is, the size of the budget, or the number of employees. That is not the purpose,” he says. “What we want to do is change the world, and if we can get 10,000 people in 10,000 communities to take the idea and apply it – even if none of it runs through our organisation – we’ve succeeded.”
Unlike traditional development agencies, the mission of those such as Ashoka, Acumen Fund and Endeavor is to bring about change not by what the organisations themselves do but through the progress of the people they support, in improving lives, generating employment, inspiring others and transforming the way things such as education and healthcare are provided.
“You have to give people fish if they’re starving, but it’s better to teach people to fish,” says Drayton. “But it’s better than either of those two to change the fishing industry – and that’s what entrepreneurs do.”
sourcece: www.ft.com
Wednesday, July 16, 2008
Monday, June 30, 2008
Science & Environment : Are We in Need of a Neuromorality?
“We have no idea how the brain enables the mind. We know a lot about the localization of function, we know a lot about neurophysiological processes, but how the brain produces mental states—how it produces conscious, rational intentionality—we don’t have a clue.
Armed with an array of tools that sound like an intergalactic arsenal straight from Star Wars, modern neuroscientists are increasingly well equipped for forays to the frontiers of the human brain. Through the use of positron emission tomography (PET), near infrared spectroscopy (NIRS) and functional magnetic resonance imaging (fMRI), neurotechnology is probing the brain with increasing precision and positing biological explanations for human behavior.
Recent breakthroughs in brain science led Martha J. Farah, director of the Center for Cognitive Neuroscience at the University of Pennsylvania, to write in January 2005: “For the first time it may be possible to breach the privacy of the human mind, and judge people not only by their actions, but also by their thoughts and predilections. The alteration of brain function in normal humans, with the goal of enhancing psychological function, is increasingly feasible and indeed practiced. At the same time, progress in basic neuroscience is illuminating the relation between mind and brain, a topic of great philosophical importance. Our understanding of why people behave as they do is closely bound up with the content of our laws, social mores and religious beliefs.”
Neuroscience’s ability to observe the brain’s activity brings with it great promise. However, it simultaneously raises intriguing issues that reach well beyond the research laboratory and into the everyday lives of people. It would appear that early into the 21st century we have arrived at the threshold of asking not whether we can monitor and manipulate brain function, but rather ought we? Therefore, faced with the familiar tendency of ethical questions to trail technological advances, the fledgling field of neuroethics—like its predecessors, medical ethics and bioethics—finds itself in a race to catch up with leading neurotechnological trends.
If neuroscience, with its developing technologies and techniques, can consistently and reliably demonstrate individual psychological traits and/or medical predispositions, what moral framework or ethical guidelines will govern its possible uses? Is there a need for a neuromorality?
Current brain imaging technology is in the process of correlating patterns of brain activity and psychological and personality traits. “Brainotyping” appears to have the potential to identify mental health vulnerabilities, predisposition to violent crime, racial attitudes, risk aversion, pessimism, etc. Most neuroscientists are quick to mention that brain imaging in its current state of development is helpful and informative but far from conclusive. Nonetheless, the potential is real, and ethical questions are beginning to be asked about prospective applications of these scientific advances and the information they might generate. Practical and philosophical problems associated with brain privacy, performance augmentation, and the very nature of personhood are among the areas of concern.
PREDICTION VS PRIVACY
If neuroscience can provide the means to expose predispositions and thus predict behaviors, it will challenge our cultural understanding of personal privacy and when or if society has the right to know what an individual is inclined to think or do before he thinks or does it.
Wouldn’t auto insurance companies be interested in identifying persons with a proclivity for risk-taking? Medical insurers would surely benefit from knowledge of the medical predispositions of their prospective clients. Can we envision a time when neuroimaging would be used to screen applicants for employment? Could the college admissions process someday be supplemented by brain scans to weed out candidates who would likely be diagnosed with schizophrenia during their college years?
Consider the serious implications of brain research in a judicial setting. What if we were able to detect brain patterns or defects that would predict the possibility of criminal behavior? Addressing this theme at an American Enterprise Institute conference in June 2005, Henry T. Greely, director of the Center for Law and the Biosciences at Stanford University, asked these thought-provoking questions: “How accurate will the predictions be? How accurate will society perceive them to be? And when should we act on predictions rather than on actions?”
If neuroimaging could help us discover increased blood flow or brain activation that signaled a predisposition for pedophilia, would it matter whether the information was obtained directly or surreptitiously? With whom should such brain-based knowledge be shared? Should we intervene before the committing of a crime, or perhaps require counseling, notification of neighbors, or the person’s disqualification from jobs with children and adolescents? Is the presumption of innocence rendered null and void by a brain-scan prediction?
If we deem intervention at this stage unacceptable, how would we face the parents of a potential victim knowing we had foreknowledge of the likelihood of a heinous crime being committed?
PERFORMANCE ENHANCEMENT
What could be more natural than the impulse to improve? Doesn’t the desire to develop and become better at mastering life’s skills seem instinctual? Isn’t the desire to do better inherent within human consciousness?
What if “doing better” came in a capsule?
According to UN figures, growth in annual U.S. sales of methylphenidatein the decade from 1987 to 1997 increased from around 60 million defined daily doses to more than 300 million. Why the dramatic increase? Farah points out that “although methylphenidate (Ritalin) and amphetamine (Adderall) are ostensibly prescribed mainly for the treatment of ADHD, sales figures suggest that they are not uncommonly used for enhancement. Methylphenidate is currently widely used by high school and college students. Surveys have estimated that as many as 10 percent of high school students and 20 percent of college students have used prescription stimulants such as Ritalin illegally.”
These medications, designed to help individuals with a cognitive disorder, are being used for cognitive enhancement by healthy students who claim that they study better and longer and earn higher scores on exams.
Psychopharmacology is poised to play a major role in augmenting brain activity. Mood alteration, memory boosts, appetite suppression, improved libido, focused attention and alertness are not only possible but currently practiced through adjustments in brain chemistry. Increased sophistication in technique and treatment is expected.
Are human effort, ingenuity and accomplishment to be recognized in the same way, whether enhanced or not? In the athletic arena we traditionally say no. Olympic and professional athletes discovered attempting to augment their performance by drugs are disciplined or disqualified. What about the academic classroom or the boardrooms of the business world? Is a college student competing for admission to law school unfairly disadvantaged if others are using psychostimulants to enhance their study skills? Are the enhancers cheating? Is the stressed professional who uses the wakefulness-promoting agent modafinil to stay alert and get more work done in a day playing fair in the pursuit of recognition and reward?
Should we even be concerned? Are these examples any different than drinking a sufficient quantity of coffee to “pull an all-nighter” and thereby finish a paper at school or a proposal at work?
Whereas in time past we may have been reluctant to ask our awakening spouse a question until after his or her daily dose of caffeine, today the stakes would appear to be much higher. Modern mood-altering medications are so effective that they can cause confusion in relationships. Farah asks, “If we fall in love with someone who is on Prozac and then find she is difficult or temperamental off the drug, do we conclude that we don’t love her after all? Then who was it we loved? Are we treating people (including ourselves) as objects if we chemically upgrade their cognition, temperament or sexual performance? People vary in how troubling they find these scenarios, but at least some see a fundamental metaphysical distinction eroding, the distinction between things (even complex biophysical things) and persons.”
PERSONHOOD
What is the defining distinction between a person and a thing that, according to Farah, appears to be eroding? For many, the slippery slope is the distinction between the brain, a physical organ of the body, and the human mind exercising free will.
Are you your brain? Or is there something else that makes us human beings?
Biological determinism, in attempting to answer that question, presents a comprehensive explanation of human nature and behavior based on the brain’s physiological processes. It professes that human conduct arises from the hardwiring of the brain and that free will is implausible. Steven Pinker, professor of psychology at Harvard University, resists the “determinist” label but nevertheless concludes that free will is an irrelevant concept because we are the sum total of our brain’s purely physical mechanisms. If that is so, are we responsible for our attitudes and actions? Pinker proposes that instead of asking whether an individual is personally responsible, we should ask “Does the person have an intact version of the human brain system that ordinarily responds to public contingencies of responsibility?” He suggests that a new neuromorality should focus on exploring and establishing contingencies that can deter inappropriate behavior and not concern itself with personal culpability.
This perspective appears to be gaining significant ground. Farah states, “As ethicists and legal theorists have grappled with neuroscientific accounts of bad behavior, they have increasingly turned to alternative interpretations of responsibility that do not depend on free will, and to so-called ‘forward-thinking’ penal codes, designed not to mete out punishments for previous behavior but to encourage good behavior and protect the public. The ‘disease model’ of substance addiction, and the extension of the medicalized notion of addiction to other compulsive behaviors such as compulsive gambling and compulsive sex, is another way in which brain-based explanations of behavior have impacted society. The disease model emphasizes the deterministic and physiological nature of the behaviors and thereby reduces their moral stigma.”
The disease model, as applied to human behavior, implies that we are helpless victims of our brain’s biochemical reactions, that our brain-based behavior is an irresistible physiological imperative. Psychiatrist and scholar Sally L. Satel argues that, using this model, an addicted person is understood to be unable to control his behavior. However, she suggests that an addict is actually someone who does not control his behavior and has the perception that he is helpless. The primary purpose of drug treatment is to teach the addict that he is not helpless at all, that he has clear-thinking periods in which he makes dozens of microdecisions every day that contribute to whether he continues to use or not. Individuals with a brain disorder such as schizophrenia cannot decide not to hallucinate, just as those with multiple sclerosis cannot choose not to have spasticity. On the other hand, Satel contends that addicts can—and often do—choose to modify their behavior.
So what are we to conclude from this debate? Are we helpless to resist our brain chemistry? Or are we entities exercising free will and conscious choice? As Satel points out, a brain scan cannot tell us whether a “craving impulse is irresistible or whether it was simply not resisted.”
Studies show that the public appears open to the findings of neuroscience, expressing considerable confidence in brain scans as a virtual test of truth. The public’s receptivity reportedly exceeds the confidence that many brain researchers are presently willing to express. Farah suggests that statements like “the brain doesn’t lie” illustrate “a failure to appreciate the many layers of signal processing and statistical analysis that intervene between actual brain function and resulting image or waveform, as well as the complex set of assumptions required to interpret the psychological significance of such images.”
Even among a seemingly open-minded public, most people still believe that they have a mind as well as a brain and that the two are not the same. Advances in neuroscience appear to collide with this almost universally held belief.
SCIENCE VS THE SPIRIT
The Judeo-Christian ethic, deeply rooted in Western civilization, has at its core the belief that man is a free moral agent made in the image and likeness of God and accountable to his Creator for his life’s conduct. The apostle Paul describes this concept in his first letter to the church at Corinth. He writes, “For what man knows the things of a man except the spirit of a man which is in him?” (1 Corinthians 2:11). Paul describes a spirit component in man that enables him to function as a human being, “to know the things of a man.” Could it be that we intuitively acknowledge that we are more than complex biochemical entities because we have “the spirit of man”? Could it be that there is a nonphysical feature that resides in us and functions in conjunction with a healthy human brain system?
Impossible to prove scientifically, but plausible to people of sincerely held conviction, is the belief that human beings have minds capable of a relationship not only with each other but with God. Before we embrace the tenets of a materialist neuromorality, perhaps we should revisit the Judeo-Christian ethic to examine whether it has indeed failed us or whether we have simply failed to appreciate, respect and apply it.
STAY TUNED
In anticipation of increased interest in performance enhancement, neuroethics, to its credit, is attempting to prepare for the challenges ahead. More invasive and sophisticated methods of brain manipulation and enhancement are currently being explored, such as brain nerve stimulation, brain surgery and brain-machine interfaces. How brain research is conducted, what it discovers, and how society will use these findings are issues of great interest, with implications for many aspects of human experience.
Fasten your seatbelt as we journey to the innermost recesses of the mind. Professor Farah declares that the question is “not whether, but rather when and how, neuroscience will shape our future.”
THOMAS E. FITZPATRICKthomas.fitzpatrick@visionjournal.org
(This article originally appeared in the Fall 2006 issue of Vision. Revised for Spring 2008.)
SELECTED REFERENCES:1 Martha J. Farah, Ph.D., “Neuroethics: The Practical and the Philosophical,” Trends in Cognitive Sciences (January 2005). 2 Sally L. Satel, M.D., and Frederick K. Goodwin, M.D., Is Drug Addiction a Brain Disease? (1998).
RELATED ARTICLESNeurogenesis: Changing Your Mind
Science & Environment : A penny for your Thoughts
Science & Environment.
The going rate for a “thought”—a probe into the thinking of another—was once quite a bargain. Today, more than four centuries since the phrase “A penny for your thoughts?” was first recorded, inflationary accounting makes that ancient penny worth more than $40. Even with the sliding value of the dollar, this still seems quite the bargain. Of course, times haven’t changed much in one sense, considering the buyer still can’t be sure of the veracity of his purchase.
How much would you pay to know what thoughts are swimming around in someone else’s head? And if you could really know their truthfulness how much more would you pay?
Such fantastical questions have long been the bread and butter of fiction. From the Twilight Zone to Minority Report, the idea of reading minds—of seeing the private intentions of another, and the possibility of intervening in those plans—has always been highly attractive.
Not long ago science was satisfied with outlining the areas of the brain responsible for various functions and the processing of sensations: frontal lobe for “higher thought,” optic lobe for vision, etc. Not so today. As Martha J. Farah, director of the Center for Cognitive Neuroscience at the University of Pennsylvania, puts it, “For the first time it may be possible to breach the privacy of the human mind, and judge people not only by their actions, but also by their thoughts and predilections” (See “Are We in Need of a Neuromorality”).
Penalizing and controlling those once private thoughts may eventually become de rigueur. Science fiction writer Michael Crichton imagined more than 35 years ago that understanding the neural origin of our thoughts and motivations will water the seeds of our desire to intervene. At first interventions will be for medical reasons, to control seizures and epileptic “brainstorms” that incapacitate their victims. In Terminal Man (1972), Crichton presaged this concept as a computer “that will monitor electrical activity of the brain, and when it sees an attack starting, will transmit a shock to the correct brain area. This computer is about the size of a postage stamp and weighs a tenth of an ounce. It will be implanted in the skin of the patient’s neck.”
In Crichton’s story, Benson, a brain-damaged accident victim, suffers blackouts during which he becomes violently psychotic. Certainly this seems a humanitarian reason to intervene, but it evolves into the sort of “could we, should we” question that is the foundation of many of Crichton’s novels. The conclusion is not surprising. The feedback loop between computer and brain becomes positive rather than negative. The stimulus pathway that was meant to negate the deviant thoughts becomes the pathway to encourage those actions. The sensation created within Benson’s brain becomes pleasurable; more violence is fed with more pleasure.
“It feels so good,” Benson said, still smiling. “That feeling, it feels so good. Nothing feels as good as that. I could just swim in that feeling forever and ever.”
In electronically countering Benson’s seizure, the computer’s input has been reinterpreted as a positive; a good feeling is perceived. This experience drives the seizure forward and, as the story continues, drives Benson to greater and more frequent bouts of violence. His brain molds itself to the new conditions created by the computer’s sensory input.
In a no longer unusual case of fiction becoming fact, Crichton was writing of neural plasticity 35 years ago. With penetrating insight he mused:
Our brains were the sum total of past experiences—long after the experiences were gone. That meant that cause and cure weren’t the same thing. . . . As the Development people said, “A match may start a fire, but once the fire is burning, putting out the match won’t stop it. The problem is no longer the match. It’s the fire."
As for Benson, he had had more than twenty-four hours of intense stimulation by his implanted computer. That stimulation had affected his brain by providing new experiences and new expectations. A new environment was being incorporated. Pretty soon, it would be impossible to predict how the brain would react. Because it wasn’t Benson’s old brain anymore—it was a new brain, the product of new experiences.
In light of recent news concerning remarkable ways to image and bloodlessly dissect the human brain, much of Terminal Man seems medically quaint and naïve. Still, knowledge about the mind is a terrible thing to waste. As the linked articles below bear out, with every passing day we are learning more about the brain and how to manipulate and stimulate it.
This is all meant to be for the good; neurologists are not aiming at mind control. Clearly, however, as Crichton writes above, new experiences create new expectations. New feedback loops come into existence; new stimuli bring about new responses. A penny for your thoughts?
DAN CLOER
RELATED ARTICLESAre We in Need of a Neuromorality?Neurogenesis: Changing Your MindBrain Science Has a Change of Mind Give Sorrow More Than Words: The Neuroscience of GrievingNeuroscience Enlightens Leadership
FOR MORE INFORMATION
Brain fitness seen as hot industry of the future
How much would you pay to know what thoughts are swimming around in someone else’s head? And if you could really know their truthfulness how much more would you pay?
Such fantastical questions have long been the bread and butter of fiction. From the Twilight Zone to Minority Report, the idea of reading minds—of seeing the private intentions of another, and the possibility of intervening in those plans—has always been highly attractive.
Not long ago science was satisfied with outlining the areas of the brain responsible for various functions and the processing of sensations: frontal lobe for “higher thought,” optic lobe for vision, etc. Not so today. As Martha J. Farah, director of the Center for Cognitive Neuroscience at the University of Pennsylvania, puts it, “For the first time it may be possible to breach the privacy of the human mind, and judge people not only by their actions, but also by their thoughts and predilections” (See “Are We in Need of a Neuromorality”).
Penalizing and controlling those once private thoughts may eventually become de rigueur. Science fiction writer Michael Crichton imagined more than 35 years ago that understanding the neural origin of our thoughts and motivations will water the seeds of our desire to intervene. At first interventions will be for medical reasons, to control seizures and epileptic “brainstorms” that incapacitate their victims. In Terminal Man (1972), Crichton presaged this concept as a computer “that will monitor electrical activity of the brain, and when it sees an attack starting, will transmit a shock to the correct brain area. This computer is about the size of a postage stamp and weighs a tenth of an ounce. It will be implanted in the skin of the patient’s neck.”
In Crichton’s story, Benson, a brain-damaged accident victim, suffers blackouts during which he becomes violently psychotic. Certainly this seems a humanitarian reason to intervene, but it evolves into the sort of “could we, should we” question that is the foundation of many of Crichton’s novels. The conclusion is not surprising. The feedback loop between computer and brain becomes positive rather than negative. The stimulus pathway that was meant to negate the deviant thoughts becomes the pathway to encourage those actions. The sensation created within Benson’s brain becomes pleasurable; more violence is fed with more pleasure.
“It feels so good,” Benson said, still smiling. “That feeling, it feels so good. Nothing feels as good as that. I could just swim in that feeling forever and ever.”
In electronically countering Benson’s seizure, the computer’s input has been reinterpreted as a positive; a good feeling is perceived. This experience drives the seizure forward and, as the story continues, drives Benson to greater and more frequent bouts of violence. His brain molds itself to the new conditions created by the computer’s sensory input.
In a no longer unusual case of fiction becoming fact, Crichton was writing of neural plasticity 35 years ago. With penetrating insight he mused:
Our brains were the sum total of past experiences—long after the experiences were gone. That meant that cause and cure weren’t the same thing. . . . As the Development people said, “A match may start a fire, but once the fire is burning, putting out the match won’t stop it. The problem is no longer the match. It’s the fire."
As for Benson, he had had more than twenty-four hours of intense stimulation by his implanted computer. That stimulation had affected his brain by providing new experiences and new expectations. A new environment was being incorporated. Pretty soon, it would be impossible to predict how the brain would react. Because it wasn’t Benson’s old brain anymore—it was a new brain, the product of new experiences.
In light of recent news concerning remarkable ways to image and bloodlessly dissect the human brain, much of Terminal Man seems medically quaint and naïve. Still, knowledge about the mind is a terrible thing to waste. As the linked articles below bear out, with every passing day we are learning more about the brain and how to manipulate and stimulate it.
This is all meant to be for the good; neurologists are not aiming at mind control. Clearly, however, as Crichton writes above, new experiences create new expectations. New feedback loops come into existence; new stimuli bring about new responses. A penny for your thoughts?
DAN CLOER
RELATED ARTICLESAre We in Need of a Neuromorality?Neurogenesis: Changing Your MindBrain Science Has a Change of Mind Give Sorrow More Than Words: The Neuroscience of GrievingNeuroscience Enlightens Leadership
FOR MORE INFORMATION
Brain fitness seen as hot industry of the future
Society and Culture :The Moral Of the story
Between “once upon a time” and “happily ever after” lies a timeless, ever-changing world, where everything is possible and dreams do come true.
Countless fairy tales with infinite variations, usually conveying moral, social or political lessons through skillful narrative and interesting characters, have existed throughout history and throughout the world. Consider Aesop’s fables, the basis for so many of our contemporary moral stories. The still-popular tales have lived on for more than two millennia, exemplifying extraordinary power and longevity. Other early influences on our literary tradition abound: Cinderella stories, for example—distressed damsels losing diminutive footwear—are found in ancient Egypt and ninth-century China.
The nature of this genre seems to invite evolution. Originally these deceptively simple stories were passed orally from generation to generation. As the printed word became more accessible, the tales became somewhat less mutable for a time. Today the images we see on the movie screen have firmly implanted themselves in our minds and have all but supplanted the originals.
More significant than the changes themselves, however, is what the evolution of the fairy tale tells us about ourselves and our changing society.
FAIRY TRAILS
The origins of the fairy tales we know today are found in sources as varied as mythology and the Bible. Common themes can be found in most cultures, whether through commonality of experience or because the tales themselves traveled with both conquerors and the conquered. Globetrotting folktales were used sometimes to educate and sometimes to frighten children (and adults) into compliance, graphically warning of the consequences for wrong actions.
As the centuries passed, virtue and a sense of morality ebbed and flowed, both in real life and in the tales that accompanied mankind on the journey. Among medieval peasants, folktales passed from those older and more experienced to younger adults and children as moral lessons for life. Many take place during the hero’s or heroine’s passage from childhood to adulthood, often ending in marriage. Along this fantastic path are not only challenges to be overcome but warnings: the perils of being alone in the woods; the potential pitfalls of physical attractiveness; the dangers of being naïve.
The stories often addressed subjects in veiled terms. According to folklore researcher and retired professor D.L. Ashliman, “many fairy tales owe their longevity to an ability to address tabooed subjects in a symbolic manner” (“Incest in Indo-European Folktales,” 1997). It is not surprising, therefore, to learn how many of these seemingly benign tales have descended from darker stories involving themes of adultery, incest, cannibalism, rape, murder and mutilation.
As Italy emerged from the medieval period and embraced the Renaissance, one of Europe’s first known written story collections was being conceived by Giovanni Francesco Straparola, often considered the father of the literary fairy tale. In 1550, Straparola first published a collection of stories told within the framework of a greater story. These bawdy literary romps, which reflected the relaxed morality of the time, were clearly not meant for children. By writing as though the stories were told by a group of ladies and gentlemen, Straparola was able to justify his use of shocking vernacular language. This pretext allowed the stories to be accepted by the educated classes in Italy and later throughout Europe, anesthetizing them to vulgarity in literature.
SATIRE AND SYMBOLISM
Straparola’s influence is seen in later European writings, including those of his fellow countryman Giambattista Basile (ca. 1576–1632). Basile’s posthumously published collection of 50 stories followed in the same tradition. His timeless social commentaries highlighted the shortcomings of those who descended to the depths for wealth, power and fame. Included are early versions of classic fables we would recognize today.
Half a century later, Charles Perrault and his contemporaries took some of the earlier European peasant tales and massaged them until they were more suited to the aristocratic salon set of 17th-century France, where storytelling was considered an important social art. He customized the stories and added new ones, often making a point of showcasing the difficulties and the challenges of his time. A collection of Perrault’s stories was published in 1697, subtitled Contes de Ma Mere l’Oye (literally Tales of My Mother the Goose). Gone was much of the violence, but added was the subtle sexual innuendo expected in the popular culture of the period. Our modern “Cinderella,” “Little Red Riding Hood,” “Sleeping Beauty,” “Bluebeard,” “Puss in Boots” and others are easily recognized in Perrault’s writings.
His work was characterized by typically French actions and lighthearted humor; for example, Cinderella, with undeniable savoir faire, drops her slipper on purpose. And when Perrault’s prince finds the sleeping beauty, who has been slumbering for a century in the woods, one of the first things he notices is her out-of-style clothing. The wicked queen, mother of the prince, upon discovering the clandestine marriage of the pair and their subsequent offspring, orders one of her grandchildren to be cooked for dinner. But not just any recipe will do: the gourmand requests that the child be served with a classic sauce Robert.
A rhyme telling a moral at the end of Perrault’s stories came later. His warning to young girls about the nature of wolves, for instance, leaves no doubt that he was not referring to canines in “Little Red Riding Hood.” One English translation reads:
Little girls, this seems to say,Never stop upon your way,Never trust a stranger-friend;No one knows how it will end.As you’re pretty so be wise;Wolves may lurk in every guise.Handsome they may be, and kind,Gay, and charming—nevermind!Now, as then, ’tis simple truth—Sweetest tongue has sharpest tooth!
Perrault’s social circle included Marie-Catherine d’Aulnoy, who published her own stories in an anthology titled Contes de Fées(Fairy Tales), and the term lives on.
According to historian Marina Warner in Wonder Tales, many of d’Aulnoy’s stories and similar “Beauty and the Beast” tales were based on the classic fable of Cupid and Psyche. The common thread, fear of an unknown or brutish groom, struck a chord with the women of France, who were beginning to challenge the traditional balance of power and the common practice of arranged marriages. Warner states, “Though the message is largely lost on today’s audience, thoroughly accustomed to choosing not just one partner but several, the French wonder tale was fighting for social emancipation and change on grounds of urgent personal experience.”
The objects of these stories went beyond weddings and women’s issues. The indiscretions and warmongering of the king and his courtesans were also subtly spoofed in the veiled satires, sometimes resulting in exile for the authors.
GRIM TALES
Using stories for political ends was not limited to the French. Neither, obviously, did biblical values tend to be an overriding theme. But often as not, the changing tales did reflect each society’s prevailing interpretations of religious themes. Anti-Semitic blood libel stories—the later-debunked tales of ritual murders and drinking of Christian children’s blood by Jews—were started by early Christian zealots and propagated during the Crusader era. These tales were found throughout Europe and encouraged in Martin Luther’s Germany, and later they even appeared in a well-known collection of folktales.
The Romantic period of the early 19th century saw a growing fascination with a glorified primitive or peasant culture. Germany was mostly recovered from the effects of the Thirty Years War, which had left a third of the population dead and the rest struggling with famine and disease. Stepparents and early death had been facts of life for much of the population, and the folktales reflected that reality. The stage was set for the work of Jacob and Wilhelm Grimm, known for their work in promoting a common German culture and language. Today the world at large recognizes the brothers Grimm as the authors of what may well be the best-known anthology of fairy tales, translated into more than 160 languages.
The brothers collected tales from friends and acquaintances, some of whom were fluent in French and intimately familiar with the popular fées. The Grimms declared the tales pure, original and German, yet they were conflated from the writings of Perrault and his contemporaries, from the anthologies of Basile, and from storytellers of the Middle East, Asia and elsewhere. Even with the multicultural influences, however, their stories demonstrated a distinct Germanic flair.
Despite claims of wanting to retain literary purity, the brothers changed the stories over the years. Their earliest manuscript dates from 1810, with various revisions being published from 1812 to 1857 (the last edition being the basis for most of the translated Grimm tales we have today). Each revision took away some of the sexual overtones and gruesome violence against the innocent (though not against wrongdoers), and added lessons in their brand of Christian morality. This sometimes altered the stories in a dramatic way: for example, Snow White’s jealous biological mother from the first edition became a vain stepmother in later editions, changing the theme from a complex mother-daughter rivalry to a much simpler moral lesson against vanity.
MUCH ADO ABOUT BOWDLER
Meanwhile, in Puritan England, where the child mortality rate was high, the fear of eternal damnation for unprepared children had been a driving force in the popularity of instructive literature like John Bunyan’s Pilgrim’s Progress. And those in the privileged, literate classes had tried to restrict the nature of children’s literature to stories that reinforced class distinctions, such as the upper class feeling charitable toward the poor, who always reacted humbly and knew their place in society. They saw danger in fairy tales encouraging upward social mobility, where a peasant could marry into the aristocracy and live happily ever after.
But the 18th century saw changes in English society, with a growing and increasingly literate middle carrying newfound discretionary income, a budding children’s culture, and money to be made in commercial endeavors. Before long, dozens of volumes of fairy tales were translated from European languages and turned into inexpensive books, which the children of the working devoured. In response, the fairy tales underwent dramatic changes, nearly eliminating the fantasy and including even stronger moral lessons, with strained, sometimes unintentionally humorous results.
Onto this post-Puritan stage stepped Thomas Bowdler, whose surname became immortalized as a verb after 1818 when he published his sanitized and paraphrased version of Shakespeare, titled The Family Shakespeare. Bowdlerization was the answer for those who believed suitable literature was to be purely didactic and devoid of fantasy. Piety and virtue were esteemed and enforced, so in books that otherwise ran the risk of being banned outright, material deemed objectionable was deleted or purified.
GeorgeCruikshank, a popular illustrator of the Grimms’ translations and Charles Dickens’s works, became an outspoken moral revisionist in the straitlaced Victorian era. When he tried to turn Cinderella into a promotional tome for teetotalism, however, it was more than Dickens, who was raised on fairy tales, could quietly bear. In Social Dreaming: Dickens and the Fairy Tale, Elaine Ostry remarks that Dickens “helped establish the fairy tale as artistic, respectable and critical of society. He adhered to one vital aspect of the fairy tale tradition: the use of fairy tale to influence the way people acted as social beings. For Dickens and many other writers before and after him, the fairy tale was an essential voice of the nation which carried with it cultural messages. For him, the fairy tale had the power, or the magic, to effect social transformations.” His 1853 “Frauds on the Fairies” counterattacked bowdlerization’s forced revisions with a satirical Cinderella story reworked to be politically correct in that era, 140 years before James Finn Garner did the same with his tongue-in-cheek bestseller, Politically Correct Bedtime Stories.
The furor died, and fairy tales continued their slow evolution. My Book House, a popular set from the early 20th century, included classic literature, fables, fairy tales and stories with historical themes. In keeping with the prevailing ideals of the time, the six-volume anthology was intended to be educational as well as entertaining. The fairy tales included were still somewhat sanitized versions, most notably eliminating all traces of wicked parents. Contributing factors included increased longevity and the exaltation of motherhood as women became more able to choose the size of their families. The romantic ideal was that each child was wanted and precious in the eyes of the parents.
The popularity of literature for children and ethereal art featuring children by a new generation of artists and illustrators, including Jessie Willcox Smith and Maxfield Parrish, helped pave the way for the next major change: the Disney fairy-tale-to-film phenomenon.
FAIRY TALE BENDING
Walt Disney’s film Snow White (1937) broke new ground as the first American full-length animated musical feature. Disney knew his audience—a country that had been through both a world war and an economic depression in one generation. The social and political messages were softened, and the stories were changed to enhance their entertainment value. The project consumed more time and resources than anyone could have expected at the time—nearly $1.5 million was an astronomical sum in the midst of the Great Depression. It was a huge risk and a huge commercial success, as people went without necessities to buy 83 minutes of escape.
Snow White was followed by Pinocchio,Cinderella and Sleeping Beauty. These fairy tale movies, produced before Disney’s death in 1966, were of the same formula, usually involving an adolescent hero/heroine desperately in need of outside help in the spirit of the Grimm versions, but without the violence and harshness. Romantic themes, cheery musical interludes and comic relief before the happy ending became the norm. Villains died or were otherwise disposed of as a result of their own actions, which prevented the blemishing of the pristine character of the hero or heroine. These sunny revisions avoided the unpleasant realities addressed in the earlier tales but also diminished the ability of the hero or heroine to triumph over greater adversity. Yet it was exactly what the paying public of that era wanted, especially for their children.
The late 1960s and ’70s saw a surge of interest in women’s rights in the Western world as the Equal Rights Amendment gained approval in the United States. Australian-born Helen Reddy’s feminist anthem “I Am Woman” hit the top of the U.S. Billboard charts in 1972. In this atmosphere, the Disney-formula heroines were increasingly criticized for their wide-eyed docility. By 1989 the passive princess of the past reemerged in the form of an empowered teenage mermaid taking charge and not listening to anyone—not even her father (see “Set in Celluloid”). Two years later, a beautiful bookworm named Belle tamed the beast and became the new standard for girls everywhere. This calculated reworking of the female protagonist both echoed then-current feelings about femininity and shaped the attitudes of young fans worldwide. More significant and far-reaching is the prevailing trend within these reworked fairy tales of people not looking to a higher authority for guidance but attempting to find solutions from within themselves.
LESSONS SPURNED
With globalization, full-length animated movies have become today’s standard for fairy tales worldwide. Often forgotten are the deeper meanings and lessons of some of the earlier versions, as well as the moralistic revisions of the brothers Grimm. If fairy tales have been a social gauge through the ages, then today’s tales suggest that Western society has shifted even further from supporting biblical values and principles to embracing the concepts of relative morality and self-sufficiency.
The dual forces of cause and effect have been consistently at work through the ages. The mutable fairy tale has always been both an unrelenting influence on society and a mirror of society. From oral tradition, through the literary fairy tale, and now to cinema—we can only imagine what new medium will carry fairy tales to the next generations and what influential messages they will instill.
One thing is certain, however. The current trend in popular fairy tales toward moral ambivalence suggests that the foreseeable future looks disturbingly amoral.
ALICE ABLER
(This article originally appeared in the Fall 2005 issue of Vision. Revised for Spring 2008.)
See also:Set in Celluloid
SELECTED REFERENCES:
1 Jack Zipes (editor), Spells of Enchantment: The Wondrous Fairy Tales of Western Culture (1991).
2 Jack Zipes (editor), The Oxford Companion to Fairy Tales (2000).
3 Marina Warner, From the Beast to the Blond: On Fairy Tales and Their Tellers (1994).
4 Marina Warner (editor), Wonder Tales (1996).
5 Jack Zipes (editor), The Great Fairy Tale Tradition: From Straparola and Basile to the Brothers Grimm (2001).
Labels:
Bowdler,
Children,
Childrens Literature,
Fairy Tales,
Grimm Brothers,
Morals,
Perrault,
Straparola
Power To the people
Is more and better democracy the way of the future? Will the establishment of democratic forms of government end the strife in contentious regions such as the Middle East? Can the enforced implementation of democracy by the United Nations or the United States solve the world’s problems?
The key to the democratic process is “the people.” To appreciate the future of democratization we need to understand some fundamental aspects of the human mind from which the will of the people originates.
Peace is a basic desire of people everywhere. And if happiness and prosperity accompany it, then so much the better.
We look to our governments to create the conditions required for peace. Is there any form of government that humanity has not tried across time in its quest for harmony? Yet peace, happiness and prosperity remain out of the grasp of most. In fact, no nation can claim it has achieved this enviable state for all its inhabitants.
Some describe the time we live in as the democratic age. They point out that through the evolution of government and the rise of democratic principles in politics and institutions, the world stands at the brink of a positive era. Indeed, compared with the rule of feudal lords, tyrant kings and despotic dictators, who would argue that government based on the needs and desires of ordinary people isn’t a significant step forward? Will democracy therefore continue to spread through the world system, gradually subduing all other forms of government and bringing peace and well-being to all?
Perhaps we should first ask whether the capacity to do so is even inherent in the democratic system. Whatever your view, one thing is certain: democracy has no set definition. It comes in many shapes, sizes and colors and means different things to different people.
Democracy is generally recognized as originating with the Greeks, when revolts in Athens brought to an end a dynasty of tyrants in the fourth and fifth centuries B.C.E. The term demokratia comes from kratos, “rule,” and demos, “people.” Hence the definition, “the rule of the people.” Aristotle thought that the ideal number of men participating in each democratic system would be about five thousand. No doubt the breadth of contemporary applications would surprise him. Today, the ancient philosopher’s concept of the polis (city) as the basis for democracy has given rise to national and even global models.
While there are many modern variations of democracy, all involve “the people” in processes that either express the will of the majority or act as checks and balances on a centralized authority. America’s founding fathers envisioned what Abraham Lincoln later termed “government of the people, by the people, for the people.” In many republics, “people power” means that the masses of ordinary citizens can exercise their power to remove an elected president from office. The Philippines is an example, having done just that in recent years. On the other hand, several African leaders see themselves as presiding over democracies in that they have been brought to power by the people, yet once in office they rule as dictators. Political analyst Fareed Zakaria differentiates between these various forms in terms of “liberal democracy” and “illiberal democracy.”
For some, “the rule of the people” is the panacea for all problems relating to personal liberty, human rights and freedom in general. For others, it means globalization and its 24-7 partner, the Internet, which transcends political boundaries. In either case, views of democracy tend to be enhanced by the collapse of closed systems, under which the people’s desires have been subjugated by the will of the state.
So while democracy can refer to a doctrine or principle of government, a set of institutional procedures, or a set of behaviors, the core idea is the downward distribution of power and the active participation of the people to influence direction and outcome.
ORIGINAL SINS
The Greeks may have laid the foundation for the system we know as democracy, but the rule of the people has its origins in a much earlier time. In fact, according to the book of Genesis, the seeds were sown just after the creation of humankind in the Garden of Eden, with Adam and Eve. Humans are described as created after the God kind, while animals were created after their own various kinds. Clearly we were meant to be different from other species. Man and woman were given a mind—a physical brain with an additional nonphysical component.
This component is identified in scriptural language as the “spirit in man.” Genesis explains that Adam became a living being once God breathed into him the breath of life (Genesis 2:7). Breath, wind and spirit are related concepts in Hebrew. The ancient Hebrew book of Job teaches that “there is a spirit in man, and the breath of the Almighty gives him understanding” (Job 32:8). Animals do not have such a spirit, and although they have sophisticated brains and can be extremely intelligent, they do not have mental capacity equivalent to human understanding.
Human beings are required to use their minds to process a complexity of information and make a constant stream of decisions of a higher order than is required of animals. Although created by God, we have free moral agency and the ability to use our intellects voluntarily in the decision-making process. This is demonstrated by the instruction given to Adam and Eve relative to two trees in the garden, identified as the tree of life and the tree of the knowledge of good and evil. The fruit of both trees appeared to be good for food, yet God instructed that they eat of one and not of the other. That’s because the trees symbolized two ways of life. One way of life links the human mind to God via His Holy Spirit; the other (forbidden) way rejects God, as humans assume for themselves the right to determine what is right and what is wrong.
Though they may not have realized the full implications, Adam and Eve chose the way of self-determination on behalf of mankind, shunning the direct influence of the Creator. This is the path that humanity has walked ever since, pictured by our first parents’ banishment from the Garden of Eden and the tree of life. Humanity was on its own, with complete freedom to make decisions apart from God. The rule of the people was born.
Human intelligence, with its nonphysical component, is a formidable force, as the accomplishments of human endeavor attest. Yet in crucial areas, such as cooperation and peace in human relationships, the power of the people has not yet netted the desired results. The citizens of the greatest democracy on earth (perhaps the greatest democracy ever) are divided down the middle politically. Anger toward their leaders, who were elected by a system based on the power of the people, boils just beneath the surface. And in Iraq, the Western attempt to replace a tyrant dictator with a democratic system has proved very difficult to say the least. Perhaps one reason is that many potential beneficiaries of such a system are at cross purposes.
Why is it that human intelligence applied to technology can be so successful, yet when it is applied to a system of government, it seems incapable of producing a lasting environment of peace and safety?
The answer lies partly in the fact that not all knowledge is physical. By refusing to comply with God’s instructions regarding the two trees, the human mind was deprived of access to godly knowledge and confined to the pursuit of physical knowledge. To build and launch space probes, to dam mighty rivers, or to transmit data in nanoseconds requires knowledge of physical laws, and the human mind is very good at accumulating and applying this kind of knowledge. But government, which involves social interaction between people, requires the application of spiritual principles to be truly successful. Success comes through the expression of genuine outgoing concern by those governing toward those being governed, and vice versa.
LAW AND DEMOCRACY
Law plays an important part in democracy. The modern democratic state creates laws to protect the individual liberties of its citizens, as well as to protect citizens from injustices or abuses at the hand of those they elect. Since man took to himself the authority to decide right from wrong, the laws developed within the democratic state are made by man, based on what he decides is right or wrong. And we all know that there are many divergent views regarding moral and ethical values. The best the state can do is to create laws that are agreed upon by a majority of the people. This may be effected by direct referendum of the masses, by representatives that the masses elect, or by judges that the elected representatives appoint.
Without a firm moral and ethical basis for the formulation of these laws, they will rest on the vagaries of the human mind deciding for itself the basis for law. Witness the evolution of laws addressing marriage and family relationships within some democratic societies today. Lawmakers struggle even to define marriage, thus compromising their efforts to formulate laws to protect those who participate in the marriage relationship. Who decides what is right and wrong regarding this most important institution? The rule of the people is always going to tend toward a lessening of restraint and thus mediocrity.
Democracy may be better than autocratic or dictatorial forms of government, but it is sadly hollow at the core. As Winston Churchill famously noted, “Indeed, it has been said that democracy is the worst form of Government except all those other forms that have been tried from time to time.” No government is stronger than its moral and ethical foundations. Unfortunately, at the heart of democracy, the people’s determination of what is right is based only on what seems right or what feels right.
ANOTHER SOURCE
What many don’t realize is that another democratic experiment—in the sense of a person taking to himself the right to decide how to live—was undertaken some three thousand years ago and the results documented. King Solomon allowed himself the luxury of trying anything he wanted, without the restrictions of right and wrong. In the first two chapters of the book of Ecclesiastes, he documents the experimentation by which he indulged himself in whatever appealed to him. He says: “I set my heart to seek and search out by wisdom concerning all that is done under heaven. . . . I set my heart to know wisdom and to know madness and folly” (Ecclesiastes 1:13, 17). This accumulated wisdom led him to declare of the human mind, “All the ways of a man are pure in his own eyes. . . . There is a way that seems right to a man, but its end is the way of death. . . . Every way of a man is right in his own eyes” (Proverbs 16:2, 25; 21:2, emphasis added throughout). The prophet Jeremiah, who came from the same Hebrew tradition, was a little more direct when he declared, “I know the way of man is not in himself; it is not in man who walks to direct his own steps” (Jeremiah 10:23). This understanding of the mind of “the people” makes it clear that government based on the will of the people simply will not work.
Perhaps we would all do well to open our minds and ask some of the questions posed by political analyst Zakaria in The Future of Freedom. He asks, “What if liberty comes not from chaos but from some measure of order as well . . . ? What if, as in much of life, we need guides and constraints? And what if liberty is truly secure only when these guardrails are strong?”
The law of God was designed to provide just such guardrails for human conduct. If we stay within them, our conduct will lead to the objectives sought by democracy. According to Moses, speaking to ancient Israel about that law, “the Lord commanded us to observe all these statutes, to fear the Lord our God, for our good always, that He might preserve us alive, as it is this day” (Deuteronomy 6:24). This is the law that provides moral and ethical content as a basis for daily living and protects human rights in the process. Instead of deciding for ourselves what is right or wrong, we need to seek a common basis for law from the One who created us in the first place. Then the individual liberty of the citizen would be protected and rulers would be required to put the good of the people above their own interests. The will of the people would be in harmony with the will of those entrusted to lead, each placing the needs and desires of others before his or her own. Peace and happiness would become a reality.
In biblical terms this system is known as the government of God. God’s benevolent government, based on His law, is the only kind that will successfully address the problems we see surrounding us today. Yes, this may be the democratic age, but it is not going to be the era when humanity solves its problems by its own systems of government, democratic or otherwise. King Solomon tried it all, and his summation, found in Ecclesiastes 12:13 (King James Version) is powerful: “Let us hear the conclusion of the whole matter: Fear God, and keep his commandments: for this is the whole duty of man.”
BRIAN ORCHARD
brian.orchard@visionjournal.org
(This article originally appeared in the Summer 2005 issue of Vision. Revised for Spring 2008.)
RELATED ARTICLEGuardrails for Human Conduct
Subscribe to:
Posts (Atom)