Watch Paranormal Experiences related to Baruch Spinoza in Faith Videos | View More Free Videos Online at Veoh.com
Maybe the recent Connecticut
home invasion didn’t mesmerize us for months like the cable news soap operas I
affectionately call “The Guiding Light of Anna Nicole Smith” and “As the World
Turns around Natalee Holloway,” but it still got entangled in the media’s “news
flash” net and held our collective attention for a full 48 hours. In the end,
two men were arrested and charged with robbing, raping, and killing a suburban
family as well as torching their home.
I was not overly surprised by
the villainous events of that day. A 2005 U.S. Department of Justice report
reveals there is one rape for every 1,000 Americans per year and six murders
for every 100,000.
I was also not shocked when the
story became the centerpiece on the marketplace of ideas dinner table that
night. A review conducted by the Project for Excellence found that media
outlets tend to replay the same select news pieces. This gives the stories a
life of their own.
What perked my ears about the home
invasion crime was the media’s obsession with a particular, seemingly
out-of-place detail: one of the alleged perpetrators, Joshua Komisarjevsky, had
been adopted. One newspaper went so far as to title its story, “Alleged
Connecticut Killer Adopted as Baby.”
Why not title the story “Alleged
Connecticut Killer Ate Lima Beans for Lunch?” Is it because lima beans rarely
cause an average Joe to explode into a lawless rampage? Can “defective” genes
be a precursor to crime?
Clearly, the adoptive family, the
press, or both, accepted the premise that biological factors can trigger
violence. It’s possible the family, hoping to distance themselves from the
heinous act and convey that they have “good DNA,” pitched the “he’s not related
to us” angle to reporters. It’s equally possible that members of the press
decided this detail was somehow meaningful. Whatever the case, the idea was
embedded in multiple articles, although there was no outward mention of a
possible link between hereditary factors and criminal behavior.
Newspaper pieces and Internet
blogs revealed how Komisarjevsky’s family struggled for years to straighten out
the wayward boy, who became a burglar at the age of 14. Attempts to make him
feel like part of the family were futile.
This reminded me of a
disturbingly similar story from a 1999 60 Minutes segment, which
described the case of Jeff Landrigan, a young man who was adopted at birth by a
law-abiding family, but who now sits on death row for murder. Landrigan’s
adoptive sister speculated that her brother had bad genes, adding, “I
personally think that the day by brother was born, his fate was probably
sealed…”
While on death row, Landrigan
found out his birthfather was imprisoned on death row in another state and that
his family tree was peppered with felons. He told 60 Minutes he believed
crime was passed down in his family “like cancer or heart disease.”
A body of evidence supports
Landrigan’s theory, although environmental influences are likewise powerful and
should not be discounted. In Change Your Brain /Change Your Life,
psychiatrist Daniel Amen states that the cingulate gyrus, curving through the
center of the brain is hyperactive in murderers. Other researchers have
determined that violent males have low levels of serotonin, a condition that
has a high rate of heritability. The National Institute of Health conducted a
study on the serotonin levels of prison inmates and determined with an 84
percent accuracy which ones would return to crime upon their release.
Dr. Sarnoff A. Mednick’s study
of 14,427 adopted children, as discussed in the New York Times, reveals
how a propensity to chronic criminal behavior may be passed through the genes.
Although Mednick does not believe criminal behavior is directly passed down, he
holds that certain biological factors that might be associated with crime can
be inherited. He cites a biological predisposition towards substance abuse as
an example.
What does this theory mean for
the person looking to adopt? And what are the chances a newly acquired child
will have gene-related difficulties? Although there do not seem to be any
studies on this topic, it is possible there are a greater percentage of
adoptees today with problematic tendencies. In the more puritanical past, a
woman was more likely to give up her child simply to avoid stigma and social
ostracism. She may have become pregnant while unmarried or involved in an
affair, but beyond that was law-abiding and well adjusted. A woman who puts a
child up for adoption today is arguably more likely to do so for pressing
reasons, i.e. due to problems with illegal substances, imprisonment or family
abuse, factors that could be hereditable. In addition, celebrities, such as
Madonna and Angelina Jolie, make it fashionable and more common to adopt infants
from foreign lands whose biological predispositions are unscreened and
unknown.
On the other hand, it is
possible there are a smaller number of adoptees today with so-called genetic
flaws. Abortion is now an option for “troubled” women. In Freakonomics,
Steven Levitt and Stephen Dubner say crime has declined over the past twenty
years because “the pool of potential criminals (has) dramatically shrunk,” a
fact they attribute to Roe vs. Wade. Although these authors are not arguing for
biological connections to crime, they say women in adverse family environments
are more likely to have children who grow up to be criminals, and these are
typically the women who get the abortions.
In addition, adoptions have
become more open and cooperative. According to the LA Times, adoptive
and natural parents meet at least once in 90% of all infant adoptions, and 25%
of these adoptions are completely open. This means an increasing number of
birth parents and adoptive parents come together in some way, review each
other’s physical and personal history and stay in contact. Genetic secrets are
less likely to be locked away in bureaucratic clinics; problems can be
confronted and resolved to some degree through positive environmental
reinforcement.
Most scientists and
psychologists will tell you the nature vs. nurture debate is complex and by no
means resolved. Landrigan promoted the “my genes made me do it” argument in
several court appeals. In the end, he lost. The US Supreme Court made the final
ruling against him three months ago, and he is likely to be executed soon.
Komisarjevsky’s case is next and
inquiring minds want to know: Will he desperately seek his DNA, or do what most
defendants do and blame it on his “nurture” resume?
Unfortunately, the “lima beans
defense” rarely works.
The issue has become a political tractor with conservatives and liberals attempting to bulldoze their opponents. Conservatives hope to acquire the seal of authenticity for their theory of ID, an accolade that only “fact-based” and “respectable” science can provide; while the liberals want to protect their turf from what they see as a religious crusade into the “objective” halls of learning. The controversy has emerged in Georgia, Kansas, Arkansas, Maryland, Missouri and South Carolina, as well as Pennsylvania where a judge recently ruled that reading a single sentence about intelligent design in biology class would violate the Establishment Clause of the Constitution. In California, the El Tejon Unified School District permanently cancelled a philosophy class about intelligent design after Americans United for Separation of Church and State filed a lawsuit. Religion was once the supreme authority on all matters, but when the Enlightenment’s onslaught of secular ideas swept over the European continent, it carried away the minds—and sometimes the hearts—of many who had been devout. Seventeenth century philosopher Baruch Spinoza rebelled against traditional Judaism and Christianity, replacing them to a great extent with the rational and scientifically based metaphysic of determinism. This metaphysic argues in favor of a mechanistic, causal universe and is bolstered by scientific findings, including later Darwinian theory. In keeping with the prior rebellion against religion, today there is arguably a rebellion against the new leader called “science.” Kings risk being toppled from their thrones, and ID has emerged as a weapon to be used against this final arbiter of “truth.” Why are IDers making their move now? First, it could be said that science has ventured into “disquieting” areas of study, such as cloning, transgenetic engineering, cross-species transplants and stem cell research. There may be an urge to rein it in with philosophical or theological “wisdom.” As Albert Einstein, a pantheist and disciple of Spinoza, said, “Science without religion is lame, religion without science is blind.” Teaching is never value-free, and an omission can convey a powerful message. When students fail to discuss the ethics of scientific actions and outcomes, they often end up like my former, high school classmates: giggling and hurling dissection specimens across the room, a behavior that conveys lack of respect for the animals who died and inability to comprehend that dissection is considered by many to be ethically impermissible in the first place. Secondly, science has faltered recently, leaving it vulnerable to attack by those who hope to depose it. Scientific fraud has leapfrogged to the public’s attention with confessions by Korean researcher Dr. Hwang Woo Suk, who admitted fabricating cloning studies for the past two years. Esteemed scientific journals published his concocted data, and his peers did not question his work. One journal editor recently stated that scientific error and dupery occur from time to time, even at leading American universities, a statement that taints the image of science as trusted authority. Thirdly, IDers may feel that any disagreement among prominent scientists opens the hatch to alternative theories. The discovery of “spooky” quantum mechanics occurred in conjunction with a pervasive disillusionment with science and its fundamental tenet: causality. While some quantum physicists, such as Einstein, support a deterministic hidden variable theory, others, such as Werner Heisenberg and Max Born, defend a framework based on the uncertainty principle. If it is acceptable to teach opposing theories in quantum mechanics, then why not let ID arm wrestle with Darwin? Because words such as “spookiness,” “magic” and “trickery” are associated with the quantum world, one could argue that mystical, veiled or opaque theories, such as ID, befit the scientific realm. If quantum strangeness can be taught, why must intelligent design be expelled? Lastly, postmodernism--which rejects any form of absolute truth, even in science--has permeated modern society, and conservative IDers are embracing it. This is ironic because the “right” has traditionally embraced the objective and absolute while the “left” has endorsed the subjective and contextual. In describing postmodernism, Richard Rorty says, “truth is made rather than found,” and Jean-Francois Lyotard emphasizes the importance of avoiding totalizing grand narratives and maintaining an infinite number of perspectives. Darwinian theory is nothing if not a totalizing grand narrative. Should ID be allowed to “act up” in science class? Most people might say yes. According to a 2005 CNN/USA Today/Gallup poll on evolution, 84 percent of Americans believe that God created humans in their present form or helped guide their development, while a mere 12 percent say God had no part in the process. The bouncers in the “12 percent club” guard the door from party crashers. They look at “fake” ID, saying it is creationism incognito and that it lacks “real science” credentials. They announce to the crowd, “If you think it qualifies, you’ve had one too many drinks.” They are correct in that intelligent design fails Karl Popper’s falsifiability test; it cannot be proved wrong. ID is philosophy, not science. But does this mean it should be denied entry? I am convinced by the evidence of natural selection and treasure Darwin’s theory because it promotes an interconnectedness of all living things, but I hold that the intense battle to keep ID out of the classroom is misguided. The shrill, political feud between conservatives and liberals has spiraled away from protecting students and the Constitution into a rendition of Hannity and Colmes. Do we lack confidence in our children to evaluate, to separate evidence from fiction, to interpret for themselves? Sweeping ID under the rug makes for a huge lump that curious teenagers will investigate. What is the resistance to cross-disciplinary study or “big picture” teaching in which related fields, such as history, philosophy and biology, are integrated? Math partners with chemistry; philosophy and ethics could collaborate with all branches of science. Compartmentalized study may lead to a lack of synthesis, thus an absence of learning in general. Why is postmodernism a no-no in science, but a welcome visitor in other disciplines? No area of study should lose the doubt and humility that a postmodern filter provides. Theories from the past have been toppled, and some that are accepted today will be mocked tomorrow. Fighting may be inappropriate in school, but arm wrestling, well, isn’t that a fundamental freedom? Now let’s roll up our sleeves and let the theorizing begin. Unless your head has been super-glued inside a science book, you have observed the furious debate between proponents of intelligent design (ID) and supporters of evolution; a debate that has bounced from courtrooms into opinion pages around the country. Pointing to the complexity of life on earth, IDers posit the existence of an intelligent designer and reject the notion that all can be explained by evolutionary theory.
In his book, Our Final Hour, Cambridge professor and Britain’s “Astronomer Royal” Martin Rees predicts humanity has no more than a 50/50 chance of survival into the next century and that by 2020 a million people will perish due to scientific error or terror. Some would call him prescient, while others would interpret his words as alarmist, resembling a layer cake with environmental fears on top of nuclear fears on top of chemical and biological threats, ad infinitum. With a sci-fi flare, he warns of runaway technology, human clones and an ability to insert memory chips into the brain.
Doomsday predictors get much the same respect as the “toxic fumes” sign at the local service station; they impart their wisdom, yet we yawn. Situations which seem grim and overwhelming, even potentially lethal, tend to be ignored. Attention on more immediate and “American” concerns, such as consumer goods and personal advancement, monopolize our daily thoughts. This is arguably foolhardy and indicative of the “another doomsday, another dollar” mentality.
Rees is not a lone voice on the scientific stage. The “Bulletin of Atomic Scientists” reports we have seven minutes until our final bow at midnight. Other reputable experts surmise that a “gray goo” or nanotechnological catastrophe poses the greatest threat. This involves the invention of miniature, self-replicating machines that gnaw away at the environment until it is devoid of life. It need not be deliberate sabotage—as in technological warfare by one nation against another--but could result from a laboratory mishap.
Astronomers speak of fugitive asteroids that could destroy major sections of our planet within the next 30 years. Others point to atom-crashing tests and their potential for a lethal strangelet scenario. Strangelets are malformed subatomic matter, which could distort all normal matter and dissolve the earth in seconds.
There are streams of alerts from environmental experts who tell us natural disasters are on the rise. They warn of climatic change and tell us the world's species die at a rate 1000 times greater than they did prior to human existence due to habitat destruction and the introduction of non-indigenous species into the ecosystem. Their conclusion? If we do not reverse the damaging trend, Earth itself will be extinct.
Should we open our minds to doomsday predictions? And if we accept them, what is the next step to insure or increase our chance of planetary survival?
In his book, Science, Money and Politics, Daniel Greenberg follows a trail of suspicion. He condemns what he believes to be the self-serving, greedy scientific community with its bungled research, conflicts of interest and findings that never see the light of day due to suppression by corporate sponsors. But this seems to be an overly cynical, embellished perspective; there are surely many scientists dedicated to discovery and social responsibility, apart from any personal gain. And we should not forget that offering controversial insights can be at a cost; proponents of “radical” theories often expose themselves to public and professional ridicule.
Regardless of skepticism, the “Pascal’s Wager” game plan seems a good bet. This essentially means we should not gamble with eternity, but instead urge the scientific community to take precautions since Armageddon allows no second chance. Better to err on the side of life, even if it means some black holes will go unexplored and some research grants will be pulled.
Precaution means building contingency plans--such as shields and containment measures--into emerging technologies so that if an experiment goes awry, a safety net will kick into place. It means the scientific community should better police itself. It means committees or boards—both local and international—should be established for oversight and regulations, much like Albert Einstein proposed in 1947 to maintain worldwide peace. Many nation-states and multinational corporations are known for fighting even minimal efforts to regulate dangerous technology, and they must be countered.
There are pragmatic hurdles to be negotiated when trying to impose rules on private parties or on authorities in renegade lands, but the ozone hole “near disaster” demonstrates how the world can cooperate when it comes to life-and-death matters. As cultures dovetail, as communications rise, as borders become more porous, and as the world figuratively shrinks, it will be easier to impose structure and scientific parameters on nations that seem combative today
Science must shift its course and find new mountains to climb. It looks to us for cues. Due to our materialistic bent as a culture, our cursory endorsement of “progress” and our captivation with the Prometheus-like aura of technology, we subtly ask the scientific community to scale those mountains that are the highest (great accolades can be received), the easiest (the path of least resistance) or the most profit-oriented (grant money from special interests or an emphasis on reducing labor so companies can realize greater proceeds) rather than those that are the most ecological and peace-enhancing.
The research community has rivers of creativity and forests of energy that could instead be directed towards rivers and forests. It could move towards ecological preservation and restoration, peaceful alternatives to conflict and a furthering of life on this planet.
We will know a cultural transition is underway when news reports following fires, earthquakes and other disasters address the impact on natural systems and nonhuman species, rather than just the human and economical consequences, such as the number of homes lost. Our capitalstic culture thrives on fact that nature is cost-free, which in turn, reinforces the notion that it is expendable and devoid of value. This reality must change. Our reality must change. And science must change. It must shift towards peace and ecology. It's as plain as doomsday.
There are financial scandals implicating and even indicting executives from Enron, WorldCom, Global Crossing, Arthur Anderson, Tyco and Merck. There are statistics showing that 10% of American households steal cable television, that 70% - 75% of students have cheated in school, and that lying is a normal part of social interaction.
There are the prisoner abuse scandals that point to sadistic behavior by seven American military personnel at the Abu Ghraib prison in Iraq, with follow-up accusations against European troops, and later against the California National Guard at a separate incarceration facility north of Baghdad.
Are these "questionable" acts indicative of the genetic or moral flaws of "a few bad apples," or is there a particular social climate or situation that paves the way for certain types of behavior? Is Shakespeare right in that "all the world is a stage," and this stage can transform a law-abiding citizen into a crook, an ethical person into a dissembler, or a pacifist into a sadist?
Numerous experiments have investigated this question and found that a person's behavior is often drastically altered by those around him and by his overall circumstance, rather than by inherent personality traits or professed individual values.
At the Princeton Theological Seminary, two psychologists conducted a study in which seminary students were asked to relate the parable of the Good Samaritan into a tape recorder at a nearby building. A "victim," feigning physical distress and needing help, was positioned en route. The instructor warned half of the students that they were late to make the recording and should hurry to the proper location, while telling the other half that there was no need to rush, but they might as well head over to the building early. Of those in a hurry, 90% walked around the injured "victim" or stepped over him to get to the destination. Of those with time to spare, 63% assisted the man in need. The context of the situation proved essential: the desire to obey orders and to conform to the perceived time limitation played a significant role in the seminarians decision as to whether to be a Good Samaritan.
Social psychologist Solomon Asch conducted his own experiments in 1951. He wanted to see if an ordinary person would conform with a group decision when it was clear that this decision was erroneous. An unsuspecting subject was placed in a classroom with seven actors who agreed one after the other that two lines were equal in length when there was in fact a huge discrepancy: one line was short and the other was long. Then the subject was asked for his opinion; only 29% of those questioned deviated from the majority, thus establishing the power of conformity and authority. Those who answered correctly reported feeling uncomfortable when doing so. If the decision-making had been related to ethics or aesthetics, rather than the empirically verifiable length of two lines, experts agree that compliance with the group decision would have been even higher.
Asch's work was inspired by Stanley Milgram, a Yale University psychologist who wondered if people would execute strangers if they were simply told to do so. Milgram was fascinated with the Nuremberg trials and wondered if Eichmann, a high ranking official of the Nazi Party, was inherently evil or had just followed orders in such a way as any person would.
Milgram invited ordinary Americans whom he labeled "teachers" to give electric shocks to a stranger whom he called the "learner" when the latter was unable to provide the correct response in a supposed memory test. Of course, there were no real shocks and memory was not being tested, but each teacher was unaware of this. The learner screamed with pain and continually begged for the teacher to release him from his straps. The testing device was labeled from level 0 to 450. 100-150 delivered mild shock while higher levels were labeled "very strong," "extreme severity," danger, severe shock," and finally "XXX" (or fatal). Milgram had asked accredited colleagues in advance to guess what they thought the outcome would be; they hypothesized that no one would go above 150 volts with the exception of the rare sadist who would push the lethal 450 lever.
The actual results were quite different: 66% of the teachers "executed" the victim, merely because they were told to do so by the authoritative figure: the psychologist. Of those who refused to go to 450, no one stopped before reaching 300 volts; and nobody helped the victim, thus proving to Milgram that part of the human condition is blind obedience.
For 25 years, Milgram's obedience experiment was replicated by numerous researchers in the United States, Australia, South Africa and in several European countries with similar results. In a German study, over 85% of the subjects administered a lethal electric shock to the learner.
"Abuse" also resulted from a study conducted at Stanford University by Philip Zimbardo in 1971. He created a mock prison, and average university students were drafted to be prisoners and guards. They passed mental and physical exams, and coins were tossed to see who was to assume which role. The authoritative impact of the guard uniform with the accompanying nightstick and mirrored sunglasses converted these previously docile students into increasingly violent enforcers, and the inferior status of the convicts, reinforced by their low ranking garb, prison numbers (rather than names) and confinement to tiny cells, transformed them into victims. Formerly active prisoners became passive; healthy ones became sick.
Both sets of students said they lost their identities and forgot they were a part of an experiment. The illusion became a reality. The two week study was called off after only six days because the treatment of the prisoners became too brutal and humiliating. The photos revealing the Abu Ghraib violations are astoundingly similar to the video footage taken of the Zimbardo experiment.
There are studies that show how lying, cheating and stealing are also the product of circumstance rather than character and how the notion of "us" vs. "them" is a fallacy. Two Yale University psychologists Hugh Hartshorne and Mark May secretly gave approximately 10,000 children the opportunity to lie, cheat, or steal on various academic tests, sporting competitions and other projects. The children's personal values were evaluated during this extensive study, 71% of the kids exhibited "unethical" behavior, and it was concluded that "honest or dishonest behavior is largely determined by circumstances," not moral beliefs.
In another incident, computers in a Manhattan credit union failed and mistakenly permitted customers to withdraw unlimited funds over their available balances. Rather than inconvenience its customers, the union decided to trust their patrons not to overdraw their accounts with their ATM cards.
Four thousand customers took advantage of the error; some stole as much as $10,000 from the financial institution. In the end, $15 million remained missing, and the authorities had to be called to make arrests.
There are countless other studies about how external factors are highly predictive of so-called "moral" behavior. It may be disheartening to learn that people are generally conformists, blind followers of authority and highly influenced by circumstance rather than character and values, but there are reasons to be uplifted by these findings.
First, this behavior can be explained: people are genetically predisposed towards conformity and ethnocentrism because these traits aid in survival. We cannot live without the assistance of others, and this type of societal cooperation requires some adherence to established rules. Cultural group selection is certainly essential in the same way that natural selection is.
Secondly, political leaders, reformers, revolutionaries, creative thinkers, innovators, public policy developers and anyone with a different slant or position on an issue should rejoice. This paradigm shift means implementing change is much more promising: situations are fluid and malleable while internalized moral beliefs are often rigid.
The recent debate over gay civil unions is a case in point. When Massachusetts and San Francisco began challenging the status quo, polls revealed that most Americans were against the legally recognized pairing of homosexual couples. However, in the course of the past year, statistics have radically changed. It may be the pervasive news coverage, the compelling arguments in favor, the open national debate, or all of the above, but external factors have clearly contributed to a shift in perspective. In a CBS nationwide poll conducted in May 2004, the percentage of people who favored civil unions was up to 56% from 39% in November 2003, and those opposed was down to 40% from 53%.
The ability to change a situation, a political climate, a law, or a collective conscious is indeed possible without heavy-duty pliers or a hammer. Reform is not at war with genetic, entrenched, or internalized stagnations. To a great extent, the only real obstacles are more flexible, external factors, such as whether there are cameras in the prison to monitor prisoner abuse, whether teachers are vigilant about plagiarism and cheating, whether corporate America prosecutes its white-collar criminals, and whether a credit union shuts down its ATM rather than naively trusting its patrons. Perhaps "The Bad Seed" of circumstance is not so bad after all.
Charlotte Laws is a member of the Greater Valley Glen Council, a syndicated columnist, and the President of a nonprofit organization.
Recent Comments