Home  |  About  |  Contact  |  Site Map

Tuesday, March 31, 2009

Stem Cell Wars: Adult vs. Embryonic

“Recent spectacular breakthroughs in noncontroversial adult stem cell research and clinical applications to effectuate cures with the mitigation of disease or disability have been well documented, Rep. Chris Smith (R-NJ) said, remarking on the 'significant progress' achieved with adult stem cells.”

Stem cell research has become highly politicized. But, even pro-lifers and conservatives, like former President Bush, are not against stem-cell research. They are against embryonic stem-cell research (and against cloning to create embryos for use in stem-cell research, or in any research).

Embryonic stem-cell (ESC) research is not the only hope for mankind, as we are typically led to believe. The prospects of adult stem-cell (ASC) and umbilical cord stem-cell research are repeatedly ignored by media and activists. Adult stem-cell research is one of the most promising advances in medical science in the last decades. As Michael Fumento, author of "Bio-Evolution: How Biotechnology Is Changing Our World," (Encounter Books, 2003) one of the few commentators who has shone a light on adult stem cells, has written: "Scientists have already discovered at least 14 types of ASCs that ... could perhaps be 'trans-differentiated' into all the types of cells we need."

And adult stem cells are not mere pie-in-the-sky hopes of potential medical progress. Adult stem cells are cells at work today. Dr. Scott Gottlieb has written, "Adult stem cells have already been used for more than 20 years as bone-marrow transplants to reconstitute the immune systems of patients with cancer and to treat blood cancers such as leukemia."

Umbilical cord stem cells are another potentially fertile opportunity for medical progress. Cord blood is rich in stem cells. Four million babies are born every year in the United States and the majority of their umbilical cords are thrown away. They could be used to treat some 11,700 Americans annually, according to the Institute of Medicine.

Regardless of whether you're pro-choice or pro-life, the prospects of adult stem-cells look very promising, currently treating over 80 different diseases. And possibly even being able to be transformed into all three major categories of cells, making them just as pliable as ESCs theoretically are. So is there really any point in pursuing embryonic stem-cell research?

◊ ◊ ◊

Supporters of expanded federal funding for embryonic stem cell research were disappointed by former President Bush. And they were pleased by President Obama. But those who truly believe ESC research will bring medical breakthroughs have naught to fear. For there’s a far more promising approach, likelier to produce more benefits and much sooner.

We're being flooded with exciting new developments from the alternatives to ESCs, called adult stem cells. Taken from a person's own body or from umbilical cords or placenta, these cells are treating ever more diseases. Further, ASC research in humans and animals keeps biting away at the alleged trump card of ESCs—that only they can be transformed into every type of cell in the body. Cardiologist Douglas Losordo's research lab at Caritas St. Elizabeth's Medical Center in Boston has now become the latest indicate ASCs can do just that.

Reporting their results in the February Journal of Clinical Investigation, they extracted from the blood stream of three humans stem cells that originated in the bone marrow, thereby saving patients the trouble and pain of direct marrow extraction. They found what they believe to be a heretofore undiscovered type of cell, then injected these into the hearts of rats that had suffered heart attacks and subsequent—formerly permanent—damage.

Some of the cells became new heart muscle while others became new blood vessels. Indeed, they grew twice as many new vessels as other rats given a fake treatment. They also grew far less scar tissue, which impedes heart function.

Marrow stem cells have been used to induce either muscle growth or vessel growth in human hearts in hundreds of patients in labs throughout the world. But this appears to be the first time both were grown at the same time by a single type of cell. Losordo’s team is now overseeing a trial using these cells on patients with untreatable severe angina. “The safety looks good and majority of patients are doing much better,” he told me.

More exciting yet, Losordo also conducted experiments showing their cells can also become nerve tissue. That means they could be transformed into all three major categories of cells, making them just as pliable as ESCs theoretically are.

Yet several other labs have also found different ASCs (all from marrow) that seem to have this same property. One of them was that of Ira Black, a neurologist at the Robert Wood Johnson Medical School. "I can't say I'm surprised" at Losordo's findings Black told me. "It's consistent with studies going on across the world. And one of the most exciting areas now is the use of ASCs in heart failure."

In fact, Brazil has announced an ambitious heart-stem cell experiment involving 1,200 patients and 40 institutions across the country. The aim is to eventually replace traditional heart treatments with stem cell therapy, such that 200,000 lives could be saved within three years if the therapy proves effective. It could also reduce the government's costs for heart treatment by $14.2 million a month, the health ministry said.

Brazil also announced they are financing studies with stem cells for treating spinal cord diseases, diabetes and degenerative nerve disorders like Parkinson's. The U.S. is too. While no ESC has even made it into animal testing, ASCs are now being used in about 300 human clinical trials and are treating over 80 different diseases.

As to the plasticity of Losordo's stem cells, Black says converting an ASC into a completely different kind of mature cell "was once thought impossible." Indeed some ESC researchers desperate for federal handouts still doggedly insist it is–on par with saying lab rats can't squeak. The media rarely hesitate to repeat their claims. But "now 10 to 20 different labs have shown" such transformation is possible, says Black.

Losordo, however, says a major advantage of his adult stem cells is that they're much easier to grow than previously-discovered types. His team multiplied them 140 times with no change in their structure or effectiveness. Now, "We've got freezers full of them" he says.

He told me he thinks his work combined with that of others could "render moot the debate between ESCs and ASCs." Says Losordo, "We're entering the second phase of development of adult stem cells. We'll soon be working on methods to enhance the efficiency of adult stem cells . . . while ESCs aren't even in the starting gate yet."

He notes ESC researchers continue to be flummoxed in trying to get ESC cells to become specific types of mature cells without inducing runaway cell growth–malignancies called “teratomas” or “monster cancer.”

Losordo bemoans the broad-based assault by ESC researchers and the media to exaggerate the potential of ESC research while downplaying or even ignoring tremendous breakthroughs in ASC work. "I don't have any personal religious or other objections to ESC research despite the vowels in my name," he says, referring to his Italian Catholic heritage. "But as a clinical investigator I have an obligation to develop therapies that appear to be of most use to my patients."

Would that ESC boosters felt the same.

◊ ◊ ◊

Stem-cell research constitutes one of the most exciting areas in medical science. It promises to prevent, ameliorate and cure diseases for which there are now few if any treatments. Far easier is listing what stem cells don't have the potential to do, but here are a few of the wonders in progress:

  • More than 30 anticancer uses for stem cells have been tested on humans, with many already in routine therapeutical use.


  • By some accounts, the area in which stem-cell applications are moving fastest is autoimmune disease, in which the body's own protective system turns on itself. Diseases for which stem cells currently are being tested on humans include diabetes, lupus, multiple sclerosis, Evans syndrome, rheumatic disease and amyotrophic lateral sclerosis (Lou Gehrig's disease), among many others.


  • Just last February, two different human-autopsy studies demonstrated that stem cells transfused into the marrow work their way into the brain, where they can repair neurons and other vital cells. Other studies have shown that when injected into animals with severed spinal cords, stem cells rush to the injury site effecting repairs. "I think the stem cells may act as a repair squad," says the leader of one of the two studies, Helen Blau of the Stanford University Brain Research Institute. "They travel through the bloodstream, respond to stress, and contribute to brain cells. They clearly repair damage in muscle and other tissues."


  • At a conference in late 2002, French researchers reported that during the last 14 years they had performed 69 stem-cell transplants with an 85 percent disease-free survival rate. Since improving their procedure in 1992, all 30 of the last transplants have been successful.


  • Stem cells have been injected into damaged hearts and become functional muscle. This destroyed the dogma that heart muscle cannot be repaired, just as stem-cell research also wrecked the firmly held belief that brain tissue cannot regenerate.
Unless you've spent the last several years stranded on a deserted island, you've probably heard of at least some of these medical miracles. But here's what you may have missed. While the overwhelming majority of favorable media coverage of stem cells concerns those pulled from human embryos, called embryonic stem cells (ESCs), not a single treatment listed above has used that kind of cell. In fact, while activists such as spinally injured actor Christopher Reeve rage that but for Bush administration and congressional restrictions on ESC funding he might be walking in a few years, there are no approved treatments—and have been no human trials—involving embryonic stem cells. Each of the above therapies and experiments has involved cells that require no use of embryos.

These are called "adult stem cells" (ASCs), though they also refer to cells found in nonadult tissue such as umbilical cords, placentas and amniotic fluid. Like ESCs, they are precursors that eventually will become a mature, specialized cell. ASCs actually have been used therapeutically to treat leukemia and other diseases since the 1980s. A bone-marrow transplant is a transplant of stem cells from marrow.

Yet when an ESC so much as hiccups, it makes international news, while tremendous breakthroughs with ASCs are as a rule ignored. Welcome to what's been called "stem-cell wars," a deliberate effort to downplay the proven value of ASCs to attract more attention to the potential of ESCs. It is a war that is being fought partly over ethics, but mostly over money.

Okay, so if ASCs have such a huge advantage over ESCs then why did anybody begin researching ESCs anyway, to a point where labs and researchers all over the world now are working with them?

Blame it on the dogma—scientific dogma that is. It's long been acknowledged that ESCs carry a boatload of physiological and ethical problems. For example, ESCs implanted into animals have a nasty tendency to cause malignant tumors. That's a major hurdle to overcome, as is the fact that the body rejects them just as it rejects donated organs. Yet it was always believed that ESCs had one huge advantage over their ASC counterparts—that while an ASC could become or "differentiate" into only a few types of mature tissue with those tissues dictated by the source of that ASC, the ESCs could become any type of tissue in the entire body. In medical terminology this is known as "plasticity."

But this has never been more than theory, and lately that theory has begun crumbling under the weight of empirical findings. Or, in other words, it's had a run-in with reality.

"We do not yet know enough about adult stem cells or ESCs to make dogmatic statements of either," declared Dr. Darwin Prockop, director of the Gene Therapy Center at Tulane University, in a letter that appeared in Science.

"There's no law of physics or such that I know of that says that [ASCs] are inherently more limited than embryonic stem cells," Prockop told Citizen.

We do know that ESCs give rise to all three germ layers (as in "germination") that become all the forms of human tissue. But this doesn't necessarily mean that they can be converted into each and every one of those tissues. Moreover, Catherine Verfaillie and her colleagues at the University of Minnesota's Stem Cell Institute recently have found stem cells in human marrow that appear to transform into all three germ layers. "I think Verfaillie's work is most exciting and translatable into the clinical arena," says Dr. David Hess, a neurologist at the Medical College of Georgia in Augusta. "They seem to give rise to every cell in the body. She seems to have a subpopulation with basically all the benefits of ESCs and none of the drawbacks."

Verfaillie calls the cells "multipotent adult progenitor cells," and has isolated them from mice, rats and people. They already have been transformed into cells of blood, the gut, liver, lung, brain and other organs. Just a few months ago, researchers at the Robert Wood Johnson Medical School in New Jersey published a paper explaining that in a mere five hours they had been able to convert bone-marrow cells into neurons both in petri dishes and in rats. Under the old dogma, that was simply impossible. More importantly, "We found that they express genes typical of all three embryonic germ layers," the researchers told Citizen. "In aggregate, our study and various others do support the idea that one [ASC] can give rise to all types of tissue."

And the good news keeps pouring in. One problem with Verfaillie's cells is that, in part because they come from marrow, they are difficult to extract. That problem won't matter down the road when culturing practices are perfected, say researchers, but currently it hinders efforts to keep labs supplied.

Enter Elizer Huberman and his colleagues at the Argonne National Laboratory outside Chicago. They wanted to find highly plastic ASCs in blood, as they would be far easier to extract and to store. Just how plastic they might be remained to be seen and wasn't even a prime concern. But when the Argonne scientists reported their results in the March 2003 issue of the Proceedings of the National Academy of Sciences, it showed that their stem cells had in fact differentiated into mature cells of all three lineages that ESCs can produce.

Even if it somehow turned out that none of the ASCs really can produce all the cells of the body, perhaps we don't need the ability of cells that are "one size fits all." That's because in recent years researchers have found that they can tease ASCs into many more types of mature tissue than was previously thought possible. Moreover, researchers now seem to be finding ASCs essentially wherever they look—including blood, bone marrow, skin, brains, spinal cords, dental pulp, muscles, blood vessels, corneas, retinas, livers, pancreases, fat, hair follicles, placentas, umbilical cords and amniotic fluid. You don't need "one size fits all" if you can provide all sizes.

At the same time, ESCs have become even more suspect ethically in the eyes of many people. The original ethical concern was that many see the destruction of human offspring, no matter how young, as an abortion. Some prominent abortion opponents believe human life only begins upon implantation in the uterine wall; therefore destruction of embryos would not count as such. Nonetheless, even to some of these people the thought of ripping apart the byproduct of human conception for the sake of science invokes images of Nazi eugenicist Josef Mengele or of Mary Shelley's Dr. Frankenstein.

This more recent worry has nothing to do with destroying life but rather with the creation of it - cloned human life. While growing embryos into blastocysts (see note at end of article) often is referred to as "therapeutic cloning" or "research cloning" to distinguish it from the process of creating a human being, the two processes follow parallel tracks. If that blastocyst is implanted into the womb and it survives, voila!—nine months later you have a clone just like something out of Star Wars Episode II. No doubt most ESC researchers haven't the least desire to take the next step, but that's not the issue. What counts is that they are developing a technology that others can build upon to refine the process of creating human clones.

Thus, ESCs have in their favor nothing more than a decaying theory that they may have greater plasticity. Going against them are the ethical concerns and that they are years behind ASCs in commercial applications.

But there's a huge ESC industry out there, with countless labs packed with innumerable scientists desperately seeking research funds. Private investors avoid them because they don't want to wait perhaps 10 years for commercial products that very well may not materialize and because they're spooked by the ethical concerns. That leaves essentially only Uncle Sam's piggy bank, primarily grants from the National Institutes of Health, to keep these labs open. This, in brief, explains the "stem-cells wars," the perceived overwhelming need grossly to exaggerate petri-dish advances with ESCs, while life-saving new applications of ASCs are downplayed or ignored.

Thus the announcement in 2001 that ESCs could be made into blood cells received almost 500 "hits" on the Nexis media database even though published medical-journal reports of ASCs differentiating into blood cells go back at least to 1971. It's possible to read lengthy articles on the promise of stem cells that mention nothing but ESCs. The influential pro-life figure and former U.S. senator Connie Mack (R-Fla.) even questioned whether ASCs exist, which is on par with questioning the existence of Starbucks.

It's probably not a coincidence that Mack has been a paid lobbyist for ESCs, but most reporters have no financial stake in the issue and it is a complex one. They take their cues from the professional medical journals. And, unfortunately, these are among the leaders in the war against ASCs. The world's most prestigious science journal, Nature, published two in-vitro studies in March 2002 widely interpreted to mean either that ASCs are grossly inferior to what had earlier been believed or even that they're outright worthless.

The Nature writers indicated their studies showed that ASCs probably were not differentiating and multiplying at all; rather that it appeared the cell nuclei were merely fusing and the resulting fusion gave the impression of a new, differentiated cell forming. The media gobbled it up. Agence-Presse France headlined: "'Breakthrough' in Adult Stem Cells Is Hype, Studies Warn." The Australian Associated Press (AAP) declared, "New Research Tips Debate on Stem Cells." The Washington Post's subhead flatly declared: "Adult Cells Found Less Useful than Embryonic Ones." It was damning ... and false.

Stanford's Helen Blau countered with a big "So what?" In a Nature commentary, she noted that "Cell fusion has long been known to achieve effective reprogramming of cells"—so long in fact that her own laboratory was doing it 20 years earlier. Thus, far from showing that ASC research is "hype" or whatever term the particular newspaper or newswire chose to apply, it turns out that cell fusion both complements and encourages the differentiation of adult stem cells - something that's already proved valuable and is clearly very promising.

The idea that differentiation wasn't happening at all was simply bizarre in light of myriad studies and therapeutic applications showing otherwise, including one that appeared in the journal Blood shortly thereafter. Showing that bone-marrow stem cells can be converted into kidney cells, it pointedly concluded: "The process does not involve cell fusion."

"We found no evidence of nuclear material from two cells fusing into one cell," one of the coauthors emphasized to me. In an interview last spring, Prockop told me, "It may well be that fusion is part of the healing process. But clearly we can take mesenchymal cells and differentiate them into various tissues because it's into bone or fat and it's been done over 20 years." Indeed, he specifically explored the fusion issue in a study released in the Sept. 30, 2003, issue of the Proceedings of the National Academy of Science, concluding "Most of the [mesenchymal cells] differentiated without evidence of cell fusion, but up to one-quarter underwent cell fusion with the epithelial cells. A few also underwent nuclear fusion."

Yet another Blood study released last September concluded, "Analysis of DNA content indicates that donor-derived endothelial [stem] cells are not the products of cell fusion." A Lancet study in early 2003 looked at cheek cells from five living women who had received bone-marrow transplants from their brothers several years earlier. They found cells containing the male Y chromosome, a sign that donor marrow stem cells had differentiated into cheek cells. Moreover, the group found almost no evidence of fusion among the cells in the cheek. Of the 9,700 cells that were examined in the study, only two showed signs of possible fusion.

And yet in late October 2003, Nature rushed into publication yet another letter claiming that there was no evidence that stem cells from marrow do anything but fuse. Of all these studies, guess which was the only one to get media attention—and lots of it.

Shortly after Nature's first effort to establish that the wheel doesn't exist, its chief competitor, Science, attempted to show that the Earth is flat after all. First it ran a letter in which authors from the Baylor College of Medicine claimed that they earnestly had tried but failed to find bone-marrow cells that had differentiated into neurons in the brain. Shortly thereafter it ran a paper from Stanford University scientists, led by Irving Weissman, claiming to show that a type of stem cell from marrow could replenish that type of marrow, but that it appeared worthless for creating other tissues. The typical media reaction was UPI's "Promise of Adult Stem Cells Put in Doubt." Weissman eschewed the usual cautionary scientific terminology such as "it appears" or "evidence indicates," or "our particular study has found." Instead he smugly told UPI: "They [the cells] don't make brain; they don't make heart muscle or any of these things."

According to Blau, it was surprising to see this published so rapidly and in such a prestigious and influential publication as Science. The Baylor study, she notes, failed to detect not only neurons but also something far more readily detectable called microglial cells. And forget that "At least 20 reports over the past 15 years have shown that bone-marrow transplantation results in readily detectable replacement of a large proportion of microglial cells in the brain." Some of these reports have even appeared in Science. Says Blau, "If they couldn't see those, how could they possibly see neurons?" It would be like announcing that you had failed to detect a tiny virus under your microscope when you also hadn't been able to see a gnat that accidentally got trapped between the slides. Either your microscope is faulty or you don't know how to use it.

"As to Weissman's paper, where you look and how you look determines what you see, and he doesn't define where he's looking," she says. "Our own experiments have shown there can be a thousand-fold frequency of stem-cell incorporation depending on where you look." Because he didn't say where he looked, "It would be quite difficult to replicate his experiments," she notes. "You could replicate ours, but he did not. The other false assumption he made was to look at a fraction of marrow, the hematopoietic part, and he looked in absence of any damage to the body; yet these are damage-repair cells." In other words, one shouldn't think it remarkable that no ambulance shows up when there's no need for an ambulance.

Weissman is also a notorious opponent of adult stem-cell research insofar as he has made millions of dollars with numerous companies that work with ESCs, according to an exposé in the Washington Monthly. "Was the publication of these two papers a political act designed to harm the image of ASCs in the image of the public?" Insight asked Blau.

"That's been a question in many people's minds," she says. "Why these negative findings should have been published in such a prominent way does suggest a political agenda."

In a commentary in the Journal of Cell Science in February 2003, British researchers asked in the very title: "Plastic Adult Stem Cells: Will They Graduate From the School of Hard Knocks?" In a good-humored, indeed sometimes humorous, piece the angst nonetheless came through. "Despite such irrefutable evidence of what is possible, a veritable chorus of detractors of adult stem-cell plasticity has emerged, some doubting its very existence, motivated perhaps by more than a little self-interest." While certain issues still need resolving, the researchers said, "slamming" the "whole field because not everything is crystal clear is not good science."

Even scientists who strongly favor ESC funding readily admit that the issue is highly politicized, with ASCs getting the short end of the stick from research publications, the popular media and the scientific community. Blau, Prockop, Black and Verfaillie are among them. "Most scientists never want a door closed, they want all doors open," says Hess. "And anybody who disagrees with that stance is seen as trying to hold up medical progress."

Another ASC researcher who strongly supports funding for ESCs is Patricia Zuk, whose lab has shown that America's most plentiful natural resource—body fat—can provide a limitless source for stem cells capable of differentiating into bone, muscle, cartilage and fat that can be used to fill in scars and wrinkles. "Certainly it's politicized," she says. But, she adds, "I think a lot of embryonic stem-cell people are right in trying to protect their jobs."

Understandable, yes. But is it right? Forget for the moment the questionable morality of a mass campaign to fool the American public. Zuk admits that the stem-cell wars are "very worrisome" in that they could harm her own efforts to get grant money. Says Hess, "Certainly one of my motivations is I don't want money from adult stem-cell research being pushed into embryonic, though it's already starting to happen."

Activists such as Christopher Reeve have it backward when they say that restrictions on ESC research funding will prevent him from walking again. ASC studies already have enabled quadriplegic animals to walk again, and human trials should be right around the corner. But the chance of ESCs helping people such as Reeve in the next 10 years is practically nil. Reeve should know about this: Many of the amazing ASC studies, including Ira Black's, have been funded by something called the Christopher Reeve Paralysis Foundation.

Moreover, to the extent that breakthroughs with ASCs are confused with ESC technology, it harms public support for ASC research. ESC propagandists are hoping for a seesaw effect; that by exaggerating ESC research and denigrating ASC research they'll push up their side of the board. But, to the extent they succeed, they're only delaying the stream of miracles coming from adult stem cells.

Note: When fertilization initially takes place, whether within a fallopian tube (in vivo) or in a petri dish (in vitro) it forms a single-cell embryo called a zygote. The zygote divides progressively into a multicell embryo. After about five days, the embryo contains many cells with a cystic cavity within its center and is called a "blastocyst." If this blastocyst implants into the uterus and continues to develop, it becomes a fetus. But this is also the stage at which the individual cells become viable for use in ESC experimentation. "Blastocyst" is not to be confused with "blastocyte," which is simply another term for an ESC.

Beginning of the End for Embryonic Stem Cell Research?
By Michael Fumento
Tech Central Station, February 11, 2005
http://www.fumento.com/biotech/stem-cell-research.html

The Stem Cell Cover-Up
By Michael Fumento
Insight on the News - National (Issue: 5/16/04)
http://www.fumento.com/biotech/stemcell.html

Michael Fumento is author of BioEvolution: How Biotechnology Is Changing Our World, which has been published by Encounter Books of San Francisco.

 

Read More . . .

Sunday, March 29, 2009

The Truth About the Slave Trade

An awful lot of people seem to be unaware of the history of slavery. Especially concerning the origins of African slavery and slaves in America.

European slave traders did not themselves capture the Africans they transported, but bought them from native slave traders. Since ancient times, war captives, criminals, and debtors in Africa had been sold into slavery among their own people. For several centuries the western Sudanese kingdoms had been supplying slaves to Muslim North Africa on a commercial basis.

When European traders first came to the Guinea coast, African masters sold them their own slaves. As the demand increased, the coastal peoples began making slave raids against peoples farther inland, aided by firearms supplied by the traders. Soon blacks belonging to a variety of tribes from Senegal south to Angola were being enslaved in great numbers.

Captured slaves were marched to the sea in single file, shackled and watched by armed African guards. Fatalities were high on the grim journey. The slaves were placed in compounds (barracoons) at points along the coast, where European traders arriving by ship would examine them and reject the old and infirm. The remaining captives were branded and taken on board. Payment for slaves was in goods such as textiles, firearms, knives and other hardware, and liquor. African kingdoms such as Ashanti and Dahomey grew in power through the slave trade, because of the European goods they received.

In the mid '60s, as the civil rights movement went into decline, American black nationalists began to depict Africa as a paradise lost. Africa was the great motherland, distorted by white people and denied by Negroes. The slave trade, we were told, was imposed by whites.

This vision took hold in the early black studies departments on college campuses. And by the time Alex Haley's "Roots" told its plagiarized tale on television, Americans were given to believe that the slave trade was kept going by white men bagging Africans in the bush. The simpleminded vision of good guys and bad guys was established, but it is not completely accurate.

The first strong African assault came when the great Senegalese filmmaker Ousemane Sembene's "Ceddo" got its U.S. debut more than 20 years ago, and the black nationalist movement was stunned. Sembene showed Africans, European Christians and Muslims working together to sell Africans into slavery.

Origins
The word slave is adapted from Slav, originating from the time when the Germans supplied the slave markets of Europe with captured Slavs. For centuries, the Slavic people of Eastern Europe were the primary source of slaves for Europe and the Near East. Because of this, the word for slave in numerous European languages is derived from the word for Slavs—the English word being a clear example.

Slavery began with civilization. For hunter-gatherers slaves would have been an unaffordable luxury—there wouldn’t have been enough food to go around. With the growth of cultivation, those defeated in warfare could be taken as slaves.

Slavery has existed throughout history and practically every country on every continent experienced some kind of slavery. Scandinavia, medieval Europe, Nepal (which abolished slavery in 1924), China, Japan, Korea, Southeast Asia, Thailand, Burma, Myanmar, Russia, Persia, Polynesia, Hawaii, New Zealand, Easter Island, just to name a few.

Western slavery goes back 10,000 years to Mesopotamia, today’s Iraq, where a male slave was worth an orchard of date palms. Female slaves were called on for sexual services, gaining freedom only when their masters died.

Early abolitionists arose in the form of two Jewish sects, the Essenes and the Therapeutae, who abhorred slave-owning and tried buying slaves in order to free them.

Slavery still exists to this day, and the slave-trade is carried on by Arab slave-traders in the interior of Africa. Slavery is in all countries considered to be a criminal activity, outlawed by UN conventions. However some states such as Myanmar and Sudan do facilitate the institution of slavery, according to anti-slavery groups such as Free the Slaves.

Slavery existed in the ancient Mediterranean cultures, such as debt-slavery and the enslavement of prisoners of war. There was slavery in the Bible, slavery in Rome and Greece. Most of the gladiators were often slaves.

The beginnings of Christianity did not seriously change slavery. Though the Christian leaders often called for good treatment for slaves and condemned the enslavement of Christians, the institution itself was not questioned. The shift from chattel slavery to serfdom in medieval Europe is otherwise an economic rather than a moral issue.

Slavery in the Islamic World
The institution of slavery pre-dates Islam in the Arab world, and was permitted under the laws of Islam. Islamic rulers made a custom of enslaving those defeated in war. The Islamic world bought and captured slaves from Europe and Africa on a large scale for roughly a thousand years.

Slavery in Medieval Europe
Slaves (especially from Slavic) countries were traded, mainly in Prague. Sold by Christians, transported by Jews and then bought in the Muslim empire. Swedish Vikings were known to terrorize and enslave many Slavs.

Slavery in Africa
Slavery was common and widespread throughout Africa into the 19th century. The Dutch imported slaves from Asia into their colony in South Africa.

The nature of the slave societies differed greatly across the continent. There were large plantations worked by slaves in Egypt, the Sudan, and Zanzibar, but this was not a typical use of slaves in Africa as a whole. In some slave societies, slaves were protected and almost incorporated into the slaveowning family. In others, slaves were brutally abused, and even used for human sacrifices. Despite the vast numbers of slaves exported from Africa, many historians say that the majority of African slaves remained in Africa, continuing as slaves in the regions where they were first captured.

More than 1 million Europeans were captured by Barbary pirates and sold as slaves in North Africa and the Ottoman Empire between the 16th and 19th centuries.

The Atlantic slave trade peaked in the late 18th century, when the largest number of slaves were captured on raiding expeditions into the interior of West Africa. These expeditions were typically carried out by African kingdoms.

Slavery persists in Africa today more so than on any other continent. The Anti-Slavery Society estimated that there were 2,000,000 slaves in the early 1930s Ethiopia, out of an estimated population of between 8 and 16 million. Slavery continued in Ethiopia until 1942. In northern Nigeria at the turn of the 20th century, approximately 2 million to 2.5 million people there were slaves. Slavery in northern Nigeria was finally outlawed in 1936.

Mauritania abolished slavery only in 1981, but several human rights organizations are reporting that the practice continues there today. The trading of children has been reported in modern Nigeria and Benin. In the Sudan, slavery continues as part of an ongoing civil war.

Slavery in Colonial America
Most slaves that were brought to the Americas ended up in the Caribbean or South America where tropical diseases took a large toll on their population and required large numbers of replacements.

The institution of slavery in the Americas during the 17th century made little distinction as to the race of the slave. White and Native American slavery was common. However, by the 18th century, the overwhelming number of slaves were black, and white and Native American slavery became less common.

Slavery under European rule began with importation of white European slaves (indentured servants). That was followed by the enslavement of local aborigines in the Caribbean. And eventually was primarily replaced with Africans imported through a large slave trade as the native populations declined through disease.

Slavery in North America
Only a fraction of the enslaved Africans brought to the New World ended up in North America—perhaps 5 percent. The vast majority of slaves shipped across the Atlantic were sent to the Caribbean sugar colonies, Brazil, or Spanish America.

The first slaves brought to the English colonies on the continent landed at Jamestown, Virginia, in 1619. Slavery was legal in most of the 13 colonies, and was ended in many of the states later called "Free States" after the turn of the 19th century. Slavery was abolished in New York state in 1827. In 1806 the U.S. passed legislation that banned the importation of slaves, but not the internal slave trade.

Slavery ended in the U.S. in the 1860s. Lincoln's Emancipation Proclamation of 1863 was a reluctant gesture, that proclaimed freedom for slaves within the Confederacy, but Lincoln was the leader of the Union and had no authority over the Confederacy. The proclamation was nothing more than a political and symbolic gesture. It did not free the slaves. The proclamation attempted to change the Civil War's goal, then making the abolition of slavery an official war goal. Slaves within the U.S. remained enslaved until the Thirteenth Amendment to the Constitution late in 1865, 8 months after the cessation of hostilities in the Civil War.

Lincoln was opposed only to the spread of slavery in the new states. He refused to free the slaves in the border states. And he even established mechanisms to colonize millions of free blacks in Latin America. His main concern of the proclamation was to preserve the Union. It did so as an antislavery crusade in order to mute the growing opposition to the mounting costs and casualties of the war.

Slavery in the Spanish New World Colonies
Slavery in the Spanish colonies began with local Native Americans put to work in the silver mines. However, as these populations shrank due to imported European diseases, African slaves began to be imported.

Slavery in Brazil
During the colonial epoch, slavery was a mainstay of the Brazilian economy, especially in mining and sugar cane production. Slavery was legally ended by the "Golden Law" of 1888.

Brazil obtained more than 35 percent of all African slaves traded, approximately 3 million slaves were sent to this one country. The Portuguese were the first to initiate the slave trade there, and the last to end it.

In the early 1990s evidence of illegal slavery was unearthed in the Amazon region. The Brazilian government has since taken measures against such activities.

By the middle of the 18th century, British Jamaica and French Saint-Domingue had become the largest slave societies of the Caribbean region, rivaling Brazil as a major destination for enslaved Africans.

So, as you can see, there's much more to the history of slavery than is commonly known.


"The Truth About African Slave Trading" by Stanley Crouch, Monday, July 23, 2001 - http://www.racematters.org/sctruthafricanslavetrading.htm

http://history.howstuffworks.com/african-history/african-slave-trade.htm

http://en.wikipedia.org/wiki/Slave_trade

http://knowledgerush.com/kr/encyclopedia/Enslavement/

http://www.newint.org/issue337/history.htm


 

Read More . . .

Tuesday, March 24, 2009

Atheism might not mean what you think

Many people think of atheists as those who believe that there is no God. True, but atheism also includes a much broader range of beliefs and non-beliefs. Atheism is the absence of belief in the existence of gods or deities. It is merely the lack of theism.

The word atheism is made up of "a-" + "theism." The prefix "a-" means "without." Theism is a belief in a God or gods. An atheist is simply someone without a belief in God, not necessarily someone who believes that God does not exist.

Agnosticism: "a-" (without) + "gnōsis" (knowledge) is the philosophical view that the truth value of certain claims—particularly metaphysical claims regarding theology, afterlife or the existence of deities, ghosts, or even ultimate reality—is unknown or, depending on the form of agnosticism, inherently impossible to prove or disprove. In other words, an agnostic claims he can't know for sure if a God (or gods) exist.

So atheism answers the question, "what do you believe?" And agnosticism answers the question, "what do you know?" If you admit that you don't know for sure if there is a god, then you're probably an agnostic. But do you sit on the fence and ignore the question that asks about your belief? The question, "do you have a belief that god exists?" still should be answered. Theist or atheist. With the belief in god or without the belief.

There are four broad categories (and many subcategories/variations):
  1. Explicit Theist: Knows for sure that God exists. Not agnostic and not atheist.
  2. Implicit Theist: Has a belief in God. (May or may not admit to being agnostic.)
  3. Implicit Atheist: Does not have the belief in God. Likely to be agnostic as well.
  4. Explicit Atheist: Knows for sure that God does not exist. Not agnostic and not theist.

NOTE: Similar (but slightly different) terms include "strong" instead of "explicit" and "weak" instead of "implicit."

Agnostic atheism encompasses both atheism and agnosticism. An agnostic atheist is atheistic because he does not believe in the existence of any deity and is also agnostic because he does not claim to have definitive knowledge that a deity does or does not exist. The agnostic atheist may be contrasted with the agnostic theist, who does believe that one or more deities exist but does not claim to have definitive knowledge of this.

George H. Smith suggested that: "The man who is unacquainted with theism is an atheist because he does not believe in a god." Smith coined the term implicit atheism to refer to "the absence of theistic belief without a conscious rejection of it" and explicit atheism to refer to the more commonly used definition of conscious disbelief.

About 2.3 percent of the world's population describes itself as atheist, while a further 11.9 percent is described as "nontheist."

Wikipedia has an entry on "Non-theism." Non-theism, which is similar to agnostic atheism, has various types. "Strong atheism" is the positive belief that a god does not exist. "Weak atheism" (or implicit atheism) could describe someone who does not think about the existence of a deity.

In Western culture, atheists are frequently assumed to be irreligious or unspiritual. However, religious and spiritual belief systems such as forms of Buddhism that do not advocate belief in gods, have been described as atheistic. Although some atheists tend toward secular philosophies such as humanism, rationalism, and naturalism, there is no one ideology or set of behaviors to which all atheists adhere.

For more details see Wikipedia:
Atheism
Implicit and explicit atheism

Agnosticism

 

Read More . . .

Sunday, March 22, 2009

The Flight of Lawnchair Larry

One of the best urban legends of all time, and it's all true! Larry took 45 weather balloons filled with helium and strapped them to his lawn chair. Then took flight. For 14 hours.

The Darwin Award is presented each year to the person who improves the gene pool by eliminating her or (usually him) self in the most creative way. The 1997 nominee of the Darwin Award: Larry Walters of Los Angeles—one of the few Darwin contenders to survive his brush with fate.

Larry's boyhood dream was to fly. When he graduated from high school, he joined the Air Force in hopes of becoming a pilot. Unfortunately, poor eyesight disqualified him. When he was finally discharged, he had to satisfy himself with watching jets fly over his backyard.

One day, Larry, had a bright idea. He decided to fly. He went to the local Army-Navy surplus store and purchased 45 weather balloons and several tanks of helium. The weather balloons, when fully inflated, would measure more than four feet across. Back home, Larry securely strapped the balloons to his sturdy lawn chair. He anchored the chair to the bumper of his jeep and inflated the balloons with the helium. He climbed on for a test while it was still only a few feet above the ground. Satisfied it would work, Larry packed several sandwiches and a six-pack of Miller Lite, loaded his pellet gun—figuring he could pop a few balloons when it was time to descend—and went back to the floating lawn chair. He tied himself in along with his pellet gun and provisions. Larry's plan was to lazily float up to a height of about 30 feet above his back yard after severing the anchor and in a few hours come back down.

Things didn't quite work out that way.

When he cut the cord anchoring the lawn chair to his jeep, he didn't float lazily up to 30 or so feet. Instead he streaked into the LA sky as if shot from a cannon. He didn't level off at 30 feet, nor did he level off at 100 feet. After climbing and climbing, he leveled off at 11,000 feet. At that height he couldn't risk shooting any of the balloons, lest he unbalance the load and really find himself in trouble. So he stayed there, drifting, cold and frightened, for more than 14 hours.

Then he really got in trouble.

He found himself drifting into the primary approach corridor of Los Angeles International Airport. A United pilot first spotted Larry. He radioed the tower and described passing a guy in a lawn chair with a gun. Radar confirmed the existence of an object floating 11,000 feet above the airport. LAX emergency procedures swung into full alert and a helicopter was dispatched to investigate. LAX is right on the ocean. Night was falling and the offshore breeze began to flow. It carried Larry out to sea with the helicopter in hot pursuit. Several miles out, the helicopter caught up with Larry. Once the crew determined that Larry was not dangerous, they attempted to close in for a rescue but the draft from the blades would push Larry away whenever they neared. Finally, the helicopter ascended to a position several hundred feet above Larry and lowered a rescue line. Larry snagged the line and was hauled back to shore. The difficult maneuver was flawlessly executed by the helicopter crew. As soon as Larry was hauled to earth, he was arrested by waiting members of the LAPD for violating LAX airspace.

As he was led away in handcuffs, a reporter dispatched to cover the daring rescue asked why he had done it. Larry stopped, turned and replied nonchalantly, "A man can't just sit around."


Much of this information was originally published in
Lawnchair Larry Flies!

http://en.wikipedia.org/wiki/Larry_Walters
http://www.snopes.com/travel/airline/walters.asp
A short documentary video about Larry's flight, by Ted Fisher

 

Read More . . .

Beware of the Kidney Thieves

Beware of the Kidney Thieves: One of the best urban legends of all time. Drugged travelers awaken in ice-filled bathtubs only to discover one of their kidneys has been harvested by organ thieves.

The story went that a well-financed, highly organized gang operating in various major U.S. cities was drugging business travelers and making off with their kidneys to sell on the organ transplant black market.

The majority of people who had this pass through their hands failed to realize this was but an urban legend, an apocryphal tale told and re-told. Moreover, it was an urban legend that had been around at least since 1991.

As part of the effort to dispel belief in this nonsense, the National Kidney Foundation has asked any individual who claims to have had his or her kidneys illegally removed to step forward and contact them. So far no one's showed up.

Who knows why, but in 1997 the mind contagion broke out in New Orleans. In January, as the city geared up for its annual Mardi Gras festivities, a rumor began circulating via word-of-mouth, fax, and email to the effect that a highly organized crime ring in New Orleans was carrying out plans to drug visitors, surgically remove organs from their bodies, and sell the organs on the black market.

The viral message, which most often arrived under the header "Travelers Beware," sparked an avalanche of phone calls to local authorities, prompting the New Orleans Police Department to publish an official statement on the Web to calm public fears. Investigators found no substantiating evidence whatsoever.

The story had a familiar ring. Before New Orleans, people said it happened in Houston; before Houston, Las Vegas — where an unsuspecting tourist was drugged in his hotel room by a prostitute and woke up the next morning, supposedly, in a bathtub full of ice, minus a kidney.

A chilling tale, and a dubious one

It's a scenario that has taken many forms. Usually by email or from a friend who'd heard it from another friend, whose mother swore it had happened to a distant cousin. FOAF ("friend of a friend").

In some versions, the victim — we'll call him "Bob" — was on a business trip alone somewhere in Europe, and went out to a bar one night to have a cocktail. Wouldn't you know it, he woke up the next morning in an unfamiliar hotel room with severe pain in his lower back. He was taken to the emergency room, where doctors determined that, unbeknownst to himself, Bob had undergone major surgery the night before. One of his kidneys had been removed, cleanly and professionally.

With minor variations, the same story has been told thousands of times by thousands of different people in many different locales. And it's always based, like the version I heard, and the version you heard, on third-, fourth-, or fifth-hand information.

It is, in fact, false. An urban legend.

Which is not to say that human organs are never traded illicitly in parts of the world where people can get away with it. The case for the existence of an international black market organ trade has become increasingly convincing in recent years. What remain unsubstantiated are the tales of "back room" organ thefts perpetrated in the dark night in secluded alleys and seedy hotel rooms. "There is absolutely no evidence of such activity ever occurring in the U.S. or any other industrialized country," says the United Network for Organ Sharing. "While the tale sounds credible enough to some listeners, it has no basis in the reality of organ transplantation."

In fact, it's all but impossible for such activities to take place outside properly-equipped medical facilities, UNOS argues. The removal, transport, and transplantation of human organs involves procedures so complex and delicate, requiring a sterile setting, minute timing, and the support of so many highly-trained personnel, that they simply could not be accomplished "on the street," as it were.

Even so, like so many urban legends fueled by irrational fear and ignorance, the organ theft story continues to spread from person to person and place to place, changing and adapting to its surroundings over time like a mutating virus.

Unlike many other urban legends, unfortunately, this one has put real people's lives at risk. A decade or so ago, rumors began spreading in Guatemala to the effect that Americans were kidnaping local children in order to harvest their organs for transplantation in the United States. In 1994, several U.S. citizens and Europeans were attacked by mobs who believed the rumors to be true. An American woman, Jane Weinstock, was severely beaten and remains critically impaired.

Kidney Theft: The Real Story

In January, 2008, ABC News reported that officials in Gurgaon, India were trying to round up members of a criminal gang accused of drugging poor people, stealing their kidneys, and transplanting the organs into the bodies of wealthy customers "It sounds like the old urban legend of people lured into an apartment or house and then being robbed of their kidneys," the report begins. "But in India, it is no legend."

In the legend, unsuspecting foreign tourists are drugged, kidnapped, and taken to makeshift operating rooms where their kidneys are stolen for sale on the black market. The actual victims, according to police, are poverty-stricken locals, enticed with promises of employment into what can only be described as a real-life house of horrors where they are forced to "donate" their organs at gunpoint. There are foreign tourists involved, to be sure, but in this scenario they are paying customers, not victims.

The police raided a "luxury guest house" owned by the alleged mastermind of the kidney theft ring, Dr. Amit Kumar. Neighbors had reported seeing blood running from the gutters of the building, not to mention "blood-soaked bandages and even bits of flesh" strewn in an open lot nearby.

So, is kidney theft an urban legend, or not? Yes and no. It depends on how the tale is told.

http://www.snopes.com/horrors/robbery/kidney.asp
http://urbanlegends.about.com/od/horrors/a/kidney_thieves.htm
http://urbanlegends.about.com/b/2008/01/28/kidney-theft-urban-legend-or-not.htm

 

Read More . . .

The scuba diver in the tree

The story: While assessing the damage done by a forest fire in California, authorities were startled to discover the body of a man dressed in a wetsuit, complete with a dive tank, flippers, and face mask, in the branches of a tree.

Great story. Not an ounce of truth.

The strangely placed victim had suffered severe burns from the forest fire, but an autopsy revealed that he had not died from the flames, but from massive internal injuries. Dental records provided the victim's identification, and investigators contacted his family in an attempt to learn how a man who was dressed for scuba diving could possibly have ended up in the branches of a tree in the midst of hundreds of acres of charred forest.

According to the horrified family, the victim had been diving in the ocean some 30 miles away from the forest on the day that the fire had gotten out of control. As the investigators pieced together the grim details of the man's death, it became apparent that he had been accidentally scooped up along with thousands of gallons of water by one of a fleet of helitankers that had been called in to help the firefighters. Caught up in one of the huge buckets, the unfortunate scuba diver had been dumped along with the sea water in an attempt to put out the forest fire as quickly as possible.

While this story has been told many times since the late 1980s, there has never been a record of a diver in a scuba outfit being accidentally dumped by helicopter tankers on a forest fire. Authorities point out that while water is sometimes taken from lakes and ocean areas in an effort to extinguish forest fires as rapidly as possible, the helitankers suck up the water by means of a hose only a couple of inches in diameter. No one could be drawn into such a small opening and pulled into the tank.

ORIGINAL ARTICLE:
www.unexplainedstuff.com

www.snopes.com

 

Read More . . .

AIG the insurer of the Congressional pension trust?

Have your heard this one? "Congress supported a bailout of AIG because that company insures the Congressional pension trust."

It's not true. There is no private insurance on any federal pensions.

http://www.snopes.com/politics/business/aig.asp

Read More . . .

Move Over

Motorists in most U.S. states can be fined for failing to slow down or change lanes when passing parked emergency vehicles.

It's true.

http://www.snopes.com/politics/traffic/moveover.asp

Read More . . .

Thursday, March 19, 2009

Medical Myths

A list of common medical or medicine related beliefs espoused by physicians and the general public

Sometimes even doctors are duped, say Rachel C Vreeman and Aaron E Carroll

Physicians understand that practicing good medicine requires the constant acquisition of new knowledge, though they often assume their existing medical beliefs do not need re-examination. These medical myths are a light hearted reminder that we can be wrong and need to question what other falsehoods we unwittingly propagate as we practice medicine. We generated a list of common medical or medicine related beliefs espoused by physicians and the general public, based on statements we had heard endorsed on multiple occasions and thought were true or might be true. We selected seven for critical review:

  • We use only 10% of our brains
  • Eating turkey makes people especially drowsy
  • People should drink at least eight glasses of water a day
  • Reading in dim light ruins your eyesight
  • Hair and fingernails continue to grow after death
  • Shaving hair causes it to grow back faster, darker, or coarser
  • Mobile phones create considerable electromagnetic interference in hospitals.
We use only 10% of our brains
The belief that we use only 10% of our brains has persisted for over a century, despite dramatic advances in neuroscience. Some sources attribute this claim to Albert Einstein, but no such reference or statement by Einstein has ever been recorded. This myth arose as early as 1907, propagated by multiple sources advocating the power of self improvement and tapping into each person’s unrealised latent abilities. But this really makes little sense—what is the other 90% supposed to be doing? Nothing? C'mon!

Evidence from studies of brain damage, brain imaging, localisation of function, microstructural analysis, and metabolic studies show that people use much more than 10% of their brains.

Eating turkey makes people especially drowsy
The presence of tryptophan in turkey may be the most commonly known fact pertaining to amino acids and food. Scientific evidence shows that tryptophan is involved in sleep and mood control and can cause drowsiness.

The myth is the idea that consuming turkey (and the tryptophan it contains) might particularly predispose someone to sleepiness. Actually, turkey does not contain an exceptional amount of tryptophan. Turkey, chicken, and minced beef contain nearly equivalent amounts of tryptophan, while other common sources of protein, such as pork or cheese, contain more tryptophan per gram than turkey.

Other physiological mechanisms explain drowsiness after meals. Any large solid meal (such as turkey, sausages, stuffing, and assorted vegetables followed by Christmas pudding and brandy butter) can induce sleepiness because blood flow and oxygenation to the brain decreases, and meals either high in protein or carbohydrates may cause drowsiness. Accompanying wine may also play a role.

People should drink at least eight glasses of water a day
The advice to drink at least eight glasses of water a day can be found throughout the popular press, but there is no medical evidence showing that eight glasses is the proper amount. Just bunk! Even though I'm pretty sure water is very good for you—just don't drink excessive amounts of it, since that can be dangerous, resulting in water intoxication, hyponatraemia, and even death.

Reading in dim light ruins your eyesight
The fearful idea that reading in dim light could ruin one’s eyesight probably has its origins in the physiological experience of eye strain. Suboptimal lighting can create a sensation of having difficulty in focusing. It also decreases the rate of blinking and leads to discomfort from drying, particularly in conditions of voluntary squinting. The important counterpoint is that these effects do not persist.

The majority consensus in ophthalmology, as outlined in a collection of educational material for patients, is that reading in dim light does not damage your eyes. Although it can cause eye strain with multiple temporary negative effects, it is unlikely to cause a permanent change on the function or structure of the eyes. Hundreds of online expert opinions conclude that reading in low light does not hurt your eyes.

Hair and fingernails continue to grow after death
Morbid information about the body captures the imagination and reinforces medical mythology. Johnny Carson even perpetuated this myth with his joke, "For three days after death, hair and fingernails continue to grow, but phone calls taper off." To quote the expert opinion of forensic anthropologist William Maples, "It is a powerful, disturbing image, but it is pure moonshine. No such thing occurs."

Shaving hair causes it to grow back faster, darker, or coarser
Another common belief is that shaving hair off will cause it to grow back in a darker or coarser form or to grow back faster. It is often reinforced by popular media sources and perhaps by people contemplating the quick appearance of stubble on their own body. Pure BS.

Strong scientific evidence disproves these claims. As early as 1928, a clinical trial showed that shaving had no effect on hair growth. More recent studies confirm that shaving does not affect the thickness or rate of hair regrowth. The new hair has not yet been lightened by the sun or other chemical exposures, resulting in an appearance that seems darker than existing hair.

Mobile phones create considerable electromagnetic interference in hospitals
After publication of a journal article citing more than 100 reports of suspected electromagnetic interference with medical devices before 1993, the Wall Street Journal published a front page article highlighting this danger. Since that time, many hospitals banned the use of mobile phones, perpetuating the belief. Despite the concerns, there is little evidence.


Much of this information was originally published in Mixed messages: Medical myths by Rachel C Vreeman, fellow in children’s health services research & Aaron E Carroll, assistant professor of paediatrics
BMJ Publishing Group
http://www.bmj.com/cgi/content/short/335/7633/1288

 

Read More . . .

Tuesday, March 17, 2009

The "Sasquatch” / Bigfoot Legend

After 80 years much of the Bigfoot evidence has been discovered to be a hoax, or due to misidentification, and much of it is just useless anecdotes. What’s needed is an actual specimen.

Gigantic footprints were discovered one spring in 1958 in a northern California logging camp near Bluff Creek, and by late summer and into the fall, a bulldozer driver, Jerry Crew began to find massive 16 inch tracks in the morning in the freshly leveled dirt around his machine. He cast the footprints in plaster and took them to a local paper which ran a story, giving rise to the term “Bigfoot” and sparking a flurry of worldwide interest.

The mystery was solved in 2002 when the family of Ray Wallace, the construction contractor who supervised workers at the foot print site, announced in Wallace’s obituary that he had hoaxed the prints. While there was a certain amount of denial by Bigfoot researchers, Wallace had a decades-long history of producing jokes and hoaxes. Ray had a friend carve him 16-inch-long feet that he could strap on and make prints with. So it can all be blamed on Ray Wallace: Bigfoot is a hoax that was launched in August 1958.


Patterson’s famous Bigfoot photo of a guy in an ape suit

Bluff Creek footage was also the site of the next Bigfoot sensation—actual footage of a lanky ape-man loping along a stream bed—the famous Patterson film.

In 2004, Greg Long published “The Making of Bigfoot: The Inside Story” which examined Roger Patterson’s character. Long discovered that there was no lack of witnesses—neighbors, friends, business associates and even Bigfoot researchers—who would describe Patterson as a shady character who engaged in check fraud and other scams and who couldn’t be trusted. Ironically the film he used at Bluff Creek was purchased with a bad check, and he was arrested for grand larceny for stealing the camera he used. Before the Bluff Creek footage was shot, Patterson made several low-budget Bigfoot films that showed he had plenty of experience creating fake Bigfoot-related footage. Long also claimed he uncovered the actual “man in the suit,” Bob Heironimus, who had acted in Patterson’s other Bigfoot films. Considering Patterson’s reputation it’s difficult to understand why Bigfoot researchers haven’t been more suspicious of the film.

Bob Heironimus claims to have been the figure depicted in the Patterson film, and his allegations are detailed in Long's book. Heironimus was a tall (6 ft), muscular Yakima, Washington native, age 26, when he says Patterson offered him $1,000 to wear an ape suit for a Bigfoot film.

Long uncovered testimony that he contends corroborates Heironimus's claims: Russ Bohannon, a longtime friend, says that Heironimus revealed the hoax privately in 1968 or 1969. Heironimus says he did not publicly discuss his role in the hoax because he hoped to be repaid eventually. In separate incidents, Bob Heironimus and Heironimus's relatives (mother Opal and nephew John Miller) claim to have seen an ape suit in Heironimus' car.

After 80 years much of the Bigfoot evidence has been discovered to be a hoax, or due to misidentification, and much of it is just useless anecdotes. What’s needed is an actual specimen. Until a specimen is produced the skeptics will continue to hold the field.



Researcher says bigfoot just a rubber gorilla suit
The Associated Press
ATLANTA August 20, 2008


Turns out Bigfoot was just a rubber suit. Two researchers on a quest to prove the existence of Bigfoot say that the carcass encased in a block of ice—handed over to them for an undisclosed sum by two men who claimed to have found it—was slowly thawed out, and discovered to be a rubber gorilla outfit.

They claim their hoax was not for profit, but Atlanta residents Matthew Whitton and Rick Dyer received $50,000 from a California Bigfoot tracker who now plans to sue to get the money back.

The two Georgia men’s tale of having found a Bigfoot carcass in the North Georgia woods really started to stink when California Bigfoot enthusiasts finally examined the body and found it was just a costume.

“There will be legal action” said Catherine Ortez, who works for Searching for Bigfoot, Inc. in in Menlo Park, Calif. The organization paid for rights to the men’s story and their find. “If this was a joke, it was very methodical and thought-out,” she said.

The Searching site was founded by Tom Biscardi, who authenticated and promoted the alleged Georgia Sasquatch. Biscardi, who did not return calls requesting comment, has his own credibility issues, according to a police officer in a nearby jurisdiction.

“He was involved in a similar hoax a few years back,” said Agent Dan Ryan with the Palo Alto (Calif.) Police Department.


Much of this info from the Skeptics Society:
http://www.skeptic.com/eskeptic/08-08-15

 

Read More . . .

Sunday, March 15, 2009

Nostradamus Didn't Predict Anything

The prophecies of Nostradamus have a magical quality for those who study them: They are muddled and obscure before the predicted event, but become crystal clear after the event has occurred.

Michel de Nostredame (1503–1566), usually Latinized to Nostradamus, was a French apothecary. According to many in the popular press, he was a great seer who prophesized future events. Nostradamus wrote four-line verses (quatrains) in groups of 100. Some claim that Nostradamus predicted the Challenger space shuttle disaster on January 28, 1986. Of course, they didn't recognize that he had predicted it until it was too late.

The writings of Nostradamus are so cryptic that they can be interpreted to mean almost anything. If we have some imagination, we can shoehorn just about any event to some passage in Nostradamus.

One thing Nostradamus didn't predict was that he would become a one-man industry in the 20th century. Publishing houses will never go broke printing the latest predictions culled from the manuscripts of Nostradamus.

http://www.skepdic.com/nostrada.html
http://en.wikipedia.org/wiki/Nostradomus

 

Read More . . .

Friday, March 6, 2009

The Fifth Third Bank's unusual name

Fifth Third Bank's history can be traced back to 1858, when the Bank of the Ohio Valley opened in Cincinnati.

The Third National Bank organized five years later in 1863, and in 1871, the younger bank acquired the older one, beginning a 130-year history of banking acquisitions and mergers, with numerous smaller banks being absorbed and renamed.

Fifth Third's unusual name is the result of the 1908 merger of two banks, The Fifth National Bank and The Third National Bank, to become The Fifth Third National Bank of Cincinnati. Because the merger took place during a period when prohibitionist ideas were gaining popularity, it was believed that "Fifth Third" was better than "Third Fifth," which could be construed as a reference to three "fifths" of alcohol. The name went through several changes over the years, until in 1969, the name was changed to Fifth Third Bank.

http://en.wikipedia.org/wiki/Fifth_Third_Bank

 

Read More . . .