# ABUSING STATISTICS FOR POLITICS AND PROFIT

## Beware of these abusive techniques.

There are many statistical functions that may be used to compare values. Unfortunately, there are more ways to use them wrong than to use them right. There are 4 ways to abuse statistics:

2. Bad application of statistical functions
3. Forming the wrong conclusion
4. Sneaky tricks to mislead the unwary

Here is a list of the wrong things to do in statistics, with some horrific examples of how politicians and "scientists" have used them to mislead people. The list is sorted according to the four categories above:

• Sampling the wrong population: In a study on whether a gene causes homosexuality, one research group studied 15 homosexuals to see if they had a certain gene. They did NOT study anyone who is not homosexual, so they had nothing to compare their results with.
• Lack of a control group: This is also true in the case above.
• Nonrandom sampling methods: Placing a survey form in a magazine that is read mostly by liberals will tend to elicit a liberal viewpoint, not an average viewpoint.
• Not checking for other variables acting on the sample: The number "Deaths caused by automobile airbags in the years 1990-1998" fails to reveal that most of them happened in 1997 and 1998. The number of airbags in existence (an independent variable) was increasing during that period.
• Not collecting a large enough sample: In the homosexual-genetics study above, they studied only 15 cases. This made the margin of error so large that the results would have been meaningless even if they had tested control subjects.
• Not noting whether or not it is possible that an already sampled item can be sampled again when designing the sampling method: Different formulas must be used for each design.
• Using experimental conditions that change the nature of the system being tested: Megadosing lab rats may cause metabolic changes that are not present with smaller doses over longer periods.
• Creating false data to replace data that were lost: In one suspected-carcinogen case, most of the lab rats were accidentally killed by fumigators spraying for bugs. Not wanting to lose a semester on his thesis, the research student obtained new rats to replace the dead ones (for display only), and faked data for the new rats based on data collected from the remaining original rats. He did not revise his error estimate to reflect the smaller number of rats. Unfortunately, the pesticide had induced tumors in them, and a perfectly safe product was removed from the market as a result.
• Discarding samples that don't fit the desired theory: Other laboratories were unable to duplicate the results of a study that concluded that 60 Hz power line radiation causes cell changes. The "researcher" had discarded all of the data that didn't fit his theory, leaving only about a tenth of the data he had collected.
• Reporting more precision than is available: Proponents of the "global warming" theory state that average temperatures have risen by half a degree since temperature readings were first recorded. The problem with this is that early thermometers had an accuracy of plus or minus one degree or worse.
• Changing the collection method in the middle of the experiment: Changes made in the methods used to obtain data may affect a time study.

### Bad application of statistical functions

• Using the wrong statistic: The nature of the population under study and the sampling method used must be studied very carefully to select the proper parameter or statistic to use when evaluating the data. In the homosexual-genetics study cited above, they should have tested for a correlation between homosexuality and the presence of the gene. Since they had no data on non-homosexuals, they could not very well do this. Therefore, they just reported the percentage of homosexuals that had the gene (which means nothing -- it could have just as well been an eye-color gene).
• Using the population standard deviation instead of the sample one: This is probably the most oft repeated mistake, because the sample standard deviation is much harder to calculate. This changes the margin of error.
• Using the wrong error calculating function: The error function must be carefully chosen to match the parameter or statistic used.
• Using the wrong averaging method: The averaging method must be carefully chosen to match the type of data used. For example, velocities cannot be averaged using an arithmetic mean. Here is an example of an attempt to qualify a race car:

By rule, the driver must make an average speed of 60 mi/hr over four laps on a one-mile oval in order to qualify for the race. He killed his engine, and averaged only 30 mi/hr on his first lap. He averaged 60 mi/hour on the second. Then he floorboarded the car and averaged 90 mi/hr on the last two laps. He used his calculator to average his lap speeds using the arithmetic mean, and got 67.5 mi/hr. He thought he was in. But the official (who knew his math) said he did not make the race. Here's why:

QualifyingFIRST LAPSECOND LAPTHIRD LAPFOURTH LAP Arithmetic meanGeometric meanHarmonic meanSum of parts
Distance1 mi1 mi1 mi1 mi ---4 mi
Time2 min1 min2/3 min2/3 min ---13/3 min
Speed30 mi/hr60 mi/hr90 mi/hr90 mi/hr 67.5 mi/hr61.8 mi/hr55.4 mi/hr55.4 mi/hr

In order to qualify, he had to average 60 mi/hr. That means he had to go four miles in less than four minutes, since it takes one minute to go one mile at 60 mi/hr. He took 3 and 2/3 minutes to finish the first 3 laps. That meant he had to go 180 mi/hr on the last lap to make the race.

The sum of the distances must be divided by the total time taken to traverse them, instead of relying on a formula. Notice that for the arithmetic mean to work here, the time durations of all of the speeds must be the same, and for the harmonic mean to work, the distances for all of the speeds must be the same (as they were in this special case).

• Comparing samples with different characteristics as though they were not different: Assuming that accident rates before a particular hazard is publicized are the same as those after publication can cause wrong results when comparing different kinds of accidents to each other.
• Misunderstanding or misapplying a formula: The 55 mi/hr speed limit was the result of misapplying the formula:

Power = Force X Velocity.

It applies only during the acceleration of a car up to speed, not while the car is moving at a steady speed.
• Assuming that an effect is linear: Many economists assume that the effect on the economy of increasing the tax rate is linear. But the effect is really more like subtracting the tax rate from 1 and squaring the result.
• Bad mathematics: In the homosexual-genetics study cited above, the "scientists" couldn't even get their mathematics right. 26 percent is not an even multiple of 1/15. They should have either reported 26.7 percent, or rounded it to 27 percent. The ignorance of even the most basic principles of scientific method shows throughout the procedures and calculations they performed.

### Forming the wrong conclusion

• Forgetting to compare the experimental value to the control value: In the homosexual-genetics study above, they obtained a result that "26 percent" of the homosexuals tested had the gene. Without comparing their result to a control group, they "concluded" that the gene causes homosexuality.
• Failing to consider the margin of error: In the homosexual-genetics study, no margin of error was reported at all.
• Confusing correlation with causality: Because the incidence of the disease rickets was strongly correlated to being in certain families, early scientists concluded that the disease was hereditary. Later it was shown that rickets was the result of malnutrition, and that poverty, not rickets, was inherited.
• In the homosexual-genetics study above, they concluded that the gene "caused homosexuality" when they had not even demonstrated a correlation between the presence of the gene and the presence of homosexuality.
• Using faulty logic to reach conclusions: In the homosexual-genetics study above, they deviated from sound logic in two ways:
1. Even though only 26 percent (sic) of the subjects had the gene, they concluded that it "caused" homosexuality. So how do they explain the other 74 percent?
2. They didn't even think about the rate of occurrence of the gene in the heterosexual population. It might be 26 percent too.
3. 25 percent is the expected value for a randomly distributed recessive gene. 4/15 (26.7%) is as close as one can get to 25 percent with a sample size of 15.
Some might think that I keep pointing to this study because of a supposed hatred for homosexuals. NOT SO! I keep indicating the study because there was so much wrong with the "scientific" methods used. It is the proverbial horrid example of science done wrong.
• Affirming the Consequent: Those saying Global Warming is real are using the expected effects that Global Warming is expected to produce to "prove" the presence of Global Warming. But they ignore the fact that those effects might have other causes besides Global Warming.

In addition, their conclusion that man is causing Global Warming is also affirming the consequent. There could be other causes, including an increase in the output of the sun (demonstrated by the presence of Global Warming on Venus, Mars, Europa, and Titan).

• False Premise: Some UFO investigators used the following logic to show that a particular UFO sighting was unidentified:

"The object was seen high in the sky, moving toward the east. The weather bureau said that they had launched a balloon fifteen minutes before the sighting. Although the object appeared similar to a balloon, and the motion was characteristic of a balloon, it could not have been the balloon. The weather bureau also stated that there was a west wind. If the object was a balloon, it was heading directly into the wind."

What's wrong with this conclusion? The premise that the west wind would make the object move west is false. By the definition that weathermen use, a west wind comes from the west. It doesn't blow toward the west. The object was probably the weather balloon.

• Confusing a curative effect with removing the cause: Because removing salt from the diet has a curative effect on some cases of high blood pressure, some people (including some government regulators) jumped to the conclusion that salt causes high blood pressure. Actually, removing salt abnormally thins the blood, allowing it to flow through constricted spaces easier.

### Sneaky tricks to mislead the unwary

• Using numbers to mislead the naive: In the homosexual-genetics study above, they threw out a number, and then "concluded" whatever they wanted to. They counted on the inability of most of the press (and the public) to understand what the number meant, or whether there was any meaning to it at all.
• Defining the premise from the wanted conclusion: The homosexual-genetics study intended to conclude that homosexuals can not help what they do because they are genetically disposed to it. The premise they used is: "If a person is genetically disposed to something, he must do it." Therefore, the entire purpose of the study was to fool the press into thinking the following:
• "Since there is a 'genetic cause' for homosexuality, it can't be a wrongful act."
• "Churches can't call homosexuality a sin, because the homosexual can't help what he does and feels. It's genetic in nature."
But the self-preservation instinct makes even animals steal from and kill other animals, even when they are not being attacked. Those actions are illegal for humans to do, and are called sins by churches. So genetics is not even a valid argument for this cause. The premise is false.

In other words, the whole purpose for the study was to gain political advantage.

• Publishing without verification: Remember cold fusion? Environmentalists are using the same tricks to force their agendas on others. So did the perpetrators of the homosexual-genetics study.
• Using strong scientific-sounding phrases: When the perpetrators of false science know they do not have a leg to stand on logically, they resort to using scientific-sounding language to confound those untutored in the sciences. They don't fool the scientists, but they fool enough legislators and reporters to get their agenda passed into law. The debates over laetrile, vitamin megadosing, and extraterrestrial spaceship landings are examples of this kind of "science." One UFO "investigator" proposed that UFOs could use microwaves to "steal" power from car batteries through the headlights. Anyone with any knowledge of electronics knows that this statement is pure mouse manure (or this idiot left his car with the headlights on, and the blackbody emission of the filaments had dropped into the microwave region because the battery was almost dead).
• Appealing to emotion instead of logic: This is repeatedly used by environmentalists who have no scientific basis for the product bans they demand. This method is also a favorite of those claiming that aliens are abducting people.
• Using scare tactics: This is similar to appealing to emotions. They present a huge, terrifying scenario of what might happen if they don't get their way. But they have no proof of any of it, so they want to scare you into believing it. The anti-nuclear crowd uses this trick heavily, as do other environmentalists. And the entire global warming argument is made up of these scares.
• Using faulty logic to demonstrate conclusions: This is a common trick used by those favoring an extraterrestrial explanation for UFOs. It goes like this:
• "We called the airport, and they had no planes in the area. The weather bureau had not launched any balloons, and there were no astronomical objects that fit the description. Therefore, this case must have been caused by an extraterrestrial spaceship."
There are several faults in this logic:
1. Private, commercial, and military airports each have knowledge of their own kind of aircraft, but very little knowledge of flights of the other two kinds. Planes flying under Visual Flight Rules (VFR) might not be listed at any of the airports.
2. There are many sources of balloons other than the weather bureau, including children and pranksters.
3. Even if all of the things the investigator thinks of are solidly disproved, there are many mundane causes that the investigator can not possibly think of.
4. Even if it is proven that no known phenomenon could have caused the sighting, it does not logically follow that an extraterrestrial spaceship caused the sighting. Using that same faulty logic, I could "prove" a time ship from the future, a space warp, a Smurf conspiracy, demonic activity, Klingons, or a thousand other flights of fancy.
• Using language designed to mislead: In this case, language is misused to "prove" the case. A sentence is designed to read two different ways, one that represents the truth, and the other misleading people to think what the deceiver wants them to think. Newspaper headlines often contain such deceptions. Here are a few I have collected that can really mislead, with the correct reading in parentheses:
1. "District Dumps Plan to Take Other Trash" (The plan was dumped.)
2. "Milk Drinkers Turn to Powder" (They used powdered milk during a strike.)
3. "Senate Approves Dishonesty Policy" (They decided what to do about dishonesty, not to support it.)
4. "Criminal Investigators hold up traffic" (Not with guns, but maybe the writer thinks it's criminal that he had to wait for them.
5. "Board Suspends Policeman in Wardrobe Case" (He was not in proper uniform, so the board took away his job for a while.)
6. "Environmentalists like NAZIs" (They behave similarly.)
7. "Man Shot in Back, Head Found in Street" (This was not as violent as depicted. He was shot twice. Nobody was decapitated.)
8. "Drunk gets 9 Months in Violin Case" (The writer might think the sentence was too severe, but they didn't stuff him in a case, he broke a violin while he was drunk.)
• Printing misleading charts: This is the sneakiest trick. "News" magazines do it all the time, to slant the news to their own political view. Here are some of the tricks used, and how they mislead:
1. Showing an inappropriately large or small amount of data. This buries or emphasizes the change being reported.
2. Reversing one of the axes of a graph. When someone takes a quick glance at the chart, this makes a change seem to be reversed from what it really is.
3. Using a nonlinear scale where it is not appropriate. Anyone used to a linear scale would be misled.
4. Covering up part of the graph with text, a logo, or the legend. This hides the swing in the value that they want you to forget about.
5. Using a scale that covers so little that parts of the trace run off the edge. This hides data they don't want you to see.
6. Not showing the scale calibrations at all. Keep them in the dark!
7. Not using equal scale divisions in a histogram. This makes the wider division seem more prominent than it is, because the bar is taller (but it includes more of the population too).
8. Making some bars of a histogram wider, when in fact the divisions are really the same size. Ostensibly, this is done to make room for text, but it also makes that bar seem numerically larger than it is.
9. Tilting the chart on the page. A decrease can be made to look like an increase, because the line runs up on the page, even though it runs down on the graph. Tilting the graph the other way makes a decrease look like an increase. Time and Newsweek used this trick during the Carter presidency to make the economy look better than it was, and during the Reagan presidency to make economic gains under Reaganomics look like losses.
10. Changing the linear size of a pictogram in BOTH directions proportional to the change in value, instead of using the area as the value. This makes a value that really doubled look four times as large.
11. Using an odd shape for a pie or bar chart. This makes the areas of the chart not correspond to the actual values.
12. Depicting the same information twice in different ways (employment and unemployment on the same chart).

One has to be very wary when reading and interpreting the claims made by others. Many have axes to grind, and they want to use you for a whetstone.