Jump to content

A peer-reviewed study about Wikipedia's accuracy


Recommended Posts

http://www.nature.com/news/2005/051212/full/438900a.html

 

However, an expert-led investigation carried out by Nature — the first to use peer review to compare Wikipedia and Britannica's coverage of science — suggests that such high-profile examples are the exception rather than the rule.

 

The exercise revealed numerous errors in both encyclopaedias, but among 42 entries tested, the difference in accuracy was not particularly great: the average science entry in Wikipedia contained around four inaccuracies; Britannica, about three.

Link to comment
Share on other sites

  • Replies 395
  • Created
  • Last Reply

Top Posters In This Topic

So the peer-reviewed study found that the source you use as a primary source is in fact 33% more inaccurate than an encyclopedia that NO ONE would ever accept as a primary source.

 

And this somehow proves you're not a moron? :rolleyes:

Only eight serious errors, such as misinterpretations of important concepts, were detected in the pairs of articles reviewed, four from each encyclopaedia.

Wikipedia's relatively small number of serious errors means it's a good starting point. If something doesn't ring true, it's obviously in your best interest to do further research.

Link to comment
Share on other sites

Wikipedia's relatively small number of serious errors means it's a good starting point. If something doesn't ring true, it's obviously in your best interest to do further research.

 

Okay, so never mind that it's less accurate than Britannica..."it's a good starting point."

 

Too bad you don't USE it as a starting point, but as an excuse to avoid further research. :rolleyes:

Link to comment
Share on other sites

Okay, so never mind that it's less accurate than Britannica..."it's a good starting point."

 

Too bad you don't USE it as a starting point, but as an excuse to avoid further research. :rolleyes:

I don't remember you providing any research in our debates about regression toward the mean, eugenics, or our other main points of disagreement. The pot is calling me black!

Link to comment
Share on other sites

I don't remember you providing any research in our debates about regression toward the mean, eugenics, or our other main points of disagreement. The pot is calling me black!

 

 

I didn't need any; I quoted mathematical fact and backed it up with very simple, relevant examples that everyone BUT you understood. Regression toward the mean is a function of the variance in the statistical distribution of a system. The easiest example of this is a pair of dice, where the roll subsequent to a very high or low roll is closer to the system mean because there are more states available to the system near the mean than there are near the very high or low roll. This holds true for any statistical system with any randomness - which means, any statistical system where measurements are correllated by a factor less than 1. Any eugenics scheme to breed for intelligence is a statistical system with a correllation factor of less than 1. The heritability factor you so desperately cling to (and horribly misunderstand) is the correllation factor of the system that, being less than one, mathematically defines the variance in the statistical distribution of intelligence that ultimately requires that regression toward the mean occur. Period. End of story. No bull sh-- about "error causes regression to the mean." That's the math, and I've said it before. You're just too damn stupid to understand it.

 

And I don't even know why I bothered. You're still too damn stupid to understand it. :rolleyes:

Link to comment
Share on other sites

Regression toward the mean is a function of the variance in the statistical distribution of a system. The easiest example of this is a pair of dice, where the roll subsequent to a very high or low roll is closer to the system mean because there are more states available to the system near the mean than there are near the very high or low roll. This holds true for any statistical system with any randomness - which means, any statistical system where measurements are correllated by a factor less than 1.

You've uncharacteristically written an error-free paragraph about regression toward the mean. Color me impressed.

Any eugenics scheme to breed for intelligence is a statistical system with a correllation factor of less than 1. The heritability factor you so desperately cling to (and horribly misunderstand) is the correllation factor of the system that, being less than one, mathematically defines the variance in the statistical distribution of intelligence that ultimately requires that regression toward the mean occur. Period. End of story. No bull sh-- about "error causes regression to the mean."

If you could get rid of I.Q. measurement error, the appearance of regression toward the mean would decline. But given that regression toward the mean takes place for height (where there is no measurement error) it's reasonable to conclude that some of the observed regression toward the intellectual mean is a product of a real, underlying phenomenon.

 

That said, there are few if any traits which have a narrow sense heritability of exactly one. This means that for just about every imaginable characteristic, there will be regression toward the mean. Despite this regression toward the mean, Darwinism states that if a given trait confers a survival or reproductive advantage, the species will gradually change to have more and more of that trait. What you seem to be saying is that regression toward the mean makes such long-term change impossible. If your ideas were right, human intelligence should have regressed toward the original, ape-like mean that existed 3 million years ago.

Link to comment
Share on other sites

You've uncharacteristically written an error-free paragraph about regression toward the mean. Color me impressed.

 

It's the same thing I've been writing - and you've been arguing against - for THREE MONTHS, you idiot. :lol:

 

If you could get rid of I.Q. measurement error, the appearance of regression toward the mean would decline. But given that regression toward the mean takes place for height (where there is no measurement error) it's reasonable to conclude that some of the observed regression toward the intellectual mean is a product of a real, underlying phenomenon.

1) No, because - again - the net effect of individual instances of normally distributed error over the entire population is ZERO. Error does not cause an appearance of regression toward the mean, unless you do something stupid and invalid like split the population into two subsets and extrapolate its behavior of one to the population as a whole while ignoring the behavior of the other. As you've been doing.

 

2) But even despite (1) above, you're now claiming I'm right, and you never said error causes regression toward the mean. You are a !@#$ing piece of work. :devil:

 

That said, there are few if any traits which have a narrow sense heritability of exactly one. This means that for just about every imaginable characteristic, there will be regression toward the mean. Despite this regression toward the mean, Darwinism states that if a given trait confers a survival or reproductive advantage, the species will gradually change to have more and more of that trait. What you seem to be saying is that regression toward the mean makes such long-term change impossible. If your ideas were right, human intelligence should have regressed toward the original, ape-like mean that existed 3 million years ago.

 

:oops: That's not what Darwinism says. Where do you get this nonsense?

Link to comment
Share on other sites

That said, there are few if any traits which have a narrow sense heritability of exactly one. This means that for just about every imaginable characteristic, there will be regression toward the mean. Despite this regression toward the mean, Darwinism states that if a given trait confers a survival or reproductive advantage, the species will gradually change to have more and more of that trait. What you seem to be saying is that regression toward the mean makes such long-term change impossible. If your ideas were right, human intelligence should have regressed toward the original, ape-like mean that existed 3 million years ago.

 

No. He's not. If you had even the slightest shred of intelligence about biology and genetics, (which you have proven time and time again that you dont), you'd understand why your above statements are completely and utterly wrong. You've spectacularly displayed again and again that you have no idea how genetic information is passed on or expressed in organisms. Furthermore, you have showed no comprehension of the underlying mechanism of the passing of genetic information. Couple this with your complete lack of basic math understanding, and you have the holcombs arm perfect storm of idiocy.

 

But heres the caveat. You dont have the slightest clue about biology or genetics. So you take your incorrect statement, pass it on as truth, and then use these falsehoods to support your other incorrect assumptions, and continue this self-propagating idiocy.

Link to comment
Share on other sites

It's the same thing I've been writing - and you've been arguing against - for THREE MONTHS, you idiot. :lol:

Wrong. I've been perfectly clear about the fact that when results are obtained by random chance, regression toward the mean will generally occur. You may recall the coin flip example I gave a while back. A group of children is asked to predict the outcome of 100 consecutive coin flips. The best child is right 62% of the time. When this kid's retested, her expected score the second time around is 50%--complete regression toward the mean.

1) No, because - again - the net effect of individual instances of normally distributed error over the entire population is ZERO. Error does not cause an appearance of regression toward the mean, unless you do something stupid and invalid like split the population into two subsets and extrapolate its behavior of one to the population as a whole while ignoring the behavior of the other. As you've been doing.

Measurement error means that the correlation between test and retest will be less than one. Whenever there's a test/retest situation, with a correlation of less than one, there will be regression toward the mean. The fact that normally distributed error is zero over the whole population simply isn't relevant to that phenomenon.

 

:oops: That's not what Darwinism says. Where do you get this nonsense?

I suggest you start reading up on Darwinism.

Link to comment
Share on other sites

No. He's not. If you had even the slightest shred of intelligence about biology and genetics, (which you have proven time and time again that you dont), you'd understand why your above statements are completely and utterly wrong. You've spectacularly displayed again and again that you have no idea how genetic information is passed on or expressed in organisms. Furthermore, you have showed no comprehension of the underlying mechanism of the passing of genetic information. Couple this with your complete lack of basic math understanding, and you have the holcombs arm perfect storm of idiocy.

 

But heres the caveat. You dont have the slightest clue about biology or genetics. So you take your incorrect statement, pass it on as truth, and then use these falsehoods to support your other incorrect assumptions, and continue this self-propagating idiocy.

You really don't have the faintest clue about what you're talking about, do you?

Link to comment
Share on other sites

Measurement error means that the correlation between test and retest will be less than one. Whenever there's a test/retest situation, with a correlation of less than one, there will be regression toward the mean.

 

The mean OF THE ERROR, YOU NITWIT.

 

The fact that normally distributed error is zero over the whole population simply isn't relevant to that phenomenon.

Only because you're taking the correllation of one distribution and applying it to a COMPLETELY DIFFERENT distribution. The distribution of error of a single score and the distribution of scores within the bulk population are NOT THE SAME THING. What part of "Things that are not the same are, in fact, different." is so !@#$ing hard for you to understand? :lol:

 

I suggest you start reading up on Darwinism.

 

Why? You're the one that doesn't understand it. Genetics and the inheritance of traits does NOT cause evolution. That's pretty much central to Darwin's On the Origin of Species. There's a reason he called his theory "natural selection" and not "evolution".

Link to comment
Share on other sites

Wrong. I've been perfectly clear about the fact that when results are obtained by random chance, regression toward the mean will generally occur. You may recall the coin flip example I gave a while back. A group of children is asked to predict the outcome of 100 consecutive coin flips. The best child is right 62% of the time. When this kid's retested, her expected score the second time around is 50%--complete regression toward the mean.

 

Measurement error means that the correlation between test and retest will be less than one. Whenever there's a test/retest situation, with a correlation of less than one, there will be regression toward the mean. The fact that normally distributed error is zero over the whole population simply isn't relevant to that phenomenon.

I suggest you start reading up on Darwinism.

Not necessarily.

 

As an example, if you measure someone's height with an 11" ruler, believing it to be 12", and measure that person's height with the incorrect ruler 100 times; you can get the exact same result every single time. The correlation would be 1 (perfect) and yet every single measurement has error in it.

Link to comment
Share on other sites

You really don't have the faintest clue about what you're talking about, do you?

 

once again, all bluster, no fact nor evidence.

 

Just because oyu dont have the slighest clue of how darwinism or genetics work, you try to cover it up by not supporting your own opinion, and then claiming that other people dont know that they are talking about? :oops:

 

But thats right, you DO know what you are talking about, because according to you, intelligence is determined by 100 genes, and if the heritability is .80, a person will get 80 intelligence genes from their parents. :devil::lol:

Link to comment
Share on other sites

once again, all bluster, no fact nor evidence.

 

Just because oyu dont have the slighest clue of how darwinism or genetics work, you try to cover it up by not supporting your own opinion, and then claiming that other people dont know that they are talking about? :oops:

 

But thats right, you DO know what you are talking about, because according to you, intelligence is determined by 100 genes, and if the heritability is .80, a person will get 80 intelligence genes from their parents. :devil::lol:

 

But don't forget that heritability varies with age. At the age of 2 or so, the heritability of intelligence is .2, it only goes to .80 as you reach adulthood.

 

So apparently you're born with 20 of your parents 100 intelligence genes, and they gradually give you the other 60 as you grow up.

Link to comment
Share on other sites

×
×
  • Create New...