Using Statistics Among 'Christians'
/
Not too long ago, both The Gospel Coalition and Ed Stetzer suggested that evangelicals are erroneously using statistics on divorce when they quote the Barna Group saying things like, "In fact, when evangelicals and non-evangelical born again Christians are combined into an aggregate class of born again adults, their divorce figure is statistically identical to that of non-born again adults: 32% versus 33%, respectively" [1].
Who is correct? Are evangelicals mishandling this information?
I should probably state that Ed Stetzer was not specifically saying that the Barna Group was wrong, but that Evangelicals are wrong in saying, "the divorce rate is the same between Christians and non-Christians"; however, this information comes from the Barna Group so by extension it would seem that the Barna Group and Stetzer were in disagreement. When I asked Stetzer if we can trust the Barna Group's statement, he said yes and provided me with a brief explanation. (He as also written out this explanation in further detail on his website.)
I should also disclose that I recently quoted the Barna statistic in a podcast on divorce. Am I erroneously using data? Well, I can say that it was not my intention; but I can also say this matter is not about one statistic being correct and another being wrong. It's about what we are measuring and how we are doing it.
So what's going on here? On the one side we have Christians who conduct extensive research using census data collected from every person in the nation and often self-reported. On the other side we have research groups conducting carefully designed questioner-data collection that uses samples. And we probably have reports from somewhere in the middle that uses data sets from both collection methods.
The national census asks people about their faith and it would seem that many people (especially in the southern US states) simply report that they are Christian regardless of their regenerate (or born-again) state. Some of these self-reporters feel that they are Christian if they attend church on occasion, or maybe even if they've ever attended a Christian church at some point in their lifetime. Maybe they believe they are Christians if their parents had them baptized as babies. Others report that they are Christian even when many theologians would argue that the faith group these people attend is decidedly not Christian, but rather, heretical. Reports generated from this kind of data collection seem to demonstrate that the divorce rate, for example, is "statistically identical" between Christians and non-Christians.
On the other hand, researchers like Ed Stetzer--who conduct sample gathering research--design questions that attempt to get at the heart of a person's faith. They inquire in ways that they believe measures a person's actual commitment to their faith, and sometimes they ask questions to assess faith itself. They might ask if the person claimed to be a Christian at the time of divorce. There could be questions about the person's level of involvement or commitment with his or her faith. Collecting data in a more focused study tends to yield decidedly different results but at the same time, the researcher makes judgements about the persons faith apart from the subject's own self-statement. In these cases, it does seem that active Christians are less likely to get divorced.
But who is right and who is mishandling statistics? They're both right and we're probably all mishandling statistics.
When I see a tweet from John Piper talking about marriage and linking to one kind of report I understand that his intention is to celebrate numbers that support that Christians divorce less often. His intentions are good but the information doesn't accurately report how the measurement was taken; or more specifically, what is accurately being measured. (And it's not like the pastor needs to offer a 22-page academic report on findings.)
When I said that the divorce rate looks like that of the world in a podcast, I too was pointing to numbers and research to make my point without a good explanation or citation of what's being measured. I probably should have stated the name of the report or that it comes from census information and self reporting data. (I accept full responsibility for that mistake.) I could have also pointed out that when data is collected differently, the numbers appear differently. But our podcast has a wide variety of listeners (both believers, non-believers) and the point was divorce, not statistical research methods.
The truth is, both reports are correct. If we were to ask how far Seattle is from Salt Lake, one could report the number of hours it takes to drive between the cities, another could state the highway millage, while still another could report the distance as the crow flies and use kilometers. Do we measure from the edge of the city limits, or the center, or from some other point? What route do we take if we drive? How fast are we to go? Does the way we measure make the answer wrong? No. It just needs to be qualified so we know what was measured and how we did it.
The question about divorce rates among Christians is not so much about misusing information as some have suggested, but about what is being measured. In the case of the divorce rate, it seems one measurement is examining the divorce rate among everybody who accepted the title of Christian for themselves at at least one point in their life, while the other measurement is trying to determine who is actually a Christian and then is only measuring those people. In either case, I'd say the only misuse of the numbers is not stating the background behind the measurement. That being said, we really do have an obligation to share this information honestly and accurately. And we really aught to examine ourselves the next time we are asked if we are a Christian.
___
1. Barna Group, "New Marriage and Divorce Statistics Released," March 31, 2008 [www.barna.org/family-kids-articles/42-new-marriage-and-divorce-statistics-released?q=divorce, accessed September 28, 2012].
* Photo by Leo Reynolds is licensed under a Creative Commons License and used with permission.
Who is correct? Are evangelicals mishandling this information?
I should probably state that Ed Stetzer was not specifically saying that the Barna Group was wrong, but that Evangelicals are wrong in saying, "the divorce rate is the same between Christians and non-Christians"; however, this information comes from the Barna Group so by extension it would seem that the Barna Group and Stetzer were in disagreement. When I asked Stetzer if we can trust the Barna Group's statement, he said yes and provided me with a brief explanation. (He as also written out this explanation in further detail on his website.)
I should also disclose that I recently quoted the Barna statistic in a podcast on divorce. Am I erroneously using data? Well, I can say that it was not my intention; but I can also say this matter is not about one statistic being correct and another being wrong. It's about what we are measuring and how we are doing it.
So what's going on here? On the one side we have Christians who conduct extensive research using census data collected from every person in the nation and often self-reported. On the other side we have research groups conducting carefully designed questioner-data collection that uses samples. And we probably have reports from somewhere in the middle that uses data sets from both collection methods.
The national census asks people about their faith and it would seem that many people (especially in the southern US states) simply report that they are Christian regardless of their regenerate (or born-again) state. Some of these self-reporters feel that they are Christian if they attend church on occasion, or maybe even if they've ever attended a Christian church at some point in their lifetime. Maybe they believe they are Christians if their parents had them baptized as babies. Others report that they are Christian even when many theologians would argue that the faith group these people attend is decidedly not Christian, but rather, heretical. Reports generated from this kind of data collection seem to demonstrate that the divorce rate, for example, is "statistically identical" between Christians and non-Christians.
On the other hand, researchers like Ed Stetzer--who conduct sample gathering research--design questions that attempt to get at the heart of a person's faith. They inquire in ways that they believe measures a person's actual commitment to their faith, and sometimes they ask questions to assess faith itself. They might ask if the person claimed to be a Christian at the time of divorce. There could be questions about the person's level of involvement or commitment with his or her faith. Collecting data in a more focused study tends to yield decidedly different results but at the same time, the researcher makes judgements about the persons faith apart from the subject's own self-statement. In these cases, it does seem that active Christians are less likely to get divorced.
But who is right and who is mishandling statistics? They're both right and we're probably all mishandling statistics.
When I see a tweet from John Piper talking about marriage and linking to one kind of report I understand that his intention is to celebrate numbers that support that Christians divorce less often. His intentions are good but the information doesn't accurately report how the measurement was taken; or more specifically, what is accurately being measured. (And it's not like the pastor needs to offer a 22-page academic report on findings.)
When I said that the divorce rate looks like that of the world in a podcast, I too was pointing to numbers and research to make my point without a good explanation or citation of what's being measured. I probably should have stated the name of the report or that it comes from census information and self reporting data. (I accept full responsibility for that mistake.) I could have also pointed out that when data is collected differently, the numbers appear differently. But our podcast has a wide variety of listeners (both believers, non-believers) and the point was divorce, not statistical research methods.
The truth is, both reports are correct. If we were to ask how far Seattle is from Salt Lake, one could report the number of hours it takes to drive between the cities, another could state the highway millage, while still another could report the distance as the crow flies and use kilometers. Do we measure from the edge of the city limits, or the center, or from some other point? What route do we take if we drive? How fast are we to go? Does the way we measure make the answer wrong? No. It just needs to be qualified so we know what was measured and how we did it.
The question about divorce rates among Christians is not so much about misusing information as some have suggested, but about what is being measured. In the case of the divorce rate, it seems one measurement is examining the divorce rate among everybody who accepted the title of Christian for themselves at at least one point in their life, while the other measurement is trying to determine who is actually a Christian and then is only measuring those people. In either case, I'd say the only misuse of the numbers is not stating the background behind the measurement. That being said, we really do have an obligation to share this information honestly and accurately. And we really aught to examine ourselves the next time we are asked if we are a Christian.
___
1. Barna Group, "New Marriage and Divorce Statistics Released," March 31, 2008 [www.barna.org/family-kids-articles/42-new-marriage-and-divorce-statistics-released?q=divorce, accessed September 28, 2012].
* Photo by Leo Reynolds is licensed under a Creative Commons License and used with permission.