A recently released review by leading survey researchers has pointed to holes in the methodology of the National Jewish Population Survey 2000-2001 that could have led to an undercount of the Jewish population.
The review comes as emotions over the controversial population survey conducted by United Jewish Communities are reaching a fever pitch, with much of the attention falling on the Forward and its editor for public criticisms of the study. Officials at UJC commissioned the review after deciding a year ago to delay release of the $6 million survey because screening data was found to be missing. They tapped Mark Schulman, past president of the American Association for Public Opinion and Research, to head the process.
In his report on the survey, Schulman cited a laundry list of errors and missteps in the data collection that he said may have led to an undercount of the overall Jewish population, damaged the survey’s comparability with a previous survey conducted in 1990 and perhaps inflated the proportion of Jews in the new survey who are more religiously identified. The 2000 survey found only 5.2 million American Jews, which is lower than the 5.5 million found in 1990. But Schulman told the Forward that even such general comparisons of the two studies should be done “at your own risk.”
Despite the problems, Schulman commended UJC for airing the study’s faults and thoroughly addressing them. “UJC folks were diligent in trying to identify the issues,” he told the Forward. “They didn’t just put out the data and run.”
Other researchers called for less rhetoric over the numbers and more research into their meaning: “Every survey has its share of ‘kick me’ moments,” said Charles Kadushin of the North American Jewish Data Bank at Brandeis University. “Does this survey have more than its share of kick-me’s? Yes, probably. Is it a total loss? No. Lay people want to know numbers first. Analysis is secondary. But these kinds of studies are best at analysis, so in a sense the best is yet to come.”
For now, though, most of the public discussion about the study appears to be focused on a September 17 opinion article in The New York Times written by Forward editor J.J. Goldberg. In the article, he criticized UJC for publishing what he called “flawed figures” that depict the nation’s Jewish population as shrinking. “Whether out of ideology, ego, incompetence or, as I suspect, a combination of all three, the respected charity has invented a crisis,” Goldberg wrote.
The opinion article caused an uproar within the halls of UJC. The organization’s president and CEO, Stephen Hoffman, was quoted in the Chicago Tribune this week as saying, “J.J. Goldberg slandered us by saying we have an ulterior motive, and that is a flat-out lie.” In his weekly memorandum to UJC leaders, Hoffman said that he was saddened that Goldberg “could be so off track regarding our motives and unmindful of the truly significant numbers uncovered by the survey; and two, that he could so easily speak so hurtfully about fellow Jews in such a public forum. Have you no shame, Mr. Goldberg?”
Goldberg told the Jewish Telegraphic Agency that UJC leaders were “vilifying the messenger when they don’t like the message.” The population figure “is supposed to be a statistic,” Goldberg told the JTA. “A statistic that is off by a couple of hundred thousands is not a statistic. That’s ridiculous.”
Goldberg also told the JTA: “They’ve created headlines across the country, and they don’t take any responsibility for them.”
Schulman’s report suggests that, indeed, the survey may have undercounted the Jewish population. The report looked at design factors such as the opening “screening” question that may have discouraged some people from participating, contributing to a low 28% response rate. The first question — “What is your religion, if any?” — might have caused many Jews to opt out of the study, according to Schulman. He argued that the response rate raises a “‘yellow flag’ of caution in interpreting the results” which found only 1.6% of American adults to be Jewish, compared to several well-regarded surveys that put the figure closer to 1.8%. “Other surveys, frankly, ask some nonthreatening questions up front,” Schulman said.
Schulman noted what he said might be an even more problematic finding in the survey: About 38% of Jews identified by screeners as eligible for interviews did not complete their interviews. Schulman pointed to research of this phenomenon that shows that unaffiliated and interfaith households were less likely to complete the interview, thus skewing the sample in favor of the more religiously inclined.
While some researchers say those who chose to opt out of the survey were just as likely to be non-Jewish as they were to be Jewish, initial tests appear to prove otherwise. Schulman described in his review a test conducted by UJC that found that those who did not cooperate with the survey were twice as likely to have a traditionally Jewish last name as those who did cooperate with the study. The UJC test found that 0.16% of those who agreed to cooperate with the survey had one of “31 distinctive Jewish names,” compared to 0.37% of those who refused to participate.
The finding seems to suggest that variables such as the screener question in the survey may have driven off significant numbers of Jews — although, Schulman told the Forward, it could also be that Jews are generally far less likely to respond to surveys.
Schulman’s review called the loss of screening data a “major problem” — most of the information on respondents who did not complete their interviews is irretrievably lost. Schulman suggested that some researchers may wish to develop their Jewish population estimates based solely on the remaining data, which would produce a slightly higher percentage of Jewish households in the overall population.
In his report, Schulman quoted from an examination conducted by Len Saxe, a researcher at Brandeis and co-director of the North American Jewish Data Bank, who found that Jewish households that completed their interview were much more likely to identify as Jewish by religion than Jewish households that did not complete their interview. “Jewish ‘completes’ appear to be more strongly identified than are the Jewish ‘incompletes,’ skewing the sample toward more hard-core Jewish respondents,” Schulman wrote.
In an interview with the Forward, Saxe commended the Schulman review: “It identifies key methodological issues, and I support its recommendations that further research be done to figure out whether there was bias and whether the data can be reweighted.”
Researcher Gary Tobin, a longtime critic of UJC’s research projects, appeared emboldened by Schulman’s findings. “I think the report vindicates what many of the critics of NJPS have been arguing for a long time: that the sample is not a good one, that the population is undercounted, that the study is rife with methodological errors,” said Tobin, president of Institute for Jewish and Community Research.
But, Tobin added, the various issues raised in Schulman’s review “will likely have little impact on the analyses of relationships between variables in the data set.”
“Analysis of these relationships,” Tobin said, “will provide valuable insights into the relationships between the varying backgrounds of Jews, their beliefs, religious practice and the role of religion in family life.”