How Much Does Your Kid Really Know About Israel?

How much do Jewish kids know about Israel? According to a new report, very little.

A team of researchers from Brandeis University created a multiple-choice test for measuring Israel literacy. Like so many multiple-choice tests of literacy conducted over the last century, this one found that kids don’t know enough. While most students knew that Benjamin Netanyahu is the prime minister of Israel, few identified Etgar Keret as an Israeli novelist or located the headquarters of the Palestinian Authority in Ramallah.

The authors provide few details about how they chose the questions to include. They convened a panel of experts and solicited suggestions for “domains and concepts panelists thought were crucial for student understanding.” They don’t say how these “experts” determined what knowledge is “crucial.”

How many adults know the location of the PA headquarters? Why is it “crucial” that a student know that the headline “Ariel Sharon Touches a Nerve and Jerusalem Explodes” refers to the beginning of the second intifada? That the aliyah from the former Soviet Union was larger than the aliyah from Morocco? Is a student who gets these questions wrong illiterate when it comes to Israel?

Tests of random factoids ripped from context perpetuate a view of knowledge as a process of memorization and regurgitation. It violates everything we’ve learned about learning in the past quarter century.

The educational experts on the panel of advisers had to know this. The report’s writers even mentioned their doubts about whether literacy was the kind of thing that can be captured and tested using a multiple-choice exam. The group proceeded despite their objections.

Multiple-choice tests define student achievement as “knowing the most facts” rather than examining the depth and quality of those facts and how students use them to build arguments. Nonetheless, the authors conclude that isolated facts still play an essential role in supporting student thinking.

The authors’ own findings challenge this claim. They interviewed students about Israel and then gave them news articles to read. Some of the students with the least factual knowledge — the ones who couldn’t, for example, identify Netanyahu as the current prime minister of Israel — were able to develop and argue sophisticated positions based on context and information gleaned from the article. The report notes: “The interviews revealed more about the intellect and ability of students… than their existing knowledge of Israel.” In other words, when it came to discussing Israel, thinking, reading and speaking skills proved more important than the ability to blacken the correct bubble.

Why then a multiple-choice test? The decision suggests a lack of creativity in thinking about what it means to “know Israel.” But other assessment tools do exist.

My own research focuses on the stories students tell about Israel. Over the past few years, I have collected over 400 responses by Jewish high school students to the question, “Tell me the whole history of the state of Israel in as much or as little detail as you want.” Their responses shed light on which events in Israel’s history they think are most important (many start with the Exodus, for example), and include many of their thoughts on the most pressing political issues of the day.

Unlike a multiple-choice test, my students’ accounts show what students are misunderstanding and why. These accounts show not what they know, but how they use what they know to build a historical account that matters to them. These accounts tell us far more about what these students know, think and feel about Israel then checking whether they have memorized that the PA is in Ramallah, not Nablus.

The bank of test questions isn’t better than nothing. It perpetuates a deeply flawed conception of what it means to know about something. At a time when public education is embracing the Common Core standards, which emphasize the connections between what we know and what we do with that knowledge, leading researchers in Jewish education are doubling down on a multiple-choice test of disconnected facts. Can’t we do better?

Jonah Hassenfeld is a PhD student in Education and Jewish Studies at Stanford University. He is a Jim Joseph fellow and Wexner fellow/Davidson scholar.

The views and opinions expressed in this article are the author’s own and do not necessarily reflect those of the Forward.


Jonah Hassenfeld

Jonah Hassenfeld

Jonah Hassenfeld recently completed his PhD in History Education at Stanford University. He is the Assistant Director of Teaching and Learning at Gann Academy in Boston and a Wexner fellow/Davidson scholar.

Your Comments

The Forward welcomes reader comments in order to promote thoughtful discussion on issues of importance to the Jewish community. All readers can browse the comments, and all Forward subscribers can add to the conversation. In the interest of maintaining a civil forum, The Forward requires that all commenters be appropriately respectful toward our writers, other commenters and the subjects of the articles. Vigorous debate and reasoned critique are welcome; name-calling and personal invective are not and will be deleted. Egregious commenters or repeat offenders will be banned from commenting. While we generally do not seek to edit or actively moderate comments, our spam filter prevents most links and certain key words from being posted and the Forward reserves the right to remove comments for any reason.

Recommend this article

How Much Does Your Kid Really Know About Israel?

Thank you!

This article has been sent!