Evidence, Anna. Where’s the Evidence?

Earlier this month, the C.D. Howe Institute published a polemical paper by University of Winnipeg mathematician Anna Stokke. The paper explores the politically-charged question of What to Do about Canada’s Declining Math Scores. It’s one of those funny papers, written by an academic wandering outside (or to the fringes) of her field. Stokke has some insightful things to say, followed by some absolute howlers. Let us begin with the “problem”: on the mathematics components of the Trends in International Mathematics and Science Study (TIMSS) and Programme for International Student Assessment (PISA) tests, most Canadian provinces have shown a statistically significant decline over the past decade or so. The first two questions that come to mind should be

  1. Is it true?
  2. Does it matter?

Stokke states that #1 is true and provides some reasonable evidence to back this up. I agree with her basic position that Canadian scores have declined on these tests. I haven’t seen a compelling case for why this is the case, but it definitely is borne out by the data. It is worth noting that Canada has declined from being near the top of the international list in 2003 to being near the top of the list in 2012. The decline is real, but it’s from height to height. Stokke assumes the answer to #2. I won’t go into #2 deeply, but I will note that it is far from obvious that the tests measure the things we value in mathematics education. There are countless possibilities for what we could value in school (computation, reasoning, geometry, algebra, number, problem solving, etc.) No curriculum can do full justice to all of them, so choices have to be made. I have no quarrel with the material emphasized by PISA and TIMMS, but I believe that any political action based on the scores must take such niceties as these into account. Let’s also note that K-12 education is a provincial matter in Canada: each province creates its own curriculum, assessment standards and teaching professional standards. Nation-wide decline becomes a problematic issue to study because each province is teaching a different curriculum. There are, of course, similarities because change doesn’t occur in a vacuum. Provincial ministries of education do talk together and have a pretty good idea of what each other are doing, and American publishers and academic trends tend to influence Anglophone curricula. I am uncertain of the influences on Francophone curricula. So far, we have a few quibbles, but the argument is fine. International scores are declining. Next comes the bogey du jour: “Discovery-based Instruction” whatever that is. The issue that Stokke pushes is the difference between structured “direct instruction”[1] and “discovery learning[2]”. If we accept Stokke’s crude dichotomy here, the empirical evidence is pretty strong. In terms of developing skills, teacher guidance is critical. This is why teachers are necessary. Well-structured instruction, with appropriate questioning, feedback and revision to assist student understanding are crucial to student achievement. So if the goal is to teach students to perform long division, or to multiply polynomials, or to solve quadratic equations, clearly worked examples under the direction of the teacher are clearly and decisively shown to be the best choice for student instruction. Stokke makes reference to some of the empirical literature on page 4 of her report, and I have no quarrel with her summary or with her interpretation of the results. (On a side note, Stokke’s footnote 5 on that page is an absolute joke. The “well-informed journalist” is David Staples, who is well-intentioned but hilariously ill-informed on the issue. The claim that phrases are interchanged “to avoid criticism” is at best a wild guess, and at worst yellow journalism.) So where’s the complaint? It is undeniably true that inquiry forms the basis of what is valued in K-12 education in Canada today. Science classes should and do have laboratory experiments. Social studies classes should and do include source-document analysis and open inquiry into relevant issues. How could one study literature without inquiry? The same goes for mathematics. Students need to acquire the basic tools of mathematics, but they also need to learn how to mobilize these skills to explore questions both practical and theoretical. If there were no room for inquiry in mathematics, Dr. Stokke would be out of a job. Which leads us to the million dollar questions.

  1. Which skills must be mastered?
  2. Which skills should be developed to the point of acquaintance?
  3. How much inquiry is desirable at each stage of the student’s development?
  4. What do we ultimately hope that our students will gain from their mathematical education?

Notice that these questions are only slightly empirical. At their hearts are values. What do we think is important and why do we think it’s important? Stokke shows some sensitivity to this problem, but she effectively abandons her project and jumps into speculation and assertion. Starting on page 9 she begins spelling out what she thinks is important for each grade of the curriculum. Stokke provides no justification for this, she makes no appeal to research literature on child development; she somehow just knows what every child should learn, and when they should learn it. Frankly, this is just spitting in the wind. She’s one voice in a multitude. I agree with some of her suggestions, but not all. (But then, I only have 20 years of classroom experience and a doctorate in education; what do I know about the matter?) And then it starts to get weird. Stokke writes

One way to redress the balance between instructional techniques that are effective and those that are less so would be to follow an 80/20 rule whereby at least 80 percent of instructional time is devoted to direct instructional techniques and 20 percent of instructional time (at most) favours discovery-based techniques.

Where does this come from? For which students? At what grades? Where’s the evidence? Stokke is relying on her intuition to make an 80/20 prescription. Is she joking? On the positive side, Stokke recognizes that there’s more to the issue than the simple-minded “discovery vs direct” dichotomy suggests. But to make an ad hoc recommendation of this type is to be irresponsible. It gets worse. Stokke generalizes from an American study of first and third grade teacher knowledge, suggesting that Canadian teachers should all be given regular licensure examinations. Seriously? You’ll need some evidence that

  1. The American study applies to Canada.
  2. What’s true for first and third grades applies to all grades.
  3. Licensure exams will actually improve classroom practice.

Ultimately, Stokke’s report reeks of the well-intentioned dilettante. That Stokke loves mathematics and is a believer in quality mathematics education is evident. Unfortunately, it’s also evident that she jumped on a bandwagon, read a small amount of research in the area and picked up a contract from a “think tank” to join a public debate. As an insider, I’m not thrilled with the current state of mathematics education in my country. But also I am convinced that single-cause arguments that fail to acknowledge differences between students—not even considering that age matters—do nothing to clarify our issues, but serve only to support loud and ill-informed public shouting.

[1] Not to be confused with an American product that uses this name. “Direct instruction” should be read generically as any teaching in which the teacher strongly guides or directs learning throughout the process.

[2] Despite the claims of Stokke and a few other public protesters, you simply won’t find this term or anything like it in current curriculum documents in Canada. Individual teachers are undoubtedly using “discovery” methods to some degree or other, but there is nothing in the curriculum to mandate them.

13 thoughts on “Evidence, Anna. Where’s the Evidence?

  1. This piece is a striking exercise of intellectual arrogance with a thin pretence of being a sober review of Mathematician Dr. Anna Stokke’s recent report published by the C. D. Howe Institute. It is a patronizing dismissal-of-the-outsider-who-dares-to-publicly-express-dissent … lathered with a self-assured, sneering tone. I have grown accustomed to education officials similarly revealing a contempt for the general public — all outside their own circle of approval but this piece rises far enough above the usual undertones to be noteworthy in this regard.

    So … this is the Program Coordinator of Research at Edmonton Public Schools? I wonder if spin-master and loose cannon were in that job description? Consider a few of the gorgeous slurs this tax-lubricated salary has underwritten here:

    “…one of those funny papers”
    “…wandering… to the fringes) of her field”
    “…absolute howlers”
    “…bogey du jour”
    “…Stokke’s crude dichotomy”
    “…an absolute joke”
    “…hilariously ill-informed”
    “…wild guess”
    “…yellow journalism”
    “…abandons her project and jumps into speculation and assertion”
    “…one voice in a multitude”
    “…spitting in the wind”
    “…then it [the C.D. Howe report] starts to get weird”
    “…irresponsible”
    “…It gets worse”
    “…report reeks”
    “…well-intentioned dilettante”
    “…jumped on a bandwagon”
    “…read a small amount of research in the area and picked up a contract from a “think tank” to join a public debate”

    In my list I have left out the numerous dismissive and sarcastic flourishes peppering this piece, which is itself devoid of any sourcing in the research literature (in contrast to the 36 sources cited by Stokke, most of them from the education literature and including such summative pieces as the 2008 NMAP report, which is based upon qualitative an analysis and synthesis of some 16,000 research publications and policy reports, public testimonies, written commentary from 160 organizations, and survey data from 743 active mathematics teachers).

    Perhaps Mr. Macnab is not to be faulted for this knee-jerk reaction,. Like many in the Alberta educational constellation he surely shares responsibility for the current state of affairs in mathematics education in his own province and those provinces so strongly influenced by their lurch toward so-called constructivist teaching methodology (which Dr. Stokke collects under the more helpful rubric “discovery-based education” in the C.D. Howe report, because it is more meaningful in open public discourse). Like the others he has apparently invested much time and personal capital in promoting these ideas and, if they fall, stands to lose much of that investment, not to mention, more tangibly, that a handsome salary rests upon his credibility — so clearly the outsider must be marginalized and put in her place.

    I do think it’s easy to understand the bias and personal animus evident here, considering that he positions himself as an “insider” in the provincial system that led the WNCP framework in mathematics, now in use in some form across 8 provinces — all of whose scores declined, and all but one significantly

    … and … Alberta did fall. Boy, did it fall, under the watch of Mr Macnab and those for whom he is running interference here.

    As PISA scores are an obscure abstraction to the general public, the following may put Alberta’s decline into perspective:

    From 2003 to 2012 Sweden’s raw PISA score dropped dramatically — particularly in mathematics. So dramatically it led to the OECD taking an unusual step: they wrote a 180-page report on the collapse of Sweden in Math education and recommendations for steps to take to stem that tide.

    Sweden’s PISA math score dropped, in fact, further than any other participating country or economy (of which there were 65 in 2012). In particular Sweden dropped by 30 points! (In the OECD’s estimation, a difference of 41 points is approximately equal to one year of schooling).

    During the same period, Alberta’s score dropped by 32 points.

    Mr Macnab’s spin: “Does it matter? [no, because] …it’s from height to height”.

    An interesting perspective, that Alberta dropping further than any country or economy in PISA can be ignored because the 2003 baseline was so high that Alberta is still a bit ahead of a mediocre pack? Out of 10 provinces Alberta’s decline was second only two Manitoba’s 36-point collapse. Alberta’s high 2003 baseline is small consolation to Manitoba and other provinces hurt similarly by this curriculum, who did not start nearly as high, and have now dropped below the world average.

    While Anna’s report distinguishes the different ways each province fares, Mr. M glosses over Alberta’s disastrous turn of affairs, speaking only of Canada’s fall. Canada’s score is rapidly approaching the international average — a truly abysmal bar. But … yes … we’re still above that mark. One must grant that this is better than … the alternative.

    For a “Program Coordinator for Research” Mr Macnab cites very little of the stuff (as in zero) in this piece. Apparently having the word “research” in one’s title is supposed to suffice for an argument from (one’s own) authority about it.

    Speaking, as Mr Macnab has, about “howlers”, there are a number of noteworthy ones in his own blog post, which deserve separate treatment. Assuming I do not get blocked in the process I shall endeavour to address a few of these in this space in the coming days as I can find time.

    Like

    • Thanks for your reply, Robert. Not sure why you think I would block you.

      Please note that I do not dispute the decline in PISA; I call for some subtlety in understanding what the decline means. The main targets of my post are Dr. Stokke’s recommendations–notably the certification exams and the blanket 80/20 split, neither of which have a scrap of evidence provided to support them.

      In your case, you draw a straight line from curriculum change to declining PISA scores. It is far from clear that this could be the “smoking gun” as the curriculum changes were not synchronous and they were not identical. As I note in the blog, it is unlikely that a single cause can be identified.

      Finally, I didn’t cite sources because this is a blog, not an academic paper. To compare Dr. Stokke’s reference list for a published paper to mine in a blog is at best disingenuous.

      Like

  2. I did note that you don’t dispute the decline, and did not suggest otherwise.

    I wrote what I did to dispute one of your implied “points” (which you posed by way of a question):
    “2. Does it matter?”
    As you apparently didn’t get my gist I’ll reprise briefly:

    It “mattered”, when Sweden fell by a *smaller* amount, enough for the OECD to write a 180 page report concerning its 30-point decline, largest among all PISA participants. Now, perhaps you wish to imply that it does not “matter” that Alberta’s score fell by more than this, and it may be that the meaning of that for Alberta is qualitatively different than the meaning for Sweden. Perhaps you’d like to make a case for these things (or do you think that innuendo without committing to a position yourself suffices?).

    If you think a mere 32 point drop is unworthy of such attention, perhaps you ought to take up the case with the OECD which differs on this judgement. But you’ll have to make a case for that — which is something you have not done thus far; you have merely asked a rhetorical question with the implication that you do not believe it to be so, apparently believing that (metaphorical) vocal inflection and raise eyebrows make an effective argument against an obvious conclusion. It does not.

    Why would I “think” you would block me? I have no experience with you, only with some (so far three) of the Alberta educationists who have, indeed, blocked me. I do not hold you responsible for their actions. But I am an intelligent being who learns from experience. I made an agnostic statement about this because my experience with your colleagues has led me to the conclusion that they do not welcome robust disagreement of the sort I bring on their blogs or twitter feeds. I am not complaining about this — it is a free world and that is their prerogative; the right to free expression does not imply the right to invade message space under the ownership of another with said expression. So if you are inclined to block me at any point, be my guest … er … decide that I’m not your guest anymore … or whatever.

    The C.D. Howe report is a policy paper and must conform to certain specifications. Anna’s reference conformed to those parameters. Perhaps you don’t know, they do not like too many references or academic jargon as these reports are for digestion by the general public and policy makers. That is the nature of this kind of policy paper, and you’ll find the same characteristics in typical reports by this and other institutes for policy. It is the norm. However, although they are not academic papers they are subject to a more rigorous review process than I’ve ever seen in an academic journal over 25 years of publishing.

    In your blog piece you characterize Dr. Stokke as having
    “picked up a contract from a ‘think tank’ to join a public debate”
    This is incorrect on more than one level.

    First it suggests that Dr. Stokke shopped around for a venue in order to get coverage for her views. I can excuse you for not knowing this as I do, being close enough to her work to know such things, but she did not pursue C.D. Howe in any way. She was asked, with no prompting from her, to write a report on this subject and was quite surprised by that request. I stand to be corrected on this, but I believe she had never previously had any contact with CDH.

    Second your statement suggests that Dr. Stokke, not having a part in the debates about public education, wanted “a piece of the action” and used this institute to somehow launch her, a nobody, into a conversation of which she was not a part.

    If that’s what you think of her part in this debate then I wonder where on earth you’ve been for the last 4 years?

    Dr. Stokke has been critically involved in the discussion about mathematics education starting with the conference on the subject she single-handedly organized in 2011 at the University of Winnipeg http://mathstats.uwinnipeg.ca/mathedconference/ , a petition she organized about teacher preservice mathematics exposure signed by over 300 academics: http://www.gopetition.com/petitions/raise-math-prerequisites-for-teachers-in-manitoba.html, founding of WISE Math, wisemath.org , an initiative by professional mathematicians (and, more recently, other concerned citizens outside the mathematical community) across the country to address issues in this field, more interviews that can be easily counted on local and national radio and TV, op eds and interviews in newspapers across the country including national press; organizing, running and, with her husband, creating resources for a hugely successful low-cost community math education initiative helping hundreds of children in Winnipeg and providing hands-on training for future teachers, regular high-level meetings with education officials. She has had more-or-less direct influence on significant changes made to curricula in two different provinces, influenced provincial choices for recommended resources, co-organized and ran a session on mathematics education at the 2014 Canadian Mathematical Society Summer meeting and spoken across the country on this subject. She has designed and modified courses and programs servicing pre-service teachers. She has been called on by schools to deliver PD for them; in 2013 she worked with Ministry of Education to deliver, with them, a large-scale training workshop for Manitoba teachers. She is the recipient of major community service awards here for her ongoing work and service to the community in this work. And … she has never been paid for these things, even paying for expenses to keep them going out of her own pocket.

    So, first, I think it is clear why C.D. Howe might have felt she was a good choice to write such a piece. And second, to characterize her in language that suggests writing this report was inserting herself vaingloriously into a discussion to which she was not already a party is simply ludicrous. I think this really merits an apology from you.

    Finally, a minor note: your scare quotes around C.D. Howe as a “think tank” comes across as a sneer. Does this reflect your view of that Institution? You do realise, I’m sure, that CDH is regarded by many as Canada’s premier institute of public policy, and while, like all such institutions, it surely has a perspective, it is completely unaligned in party politics and broadly understood to be nonpartisan in its positions. As the institute often characterizes itself,

    “The C.D. Howe Institute is an independent not-for-profit research institute whose mission is to raise living standards by fostering economically sound public policies. It is Canada’s trusted source of essential policy intelligence, distinguished by research that is nonpartisan, evidence-based and subject to definitive expert review…

    The C.D. Howe Institute’s reputation for independent, reasoned and relevant public policy research of the highest quality is its chief asset, and underpins the credibility and effectiveness of its work. Independence and nonpartisanship are core Institute values that inform its approach to research, guide the actions of its professional staff and limit the types of financial contributions that the Institute will accept.”

    Perhaps you know (or not) that CDH has played a pivotal role in defining the terms of NAFTA, of reform of Canada and Quebec pensions, the development of TFSAs, corporate tax reform and numerous other policies that shape the lives of Canadians today.

    If you really regard this institute as sneer-worthy perhaps you’d like to name a policy institute of any flavour in Canada that in your esteem is of greater influence or less beholden to partisan interests, and would be willing to share this information with us.

    As for references in a blog, I follow quite a number of blogs about education, and it is very common for writers to reference sources directly. For example, here is one such that I follow, and a piece I am reading right now:
    https://gregashman.wordpress.com/2015/06/13/teaching-for-understanding/
    Observe that in this blog post Ashman cites — and links to
    – the influential book “Education by Design” by the late Grant Wiggins (et al)
    – philosopher Dan Dennett’s “Intuition Pumps and other Tools for Thinking”
    – Two blog-posted essays by well-known cognitive scientist Dan Willingham
    – an on-point direct quotation from Alfred North Whitehad
    – Kamii and Dominick’s infamous “study” (taken far more seriously by educationalists than it deserves!) about teaching with, versus without, direct exposure to standard algorithms
    – Another more recent study, less well known but coming to the opposite conclusion using different methodology, by Stephen Norton
    – A 500-page online book of scholarly construct by a consortium of Constructivist-teaching advocates

    And, for Greg Ashman, this is relatively thin referencing. He quite often cites the academic literature pretty densely.

    Certainly some blogs are an obscure person’s blathering off the top of their head into the cybersphere. Perhaps this is how you’d like your blog to be seen. If so this would explain why my presence here appears to have single-handedly doubled your readership (judging from the comment fields on recent articles).

    Are you, then, just sounding off? Or do you, as it seems to me, intend that your title as Program Coordinator of Research at a major school district lends your writing on this matter a weight of authority not borne out in the writing? Citing a few sources might suggest to the reader that you intend your assertions to be taken seriously on some level other than conferred authority based on your position within the system. It also might seem less hypocritical in an article which, from the title to the crumbs of faint praise, reeks of academic snobbery for what you regard as feeble supporting documentation … while countering with nothing of substance beyond rhetorical flourishes and contemptuous language.

    Anyway, this is not about you, I just couldn’t resist speaking my mind about that, the overriding essence of your post. My purpose is to address your assertions concerning Anna’s analysis and recommendations, and there are more groaners in your piece to deal with … I shall return, and we shall talk about the *nature* of what is tested in PISA and the *purpose* of a PISA assessment. Given your position I should think it is your job to know about such things, so I do hope for a thorough — and fact-filled — engagement on the matter. As I’m in transit and unable to be online except in spurts … perhaps you’d like to lead out with some specifics on the matter, which will help expedite our discussion on this next point.

    Like

  3. Glad you came back, Robert. I hope that you’ll poke around the blog a bit. You’ll likely discover that we agree far more than we disagree about mathematics education. In a much earlier post, I tried to fairly outline the the debate about computation vs. understanding. I conclude with “So here we are, stuck between two ideological positions, with a distressing lack of evidence.” (I just noticed that I promised a quick return to the topic, but got distracted by the Charlie Hebdo massacre.)

    When I wrote “2. Does it matter?” I followed the question with a clarification.

    “It is far from obvious that the tests measure the things we value in mathematics education. There are countless possibilities for what we could value in school (computation, reasoning, geometry, algebra, number, problem solving, etc.) No curriculum can do full justice to all of them, so choices have to be made. I have no quarrel with the material emphasized by PISA and TIMMS, but I believe that any political action based on the scores must take such niceties as these into account.”
    Imagine that we chose to change the emphasis of junior high mathematics (pick a jurisdiction) to give students a thorough course in Euclid. This time has to come from somewhere, so the students wouldn’t be learning some of the things on these international tests. But (and I’m sure you’d agree) they would be getting something very valuable in its place. Students in this imaginary jurisdiction would do very well in the PISA “shape and space” questions but might not fare so well in the others. Now we have to decide if it matters. I’d be pretty happy in this scenario. Politically it would be dynamite, but as a mathematics educator, I would believe that the tradeoff was worthwhile. PISA is a pretty good test, but it should not define our values for us.

    For the record, I am concerned about Alberta’s drop in PISA mathematics, and I have worked hard with my colleagues to understand why this has happened. It simply cannot be solely because of curriculum change. We have PISA math scores from 2003, 2006, 2009 and 2012. Alberta’s scores have been dropping every year since 2003. But the students who wrote in 2006 and 2009 were studying precisely the same program of studies as those in 2003. If you check the Alberta Education website, you’ll see that the current program began its roll-out in 2008 for grades K, 1, 4 and 7, with succeeding grades phased in after that. As PISA is written by 15-year-olds (mostly grade 9 students), none of the writers of 2003, 2006 or 2009 would have been taught on the new program. The grade nine students who wrote in 2012 would have been on the old program until grade 6, and the new program for grades 7-9. Clearly the cause of the decline must be something other than the new program of studies. (This should not be read as an endorsement of either the old or the new programs. That’s a discussion for another time.)

    I’ll ignore your speculation about my intentions, except to say you that you really missed the mark. Perhaps we can get to know one another and you’ll understand me better.
    You seem perplexed by my working title. In deference to my employer, I won’t get into it here. As I mentioned elsewhere I am a mathematics teacher with 20 years in the classroom, and I have a PhD in education. It’s not as though I don’t have a horse in this race. As for the lack of references on my part, I didn’t question the research; there was no need. I did question the conclusions that were non sequiturs to the evidence presented. No need for references for that.

    If you care to reread my original post, you’ll see that I agree with the great bulk of Anna’s paper. I think she mischaracterizes a few bits of her argument. And I strongly oppose her two recommendations that are presented without evidence or compelling argument.

    As a final comment, I am impatient with the “discovery learning” bogey. In Alberta, teachers are bound to teach the approved Program of Studies. The program specifies what students are to learn (in a fairly broad way), and this is clarified with assessment exemplars in other documents. Nowhere are teachers told how to teach. The legal obligation is for students to perform and communicate understanding as indicated in the Program of Studies. Teachers are autonomous professionals who must make their own decisions about the best ways to teach. That some dubious techniques can spread is unquestionably true. But the critics who get the most media airtime are pointing to all the wrong places.

    Like

  4. Hi John, thx for the reply, mainly checking in to say don’t go away — I’m rather backlogged with urgent stuff upon returning from my trip and it’ll take a while to pick this up from where we left off, but what I see in your reply left me a little disappointed, but knowing me I’ll ramble on a bit off the cuff anyway…

    I was hoping for a discussion that includes your own views as to what you consider important to assess, and/or what you hold important from an Alberta Ed perspective, and some specifics about what you believe was assessed in PISA and where you think they differ. Instead I see vagueness and dissembling. Perhaps you take exception to this; I’ll be happy to see you do otherwise. You seem to think this is a key issue, yet you don’t seem to have any particular position, or your position seems to be “there might not be a match, so let’s not worry too much because it’s on someone else’s shoulders to show that we should care.” Well … Alberta Ed has spent huge amounts of money and manpower for well over a decade to participate in PISA whose sole purpose is to inform policy. You’d think someone … SOMEONE … (like say Program Coordinators for Research at major school districts) might have considered this question some time before now and have something less vague to say about it.

    You reiterate your obfuscatory paragraph about what “values” we might have or PISA might assess without offering any specifics. You add on some speculative hypotheses that seem to indicate that you think PISA is a black box and that we’re largely in the dark about what it’s assessing.

    I would have preferred to see some indication that you’re familiar with the philosophy behind the setting of PISA questions, the process by which those questions are selected and set. You do know, I presumed, that there is an extensive literature about this — and that it is quite pertinent to the questions you have raised (and thus far treated as if they are deep mysteries or open questions). You say “PISA is a pretty good test, but it should not define our values for us.” But you appear to have nothing to say about what makes it a “pretty good test” and reveal no information about what “values” you think it might define for us. Really? Isn’t your job … ah … to be *on top* of research … which might include knowing what the major international assessment your schools take part in is actually about?

    To help this discussion move ahead while I’m still trying to find my feet back home, I’ll suggest you read up on the RME (Realistic Mathematics Education) philosophy behind PISA questions and specifically address where you feel it does not reflect appropriate “values” for assessment, as this is one place I plan to take the discussion. Thus far it does not appear that you are any better informed than Dave Martin, the Alberta teacher who asserted on a CBC National Radio program the other day that “PISA tests nothing but memorization and algorithms”. Sharon Friesen’s recent statements on CTV were a little more muted than Martin’s but appeared to reveal exactly the same false understanding of the assessment. Does *nobody* in the Alberta Education establishment make it their job to actually know this and to disseminate that information to others shaping policy?

    It might help to toss this into the mix: I’m not particularly fond of PISA, partly on the basis of what it’s assessing — I think that TIMSS is far superior for the purpose at hand. However, I believe data is data and this data should be taken seriously on the basis of what it evaluates. To a large degree, that is not a mystery and it is either dishonest and obfuscatory to speak of it as if it were.

    I would remind you in the meantime that Assessments like PISA do not assess the performance of *students*, but of *school systems*. It is not Dave & Sandra in a Grade 10 class at the school down the street who’s being graded here — it’s you and your buddies in the educational establishment.

    I’m not completely convinced that we are, as you put it, “stuck between two ideological positions, with a distressing lack of evidence.” We do, as you indicate, seem to agree (at least verbally) on a number of points, but as to ideological positions it would be interesting to know what you think my ideology is.

    In terms of pedagogy I like to tell people I’m a “what-works-alist”: I don’t care for anyone’s personal theories or beliefs; I’m pretty clear about my disdain for some that are largely repudiated by the research, but have a wait-and-see approach to those that remain untested (except for those in this category being foisted upon millions of children — the equivalent in medicine would be considered criminal). I want to know what the data says … and I do not take seriously your claim that we suffer a “distressing lack of evidence” considering that the world’s largest comparative study on educational interventions (and the world’s most ironically named) was summarily dismissed by the educational establishment when it arrived at clear conclusions that didn’t happen to align with the apparently prevailing ideology within the system.

    On content matters (which I regard as being upstream of pedagogy), I am committed to grade-level benchmarks used by the strongest international systems, and a structure consistent with the subject matter itself (on both counts WNCP math, as used in Alberta, falls far short).

    I’m glad you brought up the timeline but you should know that Dave Martin and I have already been through this twice (last year on twitter following our radio debate, and this year after he reiterated his false statement about it on CBC). You are quite correct that the 2012 PISA cohort had been exposed to the new curriculum starting in Grade 7, as Dave & I agreed. And that there is a 9-year pattern of decline. But you are glossing over that the points of data of greatest interest are those in which mathematics is the major domain. That is why you always hear experts discussing the relationship between 2012 and 2003 and not (for example) 2000, 2006 and 2009. Nevertheless, the gradient is unsurprising given that the same people who did that revision have been (according to what teachers in Alberta are telling us) coordinating a “full-court press” of Inservice training across the province to align methodology to the philosophy reflected in it. The effects go back at least as long as those people have been influential in your system. And the changes in 1995 were merely a ‘lite’ version of the current one, and quite radically different from its predecessor — as we have shown by direct comparisons with earlier curricula here in Manitoba. What is more surprising to me is that Alberta performed as well as it had in 2003 under the earlier WNCP ‘lite’ mathematics curriculum. You say “clearly” there is some other cause than the curriculum, but all you offer is that there was a decline already in the works. You’ll be happy to know that we agree that the curriculum is not the sole cause of the decline. In fact if you’d read Anna’s report more carefully you would have known that she identifies at least two other factors that we regard as important to consider.

    I note that you still do not concede that your tone and characterization of Dr. Stokke are inappropriate. You seem to think that offering agreement on a number of points, albeit in a sneering, belittling manner, absolves you from charges of incivility. I respectfully disagree.

    Two things have to be said about Anna’s use of the word “discovery” by way of reply.

    First, she does not use the phrase “discovery learning” — that seems to be an invention of yours here. She uses the phrase “discovery-based instruction” throughout and in a couple of places speaks about “discovery-based learning environments”. Certainly you understand the different between “learning” and “instruction”, but it’s a little surprising to see you conflating the two here. At one point she speaks broadly of “discovery-based learning” in making a general point about the effects of discovery-based instruction on learning outcomes. On this point she cites Klahr and Nigam (2004) and Clark (1989), two articles more than you have done thus far.

    Second, she does not leave the term undefined, but lays out what is meant by it in her report (2nd column, Page 4) where it is clear that she is using it as a rubric for numerous troublesome methods and ideas in education which all lean in roughly the same direction but come under a confusing array of labels with minor distinctions that are of little import in the general discussion of policy and certainly of little interest in open public discourse, the target audience of this report (not the hair-splitting, semantic-wrangling realm of the educational academy). Further, the spectrum she lays out corresponds closely to the well-known collection described as “minimal-guidance instruction” in the very well-known and pivotal piece on this subject by psychologists Kirschner, Sweller and Clark.

    Like

  5. I’d be more than happy to continue the discussion with you Robert. I do request that you maintain a polite and respectful tone and that you refrain from speculating about what I think. Just ask; I’m happy to tell you.

    There are many overlapping ideas in these issues. As I noted before, we probably agree more than we disagree. Where we disagree, it is unlikely to be because one of us is a fool.

    Like

  6. Your complaint about my tone is a bit rich considering my point of entry into this discussion was the numerous gratuitous swipes you had taken at Dr. Stokke. Until I see that backtracked I’m not inclined to take seriously a lecture from you about tone. You set the tone.

    As for speculating about what you think, the only point on which I have done so is where you appear to make a big fuss about “values” in math education as if that were an answer to PISA scores, but fail to take any position yourself regarding those values. The reader is left to speculate what specifically you had in mind, and I did so. If we’re to have a sensible discussion about this it must have a basis more solid than innuendo.

    On that subject, the values purported to be tested by PISA are clearly laid out in the literature on RME. While RME is a distinctively European philosophy (arising out of the Freudenthal Institute in Netherlands), it broadly matches key issues found in North American “reform” education. Here, for example, is an excerpt from an explanation of RME maintained by the institute:

    “The present form of RME is mostly determined by Freudenthalís (1977) view about mathematics. According to him, mathematics must be connected to reality, stay close to children and be relevant to society, in order to be of human value. Instead of seeing mathematics as subject matter that has to be transmitted, Freudenthal stressed the idea of mathematics as a human activity. Education should give students the “guided” opportunity to “re-invent” mathematics by doing it. This means that in mathematics education, the focal point should not be on mathematics as a closed system but on the activity, on the process of mathematization (Freudenthal, 1968).
    Later on, Treffers (1978, 1987) formulated the idea of two types of mathematization explicitly in an educational context and distinguished “horizontal” and “vertical” mathematization. In broad terms, these two types can be understood as follows.
    In horizontal mathematization, the students come up with mathematical tools which can help to organize and solve a problem located in a real-life situation.
    Vertical mathematization is the process of reorganization within the mathematical system itself, like, for instance, finding shortcuts and discovering connections between concepts and strategies and then applying these discoveries.”
    http://www.fi.uu.nl/en/rme/

    You insinuate in your post that something about what PISA evaluates does not match your, or perhaps our society’s values about mathematics. I think the above is a pretty clear statement of value. Perhaps you could be clear about where you feel that description represents improper values for mathematics education.

    I have spend much of the last 10 years — since my appointment to the MB provincial mathematics curriculum steering committee — listening to people involved in the formulation or implementation of the current WNCP mathematics curriculum. As far as I can tell that statement is a pretty good match with the philosophy represented therein, with some minor differences of emphasis. Do you disagree?

    Like

  7. Okay, while waiting for a reply concerning how you feel RME matches the values of which you speak, I shall digress slightly to talk about the purpose of assessments like PISA. The value we attach to such assessments is contingent upon their purpose. They are not assessments of individual students’ progress — they are a way to monitor the health of an entire school system and, in particular, to judge the effect of policy changes both within each system (by observing longitudinal changes) and relative to other systems (by watching the larger patterns made by changing policies around the world).

    The thing being assessed in PISA, TIMSS, PCAP is not individual students — and not even individual teachers. It is the *system* itself, the larger policies within it, and effects of systemic variables such as SES, migration, social transformations in society — which is why, a year after each assessment report, a second “contextual report” is released, correlating performance variables against contextual variables such as these.

    It is telling when those within the system — indeed those whose own contributions are tied to these various factors, and who therefore themselves are in the spotlight of the assessment — seek to minimize or explain away the clear message of an assessment like this. Governments spend millions of dollars on these assessments because monitoring the health of our educational system is clearly of great importance and worth far more than the dollars spent.

    Alberta’s decline, like Manitoba’s, is greater than that of any single participating country or economy in the PISA assessment. To learn such things is precisely why those millions of dollars are being spent. Having spent that money and committed to that path, and seeing such a dramatic indicator, why would the system dismiss it? Should they not first strive to identify the causative factor (factor*s* — it’s quite evident that no single factor is likely to be responsible for such a drop? Where is the effort to do so? Where is the analysis, the exact diagnosis? Why is it up to unpaid outsiders like Anna to sound the alarm, and to suggest correctives? In our view a few different factors have worked in concert; they are not unconnected, but it is more than a few changes in curriculum) and then to introduce corrections where needed.

    Now, here’s why I dislike PISA for this purpose. It is *merely* a canary in the coal mine. To the extent that it measures performance of our educational system it provides only a temperature scale, blurring the useful detail within its diagnostic. That comes from the RME philosophy, which values “finished” skills for using mathematics in “real life”, looking at complex problem solving, multi-stage problems, “embedded” skills rather than those skills themselves.

    Now, you might say, “aren’t those the things we care about?” That may well be, but such a tool has little diagnostic value. It tells us something … but it does not give us direction for correctives. It is like your family doctor saying “how have you been feeling lately? Rate your health on a scale of 1 to 10”. If you say “3”, what does that tell him? Or, if he pulls out an “overall health thermometer” and determines in a perfectly vague fashion, that you are “seriously ill” — where does that put you? Certainly further ahead than without knowing this, for you cannot fix what you know is not broken. But what is needed in such a case is not a thermometer reading of your overall health — it is essential that the doctor knows exactly what is wrong.

    For, suppose the doctor guesses that you have cancer, whereas you actually have failing kidneys. Then he orders up chemotherapy — which places extreme stress on your internal organs leading to kidney failure, making things far worse than before.

    Those of us watching scores decline, and provincial ministries respond, across Canada, have seen an entirely analogous thing occur over and over: As political pressure mounts to address the problem the minister suddenly makes an announcement: New funding has been set aside to improve scores in math education. What is that funding for? In most cases it goes to “teacher training” — to help teachers better learn the “new ways” of teaching mathematics. But … what if … just stay with me here a moment … what if one of the causative factors in the decline has been precisely the pressure teachers are experiencing to cast aside conventional, proven approaches? The push to teaching mathematics under more demanding and largely unproven methods that involve less direct teacher guidance but demand far deeper teacher domain knowledge, and lots of whiz-bang ideas like (as top-name Canadian guru declared at a large teacher-training workshop this spring here in Manitoba) “Teachers should stop being so clear with students” (presumably so that, in struggling on their own to understand concepts and instructions, students will self-generate “deep” understanding).

    In that case, one might expect that the “cure” would not help fix the problem at all. Indeed, it is likely to make it worse. And, perhaps this pattern partly explains the inverse — yes, inverse — correlation between per-student spending across the provinces and their performance in mathematical assessments.

    What is needed is a diagnostic tool that identifies — with fine-grained analysis — precisely where problems lie. That will give us the greatest chance of introducing appropriate correctives. Three provinces — Quebec, Ontario and Alberta — participate in the TIMSS assessment, which is exactly such a tool. Questions in TIMSS isolate specific skills and learning outcomes and tests each system’s performance on those specific things. This gives us an effective MRI of the internal mechanisms leading to the systemic declines measured (but not particularly well diagnosed) in assessments like PISA.

    Back in 2013 I pointed to a couple of specific questions from TIMSS 2011 that are very telling about the Alberta situation, and suggest some pretty clear, relatively straightforward, things that could be fixed. The relevant data is summarized in the graphic I produced for the Globe & Mail here
    http://www.theglobeandmail.com/incoming/article15763259.ece/BINARY/PDF%3A+How+Canadian+students+fared+on+two+problems%3F+Not+much+better+than+guessing
    (and also found in a slightly different form in Dr. Stoke’s CD Howe report).

    Comparing these TIMSS questions to PISA ones you will immediately see the contrast. Observe how TIMSS zooms in on specifics. Now there is no perfect isolation of single learning outcomes (just for example, students must read to understand these questions — how can we completely control for deficiencies on that side?) PISA may have a problem that embeds this skill into a complex exercise surrounded by cognitive clutter. If performance is alarmingly low in that noisy environment, what have we learned about the specific failure of an educational system? Where’s the wound? What medicine should be applied?

    In this case we see two problems concerning arithmetic with fractions:

    The first is purely a test of familiarity with mechanics: Either you know how to combine simple fractions in a sum or difference … or you don’t. Alberta students show that they don’t. Or (keeping in mind what I said about who is being evaluated here) more to the point, the Alberta education system has failed to confer this knowledge upon them — across the system they perform barely better than random guessing. This signals a level of disfunction with fractional arithmetic for Grade 8 students that is truly alarming. Fraction arithmetic is not a minor detail in one’s education — it is a pivotal outcome for the transition from arithmetic to algebra. Worse, TIMSS data reveals that among Alberta students on this question the *most common* answer was option (a): 1/3 – 1/4 = (1-1)/(4-3). That’s about as bad as it gets and shows *both* lack of procedural fluency *and* lack of understanding.

    The second is a purely cognitive outcome — there are no mechanics involved. It is about understanding the multiplication of non-integer real numbers, and indicates the coarsest of estimation skills in this circumstance. In this case the Alberta system’s performance was not distinguishable from random guessing.

    On fractions … that’s a complete washout.

    What is the diagnostic value therein? As it turns out there is a pretty obvious culprit: Turn to the current Alberta curriculum and consider where arithmetic of fractions begins.

    Grade 7

    Now to be sure students under that curriculum have been required to “demonstrate an understanding of fractions” in various ways since Grade 3. But not to do anything that amounts to arithmetic.

    Don’t look to the 1995 WNCP curriculum for better because it was just as bad on this particular outcome. This dramatic change occurred 20 years ago, when fractions were moved from Grade 4 to the transition point between middle school and junior high school. The best-performing school systems still introduce arithmetic of fractions in Grade 3 or 4. And the level of damage caused by a bad curriculum will be correlated with the enforcement regime — here again I point to the “full-court-press” of in-service training gearing up for the 2008 revisions that began years earlier than that. To what degree was the 1995 curriculum and the embedded methodologies being used in Alberta classrooms? From my time on the Manitoba Steering Committee I can report that the ministry consultants here expressed frustration at lack of compliance with the first round of WNCP and determination that this would not happen again.

    It may be a surprise is that a year after supposedly learning about fraction arithmetic AB students are still rank novices in the subject. Perhaps this might point to the spiral design in the sequencing of outcomes in WNCP. There is no expectation of mastery within that year. Students are busy “demonstrating an understanding” and not showing in any concrete way that they are progressing in skill. But I would also point to the length of time it takes do develop mastery of complex skills and understanding. It is not something you have because of exposure. It comes over much repeated practise, retrieval and consolidation. A timeline of years is needed, not “just in time” learning, for such things. Understanding the pitfalls to mastery of fractions, it is no surprise that a year after exposure Alberta students still demonstrate little facility with them.

    While you’re going on about “what society values about mathematical education”, I’m interested in the *purpose* of the assessments and their diagnostic value. And while we have good data on the table, I’m interested in fixing what it shows to be broken.

    Like

  8. Robert, I appreciate your comments and response, but please remember that you are a guest at my blog, and it is not yours to dictate the agenda. I may or may not respond to your comments, as my time and interest dictate.

    FWIW I agree completely with the first two paragraphs of your most recent comment. Your speculations beginning in the third paragraph are not particularly compelling or interesting.

    I do, by the way, also highly value skill in working with fractions. I’ll see if I can find the time to look at the report containing the (concerning) poorly-answered question you not.

    Like

  9. I’ll save you some time. Here is the TIMSS 2011 directory
    http://nces.ed.gov/timss/results11_math11.asp
    Here are the released questions
    http://nces.ed.gov/timss/educators.asp
    If you have trouble finding the response stats, etc. I can send them to you directly.

    It would be interesting to learn what Edmonton SD plans to do about the clear problems with the fractions sequence in AB WNCP, and how it has happened that even with people like you who so “highly value skill in working with fractions” working for them, this problem has gone on for so long…

    Like

  10. Got to admit I scratched my head at this last comment of yours John (don’t worry, this comment is only commenting on what you wrote). The comment to which this is apparently a response did only two things:

    1. respond to your earlier statement
    “I’ll see if I can find the time to look at the report containing the (concerning) poorly-answered question”
    by providing a link to make it easier for you to look up.

    and

    2. respond your earlier statement
    “I do, by the way, also highly value skill in working with fractions.”
    by indicating that I would prefer knowing what action is in the works on this matter in your district, versus your sentiments about it.

    Like

Leave a comment