It is day 5 of ICME14. The weather here in Oslo is incredible, with temperatures close to 30C in the afternoons, so it feels almost like being in Shanghai. On the other hand, sitting alone in my office at home has a different feel to it than attending a conference in person...
The day started off with invited lectures. This is perhaps the most difficult part of attending an ICME - outstanding scholars from around the world are holding talks simultaneously, and all you have to go by are the titles (and perhaps abstracts), and your prior knowledge of their work. But even people who write wonderfully can at times have terrible presentations (I have experienced people who spend most of their talk discussing with themselves that they have prepared far too much and that there is so much of interest that they will have to skip. Good idea: focus on claiming that what you are discussing (and have time for) is interesting...) So this year I went for an outstanding scholar which I also know give good presentations: Tinne Hoff Kjeldsen. Her title was "What Can History Do for the Teaching of Mathematical Modelling in Scientific
Contexts: Why and How?"
Kjeldsen has previously worked with how history can be used to reveal metadiscursive rules andd make them explicit objects of reflection and to provoke commognitive conflicts, and to provide a window into mathematics in the making. In this talk, she focused on how history can help in teaching modelling. Mathematical modelling has much to offer to other disciplines, on the other hand, different disciplines have different ideas of what models should be. She gave three examples: John von Neumann's model in economics, Vito Volterra's predatory-prey model and Nicolas Rashevsky's model on cell division. (I am not able to summarize what she said about these...) In the example of von Neumann, we see that the mathematician's purpose can be different from an economist's purpose, which may be to solve concrete problems in practice. In the example of Volterra, D'Ancona claimed that the model could lead to new insights even when the model could not be confirmed by data. In the example of Rashevsky, biologists questioned Rashevsky's assumptions. He investigated possible explanations, while biologists wanted the explanations based in empirical data. (The three cases are analysed further in Jessen & Kjeldsen, forthcoming.)
Kjeldsen referred to Axel Gelfert (2018) on explorative modelling - providing "potential explanations of general patterns". He mentions three functions of explorative models: aiming at a starting point, proofs of principles and potential explanations. Kjeldsen shows how these functions fit with the examples she has given. Kjeldsen pointed out that there are many elements in modelling that are not explicitly captured in "modelling cycle" models for modelling (ex. Blomhøj and Jensen, 2006), such as the purpose of the modelling, the function of the model and so on. She showed that Bouman (2005)'s account of modelling could be a good supplement, although there are still issues not explicitly captured.
At the end, she discussed an example of use of the example of Raschevky with students, and how looking at the contemporary discussion of his model, improved students' own modelling competence. Moreover, history of mathematics can provide a window into mathematical modelling "in the making", and let them reflect on how scientists get ideas, which strategies they use, which choices they make and how they argue and learn, and discussions about what counts as valid arguments and what mathematical modelling can provide in scientific contexts. She also discussed the concept of "historical awareness" and how the Raschevsky project helped students develop this.
I really like Kjeldsen's approach, giving in-depth discussions on how history of mathematics can provide specific awareness and knowledge, which is valuable contributions to the literature on history of mathematics in mathematics education.
(One thought after listening to Kjeldsen, is that teaching of modelling is necessarily limited by the teachers' knowledge of pertinent subjects where mathematical modelling is used. This also reminds me of Trude Sundtjønn's work, where attempts to make mathematics relevant for students in vocational education, met hurdles connected to the teachers' (lack of) knowledge of the vocations in question. Too often, examples of modelling in textbooks are simplistic, perhaps because authors try not to presuppose knowledge that teachers and students do not have. A good thing with history is that both the mathematics and the science involved may be simpler than using real-world examples from today.)
The next item on the agenda was a plenary lecture: Mercy Kazima: "Mathematical Work of Teaching in Multilingual Context". Kazima based the lecture on what we know about teaching and learning in a language different from the home language, and on the work on "mathematical work of teaching" (Ball, Thames & Phelps, 2008). She pointed out that although Ball et al is framing their theory in a general way, it is important to investigate in different contexts - for instance they may not fully cover the issues in multilingual classrooms. (As a matter of fact, it is rather usual that Western researchers formulate general theories based on local empirical data.) She also referred to Sorto et al (2018), including the knowledge of obstacles encountered by ELLs, knowledge of resources that ELLs draw upon in learning mathematics, and knowledge of instructional strategies that help ELLs in mathematics. (ELL means "English Language Learners", thus is based in a context where students have one home language while school is conducted in English. Hopefully, these are also relevant to learners that have no interest in learning English in their context...)
She gave some context: teaching in grades 1-4 are in Chichewa or other local languages (with textbooks in Chichewa), while in grades 5-7, teaching is in English. Teachers generally know at least the two languages Chichewa and English. This differentiates Malawi from countries in which teachers only know the language of instruction and not the students' home languages.
She gave some lessons from Malawian studies:
- She pointed out that students' meanings for mathematical terms are oftten different from the mathematical meanings, and that these are influenced by home languages.
- Code switching can be used effectively to make mathematics accessible to students.
- Bilingual approach where use of home language is planned and proactive can be effective in making mathematics accessible to learners.
- Teacher education does not prepare teachers for the teaching in multilingual contexts.
The first of these means that the teachers should know the corresponding words in home language, different meanings of these words and how home language can be used to improve understanding of the mathematical terms. Based on similar discussions of the other three lessons, she discussed four types of mathematical work of teaching related to teaching in a multilingual context: 1) identifying resources in home language; 2) identifying obstacles in home language; 3) identifying obstacles in English; and 4) identifying strategies. She then gave examples from Malawi, which I do not try to summarize here. Just one example, though: the equal sign is translated as zikhala, which literally means "will become", which is of course an unfortunate understanding of the equal sign. Teachers can choose to instead explain using the word chimodzimodzi, meaning "the same as". Such vocabulary work is part of the mathematical work of teaching.
After lunch, there was the second (of three) plenary panels: "Mathematics Education Reform Post 2020: Conversations towards Building Back Better". As it does not make sense to sit looking at a computer for days on end, I decided to skip this one.
The final item on the day's agenda was Topic Study Groups. I returned to TSG12 (on statistics):
Gail Burrill: "Margin or error: connecting chance to plausible". Burrill talked about ways of teaching margins of errors and confidence intervals. (My notes below will probably mostly be useful for myself as a way to remember some of her points... ) Both teachers and researchers have problems interpreting margin of errors and confidence intervals. To see the mean as a balance point helps students look at deviation from the mean. Using simulations helps students discuss what is the probablilty of getting particular outcomes when drawing a sample of a certain sizes. They learn the difference between sample size and numbers of samples. But what happens when we are supposed to say something about the population using our sample?
Task: draw a sample of 30 M&Ms, to estimate the true proportion of blue M&Ms in the bag. Handing out M&M bags with different, but known, proportions of blue M&Ms. After a while, students go on to simulating. Then we can ask them: who have bags where 8 blue M&M would be plausible? A range of numbers are given, and this gives a starting point for what a margin of sampling error might be. Back to M&Ms: all bags now have 40 % blue M&Ms, they set a margin of error, and it turns out that (often) at least one of the groups do not have 40 % within their margin of error. So students learn that the margin of error is not giving an absolute bound.
I didn't get manage to note all the rich ideas, but I noted the use of StatKey as useful software to sample distributions. In addition she used TI Inspire (from Texas Instruments) and applets from Building Concepts Statistics and Probability. (I did get a little lost in thought, as I tried to think how I can use some of this in teaching quantitative methods to my 400-500 students this fall.)
Cindy Alejandra Martínez-Castro, Lucía Zapata-Cardona & Gloria Lynn Jones: "Critical citizenship in statistics teacher education". Zapata-Cardona discussed the concepts of "critical citizenship" and "statistical investigations". She argued that investigations of crises in society would contribute to critical citizenship. (It is interesting to see the difference of the first two presentations today - the first one very detailed both on how the teaching was done and on the resulting mathematical understandings of students, the other one being less detailed on the statistical content, and more occupied with the general critical citizenship potentially promoted by such work.) In the discussion, a resource for working on critical citizenship was shared: http://iase-web.org/islp/pcs/.
Adam Molnar and Shiteng Yang: "Mathematics ability and other factors associated with success in introductory statistics". This talk was about a diagnostic test to study factors associated with success in a course in university. A methodological point of interest is that the students who answered the test, tended to do better than those who did not (that is; people who don't like mathematics, tend not to like to do a test either). The main finding was that "College GPA" (which seems to be the grade point average from college in the US) is highly corellated to the results in the introductory statistics course, and that adding the diagnostic test results, didn't really improve the model by a lot.
(By the way, a feature of online conferences, is that the discussions about one talk continues in the chat after the time for live discussion is up and during the following talks. I'm not sure if this should really be seen as a feature or as a bug - it does of course tend to take the attention away from the following paper, while adding flexibility.)
Karoline Smucker and Azita Manouchehri: "Elementary students' responses to quantitative data". Her research was on five third grade students (8-9 years of age), and she wanted to look at students prior to explicit instruction. The activity was collecting "wingspans" from fellow students. During graph creation, they were focused on creating the "right" graph and to follow rules. They had trouble graphing the quantitative data, but eventually, they decided to create groups, which meant that they made something close to histograms. There were several interesting findings connected to these "histograms", one of which was that some students were careful to include all the original data in the diagram. In their "analysis", the shapes of the diagrams, the center and the variability (what other third grade classes would look like) were included. Thus, we see that third grade students can get quite far in working on quantitative data without explicit instruction.
In the discussion it was pointed out that these third grade students did better than teacher students do in research: http://dx.doi.org/10.1080/00207390902759584. It is highly interesting why this is. Maybe, it was pointed out, there was one student who had a good idea that the class ran with. On the other hand, as Dani Ben-Zvi argued, teacher students have gone through years of schooling where they learn that bar charts is the way to present data. Also, it was discussed what would have happened if the students had access to TinkerPlots instead of paper and pencils.
No comments:
Post a Comment