This year I have made a deliberate effort to encourage students to pose more questions, believing that this gives me a better insight into students’ thinking. On Friday, during a Geography lesson, I saw the benefits of this. In planning the lesson I had decided to model reading the climate statistics of Adelaide, so that students could then explore the climate statistics of their chosen country.
I presented the following table and graph:
I had planned to pose questions like:
- What is the highest average maximum temperature? When does this occur?
- What is the lowest average maximum temperature? When does this occur?
- What is the average rainfall for June?
Instead I referred to the data and got students to pose the questions. They asked much higher order questions, such as:
- Who collects the data? – Do they record accurately or can they manipulate the data if they are climate sceptics? (This wasn’t worded in this way, but it was what they were getting at.)
- How accurate is the data?
- What is the area related to the rainfall? How does this affect the data collected?
- Has there been major differences between the climate each year?
- After looking at the average temperatures, and knowing that the temperature can be much higher than these in Adelaide, one student thought that the statistics may be different if the last few years’ data was used, rather than the previous 30 years.
- Is the data reliable?
- When was it recorded?
- Who recorded it?
- Why is January the hottest month and has more rainfall than February? – February is usually hotter isn’t it?
- Why does the minimum temperature follow the maximum? (Recognition of pattern)
- Why does June have the highest rainfall?
I will continue to encourage students to pose and answer their own and others’ questions.
What sort of questions are your students posing?
What does this show you about their thinking?
Working in a system of standards-referenced assessment: traversing the intersections – Lenore Adie
Unit planning – Intended Learning clearly articulated
SOLO – standards clear A, B, C, etc
Success criteria clear to students – They can then use these to self assess
What did you learn?
How do you know you learned it?
What got in the way of your learning?
What helped your learning?
How do you feel?
What can I do to help you?
||You need to include more descriptive adjectives
||You need to work with your writing buddy to edit this piece of work looking for places to include more descriptive adjectives
||You already know the key features of the opening of an argument. Check to see whether you have incorporated them in your first paragraph.
Bringing it all together
triple-science-guide_assessment-for-learningThe Association for Achievement and Improvement through Assessment
Dylan Wiliam’s Website
Education and Endowment Fund
Learning Sciences – Dylan Wiliam Centre
National STEM Learning Centre
Further Reading/ Resources
Assessment For Learning
Differentiation can occur in regard to:
- intervention – role of adults and students
- journey – how
- process – ways of access
Frameworks – useful reference points
- Blooms Taxonomy
- SOLO Taxonomy – Structure of Observed Learning Outcomes
These structures/ quadrants can be used to allow students to choose a suitable pathway forward, depending on their current level of understanding.
||4. Draw your own…
|Double my number using cubes
Solve the word problem on your table.
Do it using a different method.
|Double muddle! Correct my mistakes.
Gold coins are doubling in the pirate chest.
Using scaffolds – (providing floors not ceilings!) – structure thinking to make meaning
- True/False cards
- Card sorts
- Venn diagrams
- Double Bubble
- SOLO Maps
- Hexagons – connect concepts through the use of subject specific vocabulary
|Blooms Taxonomy Cognitive Level
||Type of Activity
|Evaluate and Create
||Cause and Effect
Through listening to the discussion generated by students working in pairs/ groups on these scaffolds teachers can make judgements about misconceptions/ further challenges needed. Questioning students to build on their ideas, and then allowing time, directing them to resources and peers who can help them, without just explaining answers, can empower students more. Also expecting students to respond orally in fully developed sentences, using appropriate vocabulary, will provide practice for more developed written responses.
A space which students can go to to access further resources/ support structures,
- key word lists
- technoloical support, devices – ipads, tablets, computers
- text books/ revision guides
- graphic organisers
- sentence stems
These can be utilised individual or in pairs.
The aim is to enable all learners, including ourselves, to improve.
Could students do the proposed assessment(s) well but not really have mastered or understood the content in question?
Could students do poorly on the specific assessment(s) but really have mastery of the content in question?
Could students do all the designer-proposed activities in Stage 3 but not really be ready to explain/ justify/ infer meaning or transfer their learning as demanded by assessments in Stage 2?
Could students do all the proposed activities in Stage 3 but still be ready to handle tasks in Stage 2 that require higher-order inference and other kinds of meaning-making?
Calculating an effect size from Cognition Education on Vimeo.
Who is correct?
Is the educational community really not checking researchers?
Would the educational community really put someone in charge of the Australian Institute of Teaching and School Leadership who has made strong recommendations that now seem to be questioned?
We are being told that we need to use researched based practices. Which research can we trust?
Professional Learning Team (PLT) meeting and cycle:
1. What is the student ready to learn, and what is the evidence for this in terms of what the student can do, say, make or write?
2. What are the possible evidence-based intervention and the associated scaffolding processes for each?
3. What is the preferred intervention, and how will it be resourced and implemented?
4. What is the expected impact on learning, and how will this be evaluated?
5. What was the outcome, and how can this be interpreted?