Blog

The iceberg of outcomes

This is the time of the year when both good practice and legislative requirements have schools focusing on their achievement and other student outcomes data. We are looking to see where we have made the biggest impacts so we can celebrate this. We are also looking to see if there are groups of students, or areas of our programmes, or parts of our school that are not making the progress we had hoped for. School leadership teams often spend a lot of time crunching numbers and making the huge mass of collated information meaningful and enable colleagues, staff, teachers, Boards, and sometimes even students to make sense of it. The question is: Are we focussing on the right things?

iceberg showing underside

WHEN WE ARE LOOKING AT DATA WE NEED TO CONSIDER SOME IMPORTANT FACTORS

Student Management Systems (SMSs) are making the analysis process a bit simpler if they are used effectively, and if the information being stored and collated is numerical. But, often SMSs are simply repositories for demographic information and a place to keep test scores, NCEA data, and Overall Teacher Judgments (OTJs). A basic understanding of spreadsheets certainly helps make sense of data on scale, and means different theories can be explored and views of data used to try and understand what it really means.

Being “data literate” also means being able to choose appropriate presentation formats for the kinds of data being shared. A table may show everything, but can be confusing and significant things can be lost. A graph is good for showing differences, but attention should be paid to using an appropriate scale, for instance.

Lack of experience with data analysis can lead to making incorrect assumptions. For example: the assumption that a numerical difference is a significant one. By this, I mean, ensuring that the differences in the numbers could not simply be attributed to chance. If numbers are small then the so-called margin of error can be quite large. Think about political polls, for example, that often quote a margin of error of plus or minus something like 3.5%. This means that the actual results could be expected to vary by up to 3.5% bigger or smaller. This is with surveys of over 1000 people. In the school context, we may well be talking about samples of less than 10% of this size. With a sample of 50 or less, the variance of 20-30% in scores could, in fact, be expected simply by chance. This is particularly true if the confidence we have in the accuracy of the measure being used is not that high.

In the school context, we also know that comparing one year group with another is not comparing like with like. Different year groups can have completely different compositions and the students can vary wildly in their engagement, confidence, and ability in different components of the curriculum we may be assessing and tracking.

POSITIONING ASSESSMENT DATA IN THE DECISION-MAKING PROCESS

‘Data-driven practice’ and ‘data-informed decision-making’ have become real buzzwords in recent times. Both these things require consideration of the factors outlined above. They also require that we position assessment data in a way where it is not the sole determining factor in what we do.

In the same way that good assessment practice means a single-test score is not the only indicator of an OTJ, analysis of OTJ data is not the only indicator of schools achieving successful outcomes for their students. Or, indeed, of teachers being successful in their settings either. Any assessment should be a point-in-time litmus test of the outcomes being aimed for, not the only criteria. Effective schools and individual educators know a lot more about their students collectively and individually than can ever be captured in a single number, or set of numbers. Student outcomes over time are not always well represented on a graph.

I like to think of the things we can put the number on and, therefore, ultimately turn into some sort of graph or table as the proverbial tip of the iceberg. There are so many other things that make up student achievement, outcomes, and success that are ‘below the surface’, but nonetheless hugely significant:

iceberg outcomes

These factors below the waterline are things that the ‘tip of the iceberg’ factors can point to, but often the link may not be a very strong one. They may also be things that the whānau or culture that your students (or a group of students) come from are valued more highly than those above.

As a parent, I am way more proud of my own kids being good people than I am of any of their academic outcomes. I would think many families take a similar perspective.

So, I guess my challenge in this blog post is to consider several different things. As we begin bringing our focused attention onto the year’s data and information to begin making decisions about where we need to focus for next year in our programmes and improvement efforts:

  • Are we examining data in an appropriate way?
  • Are we reading too much into the numbers?
  • Do the numbers show what we are claiming they do?
  • Are the important things captured in the numbers, or, are there other key things that cannot be shown by numbers alone?
  • Are we using the best data and information that you can in your decision-making processes?
  • Are the conclusions we are drawing true for all students and groups of students?

If you would like support thinking about these things more deeply, and/or planning your PLD response to what you have found, do contact us at CORE Education.

 

Image sources

Iceberg photo (top and featured on home page): Image: By AWeith (Own work), CC BY-SA 4.0 via Wikimedia Commons
Iceberg of outcomes graphic: by author under CC-BY-ND-SA

The following two tabs change content below.

Greg Carroll

Greg Carroll is a Learning with Digital Technologies Facilitator/Project Leader. Greg has been teaching since 1989 and was a primary school principal from 1994-2011. He has taught at all levels of primary school, from years 4-8, usually in multi-level classes. Greg has participated in the ICT PD programme as a principal of a cluster school, Project Director (of a different cluster), and National Facilitator. He has been part of ICT PD in some capacity for the past 10 years. Since 2012, Greg has been working as a consultant and this has included roles with classes and groups of children in schools, teachers, and other staff. He has also run workshops for other education professionals, and worked as a researcher at Natural History New Zealand alongside their iOS development team.
Email me when people comment –

Anne Kenneally is an experienced professional learning facilitator. Anne facilitates the effective use of digital technologies to engage and empower all learners. She has been a part of the Learning with Digital Technologies team. She is also a mentor within the UChoose mentoring programme and Virtual Professional Learning and Development (VPLD) programmes. Anne has worked with schools and leadership teams across New Zealand supporting the development of effective use of technology, Google how-tos, and teaching as inquiry, through to strategic planning and school transformation initiatives. Anne has a strong focus on teaching as inquiry, supporting leaders and educators to maximise their potential. She is also passionate about removing ‘barriers to learning’ for students, particularly in literacy.

You need to be a member of UChoose to add comments!

Join UChoose