Measuring the effectiveness of your summer reading initiatives provides evidence of its impact. You can then use the information to review, prioritise and inform future programmes.
Measuring the impact of your summer reading initiative
Using an inquiry approach
Three dimensions of evidence-based practice
Spin-off benefits for schools
Opportunities for gathering evidence of practice
Tools and methodologies for gathering data
Analysis and reporting
How do you find out what difference your summer reading programme has made, and how your students have benefited?
Your school can extract some rich information using an evidence-based practice approach to measure its impact. By adopting this approach, you'll be able to:
- provide a rationale and motivation for summer reading initiatives
- come up with creative ideas for an engaging summer reading programme
- set up goals and targets, and work out your priorities
- acknowledge the efforts and results of everyone involved
- provide information for reporting about outcomes
- use the evidence as you reflect on and refine your summer reading programme for future delivery.
Read more about planning a summer reading initiative in your school.
If your school decides to use an inquiry model to plan an in-school initiative to support summer reading, you'll progress through the following steps:
- Set an objective to maintain student reading levels in the school over summer
- Draft up the initial planning for your programme
- Trial this with a group of students
- Modify your draft programme in line with the feedback and results of the trial, to focus on improved student achievement
Dr Ross Todd, Director of Rutgers University’s Center for International Scholarship in School Libraries (CISSL) describes three dimensions of evidence-based practice:
- Evidence FOR practice – national and international research which informs and inspires changes of practice in schools.
- Evidence IN practice – locally generated evidence as a result of changes in practice.
- Evidence OF practice – outcomes, results, impact of initiatives.
Evidence FOR practice
National and international research provides important information about the summer reading “slide”. It also adds the motivation and impetus to address this issue locally, in schools and in homes.
Evidence IN practice
This is about integrating research findings with professional expertise and local evidence and taking a strategic approach to creating initiatives to address summer reading loss.
School data and teacher observations provide a local context around summer reading loss, the implications for teacher practice, and the impact on particular students.
- Is your school aware of the “summer slide” and its impact on student reading progress? Are parents in your school community aware of this issue? How can you get the discussion going about why it matters and what to do about it?
- Is there any existing data or testing of reading levels end of year / beginning of year, for example STAR, BURT or PROBE, that gives you quantitative data about student reading loss over the summer?
- What is happening to address summer reading loss in your school community? Is there a school-wide approach?
Next steps in creating the basis for gathering evidence in practice, as you plan your summer reading initiative:
- Identify your school’s current summer holiday reading practices through a professional development focus
- Discuss possible strategies to meet the needs of your school community, and come up with approaches to try
- Identify the students your school would like to target
- Come up with ways to gather evidence of the impact of your proposed initiatives
- Develop a plan with a multi-pronged approach involving classroom teaching, libraries, and families
Download the Summer reading reflection questionnaire below.
Evidence OF practice
Your school can gather evidence of the impact of your summer reading initiatives in various ways – qualitative and quantitative, formal and informal. It is important to identify the approach you're taking, along with your timeframes for gathering evidence, at the planning stage.
Quantitative data: Information that can be reported in numerical form, in graphs or tables.
Qualitative data: Information about how people feel or think about things, for example, the results of questionnaires or surveys, “voices”, quotes, stories or reports.
The focus is on student learning outcomes measures, such as reading attitudes, levels, confidence or behaviours, rather than on outputs measures such as number of books issued or numbers attending summer reading programmes. This is shifting the emphasis from what schools and libraries do to what students achieve, for example:
- Students are reading more, and voluntary reading is increasing, becoming a personal habit.
- Reading test scores show maintenance or improvement in reading levels achieved over summer.
- Number of students who say they enjoy reading is increasing.
- Children are having library books read to them at home.
- Parents are able to talk about their child’s reading with teachers, library staff, and others.
- Positive attitudes develop towards reading and library use.
- Non-library users have started using the library voluntarily, for reading materials.
Your school might find it useful to document other spin-off benefits arising out of your summer reading programme, such as
- strengthened home / school partnerships
- your students, especially those targeted by your summer reading programme, using and valuing the school library more than before
- stronger relationships developing with the public library
- the development of home literacy practice with parents more informed and confident about helping their children.
There are 4 main sources of qualitative and quantative data to draw on as you set about planning to record the impact of your summer reading programme. As you plan your approach to data gathering, here are some suggestions under each source: in your school, your school library, the students who take part in the programme, and your public library.
- Progress made towards achieving your school’s literacy goals and targets in the Annual Plan
- Any changes in reading levels and attitudes of targeted students
- Information from existing testing programmes (eg Running records, STAR, BURT, e-asTTLe, PM benchmarks, Probe) before and after summer holidays
- Anecdotal reports, conversations and observations from teachers, eg through staff / syndicate meeting discussion
- Strategies used by different teachers / classes and correlation with results
- Liaison with the public library – student participation in public library summer reading programmes
- Feedback from families, results of questionnaires or conversations
- Promotion through the year of holiday reading / uptake by students and parents, of reading for pleasure each holiday break
Your school library
- Evidence of change from library management system data, eg borrowing statistics
- Formal and informal surveys of students and their families who have used the library to get books for the holidays
- Liaison with teachers about targeted students’ reading mileage
- Responses from visitors if the library is open during the holidays
- Observation of students and their reading mileage, enjoyment, connections
- Student confidence levels in choosing books independently for pleasure reading
- Photos of student readers
- Uptake by students of any summer reading “challenges”
- Anecdotal reports, conversations and observations
- Displays of favourite books read over the summer with brief “reviews”
- Simple pre/post summer surveys or questionnaires to students and / or their families
- Anecdotal feedback from families
- Feedback from students after the holidays about favourite reads
- Reading logs
- Uptake of competitions, personal challenges / goals
- Feedback from the public library about uptake by students of their summer reading programmes
- Participation in challenges or competitions by students
As you plan the processes around your data gathering, you'll discuss and make decisions on:
- liaison and coordination between library and teaching staff around surveying students
- who will gather data, when and how - timing, frequency, level, sample size etc
- design of surveys for useful results and variety of options for feedback – pen and paper, online (eg SurveyMonkey), and face-to-face.
Gathering evidence is only part of the story. The information needs to be analysed, shared and results applied to practice. It will be all the more convincing if it is triangulated from more than one source. For example, gathering the perspectives of students, teachers and families for a more complete and accurate perspective.
Sharing the results of any programmes will encourage successful practices to be embedded and developed further, or lead to refinements and improvements in the future.
Student success stories are often a particularly potent way of communicating the impact of summer reading initiatves to colleagues, school managers and family/ whānau.
Read how Clayton Park School used evidence to measure its efforts to tackle the Summer Slide.
Download the reporting template below.
Image: Austin Kleon on Flickr