Thinking, talking, doing science< Back to search results
- Format Lessons
- Language/s English
- Target Audience Schools, Researchers
- EBM Stage 0 - Why EBM?
- Duration >15 mins
- Difficulty Introductory
This resource has been evaluated rigorously
Key Concepts addressed
The ‘Bright Ideas Time’ is a slot to talk about science. It need not take more than 10 minutes per session, so it is easy to fit into a lesson.
The Bright Ideas Time is often used as a whole class starter activity and linked to the topic to be addressed in the main lesson. It can equally well be used as a plenary or at a transition point in the lesson.
The classroom ethos is important so that all ideas are valued and it is acceptable for pupils to take risks in their thinking and sometimes to be wrong.
The Executive Summary is reproduced below.
Thinking, Doing, Talking Science (TDTS) is a programme that aims to make science lessons in primary schools more practical, creative and challenging. Teachers are trained in a repertoire of strategies that aim to encourage pupils to use higher order thinking skills. For example, pupils are posed ‘Big Questions’, such as ‘How do you know that the earth is a sphere?’ that are used to stimulate discussion about scientific topics and the principles of scientific enquiry.
Two teachers from each participating school received five days of professional development training delivered by a team from Science Oxford and Oxford Brookes University. The training did not aim to provide participating teachers with a set of ‘off-the-shelf’ lesson plans to be delivered in schools; rather, it sought to support teachers to be more creative and thoughtful in planning their science lessons. In addition, teachers had dedicated time to work with colleagues to plan and review lessons taught as part of the project. Teachers were also encouraged to link pupils’ learning in science, with their learning in numeracy and literacy.
This project sought to assess the impact of the programme on the academic outcomes and attitudes towards science of Year 5 pupils. 655 pupils from 21 schools across England completed the project. Participating schools followed the programme for the entirety of the 2013/14 academic year. A further 20 schools formed a randomised comparison group and did not receive training in the approach until the following year.
Findings from this evaluation have moderate security. The study was set up as a randomised controlled trial, which aimed to compare the progress of pupils who received the programme to similar pupils who did not. The trial was classified as an efficacy trial, meaning that it sought to test whether the intervention can work under ideal or developer-led conditions in ten or more schools.
There was very low drop-out from the project (only 1 school out of the initial 42 who signed up), and the participating pupils appeared to be very similar to those in the comparison group. It is unlikely that the observed result occurred due to chance. In the absence of a nationally recognised science assessment, a test was developed using age-appropriate, curriculum-relevant questions that had previously been used in a similar study. The tests were administered by participating teachers, who did not have access to the test prior to the day that it was taken. The security rating is discussed further in Appendix 9 of the main report.
- Thinking, Doing, Talking Science appeared to have a positive impact on the attainment of pupils in science. Overall, Year 5 pupils in schools using the approach made approximately three additional months’ progress.
- There are some indications that the approach had a particularly positive effect on pupils eligible for free school meals, but further research is needed to explore this.
- The programme had a particularly positive effect on girls and on pupils with low prior attainment.
- The approach had a positive impact on pupils’ attitudes to science, science lessons, andpractical work in particular.
- National test data will be used to assess the English and mathematics outcomes of participating pupils and to measure the long-term impact of the approach. In addition, further research could be conducted to investigate whether this result can be replicated in a larger number of schools.
You may also like
Resources to support systematic reviews