top of page

Data Collection

Summarizing my Data Sets

Monitoring Student Progress:

Throughout my action research study, I aimed to consistently and accurately monitor my students' reading comprehension abilities when using graphic organizers. I utilized three specific forms of data collection methods throughout my study. These data points were weekly student surveys, weekly comprehension quizzes using written responses, and anecdotal notes.These three methods were chosen because they measured the effectiveness of implementing graphic organizers, explicit instruction, and students’ reading comprehension.

At the beginning of this study I had planned to use a pre/post paired t-test. Unfortunately, due to the Corona Virus Pandemic of 2020, I was unable to give my students the post test assessment.

 

 

IMG-1025 (1).jpg

Data Set One: Student Surveys

On Fridays, I gave students a check-in feelings survey with a thumbs up, sideways thumb, and a thumbs down. Students circled the thumb that best represented their feelings about if the graphic organizer used each week helped them with gaining a better understanding of the stories we read. Underneath the thumbs, students completed the Two Stars and Wish portion. In the two stars, students wrote something they either enjoyed about the organizer or how it helped them as a reader. In the wish speech bubble, students wrote something that they wish was different about the graphic organizer used that week. My goal by using this survey was to see how my students felt about using a graphic organizer while reading to help with organizing their thoughts while reading and to see what changes I could make when implementing the next graphic organizer.

 

Weekly Survey.png

The second form of question and answer written responses that I collected from my students bi-weekly were students’ Analytical Response to Writing that allowed me to see students ability to answer a “beyond the text” question with a graphic organizer. I used a rubric to score students out of nine points. On this rubric, students earned points by being exceptional (3 points), meets expectations (2 points), satisfactory (1 point), or unacceptable (0 points) in three areas. The three areas students were to reach included being able to restate questions, where I checked to see if students could properly restate a question as they are responding to it. Additionally, students were scored on their ability to connect to the text by looking to see if their responses to the question were comprehensive, accurate, complete, as well as their ability to support their answers with facts, knowledge, and details from the text. The last component students are scored on with this rubric was their ability to structure their writing in a well organized, developed, and easy to follow response. In other words, students’ responses needed to be well organized. To help with organization, I originally taught my students RAPP where they restate the question, answer the questions in their own words, prove it with text evidence, and proof read their answers. However, after a few weeks of using this model, my students struggled with explaining the how the evidence they selected from the text, helped prove their answer. Because of this, I taught my students the R.A.C.E. strategy where students restated the question as a statement, answer the question in their own words, cite text evidence to support the answer, and explain how their evidence supports their answer. My goal with this form of data collection was to see if students could respond to questions over a text that encourages them to go back into a text in order to form a well organized and developed answer to an essential question.

Data Set Two: Question and Answer Written Responses

The first form of question and answer written responses that I collected from my students weekly were short answer comprehension questions from the Wonder's Curriculum given on the fourth day of each week.The questions corresponded with the weekly skills being taught and at least one question each week involved creating a graphic organizer to help answer the question. At the beginning of my study, I asked my students questions aloud and took anecdotal notes over the responses. After my CADRE associate observed a lesson with this format, she noticed that giving a comprehension check in this format did not offer me enough insight into how all students were doing with understanding the story and practicing the weekly skill. Because of this, I made the decision to change the format to a comprehension, short answer quiz where I gave students a blank copy of the organizer taught that week and the questions to answer. These comprehension checks allowed me to see if students were able to use the graphic organizers implemented throughout the week, as well as check their comprehension skill taught each week.

Data Set Three: Anecdotal Notes

The last form of data I collected were anecdotal notes. During both whole group and small group instruction, I would write down my observations of how students were doing with completing the organizer. Additionally, I kept track of if students used the graphic organizers or made a graphic organizer to help them answer the written response questions. My goal with taking anecdotal notes was to help track how students were doing with using the graphic organizers. From my notes, I was able to see where I needed to make changes in my instruction to help my students grow in their reading comprehension.

 

 

 

 

 

 

 

 

 

 

 

 

 

bottom of page