top of page

Data Collection Methods

  • AIMSweb Math Fact Fluency (An assessment that focuses on one digit and two digit addition and subtraction facts)

  • Pre-test to Post-test

  • Timed Tests

  • Student Interviews

Why these methods?

I chose the above forms of data collection because I felt they were the best to see students' growth over time.

  • AIMSweb Math Fact Fluency: This assessment was given in the Fall, Winter, and Spring and is a summative assessment that provides data to determine if students are making adequate growth.

  • Pre-Test to Post-Test: Topic tests were given after about two weeks of instruction. These formative assessments helped me determine the effectiveness of my small groups. These topic tests also helped me form the differentiated groups. 

  • Timed Tests: Addition and subtraction timed tests were given once a week and they lasted 8 minutes. These formative assessments helped me determine which students were getting faster when completing one-digit addition and subtraction problems. The assessments also helped me to determine which students had mastered one-digit addition and subtraction.

  • Student Interviews: The interviews gave me insight into how my students were feeling about the Guided Math Workshop and their abilities within the Guided Math Workshop.

Data Analysis

AIMSweb Math

AIMSweb is a summative assessment that shows student growth over a length of time. The first graph shows 1-Digit Math Fact Fluency data, which is how quickly students can give the answer to an addition/subtraction fact orally in one minute. The graph shows how many students in my class met the goal or did not meet the goal for the specific testing time (fall-12, winter-15, or spring-16). When looking at the graph, I can easily see that the large number of students who did not meet the goal in the fall dramatically decreased in the winter. This number decreased dramatically because addition and subtraction had been taught as well as practiced before the winter testing.

The second graph shows data for Math Fact Fluency with 10's, which is how quickly students can orally give the answer to an addition/subtraction problem that contains 2 multiples of 10 (10, 20, 30, etc.). This test was not given in the fall because it is an advanced skill that had not been introduced to the students yet. This graph also shows how many of my students met or did not meet the winter goal of 3 and spring goal of 5 facts in a minute.

Pre-Test to Post-Test

Knowing that our curriculum is spiraled and that topics build off of each other, if students do not perform well on a test, they may not have the proper tools in place to help them be successful on the next text. This graph displays the class average on topic pre-tests and post-tests. When analyzing this data, I noticed that the class average increased from the pre-test to the post-test. This tells me that throughout the implementation of a Guided Math Workshop Model, my class improved their test scores. The implementation of the individualized activities and focused instruction aided my students in increasing their overall knowledge. I am still wondering why the pre-test scores for Topic 11 and Topic 12 were so low if the curriculum is spiraled. After exploring the data, I found that the Topic 11 test covered subtraction with 10's and the Topic 12 test covered length, so there was not a lot of overlap within the topics.

Timed Tests

After completing weekly timed tests, I saw overall growth in my students' scores. This graph shows timed test scores from 3 different students (above level, on-level, and below level). When analyzing this data, I observed that all 3 of my students increased their number of correct math facts from the first test to the last test. Student 1 mastered addition, subtraction, and mixed by completing 80 correct problems in 8 minutes.  Student 2 mastered addition by completing 80 correct problems in 8 minutes.  Finally, student 3 had not mastered addition yet. By looking at these 3 students' scores, I can see that each student increased their number of problems correct in 8 minutes. I am still wondering, was the increase in problems correct due to their mental math strategies? I observed the below-level students counting on their fingers to solve all addition problems. The on-level and above level students were not counting on their fingers but were not consistent in how many problems they solved each time we took a test. I am also wondering if I would have given each student a goal to meet, would their overall score increase each time we took a test?

Student Interviews

At the beginning and during the conclusion of this study, I conducted student interviews to gauge student comfort levels and opinions about the Guided Math Workshop Model. Before the study started, many of my students seemed bored during the math block. They were drawing on their math mats and oftentimes talking to neighbors. Toward the end of the study, my students seemed engaged and excited for the Guided Math Workshop. Students were excited to start math and were engaged in the station activities. There were often smiles on students' faces and conversations between students about how to solve problems. Here are a few of the responses from the beginning and the conclusion of this study:

Before the study: 
"Math is boring and hard."
"I wish we could play more games."
"I don't like sitting on the rug for that long."
"Sometimes the math mats are too hard."
After the study:
"I liked the games."
"I wish we would've done more XtraMath."
"Math was fun and not boring."
"I liked having our groups and working with my friends."
bottom of page