Tip of the Week
With September crossed off the calendar, most of us are now firmly into a new school year. Rules, routines and procedures have been taught and reviewed; initial data has been collected; data teams are getting into a rhythm. But what data are we looking at?
One of the teaching points I revisit frequently with teams all over is the need to collect data on our own adult practices. I rarely have anyone who questions this. But they are usually at a loss for how to do this, and even have hard time deciding what data would be meaningful to collect.
As I always say, if you have something that is already collecting data for you, don't invent something new. Here is an example of how we used a walkthrough form to collect and analyze adult practices, and how we prioritized next steps based on the aggregate data we collected.
My work with the school started as an off-shoot of ongoing work in the district, and statereporting of a performance gap between Hispanic students and their white classmates, and, more significantly, between English Language Learners and students whose first language was English.
The question the administrator had: What professional development do teachers need to better support English Language Learners in our school?
The Data Collection Tool
To collect data on teacher practices that impact second language acquisition, we used the Sheltered Instruction Observation Protocol (SIOP). This protocol assesses the components of a well-designed lesson for supporting second language acquisition, and allows for the recording of specific evidence for each component. Click on the image, below, or the link, above, to download a copy of this observation tool.
Lesson #1: You can collect school-wide data on adult practices from any form you currently use in your school.
The Data Collection Method
The above walkthrough sheet was taken to classrooms, which were observed for 10-20 minutes, each. Each classroom's activities were recorded on a separate sheet. The larger blocks allowed for recording of specific evidence, while the center block, with SIOP lesson features, was a check box: the feature was evident or not. Additional comments were recorded in the margins, as necessary.
This data collection method is not different from the way teacher walkthrough data is collected everywhere. But how to turn it into a school-wide data set, then prioritize the next steps from it?
Lesson #2: You don't necessarily need to change your data collecting techniques to begin collecting data on adult practices.
The Data Compilation
After all the walkthroughs were finished, we needed to decide where were the areas of strength of the staff, where were the high-leverage points (i.e., places where there were good practices that could be made even better, and where were there certain practices that were conspicuously absent, altogether. The idea is that it is better to build on current structures and practices, than to insert brand new ones.
(NOTE: if data were collected from 80% or more of the staff, I felt that there was enough to determine a school-wide focus. I collected data from 89% (16/18) staff members)
To determine this as an aggregate, I used a blank data collection form, and simply put check marks next to each component observed. In this way, I had a clear visual summary of practices which appeared to be in place, versus practices that were not yet begun in the school. Because of other professional development happening in the building, the administrator wanted to know if there were particular grades that need more support than others, so I color coded the checkmarks red for K-2, and green for 3-6.
Lesson #3: Resist the temptation to disaggregate the data. Your goal is to take the "temperature" of the entire school.
The Data Analysis
So, what do the checkmarks mean?
The administrator wanted to know where to best target additional coaching support. So we looked for patterns and trends across the entire school.
We set some filters for looking at the data:
- 80% or more of classrooms = the practice is widespread and systemic
- 70-79 % of classrooms = the practice is part of the school culture, but could be improved or made more consistent
- 1-69% of classrooms = the practice occurs in pockets or individual classrooms, but is not a part of the school culture, overall
- 0% of classrooms = the practice is not evident in any classrooms in the school
We were most interested in the widespread, systemic practices (because these reflect transfer of past professional development into standard practice, and these represent areas of strength for the staff, upon which we can build their learning); practices that are part of the culture but not yet systemic (because these represent the staff's "instructional" level); and practices which are not evident in any of the classrooms (because these demonstrate potential obstacles to overcome). The broad range of practices that are not part of the school culture are not the primary focus, because they may be out of the repertoire of many of the faculty without more intensive support. They are important, but not the top priority at present.
Based on these decision rules, we discovered this set of practices for each of the above categories (see image, at left). Note that I created a handy chart for the number of tallies and the percentages that each represents of the total number of classrooms. It was just as fast for me to create this "cheat sheet" once as to enter it into a spread sheet.
I used no special paper, just a corner of the summary sheet - why make things more complicated than necessary? But I did color code the three levels. Color is helpful, and most teachers are familiar with this color scheme from SRBI.
I love a good Excel spreadsheet. You might be tempted to codify the entire data set and enter it into a spreadsheet. Here are the reasons NOT to do this:
- The visual of simple tallies and color-coding draw your eye to areas of relative strength and need, without numbers.
- You will have to prioritize your next steps, anyway. This method helps you quickly scan and select a few areas that stand the most reasonable chance of improvement, at the present time.
- Once you prioritize, THEN you can go back to your data sheets and look more deeply at the data for just those areas.
- The time you take to codify the whole set of observations would be better spent beginning to develop a plan for addressing the significant few.
Lesson #4: Develop decision-rules to prioritize a focus BEFORE deeply analyzing the data.
The Next Steps
Sometimes (not usually), our data tell exactly the area for focus teacher support. Other times (most times), it tells us that there are many areas of need, or scattered, seemingly disconnected areas. We can only do so much in one year, so it is up to us to identify key areas that will give us the greatest changes in practice. This often takes a little detective work and inference, and the combination of small topics into a larger, more meaningful focus.
For example, in our data set, above, guided practice showed up as a universal structure, but also as an area for targeted support, based on the ways that teachers were using guided practice to support English Language Learners. Similarly, while there were many classrooms with small group activities, there were comparatively few observed where the teacher was directly instructing the entire class. It was inferred, therefore, support in the more explicit use of the gradual release model might address all of these areas, and would be preferable to addressing each of these pieces as individual topics.
Based on the practices highlighted in yellow, above, it was determined that these would be prioritized areas of support for the teachers:
- Anticipating parts of a lesson where there will be need for adaptations (in materials, content) with regard to language needs of students;
- Selecting language goals (especially for speaking and listening) and sharing them with students;
- Activating prior knowledge (linking to past learning, prior experiences, objectives);
- Development of a more explicit gradual release of responsibility for learning model (define-model-shared practice-guided practice-independent practice);
- Whole class instruction techniques that increase student engagement;
- Building in frequent checks for understanding into instructional sequence.
It was also decided that, since the percentages were consistently higher for the K-2 classrooms, and there was another professional development targeting those grades, the bulk of the coaching support would be delivered to grades 3-6, with occasional grade level team meetings with the other grades, or delivery of tips and materials during staff meetings.
Lesson #5: Be ready to infer underlying reasons for disparate data points, and to explore root causes for some observations.
Developing an Instructional Plan
Where to start?
There are many decision rules when picking the starting point for addressing adult data. In most ways, they are not unlike the rules you use when working with student data:
- Which area impacts the most teachers?
- Which area impacts the most other instructional areas?
- Which area is the foundation for the others, or, if you address it first, would also address several other areas of need?
- Are there foundational needs not directly assessed, that need to be addressed, based on the data collected?
- Which area, if addressed, would lead to the greatest change in practices in the school?
- Which area, if addressed, would lead to the greatest increase in student performance?
- Which area is the easiest to address, with given resources (time, staffing, materials)?
- Which area is the least expensive to address?
- Which area needs to be addressed urgently (that is, time depends on it)?
So, you see there isn't one decision rule.
In the above example, we then went back to our original data sheets to read over the specific comments for the areas we identified using our simple tally mark system. We determined that teachers were using a lot of good practices, but few of them were selected and used to directly address the needs of ELLs in their rooms. In other words, instruction was not necessarily adapted to address the unique needs of these students. In order to do that, however, teachers had to have a way of analyzing their lessons to anticipate areas where English language learning would impact accessibility of the instruction.
Consequently, the decision was made to introduce the staff to Cummin's Quadrants
, a tool for analyzing the content and language demands of instruction and activities. This foundational learning would then form the basis for identifying ways to adapt and supplement materials, content and instruction, with the needs of their ELLs in mind. This one piece would then, theoretically, set teachers up for most of the other focus areas: without addressing it first, the other areas might seem disconnected from one another.
Lesson #6: Once target areas are determined, go back and read over the specific data to help construct plans to address these areas.
Because you used a data collection tool and method which are not new to you, you don't have to invent a new tool to collect data on how well the plan is working. There are two basic ways to monitor progress:
- Only collect monitoring data on the specific areas being addressed. For example, of your data showed that teachers were beginning teaching without any introductory activities, you might focus on observations ONLY for the Warm-up and Anticipatory Set parts of the data collection tool, leaving the others blank.
- Collect data on all areas. The rationale here is that you've selected a focus area that impacts many other areas, so you want to see if the strategies you choose have impact outside of that specific area. In our example, above, we felt that knowledge of content and language demands might impact a number of areas, so we would use the entire form.
Did You See the Data Teams Steps?
As you see here, with this one example, collecting data on adult practices is not any different than collecting data on DRA performance in students. This look at adult practices is helpful for school data teams when deciding a problem of practice for a school-wide goal, for administrative teams determining goals for Professional Learning Plans for staff, and for professional development committees creating PD plans for the school year.
Will You Help Me?
Will you help me out?
to take a brief survey on your experience on this page (You will NOT be
directed to an advertiser! This is for my research, only!).
Schools all over the state (and country) are communicating data to stakeholders through the use of data walls. Businesses have been doing this for decades. This article reviews some of the considerations when using data in various school settings, based on some of the observations I have made in schools over the past several years, as well as teacher feedback.
Personal Data Walls:
Teachers have re-discovered the importance and impact of sharing a student's progress with the student, and inviting that student into the data-driven decision-making process. Special educators have had students track their progress (fluency graphs, sight word checklists, DRA levels, etc.) for years. More and more, classroom teachers are having students do the same in the general ed classroom, and are seeing the impace on even the youngest learners.
First-grade teachers at O'Brien STEM Elementary School in East Hartford use file folders as personal data walls for their students. Students graph their writing scores five times a year and post the graph on the left side of the folder, as well as keep track of their mastery of their first grade sight words in a graph on the right side of the folder. Photocopies of the graphs are sent home periodically to communicate with parents, and the students use their folders during conferences. The folder format creates a handy way to send scores and student work to the receiving second grade teacher at the end of the year.
Classroom Data Displays
In a previous post ("Getting Data Teams Up and Running, 2011
"), I shared one of the best classroom data displays that I've seen, where the second grade team at Mayberry Elementary School in East Hartford created a "walking data wall" to show student progress in the DRA2. Teachers at O'Brien found that placing their student reading group table near the display helped keep students focused on their goals. They also met with parents at this table, so that parents could see where their children fell in relation to their peers, in their reading progress.
Public Data Displays
When it comes to displaying data outside the classroom, teachers and schools have to make some decisions:
- Who is the intended audience for this display (parents, other students, other teachers/staff)?
- If parents, what is the intended purpose of the display (to inform, to teach, to call to action)?
- If students, how will students (in your class and others) use the data? How will their attention be drawn to it?
- If other teachers, how will the data be highlighted? What will be the intended action?
Recently, I met with teams of teachers at O'Brien STEM Elementary School, in East Hartford, where we discussed how to make hall displays parent friendly. Here are some suggestions for sharing data with the community:
- Use a small amount of data to show the reason behind the current classroom work (e.g., a bar graph showing DRA2 data to show why 'retelling' is the current area of focus in Grade 2). Parents can support school focus more easily if they understand why it is important.
- Avoid "sorting" words like "below basic," "proficient," "substantially deficient," etc. While we may use these words as teachers, they do not evoke positive and encouraging thoughts in parents. Better to provide the goal, and visually show progress toward the goal. People will get the picture, without the "punishing" words.
- Use lots of visuals to show, rather than tell. Classroom teachers can share photos and vignettes of ways that they addressed the data focus, to show parents the kinds of activities that support learners in that area.
- Provide a pocket folder with "take-home" ideas. One fourth grade teacher at O'Brien provided parents with ideas on how to support the grade-level focus at home. Other teachers provide a classroom newsletter as part of the display, to make the display interactive.
- Turn your display into a "waiting room." If there was something I wanted parents to see, I placed a desk next to it during parent conferences. Waiting parents could interact with the display while they waited for their turn at conferences.
The photo at left shows one school's approach to making hallway data displays parent-friendly. Click the photo to see their description of the display.
School Data Displays
I was at E. C. Goodwin Technical High School last week, and got a chance to take a good look at the data display they had in their main office conference room, before the school data team meeting convened. Here were the components of the display, simply tacked to the bulletin board):
- The School Improvement Plan (front page with main goals showing)
- CAPT data graph, showing 5 years of standardized assessment data (reading, writing, math and science) for 10th graders
- NOCTI data graph, showing 5 years' performance on the standardized trades exam
- The school professional development plan and calendar for the year
- CMT and CAPT (state assessment data), disaggregated by student graduating cohort)
- Guiding questions for the School Data Team
- CAPT data graph showing 5 years of reading scores and 5 years of writing scores
- The Reading Action Plan from the SIP
- The most recent English Department Data Team process summary (i.e., their most recent data team minutes)
- A narrative description of their current strategy focus (a teacher-created strategy to make more meaningful connections to literature)
- Dipstick data on a scoring rubric
- CAPT data graph showing 5 years of math scores
- The Math Action Plan from the SIP
- Math screening data (from STAR Math) - grade-level profile
- The most recent Math Department Data Team process summary
- A narrative description of their method of selecting a targeted group of struggling ninth graders for a focus group on working with exponents
- Dipstick data (via "quizlets") for the targeted student group
- Office discipline referrals for the year, by month
- A narrative of school-wide strategies being implemented to address ODRs
- CAPT data graph showing 5 years of science scores
- The most recent Science Department Data Team process summary
- A bulleted instructional plan to address the current student learning focus (developing a problem statement) in Science
- Summaries of professional development to address literacy, numeracy and comprehension strategies in the trades
The display clearly showed the alignment between district, school and departmental goals, as linked by their four guiding questions (as shown below). On a montly basis, the team gathered for brief reports, by department, then discussed school-wide strategies to address themes that emerged across disciplines. For example, their most recent debrief revealed a student learning issue around making meaning from text that was technical in nature: assignement directions, math word problems, scientific procedures, technical manual specs, etc. They then discussed the adoption of a school wide strategy for paraphrasing technical texts (going from part to whole), as well as a school-wide strategy for analyzing problems and procedures (whole to part).
For more examples of public data displays, see my Pinterest board on School Data Walls
. The examples show different formats that schools have chosen to present data. Choose the format that best suits the purpose and culture of your team and school. I will continue to add to the board as I see examples to share, so check back often.
Will You Help Me?
Will you help me out?
to take a brief survey on your experience on this page (You will NOT be
directed to an advertiser! This is for my research, only!).
, professional development
, data teams
, progress monitoring
, classroom environment
, data displays
, leadership teams
Over the past 6 years, I have worked with teachers in countless schools, in many districts, representing grades preschool through continuing education. Almost to a teacher, teams everywhere wish that they could see what other schools across the state are doing when they meet as professional learning teams or data teams.
As a help for all of you who learn by observing others, I have put together this collection of video clips and "real life" ways that schools have implemented the data team process in their communities. They are not offered to represent the "perfect" data team function, but to highlight effective process, in a variety of forms.
For each one, I have highlighted a particularly point to observe. As I combed through videos, I looked specifically for videos that answered a particularly question that teams have asked me. My answer to those questions is, "Well, here is one way that a team addressed that question..."
The teams are not all Connecticut teams, but the resources are. If you are not a Connecticut educator, check your state's Department of Education website for data teams resources, or feel free to access the ones at the Connecticut State Department of Education website.
What is the Data Teams Process?
While most schools have data teams in place, schools vary in the stage of maturity of their data teams. Many teachers want to know what the basic data team process is. In this short video clip, Doug Reeves, of the Leadership and Learning Center
, reviews his organization's protocol for data team conversations, which has been adopted by schools all over the country. The images are from Aptos High School
, Pajaro Unified School District, Watsonville, CA.
Organizing for Data Teams
Many school districts have become creative in helping to support their teachers when the Data Teams process becomes a district-wide initiative. Here are some nice examples of the use of technology to establish coherence across school districts.
Windham Middle School
, Windham Public Schools, Willimantic CT, has created a Wiki for the sharing of district-wide forms and protocols. Wikis can be created in minutes, and provide a place for public or internal sharing of resources and collaboration. This saves meeting time for talking about students and instruction.
Also in Connecticut, Middletown Public Schools
posts the dates for all district, school and grade level data team meetings on its public web page, with links to minutes and progress monitoring reports at all levels. Note the specific focus of each meeting, and the specificity of the grade-level team goals for each meeting, and the concise nature of their reports. Also note how clean the page looks, making it easy for parents, administrators and teachers to find and access information.
Form, Forms, Forms...
There is a learning curve with anything new, including district-wide forms and electronic templates. Waterloo Community School District
, has created online video tutorials
demonstrating how to use various tabs and parts of the district's electronic data teams template. The clips are only a few minutes long, and can easily be included in a data team meeting for a mini-PD. How could your district use something like this?
The Connecticut State Department of Education has a number of helpful forms available on its Data-Driven Decision-Making and Data Teams
web page, including data team forms for instructional and school data teams.
Types of Data Team Meetings
School Data Team Meetings
Grade-Level Team Meetings
In this fourth-grade team meeting, teachers at Belmont Elementary School analyze common formative assessment data and create an action plan
to address three areas: re-teaching of weak areas in measurement; inclusion of test-taking strategies in instruction; explicit Tier 2 vocabulary instruction to support English Language Learners. Also note-worthy is what they said they would NOT discuss at the meeting, and a plan for when they WOULD (Length: 11:21)
Child Study Team Meetings
Every Team Has a Purpose...
The State of Pennsylvania breaks data-driven decision-making in Response to Intervention model
into four phases: 1) Universal Screening and Pre-meeting Preparation, 2) Core Team Meetings, 3) Grade-level Team Meetings, and 4) Process Outcomes. This brief video explains the data and conversations of each phase, and the type of meeting that would address that component.
Other Resources and Ideas
Here are links to some other articles on data teams, on this website:
Charting Student Progress
Last week, we talked about different assessments that folks use to monitor student progress, Right now, most schools are in the midst of the mid-year benchmark assessments that are also used to mark progress toward grade-level expectations. Among all the students, there are those who have been receiving a "double dose" of support, through formal and informal interventions. How do we know if those interventions are doing what we had hoped they would do? How good is "good enough?"
In this article, I will show you how to create goal lines and trend lines, and then teach you how to use them to determine if a student is progressing at a suitable rate. In this way, you will be able to judge the effectiveness of an intervention and make any mid-course corrections to get better progress. if necessary.
What is the "goal?"
Whenever we assess students, we begin wih four pieces of data:
- Where they should have been at the previous grade
- Where they should be at the end of the grade
- Where they are actually ended up at the end of the previous grade
- Where they are now
Let's use a DRA example, since most of you are familiar with that assessment.
Imagine that you have a student, Josepeh, who is in the middle of second grade, and has been receiving Tier 2 reading intervention by his teacher since the start of the year, and Tier 3 reading intervention (TLC) with a reading teacher since September. Here are the data we will use for this student's graph:
Grade Level Expectation:
- 18 (May, Grade 1)
- 20 (September, Grade 2)
- 24 (January, Grade 2)
- 28.5 (May, Grade 2) [NOTE: PowerSchool reports a 28 Non-fiction as 28.5 -- a convention we will use here]
Joseph's DRA levels:
- 6 (May, Grade 1)
- 6 (September, Grade 2)
- 16 (January, Grade 2)
- No assessment for May yet
We can see that Joseph is not at the expected level for this time of the year, in second grade. We can also see that he has made great gains. But are his gains strong enough? Is he making progress toward grade level? Has his specialized instruction adequately targeted his specific needs?
First, we need to plot the line of expected grade level DRA scores:
This is our goal line (or aim line) -- the line that represents how we want all second grade students to perform throughout the course of the year. A student should progress 10.5 "points" over the 10 months of instruction, for a rate of growth of 1.05 "points" per month. While this figure doesn't mean anything real with respect to the actual DRA assessments, it doesn help us for the next part of the project.
Plotting Student Data, and Drawing a Trend Line
Now, let's plot Joseph's scores against the grade-level expectations. [Vertical lines represent the start of new interventions]
We don't have Joseph's spring scores for second grade yet -- we want to know if he is on course toward the GLE of 28 for the end of Grade 2, so we can adjust his intervention, if need be. So we have to figure out a way to predict where Joseph might end up, if we leave things just the way they are.
Joseph is currently at a level 16, having finished Grade 1 at a level 6, a difference of 10 "points" over 6 months of instruction. This means he has progressed at a rate of 1.67 "points" per month of instruction. We also note that he has really taken off since adding the TLC to his "menu" of support. But has his growth been adequate? How do we know?
Let's draw a line passing through both his starting score and his last score -- this trend line can help us predict what score he might have with his current rate of growth:
Joseph has covered a lot of territory since the end of Grade 1. But his reading really started to take off after the second intervention cycle. Prior to that, he really didn't make any progress, at all. What would the trend line look like if we started it with the beginning of the last intervention?
How Good is "Good Enough?"
But how do we know if what we are doing is working for students? In the realm of special education, there is a rule of thumb that, through specially designed instruction, students should make up an additional 0.5 year of progress over the course of a year of intervention. That means a student in an intervention should progress at a rate of 1.5 times the rate of the grade-level goal line. Let's see what that means for Joseph.
We calculated that a second grader must progress from a DRA level 18 at the end of Grade 1, to a 28.5 (a 28 Non-fiction) by the end of Grade 2. This means that he should progress at a rate of 1.05 "points" per month (since we are using 10 months as our school year).
Joseph, we determined, was actually progressing at a rate of 1.67. If we multiply the expected rate (1.05) by 1.5, we get 1.58. In addition, look more closely at his graph. He began really developing as a reader in September, when he started TLC. He went from a 6 to a 16 in 4 months of instruction, a rate of 2.5 "points" per month, much higher than the 1.58 rate we need to accelerate his learning.
So what might we conclude? We could conclude that, in his current interventions, Joseph could reach close to grade level at the end of Grade 2, with continued support. We could also conclude that TLC is the appropriate intervention for him at this time, based on his rate of improvement since September (greater than 1.58 pts per month).
Let's Look at the Rest of the Group...
Here's the graph of some of Joseph's classmates. What could you say about their progress? Are they accelerating (i.e., are they progressing at 1.5 times the expected grade level growth rate)? What is your evidence? [The rate of growth for each student is at the end of their trend line. Remember, we want a rate of 1.75 pts/month, or higher, to demonstrate acceleration]
Practical Uses of Goal Lines and Trend Lines
In the SRBI model, goal lines and trend lines are extremely useful in determining whether or not a student is progressing adequately in his current instructional program, or may need a more (or less) intensive program of support. Use of these lines can also help us evaluate the overall effectiveness of particular interventions (as in the case of Joseph).
Folks can get quite sophisticated with these charts. As a teacher, I would draw vertical lines each time the intervention plan changed, and write it right on the chart. Then I would plot the slope (rate of progress) during each intervention, instead of an overall line, to see which intervention was the turning point for the child.
If you want to experiment with goal lines and trend lines, choose one intervention group you have, especially one where you sense that students are not progressing at the same rate (so you see varying results). Ask yourself the following questions:
- Is the student progressing?
- Is she accelerating (at 1.5 times the rate of the goal line)?
- Is the intervention plan effective for this child?
For More Information
Mid-year finds teachers everywhere administering all kinds of assessments of student learning progress. Many of you ask what teachers use as ongoing monitoring measurements, when it is not time for a benchmark assessment but you want to know more about how a student is progressing. Here is a short list of published assessments that my current schools are using -- the list is, of course, subject to change. This is not meant to be exhaustive, but an indication of programs that multiple institutions have selected.
Formative Assessment Tools
All of the following are quick to administer (under 15 minutes), quick to score (all but the first are scored automatically by computer), and quick to analyze (the web-based items provide multiple reporting options, and most suggest follow-up activities, or include mini-lessons to address targeted skills as identified on the assessment).
- Uncovering Student Ideas Series (science, math formative assessments), grades K-12 (Page Keeley). Collections of formative assessments - book and electronic forms available for purchase, with many free, downloadable samples for preview and use. Assessments can be used at all grade levels. Can be administered as one-page paper and pencil assessments, as demonstrations with follow-up written assessment, or as performance tasks.
- Easy CBM - Web-based, curriculum-based formative assessments for reading and math, grades K-8. Free individual subscription, site purchases with online monitoring tools available for schools.
- Fraction Nation (Scholastic) - Web-based instruction on fractions in 15 minute mini-lessons, for grades 4-8. Adaptive online assessments - individual and group reporting options. Site licenses available for purchase.
- Fastt Math (Scholastic) - Web-based instruction on basic math facts (addition, subtraction, multiplication and division), in 10 minute mini-lessons, for grades 2 and up. Spanish language instruction available as an option. Adaptive online assessments - individual and group reporting options. Site licenses available for purchase.
- Scholastic Reading Inventory (Scholastic) - Web-based reading assessments for K-12. Useful for universal screen, progress monitoring, placement and benchmark reading levels. Multiple reporting options. Site licenses available for purchase.
- Scholastic Reading Counts (Scholastic) - Web-based reading assessments for K-12. Students select comprehension quizzes for the books they have read independently. Multiple reporting options. Site licenses available for purchase.
- Scholastic Phonics Inventory (Scholastic) - Web-based decoding and sight word assessments for grades 2-12. 10-minute assessments useful for universal screen, progress monitoring, placement and benchmark reading levels. Multiple reporting options. Site licenses available for purchase.
- Accelerated Reader (Renaissance Learning) - Web-based reading assessments for grades K-12. Students select comprehension quizzes for the books they have read independently. Multiple reporting options. Site licenses available for purchase.
- Star Math (Renaissance Learning) - Web-based math assessments for grades K-12. Multiple reporting options. Site licenses available for purchase.
Intervention Programs That Monitor Progress
In addition, the following intervention programs include their own monitoring and reporting tools.
Upcoming blog entries:
Will You Help Me?
Will you help me out?
to take a brief survey on your experience on this page (You will NOT be
directed to an advertiser! This is for my research, only!).