## When I am asked to consult or evaluate a student, often the student is years behind in math. As a result, I am often asked to determine the grade level of the student’s achievement. Regressing the math achievement to a single number is not viable. This post provides an explanation.

### Common Scenario

Here is a common scenario. A school official reported out the grade level in math for a student. The 7th grade student tested at a 4th grade level. As a result, the student spent much of her 7th grade year working on 4th grade math. When I started working with her, I discovered that she was very capable of higher level math. Six months later, she was taking algebra 1.

### The Math Spider Web

Unlike reading, math is not nearly as linear. It is more like a spider web of categories (called domains). For example, Geometry is not a prerequisite for Ratios and Proportions and Fractions is not a prerequisite for Expressions and Equations. Geometry and fractions may be included in problems associated with other domains but they are not foundational building blocks.

On the other hand, in reading, comprehension and decoding are essential in all grade levels. Unresolved trouble with decoding in 3rd grade causes major problems in 4th grade and beyond.

A student tests at a 3.2 in reading. This provides a clear picture of where the student is in the progression of reading ability. There are books written at that grade level.

If a student is reported to to test at a 3rd grade level in math, the student may have scored higher than 3rd grade in Geometry, at 3rd grade in measurement and data, and lower than 3rd grade in the other domains. True, in reading we have students who may decode at a high level and comprehend at a low level. That is more specific that sorting through 6 domains in math. Then consider that the comprehensive number of domains addressed by middle school increases to 11.

### The Domains

The image below shows a breakdown of the Common Core of State Standards math domains. In a video, I use this graphic to unpack why it is more challenging to determine a single level of ability for math.

If you are presented with a single grade level as an indicator of math ability, I recommend that you ask for a breakdown by category and how your student will be provided differentiation to address gaps. This is more appropriate than plowing through all of the math at a lower grade level.

## Planning and Preparing for Math in the Fall

If you are reading this post, it is likely that you have a student or you teach students who struggle with math. Here are suggestions to help your students prepare for the math they will encounter in the fall.

Many students are behind in their math education. This has long term implications. The sooner you can address the gaps, the better chance your student has for post-secondary success or competence with math.

## Curriculum Based Assessments

Most testing for IEPs involves standardized testing. As I wrote in a previous post, this is important testing but is not sufficient. A major focus of special education is to make the general education accessible as possible. Hence, curriculum based testing is an important complement to the standardized based testing. For example, the KeyMath3 assessment will speak to problem solving or geometry but those are broad categories. If I am working with a 3rd or 4th grade student, I would be interested in the student’s level of mastery in computing the perimeter of a rectangle.

Also, math is very different than reading because math has a variety of categories of math, aka domains. A student testing at a 4th grade level in math does not reveal much information, as I explain in this previous post.

When I conduct evaluations or assessments, I go to the Common Core Standards and assess each with curriculum based problems, see below. The photo shows my planning document and then I transfer the problems to a student handout for the student to complete.

## Shopping at the Grocery Store

There are numerous hidden tasks that we undertake while at the grocery store. We process them so quickly or subconsciously that we are not aware of these steps.

As a result, we may overlook these steps while educating students on life skills such as grocery shopping. Subsequently, these steps may not be part of the programming or teaching at school and therefore generalization is left for another day. Yet, the purpose of IDEA is, in essence, preparing students for life, including “independent living.”

To address this, we can take a task analysis approach in which we break down the act of shopping at a grocery store into a sequence of discrete steps or tasks (see excerpt of the task analysis document below).

Step 1 is to administer a baseline pretest during which we start with no prompting to determine if the student performs each task and how well each is performed. As necessary, prompting is provided and respective documentation is entered into the table (to indicate prompting as opposed to independent completion). For example, I worked with a client who understood the meaning of the shopping list but started off for the first item without a basket or cart. I engaged him with a discussion about how he would carry the items. At one point I had him hold 7 grapefruits and it became apparent to him that he needed a cart. (I documented this in the document.)

Other issues that arose were parking the cart in the middle of the aisle, finding the appropriate section of the store but struggling to navigate the section for the item (e.g. at one point I prompted him to read the signs over the freezer doors), and mishandling the money when prompted to pay by the cashier announcing the total amount to pay.

Step 2 is to identify a task or sequence of tasks to practice in isolation based on the results of the pretest. For example, this could involve walking to a section of the store and prompting the student to find an item. Data collection would involve several trials of simply finding the item without addressing any other steps of the task analysis.

Step 3 would be to chain multiple steps together, but not the entire task analysis yet. For example, having the student find the appropriate section and then finding the item in the section.

Eventually, a post-test can be administered to assess the entire sequence to identify progress and areas needing more attention.

## Asking for Examples of Mastery for IEP Objectives

To ensure the IEP team is on the same page as to what mastery of an objective looks like, the person writing the objective can take two steps:

1. provide an example problem that would be used to assess mastery (and the example problem would have the same language as used in the objective)
2. provide an example of a response to the example problem cited above that would be considered mastery level work

The graph below is not data. A graph is a representation of summary statistics. This summarizes the data.

The chart below does not show the actual prompts, e.g. what number was shown to Kate, but it does show the individual trials. This is data, with a summary statistics at the end of each row. Here is a link to more discussion about data, with an example of a data sheet I use.

The data shown below addresses the student’s effort to solve an equation. Problem 21 is checked as correct and the error in problem 22 is identified. I can use this data to identify where the student is struggling and how to help. NOTE: the math objective would use the same verb as the problem: solve the linear equation.

The excerpt of a data sheet, shown below shows trials in a student’s effort to compare numbers.

Data below shows a student’s effort to evaluate integer expressions.

This applies to all areas beyond math. The chart above or the data sheet I linked above show data sheets that indicate the prompt and the results, with notes. For example, if I am asking my son to put on his shoes, each row of the data sheet is a trial with the outcome and notes.

\

## Cutting Up the Math Into Bite-sized Pieces

When I train new math and special education teachers I explain that teaching math should be like feeding a hot dog to a baby in a high chair. Cut up the hot dog into bite-sized pieces. The baby will still consumer the entire hot dog. Same with math. Our students can consume the entire math topic being presented but in smaller chunks.

My approach to doing this is through a task analysis. This is very similar to chunking. It is a method to cut up the math into bite-sized pieces just as we would break up a common task for students with special needs.

While waiting for my coffee order at a Burger King I saw on the wall a different version of a task analysis. It was a step by step set of directions using photos on how to pour a soft cream ice-cream cone. I thought it was amazing that Burger King can do such a good job training its employees by breaking the task down yet in education we often fall short in terms of breaking a math topic down.

## Performance vs Ability

In the effort to assess student ability performance factors are likely present. It is incumbent upon the educators to mitigate the performance issues to assess true ability.

For example, I conducted an evaluation on a student in middle school who has ADHD. All of her testing records indicated that she would lose focus during the assessment and that the focus was problematic for testing. Before we met I surveyed her on her favorite snack (didn’t know Sour Skittles is a thing), brought this reinforcer along with a bottle of water. She sat through an entire 1 1/2 hour KeyMath Assessment without incident.

## Busy Engagement vs Intellectual Engagement

Owls are symbols of intelligence but the purported reasons are based on the appearance of awareness and the deft hunting skills. It is claimed that the appearance and skill sets are confused with actual wisdom.

I find a parallel between the perceived wisdom of the owls and the perceived learning of students. Through my years in education I have seen teachers praised for their student centered activities. The students may be energetic and on task by an activity which is often considered a touchstone for learning. What is often missing is independent assessment to determine actual learning.

Once I was covering a class for a teacher widely praised for his activities and multimedia activities. In the class I covered the students were taking a test. It was clear that the majority of the students were hesitant about their performance. Several were looking around, one pulled out a phone and a couple looked at other people’s paper. Very few were locked in on completing their test.

I am not suggesting that multimedia or student centered activities are ineffective. My point is that there is a perception that such activities are inherently effective and reflective of actual learning. There is a difference between being intellectually engaged and being busy. The owl deftly executes action and skill but that does not indicate higher level functioning. Conceptual understandingÂ requires more than simply being engaged by activity. Hopefully this is food for thought.