If you are reading this post, it is likely that you have a student or you teach students who struggle with math. Here are suggestions to help your students prepare for the math they will encounter in the fall.
Ask for examples of what mastery looks like for the IEP math objectives. You may not understand the math but you can compare your student’s work with an example of mastery to get some idea if your student is showing some level of mastery.
Get a math evaluation to see where you student is in terms of the curriculum.
Many students are behind in their math education. This has long term implications. The sooner you can address the gaps, the better chance your student has for post-secondary success or competence with math.
Most testing for IEPs involves standardized testing. As I wrote in a previous post, this is important testing but is not sufficient. A major focus of special education is to make the general education accessible as possible. Hence, curriculum based testing is an important complement to the standardized based testing. For example, the KeyMath3 assessment will speak to problem solving or geometry but those are broad categories. If I am working with a 3rd or 4th grade student, I would be interested in the student’s level of mastery in computing the perimeter of a rectangle.
Also, math is very different than reading because math has a variety of categories of math, aka domains. A student testing at a 4th grade level in math does not reveal much information, as I explain in this previous post.
When I conduct evaluations or assessments, I go to the Common Core Standards and assess each with curriculum based problems, see below. The photo shows my planning document and then I transfer the problems to a student handout for the student to complete.
A common scenario involves a school official reporting out the grade level in math for a student. For example, a 7th grade student I was helping had tested at a 4th grade level. As a result, the student spent much of her 7th grade year working on 4th grade math.
There are a couple problems in establishing a grade level in math. First, unlike reading, math is not nearly as linear. The image below shows a breakdown of the Common Core of State Standards math categories, called domains. In a video, I use this graphic to unpack why it is more challenging to determine a single level of ability for math. In short, the reason is the student could be doing well in some categories and doing poorly in others. Second, the testing used to establish ability level can be problematic for the student. For example, the student may not have the stamina or attention span to endure a longer assessment.
If you are presented with a single grade level as an indicator of math ability, I recommend that you ask for a breakdown by category and how your student will be provided differentiation to address gaps. This is more appropriate than plowing through all of the math at a lower grade level.
There are numerous hidden tasks that we undertake while at the grocery store. We process them so quickly or subconsciously that we are not aware of these steps.
As a result, we may overlook these steps while educating students on life skills such as grocery shopping. Subsequently, these steps may not be part of the programming or teaching at school and therefore generalization is left for another day. Yet, the purpose of IDEA is, in essence, preparing students for life, including “independent living.”
Step 1 is to administer a baseline pretest during which we start with no prompting to determine if the student performs each task and how well each is performed. As necessary, prompting is provided and respective documentation is entered into the table (to indicate prompting as opposed to independent completion). For example, I worked with a client who understood the meaning of the shopping list but started off for the first item without a basket or cart. I engaged him with a discussion about how he would carry the items. At one point I had him hold 7 grapefruits and it became apparent to him that he needed a cart. (I documented this in the document.)
Other issues that arose were parking the cart in the middle of the aisle, finding the appropriate section of the store but struggling to navigate the section for the item (e.g. at one point I prompted him to read the signs over the freezer doors), and mishandling the money when prompted to pay by the cashier announcing the total amount to pay.
Step 2 is to identify a task or sequence of tasks to practice in isolation based on the results of the pretest. For example, this could involve walking to a section of the store and prompting the student to find an item. Data collection would involve several trials of simply finding the item without addressing any other steps of the task analysis.
Step 3 would be to chain multiple steps together, but not the entire task analysis yet. For example, having the student find the appropriate section and then finding the item in the section.
Eventually, a post-test can be administered to assess the entire sequence to identify progress and areas needing more attention.
To ensure the IEP team is on the same page as to what mastery of an objective looks like, the person writing the objective can take two steps:
provide an example problem that would be used to assess mastery (and the example problem would have the same language as used in the objective)
provide an example of a response to the example problem cited above that would be considered mastery level work
The graph below is not data. A graph is a representation of summary statistics. This summarizes the data.
The chart below does not show the actual prompts, e.g. what number was shown to Kate, but it does show the individual trials. This is data, with a summary statistics at the end of each row. Here is a link to more discussion about data, with an example of a data sheet I use.
The data shown below addresses the student’s effort to solve an equation. Problem 21 is checked as correct and the error in problem 22 is identified. I can use this data to identify where the student is struggling and how to help. NOTE: the math objective would use the same verb as the problem: solvethe linear equation.
The excerpt of a data sheet, shown below shows trials in a student’s effort to compare numbers.
Data below shows a student’s effort to evaluate integer expressions.
This applies to all areas beyond math. The chart above or the data sheet I linked above show data sheets that indicate the prompt and the results, with notes. For example, if I am asking my son to put on his shoes, each row of the data sheet is a trial with the outcome and notes.
Learning is not a singular threshold to be met. There are different levels of learning – a continuum (see photo below taken from the book Teaching Mathematics Meaningfully).
A student demonstrating proficiency (fluency) is far different from a student simply showing some level of understanding (acquisition). I remember learning to drive a car with a stick shift. During acquisition (initial understanding) I was looking down at the pedals and the stick shift as I thought through the steps. It is not surprising that many students who only show acquisition of a math topic soon forget it. Despite this, the acquisition stage is often were math in schools resides.
This extends beyond math fact fluency to all math topics and the students should take the next step and demonstrate maintenance. To do this, I recommend that a curricular based assessment be given a couple of weeks after a student initially showed what is considered mastery – the student successfully performing problems aligned with a given math objective.
Below is a excerpt from the book with an explanation of the topics. I use this text in the math for special ed courses I teach at different universities.
Testing (results shown on the Present Levels of Performance page shown below) is often confusing for parents, especially in regards to math. The results are often reported in broad terms, e.g. computation or IQ.
Here is an analogy for the testing (in terms usefulness for determining instruction, performance and achievement). We go to the DMV and have to take an eye test. That test is used to determine if we have the physical ability to drive or what we need to ensure we have the physical ability to drive. If our vision is diminished maybe we need glasses in order to drive.
Passing the vision test does not mean we are ready to drive. It means we have the potential to drive. In order to determine if we can actually drive we take a driver’s test.
Similarly, in order to determine what we can actually do in math we need to take a math test (quiz, checkpoint or some type of curriculum based assessment).
Below is a problem aligned with the Common Core of State Standards for Math. I used it as part of a curriculum based assessment to determine the student’s current ability or present level of performance. She had all types of standardized testing results on record but I needed to know if she could pass the actual driver’s test.
When I train new math and special education teachers I explain that teaching math should be like feeding a hot dog to a baby in a high chair. Cut up the hot dog into bite-sized pieces. The baby will still consumer the entire hot dog. Same with math. Our students can consume the entire math topic being presented but in smaller chunks.
My approach to doing this is through a task analysis. This is very similar to chunking. It is a method to cut up the math into bite-sized pieces just as we would break up a common task for students with special needs.
While waiting for my coffee order at a Burger King I saw on the wall a different version of a task analysis. It was a step by step set of directions using photos on how to pour a soft cream ice-cream cone. I thought it was amazing that Burger King can do such a good job training its employees by breaking the task down yet in education we often fall short in terms of breaking a math topic down.
In the effort to assess student ability performance factors are likely present. It is incumbent upon the educators to mitigate the performance issues to assess true ability.
For example, I conducted an evaluation on a student in middle school who has ADHD. All of her testing records indicated that she would lose focus during the assessment and that the focus was problematic for testing. Before we met I surveyed her on her favorite snack (didn’t know Sour Skittles is a thing), brought this reinforcer along with a bottle of water. She sat through an entire 1 1/2 hour KeyMath Assessment without incident.
Owls are symbols of intelligence but the purported reasons are based on the appearance of awareness and the deft hunting skills. It is claimed that the appearance and skill sets are confused with actual wisdom.
I find a parallel between the perceived wisdom of the owls and the perceived learning of students. Through my years in education I have seen teachers praised for their student centered activities. The students may be energetic and on task by an activity which is often considered a touchstone for learning. What is often missing is independent assessment to determine actual learning.
Once I was covering a class for a teacher widely praised for his activities and multimedia activities. In the class I covered the students were taking a test. It was clear that the majority of the students were hesitant about their performance. Several were looking around, one pulled out a phone and a couple looked at other people’s paper. Very few were locked in on completing their test.
I am not suggesting that multimedia or student centered activities are ineffective. My point is that there is a perception that such activities are inherently effective and reflective of actual learning. There is a difference between being intellectually engaged and being busy. The owl deftly executes action and skill but that does not indicate higher level functioning. Conceptual understanding requires more than simply being engaged by activity. Hopefully this is food for thought.