top of page

When Interventions Fail: A Troubleshooting Framework for Behavior Analysts

Puzzle with nine pieces, one missing and outlined in blue, set against a white background.

There's a moment that happens to every behavior analyst, usually around 2 AM when you can't sleep: you realize the intervention you've been implementing with complete fidelity for the past month isn't working. The data doesn't lie, and what it's telling you is that despite your best efforts, you're not helping this kid.

And then comes the spiral: Did I choose the wrong procedure? Is my functional assessment wrong? Am I missing something obvious? Should I have seen this coming? What am I going to tell the parents?

Let me save you some of that 2 AM anxiety: sometimes ABA interventions fail. Not because you're incompetent, not because ABA doesn't work, but because human behavior is complex and our first hypothesis isn't always right.

The question isn't whether you'll ever implement an intervention that doesn't work. You will. The question is: what do you do when it happens?


Why Interventions Fail - The Real Reasons

Before we get into troubleshooting, let's talk about the actual reasons interventions don't work. Not the reasons we tell ourselves at 2 AM, but the real, fixable reasons.

Sometimes your functional assessment was incomplete. You identified a function, sure, but maybe behavior is multiply controlled and you're only addressing one function. Or maybe the function shifts across contexts and you assessed in the wrong setting. Or maybe - and this is the one that stings - the family told you what they thought you wanted to hear during the assessment, and the actual contingencies at home are completely different.

Sometimes the intervention is theoretically sound but practically impossible. It requires a level of consistency that this particular family can't maintain given their actual life circumstances. It needs materials they don't have access to. It demands attention from caregivers who are already maxed out. You designed the perfect intervention for a perfect world, and this family doesn't live there.

Sometimes the intervention is working exactly as designed, but you're measuring the wrong thing. You're tracking the target behavior while missing the fact that three other concerning behaviors have emerged. Or you're so focused on behavior reduction that you're not noticing the child is increasingly anxious, withdrawn, or losing skills in other areas.

And sometimes - this is hard to admit - the intervention itself is causing harm. Maybe your extinction procedure is more punishing than you realized. Maybe your prompting strategy is creating learned helplessness. Maybe your reinforcement system is accidentally making the child dependent on external motivation for things they used to do spontaneously.


The Troubleshooting Framework When Interventions Fail

When an intervention isn't working, most of us jump straight to "what should I change?" But that's backwards. The first question should be: "what's actually happening?"

Start with implementation fidelity, but be honest about it. Not "am I implementing the procedure?" but "is the procedure being implemented the way I designed it across all relevant contexts?" Because if you're implementing perfectly at the clinic but the family isn't implementing at home, or the teacher is implementing a modified version at school, you don't actually know if your intervention doesn't work - you just know the inconsistent version doesn't work.

Next, look at your data collection. Are you measuring what you think you're measuring? I've seen interventions "fail" because the data collector was recording something slightly different than what was defined, or recording at times when the behavior was less likely to occur anyway. Pull some reliability data. Watch some video. Make sure you're actually tracking what you think you're tracking.

Leafless tree by a winding stream in a simple line drawing with rocks and grass. Serene landscape with barren branches.

And here's where that seventeen different ways of graphing comes in - sometimes you're looking at the data wrong. You're using a cumulative graph when you need a rate graph. You're looking at daily totals when you need to see within-session patterns. You're focused on one measure when a different dimension of behavior would tell a more complete story. Before you conclude the intervention failed, make sure you're visualizing the data in a way that actually reveals what's happening.

Then - and this is where it gets interesting - look at what else is happening. What behaviors have emerged that you're not measuring? How is the child's overall affect? Are they still engaged with you, or are they starting to avoid sessions? What are the parents saying in those casual comments before and after sessions that they're not saying in formal check-ins?

Now look at your functional assessment data. Not to defend your original hypothesis, but to genuinely test it. What does the current data suggest about function? Does it match your original assessment, or is there a pattern you missed? Is the behavior occurring at times or in situations that don't fit your original function?

Finally, and this is the step people skip, look at the context changes. What else has happened in this child's life since you started the intervention? Did they start a new medication? Change classrooms? Experience a family disruption? Develop a new skill that changed their access to reinforcement? Sometimes interventions "fail" because the environment shifted and we're still treating the old problem.


When to Adjust vs. When to Abandon Ship

Here's the decision point that keeps people stuck: how do you know if you should tweak the intervention or scrap it and start over?

Adjust when:

  • The intervention is showing some effect, even if it's not as much or as fast as you wanted

  • Implementation fidelity has been inconsistent and you haven't really given it a fair test yet

  • The data suggests you're on the right track but need to modify parameters

  • Families are invested in the approach and willing to keep trying modifications

  • Troubleshooting reveals a clear, fixable problem

Abandon ship when:

Simple line drawing of a sailboat on waves. The boat has two sails and is moving peacefully. Black lines on a white background.
  • The intervention has been implemented with high fidelity for an appropriate duration and there's zero movement in the data

  • The child's distress or avoidance is increasing even though behavior might be decreasing

  • The intervention is so demanding on the family that they're starting to resent therapy

  • Troubleshooting reveals that your entire conceptualization was wrong

  • You realize you don't have the resources, expertise, or support to effectively help this client

Sometimes you need to go back to assessment rather than forward with modifications. And sometimes the right move is a referral, not a revision.


The Data Review Process for Failed Interventions

When an intervention isn't working, the way you review your data matters as much as what the data shows.

Don't just look at your target behavior graph. Look at everything: session notes, implementation fidelity data, parent reports, other behaviors you're tracking incidentally, skill acquisition in other areas. You're looking for patterns you might have missed when you were focused on your target.

Look at variability, not just trend. Highly variable data with no downward trend might mean your intervention is working sometimes - you need to figure out when and why. Stable data with no change might mean you're not accessing the relevant contingencies at all.

Look at the context. Is behavior better or worse in certain settings, with certain people, at certain times of day? If your intervention works perfectly in session but not at home, that's not intervention failure - that's a generalization problem, which is a different issue with different solutions.

Look at what happened right before you started seeing problems. Did you change something else? Did the family change something? Did the child's schedule change? Sometimes the intervention isn't the problem - something else is interfering with an intervention that was working fine before.

And here's where graphing seventeen different ways actually matters: try visualizing your data differently. Switch from a line graph to a bar graph. Look at session-by-session instead of day-by-day. Separate out different contexts. Sometimes the pattern you need to see only becomes obvious when you change how you're looking at it.


Having the Conversation When Interventions Fail

Now for the hard part: talking to families when you need to change course.

Three people in suits engage in discussion at a table, one holding a paper. Continuous line art style, simple and minimalistic.

First, don't wait until you're absolutely certain the intervention has failed. Have ongoing conversations about progress, and start raising concerns as soon as you notice them. "I expected to see more change by now" is easier to say at week three than "This isn't working" at week eight.

Be honest about uncertainty, but not in a way that destroys confidence. "I thought this would work based on our assessment, but the data is telling me we need to reconsider" is very different from "I have no idea what I'm doing." One is professional uncertainty, the other is professional incompetence.

Explain what you've learned, not just what didn't work. "The good news is, the past three weeks taught us that the behavior is more complex than we initially thought. Here's what we now know..." This reframes the "failed" intervention as part of the assessment process, which honestly, it is.

Present options, not just one new plan. "Here are three directions we could go, and here's what the data suggests about each one." This involves the family in decision-making and acknowledges that you're working together to figure this out, not that you have all the answers and sometimes get them wrong.

And be really clear about what you're asking from them. Are you asking them to try something completely different? Stick with this a bit longer with modifications? Take a break from this target and focus on something else? Families can handle uncertainty better when they know what's being asked of them.


What Not to Do When Interventions Aren't Working

Let's talk about the mistakes people make when interventions fail, because I've made most of them.

Don't keep implementing something that's not working just because admitting failure feels bad. Your discomfort is not more important than this child's progress.

Don't blame the family for implementation problems without first examining whether your intervention was realistic for their actual life. Yes, fidelity matters. But so does designing interventions that real humans in real circumstances can actually implement.

Don't get defensive when families question the approach. They should question it when it's not working. That's them being good advocates for their child, not them doubting your competence.

Don't change too many things at once. If you modify three variables simultaneously and things improve, you have no idea what actually worked. Make one change, measure it, then make another if needed.

Don't abandon data collection just because the data is depressing. When interventions aren't working is exactly when you need data most, because that's how you figure out what to do next.

And don't beat yourself up like you've failed as a clinician. Every behavior analyst who's been practicing for more than a month has implemented interventions that didn't work. What separates good clinicians from mediocre ones isn't whether they ever get it wrong - it's what they do when they realize they have.


Where You Go From Here

Think about an intervention you're running right now that's not going as well as you expected.

Not a complete disaster, just one where you're not seeing the progress you thought you'd see by now. What does your data actually show versus what you hoped it would show?

If you had to troubleshoot it systematically right now, where would you start? Implementation fidelity? Data collection accuracy? Functional assessment validity? Context changes? Or maybe you need to graph it differently to see what's really happening?

And here's the harder question: if you determined this intervention needed to be abandoned rather than adjusted, what would that conversation with the family sound like?

Because here's the reality: interventions that don't work aren't failures if we learn from them and adjust. They're only failures if we keep implementing them despite evidence that they're not helping - or worse, that they're causing harm.

The best behavior analysts aren't the ones who always get it right the first time. They're the ones who notice when something isn't working and have the courage to change course.


Want weekly insights on navigating the real challenges of behavior analysis practice?

Join our Foundations and Fresh Starts newsletter series. Each week, you'll get honest conversations about the parts of clinical work that aren't in the textbooks - delivered straight to your inbox by people who've been there too.


Comments


Address

1813 N. Mill St. Suite A,

Naperville, IL 60563

Phone

(331) 303-8600

Email

Connect

  • Facebook
  • LinkedIn

© 2022 by Precision ABA

bottom of page