By Categories: EducationTags: Comments Off on What is the difference between Evidence and Research?

This article is Part 1 of our ESSA: Evidence of Effectiveness series. Part 2 (Understanding the different tiers of Evidence of Effectiveness) is coming soon.


I know what you could be thinking: tomato/tamato, right?

For so long we, as educators, have used the words evidence and research somewhat interchangeably. Sure, we know there’s a difference. But often when we reviewed curriculum in our classrooms, set up pilots, and ultimately decided on an instructional purchase, we sought out ‘research-based’ options to guide our choices and decisions – and assumed strong research-based curriculum meant strong evidence of effectiveness.

But does it?

Do research-based programs still have a place in our classrooms? Do they still have merit? And, more importantly, do they still hold a stake in the accountability measures under the Every Student Succeeds Act (ESSA) and meet the requirements for Evidence of Effectiveness?

Let’s take it from the top.

HOW DID WE GET TO ‘EVIDENCE-BASED’ IN THE FIRST PLACE?

Under the Elementary and Secondary Education Act (ESEA), schools were told to implement programs that were grounded in research. When the No Child Left Behind Act (NCLB) came along, policymakers told schools to make sure that the intervention programs they implemented were grounded in “scientifically-based research.” Now, with NCLB being replaced by the Every Student Succeeds Act (ESSA), ‘scientifically-based research’ is being replaced by the need to implement ‘evidence-based interventions.’

DEFINING RESEARCH-BASED VS. EVIDENCE-BASED

Now you’re probably thinking, ‘Wow, that’s a lot of subtle shifts in a short period.’ And secondly, you might be thinking, ‘Why the shift to evidence-based? Are they so different?’

Today, there are clear delineations between the words research-based and evidence-based in education. These simple, hyphenated words become even more weighty (both figuratively and literally) when schools are required to demonstrate Evidence of Effectiveness of their instructional programs. To best set the stage for understanding these distinctions, I’d like to introduce you to John.

          The Analogy

Let’s imagine a student, John, who struggles with finding restaurants that he and his family can go to that cater to his specific nutritional needs and allergies. His parents are planning a big family reunion but are torn on where to have it. They thought about having it at the house due to John’s food restrictions, but don’t have the room to entertain everyone. A restaurant or catering venue is the best option. John decides to help his family out by doing research. He visits Yelp!® to find restaurants in his local area that are of high-quality, recommended by others, and are assumed will cater to his special needs. John sees a lot of great recommendations for restaurants that he knows his family will love, but whether or not they could actually cater to his needs is up for debate. Despite the great research, there’s no evidence stating they can meet John’s needs.

Just like John, schools face similar challenges when evaluating instructional programs.

          Research-Based

When we get to the bottom of it, research-based curriculum can still hold some merit. In fact, proven-effective research-based programs give us a starting point to evaluate whether a program is of sound practice and grounded in research or not. Does it have merit? Is it a high-quality program? How do I know it’s worthy to put in front of my kids’ day-in and day-out?

NCLB referred to programs of merit to be “based on scientifically-based research.” The problem with this vague phrase is that it left too much to be open to interpretation. It allowed for the subjectivity that a lot of different programs could argue that they were “based on research.” In contrast, ESSA lays out four standards and makes the use of one of the top three of four levels of evidence as a requirement for specific funded activities.   In a nutshell, ESSA now requires demonstration that the program actually works in practice in a similar environment, and that schools continue to demonstrate evidence each year that it is working based on a defined study in order to receive continued funding for implementing it.

Drawing on our analogy, research-based information would be when John received good feedback and supportive arguments, which would be similar to testimonials and case studies that schools relied on for making purchasing decisions.

          Evidence-Based

ESSA seeks to hold programs to higher standards through the use of its four tier-levels of effectiveness. ESSA requires programs to have undergone testing and systemic evaluation, demonstrating significantly higher levels of program effectiveness than was a standard practice of the past.

School leaders can no longer rely on using research-based programs that have been highly recommended, despite them being recognized as being high-quality. The shift – from research-based to evidence-based – was designed to increase the successful impact programs had on students and increase the return on educational investment for schools – at the local level. It’s requiring and focusing on the need that programs being implemented are proven to be effective. Once the needs have been identified, SEAs, LEAs, schools, and other stakeholders will determine the interventions that best serve their needs, implementing rigorous and relevant evidence of effectiveness at the local level.

Circling back to the analogy, John can do all the research he wants on restaurants (and find some promising ones too!), but how does he know that a particular restaurant meets his dietary needs without evidence? In evidence-based, John would actually have to go to each of the restaurants several times to test the menus. He would need to learn which dishes would be safe, and then using actual evidence gathered through personal experience, he could safely pick the venue. Likewise, once we use reviews to choose to pilot a product, we need evidence that it works in our classrooms.

Schools have special needs too. Therefore, seeking evidence that a program can meet needs based on similar demographics, structure, etc., is a step closer to improving the outcomes for students. Some programs may be cost-effective and appear strong in abstract terms but may make little sense for implementation in a particular district or school.

          ESSA: Evidence of Effectiveness

ESSA defines strong, moderate, and promising evidence of effectiveness. A fourth category is also listed, indicating programs lacking evidence of effectiveness, though they may be under evaluation currently. Strong, moderate, and promising categories are defined as follows (in brief):

  1. Tier 1 = Strong: At least one randomized, well-conducted study showing significant positive student outcomes.
  2. Tier 2 = Moderate: At least one quasi-experimental (i.e., matched), well-conducted study showing significant positive student outcomes.
  3. Tier 3 = Promising: At least one correlational, well-conducted study with controls for inputs showing significant positive student outcomes.
  4. Tier 4 = Demonstrates a Rationale: Practices that have a well-defined logic model or theory of action, are supported by research, and have some effort underway by a SEA, LEA, or outside research organization to determine their effectiveness.

Based on ESSA’s Evidence of Effectiveness, it is now clear that research-based programs fall under Tier 4 – the ‘lowest’ tier that demonstrates the effectiveness of programs. (In a subsequent blog, we will explore these levels and what they mean at the local level more in-depth.)

ESSA RAISES THE BAR ON RESEARCH AND EVIDENCE

The bar has been raised and the stakes have changed in how we define (and draw a line) between evidence-based and research-based – and ultimately what we (schools and districts) are now being held accountable for. With (ESSA), we can’t use the words as interchangeably as we may have been.

For school leaders, this is important on a few different levels.

  • First and foremost, with ESSA being the law of the land, that also means that implementing programs that demonstrate strong, moderate, or promising evidence of effectiveness (Tiers 1 -3) is mandatory – and the timeline for implementing programs and demonstrating a higher level of effectiveness is just around the corner (2019).
  • Second, I know that your students are always at the forefront of your mind, and your commitment to their success and your teachers’ is what probably keeps you up at night. I know you seek the best solutions with the limited funds you have, making sure that the programs you have in place demonstrate high levels of effectiveness and have a strong impact on student achievement outcomes. When it comes to funding, interventions applied under Title I, Section 1003 (School Improvement) are required to have strong, moderate, or promising evidence (Tiers 1–3) to support them. All other programs under Titles I–IV can rely on Tiers 1–4.
  • Third, implementing programs that you’ve proven are of the highest effectiveness (Tier 1: Strong Evidence) will additionally demonstrate a statistically significant effect on improving student outcomes and saving the school money in the long run.
  • Fourth, demonstrating adherence to ESSA’s Evidence of Effectiveness is what will get you those funds. As I mentioned previously, ESSA states that instead of using research-based, we need to use evidence-based programs. Some ESSA programs – including some competitive grants and Title I, Section 1003 – require the use of “evidence-based” programs that meet higher levels of evidence. Hence, this is why it becomes so important for schools to start making sure that their programs hit the top requirements set forth by ESSA. You should look through the “Non-Regulatory Guidance: Using Evidence to Strengthen Education Investments” to understand best practices and recommendations that your school is adhering to ESSA accountability.
  • Fifth, implementing strong Evidence of Effectiveness programs can be politically advantageous. After all, programs that have strong evidence are more defensible and can be used more fluidly within the district.

WHAT ARE THE NEXT STEPS?

First, make sure to read the upcoming article on what evidence requirements mean to local schools and explore more in-depth the different levels of Evidence of Effectiveness. Second, I highly recommend exploring your state’s ESSA plans and keeping informed through your state and national administrator organizations.