Beginning in spring 2020, reports developed by EdReports.org will be using an updated version of our review tools.
The tools used to create EdReports’ mathematics and English language arts reports have been in use for more than five years. As a learning organization, we regularly examine our review process, evidence guides, and review criteria to determine what, if any, updates need to be made.
Beginning in the summer of 2019, EdReports launched efforts to formally collect feedback on our review tools (review criteria, corresponding evidence guides, and the overall review process) in an effort to continually learn and improve our reports. We have synthesized feedback from the field and have applied strategic updates to our tools. Reviews utilizing the revised tools will begin in spring 2020 with completed reviews launching early next year.
We have put together a list of FAQs that provide insight into our revision process as well as address potential impact on the education field and publishers.
A: EdReports “tools” consist of our review criteria, corresponding evidence guides, our training, and the overall review process. These tools are intended to be applied by trained educator reviewers to determine the degree to which instructional materials meet college and career-ready standards and usability indicators. The goal of our reports is to empower schools and districts with evidence to make critical instructional materials selection decisions.
A: EdReports’ theory of action is that by providing credible information (free, public reports on our website) we will help drive demand for quality instructional materials. As more states, districts, and schools demand quality materials, this in turn will influence publishers to improve the quality of curriculum which will ultimately improve student learning. Consistent with our mission, we are examining how we can make these reports even better and more helpful to the field.
We published our first reviews more than five years ago. Since then, we have received constructive feedback from the field and our reviewers about how to improve our reports. We continually track potential improvements to our review processes and products, and we are always exploring, listening, and reflecting to determine what type of maintenance may be necessary.
In particular, we have made significant changes to indicators focused on usability (gateway 3) torespond to how much materials have improved since our organizational launch in 2015.Then, only 1 out of the 19 programs reviewed met expectations for alignment. Now, educators have more aligned programs to choose from. We realized that we could provide more information about product usability to support a district’s selection and implementation process.
We have revised the tools—we have not recreated them.
A: The work on these revisions began in 2019 and pre-date the current health crisis. In order to meet emerging needs from districts, EdReports will begin providing more information in summer 2020 about the components of curriculum that support critical needs, such as remote learning and digital components, for more than 200 existing grade-level materials that meet expectations for alignment. This is in response to the growing need from school districts for more precise information and guidance about specific characteristics of materials (e.g., ease of implementation, student and parent facing components, availability of digital components) so that districts can learn more about these components during their curriculum adoptions.
A: Revisions to indicators focusing on questions of usability (gateway 3) have been implemented across all content areas (mathematics, ELA, science). Revisions focusing on questions of standards alignment (gateways 1 and 2) have been made in our K-12 English language arts and K-8 mathematics tools.
Our science and ELA foundational skills review tools are relatively new and therefore do not necessitate revisions at this time.
A: The goal of this tool revision is to minimize disruption for the field and continue to provide high-quality evidence of the greatest value. We have incorporated our learnings and reflections from conducting several hundred reviews as well as feedback from the field about what information is the most valuable
Revisions are based on primary feedback from educators and researchers that include:
A: Gateway 3 focuses on the question of usability: Are the instructional materials user-friendly for students and educators? Materials must be well designed to facilitate student learning and enhance a teacher’s ability to differentiate and build knowledge within the classroom. In order to be reviewed and attain a rating for gateway 3, the instructional materials must first meet expectations for alignment to college and career-ready standards (gateways 1 and 2).
When we released our first mathematics reports in 2015, only one program met expectations for alignment which meant gateway 3 was not applied to the majority of our early reviews. A similar pattern emerged in early ELA materials reviews. In the past five years, more materials have met criteria for standards alignment. As a result, gateway 3 is becoming a much more important aspect of our reviews. In analyzing research and listening to feedback, we realized we could provide more valuable information to the field around these criteria
Our revision of gateway 3 addresses details about program format, student populations that require support for language acquisition, learner variance, and teacher supports to help understand standards and how to utilize the materials. The tool will include a ‘Technology Criteria Form’ that allows for the collection of a greater level of detail on features such as data privacy and interoperability with learning management systems.
In addition, gateway 3 changes increase clarity and consistency across all content areas. Primary changes include:
The revisions also clarify student populations and renamed the criterion for “Differentiated Instruction” to “Student Supports.” Under “Student Supports” we:
Created specific indicators for Learner Variance that look for:
Created specific indicators for English Learners (EL) that look for:
Created a Technology Criteria Form* that will be given to a publisher at the time of materials purchase and posted alongside the report with publisher’s response.
* Due to emerging needs from school districts in response to the COVID-19 health crisis we are revising our Technology Criteria Form to include more information about the components of curriculum that support distance learning, including digital components. The revised form will be available early summer 2020.
A: The standard first step for all EdReports content review expansions is a listening and learning tour. EdReports must first learn about the market and answer key questions about the status of standards, what the need is, and who to engage for feedback. This tour is with researchers, nonprofits, publishers, states, districts, and classroom educators to ensure we get feedback and that people feel heard and clear on next steps. The same learning applied to the tool revision process.
A: EdReports considered this question at length as we have reviewed 700 grade/course reviews in the past five years with some copyrights that go back to 2008. We consider this tool revision to be scheduled maintenance versus a radical new approach. We stand by all of our reviews and believe that they provide a host of important evidence. This set of revisions to the tools provide more fine grained information to help districts make choices.
Because of those reasons, EdReports will not be conducting retroactive reviews as part of its roll out of the revised tools.* However, EdReports’ established policy is that it stands ready to re-review materials when they have been substantively updated.
Reviews utilizing the new tools begin this spring with the first round of reports published early in 2021.
* Due to emerging school district needs arising from the COVID-19 health crisis, EdReports will provide additional information this summer about critical components of curriculum, such as remote learning and digital resources, for more than 200 existing grade-level materials that meet expectations for alignment.
A: We stand ready to re-review materials when they have been substantively updated. Our desire is to make these decisions collaboratively with publishers. We ask publishers to indicate where the changes are and discuss whether they would have an impact on select indicators or the review overall. Depending on the answer, we may bring together a new review team to re-review the materials in their entirety or to review for just a few indicators.
A: No. In order to ensure an equitable process for all publishers, programs scheduled for review after May 2020 will be using the revised tools. The exception to this policy will be for materials that are part of a K-8 series where a portion of the series was reviewed on the initial tool. The remaining grades in the series will be reviewed using the same tool to ensure consistency throughout the program.
A: Differentiation indicators are still present, but they are now labeled Student Supports and include more expansive coverage. For example, we include indicators that are more specific to student needs, such as those learning English.
A: We have made minor changes to our criteria focusing on standards alignment (gateways 1 and 2) and feel confident that overall alignment ratings will remain consistent. The bulk of our changes are focused on usability (gateway 3), and we will look at new versions (copyrights) of programs with our revised tool as they become available.
A: We are always cautious to make changes because it's important for school districts and other stakeholders to know that our reviews are accurate and stable. We understand that consistency is important, but we will not use consistency as an excuse not to innovate. We are a learning organization that continues to evolve and listen. Our intention is to revise our tools frequently enough to reflect research consensus and ensure our reviews continue to be relevant to the field, but to do so without causing confusion.
A: Regarding English Learners, it is important to note that the research and evidence for student needs in curriculum is still developing. We are proud of the first steps we have taken around new indicators that provide information about English Learner supports in our review tools; however, we are mindful of not forcing a consensus regarding the research and respecting the emerging evidence based approaches currently being discussed. At EdReports, our role is not to impose conclusions that have yet to be established. Rather, once consensus is reached we aim to partner with educators who know materials and think about what the appropriate review criteria are to assess curriculum.
A: Our mathematics tools were the first to be released when we launched EdReports in March 2015. In the past five years we have been listening to and collecting feedback from the field while simultaneously monitoring changes in mathematics programs. The gateways 1 and 2 changes in mathematics address coherence in gateway 1, and the mathematical practices in gateway 2.
Gateway 1 revisions focus on more detailed information on coherence. We achieve this through these key changes:
Gateway 2 revisions look at all mathematical practices in more depth. We achieve this with these key changes:
Gateways 1 and 2 for our high school mathematics review tool are unchanged.
A: Our revisions to K-12 ELA review tools focus on bundling indicators to make the resulting reports more streamlined.
In gateways 1 and 2, we are making these key changes:
A: Our ELA foundational skills review tools were launched in 2019, and we feel confident in their current status. As with our other content areas, we want our foundational skills reviews to be out in the field long enough for users to provide feedback on what’s working and what can be improved.
A: Middle school science reviews were launched in February 2019 and K-5 reviews will be released in the second quarter of 2020. No revisions will be made to gateways 1 and 2 of the review tools as we feel confident about their current status. As with our other content areas, we want our science reviews to be out in the field long enough for users to provide feedback on what’s working and what can be improved. At this point in time, the current feedback reveals that the science review tools are producing quality evidence that is helpful to the field.
However, we will be updating usability criteria (gateway 3) for our science tools to align with changes being made to our ELA and mathematics tools.
A: No. As a learning organization this revision is part of a regular cycle of learning and reflection—it doesn’t mean our previous tool was wrong. We stand by our reviews and believe that the indicators provide high quality information to the field. However, we also knew it was time to reexamine the market and revise our indicators to provide more fine-grained information to help districts make choices.
A: Yes, the foundation of our tools remains the same. Information about standards alignment is focused in the first two gateways. Usability information is found in gateway 3. Although there are score point shifts within some criteria, overall score totals have not changed in gateways 1 or 2. We always recommend looking at the details of a report after first looking to see how a program aligned to the standards.
For additional questions please contact email@example.com.