Information Regarding the Arkansas PLC at Work® Project Contract
2017– 2024
Celebrating the PLC at Work® Project in Arkansas
Visit the Solution Tree Blog to read our special bravo to the Arkansas educators from the
98 schools and 58 districts that participated in the PLC at Work Project.
Success Across Arkansas: Results From Cohort Schools
The Journey of a Professional Learning Community: Success Across Arkansas shares dozens of success stories from schools, documenting the positive
impact that the PLC at Work process has on student achievement. For example:
Bayyari Elementary: Fourth-grade math scores increased by 11.4%, and reading scores increased by 16.9%.
Magazine School District: Eighth-grade reading scores improved by 24.8%, and fourth-grade math scores rose by 22.1%.
Marked Tree School District: Tenth-grade reading scores increased by 23%, and seventh-grade English scores by 15%.
In addition, read the Evidence of Excellence stories for Lake Hamilton Junior High and Camden Fairview High School. Notable achievements include:
Lake Hamilton Junior High achieved the best growth scores in English language arts, recognized by the University of Arkansas Office for Education Policy. Watch the school's video highlighting its Model PLC at Work journey.
Camden Fairview High School achieved its highest graduation rate of 91%, readiness in reading scores increased from 12% to 17%, and behavior issues decreased by 14%.
Contact us to start your journey to be a changemaker for students.
Positive Impact on Cohort Schools
Education Northwest Independent Evaluation
For nearly 60 years, Education Northwest has partnered with communities across the United States to advance education through research and evaluation. Its external evaluation of ten Cohort 1 schools in the PLC at Work Project in Arkansas found that after only two years of implementation, students were showing positive growth in ACT Aspire test scores. The greatest impact was seen in student math achievement with PLC at Work participants seeing an overall positive impact on math ACT Aspire growth (Hanson et al., 2021) and gains three times higher than gains seen from National Board certification and the eMINTS Comprehensive Program (Torres & Hanson, 2020; What Works Clearinghouse, 2020). These findings are important because the PLC at Work process focuses on schoolwide transformation and is based on a simple proposition: improved teacher collaboration, trust, and collective responsibility will lead to improved instruction, improved student engagement, and, ultimately, increased student achievement. The external evaluation is designed to establish Every Student Success Act (ESSA) Tier II evidence.
On behalf of Education Northwest, Sun Young Yoon, Principal Researcher, commented on June 17:
"In 2017, Solution Tree Inc. partnered with the Arkansas Division of Elementary and Secondary Education to lay the foundation for expanding Professional Learning Communities at Work® (PLC at Work®) in Arkansas. After three years, Cohort 1 schools successfully implemented PLC at Work. This led to improved communication, trust, and collective responsibility among educators and, in turn, improved student engagement and learning. PLC at Work’s positive impact on student achievement test scores, particularly in math, validated educators’ perceptions about improved student learning and engagement.
"In 2019, Education Northwest conducted an evaluation of PLC at Work® that provides comprehensive information about the implementation and impact of the program in Arkansas. The study describes the implementation processes and challenges faced at the school level, and the academic improvements made by students using a mixed-methods approach that combines quantitative data with qualitative insights from interviews and surveys to provide a more holistic understanding of the relationship between PLCs, teacher practices, and student achievement. Our evaluation also examined implementation fidelity to explicate the mechanisms through which PLC at Work® impacted school leadership and teacher efficacy, ultimately leading to student achievement gains in mathematics during the study period."
Read the full independent evaluation, validating the success of the PLC at Work Project in Arkansas, on the Education Northwest website.
Key findings include:
Students in PLC at Work Cohort 1 schools showed improved academic achievement and higher levels of engagement.
Participation in PLC at Work in Arkansas has had a positive impact on achievement growth in Arkansas, particularly in math.
Cohort 1 schools have seen positive changes in student engagement, including fewer suspensions and expulsions.
All PLC at Work Cohort 1 schools reported positive changes in instructional practices, which led to improved learning opportunities for students.
Educators in PLC at Work Cohort 1 schools improved their culture of collaboration and collective responsibility for ensuring all students learn at high levels.
All educators saw growth in communication, trust, collective responsibility, and efficacy for student learning.
Implementation of key elements of PLC at Work was associated with larger growth in educator trust, collective responsibility, and the creation of schoolwide systems of interventions and extensions.
All PLC at Work Cohort 1 schools received substantial support from school leaders and Solution Tree associates and were able to fully implement the program.
All schools established the core components of PLC at Work.
Customized supports from Solution Tree associates helped schools meet their implementation goals.
Widespread support was necessary for implementing and sustaining PLC at Work.
As a commitment to continuous improvement, Solution Tree is working to complete six independent evaluations by the end of 2024 to assess the effectiveness of the PLC at Work process at specific school and district sites. While some of these evaluations are external to Arkansas, the findings will offer feedback and help validate schools' and districts' positive results in the state. For example, read Policy Analysis for California Education's first-year insights for the California Collaborative for Educational Excellence's Intensive Assistance Model.
Model PLC at Work Schools and Districts
Setting the Record Straight About Solution Tree and the PLC at Work Project
Solution Tree is the trusted partner of more than 8,000 schools and districts in 50 states and stands behind the work we have done in Arkansas. The following sections address the misinformation about the PLC at Work Project. More detailed information related to the PLC at Work Project in Arkansas and further evidence of effectiveness can be found at https://arkansas.solutiontree.com.
False: PLC at Work has no evidence of positive school results.
Fact: Hundreds of U.S. schools implementing the PLC at Work process have achieved positive results.
In Arkansas alone, dozens of schools and districts implementing the PLC at Work process have shared their successes in the ADE (2023) book, The Journey of a Professional Learning Community: Success Across Arkansas. With the school’s and districts’ permission, we’ve included three examples of evidence-based results here:
● Bayyari Elementary: In 2022, Bayyari was recognized as an Overcoming the Odds school and a School on the Move towards Excellence. Fourth-grade students grew in math 11.4% and in reading by 16.9%. (Cohort 4, 2020)
● Magazine School District: Eighth-grade students grew in reading by 24.8%, and fourth-grade students grew in math by 22.1%. (Cohort 5, 2021)
● Marked Tree School District: Tenth-grade students grew in reading by 23%, and seventh-grade students grew in English by 15%. (Cohort 6, 2022)
Beyond the Arkansas Department of Education and schools’ own data, Solution Tree engaged a highly respected independent firm to objectively evaluate the PLC at Work implementation in Arkansas. For nearly 60 years, Education Northwest, located in Portland, OR, has partnered with communities across the United States to advance education through research and evaluation. Its external evaluation of Cohort 1 of the PLC at Work Project in Arkansas found that after only two years of implementation, students were showing positive growth in ACT Aspire test scores. The greatest impact was seen in student math achievement with PLC at Work participants seeing an overall positive impact on math ACT Aspire growth (Hanson et al., 2021) and gains three times higher than gains seen from National Board certification and the eMINTS Comprehensive Program (Torres & Hanson, 2020; What Works Clearinghouse, 2020). These findings are important because the PLC at Work process focuses on schoolwide transformation and is based on a simple proposition: improved teacher collaboration, trust, and collective responsibility will lead to improved instruction, improved student engagement, and, ultimately, increased student achievement. The external evaluation is designed to establish Every Student Success Act (ESSA) Tier II evidence.
Beyond empirical student achievement, the Education Northwest study showed positive changes with schools that are complements to academic growth. According to the Education Northwest study, in addition to achievement gains, educators implementing the PLC at Work process noted improved attendance, decreased behavioral referrals, and decreased special education referrals (Education Northwest, 2020).
Further evidence of effectiveness in Arkansas include:
● Arkansas schools in PLC at Work Cohorts 1 (4.44 points), 2 (3.75 points), 3 (5.5 points), 4 (3.75 points), 5 (8.44), and 6 (4.32) all showed more recovery in school letter grade points from pre-COVID (2018–19) to 2022–23 than non-Cohort schools in the state (3.20 points). (Arkansas Department of Education Data Center, 2023)
● Education Northwest’s ESSA Tier II–aligned research study on PLC at Work implementation in Arkansas from 2016–17 to 2018–19 found a positive impact on teachers and students after only two years (Hanson et al., 2021).
● Schools participating in PLC at Work implementation were found to have improved communication, trust, and collective responsibility among educators and, in turn, improved student engagement and learning (Torres et al., 2020).
● African American students had higher growth in ELA than their peers (Hanson et al., 2021).
● Several student groups realized higher math achievement by statistically significant margins (Hanson et al., 2021).
● PLC at Work had an overall positive impact on math ACT Aspire growth (Hanson et al., 2021).
● PLC at Work had a positive impact for specific student groups on math ACT Aspire growth (Hanson et al., 2021).
● PLC at Work had a larger impact on math achievement gains than other professional learning programs (Torres & Hanson, 2020).
AllThingsPLC.info features over 600 schools and districts that have demonstrated evidence of consistent increases in student achievement. Schools and districts that have demonstrated significant student achievement over three years are recognized as Model PLCs at Work. Promising Practices schools are those that have demonstrated at least one year of improved results. This list is updated on a continuous basis as more schools demonstrate improvement as a result of implementing the PLC at Work process. As of September 2024, Arkansas has thirty-nine schools and districts Model PLC schools or districts and eight Promising Practices schools.
False: The University of Arkansas Office for Education Policy (OEP) study showed the PLC at Work process has no statistical significance.
Fact: The OEP study’s design was flawed in numerous ways, leading to inaccurate and misleading results.
The University of Arkansas OEP study (Barnes & McKenzie, 2024a, 2024b) has sparked a multitude of questions regarding the state’s investment in the PLC at Work process. “Statistical significance” is often wrongly interpreted by general audiences to be synonymous with “significant” or “meaningful.” This is inaccurate. “No statistically significant differences in student performance” doesn’t necessarily mean there’s no effect. It simply indicates that the study itself couldn’t definitively show one. Therefore, Solution Tree has assembled a group of internal and external research experts (with a combined one hundred-plus years of experience) to review the study’s methodology and evaluate the validity of the findings. In our analysis, the following aspects of the study deserve closer scrutiny: statistical significance, matching, sampling, baseline equivalence, study design, weighted achievement, and value-added models (VAMs). One aspect that is highly concerning is how the study calculated “Weighted Achievement Score.”
The Weighted Achievement Score is a measure the University of Arkansas’s Office for Education Policy uses to determine how well a school scores on annual standardized tests for English language arts and mathematics. However, in the University of Arkansas OEP study, the selection of the Weighted Achievement Score metric was biased against schools participating in the PLC at Work Cohorts, given the study’s observation: “PLC at Work schools enrolled a statistically significantly greater percentage of students who are Hispanic/Latino, are Eligible for Free or Reduced-Price Lunch, and are English language learners than schools not selected to be PLC at Work schools” (Barnes & McKenzie, 2024b, p. 11). The OEP’s website discloses potential bias from the Weighted Achievement Score: “Schools serving more advantaged students typically receive ‘good’ scores because a high percentage of their students pass, while schools serving a larger percentage of students who live in poverty, participate in special education, or are learning English often receive lower scores because a higher percentage of their students are not yet performing at grade level.”
Out of professional courtesy, the external researchers (from prestigious, well-respected universities or organizations) requested that their names and organizations not be shared but permitted us to share their responses. Here were some of their comments.
● Although I believe that the authors worked as best they could and objectively with the available data, unfortunately both the data and the design were too limited to draw meaningful conclusions regarding the efficacy of the PLC intervention.
● The PLC schools were at baseline less high-achieving and higher in underserved subgroups than were the comparison schools (the researchers did what they could with propensity score matches but those quantitative adjustments can’t neutralize the effects of more challenging school environments).
● Many of the comparison schools likely were implementing some strategies similar to the intervention model or ones actually offered by Solution Tree in the same year or previously.
● School-level rather than individual student-level data were used, thus precluding tracking the achievement trajectories of treated students over time.
● No data were available on teacher mobility or attrition, a potentially significant factor given that the teacher is the locus of PLC school impacts.
● No perception data were available to determine program impacts beyond high-stakes achievement scores. Such data could have told a much more compelling story of the model’s successes (or failures) relative to the quantitative outcomes.
● Overall, the achievement results are highly mixed indeed, showing significant effects in both directions but mostly inconclusive outcomes.
● The study did not account for the length of time that schools were in the program. As table 1 shows, the 90 schools evaluated were at very different places in their movement through the program.
● The study did not account for the impact of the COVID year in the trajectory of impact.
● The sample includes both school-based and district-based program implementation models.
● The sample includes both primary and secondary schools which may obscure differential impact.
● Other important factors on school-based scores are not controlled for or explored in the models; factors that could suppress the identification of a significant effect.
● The analysis utilized a composite of the ELA and MATH subscales; specific impacts are not explored.
● The use of weighted achievement may not be the best way to demonstrate the impact of the program. Given the large discrepancy in the sample sizes (and the known variation in types of schools in both), the use of effect size change may provide a better view of how the program is impacting schools.
● In the context of the publicly available data, non-significant results are not surprising. Without accounting for the students demographics, student prior achievement, and the nesting of students within classrooms, schools and districts, the level of change attributable to involvement in PLC at Work would likely not be detectable.
● A key limitation is the exclusion of some variables from the matching process, potentially leading to inaccurate matching of comparison group. This can result in unreliable conclusions, as unobserved confounders may influence both the selection of the PLC program and student outcomes.
● The sampling method for the “All Other AR Schools” group is problematic. Comparison group schools that may have undergone similar interventions could experience dilution effects, reducing the probability of detecting an effect even if one exists.
● Without robust matching and sampling techniques, it becomes challenging to justify the assumptions underlying the methods used. For instance, an event study assumes that in the absence of the intervention, both groups would have similar outcomes. This assumption is difficult to uphold if there are indications of mismatch or dilution effects in the groups.
● The use of weighted achievement scores without adjusting for demographic factors introduces bias against schools serving disadvantaged populations, potentially underestimating the program’s effectiveness.
● The reliance on value-added models (VAMs), which have been criticized for their instability and potential bias, further complicates the interpretation of the results.
● Future evaluations should incorporate more rigorous matching, sampling adjustments, and longitudinal designs to provide a clearer and more accurate assessment of the PLC at Work program’s impact on student outcomes.
False: There is an audit of Solution Tree on the PLC at Work Project.
Fact: Arkansas Legislative Audit is auditing ADE’s spend on the PLC at Work Project.
Arkansas legislators requested an audit during the Legislative Joint Auditing Committee. Arkansas Legislative Audit is auditing the state’s and educational entities’ spend on professional development services. The services Solution Tree has provided to schools and districts throughout the seven years of the PLC at Work Project contract are well documented and have been consistently identified in monthly invoices to ADE.
False: ADE pulled the contract due to the ALA audit.
Fact: Solution Tree requested that its contract be withdrawn from legislative review.
Solution Tree has emphasized two main reasons for requesting the contract be withdrawn from legislative review. First, the contract was no longer primarily focused on Professional Learning Communities at Work, the highly acclaimed school-improvement process of which Solution Tree is the nation’s largest implementor. Second, to implement professional development statewide with fidelity, it is critical to have the support and full cooperation of all stakeholders.
Further, the PLC at Work process provides a strong foundation for the LEARNS Act, but PLC at Work cannot satisfy all LEARNS Act requirements (like tutoring, school choice, etc.). In Year 7, Solution Tree worked with ADE to ensure that the services delivered through the PLC at Work Project aligned with and supported Governor Sanders’s priorities.
False: The 2024 procurement process was manipulated in favor of Solution Tree.
Fact: Solution Tree achieved the highest score of eight vendors from a group of ADE-selected, independent evaluators.
Solution Tree aligned its response to the January 8 Arkansas Department of Transformation and Shared Services Office of State Procurement Request for Proposal (RFP) with ADE’s request for a statewide “system of professional learning.” Specifically, in 2.2 Objective & Goals, the RFP states: “Pursuant to Arkansas Code 6-20-2305(b)(5), the Arkansas Department of Education seeks a Contractor to continue and expand a research-based, cohesive, synchronized system of professional learning, such as professional learning communities, to provide support for adult and student learners across the State” (p. 6). Therefore, based on the RFP’s Objectives & Goals, Solution Tree’s proposal was intended to expand the PLC at Work Project for an additional seven years.
Out of the eight prospective vendors who responded to the RFP, Solution Tree scored a perfect technical proposal score of 700. ADE selected nine total evaluators, external to the vendors and agency, who were a mix of teachers and administrators. Due to scheduling conflicts, five evaluators scored the eight proposals.
False: Solution Tree’s estimated bid based on the RFP was several times higher than other vendors.
Fact: Solution Tree’s bid aligned with seven years of work across the state, not 500 days of professional development. Solution Tree’s bid had a similar average per-day cost to other bids, and other bids from vendors that did not make it all the way through the process had comparable total costs.
Solution Tree’s contracted cost of $99.4 million across seven years aligned with past experience working with ADE to implement the PLC at Work process across the state.
In 2.4 General Requirements, the RFP states, “The Contractor shall provide onsite professional development and/or coaching on at least 500 occasions each year” (p. 6). However, given the implementation of the PLC at Work Project in the state since 2017, and the aforementioned costs associated with each additional year, Solution Tree’s response aligned with the RFP’s Objectives & Goals: “to continue and expand a research-based, cohesive, synchronized system of professional learning, such as professional learning communities, to provide support for adult and student learners across the State.” Therefore, Solution Tree’s response was intended to expand the work of the schools currently part of the three-year implementation of PLC at Work Project, while enabling additional schools to join the Project.
Solution Tree’s bid was higher than other bids because we built a full “system of professional learning” rather than merely offering “500 occasions each year” of professional development. Had we ignored the need for a “system of professional learning” and simply offered 500 days per year, our bid would have been about $24.9 million. This is closer to or less than the other vendors’ proposed costs (such as Bailey Education Group’s $25.97 million bid). NIET’s $70 million bid would have been the most expensive had the company made it to the discussions phase of the RFP process with a higher technical proposal score. Instead, Solution Tree “narrowly beating” MGT of America LLC, and its bid of $18.9 million, was reported.
False: The PLC at Work Project contract was $149.9 million.
Fact: The PLC at Work Project contract was $83.9 million over seven years.
The initial PLC at Work Project contract with ADE from 2017 is for $4 million. While this was indeed a no-bid contract, ADE’s requests to add to that amount across the seven years of the contract are well-documented in annual legislator-approved, ADE-executed contract amendments and memos that bring the total amount of the PLC at Work Project contract to $83.9 million, not $149.9 million previously listed on Transparency.Arkansas.gov and quoted frequently in the media.
Transparency.Arkansas.gov listed a $66 million contract that was initiated under the prior administration but not executed, according to Education Department spokeswoman Kimberly Mundell. We believe this to be the source of the $149.9 million number being attributed to the PLC at Work Project.
False: Arkansas legislators were unaware of the cost of the PLC at Work Project.
Fact: Arkansas legislators reviewed and approved the contract every year.
The initial PLC at Work Project contract with ADE from 2017 is for $4 million. Annually, ADE requested to add to that amount across the seven years of the contract. These requests are well-documented in annual legislator-approved, ADE-executed contract amendments and memos.
As the PLC at Work Project added cohorts annually in accordance with ADE’s original design and requests, the annual contract grew to $12.5 million. Due to schools’ proven successes across the state and the positive results schools were experiencing, ADE requested that Solution Tree expand the contract’s scope from $12.5 million annually to $14.5 million in Year 5 of the project and to $16.5 million in Year 6. This was done at the request of ADE and the additional funds approved by a legislative process. The annual renewal amendments very clearly document the amount of the contract approved by ADE and legislators. Starting in a Year 3 Memo, ADE established the total cost of the PLC at Work Project could be up to $100,000,000 across seven years.
Each year, the contract was signed by ADE and submitted to ALC for its annual review. For seven years, the contract was signed and approved by the state for the designated amounts determined by the state. Solution Tree provided professional development products and services to Cohort schools in alignment with ADE’s established yearly costs. At the end of Year 7, the ADE contracted work was $83.9 million.
False: Solution Tree requires consultants employed in schools to purchase professional development resources and services, which is a conflict of interest.
Fact: Solution Tree engages educators who have tremendous success in specialized areas in their schools as independent contractors to conduct professional development outside their districts.
Solution Tree has a thorough vetting process for PLC at Work professional development associates. In addition to presentation requirements, associates must demonstrate at least three years of continuous results due to their leadership. Educators who are still employed in schools and districts are hired as independent contractors, not Solution Tree employees. It is their responsibility to abide by school or district contract requirements and take off the appropriate time to conduct professional development services for Solution Tree. Solution Tree’s mission is “Advance the work of our authors”—work that is research based and results driven. Thus, Solution Tree associates act as representatives of our authors’ work that is proven to improve learning outcomes for students.
We feel Arkansas and other states should feel proud of their school leaders who can share their schools’ successes and guide other educators to improve student achievement in their own classrooms.
False: Solution Tree funded the PLC Practitioners Program.
Fact: ADE recruited expert educators to sustain the work.
Arkansas teachers or leaders who shared their PLC at Work expertise as part of ADE’s PLC Practitioners program, were neither engaged nor compensated by Solution Tree. The state recruited their own highly qualified PLC-trained leaders to coach other Arkansas schools in a “train the trainer” model, specifically designed to sustain the progress made by the PLC at Work Project. ADE provided a stipend directly to these educators much like a stipend would be provided for a teacher on special assignment or an athletic coach. ADE created the PLC Practitioners program to help sustain the work inside the state without presenter support from Solution Tree. ADE asked educators who have participated in the PLC at Work Project to give back to the state by supporting other educators. The PLC Practitioners Program operated parallel to—but separately from—the ADE-designed and Solution Tree–supported PLC Regional Network.
References
AllThingsPLC.info. (2023). See the evidence. https://allthingsplc.info/evidence/
Arkansas Department of Education Data Center. (2023). Arkansas ASPIRE student achievement data 2016-2022. Retrieved February 21, 2023, from https://myschoolinfo.arkansas.gov/Plus/Schools.
Arkansas Department of Education. (2023). The journey of a professional learning community: Success across Arkansas. Retrieved June 5, 2024, from https://dese.ade.arkansas.gov/Offices/special-projects/professional-learning-communities-for-arkansas
Barnes, K., & McKenzie, S. (2024a). Effects of PLC at Work in Arkansas on student academic outcomes [Unpublished University of Arkansas Office for Education Policy study]. https://wehco.media.clients.ellingtoncms.com/news/documents/2024/05/25/Effects_of_PLC_at_Work_in_AR_1.pdf
Barnes, K., & McKenzie, S. (2024b). Professional learning communities and student outcomes: A quantitative analysis of the PLC at Work model in Arkansas schools. Arkansas Education Report, 21(1). https://bpb-us-e1.wpmucdn.com/wordpressua.uark.edu/dist/1/555/files/2024/06/21.1_Professional-Learning-Communities-and-Student-Outcomes.pdf
Education Northwest. (2020). At a glance: Successful implementing PLC at Work® in Arkansas. Retrieved June 5, 2024, from https://educationnorthwest.org/sites/default/files/plc-at-work-at-a-glance.pdf
Hanson, H., Torres, K., Young Yoon, S., Merrill, R., Fantz, T., & Velie, Z. (2021). Growing together: Professional Learning Communities Work® generates achievement gains in Arkansas. Portland, OR: Education Northwest. Retrieved June 5, 2024, from https://educationnorthwest.org/insights/independent-evaluation-validates-success-plc-work-project-arkansas
Torres, K., & Hanson, H. (2020). On the road to impact: Solution Tree Arkansas PLC at Work® Cohort 1 Year 2 milepost memo executive summary. Portland, OR: Education Northwest. Retrieved June 5, 2024, from https://dese.ade.arkansas.gov/Files/20201203104240_plc-at-work-excutive-summary_rv2.pdf
Torres, K., Rooney, K., Holmgren, M., Young, S. Y., Taylor, S., & Hanson, H. (2020). PLC at Work® in Arkansas: Driving achievement results through school transformation and innovation–Executive summary. Portland, OR: Education Northwest. https://educationnorthwest.org/sites/default/files/driving-achievement-results-through-school-transformation.pdf
What Works Clearinghouse, Institute of Education Sciences, U.S. Department of Education. (2020, April). eMINTS Comprehensive Program. https://ies.ed.gov/ncee/wwc/Docs/InterventionReports/wwc_EESL_eMIN_IR_apr2020
.pdf
Solution Tree Addresses Misinformation on PLC at Work Project in Arkansas
Solution Tree Press Release | June 19, 2024
Little Rock, AR (June 19, 2024)—Solution Tree, the trusted provider of K–12 professional development resources and services to over 8,000 schools and districts across all 50 states, seeks to address recent misinformation regarding the PLC at Work Project in Arkansas.
The initial PLC at Work Project contract with the Arkansas Department of Education (ADE) from 2017 was valued at $4 million. Throughout the following seven years, ADE’s requests and legislative approvals grew the contract to $83.9 million, not the $149.9 million reported.
“Annually, ADE requested to add to the project’s scope through renewal amendments, supported by documented legislative approval. This was done in partnership with ADE to meet the growing needs and successes of Arkansas schools,” stated Solution Tree’s CEO, Jeff Jones. “The incorrect figure of $149.9 million appears to stem from a non-executed $66 million contract listed on Transparency.Arkansas.gov, a discrepancy we are actively seeking to correct.”
In response to the January 8, 2024, Arkansas Department of Transformation and Shared Services Office of State Procurement RFP, Solution Tree aligned its proposal to expand the PLC at Work Project. The $99.4 million proposal aimed to continue and enhance professional learning across the state, in contrast to other bids, which did not encompass the comprehensive “system of professional learning” ADE listed in the RFP’s objective and goals
Of the eight prospective vendors who responded to the RFP, Solution Tree scored the only perfect technical proposal score of 700. The ADE-selected evaluators, external to the vendors and agency, were a mix of teachers and administrators. With the highest overall score, Solution Tree was then awarded the contract, which would be sent to legislative review
Once discussions around the contract’s expectations shifted to the LEARNS Act, Solution Tree made the strategic decision to withdraw its contract from consideration, as the focus no longer aligned with its proposal of a “system of professional learning” through the PLC at Work Project
“Solution Tree supports Governor Sanders’s LEARNS Act, and PLC at Work addresses many foundational elements of the act, such as promoting improved teacher collaboration, student engagement, and academic achievement. However, our work does not encompass all LEARNS Act requirements, such as tutoring and school choice,” said Jones
From 2017 to 2024, the PLC at Work Project has delivered significant value and impact to over 58 districts and 98 schools in Arkansas. Empirical data and external evaluations highlight substantial academic improvements, such as those listed in the ADE book, The Journey of a Professional Learning Community: Success Across Arkansas. For example:
Bayyari Elementary: Fourth-grade math scores increased by 11.4%, and reading scores increased by 16.9%.
Magazine School District: Eighth-grade reading scores improved by 24.8%, and fourth-grade math scores rose by 22.1%.
Marked Tree School District: Tenth-grade reading scores increased by 23%, and seventh-grade English scores by 15%.
However, the University of Arkansas Office for Education Policy’s (OEP) study raised questions regarding the PLC at Work Project’s statistical significance, causing Solution Tree to convene two independent researchers and five university and organizational research centers to review the study’s methodology. Consistent concerns were identified regarding statistical significance, matching, sample bias, baseline equivalence, study design, weighted achievement, and value-added models. These limitations highlight the need for a more nuanced interpretation of the data and a peer-reviewed study.
In 2017, Solution Tree contracted with an independent evaluator, Education Northwest, a respected national research laboratory located in Portland, OR, to conduct a three-year evaluation of the impact of the PLC at Work Project on the ten Cohort 1 schools, with full disclosure to ADE. Education Northwest’s findings validated the meaningful improvements in student achievement, showing positive growth in ACT ASPIRE test scores, particularly in math.
“We will remain committed to the educational success of Arkansas schools through PLC at Work. The evidence of improved academic outcomes underscores the effectiveness and value of this comprehensive professional learning initiative,” Jones said.
As for the much-discussed legislative committee’s audit, Solution Tree has yet to receive any formal inquiries, clarifying that the audit is of the monies distributed from ADE to educational entities throughout Arkansas, not of Solution Tree. However, Solution Tree is ready and willing to provide any required information.
“Our engagement in Arkansas continues to prioritize transparency and accountability,” Jones said.
For more information, please visit SolutionTree.com.
For additional documentation or evidence of effectiveness, contact Media@SolutionTree.com
Solution Tree Requests Arkansas Department of Education Contract
Be Withdrawn From Arkansas Legislative Review
Solution Tree Press Release | June 3, 2024
Solution Tree’s Response to the University of Arkansas
Office for Education Policy (OEP) Study
Solution Tree is a research-based, results-oriented continuous learning organization. We believe it is vitally important to adhere to high standards of scholarly investigation and communication. In reviewing the University of Arkansas OEP study, “Professional Learning Communities and Student Outcomes: A Quantitative Analysis of the PLC at Work Model in Arkansas Schools,” we found the following aspects deserved closer scrutiny: statistical significance, matching, sampling, baseline equivalence, study design, weighted achievement, and value-added models (VAMs).
Statistical Significance
“Statistical significance” is often wrongly interpreted by general audiences to be synonymous with “significant” or “meaningful.” This is inaccurate.
The University of Arkansas OEP study was designed in such a way as to prevent the University of Arkansas from making substantive claims about the impact of PLC at Work process implementation.
No statistically significant differences in student performance” doesn’t necessarily mean there’s no effect. It simply indicates that the study itself couldn’t definitively show one.
Matching
The two groups studied—“Project Schools” and “All Other AR Schools”—are not necessarily comparable.
It’s virtually impossible to compare apples to apples when looking at student achievement data. Reporting on how likely an effect occurred based on the confines of the study is a more accurate approach.
Because the University of Arkansas did not take all potential variables into account prior to matching, the study has very likely come to false conclusions.
Sampling
Over 200 schools in the “All Other AR Schools” group had received professional development on the PLC at Work process outside of the Arkansas Department of Education (ADE) contract, either prior to Cohort 1 or during the seven-year partnership.
The study only accounts for 90 “Project Schools.” Solution Tree has provided professional development to over 148 districts and 314 individual schools in Arkansas between 2017–2024 alone.
Schools in the “All Other AR Schools” group may have already been impacted and showing achievement as a result of the PLC at Work process.
Baseline Equivalence
Without establishing baseline equivalence (a fair comparison between groups) from the onset, the entire study can be considered invalid. Due to apparent flaws in the matching and sampling processes, baseline equivalence was not fully established.
When baseline equivalence is established, the only difference between groups is receiving the intervention (i.e. PLC at Work process implementation). The results can’t be trusted without it.
Even if a study can demonstrate similarity for observed characteristics, unobserved characteristics (like motivation or attitudes) are more likely to be dissimilar among groups.
Study Design
An event study design is only suited to evaluating the impact of a one-time event, not long-term effects with multiple variables to control, as one might find in a seven-year statewide professional development implementation.
Event studies are most often used in finance and economics, such as to study the market reaction to announcements of specific events or news, like pandemic outbreaks, airplane crashes, and rumors.
The use of an event study design is not a suitable framework for this research due to its limitations.
Weighted Achievement
The use of the Weighted Achievement Score metric was biased against PLC at Work Cohort schools.
On page 11, the study notes, “PLC at Work schools enrolled a statistically significantly greater percentage of students who are Hispanic/Latino, are Eligible for Free or Reduced-Price Lunch, and are English language learners than schools not selected to be PLC at Work schools.”
The OEP’s website discloses potential bias from the Weighted Achievement Score metric against “disadvantaged” students: “Schools serving more advantaged students typically receive ‘good’ scores because a high percentage of their students pass, while schools serving a larger percentage of students who live in poverty, participate in special education, or are learning English often receive lower scores because a higher percentage of their students are not yet performing at grade level.”
Value-Added Measures
The study uses value-added models, which are highly problematic and have long been debunked in the research.
To demonstrate the unreliability of VAMs, one study even applied them “to estimate the effects of teachers on an outcome they cannot plausibly affect: student height” (p. 900).
Teachers do not have any influence or impact on student height, and yet, the researchers found “statistically significant” effects.
VAMs are an unfortunate choice for any study seeking accuracy and credibility.
To fully understand the breadth of the research supporting the PLC at Work process, we suggest watching the documentary The Origins of Professional Learning Communities, which offers decades of research, dating back to the 1940s. In addition, we highly recommend reading the research findings in Education Northwest’s external, independent evaluation of the PLC at Work process in Arkansas schools, Growing Together: Professional Learning Communities Work® Generates Achievement Gains in Arkansas, validating its success and meeting ESSA Tier II requirements.
Bloomington, Ind. (June 3, 2024)—Solution Tree, a premier provider of K–12 professional development resources and services for educators, has requested that its contract with the Arkansas Department of Education (ADE) be withdrawn from legislative review. Solution Tree responded to ADE’s RFP in February with a proposal to continue and expand the PLC at Work Project statewide, winning out over seven other prospective contractors. Solution Tree made the decision to request to withdraw its contract after contract priorities shifted in recent weeks.
In an email to ADE Secretary Jacob Oliva on Sunday, June 2, Solution Tree CEO Jeffrey C. Jones emphasized two main reasons for the decision:
The contract is no longer primarily focused on Professional Learning Communities at Work, the highly acclaimed school-improvement process of which Solution Tree is the nation’s largest implementor.
To implement professional development statewide with fidelity, it is critical to have the support and full cooperation of all stakeholders.
Since 2017, Solution Tree has partnered with over 300 schools and nearly 150 districts in Arkansas to support educators in collaborating effectively and improving education outcomes for students. Schools have been honored with such awards as Schools on the Move Toward Excellence, Office for Education Policy’s Beating the Odds, and Reward Schools. Over 30 schools have achieved the status of Model PLCs at Work or Promising Practices Schools, demonstrating remarkable progress in student achievement. During the COVID-19 pandemic, schools in Arkansas implementing the PLC at Work process experienced less learning loss. Solution Tree’s close collaboration with state legislators, ADE, and the educational community has been crucial to improving student achievement across the state.
According to the PLC at Work process, the key to improved learning for students is continuous job-embedded learning for educators. Through intensive professional development, Arkansas schools have engaged in an ongoing cycle of collective inquiry and action research where teachers work in collaborative teams to achieve high levels of learning for their students. The PLC at Work process is based on three big ideas:
A focus on learning
A collaborative culture and collective responsibility
A results orientation or evidence of student learning
Solution Tree will continue to work with and support the hardworking educators in Arkansas, as the company has done for over a decade. Solution Tree is extremely proud of its work with schools and districts throughout the state and remains committed to improving teacher practice in order to achieve outstanding results for students.
Full Response to the University of Arkansas
Office for Education Policy (OEP) Study
As a research-based, results-oriented continuous learning organization, Solution Tree believes it is vitally important to adhere to high standards of scholarly investigation and communication. In reviewing the University of Arkansas Office for Education Policy (OEP) study, “Professional Learning Communities and Student Outcomes: A Quantitative Analysis of the PLC at Work Model in Arkansas Schools” (Barnes & McKenzie, 2024b), we found little evidence that the paper had been subjected to a rigorous peer review process. Our analysis of the study—including its design, metrics (weighted achievement and value-added models), matching and sampling approaches, evidence of baseline equivalence, and treatment of statistical significance—indicates that this contribution could likely benefit from the peer review functions of identifying gaps in the literature review, highlighting methodological concerns, and encouraging clarity and integrity when reporting results. Further, Solution Tree contacted two independent researchers and five university and organizational research centers to review the study’s methodology to verify our concerns. We offer the following considerations from our analysis and encourage our colleagues at the University of Arkansas to submit their work to a formal peer review process as soon as possible.
First, to frame the foundation of this discussion, we feel it is critical to share the decades of research from education, business, sociology, psychology, and other fields supporting Professional Learning Communities at Work®, a proven process for sustained, substantive school improvement. Esteemed researchers, academics, and authors, like Shirley Hord, Karen Seashore Louis, Judith Warren Little, Milbrey McLaughlin, James Kouzes, and others, have contributed to the vast amount of literature supporting professional learning communities. This is intended to supplement the Review of Literature section added to the revised version of the study (Barnes & McKenzie, 2024b).
Next, briefly, we will describe specific aspects of the study which we believe deserve closer scrutiny: statistical significance, matching, sampling, baseline equivalence, study design, weighted achievement, and value-added models (VAMs).
The Research Supporting the Professional Learning Community at Work Process
The PLC process has been proven to be a highly effective way of ensuring high levels of learning for all students. When properly executed and when implemented with fidelity, it can deliver dramatically improved teaching and learning. U.S. and international research have studied the effectiveness of the PLC process in schools and districts, highlighting the essential elements necessary for success (Chiang, Yin, Lee, & Chang, 2024; Dogan, Pringle, & Mesa, 2016; Figueroa Moya, 2023; Hairon & Goh, 2017; Hauge & Wan, 2019; Kimani, 2024; Lieberman & Mille, 2011; Olsson, 2019; Perez, 2023; Siwy & Meilani, 2024; Tayag, 2020; Tuli & Bekele, 2020).
To fully understand the breadth of the research supporting the PLC at Work process, we suggest watching the documentary The Origins of Professional Learning Communities, which offers decades of research that will be highlighted in this section. In addition, we highly recommend reading the research findings in Education Northwest’s external, independent evaluation of the PLC at Work process in Arkansas schools, Growing Together: Professional Learning Communities Work® Generates Achievement Gains in Arkansas, validating the success of the PLC at Work Project in ten Cohort 1 schools and meeting ESSA Tier II requirements. “Advocates for Professional Learning Communities: Finding Common Ground in Education Reform” from Revisiting Professional Learning Communities at Work, Second Edition provides research and organizations supporting the PLC at Work process as the most promising path for sustained, substantive school improvement (DuFour, DuFour, Eaker, Mattos, & Muhammad, 2021). Lastly, the Arkansas Department of Education (2023) book, The Journey of a Professional Learning Community: Success Across Arkansas, offers successes from dozens of schools and districts implementing the PLC at Work process.
The Beginnings
Solution Tree’s PLC at Work process dates back to Richard DuFour and Robert Eaker’s (1998) book, Professional Learning Communities at Work: Best Practice for Enhancing Student Achievement, showcasing the authors’ successful experiences as practitioners applying PLC at Work practices in schools. In the 1980s and 1990s, DuFour was driven to lead a high-performing school for students at Adlai E. Stevenson High School in Lincolnshire, Illinois. He looked to the research for best practices and connected with Dr. Robert Eaker, dean of the College of Education for Middle Tennessee State University, who had done research on leadership and organizational development as a former fellow with the National Center for Effective Schools Research and Development. Principles about continuous learning organizations from Tom Peters and Robert Waterman Jr.’s (1982) In Search of Excellence, Peter Senge’s (1990) The Fifth Discipline, and Peter Drucker’s (1992) Managing for the Future had a profound impact on DuFour and Eaker and the evolution of the PLC at Work process. However, research from education, business, sociology, anthropology, psychology, engineering, and other fields has contributed to the evolution of PLCs for many decades.
Foundational ideas supporting the importance of collaborative teams in schools started developing as early as the 1940s and 1950s through W. Edwards Deming’s work using quality circles in post-WWII factories in Japan (Deming, 1967). According to Deming’s early work, a quality circle is a group of employees who meet regularly to find solutions to problems. This concept of working collaboratively is a clear thread woven throughout the research from the 1960s to 1990s about what has the biggest impact on success in schools.
As educators sought to break free from the factory model of schooling, the term professional learning community began to emerge in education through research from Madeline Hunter (1967), Malcolm Knowles (1968), and Benjamin Bloom (1968). Malcolm Knowles’s (1968) research on andragogy influenced theories about how adults learn differently than students, impacting professional development in schools in the decades to come. Adult learners prefer self-directed, relevant, and practical professional development focused on problem solving. Specific to classroom practice, Hunter’s (1967) Teach More—Faster emphasized teachers’ critical role in helping students understand what they will be learning through effective goals, expectations, and feedback. This shift of focusing on learning rather than teaching started making an impact in schools in the 1970s.
Research Into Practice
As the United States pushed for better practices in schools, the idea of applying research to practice began to take hold. Highly influential early educational research supporting the principles of the PLC at Work process includes the following.
Mid-1970s: Research into what makes effective schools, for example, “Changes in School Characteristics Coincident With Changes in Student Achievement” (Brookover & Lezotte, 1977), “Search for Effective Schools” (Edmonds & Frederiksen, 1978), and “Elementary School Climate and School Achievement” (Brookover et al., 1978)
1978: The “RAND Change Study” cited the importance of strong leadership, high motivation, teacher involvement, and long-term support to make effective systemic changes in schools (Berman & McLaughlin, 1978).
1979: Ronald Edmonds, in “Effective Schools for the Urban Poor,” underscored the indispensable characteristics of effective schools: strong leadership, high expectations for all students, orderly environment, a focus on learning, and frequent progress monitoring.
1982: Ronald Edmonds, in “Programs of School Improvement: An Overview,” introduced the seven correlates of Effective Schools.
1989: Milbrey McLaughlin, in the “The RAND Change Agent Study Ten Years Later,” verified its findings and cited the importance of teacher networks.
1989: Susan Rosenholtz studied seventy-eight schools, finding teacher collaboration led to greater gains in student achievement.
1993: Judith Warren Little and Milbrey McLaughlin reported that the most effective schools operated as strong professional communities.
1994: Sharon Kruse, Karen Seashore-Louis, and Anthony Bryk reported that the most effective schools in terms of student achievement operated as PLCs.
1995: Educational researchers Fred Newmann and Gary Wehlage studied twelve hundred schools, emphasizing that the most successful schools restructured as PLCs.
1996: Sharon Kruse, Helen Marks, and Karen Seashore-Louis studied twenty-four schools, reaffirming that schools operating as PLCs had a significant impact on both classroom practice and student achievement.
For over seven decades, professional learning communities have evolved and incorporated valuable ideas from various fields, not just education, making them a powerful tool for systemic change in schools. As PLCs become more widespread, evidence continues to show that they are effective in improving student learning outcomes, many of which are documented in the fourth edition of Learning by Doing: A Handbook for Professional Learning Communities at Work (DuFour, DuFour, Eaker, Many, Mattos, & Muhammad, 2024). The PLC at Work process helps schools address the challenge of ensuring high levels of learning for all students by empowering both teachers and students, creating a truly engaging learning environment.
Given our confidence in the tremendous amount of research supporting the PLC at Work process, Solution Tree evaluated the University of Arkansas study closely, finding aspects that warranted further examination: statistical significance, matching, sampling, baseline equivalence, study design, weighted achievement, and VAMs. The following sections briefly delve into each of these.
Statistical Significance
“Statistical significance” is often wrongly interpreted by general audiences to be synonymous with “significant” or “meaningful.” It is not. In the American Statistical Association’s (2016) statement on statistical significance, Wasserstein and Lazar (2016) plainly state, “Statistical significance does not measure the size of an effect or the importance of a result. Statistical significance is not equivalent to scientific, human, or economic significance” (p. 132). Statistical significance is a measure of the confidence that can be placed in a specific study’s findings. Findings that are not statistically significant have a low level of confidence associated with them.
Therefore, when the University of Arkansas study claims, “We found no statistically significant differences in student performance between PLC at Work and non-PLC schools, measured by average weighted achievement and school-level valued-added growth” (Barnes & McKenzie, 2024b, p. 39), this simply means that their study was designed in such a way as to prevent them from making substantive claims about the impact of the implementation being studied. Statistical significance is a reflection on the design of a study, not on the intervention being studied. Greenland and colleagues (2016) elaborate on this idea in “Statistical Tests, P Values, Confidence Intervals, and Power: A Guide to Misinterpretations,” clarifying that statistical significance is specific to the design of the study, not the intervention itself.
Put plainly, “no statistically significant differences in student performance” doesn’t necessarily mean there’s no effect. It simply indicates that the study itself couldn’t definitively show one.
Matching
While the revised University of Arkansas study (Barnes & McKenzie, 2024b) improves on the matching details shared since the initial version (Barnes & McKenzie, 2024a), the two groups studied—“Project Schools” and “All Other AR Schools”—are not necessarily comparable. The use of propensity score matching often occurs when “educational researchers are attempting to assess the impact of a program or intervention” (Jacovidis, Foelber, & Hort, 2017, p. 535). When used effectively, propensity score matching can help to establish like groups for comparison, as the technique is based on the likelihood, or propensity, for a group to actually participate in an intervention (Jacovidis et al., 2017). An important consideration for this type of matching, especially for education studies involving student achievement, is to consider all potential variables, or characteristics that may influence the study’s result (or covariates).
However, using propensity score matching and adequately selecting covariates poses challenges. Jacovidis et al. (2017) explain, researchers “are never truly certain whether we have captured all confounding variables” (p. 537). In other words, it’s virtually impossible to compare apples to apples when looking at student achievement data. Therefore, as stated in the Statistical Significance section, reporting on how likely an effect occurred based on the confines of the study is a more accurate approach. But what can help ensure a proper balance of groups is to examine “the distribution of the covariates for each group prior to matching . . . as it can foreshadow issues with common support” (Jacovidis et al., 2017, p. 545). Without taking all potential variables into account prior to matching, the University of Arkansas study has very likely come to false conclusions.
Sampling
Another shortcoming in the selection process of the schools included in the two groups, is that over 200 schools in the “All Other AR Schools” group had received professional development on the PLC at Work process outside of the Arkansas Department of Education (ADE) contract, either prior to Cohort 1 or during the seven-year partnership. Education Northwest controlled for this as part of its study, “Growing Together: Professional Learning Communities at Work Generates Achievement Gains in Arkansas,” where it “excluded certain schools that received partial treatment from the comparison sample prior to conducting the matching process used to establish baseline equivalency” (Hanson et al., 2021, p. 8).
Alternatively, the University of Arkansas study only accounts for 90 “Project Schools”; however, Solution Tree has provided professional development to over 148 districts and 314 individual schools in Arkansas between 2017–2024 alone. Thus, the study needed to look beyond the “Project Schools” group and control this variable. An argument can be made that schools in the “All Other AR Schools” group were already impacted and showing achievement as a result of the PLC at Work process.
Baseline Equivalence
Given that the University of Arkansas study is an impact study, establishing baseline equivalence, or a fair comparison between groups, is vital. Without establishing baseline equivalence from the onset, the entire study can be considered invalid. “Baseline equivalence must exist to accurately estimate program impacts” (Anderson & Maxwell, 2021, p. 5).
An impact study must first determine a starting point of the groups (i.e., the baseline)—in this case, student achievement and growth (according to the research question)—before the implementation of the PLC at Work process. Next, before the study can take place, the two groups have to be similar (i.e., have equivalence). When baseline equivalence occurs, then the only difference between groups is receiving the intervention (i.e., PLC at Work process implementation). As Anderson and Maxwell (2021) put it, “Baseline equivalence is important for impact studies because those studies are designed to say whether a program actually caused outcomes to occur” (p. 1). Thus, with baseline equivalence, the results are reliable, and without it, they can’t be trusted (Anderson & Maxwell, 2021; What Works Clearinghouse, n.d.).
While Table 2 purports to establish baseline equivalence, as stated, due to the flaws in the matching and sampling processes, it seems clear baseline equivalence was not fully established (Barnes & McKenzie, 2024a; Barnes & McKenzie, 2024b). Simply put, without demonstrating that both groups in the study started at the same place and were indeed similar, we cannot be confident in the results. What Works Clearinghouse (n.d.) explains, “If the two groups are different at baseline on key characteristics that could influence the outcomes, the effect found at the end of the study might be due to the differences that already existed at the beginning” (p. 1).
One final note on establishing baseline equivalence in the University of Arkansas study, regarding the use of a quasi-experimental design (QED), is that even if a study can demonstrate similarity for observed characteristics, unobserved characteristics (like motivation or attitudes) are more likely to be dissimilar among groups. Because of this, researchers Anderson and Maxwell (2021) articulate their lack of confidence that “QEDs demonstrate causality” as compared to randomized controlled trials (p. 3). As is evident in the study, “Educational researchers often face the necessity for quasi-experimental designs and their associated threats to internal validity” (Jacovidis et al., 2017, p. 535). This is particularly relevant in education research because researchers often use QEDs due to practical limitations. They must be extra cautious to control variables that might invalidate their findings.
Study Design
In addition to the identified issues with statistical significance, matching, sampling, and baseline equivalence, we would like to understand the use of an event study design for this impact study. While event study design can provide a snapshot before, during, and after a one-time event, it is not ideal to use for long-term effects with multiple variables to control, as one might find in a seven-year statewide professional development implementation (Martinez-Blasco et al., 2023). For example, factors like changes in curriculum, teacher turnover, or leadership changes can even influence the study’s outcomes. Thus, it is illogical to use an event study design to answer the question “How does Solution Tree’s PLC at Work model impact student achievement and growth in Arkansas schools?”
Further, even when used in quasi-experimental designs, event studies do not prove cause and effect. Event studies are most often used in finance and economics, such as to study “the market reaction to announcements of corporate events or news,” like pandemic outbreaks, airplane crashes, and rumors (Martinez-Blasco et al., 2023, p. 2). Therefore, the use of an event study design is not a suitable framework for this research due to its limitations.
In addition to questioning the selected study design, we also question the metrics that were selected for analysis: weighted achievement score and value-added measures.
Weighted Achievement Score
The Weighted Achievement Score is a measure the University of Arkansas’s Office for Education Policy uses to determine how well a school scores on annual standardized tests for English language arts and mathematics. However, in the University of Arkansas study, the selection of the Weighted Achievement Score metric was biased against schools participating in the PLC at Work Cohorts, given this observation: “PLC at Work schools enrolled a statistically significantly greater percentage of students who are Hispanic/Latino, are Eligible for Free or Reduced-Price Lunch, and are English language learners than schools not selected to be PLC at Work schools” (Barnes & McKenzie, 2024b, p. 11).
The OEP’s (2022) webpage, “What Would it Take for My School to Get an ‘A’?,” even indicates potential biases with this metric: “Schools serving more advantaged students typically receive ‘good’ scores because a high percentage of their students pass, while schools serving a larger percentage of students who live in poverty, participate in special education, or are learning English often receive lower scores because a higher percentage of their students are not yet performing at grade level.” As noted in the study, PLC at Work Cohort schools had more disadvantaged students. Thus, the use of the Weighted Achievement Score metric was biased against PLC at Work Cohort schools.
Value-Added Measures
The University of Arkansas study uses value-added models (VAMs), which are highly problematic and have long been debunked in the research (Bitler et al., 2021; Morganstein & Wasserstein, 2014). In 2014, the American Statistical Association (ASA) issued a statement on their use (Morganstein & Wasserstein, 2014), noting “Many VAM models have been proposed and are being used, each with their own set of assumptions. The same set of data has been observed to yield different conclusions from different VAM models” (p. 109). As a sobering example of this unreliability, Bitler et al. (2021) applied value-added models “to estimate the effects of teachers on an outcome they cannot plausibly affect: student height” (p. 900). Obviously, teachers do not have any influence or impact on student height, and yet, when Bitler and colleagues (2021) applied VAMs to this question, they found statistically significant effects. Concerns about “bias and idiosyncratic error” (Bitler et al., 2021, p. 920) make VAMs an unfortunate choice of metric for any study seeking accuracy and credibility.
Conclusion
The research supporting the PLC at Work process is vast and spans many decades. In addition, specific evidence of effectiveness from Arkansas schools and external evaluators demonstrate the value in implementing the PLC at Work process with fidelity. As demonstrated throughout, there are several questionable aspects of the University of Arkansas OEP study that deserve closer scrutiny, showing that it has come to false conclusions.
References
Anderson, M. A., & Maxwell, N. (2021). Baseline equivalence: What it is and why it is needed [Employment brief]. https://www.mathematica.org/publications/baseline-equivalence-what-it-is-and-why-it-is-needed
Arkansas Department of Education. (2023b). The Journey of a Professional Learning Community: Success across Arkansas. Retrieved June 5, 2024, from https://dese.ade.arkansas.gov/Offices/special-projects/professional-learning-communities-for-arkansas
Barnes, K., & McKenzie, S. (2024a). Effects of PLC at Work in Arkansas on student academic outcomes [Unpublished University of Arkansas Office for Education Policy study]. https://wehco.media.clients.ellingtoncms.com/news/documents/2024/05/25/Effects_of_PLC_at_Work_in_AR_1.pdf
Barnes, K., & McKenzie, S. (2024b). Professional learning communities and student outcomes: A quantitative analysis of the PLC at Work model in Arkansas schools. Arkansas Education Report, 21(1). https://bpb-us-e1.wpmucdn.com/wordpressua.uark.edu/dist/1/555/files/2024/06/21.1_Professional-Learning-Communities-and-Student-Outcomes.pdf
Berman, P., & McLaughlin, M. W. (1978). Federal programs supporting educational change, vol. VIII: Implementing and sustaining innovations. Santa Monica, CA: RAND.
Bitler, M., Corcoran, S. P., Domina, T., & Penner, E. K. (2021). Teacher effects on student achievement and height: A cautionary tale. Journal of Research on Educational Effectiveness, 14(4), 900–924. https://doi.org/10.1080/19345747.2021.1917025
Brookover, W. B., & Lezotte, L. W. (1977). Changes in school characteristics coincident with changes in student achievement. East Lansing, MI: College of Urban Development, Michigan State University.
Brookover, W. B, Schweitzer, J. H, Schneider, J. M, Beady, C. H, Flood, P. K, & Wisenbaker, J. M. (1978). Elementary school social climate and school achievement. American Educational Research Journal, 15(2), 301–318. https://doi.org/10.2307/1162468
Chiang, K. M., Yin, H., Lee, I., & Chang, C. H. (2024). Taking stock of the research into professional learning communities: Paradigms, pathways, and possibilities. Teaching and Teacher Education, 139, Article 104431.
Deming, W. E. (1967). What happened in Japan? Industrial Quality Control, 24(2), 89–93.
Dogan, S., Pringle, R., & Mesa, J. (2016). The impacts of professional learning communities on science teachers’ knowledge, practice and student learning: A review. Professional Development in Education, 42(4), 569–588.
DuFour, R., & Eaker, R. (1998). Professional Learning Communities at Work: Best practices for enhancing student achievement. Bloomington, IN: Solution Tree Press.
DuFour, R., DuFour, R., Eaker, R., Many, T. W., Mattos, M., & Muhammad, A. (2024). Learning by doing (4th ed.): A handbook for Professional Learning Communities at Work. Bloomington, IN: Solution Tree Press.
DuFour, R., DuFour R., Eaker, R., Mattos, M., & Muhammad, A. (2021). Revisiting Professional Learning Communities at Work (2nd ed.): Proven insights for sustained, substantive school improvement. Bloomington, IN: Solution Tree Press.
Edmonds, R. (1979). Effective schools for the urban poor. Educational Leadership, 37(1), 15–24.
Edmonds, R. R., & Frederiksen., J. R. (1978). Search for effective schools: The identification and analysis of city schools that are instructionally effective for poor children. Cambridge, MA: Center for Urban Studies, Harvard University.
Figueroa Moya, D. A. (2023). Making sense of teachers’ collaboration and professional dialogue in Chile: Agency and leadership in three newly formed professional learning communities [Doctoral thesis, University College London]. UCL Discovery. https://discovery.ucl.ac.uk/id/eprint/10183807
Greenland, S., Senn, S. J., Rothman, K. J., Carlin, J. B., Poole, C., Goodman, S. N., & Altman, D. G. (2016). Statistical tests, P values, confidence intervals, and power: a guide to misinterpretations. European Journal of Epidemiology, 31, 337–350. https://doi.org/10.1007%2Fs10654-016-0149-3
Hairon, S., & Goh, J. W. P. (2017). Teacher leaders in professional learning communities in Singapore: Challenges and opportunities. In A. Harris, M. Jones, & J. B. Huffman (Eds.), Teachers leading educational reform: The power of professional learning communities (pp. 86–100). New York: Routledge.
Hanson, H., Torres, K., Young Yoon, S., Merrill, R., Fantz, T., & Velie, Z. (2021). Growing together: Professional Learning Communities Work® generates achievement gains in Arkansas. Portland, OR: Education Northwest. Retrieved June 5, 2024, from https://educationnorthwest.org/insights/independent-evaluation-validates-success-plc-work-project-arkansas
Hauge, K., & Wan, P. (2019). Teachers’ collective professional development in school: A review study. Cogent Education, 6(1), Article 1619223.
Hord, S. (1997). Professional learning communities: Communities of continuous inquiry and improvement. Austin, TX: Southwest Educational Development Laboratory.
Jacovidis, J. N., Foelber, K. J., & Hort, S. J. (2017). The effect of propensity score matching method on the quantity and quality of matches. The Journal of Experimental Education, 85(4), 535–558. http://dx.doi.org/10.1080/00220973.2016.1250209
Kimani, W. M. (2024, April 16). South African study shows the power of sharing daily experiences for teachers to learn how to include all learners. The Conversation Africa. https://allafrica.com/stories/202404170004.html
Kouzes, J. M., & Posner, B. Z. (1987). The leadership challenge: How to make extraordinary things happen in organizations. San Francisco: Jossey-Bass.
Kouzes, J. M., & Posner, B. Z. (2011). The five practices of exemplary leadership (2nd ed.). San Francisco, CA: Pfeiffer.
Kouzes, J. M., & Posner, B. Z. (2017). The leadership challenge: How to make extraordinary things happen in organizations (6th ed.). Hoboken, NJ: Wiley.
Kruse, S., Louis, K. S., & Bryk, A. (1994). Building professional community in schools. Madison, WI: Center on Organization and Restructuring Schools.
Kruse, S., Marks, H., & Louis, K. S. (1996). Teachers’ professional community in restructuring schools. American Educational Research Journal, 33(4), 757–798. https://doi.org/10.3102/00028312033004757
Lieberman, A., & Mille, L. (2011). Learning communities: The starting point for professional learning is in schools and classrooms. Journal of Staff Development, 32(4), 16–20.
Little, J. W., & McLaughlin, M. W. (1993). Teachers' work: Individuals, colleagues, and contexts. New York: Teachers College Press.Martinez-Blasco, M., Serrano, V., Prior, F., & Cuadros, J. (2023). Analysis of an event study using the Fama–French five‑factor model: teaching approaches including spreadsheets and the R programming language. Financial Innovation, 9(76). https://doi.org/10.1186/s40854-023-00477-3
McLaughlin, M. W. (1990). The Rand change agent study revisited: Macro perspectives and micro realities. Educational Researcher, 19(9), 11–16. https://doi.org/10.3102/0013189X019009011
Morganstein, D., & Wasserstein, R. (2014). ASA statement on value-added models. Statistics and Public Policy, 1(1), 108–110. https://doi.org/10.1080/2330443X.2014.956906
Newmann, F. M., & Wehlage, G. G. (1995). Successful school restructuring: A report to the public and educators. Madison, WI: Center on Organization and Restructuring of Schools, University of Wisconsin-Madison.
Office for Education Policy. (2022). What would it take for my school to get an “A”? Retrieved June 5, 2024, from https://oep.uark.edu/what-would-it-take-for-my-school-to-get-an-a/
Olsson, D. (2019). Improving teaching and learning together: A literature review of professional learning communities. Karlstad University. www.diva-portal.org/smash/record.jsf?pid=diva2%3A1341682&dswid=6307
Perez, T. E. (2023). Professional learning communities as a means to strengthen teacher performance: A systematic review. Journal of Namibian Studies, 35(1), 510–524. https://doi.org/10.59670/jns.v35i.3499
Rosenholtz, S. J. (1989). Teachers' workplace: The social organization of schools. New York: Longman.
Siwy, V. E., & Meilani, Y. F. C. P. (2024). Key success factors of school leadership in implementing professional learning communities: A systematic literature. Feedforward: Journal of Human Resource, 4(1), 41–54.
Tayag, J. R. (2020). Professional learning communities in schools: Challenges and opportunities. Universal Journal of Educational Research, 8(4), 1529–1534.
Tuli, F., & Bekele, A. (2020). Professional learning communities: A review of literature. Journal of Science and Sustainable Development, 8(1), 54–64.
Wasserstein, R. L., & Lazar, N. A. (2016). The ASA statement on p-values: Context, process, and purpose. The American Statistician, 70(2), 129–133. https://doi.org/10.1080/00031305.2016.1154108
What Works Clearinghouse. (n.d.). Baseline equivalence [Standards brief]. https://ies.ed.gov/ncee/wwc/Docs/referenceresources/wwc_brief_baseline_080715.pdf
Summary of Response to the University of Arkansas OEP Study
Solution Tree is a research-based, results-oriented continuous learning organization. We believe it is vitally important to adhere to high standards of scholarly investigation and communication. In reviewing the University of Arkansas OEP study, “Professional Learning Communities and Student Outcomes: A Quantitative Analysis of the PLC at Work Model in Arkansas Schools,” we found the following aspects deserved closer scrutiny: statistical significance, matching, sampling, baseline equivalence, study design, weighted achievement, and value-added models (VAMs).
Statistical Significance
“Statistical significance” is often wrongly interpreted by general audiences to be synonymous with “significant” or “meaningful.” This is inaccurate.
The University of Arkansas OEP study was designed in such a way as to prevent the University of Arkansas from making substantive claims about the impact of PLC at Work process implementation.
No statistically significant differences in student performance” doesn’t necessarily mean there’s no effect. It simply indicates that the study itself couldn’t definitively show one.
Matching
The two groups studied—“Project Schools” and “All Other AR Schools”—are not necessarily comparable.
It’s virtually impossible to compare apples to apples when looking at student achievement data. Reporting on how likely an effect occurred based on the confines of the study is a more accurate approach.
Because the University of Arkansas did not take all potential variables into account prior to matching, the study has very likely come to false conclusions.
Sampling
Over 200 schools in the “All Other AR Schools” group had received professional development on the PLC at Work process outside of the Arkansas Department of Education (ADE) contract, either prior to Cohort 1 or during the seven-year partnership.
The study only accounts for 90 “Project Schools.” Solution Tree has provided professional development to over 148 districts and 314 individual schools in Arkansas between 2017–2024 alone.
Schools in the “All Other AR Schools” group may have already been impacted and showing achievement as a result of the PLC at Work process.
Baseline Equivalence
Without establishing baseline equivalence (a fair comparison between groups) from the onset, the entire study can be considered invalid. Due to apparent flaws in the matching and sampling processes, baseline equivalence was not fully established.
When baseline equivalence is established, the only difference between groups is receiving the intervention (i.e. PLC at Work process implementation). The results can’t be trusted without it.
Even if a study can demonstrate similarity for observed characteristics, unobserved characteristics (like motivation or attitudes) are more likely to be dissimilar among groups.
Study Design
An event study design is only suited to evaluating the impact of a one-time event, not long-term effects with multiple variables to control, as one might find in a seven-year statewide professional development implementation.
Event studies are most often used in finance and economics, such as to study the market reaction to announcements of specific events or news, like pandemic outbreaks, airplane crashes, and rumors.
The use of an event study design is not a suitable framework for this research due to its limitations.
Weighted Achievement
The use of the Weighted Achievement Score metric was biased against PLC at Work Cohort schools.
On page 11, the study notes, “PLC at Work schools enrolled a statistically significantly greater percentage of students who are Hispanic/Latino, are Eligible for Free or Reduced-Price Lunch, and are English language learners than schools not selected to be PLC at Work schools.”
The OEP’s website discloses potential bias from the Weighted Achievement Score metric against “disadvantaged” students: “Schools serving more advantaged students typically receive ‘good’ scores because a high percentage of their students pass, while schools serving a larger percentage of students who live in poverty, participate in special education, or are learning English often receive lower scores because a higher percentage of their students are not yet performing at grade level.”
Value-Added Measures
The study uses value-added models, which are highly problematic and have long been debunked in the research.
To demonstrate the unreliability of VAMs, one study even applied them “to estimate the effects of teachers on an outcome they cannot plausibly affect: student height” (p. 900).
Teachers do not have any influence or impact on student height, and yet, the researchers found “statistically significant” effects.
VAMs are an unfortunate choice for any study seeking accuracy and credibility.
To fully understand the breadth of the research supporting the PLC at Work process, we suggest watching the documentary The Origins of Professional Learning Communities, which offers decades of research, dating back to the 1940s. In addition, we highly recommend reading the research findings in Education Northwest’s external, independent evaluation of the PLC at Work process in Arkansas schools, Growing Together: Professional Learning Communities Work® Generates Achievement Gains in Arkansas, validating its success and meeting ESSA Tier II requirements.
External Researchers' Responses to the
University of Arkansas OEP Study
The University of Arkansas Office for Education Policy (OEP) study, “Professional Learning Communities and Student Outcomes: A Quantitative Analysis of the PLC at Work Model in Arkansas Schools,” has sparked a multitude of questions regarding the state’s investment in the PLC at Work process. “Statistical significance” is often wrongly interpreted by general audiences to be synonymous with “significant” or “meaningful.” This is inaccurate. “No statistically significant differences in student performance” doesn’t necessarily mean there’s no effect. It simply indicates that the study itself couldn’t definitively show one.
Therefore, Solution Tree has assembled a group of internal and external research experts (with a combined one hundred-plus years of experience) to review the study’s methodology and evaluate the validity of the findings. In our analysis, the following aspects of the study deserve closer scrutiny: statistical significance, matching, sampling, baseline equivalence, study design, weighted achievement, and value-added models (VAMs). Out of professional courtesy, the external researchers (from prestigious, well-respected universities or organizations) requested that their names and organizations not be shared but permitted us to share their responses. Here were some of their comments.
Although I believe that the authors worked as best they could and objectively with the available data, unfortunately both the data and the design were too limited to draw meaningful conclusions regarding the efficacy of the PLC intervention.
The PLC schools were at baseline less high-achieving and higher in underserved subgroups than were the comparison schools (the researchers did what they could with propensity score matches but those quantitative adjustments can’t neutralize the effects of more challenging school environments).
Many of the comparison schools likely were implementing some strategies similar to the intervention model or ones actually offered by Solution Tree in the same year or previously.
School-level rather than individual student-level data were used, thus precluding tracking the achievement trajectories of treated students over time.
No data were available on teacher mobility or attrition, a potentially significant factor given that the teacher is the locus of PLC school impacts.
No perception data were available to determine program impacts beyond high-stakes achievement scores. Such data could have told a much more compelling story of the model’s successes (or failures) relative to the quantitative outcomes.
Overall, the achievement results are highly mixed indeed, showing significant effects in both directions but mostly inconclusive outcomes.
The study did not account for the length of time that schools were in the program. As table 1 shows, the 90 schools evaluated were at very different places in their movement through the program.
The study did not account for the impact of the COVID year in the trajectory of impact.
The sample includes both school-based and district-based program implementation models.
The sample includes both primary and secondary schools which may obscure differential impact.
Other important factors on school-based scores are not controlled for or explored in the models; factors that could suppress the identification of a significant effect.
The analysis utilized a composite of the ELA and MATH subscales; specific impacts are not explored.
The use of weighted achievement may not be the best way to demonstrate the impact of the program. Given the large discrepancy in the sample sizes (and the known variation in types of schools in both), the use of effect size change may provide a better view of how the program is impacting schools.
In the context of the publicly available data, non-significant results are not surprising. Without accounting for the students demographics, student prior achievement, and the nesting of students within classrooms, schools and districts, the level of change attributable to involvement in PLC at Work would likely not be detectable.
A key limitation is the exclusion of some variables from the matching process, potentially leading to inaccurate matching of comparison groups. This can result in unreliable conclusions, as unobserved confounders may influence both the selection of the PLC program and student outcomes.
The sampling method for the “All Other AR Schools” group is problematic. Comparison group schools that may have undergone similar interventions could experience dilution effects, reducing the probability of detecting an effect even if one exists.
Without robust matching and sampling techniques, it becomes challenging to justify the assumptions underlying the methods used. For instance, an event study assumes that in the absence of the intervention, both groups would have similar outcomes. This assumption is difficult to uphold if there are indications of mismatch or dilution effects in the groups.
The use of weighted achievement scores without adjusting for demographic factors introduces bias against schools serving disadvantaged populations, potentially underestimating the program’s effectiveness.
The reliance on value-added models (VAMs), which have been criticized for their instability and potential bias, further complicates the interpretation of the results.
Future evaluations should incorporate more rigorous matching, sampling adjustments, and longitudinal designs to provide a clearer and more accurate assessment of the PLC at Work program’s impact on student outcomes.
Correcting the Narrative:
Response to Misinformation About the PLC at Work Project
Solution Tree is the trusted partner of more than 8,000 schools and districts in 50 states and stands behind the work we’ve done in Arkansas. The following sections address the misinformation about the PLC at Work Project.
The $83.9M PLC at Work Project Contract
The initial PLC at Work Project contract with ADE from 2017 is for $4 million. ADE’s requests to add to that amount across the seven years of the contract are well-documented in annual legislator-approved, ADE-executed contract amendments and memos that bring the total amount of the PLC at Work Project contract to $83.9 million.
As the PLC at Work Project added cohorts annually in accordance with ADE’s original design and requests, the annual contract grew to $12.5 million. Due to schools’ proven successes across the state and the positive results schools were experiencing, ADE requested that Solution Tree expand the contract’s scope from $12.5 million annually to $14.5 million in Year 5 of the project and to $16.5 million in Year 6. This was done at the request of ADE and the additional funds approved by a legislative process. The annual renewal amendments very clearly document the amount of the contract approved by ADE and legislators. Starting in a Year 3 Memo, ADE established the total cost of the PLC at Work Project could be up to $100,000,000 across seven years. Each year, the contract was signed by ADE and submitted to ALC for its annual review. For seven years, the contract was signed and approved by the state for the designated amounts determined by the state. Solution Tree provided professional development products and services to Cohort schools in alignment with ADE’s established yearly costs. At the end of Year 7, the ADE contracted work was $83.9 million.
From 2017–2024, Solution Tree provided professional development to over 148 districts (over 60% of Arkansas’s districts) and 314 individual schools (almost 30% of Arkansas’s schools)—over 200 schools outside the ADE PLC at Work Project.
The total for the PLC at Work Project contract is $83.9 million, not $149.9 million listed on Transparency.Arkansas.gov and quoted frequently in the media. Transparency.Arkansas.gov lists a $66 million ADE contract that was not fully executed. We believe this to be the source of the $149.9 million number being attributed to the PLC at Work Project.
Solution Tree’s RFP Response
Solution Tree aligned its response to the January 8 Arkansas Department of Transformation and Shared Services Office of State Procurement Request for Proposal (RFP) with ADE’s request for a statewide “system of professional learning.” Specifically, in 2.2 Objective & Goals, the RFP states: “Pursuant to Arkansas Code 6-20-2305(b)(5), the Arkansas Department of Education seeks a Contractor to continue and expand a research-based, cohesive, synchronized system of professional learning, such as professional learning communities, to provide support for adult and student learners across the State” (p. 6). Therefore, based on the RFP’s Objectives & Goals, Solution Tree’s proposal was intended to expand the PLC at Work Project for an additional seven years. The contracted cost of $99.4 million across seven years aligned with past experience working with ADE to implement the PLC at Work process across the state.
In 2.4 General Requirements, the RFP states, “The Contractor shall provide onsite professional development and/or coaching on at least 500 occasions each year” (p. 6). However, given the implementation of the PLC at Work Project in the state since 2017, and the aforementioned costs associated with each additional year, Solution Tree’s response aligned with the RFP’s Objectives & Goals: “to continue and expand a research-based, cohesive, synchronized system of professional learning, such as professional learning communities, to provide support for adult and student learners across the State.” Therefore, Solution Tree’s response was intended to expand the work of the schools currently part of the three-year implementation of PLC at Work Project, while enabling additional schools to join the Project. Solution Tree’s bid was higher than other bids because we built a full “system of professional learning” rather than merely offering “500 occasions each year” of professional development. Had we ignored the need for a “system of professional learning” and simply offered 500 days per year, our bid would have been about $24.9 million. This is closer to the other vendors’ proposed costs, with the exception of NIET’s, whose $70 million bid did not make it to the discussions phase of the RFP process due to its low technical proposal score. Instead, Solution Tree “narrowly beating” MGT of America LLC, and its bid of $18.9 million, was reported.
Out of the eight prospective vendors who responded to the RFP, Solution Tree scored a perfect technical proposal score of 700. ADE selected nine total evaluators, external to the vendors and agency, who were a mix of teachers and administrators. Due to scheduling conflicts, five evaluators scored the eight proposals.
PLC at Work provides a strong foundation for the LEARNS Act. In Year 7, Solution Tree worked with ADE to ensure that the services delivered through the PLC at Work Project aligned with and supported Governor Sanders’s priorities. The PLC at Work process provides a strong foundation for the LEARNS Act, but PLC at Work cannot satisfy all LEARNS Act requirements (like tutoring, school choice, etc.).
ADE’s PLC Practitioners Program
Arkansas teachers or leaders who shared their PLC at Work expertise as part of ADE’s PLC Practitioners program, were neither engaged nor compensated by Solution Tree. The state recruited their own highly qualified PLC-trained leaders to coach other Arkansas schools in a “train the trainer” model, specifically designed to sustain the progress made by the PLC at Work Project. ADE provided a stipend directly to these educators much like a stipend would be provided for a teacher on special assignment or an athletic coach. ADE created the PLC Practitioners program to help sustain the work inside the state without presenter support from Solution Tree. ADE asked educators who have participated in the PLC at Work Project to give back to the state by supporting other educators. The PLC Practitioners Program operated parallel to—but separately from—the ADE-designed and Solution Tree–supported PLC Regional Network.
Legislative Committee’s Audit of the Spend
Solution Tree has not received any formal inquiries regarding the legislative committee’s audit. Nor have we been asked to provide any information, which we would be happy to provide. The spend from ADE and educational entities throughout Arkansas is being audited.
Value and Impact of the PLC at Work Project
Dozens of schools and districts implementing the PLC at Work process have shared their successes in the ADE (2023) book, The Journey of a Professional Learning Community: Success Across Arkansas. With the school’s and districts’ permission, we’ve included three examples of evidence-based results here:
Bayyari Elementary: In 2022, Bayyari was recognized as an Overcoming the Odds school and a School on the Move towards Excellence. Fourth-grade students grew in math 11.4% and in reading by 16.9%. (Cohort 4, 2020)
Magazine School District: Eighth-grade students grew in reading by 24.8%, and fourth-grade students grew in math by 22.1%. (Cohort 5, 2021)
Marked Tree School District: Tenth-grade students grew in reading by 23%, and seventh-grade students grew in English by 15%. (Cohort 6, 2022)
Beyond the Arkansas Department of Education and schools’ own data, Solution Tree engaged a highly respected independent firm to objectively evaluate the PLC at Work implementation in Arkansas. For nearly 60 years, Education Northwest, located in Portland, OR, has partnered with communities across the United States to advance education through research and evaluation. Its external evaluation of Cohort 1 of the PLC at Work Project in Arkansas found that after only two years of implementation, students were showing positive growth in ACT Aspire test scores. The greatest impact was seen in student math achievement with PLC at Work participants seeing an overall positive impact on math ACT Aspire growth (Hanson et al., 2021) and gains three times higher than gains seen from National Board certification and the eMINTS Comprehensive Program (Torres & Hanson, 2020; What Works Clearinghouse, 2020). These findings are important because the PLC at Work process focuses on schoolwide transformation and is based on a simple proposition: improved teacher collaboration, trust, and collective responsibility will lead to improved instruction, improved student engagement, and, ultimately, increased student achievement. The external evaluation is designed to establish Every Student Success Act (ESSA) Tier II evidence.
Beyond empirical student achievement, the Education Northwest study showed positive changes with schools that are complements to academic growth. According to the Education Northwest study, in addition to achievement gains, educators implementing the PLC at Work process noted improved attendance, decreased behavioral referrals, and decreased special education referrals (Education Northwest, 2020).
In a recent letter to Governor Sanders and Secretary Oliva, an Arkansas Principal stated “our school community has been working diligently over the past two years to elevate our school’s performance. Through our partnership with Solution Tree, we have successfully improved our school rating from a D to a C. This progress is a testament to the hard work and dedication of our teachers, students and staff, all of whom have embraced the necessary changes with optimism and determination.”
Further evidence of effectiveness in Arkansas include:
Arkansas schools in PLC at Work Cohorts 1 (4.44 points), 2 (3.75 points), 3 (5.5 points), 4 (3.75 points), 5 (8.44), and 6 (4.32) all showed more recovery in school letter grade points from pre-COVID (2018–19) to 2022–23 than non-Cohort schools in the state (3.20 points). (Arkansas Department of Education Data Center, 2023)
Education Northwest’s ESSA Tier II–aligned research study on PLC at Work implementation in Arkansas from 2016–17 to 2018–19 found a positive impact on teachers and students after only two years (Hanson et al., 2021).
Schools participating in PLC at Work implementation were found to have improved communication, trust, and collective responsibility among educators and, in turn, improved student engagement and learning (Torres et al., 2020).
African American students had higher growth in ELA than their peers (Hanson et al., 2021).
Several student groups realized higher math achievement by statistically significant margins (Hanson et al., 2021).
PLC at Work had an overall positive impact on math ACT Aspire growth (Hanson et al., 2021).
PLC at Work had a positive impact for specific student groups on math ACT Aspire growth (Hanson et al., 2021).
PLC at Work had a larger impact on math achievement gains than other professional learning programs (Torres & Hanson, 2020).
University of Arkansas Office for Education Policy (OEP) Study
The University of Arkansas OEP study (Barnes & McKenzie, 2024a, 2024b) has sparked a multitude of questions regarding the state’s investment in the PLC at Work process. “Statistical significance” is often wrongly interpreted by general audiences to be synonymous with “significant” or “meaningful.” This is inaccurate. “No statistically significant differences in student performance” doesn’t necessarily mean there’s no effect. It simply indicates that the study itself couldn’t definitively show one. Therefore, Solution Tree has assembled a group of internal and external research experts (with a combined one hundred-plus years of experience) to review the study’s methodology and evaluate the validity of the findings. In our analysis, the following aspects of the study deserve closer scrutiny: statistical significance, matching, sampling, baseline equivalence, study design, weighted achievement, and value-added models (VAMs). One aspect that is highly concerning is how the study calculated “Weighted Achievement Score.”
The Weighted Achievement Score is a measure the University of Arkansas’s Office for Education Policy uses to determine how well a school scores on annual standardized tests for English language arts and mathematics. However, in the University of Arkansas OEP study, the selection of the Weighted Achievement Score metric was biased against schools participating in the PLC at Work Cohorts, given the study’s observation: “PLC at Work schools enrolled a statistically significantly greater percentage of students who are Hispanic/Latino, are Eligible for Free or Reduced-Price Lunch, and are English language learners than schools not selected to be PLC at Work schools” (Barnes & McKenzie, 2024b, p. 11). The OEP’s website discloses potential bias from the Weighted Achievement Score: “Schools serving more advantaged students typically receive ‘good’ scores because a high percentage of their students pass, while schools serving a larger percentage of students who live in poverty, participate in special education, or are learning English often receive lower scores because a higher percentage of their students are not yet performing at grade level.”
Out of professional courtesy, the external researchers (from prestigious, well-respected universities or organizations) requested that their names and organizations not be shared but permitted us to share their responses. Here were some of their comments.
Although I believe that the authors worked as best they could and objectively with the available data, unfortunately both the data and the design were too limited to draw meaningful conclusions regarding the efficacy of the PLC intervention.
The PLC schools were at baseline less high-achieving and higher in underserved subgroups than were the comparison schools (the researchers did what they could with propensity score matches but those quantitative adjustments can’t neutralize the effects of more challenging school environments).
Many of the comparison schools likely were implementing some strategies similar to the intervention model or ones actually offered by Solution Tree in the same year or previously.
School-level rather than individual student-level data were used, thus precluding tracking the achievement trajectories of treated students over time.
No data were available on teacher mobility or attrition, a potentially significant factor given that the teacher is the locus of PLC school impacts.
No perception data were available to determine program impacts beyond high-stakes achievement scores. Such data could have told a much more compelling story of the model’s successes (or failures) relative to the quantitative outcomes.
Overall, the achievement results are highly mixed indeed, showing significant effects in both directions but mostly inconclusive outcomes.
The study did not account for the length of time that schools were in the program. As table 1 shows, the 90 schools evaluated were at very different places in their movement through the program.
The study did not account for the impact of the COVID year in the trajectory of impact.
The sample includes both school-based and district-based program implementation models.
The sample includes both primary and secondary schools which may obscure differential impact.
Other important factors on school-based scores are not controlled for or explored in the models; factors that could suppress the identification of a significant effect.
The analysis utilized a composite of the ELA and MATH subscales; specific impacts are not explored.
The use of weighted achievement may not be the best way to demonstrate the impact of the program. Given the large discrepancy in the sample sizes (and the known variation in types of schools in both), the use of effect size change may provide a better view of how the program is impacting schools.
In the context of the publicly available data, non-significant results are not surprising. Without accounting for the students demographics, student prior achievement, and the nesting of students within classrooms, schools and districts, the level of change attributable to involvement in PLC at Work would likely not be detectable.
A key limitation is the exclusion of some variables from the matching process, potentially leading to inaccurate matching of comparison group. This can result in unreliable conclusions, as unobserved confounders may influence both the selection of the PLC program and student outcomes.
The sampling method for the “All Other AR Schools” group is problematic. Comparison group schools that may have undergone similar interventions could experience dilution effects, reducing the probability of detecting an effect even if one exists.
Without robust matching and sampling techniques, it becomes challenging to justify the assumptions underlying the methods used. For instance, an event study assumes that in the absence of the intervention, both groups would have similar outcomes. This assumption is difficult to uphold if there are indications of mismatch or dilution effects in the groups.
The use of weighted achievement scores without adjusting for demographic factors introduces bias against schools serving disadvantaged populations, potentially underestimating the program’s effectiveness.
The reliance on value-added models (VAMs), which have been criticized for their instability and potential bias, further complicates the interpretation of the results.
Future evaluations should incorporate more rigorous matching, sampling adjustments, and longitudinal designs to provide a clearer and more accurate assessment of the PLC at Work program's impact on student outcomes.
References
AllThingsPLC.info. (2023). See the evidence. https://allthingsplc.info/evidence/
Arkansas Department of Education Data Center. (2023). Arkansas ASPIRE student achievement data 2016-2022. Retrieved February 21, 2023, from https://myschoolinfo.arkansas.gov/Plus/Schools.
Arkansas Department of Education. (2023). The journey of a professional learning community: Success across Arkansas. Retrieved June 5, 2024, from https://dese.ade.arkansas.gov/Offices/special-projects/professional-learning-communities-for-arkansas
Barnes, K., & McKenzie, S. (2024a). Effects of PLC at Work in Arkansas on student academic outcomes [Unpublished University of Arkansas Office for Education Policy study]. https://wehco.media.clients.ellingtoncms.com/news/documents/2024/05/25/Effects_of_PLC_at_Work_in_AR_1.pdf
Barnes, K., & McKenzie, S. (2024b). Professional learning communities and student outcomes: A quantitative analysis of the PLC at Work model in Arkansas schools. Arkansas Education Report, 21(1). https://bpb-us-e1.wpmucdn.com/wordpressua.uark.edu/dist/1/555/files/2024/06/21.1_Professional-Learning-Communities-and-Student-Outcomes.pdf
Education Northwest. (2020). At a glance: Successful implementing PLC at Work® in Arkansas. Retrieved June 5, 2024, from https://educationnorthwest.org/sites/default/files/plc-at-work-at-a-glance.pdf
Hanson, H., Torres, K., Young Yoon, S., Merrill, R., Fantz, T., & Velie, Z. (2021). Growing together: Professional Learning Communities Work® generates achievement gains in Arkansas. Portland, OR: Education Northwest. Retrieved June 5, 2024, from https://educationnorthwest.org/insights/independent-evaluation-validates-success-plc-work-project-arkansas
Torres, K., & Hanson, H. (2020). On the road to impact: Solution Tree Arkansas PLC at Work® Cohort 1 Year 2 milepost memo executive summary. Portland, OR: Education Northwest. Retrieved June 5, 2024, from https://dese.ade.arkansas.gov/Files/20201203104240_plc-at-work-excutive-summary_rv2.pdf
Torres, K., Rooney, K., Holmgren, M., Young, S. Y., Taylor, S., & Hanson, H. (2020). PLC at Work® in Arkansas: Driving achievement results through school transformation and innovation–Executive summary. Portland, OR: Education Northwest. https://educationnorthwest.org/sites/default/files/driving-achievement-results-through-school-transformation.pdf
What Works Clearinghouse, Institute of Education Sciences, U.S. Department of Education. (2020, April). eMINTS Comprehensive Program. https://ies.ed.gov/ncee/wwc/Docs/InterventionReports/wwc_EESL_eMIN_IR_apr2020.pdf
Mystery Solved: $66M Contract
Upon digging into the $149,900,000 “contract value” listed on Transparency.Arkansas.Gov, Solution Tree came across an entry for a $66 million contract with the Arkansas Department of Education (ADE) from 7/1/23 to 6/30/27 that had a $0 “amount ordered” and “amount spent.” Therefore, with a $0 value, this contract was not executed. Thus, it should not have been included in the listed contract value.
Solution Tree searched our records for the contract number listed, 4600051773, to no avail. We also contacted the Arkansas Office of State Procurement (OSP) and ADE, specifically, to uncover the mystery $66 million contract. At the time of our emailing with OSP and ADE on June 13–14, they also could not identify the contract. In an interview with KUAF’s Matthew Moore on June 14, Solution Tree’s CEO, Jeff Jones, stated that Solution Tree could not find any paperwork for the $66 million contract.
By Monday, June 17, ADE spokeswoman, Kimberly Mundell, stated in an email to the Arkansas Democrat-Gazette that the $66 million contract listed on Transparency.Arkansas.Gov was initiated by the previous administration but “never executed.” Department of Finance and Administration spokesman, Scott Hardin, confirmed that no money was spent on the contract. Hardin also clarified that the contract was entered in the Arkansas Administrative Statewide Information System on October 5, 2022.
The contract shows that Solution Tree signed it on August 17, 2022; however, at that time, the contract was a draft—a possible contract. Solution Tree was told in numerous conversations in 2022 that the contract would not be moving forward—and we never received a copy of any paperwork since the initial draft in August 2022. Therefore, just this month, we were surprised to see that it was countersigned on October 4, 2022, surprised to see a contract number assigned, and surprised that the mystery contract was, in fact, the contract from two years ago that we were told was not moved forward. While we were aware we signed a draft in 2022, the final version listed on Transparency.Arkansas.Gov was unknown to us until very recently.
Solution Tree is actively working with the OSP to have the unexecuted contract removed from Transparency.Arkansas.Gov to ensure the accurate PLC at Work Project contract value is $83.9 million.
About Solution Tree
For over 25 years, Solution Tree has worked to transform education worldwide, empowering educators to raise student achievement. With more than 60,000 educators attending professional development events and more than 10,000 professional development days in schools each year, Solution Tree helps teachers and administrators confront essential challenges. Solution Tree has a catalog of more than 600 titles, along with hundreds of videos and online courses, and is the creator of Global PD Teams and Avanti, online learning platforms that facilitate the work of teachers and educators.