For the data literacy portion of my portfolio, I have chosen to include my analysis of ACT data for 2016-2017. Through this analysis, I have shown that I can successfully assess results of assessments to create a plan that will ensure future success of the students. I compared several sets of data here to determine the strengths and weaknesses shown. Through this comparison, I gave several commendations and recommendations that will help students achieve a higher rate of success on subsequent tests.
An Analysis of End of Course and ACT Data in the Years 2016 and 2017
Names have been removed.
The student population of the school studied in the years of this study was about 1,700 students; however, the school continues to grow and has a population of about 1,900 in the 2018-2019 school year. The community is rural/suburban with about 41% of students receiving free or reduced lunch. That being said, the average free and reduced rate at the feeder elementary schools is about 60%, which seems to indicate that the actual number of students eligible for aid is much higher than those who receive it. The school is 88.2% white, 3.8% African American, 3.4% Hispanic, 2.7% two or more races, 1.3% Asian, .5% American Indian or Alaska Native, and .1% Native Hawaiian or other Pacific Islander.
After considering the data extensively, there seems to be a discrepancy between the results of the ACT scores in contrast to the EOC scores. The scores for the free/reduced lunch and disabilities subgroups improved from one year to the next in both Reading and Science; however, scores on the EOC decreased for those same groups. In fact, scores of almost every group decreased on the EOC while scores of almost every group increased on the ACT. In addition, females consistently scored higher than males on all tests, and white students scored higher than minority groups with a few exceptions.
Overall, scores have decreased on the EOC and increased on the ACT in both areas observed as shown in Fig. 1 and Fig. 2. It is also noteworthy that the percent of students meeting benchmark dropped from the 2016 English EOC to the 2017 Reading ACT as shown in Fig. 3.
There were two significant changes in the ACT data. The percentage of African-American students meeting benchmark on the Reading ACT dropped from 54.3% to 33.3%, as shown in Fig. 3. In contrast, the Free/Reduced Lunch subgroup improved their Reading scores by .9 points and their Science scores by one full point as shown in Fig. 2.
Referring back to Fig. 1, it is worthy to note that the Hispanic population increased both EOC percent proficient and distinguished. The English percentage rose 14.1%, and the Biology percentage rose 20%.
The scores in both areas remained very similar in regards to the whole student population. The relatively large increases or decreases for some subgroups were balanced by increases or decreases in other subgroups.
The negative changes in the EOC scores seem to stem from what I would deem as arrogance. The scores for EOC in 2016 were higher than 2017 for all groups. It is my theory that, because of this success, teachers did not prepare students as well in 2017 because they thought the success of the previous year would be easily repeated. William Pollard, a successful Physicist of the mid-1800s, said, “The arrogance of success is to think that what you did yesterday will be sufficient for tomorrow” (as cited in Cuorus, 2018). In contrast, the scores for the ACT went up or only slightly changed (in most groups) from one year to the next. This, it seems, is due to the opposite of what happened with the success of the EOC scores. Teachers and students worked harder on improving ACT scores because the scores from 2016 were not adequate, unlike the EOC scores from 2016. Therefore, it seems that both the students and teachers vowed to work harder to achieve an increase in those scores.
The school mission, belief statements, and improvement plan of the school in question are not particularly unique or compelling. The mission asserts that students will “develop the knowledge, skills, and attitudes essential for success,” which is common among most schools. The belief statements are focused on the school’s “concern for people, both individually and collectively,” which, again, is a common belief among educators. Finally, the Comprehensive School Improvement Plan (CSIP) targets growth in reading, math, and writing as common goals, but the ways to achieve those goals do not include anything that wasn’t already in place before the CSIP was developed. Of these three artifacts, the CSIP is most concerning.
Although there are several areas of improvement indicated in the CSIP, the lack of innovation in working to improve those areas is absent. The school outlines that they will continue to use the same interventions as in previous years, but they expect improved scores. These are unrealistic expectations, and it seems as though that “arrogance” mentioned previously has something to do with the lack of change. The school in question has consistently been the highest scoring high school in the district, so the motivation to develop and implement strategic and data-driven improvement plans is absent. The CSIP seems to have been copied and pasted (with minor changes in percentages) from previous years to complete the requirements mandated by the state. The plan is not referred to in school-wide meetings, nor have I ever been directed to look at or become familiar with the plan. It is an unfortunate reality that many of the “goals” teachers, schools, and districts set are to appease “higher ups” rather than improve student learning.
I would like to commend the school for improving the scores of some of the minority subgroups such as the free/reduced lunch population and the Hispanic and African American populations. It is clear that those groups were given extra support to improve their performance on the EOC and ACT. That being said, the mission and belief statements are generic and do not especially address the specific population of the school. Likewise, the CSIP did not seem to have actual ways to improve student learning beyond what the school already does.
Consequently, there are several recommendations that I would like to make based on the data analysis and artifact review.
The scores from the EOC and the ACT should be analyzed together. For example, the percent scoring in each level on the EOCs for grade 10 should be analyzed and used to target students who may need additional help before they take the ACT in grade 11. This comparison is shown in Fig. 4. The same should be done for the Biology EOC and Science ACT.
A committee should be formed to revise the school’s mission and belief statements so that they are more specific to the student population of Central Hardin. With the formation of this committee, staff and students should be surveyed so that the committee can see a true picture of what Central Hardin is, does, and means to staff and students.
The CSIP should also be revised to include research-based, innovative ideas and methods to improve student learning beyond what is currently being offered. Again, a committee should be formed, the data presented, and a survey sent out to staff and students to solicit ideas and recommendations about improving student learning. As stated by Clark and Miller (2017), “we can get a much fuller picture of school improvement issues by expanding our use of data to include stakeholder feedback, which is typically collected through surveys” (p.4). Staff should be expected to include relevant research citations with their recommendations.
All CSIP strategies should be research and evidenced-based to promote success in implementation and results. Arens and Lewis (2017) state that “Decision-makers should support research or evaluation efforts that examine solutions to relevant and local problems of practice” (p. 9). Meaning that administration and even teachers need to make use of research to improve student learning
Arens, S. A., & Lewis, D. (2017). ESSA offers opportunity to use data to benefit all students. Changing Schools, 77(Spring 2017), 7-9.