BoardOnTrack Help Center

Find answers fast: Ask a question, search for keywords, or explore articles below.

Smart Questions for Board Members to Ask About Assessment Data

While most board members are not (and need not be) experts in assessment or education, all board member should play an important role in the success of their organization by asking questions about academic assessment data. The dialogue this promotes has several benefits:

  1. It helps the CEO to have a sounding board for cementing their own understanding of the data.
  2. It ensures that board members fully understand the data and are closely monitoring the organization’s overall results.

At times, many board members may feel “out of their depth” when it comes to academic data. Asking questions to understand the data, however, is one of the most important things a board member can do. The remainder of this document is intended to help you ask helpful questions.

The document separates out Interim Assessments (tests students take during the school year to assess progress to date; these could include tests created by the school, the “ANet” tests, or the “STEP”), from State Tests (the tests your state requires all students to take once each year usually beginning in 3rd grade), from nationally-normed standardized tests (which school generally use to measure progress in students before 3rd grade or to compare student to a national sample; these could include, for example, the TerraNova or SAT 10).

Each section is also divided into a set of “General Questions” and a set of “Data Set Specific Questions.” General Questions gather background about the assessment in general. They are likely ones that you might only ask once the first time you review data from that assessment (though you should remember to review them whenever new board members join the committee/board, and you may find you want to review some of them each time an assessment is administered to refresh your understanding of the key issues). Data-Set Specific Questions focus on understanding the key takeaways of a particular administration of that assessment—in other words, the type of questions your committee or board should consider every time that assessment is administered.

Interim Assessments

General Questions: (to ask the CEO)

  • What does this test assess, from your perspective?
  • What kind of decisions do you and other staff base on the data from this assessment?
  • Who wrote this assessment/how was it developed?
  • If it was developed by school staff, how did they decide what skills/standards to assess?
  • How closely did they base the assessment on either your state’s tests or the nationally-normed standardized test your school administers?
  • Is this assessment “formative,” “cumulative,” or “summative”?

Formative means that each assessment tests only the skills taught since the last test. On formative assessments, you want to see a high level of mastery (say, 80% or more correct, on average) for every testing period. Since you are only testing the topics taught since the last test, there is no reason to expect that low scoring students can “make up” for these low scores on tests later in the year, as those tests will assess different standards than this one did.

Cumulative means that each successive test in the year tests the standards from the last test given as well as the new standards taught (e.g., if teachers taught six standards in September and then another six standards in October, the October test would test not just the six standards taught during October but also the six standards taught during September). On cumulative tests, seeing increasing scores over the course of the year is a good thing and can be telling you that students are “catching up,” as they are mastering both new standards and those taught earlier in the year.

Summative means that each test assesses all the standards for the entire year. It means that you are, in effect, giving the end-of-the-year test repeatedly throughout the year (though the specific questions should be altered). So, on a summative test, you would expect students to score lower at the beginning of the year (when relatively few of the standards have been taught) and to increase their scores throughout the year.

  • Is this assessment “normed”?
  • In other words, is there a way to interpret our students’ scores against those of a larger sample?
  • Does this test tell us if students are “on grade level” for this point in the year?
  • How similar is this test to our state’s test or to the nationally normed standardized test our school uses?
  • How predictive is data from this test of how our students will score on our state’s test or to the nationally normed standardized test our school uses?

Data Set Specific Questions:

  • How did our students score on the test overall, by grade level?
  • How does this compare to other schools that took this test? (Data might or might not be available for this comparison.)
  • If our students took this same interim assessment last year, how do this year’s results compare to last year’s?
  • If this test is “normed,” what does the norm data tell us about how our students compare to a national sample or to grade level? (E.g., our scores averaged in the 83rd percentile or 94% are “at grade level” or above.)
  • Did all subgroups of students score similarly on this assessment or make similar progress since the last test?
  • There are many, many subgroups that you could break the data out by. This can be time-consuming, so while it is important to do, you should work with your CEO to determine a reasonable number of subgroups to look at each testing cycle. The subgroups you look at should be determined based on the specific demographics of your school and the weaknesses you find for certain groups. Possible groups to monitor are:
  • Boys vs. girls
  • Special education vs. general education
  • Minority vs. Caucasian
  • Low income vs. higher income (usually measured in terms of students qualifying for free or reduced-price lunch under the federal guidelines)
  • Students receiving extra services (intervention, tutoring, etc) vs. those not receiving these services
  • Students who have been at the school for over a year vs. those who have come more recently
  • What do you see as the areas of strength for our students reflected in this data?
  • What do you see as the areas of weakness for our students reflected in this data?
  • Are the key action steps you are taking/overseeing in response to these weaknesses? (Note: The intention here is not that you will provide ideas or an approach to responding to the data; that is the CEO’s role. Your role is to hear the plan and simply establish that there is a plan and that it seems to be a reasonable one.)
  • What are your three biggest takeaways from this data? Where in the data do you see these things?

State Tests

General Questions:

  • What are the different ratings or scores a student can get on this test?
  • What decisions or ratings do the state or district make based on this test (e.g., school report card rating, performance watch lists, etc.)?

Data-Set Specific Questions:

  • How did our students perform overall by grade level and subject? Specifically, how does it compare to the scores of:
  • Students in our district
  • Students in the school around our school or schools that serve similar students
  • Our students last year
  • Students at the best open-enrollment public school in our district
  • Did all subgroups of students score similarly on this assessment/make similar progress since the last test?
  • There are many, many subgroups that you could break the data out by. This can be time-consuming so while important to do, you should work with your CEO to determine a reasonable number of subgroups to look at each testing cycle. The subgroups you look at should be determined based on the specific demographics of your school and the weaknesses you find for certain groups. Possible groups to monitor are:
  • Boys vs. girls
  • Special education vs. general education
  • Minority vs. Caucasian
  • Low income vs. higher income (usually measured in terms of students qualifying for free or reduced-price lunch under the federal guidelines)
  • Students receiving extra services (intervention, tutoring, etc.) vs. those not receiving these services
  • Students who have been at the school for over a year vs. those who have come more recently
  • Do the “raw scores” (simply the number of questions correct) reflect any important takeaways that are obscured by the “advanced,” “proficient,” “basic,” or “failing” categories that many states use?
  • What do you see as the areas of strength for our students reflected in this data?
  • What do you see as the areas of weakness for our students reflected in this data?
  • What are the key action steps you are taking/overseeing in response to these weaknesses? (Note: The intention here is not that you will provide ideas or an approach to responding to the data; that is the CEO’s role. The Academic Excellence Committee’s role is to simply establish that there is a plan, here it, and confirm that it seems to be a reasonable one.)
  • What are your three biggest takeaways from this data? Where in the data do you see these conclusions?
  • How do these results compare to the promises made in our accountability plan or charter contract?
  • How do you think our authorizer will view this data in terms of a reflection of our school’s overall performance?

Nationally Normed Standardized Tests

General Questions:

 

  • Why did we choose to use this particular assessment?
  • How often do we administer this assessment?
  • How similar to or different from our state test is this assessment?
  • What kind of decisions do you and other staff base on the data from this assessment?
  • How are the scores reported out? Possible options:
    • Percentiles
    • Normal Curve Equivalents (NCEs; similar to percentiles)
    • Percents of students at grade level
    • Grade level equivalents (e.g., grade level 3.4)
  • If scores are presented in terms of “percent of students at grade level,” how do the test makers define “grade level?” (Note: Many test makers define grade level as being at the 50th percentile or higher, meaning scoring higher than 50% of American students at that grade level. Since this average American student is currently not truly at grade level in terms of their knowledge and skills, this sets a low bar for grade level. Various high performing charter schools nationally have found that a student must score closer to the 70th percentile to be truly “on grade level,” as defined as on track to be ready for success in college.)

Data-Set Specific Questions

  • How did our students perform overall by grade level and subject?
  • How does this compare to how our students scored last year?
  • How does it compare to how students at other charter schools within and outside of our city scored? (Your CEO may need to contact schools individually to arrange to compare data.)
  • What proportion of our students would the test makers say were “at grade level” or above?
  • What proportion do we believe are “at grade level” by our own internal standard for what it means to be on track for college?
  • How does this compare to what percent were “at grade level” last year?
  • Did all subgroups of students score similarly on this assessment/make similar progress since the last test?
  • There are many, many subgroups that you could break the data out by. This can be time consuming so while it is important to do, you should work with your school leader to determine a reasonable number of subgroups to look at each testing cycle. The subgroups you look at should be determined based on the specific demographics of your school and the weaknesses you find for certain groups. Possible groups to monitor are:
  • Boys vs. girls
  • Special education vs. general education
  • Minority vs. Caucasian
  • Low income vs. higher income (usually measured in terms of students qualifying for free or reduced price lunch under the federal guidelines)
  • Students receiving extra services (intervention, tutoring, etc.) vs. those not receiving these services
  • Students who have been at the school for over a year vs. those who have come more recently
  • What do you see as the areas of strength for our students, as reflected in this data?
  • What do you see as the areas of weakness for our students, as reflected in this data?
  • Are there key action steps you are taking/overseeing in response to these weaknesses? (Note: The intention here is not that you will provide ideas or an approach to responding to the data; that is the CEO’s role. Your role is to hear the plan and simply establish that there is a plan and that it seems to be a reasonable one.)
  • What are your three biggest takeaways from this data? Where in the data do you see these things?
  • How do these results compare to the promises made in our accountability plan or charter contract?
  • How do you think our authorizer will view this data, in terms of a reflection of our school’s overall performance?

Questions about Using Results

  • How is our organization using information from internal and external assessments to improve teaching and learning in our schools?
    • What do the teachers do with the assessments?
    • Does the school use this information to inform practices?
    • Inform what goes on in the classroom?
    • Inform curriculum decisions?
    • Inform school improvement efforts?
    • Inform program design?
  •  How does our organization manage the data on student performance?
  •  Is there a system in place that allows our organization to analyze student achievement data on a regular basis?
  •  How does this information impact/interface with our organzation’s budgeting process?

Questions about Communicating Results

  • How is our organization communicating assessment data to students and parents?
  • How is our organization communicating assessment data to the community and other external audiences?
    • What data are presented in annual reports?
    • What data are reported to the media?
    • What data are reported to the Department of Education?

 

 

Updated

Was this article helpful?

0 out of 0 found this helpful

Have more questions? Submit a request