|Madison Metropolitan School District
Art Rainwater, Superintendent
|BOARD OF EDUCATION
Minutes for Performance and Achievement
July 30, 2007
|Doyle Administration Building
545 West Dayton Street, Auditorium
Performance and Achievement Committee meeting was called to order by Chair Lawrie Kobza at 5:04 p.m.
MEMBERS PRESENT: Maya Cole, Lawrie Kobza
MEMBERS ABSENT: Johnny Winston, Jr.
OTHER BOARD MEMBERS PRESENT: Arlene Silveira
STUDENT REPRESENTATIVE PRESENT: None
STAFF PRESENT: Sue Abplanalp, Steve Hartley, Kurt Kiefer, Pam Nash, Art Rainwater, Ken Syke, Barbara Lehman - Recording Secretary
1. Approval of Minutes
It was moved by Maya Cole and seconded by Lawrie Kobza to approve the minutes of the Performance and Achievement Committee dated July 23, 2007 as distributed. Motion unanimously carried by those present.
2. Public Appearances
There were no public appearances.
There were no announcements.
4. A Model to Measure Student Performance
(Packets included a PowerPoint presentation entitled, "Applications of a Value-Added Data Analysis System." A copy is attached to the original of these minutes.)
Mr. Kiefer spoke of the district's long history in terms of its intent to look at change over time. This system would take the district to a more sophisticated level. Dr. Meyer makes it very understandable. The district is collaborating with DPI on longitudinal data systems and one project outcome was the creation of a value added system. Madison will be one of the districts used to create that system. Madison is on track and this system has been developed in other school districts. It is quite impressive in terms of what it does in bringing equity to accountability. It looks at every child from where they start, controls for various factors (e.g., poverty), and looks for growth over time. Benchmarks can be set. The district is looking for this new system to provide not only year-to-year data but how much growth is occurring and what that growth can be attributed to. Staff is very excited about the equity it can bring. The superintendent added that the annual report would be presented in a growth format this fall from baseline data. between 2005-06 and 2006-07.
Director Rob Meyer Introduced his team: Mike Christian, Scientist; Ernie Morgan, Associate Researcher; Yue Hu, Economics; Emily Cheng, Economics.
Dr. Meyer stated that they started working with Milwaukee about ten years ago. He commented that Madison can contribute a lot and that it is quite far along on this. He began his presentation by stating that Value-Added models have their origin in research. Presentation highlights: what is Value-Added, criteria, why not Average Attainment or the Proficiency Rate, Value-Added Indicator as a residual parameter, issues in the development of a Value-Added Indicator System, Wisconsin Center on Education Research (WCER) experience and contributions.
Comments by Presenter:
o No Child Left Behind goes toward attainment indicators. Should be measuring the performance of schools correctly. Why attainment indicators are biased. Some distortions that occur if you focus on attainment.
o Value-Added completely eliminates the argument that "it is the kid's fault."
o How to give them the tools to get better.
o Do not fixate too much on one year but trends over time.
o Focus on board policy to attack the student achievement gap. Breaks down role of classroom, school, and district.
o Very open process.
o Advantage in having the WCER located just two blocks away.
Comments/Questions from the Board:
How is it made visible to the public? Specific names provided? Each school has a report card. Scores aggregated over grades. Also have additional information that gives them estimates. Evidence by grade. Not at the teacher level.
Shows where individual schools are compared to average growth in the district. It is published on the web. Information by grade is for educators and is not generally available. School names are published.
The Superintendent indicated that he has thought for many years that this has tremendous potential for all aspects of how the MMSD makes decisions about instruction. It is a tool that controls a majority of factors that measures specific factors. This is the more sophisticated direction to go. The district already has a rich database so many characteristics are already known about MMSD students. The data goes back to 1990. It gives us leg up. He wants to use that data in a positive way. It will also be very important for the Board, for learning, and for instruction.
What are the tools we need to move into this type of analyses? Tools mean intellectual capacity that this team brings. We are anxious to partner with them. We have statistical capacity but we need to have time to work with Rob and his team to get to where we are looking at data over 3-4 years.
We have been trying to put some benchmarks to our strategic plan. One of the elements is student achievement and rigorous curriculum. We would like to move to something more. Can this help us do that and relatively soon? Yes, one of the benchmarks needs to be helping students grow. Looking at certain kinds of student characteristics, which can be things such as low-income, shows how much the school or district is contributing in terms of growth and why. Can also look at students who start out high and what growth the district is contributing to that student; how much of it we are responsible for.
How do districts use it for evaluating staff, curriculum, students, etc.? Staff may be getting a little nervous about how the numbers will relate to the job. Have you experienced that? They started by reporting by grade level at the schools. Dallas goes to the teacher level. As you get closer there are more concerns. Could be using it wisely. They made a lot of effort to work with their stakeholders. Chicago had an open and transparent process. Further down the trail, you will want to measure things right if people are moving kids to other schools. There is a lot of interest in getting it right. They like the open approach. The National Federation of Teachers has joined our advisory panel. Can run the models in different ways and show the differences. Everyone wants to do better. Can show that kids are learning and point to things they are doing. Can learn from the data and use it and do some good.
What we are measuring in the sense of how it will benefit individual students and families to see how kids are growing? Pre-testing on specific things? State tests have changed and reporting data that makes them look better than they are. What are we going to use that measures content? Plan on using WKCE tests. We are riding with what the state provides. In terms of its rigor in measuring student achievement that test or series has been applied to state academic standards pretty well. We are comfortable with that. Other information that we have that can be used to help we do capture like parent level of education, all voluntarily provided, and mobility. We do have a data warehouse environment that has about 5,000 different elements. Hopefully we can put it to good use. For individual students we will now be able to reflect on individual growth they are making over time and why. Then this data can be used for SIP and for program evaluation perspectives, equity in fairness, etc. This system allows you to control for factors that we know affect students' achievement coming in and, despite that, what kinds of growth we are seeing for all children and that is more equitable.
One of the most interesting discussions and more important discussions we have had on the board. Very exciting.
FOLLOW UP: Put on next meeting agenda also. Board will come up with questions in the meantime.
5. Other Business
There was no other business.
It was moved by Maya Cole and seconded by Lawrie Kobza to adjourn the meeting at 6:16 p.m. Motion unanimously carried by those present.