| 
  • If you are citizen of an European Union member nation, you may not use this service unless you are at least 16 years old.

  • Stop wasting time looking for files and revisions. Connect your Gmail, DriveDropbox, and Slack accounts and in less than 2 minutes, Dokkio will automatically organize all your file attachments. Learn more and claim your free account.

View
 

Grading of Schools

Page history last edited by PBworks 12 years, 9 months ago

This is a thinking space to discuss appropriate responses from the ground to NYC grading of schools.

 

NOT FOR PUBLICATION OR DISTRIBUTION.

Consider this document in process, private and confidential.

 

A Proposed Response as of November 12, 2007

As every educator understands the purpose of a grade is not punitive. In the best of cases it is a clear definition of the performance standards expected, and a non subjective measure of managing to those goals. It can move the organization on the ground away from non performance related decision making towards an evidence based strategy.

 

Every educator also understands that the potential effects of a bad grade can be demoralizing and more to the point, stop or significantly slow progress towards success. One of the best indicators towards the road of life learning is to see how a student responds to report of failure.  As with a student, so for a school, so for the teachers, and the adminstrative staff. If the issue is managing a learning environment as opposed to tending a teaching machine, it becomes critical to react quickly to take advantage of a teachable moment. Teachable moments appear unexpectedly and disappear just as quickly. Timing is everything.

 

The publication of the grades for New York City schools has created a powerful teaching moment. With a clear, understandable letter grade, the information on the performance of the system, as opposed to individuals in the system, is a promising approach to sustainable improvement. With the release of the specific things being measured and the why and how they are being measured, the Board has done it's job in making the performance standards clear and actionable. It is a reformulation of the rubric approach that is tried and true in our environment.

 

The challenge on the ground is to keep a failure grade from being viewed only in a reward/punishment framework. The danger is that is viewed as a judgment on an individual, as opposed to a guide to a strategy for improvement. It is the educator's job in creating a learning community to leverage the teachable moment  to re- focus the activity and attention of the learners. As we see all players in the Chelsea community as learners, this response is offered in that spirit.

 

After two years of many initiatives and attempts at the cultural change from a legacy Vocational School Environment to the vision of SLC's organized into Professional Learning Communities, we realized that the job one is a communication environment that is resilient and scalable. The challenge is to create this environment in real time, without taking time away from the primary value creator in the learning experience, the relationship by a caring professional adult and a student.  For the purposes of this response, we take as given the budget. But the true limiting resource is time and focus of our staff, students and parents.

 

A traditional website is too expensive in time and focus to be practical in our situation. Instead we've gone with a more collaborative approach.

As educators and managers we know that today more than ever timing is everything. The approach of "incubating" a website allows us to respond in real time to challenges and opportunities. Started in late August, you can see the present (November 6) progress  at www.highschoolpubcenter.com.

 

The architecture is based on the idea that for each of our constituencies students, parents, teachers, and the professional business community have to be able to get useful information and have the opportunity to give useful information. The front page of the website-in-incubation has a link for anyone to be able to ask a question, share a comment, or voice a concern.

 

The vision for the website is to be the central node of a communication environment to support the creation of Professional Learning Teams on the ground. The strategy is to nurture the growth of Professional Learning Teams in the service of creating a robust, sustainable Professional Learning Community. The secret sauce is that we are inventing a method to go from here to there. To see the public face of this approach, click here

 

 

Professional Learning Teams in Development

In the Soho Business Academy, the AP, Ms Ware and five teachers and the guidance counselor have volunteered to mentor 10 students each in the service of getting them into college. In addition, a team is in fromation to focus on the students who have been missing in action. It's purpose is to find out why, and what must be done to get them back.

 

In Chelsea Technical Education,  a Professional Learning Team has formed with voluntary efforts from Ms Tilly, an Art teacher and Ms Apellaniz, the School Counselor. The focus of the problems they are addressing is a pilot program in addressing the attendance problem.

 

In Soho Communication Arts Academy, the kernel of a Professional Learning Team is forming around Ms Bermack and Mr. Cerny. They are the proto team that is leading the preparation of students for entrance to design school and the continuing education in the graphic design field. In that context, the energizing focus is around an interning program which has been a focus for Mr Chin.

 

From the overall management point of view, a useful feature of the site is to gather information from our student body so that a personalized epxerience will be possible, in the context of 1000 students. We've incorporated surveys into the website. The surveys and the process are beginning to give us the real time data we need to make evidence based time and focus decisions.

 

You can see the results of some of the surveys we've completed in the last 9 weeks here.

Please contact Mr. Tim Timberlake at TTimber (at) schools.nyc.gov for the passwords.

 

Intern survey 127 responses as of Nov 6

Going to College Survey version 1   60 responses as of Nov.6

Going to College Survey version 1.5  30 responses as of Nov 6

 

The Mentoring survey, is in support of a proto team starting to form around Ms Mack in SBA. She is managing the relationship with the Morgan Stanley Mentoring Program that has been a part of Chelse for many years. We 've adapted the website functionality to engage this effort primarily to eliminate some of the off task time that can be devoted to managing the students and the interns to achieve a better learning experience both for the mentees and the mentors from Morgan Stanley. Based on analysis from our previous experience, we are using an online project management tool to bring a little transparency into the process. We were able to transform a a paper process into an electronic process at a very low time/focus price.

 

Mentoring Survey

 

What has been the delay in seeing results?

 

Unfortunately it has taken 2 years to identify and locate the resources to implement this approach and the results are not quite visible yet.

 

Why will it work now?

We think it is plausible that significant results will be evident by June for the following reasons:

1. There is a wealth of evidence to suggest that the most effective approach to fix schools is the creation of professional learning communities.

2. The social collaboration web tools have only recently been scaled. Without those tools it was just too time/focus expensive to change a legacy culture that had grown in response to legacy conditions.

3. The visibility created by the release of well defined "grades" has created a unique teachable moment.

4. We have built the communication infrastructure to now leverage this teachable moment into much faster adoption.

5. Proto teams and evangelists have been engaged in each of the three academies and are now engaged in collaborative problem solving.

 

Next steps

Over the next two months, while continuing to nurture the proto teams into teams and the teams into Professional Learning Communities, we will measure unique visits to the website.  Our experts tell us that the adoption period of a new form of communication can take a while. We understand that the issue is NOT "driving visitors to the site."  Rather it is creating enough communication value at the site that students, parents, teachers and adminstrators choose to visit and interact.  This will require a combination of real time adjustment to the site AND an on the ground effort centralizing communications through it.

 

We have the vision.

The drive to get daily feedback on how we are doing. And at least three clear metrics:

1. Attendance

2. Number of students engaged in College Choice

3. Hits to the website

4. Date and time stamps of interactions to the wikis, the PubCenter Project site, and the website.

 

The Department of Education has suppliede the teachable moment.

 

It should work.

More

 

 

 

 

A Progress Report Grade.

This letter grade (A through F) provides an overall assessment of the school’s contribution to student learning in three main areas of measurement:

(I) School Environment,

(II) Student Performance, and

(III) Student Progress.

 

Schools receive additional recognition for Exemplary Student Progress by students most in need of attention and improvement. The overall Progress Report Grade is designed to reflect each school’s contribution to student academic progress, no matter where each child begins his or her journey to proficiency and beyond.

 

Schools are compared to all schools Citywide and to schools with student populations most like their own.

 

A Quality Review Score.

This separate accountability score is based on an onsite Quality Review of the school by an experienced educator. The score represents the quality of efforts taking place at the school to track the capacities and needs of each student, to plan and set rigorous goals for each student’s improved learning, to focus the school’s academic practices and leadership development around the achievement of those goals, and to evaluate the effectiveness of plans and practices constantly and revise them as needed to ensure success.

 

The Quality Review Score is evaluated on a three point scale: Well Developed, Proficient, and Undeveloped.

The Quality Review Score is not incorporated into the Progress Report Grade and instead is treated as a different, equally important indicator.

 

NCLB Status.

This separate accountability indicator reports the school’s status under the accountability system New York State has adopted under the federal No Child Left Behind Act. The Progress Report is designed to supplement the State accountability system. A school’s NCLB status is an important basis for assessing the number and characteristic of students in a school who have attained the goal of proficiency in literacy and mathematics. NCLB Status is not incorporated into the Progress Report Grade.

 

General Information

The High School Progress Report evaluates schools that serve some or all of Grades 9–12. A separate Elementary/Middle School Progress Report evaluates schools or portions of schools that serve Grades K–8. Separate Progress Reports are also used to evaluate schools that have substantial populations of Special Education students receiving alternative assessments and High Schools for Transfer Students.

 

Definitions

Peer Schools are high schools that serve student populations with similar entering academic performance levels. To determine the groupings, schools are given a “peer index,” which is the average of the Proficiency Levels its actively enrolled students had earned on their State ELA and mathematics exams as 8th graders. High schools are then ranked by this peer index. A school’s peer group is composed of the 20 schools above and 20 schools below it on the ranked list, as long as these schools’ peer indexes are within 0.5 of the original school’s peer index. (In other words, the goal is to compare each school’s performance to that of other schools with very similar student populations.) Peer group schools with a peer index that differs by more than .5 of a school’s peer index are removed from that school’s peer group. Schools with less than 20 students with 8th grade scores will be given a proxy peer index. This proxy index is calculated by averaging the peer indices of the 20 school’s above it and the 20 schools below it on a ranked list of percentage of students eligible for Free and Reduced Lunch. A small number of schools do not have peer groups and will therefore not have any peer group calculations. As a result, these schools cannot receive overall scores or grades. Specialized High Schools are an exception to this rule and serve as their own peer group. The list of schools in each peer group is provided when Progress Reports are issued each year.

 

Citywide Range

The Progress Reports evaluate schools in part based on how their students’ performance compares to that of students in other City high schools. Citywide results for 2003–07 (school years 2003–04, 2004–05, 2005–06, and 2006-07) are used as reference points that provide a basis for evaluating schools’ performance over the next several years.

 

In other words, schools’ results are not measured on a curve for each successive year, and instead are compared to criteria set by performance

of all City schools in the 2003–2007 period.

 

As a result, a school’s grade can potentially improve each year if its overall performance improves, even if all other schools also improve. Roughly speaking, for each element on the Progress Report, the Citywide Range is the range of scores earned by all schools Citywide during the 2003–07 period, excluding the “outliers.” Outliers are schools with scores so unreasonably high or low that it is inappropriate to use them as a basis for comparison. After several years, new Ranges will be calculated to reflect more recent City experience. The 2006–07 Progress Reports will include Citywide reference points for each element, so that schools will be aware of the reference points during the school year in which they are being evaluated.

 

Peer Range

The Progress Reports evaluate schools in part based on how their students’ performance compares to that of students in their peer schools. Like the Citywide Range, Peer Ranges that are used as reference points for evaluation are derived from results from 2003–07 (school years 2003–04, 2004–05, 2005–06, and 2006-07), and will remain fixed for several years. Like the Citywide Range, the Peer Range excludes “outlier” scores that deviate dramatically from other scores.

 

Students in a School’s Lowest Third

 

This category is used to evaluate progress made by students who require the most academic support in a Regents subject area. This category is made up of the one-third of students in the high school who had the lowest scores on the Grade 8 test that corresponds to the Regents subject test being evaluated. For example, the 8th grade ELA exam corresponds to the Regents English exam.

 

Students in Lowest Third Citywide

This category is used to measure schools’ abilities to make exemplary progress with students who entered high school as extremely underperforming. This category is made up of the one-third of students in their first year of high school who scored the lowest weighted averages Citywide on their State Grade 8 ELA and mathematics tests. Students who took only one of the ELA and mathematics exams can be eligible for this category.

 

Minimum N (Number of Students)

The minimum number of values used for all reported calculations at the school level is 20. Elements for which there are fewer than 20 valid observations at a school are not included because of confidentiality considerations and the unreliability of measurements based on small numbers. Elements for which there are fewer than twenty valid observations are represented on the progress reports with the symbol “–”

 

Attribution of Students to Schools

The results of students who are registered at the same school for an entire academic year are attributed to the school where the students are registered. The results of students who transferred between DOE schools within a school year are attributed as follows:

Diplomas are attributed to the school where the student is registered on the date the diploma is awarded. Graduation and Diploma rates are based on the Cohorts described below.

 

Academic credits are attributed to the school where the student is registered at the end of the semester in which the credit was earned (January 1st and June 1st). Summer and Night School credits are not included in the calculation.

 

Regents examinations are attributed to the school where the student is registered at the end of the semester when the Regents exam was taken. The Fall Semester is attributed to the school where the student was registered on January 1st and the Spring Semester is attributed to the school where the student was registered on June 1st. Regents examinations taken over the summer are attributed to that student’s school of enrollment on June 1 preceding the summer. However, these scores are not reflected in the Progress Report. Regents examination results for the Summer of 2007, for example, would be counted in the 2007–08 Progress Report and NOT in the 2006–07 Progress Report.

 

Schoolwide measures such as Regents completion rates, credit accumulation rates, and test participation rates including students with long-term absences (LTAs) and students who dropped out by attributing them to the school where they were most recently enrolled when they acquired LTA status or dropped out. Students with valid discharges (e.g., transfers out of state) are excluded from a school’s performance rates.

 

Performance Levels

Performance levels (1, 2, 3 and 4) reflect the extent to which a student demonstrates the level of understanding expected at his/her grade level in ELA and Mathematics.

 

Proficiency Ratings

For purposes of the Progress Report, the scale scores awarded by the State on State mathematics and ELA exams are assigned a Proficiency Rating on a continuum from 1.00 to 4.50. These Ratings are used to determine the average incoming proficiency level of a school’s entire population. A Proficiency Rating of 1.00 corresponds to the lowest score a student in Performance Level 1 can attain. A Rating of 1.99 corresponds to the highest score a student can attain and still be at Performance Level 1. A Rating of 2.50 corresponds to the midpoint between Performance Level 2 and Performance Level 3. Similarly, Ratings between 2.00 and 3.00 reflect scale scores between the State cut-off scores for Performance Levels 2 and 3, and Ratings between and 4.00 reflect scale scores between the State cut-off scores for Performance Levels 11 and 4. Students who exceed the cut-off score for Performance Level 4 are assigned Proficiency Ratings from 4.01 to 4.50; a Rating of 4.50 corresponds to the highest score that can be attained on the test.

 

Progress Report 4-Year Graduation Cohort

In order to calculate 4-year diploma and graduation rates, the student population for the year graduation rate is identified using the rules for the NYC Cohort. For example, the 2007 Cohort used for these purposes consists of all the general education students who started Grade 9 in a New York public school for the first time in the 2003–04 school year, as well as students who transferred to the New York public schools during high school and entered as 9th graders in 2003–04, 10th graders in 2004–05 11th graders in 2005–06, or 12th graders in 2006–07.

 

Discharges, such as students who left the school system and enrolled in another educational setting or program, reached the age of 21, or died prior to completing high school, are excluded from the Cohort. Drop-outs are included in the Cohort. Because the 2006–07 Progress Report 4-year Cohort is based on student data as of the last week in August 2007, it may differ from the actual 2007 NYC Cohort, which is based on student data as of November 30, 2007.

 

Progress Report 6-Year Graduation Cohort

The Progress Report 6-year Cohort is determined using the same methodology as the NYC 4-year cohort, and is also based on student data as of the last week of August.

 

 

Elements of the Progress Report Considerations in Computing the Overall Progress Report Grade

A Progress Report grade of A, B, C, D, or F will be assigned to each school based on a weighted average of the Report Elements plus any additional recognition the school obtains based on Exemplary Student Progress.

 

The Report Elements (described in detail below) include three main areas of measurement:

(I) School Environment,

(II) Student Performance, and

(III) Student Progress.

 

Particular weight is given to Student Progress and to each school’s performance in relation to peer schools. Recognition for Exemplary Student Progress among students most in need of attention and improvement is reported in a fourth category.

 

School Environment measures preconditions for learning: student attendance and other crucial aspects of the school’s environment, such as safety and parent, student, and teacher engagement in the process of accelerating student learning, as measured by scientific surveys of parents, students, and teachers. The School Environment component of the Progress Report counts for 15% of the overall Progress Report score.

 

 

Student Performance measures the percentage of students at a school who have reached the crucial goal of graduation, with emphasis on the number of students graduating with the Regents Diploma that State law now established as the goal for all students.

 

In the future, performance will also measure the percentage of students at the school who are becoming college-ready, as indicated by participation rates of 11th and 12th graders on PSAT, SAT, and ACT exams.

 

Student Progress measures the ability of a school to enhance the performance levels of students from one year to the next, and the incremental gains students make toward the long-term goal of earning a Regents diploma. The measure focuses on the capacities students develop as a result of attending the school, not the capacities they bring with them on the first day. Attention is given to all students in each school and particular emphasis is given to the one-third of students who entered high school at the lowest performance levels.

 

 

In addition, schools can earn additional credit on their Progress Report when their high-need students make exemplary gains. This component of the score can only improve a school’s overall Progress Report Score. It cannot lower a school’s score.

 

Chart in PDF or printed report.

 

 

Reporting Metrics

Each Report Element is measured by how a school performed “Relative to the City Horizon” on that element and also “Relative to the Peer Horizon.”

 

Current Year (Your School’s Score This Year) shows the school’s results for the current year.

 

Proximity to Citywide Score (Your School Relative to City Horizon) shows the school’s results for the current year (2006–07) compared to the Citywide Range, which reflects the range of performance of all schools in the City in the 2003–06 reference period. For example, a score of 50% on a particular element means that the school’s performance on that element in the current year was exactly halfway between the bottom and top scores in the Citywide Range during 2003–06. Similarly, 75% signifies that the school’s score was three-quarters of the distance between the bottom and top of that Range.

 

Scores above 100% are possible if in the year of the Progress Report the school beats the top score in the 2003–06 Range. The Citywide score contributes one-third of the school’s score on every element.

 

Proximity to Peer Score (Your School Relative to Peer Horizon) is calculated in the same way as the Proximity to Citywide score, using the Peer Horizon Range instead of the Citywide Range. This score compares the school’s performance to the range of performance of all peer schools (defined above) during the 2003–06 reference period. The Peer Score contributes two-thirds of the school’s score on every element.

 

Report Elements

Progress Reports include the following report elements by area of measurement:

School Environment

I.1 Attendance Rate is the school’s average daily attendance for Grades 9–12.

I.2- The Learning Environment Survey is administered yearly to parents, teachers, and middle and high school students. The survey gathers information on how well each school serves student learning from these key members of school communities. Each survey question informs school results in one of four categories.

 

I.2 Safety and Respect

Survey domain which measures the degree to which a school provides a physically and emotionally secure environment for learning. Students who feel safe are more able to engage in academic work and less likely to behave in ways that interfere with academic performance.

 

I.3 Academic Expectations

Survey domain which measures the degree to which a school encourages students to do their best and develops rigorous and meaningful academic goals. Expectations are communicated in direct and subtle ways, and are powerful motivators of student behaviors and performance. Schools with high expectations provide a learning environment in which students believe they are capable of academic success.

 

I.4 Engagement

Survey domain which measures the degree to which a school involves students, parents and educators in a partnership to promote student learning. Schools with a broad range of curricular offerings, activities, and opportunities for parents, teachers and students to influence the direction of the school are better able to meet the learning needs of children.

 

I.5 Communication

Survey domain which measures the degree to which a school effectively communicates it educational goals and requirements, listens to community members, and provides appropriate feedback on each student’s learning outcomes. Access to this information can be used to establish a greater degree of agency and responsibility for student learning by all community members.

 

Each school receives a score for each question on the parent, teacher, and student surveys. Each question is linked to one of four domains: Safety and Respect, Academic Expectations, Engagement, and Communication. Question scores are combined to form domain scores of 0 to 10, which appear on the Progress Report.

 

Chart included in PDF or printed version.

 

Student Performance

II.1 4-Year Graduation Rate reflects the percentage of students in the school’s 4-year Cohort (defined above) that graduated with a Regents or Local Diploma. This graduation rate differs from the NYC Cohort graduation rate in that it reflects data as of August (while the City Cohort reflects data as of November 30th) and it does not include IEP and GED diplomas.

 

II.2 4-Year Weighted Diploma Rate by Type weighs each type of diploma based on the relative level of proficiency and college readiness indicated by each diploma type. GEDs, which are not included in the

non-weighted Graduation rates, contribute to this measure.

 

Chart included in PDF or Printed Report

 

Diplomas are weighted as follows: would have a 4-year weighted diploma rate of 300%, and a school with no graduates in the year cohort would earn 0%. This rate uses the same Cohort

as the Year Graduation rate.

 

II.3 6-Year Diploma Rate. Like the year rate, the 6-year rate reflects the percent of students in the school’s year Progress Report Cohort that graduated with a Regents or Local Diploma. It differs from the NYC Cohort graduation rate in the same ways as the 4-Year Graduation rate.

 

II.4 6-Year Weighted Diploma Rate by Type weighs each type of diploma based on the relative level of proficiency and college readiness indicated by each diploma type. It creates a weighted diploma rate for students in the 6-year Cohort using the same weights as the

4-year weighted diploma rate above.

This rate uses the same Cohort as the

 

Student Progress

III.1 Percentage Earning 10+ Credits in 1st Year measures the percentage of students at a school who accumulated 10 or more academic credits in their first year of high school. Credit and responsibility for students who are registered at different schools for fall and spring semesters are split equally between the initial school and the transfer school.

 

III.2 Percentage Earning 10+ Credits in 2nd Year measures the percentage of students at a school who accumulated 10 or more academic credits in their second year of high school.

 

III.3 Percentage Earning 10+ Credits in 3rd Year measures the percentage of students

at a school who accumulated 10 or more academic credits in their third year

of high school.

 

III.4 Average Change in PSAT Score, 2nd to 3rd Year is the average increase or decrease in students’ PSAT scores taken in their 3rd year of high school from their PSAT scores in the 2nd year of high school. This measure is not included on the 2006–07 Report and will not go into effect until PSAT tests have been provided free of charge to successive classes of 10th and 11th graders.

 

III.5 Average Completion Rate for Remaining Regents evaluates a school’s ability to help students progress each year towards passing the five Regents Tests required for a Regents diploma: English, Mathematics, Science, U.S. History, and Global Studies.

 

 

Under the State’s requirements for Regents diplomas, students pass a Regents test when they score 65 or higher. At the beginning of each year, high school students are treated for purposes of this element as eligible to pass any of the five Regents Tests on which they have not yet received a score of or better. This element measures the proportion of all Regents Tests that students were eligible to pass at the beginning of the school year, as compared to the number they passed by the end of the school year. That proportion is calculated by dividing the number of Regents Tests that students at the school passed with a or higher for the first time in the current year (the numerator) by the number of Regents tests that all students in the school were eligible to pass in the beginning of the school year (the denominator). So, for example, a student who passed U.S. History and Mathematics A (each for the first time) this year contributes two to the numerator.

 

Chart included in PDF and printed report.

 

In order to give schools a choice about whether to give Regents Tests in Grade 9, this element calculates Regents Test eligibility by treating Grades 9 and 10 as a single class. For example, if a student who is currently in her second year

of high school passed Living Environment in Grade 9 and English and Math A in the current year in Grade 10, she would contribute 3 (for the tests passed) to the numerator and 5 (for all tests that she was eligible to take at the beginning of Grade 9) to the denominator. Next year, this same student will contribute 2 to the denominator for her third year of high school because she will only be required to pass 2 of the required exams. Exams that transfer students had passed before entering a school are excluded from both the numerator and the denominator. Similarly, if a student passed mathematics A in Grade 8, that exam is excluded from both the numerator and the denominator. All students enrolled at the school, as well as students with long-term absences or who have dropped out, contribute to both the numerator and the denominator of this metric. For the purpose of this element, the Mathematics requirement can be satisfied by passing either Mathematics A or Mathematics B and the Science requirement can be satisfied by passing any of the following Regents exams: Chemistry, Earth Science, Living Environment, or Physics. Regents Competency Tests (RCTs) may be substituted for Regents exams for eligible students.

 

III.6- Weighted Regents Pass Rate. On a Citywide basis, students’ entering proficiency, as measured by their performance on State Grade 8 subject tests, is predictive of their likelihood of passing the high school Regents exams. These elements evaluate the extent to which some high schools help their students meet or exceed these expectations, while students attending other high schools fall below expectations.

 

This element is calculated as follows for each Regents Test:

A weighted Regents pass rate is calculated for each of the five Regents Tests required for a Regents diploma: English, Mathematics, Science, U.S. History, and Global Studies. As is true in calculating the Average Regents Completion Rate, students are treated as passing an exam for purposes of this element when they first score 65 or higher, and students who have passed the same Regents Test in a prior year are not included in this element. In all cases, students who pass one of the five Regents tests will not be penalized if they attempt the test again and fail. However, for Mathematics and Science, students who attempt and pass different Regents exams in subsequent semesters will contribute positively to the school’s Weighted Regents Pass Rate score.

 

Students, including those in Grade 9, who pass a subject Regents Test for the first time contribute to their school’s weighted student pass rate, but students who had lower proficiency upon entering high schools are weighted to contribute more. If only one in five students with a student’s entering proficiency is expected, based on prior experience of all City students, to pass a subject Regents Test, then the student’s weight on that Regents is five. If one in two students with the same entering performance level passed the Regents, then that student’s subject weight is two. When the first student passed the Regents with 65 or higher, he would contribute five to his school’s weighted Regents Pass rate. When the second student passed with or higher, he would contribute two. Students who score below the passing mark, and who have not yet achieved a passing score of on the same test or on one of the other tests within that subject (e.g., Mathematics or Science), contribute zero. The Weighted Regents Pass Rate is the average contribution of all students who took the exam. (Students who had previously passed that exam and chose to retake it are excluded from this measure.) The weighted contribution of Students with Interrupted Formal Education (SIFE) who pass the English Regents is determined by the historic pass-rate of SIFE students. Because the weight that each student contributes is inversely proportional to his/her expectation of passing the Regents Test, all schools have a statistical expectation of one on these measures.

 

III.6–III.10 using only the population in the school’s lowest third. Students in the Lowest Third are determined by the same State Grade 8 subject tests. Exemplary Student Progress, for General Ed High Schools, reflects the percentage of students most in need of improvement who reach a key milestone that is highly predictive of high school graduation. It measures the percentage of particular groups of high needs students in their first, second, or third years of high school who earn 11 or more credits. This element contributes Grade: it can raise but cannot lower the school’s grade. In this way, schools with exemplary instruction and progress are encouraged to enroll students most in need of improvement.

 

The five student groups who are eligible for additional credit through Exemplary Proficiency Gains, listed below, are aligned with the NCLB categories. Students may belong to more than one of these groups, and if these students make exemplary gains, their gains may count twice

or three times towards additional credit for that school.

 

Schools receive additional credit if the percentage of students in each of the five categories below making exemplary gains is

in the highest forty percent of all schools Citywide. Specifically, 1.5 points are added for each element in which the school’s percentage

of qualifying students making exemplary gains is in the top 40% of all schools, and 3.0 points are added for each element in which exemplary gains is in the top 20% of all schools. The percentage of students in each category making exemplary gains is indicated on the Progress Report2

followed by a notation indicating whether the school received additional credit for gains among any relevant category of students. Categories in which the school has fewer than twenty students are represented with the symbol “–”

 

Chart in PDF or Printed version.

 

The minimum score cut-offs to earn exemplary gains in each category are listed in the table below. For example, at least 52.2% of a High

School’s English Language Learners must achieve exemplary credit gains in order for the school to earn an additional 1.5 points in that category. Furthermore, at least 63.3% of a High School’s English Language Learners must achieve exemplary credit gains in order for the school to earn an additional 3.0 points in that category.

 

IV.1 English Language Learners: students currently identified as being Limited

English Proficient.

 

IV.2 Special Education Students: students currently identified as having special needs.

 

IV.3 Hispanic Students Who Entered High School in Lowest Third Citywide: Hispanic students who performed in the lowest third of all students Citywide in their grade level in the year taken on their Grade 8 English Language Arts and mathematics exams.

 

IV.4 Black Students Who Entered High School

 

IV.5 Other Students Who Entered High School in Lowest Third Citywide:

Other students who performed in the lowest third of all students Citywide in their grade level in the year taken on their Grade 8 English Language Arts and mathematics exams. If the school did not have 20 students (the minimum N-value) in the category of Hispanic Students in the Lowest Third Citywide or 20 Black Students in the Lowest Third Citywide, then those students are included in the category Other Students in the Lowest Third Citywide.

 

The exemplary gain measures are different for the Specialized High Schools. These schools are evaluated based on the percentage of students earning 85% or higher on each Regents subject. The top-performing Specialized High School in a subject receives 2.0 points, while the second and third top-performing Specialized High School in a subject receive 1.0 point. The five Regents subjects are: ELA, Math, Science, U.S. History, and Global Studies. Cut scores for each subject are listed in the table below:

 

Table available in PDF or printed version.

 

Final Calculation of Progress Report Grade

Category Scores are calculated by weighting the values of the Proximity to City Horizon (x1) and Proximity to Peer Horizon (x2) measures for School Environment, Student Performance, and Student Progress. As the weighting indicates, Proximity to Peer Horizon counts twice as much as Proximity to City Horizon. These weighted values are then averaged to create scores for School Environment, Student Performance, and Student Progress. The school’s Weighted Total Score (excluding additional credit for Exemplary Student Progress) is a weighted average of School Environment (15%), Student Performance (30%), and Student Progress (55%).

 

A school’s Weighted Total Score is then assigned a percentile ranking based on the range of all Weighted Total Scores Citywide during the 2006–07 academic year. These Percentile Ranks are used to determine each school’s Target for next year using the values in Table #4 below.

Additional credit for schools obtaining Exemplary Student Progress as defined above is then

added to the Weighted Total Score using

the following formula.

 

By way of example, assume that a school’s Weighted Total Score for the 2006–07 school

Table and explanation available in PDF or printed version.

 

 

 

Comments (0)

You don't have permission to comment on this page.