How many abbott school districts in nj




















Put the administration of school construction funds back into the state Economic Development Authority and return control of building projects back to the individual school districts, with appropriate oversight. If it needs fixing, then the governor needs to roll up his sleeves and fix it. But these successes seem to be primarily the result of relentless vigilance by ELC and the power of the Supreme Court.

So, last year, he sued again to force compliance. Murphy replied by submitting a statewide strategic plan, two years late, but with no cost estimates and no source of funding.

The court ruled that ELC should wait and see what the governor and Legislature would provide in the budget that was just passed and signed into law. Nonetheless, the Assembly takes its role of ensuring an equitable education system for all students very seriously. He put in good people. When Murphy took office in , Sciarra said he expected a willing and progressive partner, one who had ridden into office with the support of labor and education unions and a promise to restore school funding that had been dramatically reduced by Gov.

Chris Christie, a Republican. Murphy inherited an agency that was well-run, but broke. He has brought back problems we saw during McGreevey — a return to the bad old days of waste, mismanagement and corruption. The SDA was still broke, but no longer well-run. The failure of any one group to improve can result in the designation as schools in need of improvement. Additional factors not specifically referenced by the Court, but appropriate for consideration are school safety and the ability of all students at Grade 3 to read at grade level.

Therefore, to determine educational adequacy the following factors will be considered:. These criteria make the essential need for a statewide student-level database because it will produce much more rigorous and credible data for making such judgments. Districts with concentrated poverty in which students are consistently attaining acceptable achievement may not be designated an Abbott District, though their ability to offer a sound educational program in the face of substantial obstacles should be celebrated and shared with other districts.

The Department of Education devised this system in to rank every district in the state by seven factors: 1 per capita income level; 2 occupation level; 3 education level; 4 percent of residents below the federal poverty level; 5 density the average number of persons per household ; 6 urbanization percent of district considered urban ; and 7 unemployment percent of those in the work force who received some unemployment compensation. Abbott II, page The DFG model was revised following the census, as follows: 1 percent of population with no high school diploma; 2 percent with some college; 3 occupation; 4 income; 5 unemployment, and 6 poverty.

Additional criteria for urban districts derive from those districts originally identified as "urban aid districts" by the Department of Community Affairs pursuant to N. Abbott II, pages Municipal overburden, which can be determined by comparing the sum of the county, municipal, and school tax rates among the school districts, must also be considered.

Abbott II, page In the Educational Facilities and Financing Act, the Legislature provided additional guidance in determining socioeconomic status for new or continuing Abbott Districts.

The criteria may include, but not be limited to: the number of residents per 1, within the municipality or municipalities in which the district is situate who receive TANF; the district's equalized valuation per resident pupil Given this context and its own analysis, the Department of Education adopts the following economic indicators to guide Abbott classification.

The district must be DFG A and satisfy the following criteria. DFG B districts that also meet the following criteria may be classified as Abbott if they demonstrate additional substantial economic hardship:. If the district has a low-income concentration less than 60 percent, then it must have an equalized value per capita at least 3 percent below the state average and equalized tax rate at least 30 percent greater than the state average;.

If the low-income concentration is at least 60 percent, then the equalized value per capita must be at least 3 percent below the state average; the equalized tax rate does not factor into the eligibility requirement for these districts; and.

In terms of Abbott designation, the remedies as defined broadly and specifically are similarly intended to improve student achievement through the application of resources and the support that the community, because of economic conditions, is unable to provide.

It is under those circumstances that the Abbott decision requires that additional state aid and remedies must be provided. However, when durable and concentrated poverty are combined with significant educational improvement, Abbott status will be maintained.

Change in status requires a systematic and timely review of current Abbott Districts and of prospective Abbott Districts so that a determination consistent with the intent and spirit of Abbott can be made.

Such a review will occur after the decennial census; although prospective districts could petition the Commissioner for such designation and appeal through the Courts if denied. There may be circumstances where the Commissioner independently determines that students in a particular district require the full array of Abbott remedies.

In the instance where a district applies for Abbott designation, a thorough and promulgated review process must be established.

In all cases, both the relevant educational and economic criteria must apply. Abbott designation carries with it the obligation to implement specific court-ordered remedies. Prospectively, one could classify any district eligible for any state-supported remedy as Abbott regardless of the range of special services. It is preferable to limit Abbott designation to districts that qualify for most, if not all, Abbott remedies.

Other districts which may be determined by the Commissioner to be in need of additional remedies but not the full panoply of Abbott remedies would be designated as non-Abbotts.

The redesign of a modified school formula should provide for the "sliding scale" approach for needy, non-Abbotts.

This approach would be targeted to documented district needs, which is preferable to across-the-board Abbott designations. Particular attention should be given to K-8 districts requiring additional resources.

They do not fit into the Abbott models which are all PK districts. Additionally, high poverty rural districts may require additional resources to deal with issues of high student mobility and their inability to attract highly qualified teachers. Additionally, for those districts not providing programs beyond the fifth or eighth grade, there must be effective articulation between the lower schools and the districts into which they feed.

Eligible districts will be asked to develop a three-year plan to integrate preschool into their curriculum and other programs to be determined. By it is anticipated that preschool will be in place for all eligible children in those districts. These scores were used to rank the occupation groups on a scale of 1 least prestigious to 12 most prestigious and a community prestige score was calculated based on the percent of residents who held jobs in each category.

This methodology is very similar to the education measure produced in the first two iterations and has similar shortcomings. While this was noted in the DFG report, experimentation with alternative measures failed to produce better results. To that end, all three DFG reports measured occupational status in the same manner. This stark difference failed to capture degrees of variation that may exist across districts. The most recent report dropped the urbanization variable and added population density.

This was an attempt to measure the same concept in a more refined manner to capture nuanced differences among the districts that would not be captured in the dichotomous variable. The first iteration used average family income. In the DFGs, this was switched to median family income, as the average may be skewed by a small number of outlying observations. This same measure was used in The second analysis changed the measure to capture the percent of workers who received unemployment compensation at some point in the previous year.

The most recent DFG analysis noted that some unemployed individuals do not actually receive unemployment compensation. As such, that report reverted back to the traditional unemployment rate.

This measure does not include individuals who do not live with any relatives. The and analyses used the more inclusive person level poverty rate. When the DFGs were developed, exploratory analysis suggested that this variable was no longer a useful indicator of SES. Therefore, it was dropped. The report noted that over time, this has become a less reliable indicator for SES as people became increasingly likely to relocate to pursue better career opportunities.

This variable has not been utilized since the DFG report. While a detailed explanation of this procedure is beyond the scope of this report, a general description will provide better insight into how the DFGs are determined. PCA is a technique designed to express the information contained in a group of highly correlated variables in a smaller number of variables.

For example, assume a situation in which an analyst has collected height and weight data for a population. PCA could be used to calculate a new variable called a principal component that captures the same information, but with the use of only one variable instead of two. One could view this combination of the height and weight data as a more generic size measure.

This description is very simplified. In fact, the PCA process will not produce just one principal component. Rather, it will create as many principal components as there are variables in the original analysis.

One would not use all of the principal components, however, because that would be inconsistent with the objective of reducing the number of variables included in the analysis. This is a reasonable approach if the variables included in the analysis impact the first principal component in a manner consistent with expectations for example, if the results show higher income decreases the first principal component, it is likely that the first principal component is not measuring SES.

Once the PCA analysis has been implemented and the first principal component has defined a numeric measure of relative SES, the districts must be grouped into the DFG classes. The first two DFG reports utilized a simple method. The districts were grouped into deciles ten groups containing an approximately equal number of districts based on their SES score the first principal component discussed above. The report noted that this grouping method, while straightforward, was flawed. The process of classifying districts into equally sized deciles did not account for the magnitude of the difference in the SES scores across districts.

This represented a particular problem in the middle of the distribution, where a large number of districts had similar SES scores. One result of this problem was that in some cases, average test scores were higher in lower DFGs. The analysis classified districts based on the range of SES scores. These groupings became the eight DFG categories currently used. Given the expanded use of the DFG classification, particularly the lowest and highest categories, efforts were made to preserve the underlying meaning of these groups.

In determining the DFGs using the Decennial Census data, the overarching goal was to continue refining the methodology in ways that will make the calculation more accurate while simultaneously preserving the basic meaning of the DFG classifications particularly the two lowest and two highest categories. To this end, the department began the process by obtaining feedback from districts regarding modifications that may be required. Through various means of communication, the department received a significant number of comments.

The most common concerns can be classified into one of four categories:. It should be noted that questions were not raised regarding the statistical technique used to determine the SES scores and the method for grouping districts into DFG classes.

Given the previous and future uses of the DFGs, one key objective is to preserve the underlying meaning of the groupings, particularly at the low and high ends. In the absence of any compelling reason to modify these methods, the decision was made to continue the same quantitative analysis technique and grouping method used in the development of the DFGs. The four subject areas raised during various discussions were explored at length in developing the DFGs.

The process is discussed and the final decisions made are explained here. In reviewing the previous DFG analyses and discussing the measure with representatives from school districts, a number of questions were raised with regards to variables that may improve the DFG calculation. The previous inclusion of one variable, population density, was called into question. When determining whether such variables should be added to the model, several factors were considered:.

Empirical Results: After experimenting with various models, variables that do a poor job of defining SES should be dropped from the final analysis. In updating the DFGs, six changes in the model specification were tested with the above four considerations in mind. The empirical analysis is straightforward. The first model was a baseline version that included the same seven variables as the DFGs. Each additional option made one change to allow a clear comparison to the baseline version.

Each variable used is discussed below. Table 2 summarizes the results of the PCA models. A review of literature on SES does not reveal frequent use of this measure.

Furthermore, a table in the DFG report suggests that this variable was substantially weaker than the other six in terms of explaining SES. The share of explained variance increases by nearly 10 percentage points or 14 percent. However, the percent of students classified as LEP is not an appropriate measure for this analysis as it is at least partly determined by district policy and practice. The census data provides two variables that could be used to measure this phenomenon: 1 the percent of people between the ages of 5 and 17 who do not speak English well and 2 the percent of households that are "linguistically isolated" households in which no one over the age of 14 speaks English well.

It should be noted that some analysis was done with the first variable when the DFGs were developed. However, the report concluded that this was not a reasonable measure of SES. The empirical analysis here corroborates those results. Including the percent of individuals who do not speak English well decreases the explained variance by 6. Including linguistic isolation yields a similarly sized decrease 5. While it appears that further analysis is warranted, it should be noted that the DFG analysis explored using this variable as an alternative to the poverty measure.

It was determined that poverty was a more appropriate variable. In this analysis, the percent of families with children is explored as a supplement to the other variables. However, the results show a slight decrease in the percent of variable explained 1. As noted in the report, this does not provide information on how poor these individuals are. While the inclusion of this variable seems intuitive, it caused a small decrease in the percent of variance explained 0.

This idea raises two concerns. First, similar to the percent of students classified as LEP, it is a measure that partly depends on district level decisions. Second, there appears to be nothing in the research literature on this topic that link disability status to SES.

To explore this linkage, census data are used to estimate the percent of people between the ages of 5 and 20 who have some disability this measure has the benefit of not being affected by district level decision-making. The explained variance decreases 4. The census data do not include variables that may be used as a proxy for student mobility. As an alternative, data from the School Report Card were aggregated to the school district level to estimate the mobility rate.

Given the above discussion, it appears that the best model should include six variables: percent of adults with no high school diploma, percent of adults with some college education, occupational status, median family income, poverty rate, and unemployment rate. A considerable number of school districts are engaged in sending-receiving relationships whereby a district educates students from another community on a tuition basis.

There may be situations in which a district receives students from a community with substantially different demographics. As designed in the past, the DFGs were based on the characteristics of the community in which the district is located, not the communities in which the enrolled students live.



0コメント

  • 1000 / 1000