.comment-link {margin-left:.6em;}

Ed Knows Policy

EKP -- a local (Washington, DC) and national blog about education policy, politics, and research.

Who is Ed Researcher?

Friday, March 31, 2006

California Dreamin'

Big score for Susanna Loeb at Stanford and her colleagues who got four foundations to pony up $2.6 million to fund 23 studies in one year on California's edumess. Not sure if this is going to be new research or just a compilation of stuff these folks have already done.



From the 4-pager:


“Getting Down to Facts” is a research project of more than 20 studies designed
to provide California’s policy-makers and other education stakeholders with the
comprehensive information they need to raise student achievement and reposition
California as an education leader. The purpose of the research project is to
carve out common ground for a serious and substantive conversation that will
lead to meaningful reform by providing ground-level information about
California’s school finance and governance systems necessary to assess the
effectiveness of any proposed reform
.

Emphasis is mine. What can that underlined part mean? You tell me.

Anyway, Susanna spreads the love.

Getting Down to Facts Researchers

From Stanford:
· Susanna Loeb
· Anthony Bryk,
· Linda Darling-Hammond,
· Eric Hanushek,
· Michael Kirst,
· William Koski and
· Rob Reich

· Eric Brunner from QuinnipiacUniversity;
· Jennifer Imazeki from San Diego State University;
· William Duncombe and John Yinger from Syracuse University;
· Thomas Downes from Tufts University;
· Patricia Gandara and Thomas Timar from the University of California at Davis;
· Bruce Fuller from the University of California at Berkeley;
· Russell Rumberger from the University of California at Santa Barbara;
· Margaret Goertz from the University ofPennsylvania;
· Dominic Brewer and Priscilla Wohlstetter from the University of Southern California;
· Allen Odden from the University of Wisconsin;
· Jay Chambers, Tom Parrish, Jesse Levin and Maria Perez from American Institutes for Research;
· Trish Williams and Mary Perry from EdSource;
· Jon Sonstelie and Heather Rose from the Public Policy Institute of California;
· Janet Hansen of the RAND Corporation;
· Ron Bennett and Robert Miyashiro from School Services of California; and
· Ida Oberman and Jim Hollis from Springboard Schools


Click here for continuation of post (if any)

Wednesday, March 29, 2006

Top 100 High Schools?

I finally got around to reading Andy Rotherham and Sara Mead's critique of the "Challenge Index" which Jay Matthews of the Washington Post got Newsweek to implement for a Top 100 High Schools list.

I have to say, Rotherham/Mead win the debate hands down. I don't think their focus on achievement gaps is something that parents necessarily care about so much for their own kids, but the critique of the Challenge Index sticks. That is, it is narrow to the point of trivial.












Total IB and AP tests taken? You could have a bad school that takes lots of those tests and good one that doesn't. It's that simple.

Reading Mathews' response doesn't make me feel better. He seems incapable of making an argument without relying on anecdotes, which gets tiresome fast, especially if you have any training in or exposure to real social science. And by the way, Jay, not every high school graduate has to go to college. Some schools may excel at training non-college bound students for the workforce and life. We have a complex economy with division of labor and comparative advantage in things like the production of high and low skill workers.

First, let me start with my answer to both of these parties. Ranking high schools is fine, but it should reflect the quality of the teachers, meaning their ability to consistently raise student achievement above what it would have been if those same students had been educated by other, less productive teachers. We sometimes call this value added. I know, it's hard to measure value added, but that doesn't mean we should throw our hands up and use a crude measure like the "Challenge Index" (roll eyes).

How should value added be estimated? That's easy. First you need test scores in every grade and year and every subject that matters and you need to test in-migrants when they enter the system. You also need data on the factors outside the school and teachers' control, such as the students' family background, disability status, and so on. What, you don't have all that data? You don't like to use test scores to measure achievement? Then you must not really be committed to knowing which schools are good. Schools that can't pony up high quality data on their students' performance should be left off rankings as "don't care enough to measure their progress."

Then you hand the data to someone who knows what they're doing. A good set of value added indicators would give the total value added, which is the effect on the average student's test scores of attending school X relative to all other schools, and then you estimate the intrinsic value added, which nets out the peer effects. If you are a parent, you care about total performance. If you are an administrator, school board member, or legislator deciding who gets funds and what interventions are working, you care about instrinsic performance, since most schools cannot choose their students (peer group).

Mathews will whine that it's too complicated, so he can go on advising parents to choose schools with tall flagpoles or whatever it is he thinks is the simple objective measure that fits neatly in the Top 100 list. I would argue that it needs to be more complicated, not less. You might want to know about "comparative advantage" value added. That is, some schools may be good at educating minority students or English language learners. Others might be good at educating middle class students but have a negative impact on disadvantaged students. Then parents could really make informed choices and school choice would have a more positive effect on schools that it can today with our limited understanding of school quality.


Click here for continuation of post (if any)

Wednesday, March 22, 2006

Hiatus

Light posting for a few days. Be back soon.


Click here for continuation of post (if any)

Friday, March 17, 2006

Extry, Extry, Get yer KIPP findings here

SRI just put out a study of 5 KIPP schools (Knowledge is Power Program) in the San Francisco area. It's a 90 pager, so brace yourself and your printer. Actually, you can skip right to chapter 6, which has the test score results.

Here's a quote from the executive summary:

Because the data are cross-sectional and school-level rather than
student-level, we cannot draw conclusions about the impact of KIPP schools; that
is, we cannot answer the question, Do students who attend KIPP schools perform
better than they would have had they not attended a KIPP school?



I agree that they can't answer the question, but disagree on why. It's not "because the data are cross-sectional." It's because the comparisons are not really valid. You have self-selected students in schools of choice compared to the schools that those choosers abandoned. But they go on and answer it anyway:

With these limitations in mind, standardized test score results suggest that
KIPP schools are posting gains beyond what would be expected in most subjects
and grade levels, given their demographic composition. Based on publicly
available CST data for 2 years, the percentage of students scoring proficient or
above is consistently higher for KIPP schools than for comparableschools in the
neighborhood—in a few cases dramatically so. Based on cross-sectional fall to
spring SAT 10 data, the percentage of students at or above the 50th percentile
increased in 16 of 17 cases (as defined by subject area and grade level),
ranging from an increase of six percentage points in fifth-grade reading in one
school to an increase of 51 percentage points in sixth-grade math in another
school.

At a minimum they should have conducted an experiment, for example, by comparing test scores of kids who applied to the KIPP schools and were selected in by lottery vs. those randomized out by lottery. Not a perfect design, but still better.


Click here for continuation of post (if any)

Wednesday, March 15, 2006

WaPo on NCLB

Andy Rotherham (eduwonk) has a nice collection of links to the Washington Post story on NCLB and AYP as well as a link to Ed Trust's FAQ on NCLB.

I'm too lazy to copy the links, so for the hordes of people who read this blog but not Eduwonk, here it is from Andy.

Expect a rant from me in the future about AYP and how awkwardly it was devised and rolled out. For now I leave you with Andy's links.


Click here for continuation of post (if any)

Tuesday, March 14, 2006

Channel One -- evil propaganda?

There is a nice article in Pediatrics looking at whether commercially sponsored television news clips in the classroom (the famous "Channel One") have costs that outweigh the benefits. Specifically, they look at whether teaching styles help blunt the brainwashing effect of the advertising and better imprint the news content.

Of course, once again, Education Week, who kindly picked up the story, dumbed it down too much. They just reported the overall finding that on average Channel One viewers remembered more ads than news content. Thank you, Ed Week, for that superficial coverage of research that we've grown accustomed to.

Back to the Pediatrics article. It was a randomized trial of 240 middle schoolers. Everyone got Channel One, but a treatment group got "media literacy" instruction. There were two treatment groups, actually, corresponding to two different flavors of the instruction. One flavor was fact-based and the other more emotive. The third arm was a control group that got no special media literacy instruction.

Here's the money graph:



Here is a paragraph taken from their conclusions:

The data show that despite some apparent benefits to young adolescents from
exposure to the program, the AAP's concerns about the commercialization of the
classroom seem justified. Students tended to remember a greater number of ads
than news stories, and students had purchased up to 8 of the 11 advertised items
listed on the survey. It seems unlikely that the program would attract
advertisers if the advertising did not work, and the data showed that students
who liked the program tended to like the ads, and students who liked the ads
purchased more products. Primedia asserts that growth at Channel One has
contributed to a 10% increase in net revenue for its education holdings


Click here for continuation of post (if any)

Monday, March 13, 2006

Free Cash Money for Lucky DC Charters

This one caught me by surprise. Reported in the Common Denominator and picked up by the DC Education Blog, this story describes a program that gives piles of money to selected DC charter schools for achievement and "enrollment" and things like getting accredited. Hmm, sounds like a windfall to me.

4 "Gold" schools got $175K each.
4 "Silver" schools got $125K each.
5 "Bronze" schools got ??? (amount not reported)

How do you get gold, silver, or bronze status? High enough percentages of students scoring proficient on the Stanford 9 test in 2004. (If I was a teacher who left in the last year or two I'd be annoyed at this delayed recognition).

Gold: 65-85% in math and reading
Silver: 55-65% in math and reading
Bronze: >55% in math OR reading

And if you missed all of those targets? You still get $50K if were deemed to have "made significant improvements" under requirements of the federal No Child Left Behind Act

Plus they got an additional $20K if they were accredited.

Maybe it's the soft bigotry of low expectations. Whatever, if you are a charter school supporter, you are loving the cash. My guess is that this is just a way to spread the payoff money that was used to sweeten the school voucher deal that Mayor Williams negotiated with the feds.

Why am I not praising this "award" announcement for being generous and meritocratic at the same time? Because it's not meritocratic. The thing that I don't like about all this (and it's a criticism of NCLB as it's been implemented) is that the cash is tied to achievement levels, not value added by the schools. Average test scores in 2004 tells me about:
1. The demographics of students who attended in 2003-2004 and made it through the year, because demographics explains a large percentage of student achievment, whether you like that fact or not, and
2. The quality of the tested grades in that year plus the quality of the teachers those kids had the in the previous year (when they might have been in a different school), the year before that, the year before that, etc.

So you are rewarding schools based on characteristics that lie outside the schools' control (and possibly reinforcing inequalities while you're at it). In other words, WHY ARE WE STILL USING STONE AGE SCHOOL QUALITY MEASURES? Why can't we use sensible estimates of the value added by the schools.


Click here for continuation of post (if any)

Saturday, March 11, 2006

Russo on Petrilli

Nice to see bloggers getting out there and doing some journalism, even if it's a brief adoring interview of a Bush official. I haven't made up my mind one way or the other about Mike Petrilli, but I never liked the idea of OII, a playground for political appointees. Russo's blog entry is more about the Fordham Foundation, which Petrilli tries to say is not knee-jerk conservative. Fordham doesn't do serious research, but they do have some good writers and have useful things to say on education policy. Checker Finn is a very good communicator and debater.

One of the commenters asks why nobody at Fordham has teaching experience. I don't think you need teaching experience to do ed research or ed policy wonkism, although it definitely helps. I'd like to see some discussion of how not having teaching experiences hurts the researcher. Any teachers out there? Share please. I never taught K-12 but I do Ed Research all day, want to know what I'm doing wrong.


Click here for continuation of post (if any)

Wednesday, March 08, 2006

Ed Week Clueless Again

Education Week ($) is such an important and timely publication that it's a shame they are so clueless when it comes to reporting on education research. Their reporting consists of spewing whatever press releases they receive in largely unmodified form. They "add value" by getting ill-informed quotes from the usual suspects (often hacks) who have clearly not read the study or are commenting outside their field of expertise, or both.

Here's the latest annoyance. I am a regular reader of "Report Roundup" (yee-haw!), which summarizes the latest research. This week's roundup includes a blurb on a report showing that increased education expenditures don't raise test scores. Did they tell their readers who wrote the report? Yes, the "Washington-based American Legislative Exchange Council (ALEC)."

Yes, but who is this authoritative-sounding Washington-based ALEC? No mention of this as a highly secretive conservative think tank supported by deep pockets corporations like Enron, Amoco, Chevron, Texaco, R.J. Reynolds, AT&T, the American Nuclear Energy Council, the Chlorine Chemistry Council, the American Petroleum Institute, and the Pharmaceutical Research & Manufacturers of America. Read more about ALEC, the right-wing propaganda machine, here at Media Transparency.

Don't worry, you too can join ALEC and be part of this network of policy makers. But it will cost you $5,000 minimum. More like $50,000 if you're a corporation.

Most readers of Ed Week, not knowing what ALEC is, might think this is the finding from a piece of serious social science research. Look more closely. I haven't read through it carefully yet but it doesn't smell so good from a quick leaf-through. First, it uses the NAEP, which tells us how good schools have been over the last 4 to 8 years, not just the most recent year. NAEP is a snapshot of student performance at selected grade levels. What a student knows in grade 8 reflects what they've learned since they started schools. It does not support even the most simplistic growth analysis.

Second, it doesn't seem to do the rudimentary things that researchers do, like account for demographics and variation in price levels. The thin regression model and weird discussion of how a simple percentile score was calculated (I mean, "scaled rank") in the appendix raise soem flags that this is not your typical objective research paper. The gratuitous use of state by state tables for every possible variable, including 120 pages of filler "state snapshots" seem more like an effort to fill out the 214 pages for an aura of gravitas than actual scholarship.

But don't trust Ed Week to tell you that. This paper gets equal billing with real social science research. Nice.


Click here for continuation of post (if any)

How big is YOUR labor market?

How large are teacher labor markets? Do teachers search nationally to find a job? Do they buy a house and search in a five mile radius? WHy do we care. Well, we care because it has major implications for the maldistribution of teacher talent. You live in a poor community, you will probably have a worse teacher than if you live in a wealthier one. We know many of the reasons why this is, but part of the problem is that we need to recognize compensating differentials, or in plain English, you gotta pay more to bring good teachers to the 'hood.

This study is a good investigation of the question using data from New York state on teachers' hometown addresses (where they went to high school) and the location of their teaching positions. (THis is work by Donald Boyd, Susanna Loeb, Hamp Lankford, and Jim Wyckoff. They are doing interesting work in NY. Check out their web page.)

The answer: 40 miles.


Ok, the answer isn't that direct, but 85% of teachers went to HS within 40 miles of the school where they teach (if I read Table 1 correctly).

If you're handy with regression coefficients, read Table 4. The interaction effects tell an interesting story.

We need to pay attention to this question because it has major implications for addressing the maldistribution of teaching quality by community. I recommend reading their conclusion. You don't need to know what an odds ratio is to get it:

DISCUSSION AND CONCLUSIONS


In seeking their first teaching jobs, prospective teachers appear to search very close to their hometowns and in regions that are similar to those where they grew up. Location of college plays an independent, although less important, role in teachers’ employment location decisions.


The importance of distance in teachers’ preferences particularly
challenges urban districts, which are net importers of teachers. The number of teacher recruits whose hometown is in an urban area falls short of the number of positions being filled in urban districts, requiring that these districts attract teachers from other regions. Teacher candidates coming from suburban or rural hometowns strongly prefer to remain in those areas, rather than teach in the urban districts—both because of the importance of distance and because teachers have preferences with respect to urbanicity. Thus, urban districts must overcome these preferences in addition to addressing the considerations typically identified with recruiting teachers to difficult-to-staff urban schools, such as salary, working conditions, and the characteristics of the student population. In general, urban schools must have salaries, working conditions, or student populations that are more attractive than those of the surrounding suburban districts to induce sufficiently qualified candidates whose hometowns are in suburban regions to take jobs farther from home and in a different type of region. To the extent that they do not, teachers with suburban hometowns who take jobs in urban areas are likely to be less qualified than those who teach in the suburbs. Moreover, urban districts face a second disadvantage.

If, historically, the graduates of urban high schools have not received
adequate education, then the cities face a less-qualified pool of potential teachers even if they are not net importers. Preferences for proximity lead to the perpetuation of inequities in the qualifications of teachers. Inadequate education is a cycle that is difficult to break.



Click here for continuation of post (if any)

Tuesday, March 07, 2006

LDH gets the Greenspan treatment in Ed Week

LDH (no, not a type of cholesterol, it's Linda Darling Hammond, the education professor at Stanford) can tell a joke in a conference and get written up in Ed Week. What is wrong with that rag?


Reporter's Notebook

Educator Condemns Lack of Respect for Teacher Prep
Atlanta


Citing American students’ relatively poor showing in math and science on international tests such as PISA—the Program for International Student Assessment—a nationally renowned professor of education at Stanford University is calling for strengthening rather than bypassing teacher-preparation programs to improve student achievement.

At the annual conference of the Association for Teacher Educators, held here Feb. 18-22, Linda Darling-Hammond decried what she said were attempts by politicians such as former U.S. Secretary of Education Rod Paige to erode teacher preparation.

Pointing out that student-achievement gains are more influenced by classroom teachers than any other factor, she said the focus should be on providing new
teachers with enough tools to be successful.

“We need to be artistic in articulating how to prepare teachers, rather than lowering standards. It would be penny-wise and pound-foolish to bring people into teaching unarmed,” she said.

As an example, Ms. Darling-Hammond pointed to Finland, where students have zoomed to the top on PISA. The Finns, she said, “put most of their investment into teacher education. … They invest in the abilities of professionals.”
To make her point that teacher education is “one of the most important and challenging fields,” Ms. Darling-Hammond outlined a plot for a segment of the popular reality-television show “Survivor”: Six business people
would be dropped into an elementary school that included high-needs students, some with ADHD and some who speak no English. They would be expected to multitask as every teacher does, from teaching students to documenting benchmarks to maintaining discipline. There would be no golf, except on Sundays, but that wouldn’t matter because they couldn’t afford to play anyway on their new salaries.

“The winner,” she concluded to peals of laughter from the audience, “will be allowed to return to his or her job.”

She's picking on a former official? She assumes that the only difference between Finland and the rest of the world is... teacher training? Why doesn't Ed Week leave her alone and let her get a blog like the rest of us? There's nothing to see here. Move along.


Click here for continuation of post (if any)

A Whale of a Program

The U. S. Department of Education has an Office of Innovation and Improvement (OII) that is famous for supporting market-oriented reforms in education. The outgoing head of the agency is Nina Rees, formerly of the Heritage Foundation. (The position is vacant -- leave nominations in the Comments and I'll see what I can do). This office supports all kinds of innovations and improvements, like um, this one:

Exchanges with Historic Whaling and Trading Partners

CFDA Number: 84.215Y
Program Type: Discretionary/Competitive
GrantsProgram

Description

This program supports culturally based educational activities, internships, apprenticeship programs, and exchanges for Alaska Natives, Native Hawaiians, and children and families of Massachusetts.
The program earmarks funds for certain entities in Massachusetts, Alaska, and Hawaii as follows: $2 million each for (1) the New Bedford Whaling Museum in partnership with the New Bedford Oceanarium in Massachusetts and (2) the Inupiat Heritage Center in Alaska; not less than $1 million each for the New Trade Winds Project to (1) the Alaska Native Heritage Center, (2) the Bishop Museum in Hawaii, and (3) the Peabody Essex Museum in Massachusetts; and not less than $1
million each for the same three entities for internship and apprenticeship programs.


Are those innovations or are they improvements? I can't tell.

By the way, I still don't get why Alaska and Hawaii are part of the United States. Why are the inhabitants of a bunch of tiny islands in the middle of the Pacific Ocean afforded full U.S. citizenship with voting rights in the House and Senate while residents of the District of Columbia stuck with unpaid and utterly powerless "shadow representatives" like this poor guy.


Click here for continuation of post (if any)

Thursday, March 02, 2006

Principal ratings of teacher performance

Brian Jacob and Lars Lefgren have a nice working paper on principals' ratings of teachers (popularized in the latest issue of Education Next). This is a subject that is worth studying more. Private sector professionals' wages typically are a function of tenure and supervisor ratings with some small role played by objective measures of output, performance, or productivity. Public teacher compensation, however, usually follows a strict rule, basing pay on years of experience and degrees received, with no input from principals and no use of student achievement or parent satisfaction. It would be very useful to learn why supervisor (principal) ratings have no role in teacher compensation and this working paper is a good place to start.

I highly recommend people read Jacob and Lefgren, less for its earth-shattering findings than for the questions it raises. Their paper is essentially a validation study, comparing several measures/predictors of teacher performance against each other.

Teacher compensation reform bubbles up as a policy issue every 5 or 10 years, and it's due back for a big showing now (see MA, FL, and TX proposals), but we will hopefully be smarter each time and not repeat the mistakes of the past. New empirical research on the reliability of different methods for measuring teacher quality will be vital to this debate. After all, performance based pay is only as good as our ability to measure performance.


Click here for continuation of post (if any)

Wednesday, March 01, 2006

Another Master Plan

Ok, it's hard to read Casey Lartigue's long list of history's grand plans for DC Public Schools and not get discouraged before reading the latest one put out by Superintendent Janey.

Mark Lerner is complaining already that it's doomed to failure because it came from the Superintendant instead "The People", you know, the Entreprenuerial Class.

They may be right to be skeptical, but let's take a look at the ideas put forth. I am still reading...


Click here for continuation of post (if any)