5510 stories
·
70 followers

The Source Code for NYC's Forensic DNA Statistical Analysis Tool

1 Share
Comments

README.md

Forensic Statistical Tool Source Code

ProPublica obtained the source code for New York City's Forensic Statistical Tool (FST), developed by employees of the city's medical examiner's office to help analyze complex DNA evidence from crime scenes.

The software source code contained within this repository is Copyright © City of New York and was unsealed by a federal judge after a motion by ProPublica.

For more info, and for links to newly unredacted expert affidavits analyzing this program, please read our story about this source code release.

Read about this program's design and functionality in the medical examiner's office's FST executive summary, user manual, and validation summary.

For more background on FST, read our earlier coverage:


Comments
Read the whole story
skorgu
5 hours ago
reply
Share this story
Delete

By You Can't Tip a Buick in ""He would have to concern anyone who cares about our nation"" on MeFi

2 Comments and 3 Shares
Deep upthread someone asked how you win a debate against people who willfully disregard facts and reason.

If I have a political project in my participation in these threads, it's in observing the ways that people presume that political disputes are resolved through debate and reason, and then observing how in all of those cases those disputes are instead resolved through organization and action. All of those cases — I stand by the strong universalizing claim here.

So the way you beat them is by out-organizing them, by getting more people on our side than theirs, by denying them space in the room where decisions are made, through materially demonstrating that our side can beat their side, not in the domain of reasoned dispute but in the domain of who can get there the first with the most, in the domain of who can be sneakiest when being sneaky matters, in the domain of effectively disorganizing and demoralizing everyone not on our side.

Debate plays almost no role in this. Rhetoric, now rhetoric plays a huge role, as does propaganda, and effective knowledge of human psychology, and material resources — money, media outlets, control of institutional processes. Debate is a sideshow. Facts matter insofar as having the right facts can point you toward the most effective organizational strategy and the most effective way to disorganize your opponents, but simply having more accurate facts than the other side means nothing.

The reason why I am insistent on railing against liberal Enlightenment ideology is that it leads to miserable, ineffective tactics. There's a liberal idea, baked deep into our pre-2016 culture, that says that reasoned debate between formal equals is the best way to resolve disputes, and that therefore the best decision making processes involve constructing a sandbox where we pretend that reasoned debate is how disputes are resolved, and then act according to what we decide on within that sandbox. This idea was so deep in our culture that we forgot that the sandbox was a sandbox; we started to think that that was how the world really worked. And so, just like in 1933, we were helpless when people who recognized the sandbox was a sandbox walked over to the sandbox, took a healthy shit in it, then flipped the whole thing over.

The thing is, it's not just that the sandbox is susceptible to sudden major attack from outside, from people who are like "fuck debate I'm taking what I want." It's that the reality outside the sandbox of reasoned debate is always intruding, and reasoned debate is never determinative of decision-making processes, no matter how intent you are in establishing an abstraction that lets you think that reason rules. The bosses and the capital-owners always put their thumbs on the scale of reasoned debate by buying the participants and the judges, the cops always abuse their position as enforcers of law derived by reason to their own unreasoned benefit. The sandbox is a leaky abstraction; there's always buffer overruns and always people ready to exploit them.

This is why I'm always dismissive of people here and elsewhere who are like "well we just gotta fix our processes by [reforming campaign finance laws/doubling the size of the house of representatives/whatever weird shit Lessig is on about these days]. It's not about processes. It's about organized power. It's about who owns what. It's about who can convince whom of what, not about what's true or what's right. This is the distinction between left and liberal: liberal solutions involve funding fair processes — about trying to patch up the sandbox so we can go back to pretending reason rules — while left solutions are about acknowledging that the sandbox is impossible and (governed by our collective senses of fairness, justice, reason, empathy, and love) making those solutions real in the world.

This is a hard grim thing, though, because if you're coming from the liberal position you can pretend there's a rock-solid foundation for your actions. You can say "well, we have a process, and that process allows for decision making based on reasoned debate, and we followed that process and here's the result it yielded, so we know we have good reason to do what we're doing." If you admit that that foundation, which seems rock-solid, is built on sand, you have no way whatsoever to be certain that what you're doing is right. And because you can't rely on a process to ensure that the conditions you want remain extant, there is no end to the process of struggle — struggle informed by reason, but never governed by it, because reason can't govern, and if you trick yourselves into thinking reason can govern you've gone and made yourself susceptible to attack by nazi thugs who are quite eager indeed to show you your error.


It's a hell of a world we're living in. But living in it beats the alternatives.
Read the whole story
duerig
1 day ago
reply
It can be a useful corrective when somebody points out the implicit power relationships in society. Because it is easy to conflate the status quo with 'good' or 'just' or even 'neutral' and 'equal'. But every society has a pattern of power relations and we need to inspect the relations of our society and not blind ourselves to power imbalances.

On the other hand, it seems easy for people who think a lot about power relations to start thinking they are the only thing that matters. Reading this article, I'm left with no reason to ally with the 'left'. If the world really is just different groups competing for power, why should I root for one over the other? The only wise course of action in such a world is to ally with the side that will grant me the most power in exchange.

In order to have a reason to pick one side or another, I have to have more than the knowledge of power relations. I have to have a standard of truth and a standard of justice. If these are discarded in favor of the pursuit of organized power, then it doesn't matter who wins any more.

And this is where defending the process and the sandbox comes in. Because the institutions and processes of society aren't just imaginary, though they do depend on most people following them most of the time. They are also real. They are embodied in our law enforcement services (however flawed) and our elections (even if they don't go our way) and our property (even when unequally distributed). The sandbox isn't an automatic thing. And we have to recognize that it can be threatened and be willing to defend it. But it is a real thing that is worth defending.

The sandbox has been overturned many times in our history. Many of the people who did it had righteous causes and they were filled with a deep sense of fairness, justice, reason, empathy, and even love. But each time it happens, the blood of the tyrants is spilled, the blood of the weak is spilled, the blood of the powerless is spilled. And the new sandbox that was created still had power relations and injustice and unfairness and corruption. And the new tyrants oppressed the new weak and powerless.

The world is better now than it has been in centuries past. But it is not because of those who overturned the sandbox. It is because of those who spent years or decades slowly improving the sandbox they already lived in. Sometimes with better laws. Sometimes with new ways to prevent corruption. Sometimes just with a new way to till the land.
skorgu
1 day ago
reply
Share this story
Delete
1 public comment
CrystalDave
1 day ago
reply
Which, not that I'm :100: down with this (I suspect the illusion-of-sandbox & trying to reinforce it as an ideal holds some value), but it's a hard thing to avoid grappling with, because the core point holds fairly well.

It's all well and good to hold to sandbox rules until someone comes along who wants their goal more than they want to avoid upsetting the sandbox, and forgetting the map/territory distinction there means getting blindsided.

See also: Yonatan Zunger's Tolerance as Treaty, not Moral Precept article & similar.
Seattle, WA

The Supreme Court Is Allergic To Math

1 Comment and 2 Shares

The Supreme Court does not compute. Or at least some of its members would rather not. The justices, the most powerful jurists in the land, seem to have a reluctance — even an allergy — to taking math and statistics seriously.

For decades, the court has struggled with quantitative evidence of all kinds in a wide variety of cases. Sometimes justices ignore this evidence. Sometimes they misinterpret it. And sometimes they cast it aside in order to hold on to more traditional legal arguments. (And, yes, sometimes they also listen to the numbers.) Yet the world itself is becoming more computationally driven, and some of those computations will need to be adjudicated before long. Some major artificial intelligence case will likely come across the court’s desk in the next decade, for example. By voicing an unwillingness to engage with data-driven empiricism, justices — and thus the court — are at risk of making decisions without fully grappling with the evidence.

This problem was on full display earlier this month, when the Supreme Court heard arguments in Gill v. Whitford, a case that will determine the future of partisan gerrymandering — and the contours of American democracy along with it. As my colleague Galen Druke has reported, the case hinges on math: Is there a way to measure a map’s partisan bias and to create a standard for when a gerrymandered map infringes on voters’ rights?

The metric at the heart of the Wisconsin case is called the efficiency gap. To calculate it, you take the difference between each party’s “wasted” votes — votes for losing candidates and votes for winning candidates beyond what the candidate needed to win — and divide that by the total number of votes cast. It’s mathematical, yes, but quite simple, and aims to measure the extent of partisan gerrymandering.

Four of the eight justices who regularly speak during oral arguments3 voiced anxiety about using calculations to answer questions about bias and partisanship. Some said the math was unwieldy, complicated, and newfangled. One justice called it “baloney” and argued that the difficulty the public would have in understanding the test would ultimately erode the legitimacy of the court.

Justice Neil Gorsuch balked at the multifaceted empirical approach that the Democratic team bringing the suit is proposing be used to calculate when partisan gerrymandering has gone too far, comparing the metric to a secret recipe: “It reminds me a little bit of my steak rub. I like some turmeric, I like a few other little ingredients, but I’m not going to tell you how much of each. And so what’s this court supposed to do? A pinch of this, a pinch of that?”

Justice Stephen Breyer said, “I think the hard issue in this case is are there standards manageable by a court, not by some group of social science political ex … you know, computer experts? I understand that, and I am quite sympathetic to that.”

“What Roberts is revealing is a professional pathology of legal education.”

And Chief Justice John Roberts, most of all, dismissed the modern attempts to quantify partisan gerrymandering: “It may be simply my educational background, but I can only describe it as sociological gobbledygook.” This was tough talk — justices had only uttered the g-word a few times before in the court’s 230-year history.4 Keep in mind that Roberts is a man with two degrees from Harvard and that this case isn’t really about sociology. (Although he did earn a rebuke from the American Sociological Association for his comments.) Roberts later added, “Predicting on the basis of the statistics that are before us has been a very hazardous enterprise.” FiveThirtyEight will apparently not be arguing any cases before the Supreme Court anytime soon.

This allergy to statistics and quantitative social science — or at least to their legal application — seems to present a perverse incentive to would-be gerrymanderers: The more complicated your process is, and therefore the more complicated the math would need to be to identify the process as unconstitutional, the less likely the court will be to find it unconstitutional.


But this trouble with math isn’t limited to this session’s blockbuster case. Just this term, the justices will again encounter data again when they hear a case about the warrantless seizure of cell phone records. The Electronic Frontier Foundation, the Data & Society Research Institute, and empirical scholars of the Fourth Amendment, among others, have filed briefs in the case.

“This is a real problem,” Sanford Levinson, a professor of law and government at the University of Texas at Austin, told me. “Because more and more law requires genuine familiarity with the empirical world and, frankly, classical legal analysis isn’t a particularly good way of finding out how the empirical world operates.” But top-level law schools like Harvard — all nine current justices attended Harvard or Yale — emphasize exactly those traditional, classical legal skills, Levinson said.

In 1897, before he had taken his seat on the Supreme Court, Oliver Wendell Holmes delivered a famous speech at Boston University, advocating for empiricism over traditionalism: “For the rational study of the law … the man of the future is the man of statistics and the master of economics. It is revolting to have no better reason for a rule of law than that so it was laid down in the time of Henry IV.” If we hadn’t made much progress in the 500 years between Henry IV and Holmes, neither have we made much progress in the 120 years between Holmes and today. “What Roberts is revealing is a professional pathology of legal education,” Levinson said. “John Roberts is very, very smart. But he has really a strong anti-intellectual streak in him.”

I reached Eric McGhee, a political scientist and research fellow at the Public Policy Institute of California who helped develop the central gerrymandering measure, a couple days after the oral argument. He wasn’t surprised that some justices were hesitant, given the large amount of analysis involved in the case, including his metric. But he did agree that the court’s numbers allergy would crop up again. “There’s a lot of the world that you can only understand through that kind of analysis,” he said. “It’s not like the fact that a complicated analysis is necessary tells you that it’s not actually happening.”

During the Gill v. Whitford oral argument, the math-skeptical justices groped for an out — a simpler legal alternative that could save them from having to fully embrace the statistical standards in their decisionmaking. “When I read all that social science stuff and the computer stuff, I said, ‘Is there a way of reducing it to something that’s manageable?’” said Justice Breyer, who is nevertheless expected to vote with the court’s liberal bloc.

It’s easy to imagine a situation where the answer for this and many other cases is, simply, “No.” The world is a complicated place.


Documentation of the court’s math problem fills pages in academic journals. “It’s one thing for the court to consider quantitative evidence and dismiss it based on its merits” — which could still happen here, as Republicans involved in the Wisconsin case have criticized the efficiency gap method — “but we see a troubling pattern whereby evidence is dismissed based on sweeping statements, gut reactions and logical fallacies,” Ryan Enos, a political scientist at Harvard, told me.

One stark example: a 1986 death penalty case called McCleskey v. Kemp. Warren McCleskey, a black man, was convicted of murdering a white police officer and was sentenced to death by the state of Georgia. In appealing his death sentence, McCleskey cited sophisticated statistical research, performed by two law professors and a statistician, that found that a defendant in Georgia was more than four times as likely to be sentenced to death if the victim in a capital case was white compared to if the victim was black. McCleskey argued that that discrepancy violated his 14th Amendment right to equal protection. In his majority opinion, Justice Lewis Powell wrote, “Statistics, at most, may show only a likelihood that a particular factor entered into some decisions.” McCleskey lost the case. It’s been cited as one of the worst decisions since World War II and has been called “the Dred Scott decision of our time.”

Maybe this allergy to statistical evidence is really a smoke screen — a convenient way to make a decision based on ideology while couching it in terms of practicality.

Another instance of judicial innumeracy: the Supreme Court’s decision on a Fourth Amendment case about federal searches and seizures called Elkins v. United States in 1960. In his majority opinion, Justice Potter Stewart discussed how no data existed showing that people in states that had stricter rules regarding the admission of evidence obtained in an unlawful search were less likely to be subjected to these searches. He wrote, “Since, as a practical matter, it is never easy to prove a negative, it is hardly likely that conclusive factual data could ever be assembled.”

This, however, is silly. It conflates two meanings of the word “negative.” Philosophically, sure, it’s difficult to prove that something does not exist: No matter how prevalent gray elephants are, their numbers alone can’t prove the nonexistence of polka-dotted elephants. Arithmetically, though, scientists, social and otherwise, demonstrate negatives — as in a decrease, or a difference in rate — all the time. There’s nothing special about these kinds of negatives. Some drug tends to lower blood pressure. The average lottery player will lose money. A certain voting requirement depresses turnout.

Enos and his coauthors call this the “negative effect fallacy,” a term they coined in a paper published in September. It’s just one example, they wrote, of an empirical misunderstanding that has proliferated like a tsunami through decades of judges’ thinking, affecting cases concerning “free speech, voting rights, and campaign finance.”

Another example of this fallacy, they wrote, came fifty years later in Arizona Free Enterprise v. Bennett, a 2011 campaign finance case. The topic was Arizona’s public campaign financing system, specifically a provision that provided matching funds to publicly financed candidates. The question was whether this system impinged on the free speech of the privately funded candidates. A group of social scientists, including Enos, found that private donations weren’t following the kind of patterns they’d expect to see if the public funding rule were affecting how donors behaved. The Supreme Court didn’t care and ultimately struck down the provision.

In his majority opinion, John Roberts echoed Stewart and repeated the fallacy, writing that “it is never easy to prove a negative.”


So what can be done?

McGhee, who helped develop the efficiency gap measure, wondered if the court should hire a trusted staff of social scientists to help the justices parse empirical arguments. Levinson, the Texas professor, felt that the problem was a lack of rigorous empirical training at most elite law schools, so the long-term solution would be a change in curriculum. Enos and his coauthors proposed “that courts alter their norms and standards regarding the consideration of statistical evidence”; judges are free to ignore statistical evidence, so perhaps nothing will change unless they take this category of evidence more seriously.

But maybe this allergy to statistical evidence is really a smoke screen — a convenient way to make a decision based on ideology while couching it in terms of practicality.

“I don’t put much stock in the claim that the Supreme Court is afraid of adjudicating partisan gerrymanders because it’s afraid of math,” Daniel Hemel, who teaches law at the University of Chicago, told me. “[Roberts] is very smart and so are the judges who would be adjudicating partisan gerrymandering claims — I’m sure he and they could wrap their minds around the math. The ‘gobbledygook’ argument seems to be masking whatever his real objection might be.”

But if the chief justice hides his true objections behind a feigned inability to grok the math, well, that’s a problem math can’t solve.





Read the whole story
skorgu
2 days ago
reply
:(
kazriko
2 days ago
I think in general most branches of the government have an allergy to math, statistics, and even quantifying the results from their policies.
Share this story
Delete

Carrie Fisher apparently sent a cow tongue to a producer who assaulted her friend

1 Comment and 2 Shares
(Photo: Getty Images for Wizard World, Daniel Boczarski) Inspired by the people coming forward with stories about Harvey Weinstein, a woman named Heather Ross…
Read the whole story
skorgu
2 days ago
reply
<3
acdha
3 days ago
reply
Washington, DC
Share this story
Delete

By scaryblackdeath in ""He would have to concern anyone who cares about our nation"" on MeFi

1 Share
I'm noticing recently how my brain gets tired earlier in the day than it used to. My daily job schedule is easier on me than ever (work-at-home on my own pace), so it can't be that. Earlier today I was thinking about it: not enough sleep? Not drinking enough water? Need more exercise?

And then I saw this new thread and I'm like, "Oh yeah, it's 2017 and the world's a nightmare hellscape and there are people actually cheering it on. That's why I'm so tired by 3:30 in the afternoon."
Read the whole story
skorgu
2 days ago
reply
Share this story
Delete

US telcos selling realtime ability to associate web browsing with name & address

1 Share
Comments

Comments
Read the whole story
skorgu
5 days ago
reply
Share this story
Delete
Next Page of Stories