Measures for Justice Data Portal by Jason Tashea

This week, Measures for Justice (MFJ) released their new data portal for the public. According to MFJ, the idea behind the portal "allows users to review and compare performance data within and across states, and to break them down by race/ethnicity; sex; indigent status; age; offense type; offense severity; and attorney type." They go on to say that "[t]he Data Portal comprises data that has been passed through 32 performance measures developed by some of the country’s most renowned criminologists and scholars. The measures address three primary objectives of criminal justice systems: Public Safety; Fair Process; Fiscal Responsibility."

Anyone who has done multi-jurisdictional criminal justice research will tell you a project like this is desperately needed. Further, this isn't a project from a dilettante that recently became acquainted with the mess that is American criminal justice. Amy Bach, the executive director of MFJ, is an expert on the criminal justice system and has been fighting this battle for criminal justice reform for many years. For these reasons, this project should be taken seriously.

Launching with six jurisdictions, the staff at MFJ are doing difficult work that the Bureau of Justice Statistics (BJS) should have been doing years ago. What's most impressive is that they're doing this work with just 22 staff in Upstate New York. Further, it's not just a chance to see comparisons through their clean user interface; they made it easy to export any of the raw datasets their staff meticulously poured over.

This being said, a project of this nature and with this massive scope raises some questions:

  • Sustainability: This project is massive and to keep it going will take a lot of money and effort. As of late, MFJ has been infused with a lot of money, especially from the tech sector; but, what comes of these efforts when they are no longer novel or become banal? Even worse, where does the data go if MFJ has to close? I'd assume someone would step in, but does that contingency exist? For the sake of the effort being made, I hope an off-boarding procedure has been considered. 

These challenges were faced by the Sunlight Foundation when they were undertaking their criminal justice data project a number of years ago. Ultimately, they had to off-board the project and never reached their goal of collecting all the criminal justice data from all 50 states plus D.C. and the Feds.

  • Upkeep: The work done by MFJ to get this far has taken six years and, from what I understand, was painstaking. While they plan to add 14 more states in the next three years, what happens to the datasets they've already built? Will we see expansions of these datasets as more data becomes available over time? To do so requires continuous effort and upkeep that isn't necessarily automate-able (the process of going the the jurisdiction in some cases was a needed step). I'm curious to know the plan for this type of perpetual upkeep.
  • Tracking Impact: This is never easy. Ever. While there will be anecdotal successes of district attorneys or judges that see the data and make change, how can we as the criminal justice community track the impact of this work? I hope to see the innovation and energy put into this project extend to creative ways to track its impact. Page view and download stats will be insufficient; as well success stories from various jurisdictions. I'd be happy to brainstorm on this front. It's a challenge worth tackling.

Beyond these initial concerns, I think MFJ is well positioned to be a normative force in criminal justice stats around the country. While their work is focused on collecting and aggregating this data, I think it would be a missed opportunity if they don't leverage their relationships at the county level to help improve data collection capacity and provide a floor for standardization. My dream is for a government agency or company to create something similar to what Google's General Transit Feed Specification did for transit data. Without a doubt the criminal justice system is more challenging than transit; however, MFJ seems well suited through their relationships to bring this message and support. Perhaps they build off of the National Information Exchange Model (NIEM) and the Global Justice XML data transit standards. I'm not sure. However, it would be great to see MFJ, or a coalition led by MFJ, flex this muscle. 

TLDR

Measures for justice made a great data portal with county-level criminal justice statistics. It's a great and needed tool. There will likely be challenges around sustainability, upkeep, and measuring impact. With their impressive national network of local stakeholders, perhaps, MFJ can help standardize data capture in the criminal justice system. Go MFJ!

Can you code for Miranda? by Jason Tashea

A couple months ago, Andrew Ferguson, a law prof at DC Law School, reached out to ask about my thoughts on a recent article he wrote with Richard Leo, a law prof and Miranda expert, on the creation of a Miranda App. I've finally put my thoughts together for a post. This is that post

For your background, the article is summarized thusly:

For fifty years, the core problem that gave rise to Miranda – namely, the coercive pressure of custodial interrogation – has remained largely unchanged. This article proposes bringing Miranda into the twenty-first century by developing a “Miranda App” to replace the existing, human Miranda warnings and waiver process with a digital, scripted computer program of videos, text, and comprehension assessments. The Miranda App would provide constitutionally adequate warnings, clarifying answers, contextual information, and age-appropriate instruction to suspects before interrogation. Designed by legal scholars, validated by social science experts, and tested by police, the Miranda App would address several decades of unsatisfactory Miranda process. The goal is not simply to invent a better process for informing suspects of their Miranda rights, but to use the design process itself to study what has failed in past practice. In the article, the authors summarize the problems with Miranda doctrine and practice and describe the Miranda App's design components. The article explains how the App will address many of the problems with Miranda practice in law enforcement. By removing the core problem with Miranda – police control over the administration of warnings and the elicitation of Miranda waiver and non-waivers – the authors argue that the criminal justice system can improve Miranda practice by bringing it into the digital age.

I had many thoughts after reading this article. I like the mix of metaphor and machine, and, specifically, I think the metaphor is strong. I admire the leap the authors want to make between the confederated, bureaucratic nature of Miranda to a reality where this constitutional protection is treated like one. As they describe, the challenges around Miranda today create hurdles for this tool.

These hurdles are not necessarily overcome by approaching the problem through a tech lens. First and foremost, I think the authors need to prove that the lack of tech is the problem. The article lays out the many problems with Miranda and then proposes the solution. If this project indeed want to move forward, I'd recommend running a randomized controlled trial that had one group be told Miranda, one group who read Miranda, and one group that had a more interactive experience via a website. Afterwards test everyone on their understanding of the warning. If there's a statistically significant difference between the first two groups and the web-based group, then there might be reason to believe that an app could be valuable. Otherwise, I'm hard pressed, on first blush, that an app is the solution. (As a side note, I think an interactive, responsive web-app would suffice; there's no need to build a proper app for this project.)

Further, some time needs to be spent defining what the core competency of the tool is. The article is very broad and articulates numerous potential features, but this is dangerous. Called "feature creep", this is a common issue with any ideation process. While there are an endless number of permutations of the tool put forward, it's important to clearly articulate what the tool's core job is and build (with user feedback) that first, while fending off various unnecessary features. 

On the accessibility front, there's a lack of analysis or definition on what this means. I recommend taking a look at WCAG 2.0 standards. A project like this should be aiming for a AA standard (the standard recommended by the DOJ). 

With any project meant to effect the justice system, implementation is the biggest hurdle. Put another way, building this tool is a moot point if no one is going to use it. The fact that there are 500 different versions of Miranda around the country should illustrate how hard it is to standardize this procedure in America's 19,000 law enforcement agencies. Is there an agency that would be willing to test this idea? The status quo is a powerful foe, so I'd recommend finding an agency that is willing to be a part of the project from the ground floor. It'll be easier to test the initial build and find other adopters if there's the support of a law enforcement agency.

Last, I wonder if the tool as described runs the risk of over-informing the user causing confusion. The article includes a lecture's worth of information, and that could overload and limit the user's comprehension. Through user testing they can figure out what the right amount of information is. But this will require time and effort early in the process.

Beyond these thoughts, I had a couple of random questions about the tool as the article conceives it:

  • The article says that the data would be kept on a "secure server"; however, this misses a number of issues. Beyond secure server not being defined (does this mean password protected? encrypted? etc?), would the data be protected in transit and at rest? Who has access to the server (police, states attorney, defender, courts, researchers, developers)? And further, what would the data retention process and procedure be? A major issue with police bodycams has been the cost of storage, who will pay for all this data (including cumbersome video footage) to be stored?
  • Would the user (the arrestee) be informed that 1. they are being recorded and 2. the data they produce is being collected? 
  • Would the metadata collected be anonymized? Or would there be unique identifiers that would make it easy to admit this information in court? If it is admissible, what are the potential impacts of this new information in court?
  • If the user reaches a point in the process within the tool where they need help or further information, what process would take place? Would the police be the intermediaries? If so, does this diminish some of the value of the tech intervention?
  • The article mentions that the tool would be used at booking and not arrest. Correct me if I'm wrong, but isn't the current standard to read Miranda at the time of arrest? Do they envision that Miranda would be read at arrest and then the app would be given to the person at booking? If the only Miranda warning is given via the tool at booking, then does that create a rights gap between arrest and booking?

They've decided to tackle an interesting and difficult problem, which is exciting. However, there's a lot of work to be done to make this idea viable.

Police should weaponize the DMCA by Jason Tashea

I have no love for the mugshot racket.

If you aren't familiar, mugshot websites are privately held sites that post people's mugshots and exploit search engine algorithms to bring a person's worst night to the top of their search results. To be clear, mugshot sites aren't news or crime blotter websites providing a public service. These sites operate under an extortion model that requires a fee to takedown the pictures. However, people often find when they pay a few hundred dollars to take down their photo, their mugshot pops up on a different site. It's an endless game of online reputation whack-a-mole. 

To put it mildly, this practice sucks. The Internet is where landlords and potential employers go to find more information on a person. Even if the person was merely arrested and never charged with a crime, their mugshot lives on in infamy. This creates a plethora of collateral consequences that hurt peoples' ability to move on with their lives (for a lot more on this subject check out the Collateral Consequences Resource Center). Even if the individual can legally expunge the record, the Internet never forgets.

However, over beers with Sarah Lageson from Rutgers, it dawned on me: the only rule that the Internet even comes close to following is the Digital Millennium Copyright Act, and this could be a weapon against mugshot websites. Because the individual in the mugshot doesn't own the copyright of the mugshot, it becomes an opportunity for the police to take a stand on behalf of the citizens they serve and protect.

To put it simply, the DMCA is an American law that codified two international treaties from the 90s that was an attempt to control the illegal use and dissemination of copyrighted material. You've likely come into contact with the DMCA as a sad face and a short message on YouTube.

This is the DMCA in action. Someone with a copyright saw their video on YouTube, hadn't given permission for it to be there, and wrote a letter to YouTube (actually it's a form on the site) saying, "That's mine, under the DMCA please take it down." 

Internationally, 94 parties have joined the World Intellectual Property Organization Copyright Treaty. Most notably, the EU has Directive 2001/29/EC, which is their DMCA. This is important because the Internet doesn't have a jurisdiction like a person does. So, the fact that nearly half of the world is onboard with this treaty matters when it comes to enforcement.

Back to the mugshot issue, the reason why these photos are allowed to be circulated in the U.S. is primarily a First Amendment issue. The press will oppose a police department that tries to shutdown the public release of these photos. They will argue the public has a right to know, and that public safety demands the release of this information. Whether or not you agree with this position is irrelevant to the point I'm making, which is that the First Amendment is going to keep the photos in the public sphere. So, turning off this spigot of mugshots is an unlikely solution.

This all brings me to my clickbait-y headline: where it is allowed police departments should use the DMCA to en masse get mugshots taken off these sites. 

Why the police? The default copyright holder of a photo is the person that took the photo, in the case of mugshots it's the police. The police could write takedown notices that would affect tens of thousands of people in a jurisdiction. Not only would this be a huge benefit to the individuals with mugshots on these sites, it would earn the police a win with the community.

Now, this idea has a couple of potential hurdles. First, some governments do not allow themselves to own copyrights. The feds, for one, do not hold copyrights. In the absence of a copyright, a DMCA takedown notice would not work. One potential work around would be to hire a third party vendor to take the mugshots, grant them the copyrights, and then work with the vendor to send the DMCA notices. 

Second, the Internet is a lawless, apocalyptic hellscape with no justice or jurisdiction. True, but allowing extortion to continue unabated doesn't seem like a good alternative. As I pointed out above, half of the world's countries are signatories of the treaty that was the basis for the DMCA. My understanding is enforcement is not uniform in those countries. Further, mugshot sites could always decide to be hosted in a country that didn't sign/ratify the WIPO Treaty. This is the same struggle with revenge porn websites and "dark net" hubs for child pornography, for example. However, unless we're just going to throw up our hands at the challenges of enforcing our terrestrial laws online, I think this is a battle worth waging.

Last, there's the small hurdle of finding out where mugshots are and which ones come from which police department. Some sites make it easy to know what jurisdiction they pull their mugshots from like TampaCriminal.com. Others, like Mugshots.com, are a national clearing house of these photos. An automated reverse image search could help in this process. This would take some work, but a partnership between a department down to tackle this issue and a savvy developer could overcome this hurdle. Plus, once the tool is built, it should be open sourced and shared with other departments across the nation.

This is just a cursory pitch, but let me know what you think. I'm by no means a copyright expert, and I'm sure there's other issues that I skate by, but I wanted to throw it out there and get some feedback.

UPDATE: Shortly after publishing this piece, I received some helpful feedback from numerous people. One thing that was pointed out was that the reverse image search is an unnecessary step. Instead, the police department or third party contractor could keep a catalog of the image hash from each photo that would allow them to easily search for the images. 

Second, a concern I was unaware of when I wrote the original post is that there are significant privacy issues when someone uses the DMCA. Specifically, in revenge porn cases a person issuing a notice has to provide private information to prove their identity to the site that is hosting the image. That information is then used to dox the victim, which only continues their public shaming. By having the police department or third party contractor send the DMCA, the individual in the mugshot's personal information does not have to be divulged to the site. This will save people with mugshots from further online reprisals. 

Mugshot websites, still a thing by Jason Tashea

I'm working on a feature piece regarding the use of algorithms in the criminal justice system. One aspect that wont make it into the final draft is that online search algorithms play a role in all this criminal justice and technology work. Here is the excerpt that didn't make the final edit:

Online search algorithms also affect those with criminal records. Julie Cantu of Tampa, Florida found this out when a first date asked about her mug shot he found online. Cantu was arrested in 2010 after blowing below the legal limit  during a field sobriety test, but she thought the issue was behind her after the charges were dropped and the record was expunged.

After the date, she found her mug shot with tear streaks running down her face on sites like MugShots.com, Tampacriminal.com and Arrestmugshot.com. These are not newspaper or crime blotter sites that are reporting on local crime. Cantu found herself in the mug shot racket, a series of websites, primarily hosted offshore, that exploit search engine algorithms and demand a fee to takedown pictures. After paying $175 to one site, she found her photo pop up on a different one.

Cantu says she worried that the photo was “going to be there the rest of my life,” which could affect her employment as a nurse.

Luckily for Cantu and the estimated 70 million Americans with a criminal record, Google, which accounts for 65 percent of U.S. search traffic, changed how their search algorithm serves results related to mug shot websites in 2011.

Johnathan Hochman, an Internet marketing consultant based in Connecticut, says that Google did not disclose how the algorithm was changed, but he suspects they “deindexed” mug shot websites, which means they do not show up in search results. Google did not respond to a request for comment.

At the time of publishing, Googling “Julie Cantu Tampa” did not bring up her mug shot on the first five pages of results. However, searching “Julie Cantu Mug Shot” immediately produced the photo.

While Hochman appreciates Google’s effort, he says, “It’s not completely perfect.” He thinks there is need for federal legislation banning the “depublishing” industry, which includes sites like MugShots.com but also revenge porn sites that operate similarly. Calling these sites “extortion”, he says it “is something completely new, and it should be illegal.” 

Tech & Partner Violence: Promise & Peril by Jason Tashea

Last Thursday and Friday I was a part of the "Technology and violence against women: Protection and Peril" symposium hosted by the Ortner Center at the University of Pennsylvania. This was an opportunity for experts in domestic violence, government, justice, law, social work, and technology to come together and discuss the pending and forthcoming policy issues surrounding technology and intimate partner violence (IPV). This was a unique and exciting opportunity to discuss issues that I care deeply about, but, more importantly, to learn about how technology is affecting IPV, good or bad.

The talks were universally enlightening. Topics included new tools and platforms (if you haven't checked out Callisto, fix that before you continue reading), data, smart guns, revenge porn, innovations in post-assault evidence collection, and about a half dozen more.

The challenges are vast regarding IPV, and the promises of new techniques and technologies are not fully realized. Many of the apps that are being developed might be good in concept, but are insufficient in practice. The need to accurately document physical trauma on darker skinned people remains. Smart guns have the potential to crimp gun violence generally, but may be neutral in impacting IPV specifically.

The symposium acted as a place to not only confront these challenges, but to also discuss policy prescriptions. The Ortner Center will be producing white papers on proposed policies. (Here’s hoping that procurement reform and data trusts make the list!)

More broadly than the specific talks themselves, this symposium was unique due to the intersection of topics. The field of justice and technology remains nascent and is obscure to many, so an event like this remains a novel treat.  When I asked Susan B. Sorensen, who is a professor at Penn and runs the Ortner Center, about this symposium and her motivation around it she said it was driven by her curiosity as she witnessed a changing landscape on account of tech. However, she was uncertain about the timing of the event. She admitted that the conference could be too early, which would make it bleeding edge as opposed to cutting edge.

This is a sentiment that I've been feeling for awhile. I know I've seen improvement and increased awareness over the past few years; this symposium and our symposium last week are evidence of an evolution toward greater awareness of these issues. However, when I speak to individuals at more established criminal justice organizations or attend so-called “justice and tech” events, they often don't see the intersection of these two worlds or the need for a singular focus on this subject, like how juvenile justice or bail reform has its experts. However, to draw this analogy out, I think we are beginning to see the blood coagulate. Vera now has a VP of tech and justice, the White House put out its Task Force on 21st Century Policing, and the fact that media covers this issue with more savvy all indicate that the issues around tech and justice are moving from the shadows and into the mainstream.

After the stimulating symposium, I wanted to reflect on this moment in criminal justice reform and our country’s history on the subject. So, I spent a few hours on Saturday at Philly's Eastern State Penitentiary. Built in 1821, this facility was the first penitentiary in the world and ushered in the promise of a more humane criminal justice system that brought prisoners closer to God through solitary confinement, which would make the individual penitent.

For nearly 100 years, Eastern State was operated under this theory, called the Pennsylvania system, and more than 300 prisons were opened around the world with the same structure.

The Pennsylvania system and Eastern State were controversial from the start. Primarily at odds with Sing Sing Prison and the New York system, which focused on inmate collaboration and labor, Charles Dickens best articulated the conflict between a righteous plan and a ruinous outcome. After his visit in 1842, he went on to write:

In its intention I am well convinced that it is kind, humane, and meant for reformation; but I am persuaded that those who designed this system of Prison Discipline, and those benevolent gentleman who carry it into execution, do not know what it is that they are doing....I hold this slow and daily tampering with the mysteries of the brain to be immeasurably worse than any torture of the body; and because its ghastly signs and tokens are not so palpable to the eye,... and it extorts few cries that human ears can hear; therefore I the more denounce it, as a secret punishment in which slumbering humanity is not roused up to stay. 

Today, many criminal justice advocates and the United Nations consider solitary confinement to be torture. However, it wasn't until 1913 that Eastern State ended this practice, primarily because the use of solitary was incompatible with the facility’s overcrowding. Many other prisons around the world continued to use this practice into the post-war period. 

The lessons of Eastern State are numerous, but the largest may be that good intentions and a righteous plan don't guarantee humane reform. 

Currently, the U.S. is experiencing the largest push for criminal justice reform in the last century. Decades of increased and mass incarceration, the school-to-prison pipeline, and indefinite collateral consequences following system contact are now apparent and intolerable to many.

With increased attention, there are novel and varied solutions to these system ailments, including those that come from the tech industry. This week's symposium focused on some of them. Similar to what Dickens acknowledged in 1842, these are projects that are being built with a benevolent purpose but have little understanding of the nature of the problem, end user, or systems they look to affect. This is creating defective, unreliable, and dangerous tools.

I was impressed with the two women that I met at this symposium from the National Network to End Domestic Violence who run TechSafety.org.  This a needed resource (I've been kicking around a similar idea to build off our survey on new criminal justice tech), that individually tests each tool as an end user. Time and again, they explain, these tools fall short of the mark. They find incorrect geo-tagging, promises to send a text to police in a jurisdiction where police does not receive texts, and design that doesn’t consider victims/survivors of IPV. These are horrifying errors when considering someone is supposed to rely on them before, during, or after their personal safety is at risk.

This is unacceptable.  Without any independent certification or accountability for such tools, however, there's nothing to stop people from making available well-intentioned websites or apps that jeopardize someone's life. Arguably, there is a level of liability for these developers that could land them in criminal or civil court; however, if that happens it means the worst has already occurred to the user that attempted to rely on the faulty tool.

The stakes are too high for sloppy or ineffectual “innovations”.

To be clear, this isn't a problem unique to tech. Our analogue criminal justice system is rank with ineffectual ideas that ruined lives: the death penalty, scared straight programs, and the Pennsylvania system are just three of a much longer list. As we witness the incoming wave of next gen criminal justice innovations, we need to acknowledge how easy it is to screw up reform, and then we need to create standards as a community to mitigate these potential errors.

The Pennsylvania system was supported by the country's first criminal justice reform organization: the Society for Alleviating the Miseries of Public Prisons. They had the stated goal of wanting a prison system that reformed people, not merely hold them in raucous environments with others who committed a crime. This group thought their model improved the status quo. This group, largely informed by the majority's Quaker faith, didn't think they were institutionalizing a human rights violation.

It's important that we acknowledge that righteous paths can still create ruinous results. Altruism will never be sufficient to make tech-for-good good. Events like the one at the Ortner Center is one large step forward to create awareness of this issue. However, if we, the reform community, don't take meaningful steps to hold technology accountable in the IPV and criminal justice spaces, then we've already embarrassed ourselves to future generations that will look back at this moment as a missed opportunity, or worse, something akin to a human rights violation.

Cell Site Simulators & an FBI FOIA by Jason Tashea

I recently wrote and published a piece on the use of cell site simulators for the ABA Journal, and during that process I FOIA'ed the FBI. This post includes the results of that FOIA and further discussion around the FBI's new guidelines.

For the TLDR crowd: there are these tools (AKA: StingRays, IMSI Catchers, HailStorm, or cell site simulators) that law enforcement use primarily without probable cause warrants. The tool forces cellphones in its proximity to give up their location. This approach is being challenged in state and federal court, including Maryland, which recently decided that you did need probable cause warrants to use these devices. The trend from the courts seem to be towards requiring probable cause, which is a good thing.

That being said, while interviewing a spokesperson at the FBI for this piece I asked for a copy of the pre-2015 guidelines. I could hear his smirk through the phone as he told me that he wouldn't share it but I could FOIA it. So, I did.

Below is the document that they released pertaining to any pre-2015 FBI guidelines on the use of cell site simulators. The original document was 20 pages, they sent seven.

The biggest change between the two sets of guidelines regards the court order the FBI recommends to legally deploy the tool. Pre-2015, the guidelines recommended a pen register order (or a pen reg order plus a warrant depending on the jurisdiction). A pen register order does not require law enforcement to show probable cause. The new guidelines do recommend a warrant, which would require probable cause. Incase you're wondering, requiring probable cause is an extra layer of due process that law enforcement needs to go through to carry out the search. It's a procedural check on police search and seizure. 

While this is an improvement, it's important to remember that they are just guidelines. They do not raise to the level of regulation, let alone law or precedent. The FBI have skated past these guidelines to deploy StingRays before. The older guidelines state that the order/warrant needs to describe the "technique to be deployed", which would inform the court what tool is being used. 

The recent Andrews decision in Maryland illustrates how courts are being habitually kept in the dark by law enforcement in regards to the use and deployment of these tools. Granted this is a state case, and not one led by the FBI; however, there's evidence of the FBI's shortcomings as well.

Last, these new guidelines do nothing in regards to the FBI's needless yet fervent non-disclosure agreements with local law enforcement. The FBI makes every agency, no matter the level of government, sign an NDA saying they wont mention, discuss, or acknowledge the use of the tool. The spokesperson at the FBI I spoke with said that the NDAs were to protect the proprietary nature of the tool, which is developed by Florida-based Harris Corp, and not the use of the tool by law enforcement. You can read the NDAs for yourself in the previous link, but my interpretation is that the NDAs go much further than protecting Harris Corp's IP. And cases like Andrews illustrate that local law enforcement agree that the NDA goes beyond protecting the nature of the tool.

The long and short of it is that even with improved guidelines, the NDAs and lack of controlling law in the vast majority of U.S. jurisdictions leaves the situation basically the same.  

Alternative text - include a link to the PDF!

NeuLaw Criminal Records Database by Jason Tashea

This past weekend, I had the opportunity to travel to Houston and check out NeuLaw's Criminal Record Database. I wanted to take a moment a make an introduction to this project and talk about why it's important.

 

The U.S. needs a useable criminal records database. Primarily, this is because the closest thing we have right now, the FBI’s Unified Crime Report (UCR), doesn’t cut it. First, the UCR doesn’t have individual identifiers, so the ability to track cases is impossible. Second, the numbers that the UCR releases are cumulative, which means that you can’t understand the life cycle of a case from original charge to final disposition. Last, the UCR relies on local law enforcement agencies (there are over 18,000 in the U.S.) to voluntarily hand over their data, leaving the dataset incomplete. All of these factors mean that the UCR is woefully deficient. (As a side note, there is discussion about improving and reforming the UCR, but I have not seen any concrete recommendations.)

 

Understanding the current data problem, it’s easier to understand the need for the ambitious NeuLaw project. The database intends to collect tens of millions of criminal records from around the country and compile them in one standardized database. This project has cumulative longitudinal data, but also lets researchers drill down to the individual cases and follow their journey through the criminal justice system. As of last fall, the database had 22.5 million records from 1977 to 2014 from Harris County, Texas, New York City, Miami-Dade County, Florida, and New Mexico. To go deeper on this project, you can read this article by Pablo, David, Sasha and Gabe, the project’s core team.

 

This project has not been easy. The team applied for numerous freedom of information requests to get this data. Once with data was in hand, they took the painstaking effort to standardize it across jurisdictions. While the whole effort is impressive, it’s the standardization process that is amazing. Standardizing charges, dispositions, and human input errors is no small feat, especially when you’re talking about tens of millions of records.

 

As this project continues to grow, I hope that we have a discussion around the broader use of the project’s data standard. Yes, the database on its own is a needed tool; however, it would be revolutionary to have jurisdictions across the country collect data in its standard format. Undoubtedly, this is a big ask (even bigger than the creation of the database itself), but successfully implementing this standard would be beneficial in two main ways.

 

First, if the local crime data being created looked like the fields in the NeuLaw database, then the ability to update and keep the project current becomes immensely easier. This would both increase the usability of the tool, and it would greatly reduce the human hours it takes to keep the project going. Second, the White House, NYU’s GovLab, SpotCrime Open data standard, and Measures for Justice among others are trying to find or create a standard for justice data. I don’t see why we can’t explore using the NeuLaw data standard as a national standard. There’s no reason to reinvent the wheel if we already have a functional standard just waiting to be packaged as such.

 

I realize the dream of a national criminal records data standard is a hard ask in a decentralized criminal justice system like ours. However, it is a dream worth fighting for. Without such a standard, we will continue to struggle to create a deep understanding of how our criminal justice system functions. This should matter to everyone involved in criminal justice reform. because it’s near impossible to find a solution if you can’t first understand the problem.

Police Misconduct Databases by Jason Tashea

A piece of mine on the creation of police misconduct databases just came out in the ABA Journal. I had fun writing this piece and getting to know some of the attorneys and advocates putting in the effort to create functional and prolific databases.

I'll let the article speak for itself, just two thoughts on this work generally.

1. Creating new tech isn't the ends, it's the means. Tech affecting the criminal justice system is only as useful as our ability to implement and use it. These databases are about better informing the user, they do to not supplant advocacy or legal work. Tech wont remove this necessary human component.

2. The police have to get with it. The two main projects in this piece have received legal battles and acrimony from local police unions and departments. For the police, this is a losing battle. Police Departments, like those involved in the White House Police Data Initiative, are opening up their data without the pressure of legal action and public protest (and the sky doesn't fall either). For example, Indianapolis' Police Department, with help from Code for America, just launched Project Comport, which is a good start. It's also a roadmap for other departments looking to create and launch a public data portal. Police departments need to be thinking about how they default to open.