A New Kind of Expungement App by Jason Tashea

An expungement app was my gateway drug. In 2014, I developed a public facing, triage web app for my old employer (it's still up!). At the time, this project was exciting and novel. However, research I did earlier this year found that these types of apps don't really accomplish anything. Yes, from a technical perspective they work, but they create limited impact. It's not just my project, others built on this model were also wanting for more expungements.

In the alternative, there is a more impactful model that leverages local court record databases. However, these databases don't exist in every state and, therefore, the model is not replicable. With these limitations in mind, I think there's potential for an expungement app that has greater applicability across states, regardless of court record databases, and improves attorney workflow, a key component of a worthwhile expungement app.

The database model has worked in Maryland and Pennsylvania. Over simplified, this model scrapes a publicly accessible court record database so that the tool and the lawyers can know, in realtime, who can expunge their record and who can't. These tools also auto-populate the appropriate forms to file in court.

Both of these tools do two critical things that have led to their success and impact:

1. It makes determining and filing an expungement easier.

2. It integrates into an existing expungement workflow.

Collectively, these tools have created tens of thousands of expungements. This is an exponential impact over the public facing apps that I've built. As mentioned above, however, this model isn't being replicated widely because a minority of states have the type of database that is needed to build out a project like this. In others, this type of database is not publicly accessible or it is expensive to access. 

In a place like Florida, the law makes it impossible to request criminal record data en masse. You have to do it one case at a time, at a rate of $24 each. This creates too much of a hurdle to replicate the Maryland and Philly database approach. 

So, the question then becomes, do states without a public criminal records database just give up hope of riding the expungement tech wave? I don't think they have to. To that end, I want to propose a model I haven't seen yet, but I think has potential.

The idea is to use OCR (optical character recognition) to read RAP sheets and help determine eligability. This project could be browser or phone based. By either using the camera on a phone or by uploading a pdf to a browser, the machine would read the pertinent data from the RAP sheet, process it through an algorithm that knows the expungement statute, and, if the record is expungeable, then populate the appropriate forms in an editable format.

This project would meet the dual criteria of making an expungement easier to determine and it could easily integrate into an existing workflow. The tool takes the work out of reading someone's RAP sheet and confirming if their record is expungeable. It also obliterates the time it takes to fill out the paperwork. By saving over burdened and underfunded legal aid attorneys significant time, this is an app they will want in their workflow. 

At the beginning of every project, we have to ask ourselves if what we are doing actually makes the end user's life easier. In the case of public facing triage sites, the user drop-off rate indicates that that model isn't value added, but cumbersome to the user. Further, it doesn't add anything to a lawyer's ability to do her job. With these limitation in mind, I think an OCR project is the next evolution of these tools.

What do you think?

AI and Criminal Justice by Jason Tashea

Last week, I spent two days at the Government Accountability Office (GAO) Comptroller General Forum on Artificial Intelligence. It was an event that brought experts together in AI from the fields of autonomous vehicles, cyber security, finance, and criminal justice to discuss the current state of AI, propose what the near future will look like, and present potential policy recommendations and research questions to be taken up by Congress. 

To be a part of this well-run, interdisciplinary conversation was a treat. Often stuck in the world of law and criminal justice, it was a welcome change to hear from experts tackling these issues from non-legal perspectives.

Over the two days, I gained perspective on where AI is at broadly and how far criminal justice has to go specifically. Without question, the use of AI in the criminal justice system is nascent. The fact that criminal justice was included (to the exclusion of health care, education or any other policy area) remains surprising to me. One GAO staffer told me what partially sparked criminal justice's inclusion was my op-ed in Wired, which discussed the potential application of neural nets in risk assessment tools. While flattering, it illustrated my surprise: the jump off for criminal justice's inclusion in the forum was a prospective use of AI, not its current application. 

To that end, it did feel like myself and Richard Berk of Penn, the other criminal justice expert in the room, were talking about what-ifs, as opposed to the other experts who were talking about current challenges. Nevertheless, by listening to experts from industry, government, and academia from other fields I came away with a new perspective regarding AI in criminal justice. Specifically, without a strong market incentive, if quality AI is going to be created in the criminal justice space then the federal government needs to help. It can do this by building training datasets and creating a transparency mechanism.

While everyone else in the room represented fields with a strong market component, criminal justice was alone in being a government industry. Yes, there are private defense attorneys and third party vendors that sell products and services to the criminal justice system; however, this does not offset the fact that the criminal justice sector is government owned and operated. What this means in regards to the AI discussion is that there is no the market incentive to develop AI at the rate that we see in the medical or transportation fields where the financial payoff will be tremendous. 

If the criminal justice system wants to be serious about AI, it is going to need better training data. As I've written about previously, the are organizations that are trying to collect county level data and clean it for cross-jurisdictional analysis. However, being undertaken by a scrappy non-profit is impressive but not ideal for the gargantuan size of this project. A job this big and complex, should be undertaken and underwritten by the Bureau of Justice Assistance and curated by the Bureau of Justice Statistics. This would create a public dataset that could be used to train new AI in the world of risk assessment and facial recognition, for example.

Since collecting these large datasets can often be cost prohibitive, public training data would lower the bar for researchers and entrepreneurs to tackle difficult problems in criminal justice. It would also allow companies or governments developing these tools to benchmark their creations, an important process in the evolution of these tools and the standard in other industries. 

While bolstering the use of AI in the criminal justice system, the U.S. government needs to get serious about algorithmic and AI oversight. There is a transparency issue regarding these tools that is in direct conflict with an open and transparent court process. For guidance, I'll be watching the E.U.'s General Data Protection Regulation (GDPR), which creates a new right to challenge any algorithm that makes a non-human aided decision about a person. This type of intervention is critical as our technologies grow more complex and more opaque. (Even as I write this, I have reason to believe that this provision of the GDPR will fall short if applied to our court system, especially if cases like Loomis and Malenchik become the standard.) 

The U.S. is a laggard in data and algorithmic regulation, and it's time we take these issues seriously. If the feds don't act, we run the risk of states and localities passing their own laws on these issues. This is already happening in the auto industry. In the same way that we don't want a car to be legal and then illegal as it crosses a county or state line, we don't want algorithmic tools dolling out recommendations or justice through a regulatory patchwork. Due to the ubiquity of technology and AI, federal preemption on this issue will be important to provide guidance and take away uncertainly in the market and legal system, which will help bolster research and experimentation in this space.

Without a doubt, criminal justice, outside of early facial recognition work, is not leading on AI issues. However, there are specific paths the government can take to create an environment that welcomes research into this space, while protecting due process and creating transparency. The GAO will put out a report based on this meeting, and I'll post more about that when it's released.

In the meantime, where do you see AI being beneficial in the criminal justice system? What are the challenges?


Partner with our Georgetown Law Course! by Jason Tashea

We are seeking partners in the criminal justice system (corrections, courts, defenders, prosecutors, police, social services) to work with our new course at Georgetown Law.

Keith Porcaro, CTO at SIMLab, and I are teaching a course this fall at Georgetown on criminal justice technology, policy, and law. This course is a lab where small groups of students (3 or 4) will work with system partners on designing potential solutions to an existing problem identified by the partner. In total, we are looking for three criminal justice stakeholders to work with. Using teachings from the class, students will work with the partner organization to map a discrete policy or practice challenge and design prototype solutions that partners can take forward. 

A stakeholder partner will be available to help students learn about their system through a series of 30-minute interviews and provide feedback as needed. At the end of the course, the stakeholder will receive a prototyped solution to the problem they face. The course runs from late August to early December of fall 2017. Interested organizations should fill out this intake form, and we will follow up with you. Please don’t hesitate to reach out if you have any questions, jason@justicecodes.org.

Measures for Justice Data Portal by Jason Tashea

This week, Measures for Justice (MFJ) released their new data portal for the public. According to MFJ, the idea behind the portal "allows users to review and compare performance data within and across states, and to break them down by race/ethnicity; sex; indigent status; age; offense type; offense severity; and attorney type." They go on to say that "[t]he Data Portal comprises data that has been passed through 32 performance measures developed by some of the country’s most renowned criminologists and scholars. The measures address three primary objectives of criminal justice systems: Public Safety; Fair Process; Fiscal Responsibility."

Anyone who has done multi-jurisdictional criminal justice research will tell you a project like this is desperately needed. Further, this isn't a project from a dilettante that recently became acquainted with the mess that is American criminal justice. Amy Bach, the executive director of MFJ, is an expert on the criminal justice system and has been fighting this battle for criminal justice reform for many years. For these reasons, this project should be taken seriously.

Launching with six jurisdictions, the staff at MFJ are doing difficult work that the Bureau of Justice Statistics (BJS) should have been doing years ago. What's most impressive is that they're doing this work with just 22 staff in Upstate New York. Further, it's not just a chance to see comparisons through their clean user interface; they made it easy to export any of the raw datasets their staff meticulously poured over.

This being said, a project of this nature and with this massive scope raises some questions:

  • Sustainability: This project is massive and to keep it going will take a lot of money and effort. As of late, MFJ has been infused with a lot of money, especially from the tech sector; but, what comes of these efforts when they are no longer novel or become banal? Even worse, where does the data go if MFJ has to close? I'd assume someone would step in, but does that contingency exist? For the sake of the effort being made, I hope an off-boarding procedure has been considered. 

These challenges were faced by the Sunlight Foundation when they were undertaking their criminal justice data project a number of years ago. Ultimately, they had to off-board the project and never reached their goal of collecting all the criminal justice data from all 50 states plus D.C. and the Feds.

  • Upkeep: The work done by MFJ to get this far has taken six years and, from what I understand, was painstaking. While they plan to add 14 more states in the next three years, what happens to the datasets they've already built? Will we see expansions of these datasets as more data becomes available over time? To do so requires continuous effort and upkeep that isn't necessarily automate-able (the process of going the the jurisdiction in some cases was a needed step). I'm curious to know the plan for this type of perpetual upkeep.
  • Tracking Impact: This is never easy. Ever. While there will be anecdotal successes of district attorneys or judges that see the data and make change, how can we as the criminal justice community track the impact of this work? I hope to see the innovation and energy put into this project extend to creative ways to track its impact. Page view and download stats will be insufficient; as well success stories from various jurisdictions. I'd be happy to brainstorm on this front. It's a challenge worth tackling.

Beyond these initial concerns, I think MFJ is well positioned to be a normative force in criminal justice stats around the country. While their work is focused on collecting and aggregating this data, I think it would be a missed opportunity if they don't leverage their relationships at the county level to help improve data collection capacity and provide a floor for standardization. My dream is for a government agency or company to create something similar to what Google's General Transit Feed Specification did for transit data. Without a doubt the criminal justice system is more challenging than transit; however, MFJ seems well suited through their relationships to bring this message and support. Perhaps they build off of the National Information Exchange Model (NIEM) and the Global Justice XML data transit standards. I'm not sure. However, it would be great to see MFJ, or a coalition led by MFJ, flex this muscle. 


Measures for justice made a great data portal with county-level criminal justice statistics. It's a great and needed tool. There will likely be challenges around sustainability, upkeep, and measuring impact. With their impressive national network of local stakeholders, perhaps, MFJ can help standardize data capture in the criminal justice system. Go MFJ!

Can you code for Miranda? by Jason Tashea

A couple months ago, Andrew Ferguson, a law prof at DC Law School, reached out to ask about my thoughts on a recent article he wrote with Richard Leo, a law prof and Miranda expert, on the creation of a Miranda App. I've finally put my thoughts together for a post. This is that post

For your background, the article is summarized thusly:

For fifty years, the core problem that gave rise to Miranda – namely, the coercive pressure of custodial interrogation – has remained largely unchanged. This article proposes bringing Miranda into the twenty-first century by developing a “Miranda App” to replace the existing, human Miranda warnings and waiver process with a digital, scripted computer program of videos, text, and comprehension assessments. The Miranda App would provide constitutionally adequate warnings, clarifying answers, contextual information, and age-appropriate instruction to suspects before interrogation. Designed by legal scholars, validated by social science experts, and tested by police, the Miranda App would address several decades of unsatisfactory Miranda process. The goal is not simply to invent a better process for informing suspects of their Miranda rights, but to use the design process itself to study what has failed in past practice. In the article, the authors summarize the problems with Miranda doctrine and practice and describe the Miranda App's design components. The article explains how the App will address many of the problems with Miranda practice in law enforcement. By removing the core problem with Miranda – police control over the administration of warnings and the elicitation of Miranda waiver and non-waivers – the authors argue that the criminal justice system can improve Miranda practice by bringing it into the digital age.

I had many thoughts after reading this article. I like the mix of metaphor and machine, and, specifically, I think the metaphor is strong. I admire the leap the authors want to make between the confederated, bureaucratic nature of Miranda to a reality where this constitutional protection is treated like one. As they describe, the challenges around Miranda today create hurdles for this tool.

These hurdles are not necessarily overcome by approaching the problem through a tech lens. First and foremost, I think the authors need to prove that the lack of tech is the problem. The article lays out the many problems with Miranda and then proposes the solution. If this project indeed want to move forward, I'd recommend running a randomized controlled trial that had one group be told Miranda, one group who read Miranda, and one group that had a more interactive experience via a website. Afterwards test everyone on their understanding of the warning. If there's a statistically significant difference between the first two groups and the web-based group, then there might be reason to believe that an app could be valuable. Otherwise, I'm hard pressed, on first blush, that an app is the solution. (As a side note, I think an interactive, responsive web-app would suffice; there's no need to build a proper app for this project.)

Further, some time needs to be spent defining what the core competency of the tool is. The article is very broad and articulates numerous potential features, but this is dangerous. Called "feature creep", this is a common issue with any ideation process. While there are an endless number of permutations of the tool put forward, it's important to clearly articulate what the tool's core job is and build (with user feedback) that first, while fending off various unnecessary features. 

On the accessibility front, there's a lack of analysis or definition on what this means. I recommend taking a look at WCAG 2.0 standards. A project like this should be aiming for a AA standard (the standard recommended by the DOJ). 

With any project meant to effect the justice system, implementation is the biggest hurdle. Put another way, building this tool is a moot point if no one is going to use it. The fact that there are 500 different versions of Miranda around the country should illustrate how hard it is to standardize this procedure in America's 19,000 law enforcement agencies. Is there an agency that would be willing to test this idea? The status quo is a powerful foe, so I'd recommend finding an agency that is willing to be a part of the project from the ground floor. It'll be easier to test the initial build and find other adopters if there's the support of a law enforcement agency.

Last, I wonder if the tool as described runs the risk of over-informing the user causing confusion. The article includes a lecture's worth of information, and that could overload and limit the user's comprehension. Through user testing they can figure out what the right amount of information is. But this will require time and effort early in the process.

Beyond these thoughts, I had a couple of random questions about the tool as the article conceives it:

  • The article says that the data would be kept on a "secure server"; however, this misses a number of issues. Beyond secure server not being defined (does this mean password protected? encrypted? etc?), would the data be protected in transit and at rest? Who has access to the server (police, states attorney, defender, courts, researchers, developers)? And further, what would the data retention process and procedure be? A major issue with police bodycams has been the cost of storage, who will pay for all this data (including cumbersome video footage) to be stored?
  • Would the user (the arrestee) be informed that 1. they are being recorded and 2. the data they produce is being collected? 
  • Would the metadata collected be anonymized? Or would there be unique identifiers that would make it easy to admit this information in court? If it is admissible, what are the potential impacts of this new information in court?
  • If the user reaches a point in the process within the tool where they need help or further information, what process would take place? Would the police be the intermediaries? If so, does this diminish some of the value of the tech intervention?
  • The article mentions that the tool would be used at booking and not arrest. Correct me if I'm wrong, but isn't the current standard to read Miranda at the time of arrest? Do they envision that Miranda would be read at arrest and then the app would be given to the person at booking? If the only Miranda warning is given via the tool at booking, then does that create a rights gap between arrest and booking?

They've decided to tackle an interesting and difficult problem, which is exciting. However, there's a lot of work to be done to make this idea viable.

Police should weaponize the DMCA by Jason Tashea

I have no love for the mugshot racket.

If you aren't familiar, mugshot websites are privately held sites that post people's mugshots and exploit search engine algorithms to bring a person's worst night to the top of their search results. To be clear, mugshot sites aren't news or crime blotter websites providing a public service. These sites operate under an extortion model that requires a fee to takedown the pictures. However, people often find when they pay a few hundred dollars to take down their photo, their mugshot pops up on a different site. It's an endless game of online reputation whack-a-mole. 

To put it mildly, this practice sucks. The Internet is where landlords and potential employers go to find more information on a person. Even if the person was merely arrested and never charged with a crime, their mugshot lives on in infamy. This creates a plethora of collateral consequences that hurt peoples' ability to move on with their lives (for a lot more on this subject check out the Collateral Consequences Resource Center). Even if the individual can legally expunge the record, the Internet never forgets.

However, over beers with Sarah Lageson from Rutgers, it dawned on me: the only rule that the Internet even comes close to following is the Digital Millennium Copyright Act, and this could be a weapon against mugshot websites. Because the individual in the mugshot doesn't own the copyright of the mugshot, it becomes an opportunity for the police to take a stand on behalf of the citizens they serve and protect.

To put it simply, the DMCA is an American law that codified two international treaties from the 90s that was an attempt to control the illegal use and dissemination of copyrighted material. You've likely come into contact with the DMCA as a sad face and a short message on YouTube.

This is the DMCA in action. Someone with a copyright saw their video on YouTube, hadn't given permission for it to be there, and wrote a letter to YouTube (actually it's a form on the site) saying, "That's mine, under the DMCA please take it down." 

Internationally, 94 parties have joined the World Intellectual Property Organization Copyright Treaty. Most notably, the EU has Directive 2001/29/EC, which is their DMCA. This is important because the Internet doesn't have a jurisdiction like a person does. So, the fact that nearly half of the world is onboard with this treaty matters when it comes to enforcement.

Back to the mugshot issue, the reason why these photos are allowed to be circulated in the U.S. is primarily a First Amendment issue. The press will oppose a police department that tries to shutdown the public release of these photos. They will argue the public has a right to know, and that public safety demands the release of this information. Whether or not you agree with this position is irrelevant to the point I'm making, which is that the First Amendment is going to keep the photos in the public sphere. So, turning off this spigot of mugshots is an unlikely solution.

This all brings me to my clickbait-y headline: where it is allowed police departments should use the DMCA to en masse get mugshots taken off these sites. 

Why the police? The default copyright holder of a photo is the person that took the photo, in the case of mugshots it's the police. The police could write takedown notices that would affect tens of thousands of people in a jurisdiction. Not only would this be a huge benefit to the individuals with mugshots on these sites, it would earn the police a win with the community.

Now, this idea has a couple of potential hurdles. First, some governments do not allow themselves to own copyrights. The feds, for one, do not hold copyrights. In the absence of a copyright, a DMCA takedown notice would not work. One potential work around would be to hire a third party vendor to take the mugshots, grant them the copyrights, and then work with the vendor to send the DMCA notices. 

Second, the Internet is a lawless, apocalyptic hellscape with no justice or jurisdiction. True, but allowing extortion to continue unabated doesn't seem like a good alternative. As I pointed out above, half of the world's countries are signatories of the treaty that was the basis for the DMCA. My understanding is enforcement is not uniform in those countries. Further, mugshot sites could always decide to be hosted in a country that didn't sign/ratify the WIPO Treaty. This is the same struggle with revenge porn websites and "dark net" hubs for child pornography, for example. However, unless we're just going to throw up our hands at the challenges of enforcing our terrestrial laws online, I think this is a battle worth waging.

Last, there's the small hurdle of finding out where mugshots are and which ones come from which police department. Some sites make it easy to know what jurisdiction they pull their mugshots from like TampaCriminal.com. Others, like Mugshots.com, are a national clearing house of these photos. An automated reverse image search could help in this process. This would take some work, but a partnership between a department down to tackle this issue and a savvy developer could overcome this hurdle. Plus, once the tool is built, it should be open sourced and shared with other departments across the nation.

This is just a cursory pitch, but let me know what you think. I'm by no means a copyright expert, and I'm sure there's other issues that I skate by, but I wanted to throw it out there and get some feedback.

UPDATE: Shortly after publishing this piece, I received some helpful feedback from numerous people. One thing that was pointed out was that the reverse image search is an unnecessary step. Instead, the police department or third party contractor could keep a catalog of the image hash from each photo that would allow them to easily search for the images. 

Second, a concern I was unaware of when I wrote the original post is that there are significant privacy issues when someone uses the DMCA. Specifically, in revenge porn cases a person issuing a notice has to provide private information to prove their identity to the site that is hosting the image. That information is then used to dox the victim, which only continues their public shaming. By having the police department or third party contractor send the DMCA, the individual in the mugshot's personal information does not have to be divulged to the site. This will save people with mugshots from further online reprisals. 

Mugshot websites, still a thing by Jason Tashea

I'm working on a feature piece regarding the use of algorithms in the criminal justice system. One aspect that wont make it into the final draft is that online search algorithms play a role in all this criminal justice and technology work. Here is the excerpt that didn't make the final edit:

Online search algorithms also affect those with criminal records. Julie Cantu of Tampa, Florida found this out when a first date asked about her mug shot he found online. Cantu was arrested in 2010 after blowing below the legal limit  during a field sobriety test, but she thought the issue was behind her after the charges were dropped and the record was expunged.

After the date, she found her mug shot with tear streaks running down her face on sites like MugShots.com, Tampacriminal.com and Arrestmugshot.com. These are not newspaper or crime blotter sites that are reporting on local crime. Cantu found herself in the mug shot racket, a series of websites, primarily hosted offshore, that exploit search engine algorithms and demand a fee to takedown pictures. After paying $175 to one site, she found her photo pop up on a different one.

Cantu says she worried that the photo was “going to be there the rest of my life,” which could affect her employment as a nurse.

Luckily for Cantu and the estimated 70 million Americans with a criminal record, Google, which accounts for 65 percent of U.S. search traffic, changed how their search algorithm serves results related to mug shot websites in 2011.

Johnathan Hochman, an Internet marketing consultant based in Connecticut, says that Google did not disclose how the algorithm was changed, but he suspects they “deindexed” mug shot websites, which means they do not show up in search results. Google did not respond to a request for comment.

At the time of publishing, Googling “Julie Cantu Tampa” did not bring up her mug shot on the first five pages of results. However, searching “Julie Cantu Mug Shot” immediately produced the photo.

While Hochman appreciates Google’s effort, he says, “It’s not completely perfect.” He thinks there is need for federal legislation banning the “depublishing” industry, which includes sites like MugShots.com but also revenge porn sites that operate similarly. Calling these sites “extortion”, he says it “is something completely new, and it should be illegal.” 

Tech & Partner Violence: Promise & Peril by Jason Tashea

Last Thursday and Friday I was a part of the "Technology and violence against women: Protection and Peril" symposium hosted by the Ortner Center at the University of Pennsylvania. This was an opportunity for experts in domestic violence, government, justice, law, social work, and technology to come together and discuss the pending and forthcoming policy issues surrounding technology and intimate partner violence (IPV). This was a unique and exciting opportunity to discuss issues that I care deeply about, but, more importantly, to learn about how technology is affecting IPV, good or bad.

The talks were universally enlightening. Topics included new tools and platforms (if you haven't checked out Callisto, fix that before you continue reading), data, smart guns, revenge porn, innovations in post-assault evidence collection, and about a half dozen more.

The challenges are vast regarding IPV, and the promises of new techniques and technologies are not fully realized. Many of the apps that are being developed might be good in concept, but are insufficient in practice. The need to accurately document physical trauma on darker skinned people remains. Smart guns have the potential to crimp gun violence generally, but may be neutral in impacting IPV specifically.

The symposium acted as a place to not only confront these challenges, but to also discuss policy prescriptions. The Ortner Center will be producing white papers on proposed policies. (Here’s hoping that procurement reform and data trusts make the list!)

More broadly than the specific talks themselves, this symposium was unique due to the intersection of topics. The field of justice and technology remains nascent and is obscure to many, so an event like this remains a novel treat.  When I asked Susan B. Sorensen, who is a professor at Penn and runs the Ortner Center, about this symposium and her motivation around it she said it was driven by her curiosity as she witnessed a changing landscape on account of tech. However, she was uncertain about the timing of the event. She admitted that the conference could be too early, which would make it bleeding edge as opposed to cutting edge.

This is a sentiment that I've been feeling for awhile. I know I've seen improvement and increased awareness over the past few years; this symposium and our symposium last week are evidence of an evolution toward greater awareness of these issues. However, when I speak to individuals at more established criminal justice organizations or attend so-called “justice and tech” events, they often don't see the intersection of these two worlds or the need for a singular focus on this subject, like how juvenile justice or bail reform has its experts. However, to draw this analogy out, I think we are beginning to see the blood coagulate. Vera now has a VP of tech and justice, the White House put out its Task Force on 21st Century Policing, and the fact that media covers this issue with more savvy all indicate that the issues around tech and justice are moving from the shadows and into the mainstream.

After the stimulating symposium, I wanted to reflect on this moment in criminal justice reform and our country’s history on the subject. So, I spent a few hours on Saturday at Philly's Eastern State Penitentiary. Built in 1821, this facility was the first penitentiary in the world and ushered in the promise of a more humane criminal justice system that brought prisoners closer to God through solitary confinement, which would make the individual penitent.

For nearly 100 years, Eastern State was operated under this theory, called the Pennsylvania system, and more than 300 prisons were opened around the world with the same structure.

The Pennsylvania system and Eastern State were controversial from the start. Primarily at odds with Sing Sing Prison and the New York system, which focused on inmate collaboration and labor, Charles Dickens best articulated the conflict between a righteous plan and a ruinous outcome. After his visit in 1842, he went on to write:

In its intention I am well convinced that it is kind, humane, and meant for reformation; but I am persuaded that those who designed this system of Prison Discipline, and those benevolent gentleman who carry it into execution, do not know what it is that they are doing....I hold this slow and daily tampering with the mysteries of the brain to be immeasurably worse than any torture of the body; and because its ghastly signs and tokens are not so palpable to the eye,... and it extorts few cries that human ears can hear; therefore I the more denounce it, as a secret punishment in which slumbering humanity is not roused up to stay. 

Today, many criminal justice advocates and the United Nations consider solitary confinement to be torture. However, it wasn't until 1913 that Eastern State ended this practice, primarily because the use of solitary was incompatible with the facility’s overcrowding. Many other prisons around the world continued to use this practice into the post-war period. 

The lessons of Eastern State are numerous, but the largest may be that good intentions and a righteous plan don't guarantee humane reform. 

Currently, the U.S. is experiencing the largest push for criminal justice reform in the last century. Decades of increased and mass incarceration, the school-to-prison pipeline, and indefinite collateral consequences following system contact are now apparent and intolerable to many.

With increased attention, there are novel and varied solutions to these system ailments, including those that come from the tech industry. This week's symposium focused on some of them. Similar to what Dickens acknowledged in 1842, these are projects that are being built with a benevolent purpose but have little understanding of the nature of the problem, end user, or systems they look to affect. This is creating defective, unreliable, and dangerous tools.

I was impressed with the two women that I met at this symposium from the National Network to End Domestic Violence who run TechSafety.org.  This a needed resource (I've been kicking around a similar idea to build off our survey on new criminal justice tech), that individually tests each tool as an end user. Time and again, they explain, these tools fall short of the mark. They find incorrect geo-tagging, promises to send a text to police in a jurisdiction where police does not receive texts, and design that doesn’t consider victims/survivors of IPV. These are horrifying errors when considering someone is supposed to rely on them before, during, or after their personal safety is at risk.

This is unacceptable.  Without any independent certification or accountability for such tools, however, there's nothing to stop people from making available well-intentioned websites or apps that jeopardize someone's life. Arguably, there is a level of liability for these developers that could land them in criminal or civil court; however, if that happens it means the worst has already occurred to the user that attempted to rely on the faulty tool.

The stakes are too high for sloppy or ineffectual “innovations”.

To be clear, this isn't a problem unique to tech. Our analogue criminal justice system is rank with ineffectual ideas that ruined lives: the death penalty, scared straight programs, and the Pennsylvania system are just three of a much longer list. As we witness the incoming wave of next gen criminal justice innovations, we need to acknowledge how easy it is to screw up reform, and then we need to create standards as a community to mitigate these potential errors.

The Pennsylvania system was supported by the country's first criminal justice reform organization: the Society for Alleviating the Miseries of Public Prisons. They had the stated goal of wanting a prison system that reformed people, not merely hold them in raucous environments with others who committed a crime. This group thought their model improved the status quo. This group, largely informed by the majority's Quaker faith, didn't think they were institutionalizing a human rights violation.

It's important that we acknowledge that righteous paths can still create ruinous results. Altruism will never be sufficient to make tech-for-good good. Events like the one at the Ortner Center is one large step forward to create awareness of this issue. However, if we, the reform community, don't take meaningful steps to hold technology accountable in the IPV and criminal justice spaces, then we've already embarrassed ourselves to future generations that will look back at this moment as a missed opportunity, or worse, something akin to a human rights violation.