The ACLU is arguing that government programs like this one which attempt to predict crime carry a high risk of error.

-

According to the ACLU’s update on Medium, some on the United States government’s No Fly List are being kept on the list due to a fear of future criminal activities and are not being allowed to plead their cases in court. As a result, the ACLU is petitioning the Ninth Circuit Court of Appeals in Portland on behalf of their clients. In 2014, a federal district judge ruled against the federal government’s procedures for placing people on the No Fly List, which allowed the government not to notify people that they were even on the list.

“[W]ithout proper notice and an opportunity to be heard, an individual could be doomed to indefinite placement on the No-Fly List,” the court ruled. “[T]he absence of any meaningful procedures to afford Plaintiffs the opportunity to contest their placement on the No-Fly List violates Plaintiffs’ rights to procedural due process.”

Following this ruling, and subsequent court-ordered reforms, seven of the ACLU’S clients were cleared to fly, but still were not told why they wound up on the list in the first place. In addition to these issues, the government has begun to engage in predictive prevention; that is, using an algorithm to determine if someone will commit crimes.

If this sounds familiar, it is because you have likely encountered this kind of thing before in pop culture. In the 2002 film Minority Report, instead of an algorithm, people are arrested based on predictions from a psychic; perhaps the best illustration of this comes from the video game series Watchdogs, where the latest iteration has its opening sequence deal with its protagonist being under police scrutiny because of an algorithm that predicts future crimes.

At the heart of Watchdogs’ critique of these types of programs is an insinuation that these kinds of policies nearly always malign Black people and other people of color, with the main character in the story being a Black hacktivist. Indeed, fiction mirrors reality, because, in several instances, algorithms have been shown to reflect the racial biases of their creators.

The ACLU’s clients are, unsurprisingly, all Muslim, which feeds into the current climate of xenophobia, led by the Trump administration’s policies. The ACLU is arguing that government programs like this one which attempt to predict crime carry a high risk of error.

Despite the government not challenging that assertion in a district court, the court sided with the government on the grounds that “undue risk to national security”, which honestly could mean anything, justifies the federal government’s secrecy and process which undercuts civil liberties.