Even if you’ve never heard of Edward French, his death should terrify you.

Two years ago, the 71-year-old photographer was greeting the sunrise with his camera at a popular San Francisco scenic spot when a pair of robbers shot and killed him. One of those robbers, Lamonte Mims, was in jail just days before the murder. He was released – free to help kill French – because a computer algorithm determined he didn’t pose a grave risk to public safety.

Jurisdictions across the U.S. are snapping up algorithms as tools to help judges make bail and bond decisions. They’re being sold as race- and gender-neutral assessments that allow judges to use science in determining whether someone will behave if released from jail pending trial.

MULVANEY ON MASS SHOOTINGS: 'I DON'T THINK IT'S FAIR TO LAY THIS AT THE FEET OF THE PRESIDENT' 

Really, they’re a dangerous collision of the poorly vetted cost cuts and socialist agendas that are sweeping this country.

The algorithms scare me because they’re being implemented for the same reason as the early release programs that are getting people killed. The goal isn’t to protect public safety. It’s to empty jail cells and release dangerous criminals on their own recognizance.

As a member of the Senate Judiciary Committee, I’m concerned about the recklessness of public policy that endangers people’s lives, especially in minority communities, where crime often is such a scourge. These algorithms – called pretrial assessment release tools – are the equivalent of using a Magic 8 ball in courtrooms. The results are disastrous to communities and great for criminals.

The algorithms scare me because they’re being implemented for the same reason as the early release programs that are getting people killed. 

In my home state of Louisiana, New Orleans decided a few years ago to reduce the jail population. City officials started using a pretrial assessment release tool that was available for free from a nonprofit founded by a former hedge fund manager who became a billionaire through risky investments that turned into gold.

Do you know what happens when you allow a hedge fund manager to restructure your criminal justice system? You get a model that’s fraught with risk.

MORE FROM OPINION

The new tool comes into play when someone is arrested on a felony charge, such as robbery or rape. The tool comes up with a score of one to five based on the defendant’s age, criminal history and several other factors. A “one” is considered a low risk to public safety. A “five” is considered justification for maximum supervision.

You would think that a risk level of “one” would be limited to people who jaywalk or shoplift. You would be wrong. In practice, a “five” apparently is reserved for people who kill busloads of nuns. Ordinary thugs get a “one” as long as they promise that they’ll spend all their time in church and attend every court appearance. They don’t have to regularly check in with a court officer or even call once a month.

CLICK HERE TO GET THE OPINION NEWSLETTER

Alan Parson and Richard Sansbury were charged with the robbery of a New Orleans pharmacy in which workers were tied up and a police officer shot. They received a risk assessment of “one.”

Neither Parson nor Sansbury is from Louisiana. They are believed to be part of an Indianapolis gang responsible for pharmacy robberies in multiple states. But a computer algorithm deems them a minimal risk to public safety.

Then there’s Theron Glover. At 18, Glover currently faces 160 criminal charges in New Orleans, many related to car break-ins and auto theft. He received a risk assessment of “three” after a series of heists.

While out on bail, Glover is suspected of progressing from stealing Honda CRVs to firing into a crowd of people. Four people were injured and Glover was hauled back to court. This time, the algorithm gave him a risk assessment of “one.”

These aren’t isolated examples.

The Metropolitan Crime Commission found that 37.6% of the people arrested for violent felonies in New Orleans during the third and fourth quarters of 2018 received the lowest risk level of “one.”  That included more than 32% of the people arrested for homicide and 36.5% of the people arrested for rape.

CLICK HERE TO GET THE FOX NEWS APP

Algorithms diminish public safety in this country. They ask us to pretend that lengthy arrest records and violent crimes don’t matter.  They ask police to scoop up the bad guys only for the courts to immediately release them.  They turn us into a bad joke.

This is utter insanity, and it’s taking the lives of innocent people like Edward French.

CLICK HERE TO READ MORE BY SEN. JOHN KENNEDY