Updated

In the past three years, data breaches have had a sort of media coming-out party.

Successful attacks on Home Depot, Target, the Internal Revenue Service, banks, and Anthem insurance competed with conventional scandals for top billing in the news. It seemed like just about every person in America suffered a data loss—in 2015, 21.5 million Social Security numbers were stolen from the federal Office of Personal Management alone.

Security hasn’t improved much in the aftermath, according to experts at a panel on data breaches at RightsCon, a digital rights conference being held in San Francisco this week. But, they said, several steps could help keep consumer data safer.

Make Data a “Toxic Asset”

Bruce Schneier, a leading encryption researcher, writer, and security consultant, said that private companies are collecting all the information they can on their users, assuming that they’ll find ways to use it for profit. “We are basically punch drunk on data,” he said, though in a few years he predicts that “big data hype will die down.”

Schneier argued that ideally companies would look at data as a “toxic asset,” an idea he’s proposed before. As long as user data is stored on a company server, it can be stolen. And that brings some risk of being sued by consumers or fined by government regulators. The problem, he said, is that “we don’t have a good liability regime right now.”

Both Schneier and other panelists called for measures to make those financial risks bigger for companies that do a poor job of protecting user data. In economic terms, a data breach is an externality—bad for someone, but usually not that damaging to the company with lackluster security. (Air pollution in the absence of environmental regulation is a good analogy, panelists said.) The idea is that if the financial risk was larger, companies would have a greater incentive to collect only information they really need, to encrypt it, and to quickly delete it from their servers.

Increase Transparency and Choice

Ranking Digital Rights is a nonprofit that rates companies on how well they protect user privacy and freedom of expression. Priya Kumar, a research analyst with the organization, spoke on the panel alongside Schneier. She said that it’s too hard for users to know how companies are protecting consumer data—or even what the companies should be doing. And so consumers can’t take privacy into their calculations when choosing which online services to use.

However, she said, “companies like to get gold stars, as we’ve learned,” and will change their policies to get higher scores by independent organizations. Ranking Digital Rights released its first findings on 16 big telecommunications and Internet firms in November 2015.

Make Privacy a Right

Transparency isn’t enough to provide choice, Kumar said. Unless you click “Agree” to a user agreement, you can’t use services from email to online banking sites to Internet service providers. If you don’t agree to revised terms of service from Facebook, as an example, you can lose a primary tool for connecting with friends and family—along with access to your own family photos. That’s why the “notice and consent” model for protecting privacy should be rethought, she said.

Several panelists said they’d prefer to see privacy as right protected by the government, rather than a piece of property that consumers can be required to relinquish in order to use important services.

Copyright © 2005-2016 Consumers Union of U.S., Inc. No reproduction, in whole or in part, without written permission. Consumer Reports has no relationship with any advertisers on this site.