By Nikki Kallio

1331943958

As companies, organizations and governmental agencies collect more and more data from individuals, criminal networks likewise are working to stay a few steps ahead.

People want assurances that their data is safe — and policies surrounding privacy and ethics are becoming front and center. Governmental agencies including law enforcement are among those working to evolve with the changing technical landscape.

This spring, Fox Valley Technical College’s National Criminal Justice Training Center (NCJTC) announced a partnership with Biometrica Systems to build a new course aimed at training law enforcement in the ethical use of big data and facial recognition in critical investigations.

Tech_Tomlinson mug_RGB.jpg

Tomlinson

“A lot of the technology specific to law enforcement investigations is kind of the modern-day DNA, if you will,” says Aaron Tomlinson, NCJTC program development administrator. “There’s really not a lot of case law around it or a national standard or policy for usage.”

Questions from NCJTC clients — as well as among lawmakers — often center on the balance between when you can use certain information and when you should, Tomlinson says.

Problems arise when information gleaned from that software is not corroborated — in other words, good old-fashioned police legwork is still required, Tomlinson says.

“We have a whole section of the program called ‘start with the wrong,’” Tomlinson says. “Which basically means, start with the fact that the person you’re getting (with the software) is not the right person.”

An incredible amount of information can be gathered from wearable devices, smart refrigerators and beds, later-model cars and more — and Tomlinson says framing a national standard for information use will be key going forward.

“Algorithms are incredible,” Tomlinson says. “But it doesn’t remove the human element of needing verification.”

Hitting the brakes on all new technology — such as when communities ban facial recognition — isn’t necessarily the right action, either. “Imagine if you have a child that went missing,” Tomlinson says. “You’d be eliminating an opportunity to potentially recover that child.”

Tech_36 computers_RGB.jpg

UW-Oshkosh recently opened the Cybersecurity Center of Excellence in partnership with the Wisconsin Cyber Threat Response Alliance.

 

The evolution of data use

Conversations around the ethical use of data and technology are not new — they’re simply evolving as the technology does.

“This discussion kind of started back in the 1980s,” says Michael Patton, University of Wisconsin-Oshkosh Information Systems lecturer and co-founder of the UW-Oshkosh Cybersecurity Center of Excellence. “This was really before the internet took off, but organizations were starting to collect data and keep it in digital format.”

At the time, data philosopher Richard Mason developed the acronym PAPA — or privacy, accuracy, property and accessibility — as the key ethical issues surrounding the growth in data usage.

Mason recognized the importance of data remaining accurate as well as keeping certain types of data private — think in terms of health care organizations and credit issuers, Patton says.

Accessibility refers to consumers knowing what types of data have been collected and how it’s being used. In terms of property, discussing what kinds of data — such as email addresses — are “owned” by whom and what can be done with it is key to protecting personal information.

“One of the things we know about our society is people who own stuff get to make decisions about how it’s used,” Patton says.

 

Who is a target?

“There is not a business out there that doesn’t want to know their customers better to sell to them better,” Patton says. And it can be more convenient for consumers who may be happy to discover new products that are a good fit for them.

But there’s always the flip side. Threats to data security come from many directions, including nation states, organized crime, “hacktivists” and others who want personal information for nefarious purposes. One of the biggest issues right now is ransomware.

“The bad guys set it up as a business,” says Jerry Eastman, CEO and founder of the Wisconsin Cyber Threat Response Alliance (WICTRA) and co-founder, with Patton, of the UW-Oshkosh Cybersecurity Center of Excellence. “They even have customer service that will help you with bitcoin and so on.”

Once the bad guys are in your network, they may hold the information for whatever they think you can afford — just over what your insurance will cover, for example, and that can break a small company.

Criminals will “use software to just scan the internet or they’ll send out phishing emails and if you click on it, they’re in,” Eastman says. “I’ve helped two companies survive — one from ransomware, one from phishing. It’s pretty sad when it happens to a very small business, and they think they may have to close their doors.”

Phishing scams — where someone tries to steal key information such as social security numbers or credit cards — may come in the form of a legitimate-looking email from a familiar company that encourages you to click a fraudulent link. These attacks “often rely on human emotions, social engineering people to click on links for whatever reason,” Eastman says.

Training employees in best practices is essential, because even though “humans are the easiest target — they’re also the very best defense,” Eastman says. “They’re the best last defense a company has against phishing.”

 

Preventing data theft 

So, where does the responsibility fall to protect data — with individuals or with companies?

“The answer is yes,” Patton says. “We as individuals need to take some responsibility for how much data we give and what data we give up. But then, organizations need to be responsible to their members.”

On the technical side, organizations can install the right kind of firewalls and encryption and set permissions in such a way that only the people who need access can see data, he says.

Patton argues for organizations to follow models like in Europe, which has opt-in laws, meaning companies delete any personally identifiable information unless the customer specifically says they can keep it. Here, it’s essentially the opposite, where it’s on us to opt out.

“As with most things, the legislative response is slow — the law always lags behind technology,” Patton says. “And while there are laws that have been passed around credit card numbers and on financial information, there has been less focus on personally identifiable information.”

Organizations also “need to do a better job at filtering out information they don’t need to keep or gather,” Eastman says. “They gather so much, and they have it and then they hold on to it — why do you have my stuff for 10 years? We as a culture need to ask those questions.”

“The upside is there are really great people in the world that are starting to see that vision and figure out ways to get better at protecting our own assets,” Eastman says.

One of the points Patton shares in his classes is that technology is neither good nor bad nor neutral. “By introducing that new technology, the world has changed; we can’t go back to a time before that,” he says. So we must continue to have discussions centered on how and why that technology will be used.

“If we’re not talking about it, then people will make those decisions for us,” Patton says. “And we may not like the outcomes.”