The 4 pillars of data collection ethics

Every business collects data of one sort or another, and most businesses recognize that data collection security is vitally important — there’s no question about it.

Once a company scales up, it often starts running into moral and ethical questions about transparency, security, and privacy. A lot of major corporations have become the target of media attention when they’ve dropped the ball, but could your organization do any better in the same situation?

Obviously, a company is in business of making money and sometimes, stepping on toes to do that. But knowing where to draw the line can get tricky.

Ultimately, a company that displays honesty and transparency is going to fare better when it comes to public opinion. And with any luck, the executives will be able to sleep better at night. Adhering to data collection ethics represents a deeper commitment to being an ethical company.

The foundations of data collection ethics

There’s no single definitive source of information about data collection ethics, though there are some generally accepted best practices.

Still, it’s important to have a bearing point for data collection ethics, and that boils down to four foundational pillars: respect, stewardship, neutrality, and transparency.

Incorporating these four pillars into your infrastructure, employee training, communications, and all other aspects of your business should illustrate accountability and the importance of an ethical code at your company.

Pillar 1: Respect

When a person agrees to work with a business, they enter into a professional relationship with that company. The relationship may not be as intimate or friendly as a personal relationship, but it’s a relationship nonetheless.

Before potential clients can trust your business, they need to respect you. You can earn that respect by establishing a solid business reputation, maintaining a good track record with customers, and producing convincing sales copy. But it goes both ways.

Mutual respect is a valuable part of any relationship, and because managing data places businesses in a position of power, they have the greater risk of abusing that power. Abuse may be a strong word to use in an article about business, but it’s meant to make a strong point — businesses have a moral obligation to treat its clients, customers, and partners with respect.

When it comes to handling large amounts of data, it’s important to remember that you’re not just handling data; you’re handling people’s data.

That data can contain someone’s personal information or sensitive materials that require a level of privacy. A business that collects data needs to have a healthy respect for these things. So, throughout the entire life cycle of collecting any data, your company needs to be transparent about how you’ll use it, get consent to collect it, maintain it securely, and ensure its confidentiality.

Pillar 2: Stewardship

Data you collect from your customers is not your data. Even if it’s data that your business generated programmatically, it still may not be your data. It’s really data that’s under your company’s care.

Technically, your business may own all the rights to the data, but if a source outside of your business collected that data, it simply means that you’re now the protector of data that used to belong to someone else. You must be a trustworthy and responsible steward of that data.

Data stewardship should be more than just an attitude and a promise. You need to have the infrastructure in place to make it a reality as well.

Your business should keep all the data it manages stored in the most secure system you can afford. And you should ensure that you monitor this system constantly and report on it regularly.

Your security framework should protect against internal and external threats. It’s not enough to have a good firewall to protect your data from the internet. It’s also important to carefully weigh, evaluate, and maintain each staff member’s access permissions.

You should hold your team and systems accountable, and that’s something you can manage internally and externally as well. Regular maintenance on your security systems should be part of your day-to-day business, but you should also audit and update your systems regularly.

Pillar 3: Neutrality

Personal opinions shouldn’t affect your business unless your business is specific to a certain niche where offering your subjective perspective makes sense. For example, a company that builds and manages websites for churches should be friendly and welcoming to members of different faiths.

But for the most part, religion, politics, and fringe beliefs shouldn’t interfere with your daily operations. It’s OK for a business to show that it supports various causes, but it’s best not to be exclusionary or risk alienating your audience. Not only is an attitude of inclusion ethical, it’s also a way to attract the widest audience possible.

Regardless of what your company policies are, your data management should be strictly neutral and nonjudgmental. Once you’ve accepted someone’s business, you should ensure that their data is safe from any form of bias.

Part of maintaining neutrality is recognizing people as people. Businesses understand that shorter wait times improve customer satisfaction and vacation time reduces employee fatigue, but that doesn’t cover the whole spectrum of the human experience.

Issues like privacy and dignity are important as well. If a person is sharing private information, they should feel confident that no one will use it against them in any way.

It’s possible for your systems to have a bias for or against certain types of data, though it would be uncommon. It’s much more likely for the human beings who manage data to have certain biases. Your entire company should be impartial when it comes to data management, and you shouldn’t tolerate anyone allowing their prejudices to cloud their judgment when they’re handling data.

Pillar 4: Transparency

Maintaining transparency is a balancing act. A lot of what your business does is proprietary and must be kept confidential. Transparency is a must for businesses that have gone public, but it’s also a foundational part of a company’s ethical data collection practices.

Before you can even think about being honest as a company, you have to be honest with yourself. A lot of ethical gray areas can appear much more black-and-white when emotions come into play.

Emotions can easily cloud judgment. That’s why it’s important to be impartial and make decisions as a team when it comes to data privacy. When your team has decided that your business messed up in a way that affects your customers, it’s time to step up and take responsibility.

To help clarify whether you should share information about a data-related incident, you should also consider the impact on customers. Let’s say you lost a backup of data. You can make another backup using current data and carry on. But if you lost or compromised any data at all, that’s the kind of information you should share.

When your business screws up, it’s important to not only do damage control, but also take responsibility. That should include communicating issues and resolutions with affected parties. Your attitude should be: “We’re sorry and we’ll make this right.”

Advanced ethical dilemmas to consider

Supermarket loyalty data

Years ago, supermarkets started using loyalty cards to incentivize people to shop at their establishments exclusively, but a hidden benefit of those cards was the data they gathered. That data helped companies quantify information that was previously hard to track. It helped them understand consumer habits and much more.

But a tale from those early days reveals a dark side to data collection. As the story goes, a man got into a car accident and caused some property damage. The owner of the property sued the driver. The prosecuting lawyer was able to get access to the driver’s supermarket loyalty card data. The data showed that the man had purchased a lot of alcohol. The lawyer was able to convince the court that the driver was an alcoholic and that it was likely he was drunk while driving when the accident occurred. That won the case against the driver.

Let’s think about that story in terms of the four pillars. In this case, the biggest failing of the supermarkets was a lack of stewardship. They didn’t protect the driver’s identity and they didn’t respect his privacy.

And while the supermarket may have been neutral in their data collection, someone used the data they shared in a way that wasn’t neutral. The grocery store may have been transparent with its data, but it was not being transparent about how that data was actually being used.

The whole story illustrates an ethical conundrum. It’s easy to say that an organization should never share personal data, but there’s often a gray area. When does data sharing become a privacy violation? Is it OK to share data that may help resolve a legal issue? Do the ends justify the means? Where do we draw the line?

Facebook’s suicide prevention algorithm

In 2017, there were several instances of people live-streaming the final moments of their lives on Facebook for anyone to see. Facebook responded by introducing an algorithm designed to identify and prevent suicide attempts by monitoring what individuals share on their network. There were immediate concerns over the invasive nature of this new technology, but there were other issues as well.

First off, Facebook started collecting this data without getting consent from its users — this gave people no way to opt out.

In the United States, HIPAA regulations protect sensitive medical information in a number of ways. They help guarantee the anonymity, privacy, and dignity of anyone receiving medical attention. The thing is, Facebook isn’t a medical organization, so it doesn’t have any obligation to comply with HIPAA standards. That means that Facebook can do whatever it wants with the sensitive information it gathers from people with mental health issues.

When the algorithm detects that someone is suicidal, Facebook can go as far as contacting local authorities to address the problem. In one case, a police officer was sent to pick up a person that Facebook tagged as suicidal. The person’s information was then shared with the New York Times. Sharing that information with a news authority would be a clear violation of that person’s rights under HIPAA, but the police aren’t obliged to adhere to HIPAA regulations either.

Many people might consider it disrespectful for a company to make decisions about releasing people’s information without their consent and without giving them a chance to opt out — but is it OK to do so when it might save lives?

The way that an organization shares the information they collect is a question of stewardship. Is it OK to distribute information to the authorities to help people who might be suffering? Should suicidal ideation be a subject that a company remains neutral about? How transparent is Facebook about its use of personal information?

These questions are not easy to answer and aren’t intended to make companies like Facebook look bad. They simply demonstrate an advanced ethical challenge that companies should think long and hard about before they begin data collection.

A few more examples

Our AI helpers

Siri, Google, and Alexa are always listening to us. The mere thought of that can be frightening, but the truth behind their listening is more nuanced than that.

They only save the conversations we have with them directly. They’re always listening for us to start a conversation with them, like “Hey Google!” But the data isn’t recorded until after the conversation with the AI starts.

However, once that conversation does start, everything you search for, request, and say is recorded. And those companies can use that data however they want.

They can track your location, spending habits, and interests. Then they can use that data to influence your purchases, shopping preferences, and more. They could even theoretically use that information to influence your relationships, beliefs, and career paths.

The most private data — DNA

The service 23andMe can provide information about your health and family based on genetic data. You send a sample of your DNA, and 23andMe sends you information about it.

That data is incredibly valuable, and 23andMe and other genetic testing companies have shared DNA data with pharmaceutical giants. It’s big business.

The problem is that companies could also sell or leak that data to health insurance companies. That could lead to insurance providers denying people medical coverage for preexisting conditions based on their DNA.

Governmental tracking

The most ethically complex example is China’s social credit system. China is building up a system that uses facial recognition to track citizens and their behavior. The system then weighs that behavior and assigns a score to them.

A person can lose points if authorities catch them jaywalking, if they fail to pay a debt, or if they speak out against the government. A poor credit score can lead to an inability to book a flight, take out a loan, or buy property.

Your homework assignment

Take a moment to consider these examples. Think about the four pillars and how they align with the moral questions in each example.

Do you find each to be an ethical use of data? Why or why not?

If you had complete authority, what would you do differently? What would you do similarly?

There are no easy answers to any ethical question that may arise when it comes to data collection. And with the exponential growth of data science in our society, these moral dilemmas are bound to become more complex and challenging.

Our dedication to the four pillars

Jotform respects your privacy and the privacy of people who fill out your forms. We treat form questions and responses as private data. You can get more information about that from our privacy policy.

Stewardship is also a top priority for us. We provide industry-leading security, encryption, and protection, and we enable our users for HIPAA, CCPA, and the GDPR compliances.

Jotform is a completely unbiased company. Anybody can use our software for any ethical purpose, and we’re an equal opportunity employer with offices in multiple countries.

And Jotform has never had a data breach. If we ever did, we would address the issue responsibly and take appropriate action.

Photo by Los Muertos Crew

AUTHOR
Lee Nathan is a personal development and productivity technology writer. He can be found at leenathan.com.

Send Comment:

Jotform Avatar
This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Podo Comment Be the first to comment.