I've faced some opposition recently based on my views that the Electronic Frontier Foundation did a disservice to their constituents by focusing so much of their efforts on privacy, rather than data ownership. With that in mind, I pose two ethical scenarios to help illustrate my (and the Guardian's) point that solving the data ownership debate will solve far more than just the privacy debate.
First, consider a computer programmer that enjoyed building small computer applications to give his friends. He would frequently go to his office on Saturday when no one was working and use his employer's computer to develop computer applications. He did not hide the fact that he was going into the building; he had to sign a register at a security desk each time he entered.
The obvious ethical question raised here is whether or not using company property for personal gain is permitted. While this has some ethical quandaries, the answer is just as obvious - if his employer explicitly grants permission, then the action is ethical. However, the issue of data ownership comes into play a little bit here. For starters, it is common knowledge that anything created on the job is the sole intellectual property of the employer. This makes sense since the employer is paying you to create and you are using the employer's resources to make the item, the employer should reap the benefits.
However, when data is stored on an employer's system, even if it is created outside of work hours, the data is undeniably in the possession of the employer. However, does this possession equate to ownership? In this instance, I think our current intellectual property laws would say yes -- the applications are being developed using the employers assets and the scenario does not state that the employee actually garnered permission to use these assets.
What if the employer wanted to act upon this ownership and sell this application? Would that be ethical? Certainly the employer could prohibit the employee from accessing the systems and no one would bat an eye, but how many people would feel comfortable with their employers owning every datum they input into their work computers?
Second, consider that an information security manager has access to the e-mail servers for his company and routinely monitors the contents of these e-mails. During his surveillance, he discovered a lot of personal use for the e-mail including: love letters, marital disputes, affairs, off-color jokes, and gambling. The security manager would inform security and human resource directors by providing printed copies of the correspondence, these directors would then punish employees based on this information to the vehement objections of the employees.
This issue is more directly related to privacy and synonymous with the current domestic surveillance argument that's arisen after the PRISM scandal. Individuals have some reasonable expectation of privacy in the world, and while there have been numerous legal precedents dictating that metadata is considered public domain and not subject to privacy concerns, no such court rulings have condoned warrantless monitoring of electronic communications' content.
As a result of this, some companies have attempted to utilize End User License Agreements to circumvent the lack of legal precedent by declaring that communications conducted over their systems may be monitored by them without due processes. However, the Federal Trade Commission has been aware of this ploy since 2009 and has condemned companies with contradictory EULA and marketing campaigns. Given the Federal Trade Commission's stark opposition to commercial surveillance, I find it hard to believe that companies that monitor their employees whole cloth would find a warm reception in a court of law.
However, our laws are not always correlated with our morals. In this instance, however, we can reasonably assume that this company is not acting ethically. As the company transitioned from phone to e-mail communications, the company gained the ability to transition from metadata to content collections, and the debate of data ownership comes into play once again. No one in the 1990s would have suggested that the conversation conducted over company phones was owned by the company, so why would a company assume that e-mail communications are suddenly their domain?
Simple: they now have the means to store it. Short of tapping your employee's phones (active), there was no way to store phone communications in the 1990s. Since the advent of e-mail, companies can now store e-mails on outlook servers passively, which blurs the line of ownership enough to make unethical decisions seem permissible.
While this example does have two competing forces that must also be balanced, they are not the obvious choices. Readers might be inclined to think that the balancing act would rest between the right to privacy and productivity; but the balancing takes place at the intersect of privacy and an employer’s right to reduce fraud, waste, and abuse.
Earlier I mentioned that employers only had metadata information prior to the advent, collection, and analytics of e-mail traffic, and employers could only marginally ensure that their resources would not be squandered by looking for long distance, repetitive, or excessively long phone conversations. As a result of this metadata only analysis, employers could both ensure the ethical use of their resources while not infringing on their employee’s right to privacy.
Moving this metadata collection argument forward to current events, employers have the right to ensure that their system resources are not being used for personal business. The privacy concerns start to arise when metadata is collected in such volume as to enable the creation of accurate data profiles as is the case with PRISM, or when schools used issued laptops to spy on students (who owns the video footage there?).
So what can administrators do to prevent this gradual erosion of their constituents’ civil liberties? If we technologists, accept that we must adhere to a professional code of conduct, then we must also accept that our actions must adhere to a certain standard. The vast majority of computer, information, and data scientists will never been in a position in which they can ruin a life (let alone millions) through abuses of power; but each and every one of us face ethical decisions every day. Whether we represent a boss with control issues, or a government with an obsession, we must recognize that certain actions are inherently immoral; and the scope is irrelevant.
Making the right choice with these decisions could mean ensuring that a company doesn’t abuse our country’s outdated intellectual property laws, or standing up to an overzealous security manager snooping on company e-mail traffic. Above all else, it means guiding our decisions with a consistent moral compass, like James Rests’. Second, administrators need to recognize the nebulous nature of our privacy concerns and that our laws are not the best guidance for morality.
“Our laws are focused on data collection, not the usage of data. And, yet, it’s at the usage level where the violations of collective privacy take place. It’s not particularly creepy to know that someone is a Yankees fan if they’re wearing a Yankee’s T-Shirt. But if your algorithm pieces together thousands of pieces of data shared by that person and their friends and develops a portrait of that person from which to judge them… that’s creepy.”
Our laws are focused on data collection, but the existence of data is not the concern; it’s the usage and sharing of data. In today’s interconnected world, individuals are no longer as concerned about what a given company knows about them, but how it’s used and with whom that information is shared. These are issues that cannot be solved when we limit the scope of our conversation to privacy, but must be evaluated in the larger discussion of establishing ethical data ownership legislation.