Educating the mind without educating the heart is no education at all.
– Aristotle
Dear Diary,
Recently I had a conversation with a nice lad on twitter that I’ll include here.
Are Google and Facebook and the like that use our personal data for their own good evil? Are their actions in these circumstances wrong? – Dev Community (@ThePracticalDev)
I don’t think they are evil. I think they are smart. However how they use the data could be considered unethical – Diaries of a Developer (@doadeveloper)
Is there a distinction between evil and unethical? – Doug Black Jr. (@dougblackjr)
So, Diary, this brought me to think. What are our roles as developers when it comes to Privacy and Security, and what responsibilities to we owe to users and user’s sovereign nations?
What Defines Private and Secure?
So when dealing with this, I have decided to talk about encryption and its role within our society. Absolute privacy and security would be a situation where one person could send a message to another person where no other party has any idea who sent the message to who nor the contents of that message.
This poses many questions of ethics and general security, each of which I will present below.
Do we backup a users encryption key?
Let’s first go over how a secure 3 way message system could work.
- A system would use asymmetric encryption using public/private key-pairs to deliver a group encryption key.
- This group encryption key is then used to encrypt and decrypt messages and files within a group.
- If a user is added, one would simply encrypt the key for the new users public key which then they would be able to decrypt with their private key.
- All users would be able to read and send messages to each other this way. And the group key was never exposed to the public.
- If the group key is changed, then all old messages would be lost.
This is a fairly simple and straight forward message system. And a similar one to how Owncloud does file encryption which you can read up more about here: Owncloud File Encryption.
However at what point do we allow a system to control these messages? And by that, I mean, should the system retain a copy of the group key in case a user loses its private key? Another problem presented is, does the system have the ability to decrypt old messages and re-encrypt them with the new key so that all users can still see the old messages? If a users public/private key-pair is stored in the system for use and syncing to other devices but simply encrypted with a password, what happens when the user loses that password?
These questions are not easy ones to answer for developers who do not work in cryptography day in and out. There are opinions and thoughts about best practices, but no final answer that also supports complete privacy and security for the group.
What about malicious use of platforms?
If a system is 100% secure and private, then no one but the participants know what is going on in that system. If, for example, that system is a chat room then the participants could talk about whatever they want. This is great for law abiding citizens, but how does this apply to people who are operating outside the law? Terrorists, Money Laundry, Hacktivists, Black Markets, Vile and Putrid Media, and Spammers are examples of malicious actors and uses of such a chat system. When building these systems, is it a developers responsibility what these systems will be used for? Is the developer liable? Is it ethical to build a system that goes against the common good to protect these types of activities?
This is where building systems of this magnitude for privacy and security collides with the fundamental logic of ethics.
Let’s say for one that the answer to the previous question was a resounding yes, we keep the backup of the users encryption key. Why? Well because then we could passively filter the content to make sure no malicious use is detected. We could prevent spam to channels by disgruntled members, or members who were invited without background checks. Even if your platform only used it for those purposes, you then control a key to that chat. If a government body of your country approached you for copies and records of those chats, you have a key, you would be forced to hand it over and give them access. Now, one might say that this is ok as the government is going to work against bad actors. However, not all governments work towards the common good, and sometimes they don’t work towards any good.
So, What Now?
If you control a key to the chat, you can build anti-spam and anti-malicious filters. You can build a better user experience by providing them with ways to recover passwords and update keys and data. However, you also are more susceptible to hackers gaining access to those stored keys. You are also now at the mercy of your governing nation, to whether they request that information legally or not.
- Where do you draw the line?
- What is the ethical answer to having a truly secure and private system?
- Does a developer even build a system that is encrypted end to end like this, and begin accepting responsibility for actions on the system?
- If it is 100% private and secure, and it comes out that a terrorist attack was made possibly by using this system, who is at fault?
- Does a user have a right to complete 100% privacy and security?
Summary
As I stated on twitter:
I’m not an ethics professor, so I cannot define the difference. – Diaries of a Developer (@doadeveloper)
Therefor, I do not have the answers for these questions. These are questions I pose to you, Diary, as I seek them myself.
Sincerely,
The Developer