THANK YOU FOR SUBSCRIBING
An Appeal: Please Build Responsibly
Nidhi Gupta, Chief Product Officer, Hired
Over the last few years, that aura and sheen has faded. It has given way, in many cases, to distaste and distrust. The election in 2016 was an inflection point. We discovered how big of a role social media played in that election cycle, how these platforms enabled us to get manipulated. These downsides of the products that the industry was so proud of had now been surfaced and were made apparent, for all of us to see. And as we were processing that, we also discovered how poorly our private data, that we entrusted these companies with, was handled.
At the heart of all this is the question that entrepreneurs have long wrestled with. Their job is to innovate. Should it also be their job to identify the ways in which their technology can be misused? There’s a train of thought that entrepreneurs should just innovate. Create. Aggressively go after market share and revenue growth. They cannot be ‘burdened’ by social responsibility as it will slow them down. Social responsibility should come ‘later’ or it should be ‘someone else’s’ job.
I am of the opinion that it should absolutely be the innovator’s job to make sure that they build responsibly. They need to ensure, right at the onset, that the products they are building cannot be misused. They built these products, to begin with. They are most intimately familiar with them. So they are the ones who are best positioned to build software such that it cannot be abused. Additionally, building responsibly should also include ensuring that our software works for people from all backgrounds. That our software does not perpetuate the same biases that exist in the real world. If some of our smartest sharpest brains won’t consider the social impacts of their software, then who will?
Artificial intelligence has increasingly become common in all spheres of software development. While AI has tremendous potential in making our lives easier, the risks of AI, when not built responsibly, are grave.
While AI has tremendous potential in making our lives easier, the risks of AI, when not built responsibly, are grave
Today, we see AI in all spheres of life, such as recommendations on what products you should buy, which news articles you should read, which jobs you are qualified for. One of the biggest risks of AI is that it could perpetuate the same set of biases that exist in our society today. In a recent interview, Jeff Dean, the Head of AI at Google, said that, “You might learn, for example, an unfortunate connotation, which is doctor is more associated with the word ‘he’ than ‘she’, and nurse is more associated with the word ‘she’ than ‘he’.” This is what we ought to avoid by building responsibly.
Here are some steps for all of us to take in order to build responsibly.
1. Build your software for everyone. When you are building your innovation, you will often rely on your own social and economic circumstances to make decisions. However, the same set of circumstances are not applicable to people with different backgrounds. In order to build products that appeal to and are usable by people from a variety of different backgrounds, you need to consider how your software will work for people of all backgrounds and points of view.
2. Hire a diverse team to help with this effort. If engineers from the same socio-economic and ethnicity build your algorithms, then how will you bring any diversity of thought? Our backgrounds factor into our decision making at work and in personal life. If your team has not seen a lot of diverse backgrounds and experiences, then your code is not going to work for everyone. A simple example is the infamous soap dispenser. Recent studies suggest that some automatic soap dispensers do not work for people with dark skin. That is because when these were developed, they were not tested with dark-skinned people!
3. Train your algorithms with normalised data so as to not perpetuate biases. Our world today has a ton of unconscious bias in it today. You can actually leverage the technology that you are building, to have it provide outcomes that are more equitable to people of all stripes. Let us take an example. The tech industry today has about 20 percent women. Let us say you are building an algorithm that surfaces job candidates for a given position. This algorithm is trained by data, which has 1000 candidates. Based on the ratios that are prevalent in the industry today, it will likely have 800 men and 200 women. If the algorithm is trained on this dataset, then the likelihood of a male showing is four times more than that of a woman. However, if the data set was normalised so as to contain 500 men and 500 women, the algorithm will not bias based on either gender.
4. Abuse prevention should be top of mind. And of course, as we have seen from our Facebook, Twitter, and YouTube examples, think of ways your innovation can be misused or abused, and build prevention mechanisms for that. Today, when an engineer builds code, they are asked to test for ‘edge cases.’ These ‘edge cases’ are unexpected inputs to the code. As an example, if the engineer has built a form which accepts a name entry, they do not expect numbers in the name field. This is something they would test for and subsequently guard their code against. Similar to this practice, it should also behoove us to protect our code from abuse; to think of ways someone can misuse our innovation.
In summary, for entrepreneurs who wonder if it is their job to ensure their innovations are not misused, I believe that the answer to this question is a resounding YES! As to when you should start? Start NOW! Your products, your employees, and the world cannot wait for you to become socially responsible at a later date. Your products are meant to change the world, so please make sure that you build responsibly so that they change this world for the better!
Cloud Computing - More Regulation, Better Regulation?
Dr W Kuan Hon, Director, Privacy, Security & Information Law, Fieldfisher