Skip to main content

Understand the Ethical Use of Technology

Learning Objectives

After completing this unit, you’ll be able to:

  • Define bias and fairness.
  • Build a diverse team to avoid bias and gaps.
  • Translate company values into processes.
  • Describe the importance of understanding your customers for ethically aligned design.

Responsible Technology Design and Development

We are just beginning to understand how emerging technology impacts society. Diverse issues have arisen, from questions about automation replacing jobs to speculation about the developmental effects of social media.  

Many industries are governed by standards, protocols, and regulations meant to ensure that their products have a positive impact on society. Medical doctors, for example, follow the Hippocratic Oath and have established research review boards to ensure ethical practices. The automobile industry is subject to driving laws, and the safety standards surrounding seatbelts and airbags. More generally, in 2011 the United Nations endorsed the Guiding Principles for Business and Human Rights, which define the responsibilities that businesses and states have to protect the rights and liberties afforded to all individuals.

At Salesforce, we understand that we have a broader responsibility to society, and aspire to create technology that not only drives the success of our customers, but also drives positive social change and benefits humanity. Salesforce established the Office of Ethical and Humane Use to blaze a trail in the Fourth Industrial Revolution by helping our customers use our technology to make a positive impact. This effort is anchored in Salesforce’s core values (trust, customer success, innovation, equality, and sustainability).

Salesforce core values emblazoned on a shield—trust, customer success, innovation, equality, and sustainability
When it comes to technology ethics, the questions have never been more urgent—and it’s up to all of us to find the solutions.

What Does It Mean to Be Biased or Fair?

When you create or use technology, especially involving artificial intelligence or automation, it’s important to ask yourself questions of bias and fairness.

At Salesforce, we see bias as, “systematic and repeatable errors in a computer system that create unfair outcomes, in ways different from the intended function of the system, due to inaccurate assumptions in the machine learning process.” In the context of statistics, bias is systematic deviation from the truth or error. From a social and legal perspective, researcher and professor Kate Crawford defines bias as, “Judgement based on preconceived notions or prejudices, as opposed to the impartial evaluation of facts.” 

Fairness is defined as a decision made free of self-interest, prejudice, or favoritism. In reality, it’s nearly impossible for a decision to be perfectly fair. A panel at the Association for Computing Machinery's Conference on Fairness, Accountability, and Transparency in 2018 developed a list of over 21 definitions of fairness. If there are so many ways to think about fairness, how can you tell if humans or machines are making fair decisions? 

Three people envisioning different equal and fair ways to slice a cake.

To make a more informed decision, it’s fundamental to understand the impact of that decision. A decision that benefits the largest number of people still excludes a minority, which is unfair if that minority is often overlooked. You need to ask yourself: Are some individuals or groups disproportionately impacted by a decision? Does systemic bias in past decisions or inaccurate data make some groups less likely to receive a fair or impartial assessment? If the answer is yes, then you must decide if, and how, you should optimize to protect those individuals, even if it won’t benefit the majority.

Is There Such a Thing as "Good" Bias?

Some may argue that not all biases are bad. For example, let's say a pharmaceutical company manufactures a prostate cancer drug, and they target only men in their marketing campaigns. The company believes that their marketing campaign targeting is an example of a good bias because it's providing a service by not bothering female audiences with irrelevant ads. But if the company's dataset included only cis-gender people, or failed to acknowledge additional identities (ie. non-binary, transgender woman, transgender man, and agender individuals), then they were likely excluding other people who would benefit from viewing the ad. By incorporating a more complex and accurate understanding of gender and gender identity, the company would be better-equipped to reach everyone who could benefit from the drug.

Create an Ethical Culture

Most companies don’t actively set out to offend or harm people. But they can do so unintentionally if they don't define their core values, and put processes in place to ensure that everyone at the company is working in line with them. By defining values, processes, and incentives, leaders can influence the culture at their companies. Leaders can and should teach students and employees how to apply ethics in their work, but if the company culture isn’t healthy, it’s like planting a healthy tree in a blighted orchard. Eventually, even the healthy tree produces bad apples. Leaders must reward ethical behavior while catching and stopping unethical behavior. 

It’s important to remember that we are all leaders in this domain. You can make a difference by introducing and maintaining an ethical culture with an end-to-end approach. 

  1. Build diverse teams.
  2. Translate values into processes.
  3. Understand your customers.

Build Diverse Teams

Research shows that diverse teams (across the spectrum of experience, race, gender, and ability) are more creative, diligent, and hardworking. An organization that includes more women at all levels, especially top management, typically has higher profits. To learn more, check out the Resources section at the end of this unit. 

Eleven people with a range of abilities, races, genders, and ages.

Everything we create represents our values, experiences, and biases. For example, facial recognition systems often have more difficulty identifying black or brown faces than white faces. If the teams creating such technology had been more diverse, they would have been more likely to have recognized and addressed this bias. 

Development teams should strive toward diversity in every area, from age and race to culture, education, and ability. Lack of diversity can create an echo chamber that results in biased products and feature gaps. If you are unable to hire diverse team members, consider seeking feedback from underrepresented groups across your company and user base.

A sense of community is also part of the ethical groundwork at a company. No one person should be solely responsible for acting ethically or promoting ethics. Instead, the company as a whole should be mindful and conscious of ethics. Employees should feel comfortable challenging the status quo and speaking up, which can identify risks for your business. Team members should ask ethical questions specific to their domains, such as: 

  • Product managers: What is the business impact of a false positive or false negative in our algorithm?
  • Researchers: Who is impacted by our system and how? How can it be abused? How can people try to break the product or use it in unintended ways? What is the social context in which this is used?
  • Designers: What defaults or assumptions am I building into the product? Am I designing this for transparency and equality?
  • Data scientists: What are the implications for users when I optimize my model this way?
  • Content writers: Can I explain why the system made a prediction, recommendation, or decision in terms the user can understand?
  • Engineers: What notifications, processes, checks, or failsafes can we build into the system to mitigate harm?

Employees asking ethical questions specific to their roles.

Notice that these questions involve the perspectives of multiple roles. Involving stakeholders and team members at every stage of the product development lifecycle helps correct the impact of systemic social inequalities in your system. If you find yourself on a team that’s missing any of these roles, or where you play multiple roles, you may need to wear multiple hats to ensure each of these perspectives is included—and that may involve seeking out external expertise or advice. When employees are dissatisfied with the answers they receive, there should be a clear process for resolving the problem areas, like a review board. We go into more detail on that later. 

Translate Values into Processes

Nearly every organization has a set of values designed to guide their employees’ decision making. There are three ways this can be put into practice.

  1. Incentive structures
  2. Resources
  3. Documentation and Communication

Incentive structures

Incentive structures reward individuals for specific behaviors or achieving specific goals. Incentive structures should be informed by organizational values. More often than not, employees are rewarded based on sales, customer acquisition, and user engagement. These metrics can sometimes run counter to ethical decision making. 

If an organization wishes to reward behaviors in line with its values, possible incentive structures can include ethical bounties. Similar to bug bounties, ethical bounties reward employees when they identify decisions, processes, or features that are counter to the company’s values or cause harm to others or reward sales reps that share concerns about customer deals and avoid potential legal or public relations risk.

You may also include questions about ethical technology development in your hiring process. This sets the expectation for new employees that the ethical culture you’re building is important to the company and that ethical thinking and behavior is rewarded.

Employee Support 

Responsible organizations provide resources to support employees and empower them to make decisions in line with the company’s values. This can include employee education (Trailhead is a great resource for this) and review boards to resolve difficult issues and ensure employees are following guidelines. 

At Salesforce, we have a data science review board, which provides feedback on the quality and ethical considerations of our AI models, training data, and the applications that use them. 

While building an ethical culture empowers employees to speak up when they have ethical concerns, you may want to also consider creating a clear, anonymous process for employees to submit concerns. Finally, checklists are great resources to provoke discussion and ensure that you don't overlook important concerns. Checklists, although consistent and easy to implement, are rarely exhaustive and must be clear and actionable to be useful. Because they enable employees to have difficult conversations, checklists help your company build an ethical culture from the ground up. 

Checklist on a clipboard to show completion of actions.

Documentation and Communication

Document decision making for transparency and consistency. If a team is at an ethical crossroads, documenting what is decided and why enables future teams to learn from that experience and act consistently rather than arbitrarily. Documentation and communication also give your employees and stakeholders confidence in your process and the resulting decisions.

Understand Your Customers

It should go without saying that you need to understand all of your customers. If you don't, you could be designing products that ignore a portion of your user base or cause harm to some users with you being none the wiser. Ask yourself, whose needs and values have you assumed rather than consulted? Who is at the greatest risk of harm and why? Are there bad actors that could intentionally use your product to cause harm or individuals who might use it ignorantly and accidentally cause harm? Once you know the answers to these questions, you can work toward solving these problems. We recommend reaching out to a user researcher to learn more about your customers.

Resources

Share your Trailhead feedback over on Salesforce Help.

We'd love to hear about your experience with Trailhead - you can now access the new feedback form anytime from the Salesforce Help site.

Learn More Continue to Share Feedback