Advertisement

The AI Trust Gap: A Cultural Perspective

By on
Read more about author Dan Everett.

Only 52% of employees are confident their organization will ensure AI is implemented in a responsible and trustworthy way, according to Workday’s Closing the AI Trust Gap report. Trust will be key to getting employees engaged in the change needed to realize AI’s full potential. This post will look at what can be done from a cultural perspective.

What Is Company Culture?

Company culture is an expression of the beliefs that guide and shape its purpose – the operational structures that enable the organization to execute and achieve its goals, the communication mechanisms that facilitate collaboration between individuals and teams, and the customs that act as guardrails for appropriate behavior. 

Let’s look at some ways to use these domains of cultural influence to close the AI trust gap. 

Beliefs: Envisioning a Purpose to Close the AI Trust Gap

  • Vision: Craft a narrative of the future centered around a synergistic partnership between AI and humans. A future where the development and utilization of AI align with human values and uphold the highest ethical standards.
  • Mission: Empower employees with AI-enriched experiences that provide more fulfilling and creative work environments. Enabled by the seamless integration of AI technologies into operational processes and activities.
  • Values: Establish transparency and accountability as the foundation of the organization’s AI practices – where these values act as a moral compass, directing decision-making processes and reinforcing responsible behavior in the development and use of AI.

Organization: Creating an Operating Model to Close the AI Trust Gap

  • Roles: Introduce new roles within the organization, such as AI liaisons that provide guidance on AI practices. These liaisons serve as translators to facilitate better understanding of AI within and across departments.
  • Responsibilities: Clearly define accountability for the development and use of AI in a responsible manner. Ensure that every individual within the organization understands their role and responsibilities regarding AI practices.
  • Processes: Establish how AI will be utilized to automate tasks and make decisions within the organization. Incorporate mechanisms for appropriate human review and intervention to minimize the risk of unintended consequences.

Communication: Facilitating Dialogue to Close the AI Trust Gap

  • Terminology: Avoid technical terms and use everyday language when discussing AI initiatives. The goal is to ensure information about AI is easily understandable for everyone within the organization.
  • Format: Cater to diverse learning styles by incorporating a variety of communication mediums like newsletters, videos, and brown bag lunches. Using different formats not only keeps AI communications fresh but also boosts engagement and retention.
  • Channels: Establish avenues for both top-down distribution of AI information and bottom-up reporting of concerns and issues. And give employees the means to independently form peer-to-peer discussion communities. 

Customs: Enforcing Standards to Close the AI Trust Gap

  • Policies: Setting foundational guidelines for AI fairness, transparency, and explainability is not enough to bridge the AI trust gap. Employees are seeking policies that align with the company’s vision and mission and steer AI to benefit everyone collectively.
  • Procedures: Monitoring mechanisms, risk and accountability frameworks, feedback loops, and escalation processes are also not enough to bridge the AI trust gap. Comprehensive training on the procedures is critical to creating trust in the policies and ensuring policies are enforced.
  • Ethics: Go beyond legal compliance in your policies and procedures, addressing broader AI ethical concerns such as diversity, inclusion, and human rights. Demonstrating a commitment to ensuring AI benefits society is crucial for closing the AI trust gap.

Closing the AI trust gap requires a comprehensive strategy to address employees’ concerns, with a crucial focus on restructuring the company’s beliefs, organizational structure, communication methods, and customs. This restructuring should demonstrate commitment to using AI not merely as a tool for enhancing worker productivity but as a means for unlocking human potential for the benefit of all.

The suggestions presented here are not all-encompassing; rather, they are intended to stimulate thinking on how to shape corporate culture to bridge the AI trust gap. I welcome your perspectives and insights on this subject.