这是用户在 2024-8-1 4:48 为 https://app.immersivetranslate.com/word/ 保存的双语快照页面,由 沉浸式翻译 提供双语支持。了解如何保存?

Value(s)

Describe the risk

Possible consequences
可能的后果

Priority
优先权

Transparency
透明度

The robot’s output results are not presented in a manner that is easily understandable by elderly users or their caregivers.
机器人的输出结果没有以老年用户或其护理人员容易理解的方式呈现。

Users and caregivers may feel confused or frustrated, leading to dissatisfaction; employees might leave due to increased support demands; customers may opt for competitors; potential for discrimination lawsuits if outputs are not clearly explained; difficulty in identifying and correcting errors.
使用者和护理人员可能会感到困惑或沮丧,从而导致不满;员工可能会因支持需求增加而离职;客户可能会选择竞争对手;如果产出没有明确解释,则可能导致歧视诉讼;难以识别和纠正错误。

High

Trust, transparency
信任、透明

The subjective nature of categorizing output based on probability scores makes it challenging for users to understand or correct issues.
根据概率分数对输出进行分类的主观性使得用户难以理解或纠正问题。

Loss of trust from families and caregivers, leading to customer attrition; increased difficulty for customer service representatives (CSRs) who cannot adequately explain the system’s decisions; potential legal disputes over perceived discrimination.
失去家人和照顾者的信任,导致客户流失;对于无法充分解释系统决策的客户服务代表 (CSR) 来说,难度增加;因感知到的歧视而可能产生的法律纠纷。

High

Fairness
公平

Input data may be biased towards caregivers willing to travel to suburbs or other specific areas.
输入数据可能偏向于愿意前往郊区或其他特定地区的护理人员。

Perceived unfairness among caregivers, damaging company reputation; potential discrimination against less affluent caregivers who may not have access to transportation.
认为护理人员之间存在不公平,损害了公司声誉;对可能无法使用交通工具的较不富裕的护理人员进行歧视。

Med-High
中高

Use of demographic data such as location, gender, race, and age could result in discriminatory practices.
使用位置、性别、种族和年龄等人口统计数据可能会导致歧视性做法。

Significant legal risks; need to eliminate demographic biases from the system.
重大法律风险;需要从系统中消除人口偏见。

Very High
非常高

Different stakeholders, including families, caregivers, and data scientists, have varying definitions of fairness.
不同的利益相关者,包括家庭、护理人员和数据科学家,对公平有不同的定义。

Internal conflicts and confusion, negatively impacting company morale and operations.
内部冲突和混乱,对公司士气和运营产生负面影响。

Low-Med
低中等

CSRs’ personal biases in matching caregivers with families could influence the system’s recommendations over time.
随着时间的推移,CSR 在将护理人员与家庭相匹配时的个人偏见可能会影响系统的建议。

Risk of biased outcomes, causing caregiver dissatisfaction, harming the company’s reputation, and potential legal actions.
存在结果偏颇、引起护理人员不满、损害公司声誉以及可能采取法律行动的风险。

High

The system’s reliance on English-language bios might exclude non-English speaking caregivers.
该系统对英语简历的依赖可能会排除不会说英语的护理人员。

Non-English speaking caregivers and families might feel discriminated against, limiting the company’s ability to serve diverse communities effectively.
不会说英语的护理人员和家庭可能会感到受到歧视,这限制了公司有效为不同社区服务的能力。

Med
地中海

Privacy
隐私

Caregivers can opt out of sharing their data, but doing so might disadvantage them compared to those who share data.
护理人员可以选择不共享他们的数据,但与共享数据的人相比,这样做可能会使他们处于不利地位。

Caregivers can opt out of sharing their data, but doing so might disadvantage them compared to those who share data.
护理人员可以选择不共享他们的数据,但与共享数据的人相比,这样做可能会使他们处于不利地位。

Med-High
中高

Autonomy and job security
自主权和工作保障

Increased automation may threaten the job security of CSRs without proper planning and transition strategies.
如果没有适当的规划和过渡策略,自动化程度的提高可能会威胁到 CSR 的工作安全。

Lower employee morale; proactive job-seeking by employees; loss of trust and dissatisfaction among customers and caregivers who value their relationship with CSRs.
降低员工士气;员工积极主动地寻找工作;重视与 CSR 关系的客户和护理人员之间失去信任和不满。

Med
地中海