LAW & TECHNOLOGY 제18권 제1호 통권 제97호 (p.37-57)

인공지능에 의한 자동화된 의사결정의 위험성 - 장애인 채용을 중심으로 -

Risk of automated decision-making by artificial intelligence – Focusing on hiring people with disabilities -
키워드 :
알고리즘(algorithm),자동화된 의사결정(automated decision-making),장애인(disabled),장애 (disability),고용(employment),채용(recruitment),직접차별(direct discrimination),간접차별 (indirect discrimination)


I. 서론 – 논의의 배경
II. 인공지능의 개괄
   1. 인공지능의 정의 및 개괄
   2. 자동화된 의사결정 (automated decisionmaking)
III. 고전적 의미에서 채용에 있어서의장애인에 대한 공정한 채용절차
   1. 채용 차별의 정의
   2. 채용 차별과 관련한 관계 법령
   3. 장애인 차별의 유형
   4. 채용상 장애인 차별의 사례
IV. 채용 영역에서의 자동화된 의사결정의활용례와 한계
   1. 채용 영역에서의 자동화된 의사결정의 활용사례
   2. 인공지능을 통한 채용의 한계점
V. 인공지능에 대한 입법적 규제
   1. 자동화된 의사결정에 대한 규제의 해외사례
   2. 우리나라의 관련 입법 동향 및 논의 사항
VI. 결론


With the rapid development of artificial intelligence, the use of artificial intelligence in the field of employment, especially in recruitment, is gradually increasing. Artificial intelligence judges applications for employment of applicants and analyzes interview videos to determine emotional intelligence, communication skills, cognitive ability, and problem-solving ability. However, there is a risk of discrimination in evaluating humans in hiring by these automated decisions made by artificial intelligence. Discrimination in hiring according to gender, race, or disability is prohibited by various laws such as the Constitution, Labor Standards Act, Framework Act On Employment Policy, and Employment Security Act in Korea. In particular, the Act On The Prohibition Of Discrimination Against Persons With Disabilities, Remedy Against Infringement Of Their Rights and the Act On The Employment Promotion And Vocational Rehabilitation Of Persons With Disabilities require special protection for people with disabilities. Discrimination against persons with disabilities can be divided into direct discrimination against persons with disabilities without justifiable grounds, and indirect discrimination that results in disadvantageous consequences for persons with disabilities by applying standards that do not take disabilities into account even though they are not formally treated unfavorable. When hiring according to automated decision-making, (i) intentional discrimination that intentionally sets applicants with elements of the disablity to be excluded from recruitment, (ii) making an automated decision not to hire the disabled through the existing data which prejudice against the disabled is already reflected, and (iii) if the existing data lacks or does not have information on the disabled, the data itself lacks representativeness, leading to distorted decisions, there is a risk that will be judged as discrimination for the disabled. In Korea, the Credit Information Use And Protection Act has proposed the definition of automated evaluation for owner of credit information and protection standards, but it is limited to the protection of credit information, and a recent amendment of the Personal Information Protection Act suggests protection measures from decisions made by automated systems, but it is criticized for the scope of protection being reduced than that of the GDPR. In addition, special protection measures for workers, such as the ILO’s Code of Practice for Protection of Workers’ Personal Data, are not mentioned separately, so it is necessary to devise legal protection measures in the hiring process according to automated decision-making of disabled workers.