RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      KCI우수등재

      기계 독해 성능 개선을 위한 데이터 증강 기법

      한글로보기

      https://www.riss.kr/link?id=A107946895

      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      다국어 초록 (Multilingual Abstract)

      Machine reading comprehension is a method of understanding the meaning and performing inference over a given text by computers, and it is one of the most essential techniques for understanding natural language. The question answering task yields a way...

      Machine reading comprehension is a method of understanding the meaning and performing inference over a given text by computers, and it is one of the most essential techniques for understanding natural language. The question answering task yields a way to test the reasoning ability of intelligent systems. Nowadays, machine reading comprehension techniques performance has significantly improved following the recent progress of deep neural networks. Nevertheless, there may be challenges in improving performance when data is sparse. To address this issue, we leverage word-level and sentence-level data augmentation techniques through text editing, while minimizing changes to the existing models and cost. In this work, we propose data augmentation methods for a pre-trained language model, which is most widely used in English question answering tasks, to confirm the improved performance over the existing models.

      더보기

      참고문헌 (Reference)

      1 S. Edunov, "Understanding Back-Translation at Scale" 489-500, 2018

      2 X. Jiao, "TinyBERT: Distilling BERT for Natural Language Understanding" 4163-4174, 2020

      3 A. Riabi, "Synthetic Data Augmentation for Zero-Shot Cross-Lingual Question Answering"

      4 P. Rajpurkar, "SQuAD: 100, 000+ Questions for Machine Comprehension of Text" 2383-2392, 2016

      5 L. Wan, "Regularization of Neural Networks using DropConnect" 1058-1066, 2013

      6 A. W. Yu, "QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension" 2018

      7 D. C. Ciresan, "Multi-column deep neural networks for image classification" 3642-3649, 2012

      8 T. Nguyen, "MS MARCO:A Human Generated Machine Reading Comprehension Dataset"

      9 A. Asai, "Logic guided data augmentation and regularization for consistent question answering" ACL 5642-5650, 2020

      10 Z. Yang, "HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering" 2369-2380, 2018

      1 S. Edunov, "Understanding Back-Translation at Scale" 489-500, 2018

      2 X. Jiao, "TinyBERT: Distilling BERT for Natural Language Understanding" 4163-4174, 2020

      3 A. Riabi, "Synthetic Data Augmentation for Zero-Shot Cross-Lingual Question Answering"

      4 P. Rajpurkar, "SQuAD: 100, 000+ Questions for Machine Comprehension of Text" 2383-2392, 2016

      5 L. Wan, "Regularization of Neural Networks using DropConnect" 1058-1066, 2013

      6 A. W. Yu, "QANet: Combining Local Convolution with Global Self-Attention for Reading Comprehension" 2018

      7 D. C. Ciresan, "Multi-column deep neural networks for image classification" 3642-3649, 2012

      8 T. Nguyen, "MS MARCO:A Human Generated Machine Reading Comprehension Dataset"

      9 A. Asai, "Logic guided data augmentation and regularization for consistent question answering" ACL 5642-5650, 2020

      10 Z. Yang, "HotpotQA: A Dataset for Diverse, Explainable Multi-hop Question Answering" 2369-2380, 2018

      11 J. Pennington, "Glove:Global vectors for word representation" 1532-1543, 2014

      12 J. Wei, "EDA: Easy Data Augmentation Techniques for Boosting Performance on Text Classification Tasks" 6381-6387, 2019

      13 A. Anaby-Tavor, "Do Not Have Enough Data? Deep Learning to the Rescue!" AAAI 7383-7390, 2020

      14 V. Karpukhin, "Dense Passage Retrieval for Open-Domain Question Answering" 6769-6781, 2020

      15 V. Kumar, "Data augmentation using pre-trained transformer models"

      16 A. Abdelkawi, "Complex Query Augmentation for Question Answering Over Knowledge Graphs" 571-587, 2019

      17 P. Y. Simard, "Best practices for convolutional neural networks applied to visual document analysis" 958-963, 2003

      18 J. Devlin, "BERT: Pre-training of Deep Bidirectional Transformers for Language Understanding" 4171-4186, 2019

      19 S. Longpre, "An exploration of data augmentation and sampling techniques for domain-agnostic question answering"

      20 I. Sato, "APAC:Augmented PAttern Classification with Neural Networks"

      21 Z. Lan, "ALBERT: A Lite BERT for Self-supervised Learning of Language Representations" 2020

      더보기

      동일학술지(권/호) 다른 논문

      동일학술지 더보기

      더보기

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      주제

      연도별 연구동향

      연도별 활용동향

      연관논문

      연구자 네트워크맵

      공동연구자 (7)

      유사연구자 (20) 활용도상위20명

      인용정보 인용지수 설명보기

      학술지 이력

      학술지 이력
      연월일 이력구분 이력상세 등재구분
      2021 평가예정 계속평가 신청대상 (등재유지)
      2016-01-01 평가 우수등재학술지 선정 (계속평가)
      2015-01-01 평가 등재학술지 유지 (등재유지) KCI등재
      2002-01-01 평가 학술지 통합 (등재유지) KCI등재
      더보기

      학술지 인용정보

      학술지 인용정보
      기준연도 WOS-KCI 통합IF(2년) KCIF(2년) KCIF(3년)
      2016 0.19 0.19 0.19
      KCIF(4년) KCIF(5년) 중심성지수(3년) 즉시성지수
      0.2 0.18 0.373 0.07
      더보기

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼