- 요약
- Abstract
- 1. 서론
- 2. 관련 연구
- 3. 문서 요약 모델
http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
https://www.riss.kr/link?id=A108011177
2022
Korean
문서 요약 ; 사전학습 ; 복사 메커니즘 ; 커버리지 메커니즘 ; 길이 임베딩 ; text summarization ; pre-training ; MASS ; copying mechanism ; coverage mechanism ; length embedding
569
KCI우수등재
학술저널
25-31(7쪽)
0
0
상세조회0
다운로드목차 (Table of Contents)
참고문헌 (Reference)
1 Rico Sennrich, "Neural Machine Translation of Rare Words with Subword Units" 2016
2 Dzmitry Bahdanau, "Neural Machine Translation by Jointly Learning to Align and Translate" 2015
3 Kaitao Song, "MASS: Masked Sequence to Sequence Pre-training for Language Generation" 2019
4 Abigail See, "Get To The Point : Summarization with Pointer-Generator Networks"
5 Dongheon Jeon, "Copy-Transformer model using Copy-Mechanism and Inference Penalty for Document Abstractive Summarization" 301-306, 2019
6 Yuta Kikuchi, "Controlling Output Length in Neural Encoder-Decoders" 2016
7 Jacob Devlin, "BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding"
8 Ashish Vaswani, "Attention Is All You Need" 2017
1 Rico Sennrich, "Neural Machine Translation of Rare Words with Subword Units" 2016
2 Dzmitry Bahdanau, "Neural Machine Translation by Jointly Learning to Align and Translate" 2015
3 Kaitao Song, "MASS: Masked Sequence to Sequence Pre-training for Language Generation" 2019
4 Abigail See, "Get To The Point : Summarization with Pointer-Generator Networks"
5 Dongheon Jeon, "Copy-Transformer model using Copy-Mechanism and Inference Penalty for Document Abstractive Summarization" 301-306, 2019
6 Yuta Kikuchi, "Controlling Output Length in Neural Encoder-Decoders" 2016
7 Jacob Devlin, "BERT : Pre-training of Deep Bidirectional Transformers for Language Understanding"
8 Ashish Vaswani, "Attention Is All You Need" 2017
New Adaptive Matching Order and Performance Comparison for Subgraph Matching Problem
Design of Durable Node Replication for Persistent Memory Data Structures on NUMA Architectures
Survey on Recent Cryogenic Computing Research
Deletion-based Korean Sentence Compression using Graph Neural Networks
학술지 이력
연월일 | 이력구분 | 이력상세 | 등재구분 |
---|---|---|---|
2021 | 평가예정 | 계속평가 신청대상 (등재유지) | |
2016-01-01 | 평가 | 우수등재학술지 선정 (계속평가) | |
2015-01-01 | 평가 | 등재학술지 유지 (등재유지) | ![]() |
2002-01-01 | 평가 | 학술지 통합 (등재유지) | ![]() |
학술지 인용정보
기준연도 | WOS-KCI 통합IF(2년) | KCIF(2년) | KCIF(3년) |
---|---|---|---|
2016 | 0.19 | 0.19 | 0.19 |
KCIF(4년) | KCIF(5년) | 중심성지수(3년) | 즉시성지수 |
0.2 | 0.18 | 0.373 | 0.07 |