http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Moon, Yuseok,Cho, Seok Goo,Lee, Jong Wook,Min, Woo Sung,Kim, Chun Choo,Lee, Hae-Kyung,Kim, Yong-Goo,Chang, Hong Seok,Chae, Hyun Seok,Kim, Ho-Youn,Tani, Kenzaburo Mary Ann Liebert 2006 Cancer biotherapy & radiopharmaceuticals Vol.21 No.3
<P>Radiation therapy and chemotherapy have little effect on renal-cell carcinomas (RENCAs). We investigated the effect of the tumor vaccination strategy on preventing tumor formation after a challenge with RENCA. The hepatitis surface antigen (HBsAg) was used to enhance the antitumor immunity and tumor vaccination efficiency. RENCA cells expressing HBsAg (RENCA/HBS) were completely susceptible to HBsAg vaccination, which implies that HBsAg vaccination induces specific antitumor immunity against HBsAg- expressing cancer cells. As with HBsAg vaccination, vaccination with irradiated RENCA/HBS retarded tumor formation following a RENCA/HBS challenge. After HBsAg vaccination, the irradiated RENCA/HBS tumor vaccine completely prevented the tumor formation by RENCA/HBS. Tumor vaccination with irradiated RENCA/HBS (5 x 10(4) cells), but not with RENCA, reduced the tumor rate after a challenge with 5 x 10(6) RENCA cells, whereas a lower tumor load was overcome by the RENCA vaccination alone. These results confirm the postulate that RENCA/HBS vaccination elicits an antitumor immune response to some putative antigens or enhances the general immune competence in immunosuppressed renal tumor patients.</P>
이영완(Youngwan Lee),배유석(Yuseok Bae),이용주(Yong-Ju Lee) 대한전자공학회 2023 대한전자공학회 학술대회 Vol.2023 No.6
Recently, self-supervised learning (SSL) without supervision (i.e., human annotation) in the computer vision domain has drawn huge popularity due to its superior performance than deep supervised learning on downstream tasks such as object detection and segmentation. Thus, it is more likely to use the pretrained models by self-supervised learning than supervised learning. Specifically, as masked language modeling (MLM) in natural language processing (NLP) shows powerful representation in the pre-training phase, masked image modeling (MIM) has emerged in SSL for the computer vision domain. From this context, we try to review recent self-supervised learning algorithms released since 2020 in terms of pre-text tasks, architecture, and backbone network and provide several discussion points.