RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      NETLA를 이용한 이진 신경회로망의 기하학적인 학습방법 = The Geometrical Learning Method of Binary Neural Network using NETLA

      한글로보기

      https://www.riss.kr/link?id=A3011025

      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      다국어 초록 (Multilingual Abstract)

      This paper describes an optimal synthesis method of binary neural network(BNN) for an approximation problem of a circular region using a newly proposed learning algorithm[7]. Our object is to minimize the number of connections and neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm(NETLA) for the multilayer BNN. The synthesis method in the NETLA is based on the extension principle of Expanded and Truncated Learning(ETL) and is based on Expanded Sum of Product (ESP) as one of the boolean expression techniques. And it has an ability to optimize the given BNN in the binary space without any iterative training as the conventional Error Back Propagation(EBP) algorithm[6]. If all the true and false patterns are only given, the connection weights and the threshold values can be immediately determined by an optimal synthesis method of the NETLA without any tedious learning. Futhermore, the number of the required neurons in hidden layer can be reduced and the fast learning of BNN can be realized. The superiority of this NETLA to other algorithms was proved by the approximation problem of one circular region.
      번역하기

      This paper describes an optimal synthesis method of binary neural network(BNN) for an approximation problem of a circular region using a newly proposed learning algorithm[7]. Our object is to minimize the number of connections and neurons in hidden la...

      This paper describes an optimal synthesis method of binary neural network(BNN) for an approximation problem of a circular region using a newly proposed learning algorithm[7]. Our object is to minimize the number of connections and neurons in hidden layer by using a Newly Expanded and Truncated Learning Algorithm(NETLA) for the multilayer BNN. The synthesis method in the NETLA is based on the extension principle of Expanded and Truncated Learning(ETL) and is based on Expanded Sum of Product (ESP) as one of the boolean expression techniques. And it has an ability to optimize the given BNN in the binary space without any iterative training as the conventional Error Back Propagation(EBP) algorithm[6]. If all the true and false patterns are only given, the connection weights and the threshold values can be immediately determined by an optimal synthesis method of the NETLA without any tedious learning. Futhermore, the number of the required neurons in hidden layer can be reduced and the fast learning of BNN can be realized. The superiority of this NETLA to other algorithms was proved by the approximation problem of one circular region.

      더보기

      목차 (Table of Contents)

      • Ⅰ.서론
      • Ⅱ.본론
      • 1.이진 신경회로망의 구조
      • 2.은닉층내의 가중치와 임계치의 결정
      • 3.ETL 알고리즘의 확장의 원리
      • Ⅰ.서론
      • Ⅱ.본론
      • 1.이진 신경회로망의 구조
      • 2.은닉층내의 가중치와 임계치의 결정
      • 3.ETL 알고리즘의 확장의 원리
      • 4.실험결과
      • Ⅲ.결론
      더보기

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      주제

      연도별 연구동향

      연도별 활용동향

      연관논문

      연구자 네트워크맵

      공동연구자 (7)

      유사연구자 (20) 활용도상위20명

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼