http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Christian Martens,Jochen Schuttler,Axel Graser 한국과학기술원 인간친화 복지 로봇 시스템 연구센터 2003 International Journal of Assistive Robotics and Me Vol.4 No.2
Service robots and especially rehabilitation robots shall support disabled persons in daily life situations as well as in the working environment. Currently available systems offer support on a relatively low task level. This puts a high cognitive load on its users, so that controlling these systems turns out to be very tiresome. In order to eliminate this disadvantage system control on a higher task level becomes a necessary prerequisite. For this purpose, detailed information about the task to be solved i.e. task-knowledge has to be supported. This kind of information enables the system to reason about its internal as well as about its environmental state in a (semi-)autonomous manner. This paper presents an approach for the logical verification of the input of such task-knowledge. The task-knowledge is represented on the basis of AND/OR-net structures. These net structures can be created easily by non-service robotic experts (application programmer e.g. care personnel). The logical verification during knowledge input on a semantic level becomes possible due to the enhanced a priori knowledge about the net elements (i.e. objects and operators) and their relationships.
Rehabilitation Robots Transfer of Development and Research Results to Disabled Users
Aaxel Graser,Christian Martens 한국과학기술원 인간친화 복지 로봇 시스템 연구센터 2002 International Journal of Assistive Robotics and Me Vol.3 No.1
The development of rehabilitation robots is pushed by research people who work in the area of autonomous robot systems and is also urgently requested from disabled users. Real success in development can be attained only, if our results lead to a real benefit for the disabled. This is a very demanding goal and we have to consider not only research topics but also the kind of handicap and the economical situation, which varies widely in different countries. In this paper we will discuss some general aspects in development and research and the approach the IAT has been chosen to transfer the results to the users.
Hamacher, Michael,Stephan, Christian,Eisenacher, Martin,Lewczuk, Piotr,Wiltfang, Jens,Martens, Lennart,Vizcaí,no, Juan Antonio,Kwon, Kyung-Hoon,Yoo, Jong Shin,Park, Young Mok,Beckers, Johannes,H WILEY-VCH Verlag 2007 Proteomics Vol.7 No.15
<P>The Wellcome Trust Conference Centre at Hinxton, UK, was the meeting place of the 7<SUP>th</SUP> HUPO Brain Proteome Project Workshop entitled “High Performance Proteomics”. It started on Wednesday, March 7, 2007 with a steering committee meeting followed by a two days series of talks dealing with the standardization and handling of tissues, body fluids as well as of proteomics data. The presentation and accompanying vivid discussions created a picture of actual strategies and standards in recent proteomics.</P>
Kim, Young Hye,Marcus, Katrin,Grinberg, Lea Tenenholz,Goehler, Heike,Wiltfang, Jens,Stephan, Christian,Eisenacher, Martin,Hardt, Tanja,Martens, Lennart,J Dunn, Michael,Park, Young Mok,Meyer, Helmut E JOHN WILEY & SONS, LTD 2009 PROTEOMICS CLINICAL APPLICATIONS Vol.3 No.9
<P>The HUPO Brain Proteome Project (HUPO BPP) held its 11th workshop in Kolymbari on March 3, 2009. The principal aim of this project is to obtain a better understanding of neurodiseases and ageing, with the ultimate objective of discovering prognostic and diagnostic biomarkers, in addition to the development of novel diagnostic techniques and new medications. The attendees came together to discuss sub-project progress in the clinical neuroproteomics of human or mouse models of Alzheimer's and Parkinson's disease, and to define the needs and guidelines required for more advanced proteomics approaches. With the election of new steering committees, the members of the HUPO BPP elaborated an actual plan promoting activities, outcomes, and future directions of the HUPO BPP to acquire new funding and new participants.</P>