RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      Assessing science understanding : a human constructivist view

      한글로보기

      https://www.riss.kr/link?id=M7880990

      • 저자
      • 발행사항

        San Diego : Academic Press, c2000

      • 발행연도

        2000

      • 작성언어

        영어

      • 주제어
      • DDC

        507.8 판사항(20)

      • ISBN

        0124983650 (alk. paper)

      • 자료형태

        단행본(다권본)

      • 발행국(도시)

        California

      • 서명/저자사항

        Assessing science understanding : a human constructivist view / edited by Joel J. Mintzes, James H. Wandersee, Joseph D. Novak.

      • 형태사항

        xxii, 386 p. : ill. ; 24 cm.

      • 총서사항

        Educational psychology series Educational psychology series (San Diego, Calif.)

      • 일반주기명

        Companion volume to Teaching science for understanding. 1998.
        Includes bibliographical references and index.

      • 소장기관
        • 경희대학교 중앙도서관 소장기관정보
        • 국립중앙도서관 국립중앙도서관 우편복사 서비스
        • 대구교육대학교 도서관 소장기관정보
        • 서울대학교 중앙도서관 소장기관정보 Deep Link
        • 연세대학교 학술문화처 도서관 소장기관정보 Deep Link
        • 이화여자대학교 도서관 소장기관정보 Deep Link
        • 전남대학교 중앙도서관 소장기관정보
        • 한국과학기술원(KAIST) 학술문화관 소장기관정보
      • 0

        상세조회
      • 0

        다운로드
      서지정보 열기
      • 내보내기
      • 내책장담기
      • 공유하기
      • 오류접수

      부가정보

      목차 (Table of Contents)

      • CONTENTS
      • Contributors = xvii
      • Preface = xix
      • 1. LEARNING, TEACHING, AND ASSESSMENT : A HUMAN CONSTRUCTIVIST PERSPECTIVE / Joseph D. Novak ; Joel J. Mintzes ; lames H. Wandersee
      • The Role of Assessment = 1
      • CONTENTS
      • Contributors = xvii
      • Preface = xix
      • 1. LEARNING, TEACHING, AND ASSESSMENT : A HUMAN CONSTRUCTIVIST PERSPECTIVE / Joseph D. Novak ; Joel J. Mintzes ; lames H. Wandersee
      • The Role of Assessment = 1
      • The Learner = 2
      • The Teacher = 7
      • Knowledge and Knowledge Creation = 8
      • The Social Milieu = 11
      • A Foreword = 13
      • References = 13
      • 2. ASSESSING SCIENCE UNDERSTANDING THROUGH CONCEPT MAPS / Katherine M. Edmondson
      • Concept Mapping to Portray Shared Meaning and Meaningful Learning = 19
      • Concept Maps as Assessment Tools = 22
      • Cases in Point : Assessing Shared Meaning in Specific Disciplines or Content Domains with a View to the Larger World = 30
      • Summary and Conclusions = 35
      • References = 36
      • 3. ASSESSING SCIENCE UNDERSTANDING : THE EPISTEMOLOGICAL VEE DIAGRAM / Joel J. MiMtzcs ; Josefph D. Novak
      • Understanding Understanding = 42
      • Sharing Meanings (Intersubjectivity) = 43
      • Resolving Inconsistencies (Coherence) = 44
      • Seeking Simplicity (Parsimony) = 46
      • Thinking Critically (Transparency) = 47
      • Introducing Gowin's V Diagram = 47
      • Focus Question = 49
      • Objects and Events = 50
      • Conceptual (Thinking) Side = 50
      • Methodological (Doing) Side = 52
      • V Diagrams in College Biology = 54
      • Template and Rubric = 54
      • Blaine = 57
      • Melodie = 57
      • corrie = 60
      • Comments and Reflections = 60
      • V Diagrams in College Biology = 60
      • Improvement of Laboratory Study Guides = 62
      • V Diagrams in Junior High School = 64
      • The Need for Greater Epistemological Understanding = 66
      • References = 68
      • 4. "WHAT DO YOU MEAN BY THAT?" : USING STRUCTURED INTERVIEWS TO ASSESS SCIENCE UNDERSTANDING / Sherry A. Southerland ; Mike U. Smith ; and Catherine L. Cummins
      • The Need for Qualitative Assessment Tools = 72
      • What Is a Structured Interview? = 73
      • Structured Interview Tasks = 73
      • Interviews about Instances = 74
      • Prediction Interviews = 76
      • Sorting Interviews = 79
      • Problem­Solving and Process Interviews = 82
      • Conducting Structured Interviews = 84
      • Practicalities of Preparation = 84
      • Student Comfort = 84
      • Timeframe = 85
      • Prompts, FoIlow­Up Questions, and Interviewer Comments = 85
      • Pilot Studies = 87
      • Potential for Formal Evaluation = 88
      • How to Make Sense of Interview Responses = 89
      • Issues to Consider in the Use of Structured Interviews = 90
      • Mode Validity = 90
      • Content Specificity = 91
      • Using Structured Interviews to Inform Teaching = 91
      • References = 92
      • 5. DIALOGUE AS DATA : ASSESSING STUDENTS' SCIENTIFIC REASONING WITH INTERACTIVE PROTOCOLS / Kathleen Hogan ; JoEllen Fisherkeller
      • The Knowledge and Reasoning Connection = 96
      • Broadening Conceptualizations of Scientific Reasoning = 97
      • Using Verbal Interactions to Assess Scientific Reasoning = 100
      • Steps for Assessing Reasoning Through Verbal Interactions in Classroom and Interview Contexts = 102
      • Step 1 : Determine a Purpose and Approach = 102
      • Step 2 : Establish and Communicate Standards for Reasoning = 104
      • Step 3 : Choose Interactive Settings for Assessment = 106
      • Step 4 : Choose Tasks for Eliciting and Assessing Reasoning = 110
      • Step 5 : Collect Data = 113
      • Step 6 : Analyze Data = 115
      • Step 7 : Give Feedback or Make Claims = 122
      • Challenges, Caveats, and Conclusions = 123
      • References = 124
      • 6. DESIGNING AN IMAGE­BASED BIOLOGY TEST / James H. Wandersee
      • What Can an Image­Based Test Reveal about Biology Learning? = 131
      • What Are the Testing Implications of Paivio's Dual Coding Theory? = 132
      • What Are Some Salient Principles of Visual Perception and Cognition? = 133
      • How Can the Student's Attention be Focused on Selected Aspects of the Photographic Image? = 135
      • When Should a Color Image Be Used Instead of a Monochrome Image? = 135
      • What Is the Author's Model of Image­Based Biology Test­Item Design? = 136
      • What Are Some Examples of Model­Based Test Items? = 138
      • How Can Image­Based Biology Test Results Be Analyzed and Interpreted? = 138
      • References = 142
      • 7. OBSERVATION RUBRICS IN SCIENCE ASSESSMENT / John E. Trowbridge ; James H. Wandersee
      • Observation in the Naturalist Tradition = 147
      • Agassiz' Biology Teaching Legacy = 147
      • Science Advances Through Observation = 149
      • Notable Scientists Who Use Observations as a Basis for Their Work = 150
      • Pedagogy and the New Standards = 152
      • Theory­Ladenness of Observation = 154
      • Teaching the Power of Direct Observation = 154
      • Complementary Sets of Criteria to Assess the Quality of Observation = 154
      • Making Observations Well­The Norris Criteria = 154
      • Reporting Observations Well­The Norris Criteria = 155
      • Assessing Reports of Observations­The Norris Criteria = 155
      • Levels of Observational Competency­The ESC Criteria = 156
      • Human Vision = 157
      • Filtering of Human Vision = 157
      • Journals = 158
      • Constructing Observation Rubrics = 158
      • Teaching How to Make Good Observations = 161
      • Teaching How to Report Observations Well = 163
      • Teaching How to Assess Observations Well = 163
      • Conclusion = 164
      • References = 165
      • 8. PORTFOLIOS IN SCIENCE ASSESSMENT : A KNOWLEDGE­BASED MODEL FOR CLASSROOM PRACTICE / Michael R. Vitale ; Nancy R. Romance
      • Portfolios and Assessment in Science = 168
      • Portfolios and Science Assessment = 168
      • Portfolios in Science Classrooms and Assessment Methodology = 169
      • Recognizing Other Instructional Uses of Portfolios in Science Teaching = 170
      • Summary of Potential Benefits and Problems with Portfolio Assessment in Science = 171
      • Limitations of the Chapter on the Scope of Portfolio Assessment = 171
      • A Cognitive Science Perspective on Knowledge, Learning, and Assessment = 172
      • Core Concepts and Concept Relationships as a Framework for Science Understanding = 172
      • A Methodological Perspective on Science Knowledge and Understanding = 173
      • Concept Mapping as a Practical Strategy for Knowledge Representation in Science Classrooms = 175
      • Knowledge­Based Architectures for Science Teaching and Assessment = 176
      • A Knowledge­Based Portfolio Assessment Model = 182
      • Additional Perspectives Relating to Knowledge­Based Assessment = 182
      • A Model for Knowledge­Based Portfolio Assessment = 184
      • Some Elaborations of the Basic Model for Knowledge­Based Portfolio Assessment = 187
      • Planning and Implementation Guidelines for Knowledge­Based Portfolio Assessment = 188
      • Examples Illustrating the Knowledge­Based Portfolio Assessment Model = 189
      • Formal and Informal Processes for Developing Knowledge­Based Portfolio Tasks = 189
      • Selected Examples of Knowledge­Based Portfolio Tasks = 190
      • Some Variations on the Knowledge­Based Portfolio Examples = 192
      • Implications of the Knowledge­Based Model for Science Teachers and Researchers = 193
      • References = 194
      • 9. SemNet SOFTWARE AS AN ASSESSMENT TOOL / Kathleen M. Fisher
      • What Is the SemNet Software? = 198
      • What Is Meaningful Conceptual Understanding in Science? = 201
      • How Can Meaningful Conceptual Understanding Be Assessed? = 202
      • Using SemNet as an Assessment Tool = 203
      • Nature of Relations = 204
      • Generative Assessments with SemNet = 204
      • Diagnostic Assessment = 205
      • Embedded Assessments = 206
      • Summative Assessment = 206
      • Assessment of Prior Knowledge = 207
      • Strategies for Evaluating Student­Generated Semantic Networks = 207
      • Overview­About Net = 208
      • Quality of Relations = 209
      • Assessing Content = 211
      • Quality of Concept Descriptions = 211
      • Identifying Main Ideas­Concepts by Embeddedness = 212
      • Hierarchies and Temporal Flows = 213
      • Peer Review of Nets = 213
      • Assessing Contributions of Members of a Group = 213
      • Incorporating SemNet­Based Assessment into Printed Tests = 214
      • Assessing Knowledge about Main Ideas = 214
      • Assessing Knowledge about Details = 215
      • Assessing Knowledge about Relations = 215
      • Assessing Ability to Make Fine Discriminations = 216
      • Summary : A Vision for the Future = 217
      • References = 219
      • 10. WRITING TO INQUIRE : WRITTEN PRODUCTS AS PERFORMANCE MEASURES / Audrey B. Champagne ; Vicky L. Kouba
      • Definitions, Assumptions, and Perspectives = 224
      • Assessment = 225
      • Performance Assessment = 225
      • Opportunity to Lean = 226
      • Performance Expectations = 226
      • Inquiry = 227
      • The Learning Environment = 229
      • Discourse in the Science Classroom = 232
      • Evidence That Learning Has Occurred = 233
      • Writing to Inquire = 234
      • Writing to Develop Community Norms = 236
      • The Teacher = 237
      • Theoretical Perspective = 237
      • Social Constructivism = 237
      • Reflective Thinking = 238
      • Rhetoric = 239
      • Performance Expectations = 239
      • Strategies for Developing Performance Expectations = 240
      • An Example of the Process of Developing Performance Expectations = 241
      • National Science Education Science as Inquiry Standard = 241
      • Interpreting the Standard and Descriptions of the Abilities and Knowledge = 242
      • Elaboration of the Abilities and Knowledge = 244
      • Conclusions = 246
      • References = 246
      • 11. THE RELEVANCE OF MULTIPLE­CHOICE TESTING IN ASSESSING SCIENCE UNDERSTANDING / Philip M. Sadler
      • Background Issues = 251
      • History of Multiple­Choice Tests = 251
      • Standardized Tests = 252
      • Underlying Theories = 254
      • The Origins of Testing for Preconceptions = 255
      • Test Construction = 257
      • Item Construction = 257
      • Examples of Constructivist Tests in Science = 258
      • Pychometric Tools = 259
      • Item Response Theory = 259
      • Analysis of Two Items and the J―Curve = 260
      • Test Characteristics = 264
      • Measuring Conceptual Change = 265
      • Placement along a Continuum = 265
      • Curriculum Maps = 267
      • Cognitive Range = 268
      • Logical Networks = 270
      • Implications = 272
      • References = 274
      • 12. NATIONAL AND INTERNATIONAL ASSESSMENT / Pinchas Tamir
      • Student Assessment = 279
      • Curriculum Reform and Student Assessment = 280
      • A Balanced Student Assessment = 280
      • Types of Assessment = 280
      • National Assessment = 281
      • Scoring = 281
      • The Meaning of Scores = 283
      • Scoring Key : An Example = 284
      • International Assessment = 285
      • A Brief Introduction to TIMSS = 285
      • Conceptual Framework for TIMSS = 286
      • TIMSS Curriculum Framework = 287
      • Who Was Tested? = 288
      • Data Collection = 288
      • Rigorous Quality Control = 288
      • United States = 289
      • United Kingdom = 290
      • France = 291
      • Germany = 291
      • Israel = 292
      • Science Education : International = 293
      • Teachers' Autonomy = 293
      • Freedom Built into the System (Not Specific to Biology) = 293
      • Particular Features of the Biology Program = 294
      • Empowering Teachers = 295
      • Differences between SISS and TIMSS = 296
      • Selected Findings = 298
      • Conclusions 299
      • References 300
      • Bibliography 301
      • 13. ON THE PSYCHOMETRICS OF ASSESSING SCIENCE UNDERSTANDING / Richard J. Shavelson ; Maria Araceli Ruiz­Primo
      • Sampling Framework for Evaluating Alternative Science Achievement Tests = 304
      • Psychometric Approaches to Modeling Science Achievement Scores = 306
      • Classical Reliability Theory = 307
      • Generalizability Theory = 307
      • Item Response Theory = 308
      • A Sketch of Generalizability Theory = 309
      • Relative and Absolute Decisions = 313
      • Measurement Error = 313
      • Estimation of Generalizability and Dependability Coefficients = 314
      • Generalizability and Decision Studies = 315
      • Validity of Proposed Interpretations of Assessment Scores = 316
      • Evaluation of Alternative Assessments : Examples and Summary of Findings = 317
      • Concept Maps = 318
      • Performance Assessment = 330
      • Concluding Comments = 337
      • References = 338
      • 14. CAUTIONARY NOTES ON ASSESSMENT OF UNDERSTANDING SCIENCE CONCEPTS AND NATURE OF SCIENCE / Ronald G. Good
      • Defining Understanding : Expert­Novice Studies = 344
      • Assessing Understanding of Science Concepts = 346
      • Assessment as Problem Solving = 346
      • Assessment as Concept Mapping = 347
      • Assessment as Prediction = 348
      • Assessing Beliefs about Science and Scientists = 349
      • NOS Assessment : Problems and Prospects = 349
      • Reasonable Assessment Expectations for Science Teachers = 351
      • References = 353
      • 15. EPILOGUE : ON WAYS OF ASSESSING SCIENCE UNDERSTANDING / Joseph D. Novak ; Joel J. Mintzes ; James H. Wandersee
      • Assessing Science Understanding : A Summary of Tools, Techniques, and Ideas = 355
      • Learning, Teaching, and Assessment : A Human Constructivist View = 359
      • Assessing Assessment and Valuing Student Work = 366
      • Assessing Assessment = 366
      • Valuing Student Work = 369
      • Windows on the Mind : Concluding Remarks = 370
      • References = 373
      • index = 375
      더보기

      온라인 도서 정보

      온라인 서점 구매

      온라인 서점 구매 정보
      서점명 서명 판매현황 종이책 전자책 구매링크
      정가 판매가(할인율) 포인트(포인트몰)
      예스24.com

      Assessing Science Understanding: A Human Constructivist View

      판매중 194,390원 159,390원 (18%)

      종이책 구매

      7,970포인트 (5%)
      • 포인트 적립은 해당 온라인 서점 회원인 경우만 해당됩니다.
      • 상기 할인율 및 적립포인트는 온라인 서점에서 제공하는 정보와 일치하지 않을 수 있습니다.
      • RISS 서비스에서는 해당 온라인 서점에서 구매한 상품에 대하여 보증하거나 별도의 책임을 지지 않습니다.

      분석정보

      View

      상세정보조회

      0

      Usage

      원문다운로드

      0

      대출신청

      0

      복사신청

      0

      EDDS신청

      0

      동일 주제 내 활용도 TOP

      더보기

      이 자료와 함께 이용한 RISS 자료

      나만을 위한 추천자료

      해외이동버튼