RISS 학술연구정보서비스

검색
다국어 입력

http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.

변환된 중국어를 복사하여 사용하시면 됩니다.

예시)
  • 中文 을 입력하시려면 zhongwen을 입력하시고 space를누르시면됩니다.
  • 北京 을 입력하시려면 beijing을 입력하시고 space를 누르시면 됩니다.
닫기
    인기검색어 순위 펼치기

    RISS 인기검색어

      검색결과 좁혀 보기

      선택해제

      오늘 본 자료

      • 오늘 본 자료가 없습니다.
      더보기
      • Learning when to ask: The quantity and type of implementation data as a function of when the data are collected in a program evaluation

        Bensenberg, Michelle Lyn The University of Texas at Austin 2000 해외박사(DDOD)

        RANK : 247343

        소속기관이 구독 중이 아닌 경우 오후 4시부터 익일 오전 9시까지 원문보기가 가능합니다.

        The choices front-line staff make when they adapt programs for local circumstances affect program operations in important ways. Program evaluators stress the importance of gathering information about the implementation process for several reasons, including to improve the accuracy of the implementation, to help interpret program outcomes, and to explain variations between sites. However, the evaluation literature does not indicate whether it makes a difference when evaluators collect implementation data. This exploratory study included case studies of the implementation of two major welfare reform programs in Texas: Time Limits and the Personal Responsibility Agreement (PRA). It included three groups of Texas Department of Human Services eligibility workers and supervisors. All three groups implemented the PRA in May 1996. Group 1 implemented Time Limits three months later. Groups 2 and 3 did not implement Time Limits during the study period. Groups 1 and 2 were from relatively urban offices, and Group 3 was from rural offices. Data about the implementation of the programs were collected from randomly sampled staff in each group twice a month from July 1996 through November 1996. Sampled staff received an e-mail questionnaire that included four open-ended questions about the implementation process and a question about the degree to which the programs were integrated into their work routine. The average number of comments per respondent, including the type of comment (staff-related or client-related), is presented for each group by data collection period. Differences over time are explored. Study results are interpreted in the context of each program, and in terms of models of the implementation process. This study provides the first empirical evidence that data gathered by program evaluators about the implementation process may change due to the time of data collection. The study indicates that program characteristics and local office and staff characteristics may influence responses to the questions. Unsolicited comments from front-line staff indicated that the first few weeks of a new program may be too soon to ask some questions about the program implementation. More research is needed to understand how the dynamics of the implementation process affect data gathered about program implementation.

      연관 검색어 추천

      이 검색어로 많이 본 자료

      활용도 높은 자료

      해외이동버튼