http://chineseinput.net/에서 pinyin(병음)방식으로 중국어를 변환할 수 있습니다.
변환된 중국어를 복사하여 사용하시면 됩니다.
Bensenberg, Michelle Lyn The University of Texas at Austin 2000 해외박사(DDOD)
소속기관이 구독 중이 아닌 경우 오후 4시부터 익일 오전 9시까지 원문보기가 가능합니다.
The choices front-line staff make when they adapt programs for local circumstances affect program operations in important ways. Program evaluators stress the importance of gathering information about the implementation process for several reasons, including to improve the accuracy of the implementation, to help interpret program outcomes, and to explain variations between sites. However, the evaluation literature does not indicate whether it makes a difference when evaluators collect implementation data. This exploratory study included case studies of the implementation of two major welfare reform programs in Texas: Time Limits and the Personal Responsibility Agreement (PRA). It included three groups of Texas Department of Human Services eligibility workers and supervisors. All three groups implemented the PRA in May 1996. Group 1 implemented Time Limits three months later. Groups 2 and 3 did not implement Time Limits during the study period. Groups 1 and 2 were from relatively urban offices, and Group 3 was from rural offices. Data about the implementation of the programs were collected from randomly sampled staff in each group twice a month from July 1996 through November 1996. Sampled staff received an e-mail questionnaire that included four open-ended questions about the implementation process and a question about the degree to which the programs were integrated into their work routine. The average number of comments per respondent, including the type of comment (staff-related or client-related), is presented for each group by data collection period. Differences over time are explored. Study results are interpreted in the context of each program, and in terms of models of the implementation process. This study provides the first empirical evidence that data gathered by program evaluators about the implementation process may change due to the time of data collection. The study indicates that program characteristics and local office and staff characteristics may influence responses to the questions. Unsolicited comments from front-line staff indicated that the first few weeks of a new program may be too soon to ask some questions about the program implementation. More research is needed to understand how the dynamics of the implementation process affect data gathered about program implementation.