This guide provides clear, concise guidance for the researcher dealing with the everyday problems of sampling. Using the Practical Design Approach, the author integrates sampling design into the over-all research design and explains the interrelations between research design and sampling choices. The author takes the perspective of the researcher faced with making sampling decisions, laying out alternatives and the implications of different choices. The author uses a narrative, conceptual approach throughout the book. Mathematical presentations are limited to necessary formulas and examples of calculating using the formulas.
This volume explores the regression (or structural equation) approach to the analysis of time series data in which the modeller makes an initial specification of a causal structure and then analyzes the data to determine whether there is any empirical support for the specification. The great advantage of time series regression analysis is the possibility for both explaining the past, and predicting the future behaviour of variables of interest. Although this volume does not cover Box-Jenkins types of models for explaining endogenous variables, it does introduce the Box-Jenkins time series method as an alternative to modelling the underlying error processes. As such, the book attempts to partially bridge the gap between the two approaches to the analysis of time to time series data.
This manual facilitates programme self-evaluation research so that it can be conducted with a minimum of outside, technical assistance. Part one features a discussion of three distinct models of evaluation: Input, Process and Group or Client Level Outcomes. Part two features common technical elements of Programme Evaluation Research and presents simple information and instructions on how to perform some specialized research procedures. Topics related to choosing samples, selecting research designs, constructing data collection instruments, scheduling data collection, training data collectors and analyzing findings are covered. An annotated guide to practical, hands-on-materials on self-evaluation is also provided. Throughout this manual, examples and case illustrations are drawn from a wide range of child abuse prevention programmes. For the evaluator familiar with research design and methods, but not an expert in child welfare, the manual provides strategies and resources such as data collection instruments, that are specific to the field. For students of child abuse prevention and programme staff, the manual provides a guide to the complex process of programme evaluation research.
This volume emphasizes the conceptualization and use of measurement concepts and principles in relationship to decisions routinely made in various phases of direct practice - assessment, planning interventions, implementing intervention, and termination and follow-up. The authors describe measurement concepts and research tools providing frequent case examples to demonstrate how measurement can facilitate case planning and decision making. More specifically, they show how practitioners can utilize measurement techniques to help determine client eligibility, to assess client functioning and problems, to determine intervention plans and goals and maintain the extent to which they are implemented, and to estimate the degree of client progress and the extent to which that progress is maintained.
The author expounds the application of this multivariate analysis technique to the social sciences. He demonstrates what it can offer the researcher in analyzing particular sets of multidimensional data. He shows how it can be used to determine the number of factors to be retained in a factor analysis; for extracting the initial factors in a factor analysis; in selecting a subset of variables to represent a much larger set; and in coping with multicolinearity in regression analysis, a persistent problem in behavioural and social science data sets.
Research in Health Care Settings provides: an abbreviated review of the step-by-step process of conducting research; a glimpse backstage at the way research is actually done; a discussion of the problems of collaboration; and help in building bridges to the health professional necessarily immersed in the day-to-day problems and emergencies of health care Applied Research is defined as requiring a completely different model from the textbook model traditionally presented. 'Research in the Real World' is shown to require good judgement, flexibility and creativity. The volume is thus essential reading for all social scientists.
The Program Evaluation Kit is a practical guide to planning and conducting programme evaluations. Its nine volumes and more than 1,200 pages contain every technique necessary to evaluate any programme. This edition of the Kit is a major revision of the highly successful and influential First Edition, published in 1978. It reflects the substantial changes in the process of evaluating programmes that have taken place in the last decade. It will be invaluable to novice evaluators in a broad range of professions as well as a compact reference for the more experienced evaluator. Examples from education, management, health and social services are presented, making this edition of the Kit indispensable to evaluators in a multitude of settings.
This provocative volume deals with one of the chief criticisms of ethnographic studies, a criticism which centres on their particularism or their insistence on context -- the question is asked: How can these studies be generalized beyond the individual case? Noblit and Hare propose a method -- meta-ethnography -- for synthesizing from qualitative, interpretive studies. They show that ethnographies themselves are interpretive acts, and demonstrate that by translating metaphors and key concepts between ethnographic studies, it is possible to develop a broader interpretive synthesis. Using examples from numerous studies, the authors illuminate how meta-ethnography works, isolate several types of meta-ethnographic study and provide a theoretical justification for the method's use.
Research is often seen as a neutral, technical process through which researchers simply reveal or discover knowledge. A broader and more self-reflective stance is advocated in Beyond Method, one in which a knowledge of technique needs to be complemented by an appreciation of the nature of research as a distinctively human process, through which researchers make knowledge. Such an appreciation requires a reframing of understanding and debate about research, in a way that goes beyond considerations of method alone.