Featured Post

Homeland Security: The Sworn Duty of Public Officials

Homeland Security: The Sworn Duty of Public Officials     The United States has a unique position amongst the countries of the world;...

Tuesday, November 10, 2015

Week 5 notes

Rules for Establishing an Argument for Causality
In order to establish a causal argument to argue that the intervention caused the change, you need to satisfy the following conditions. Consider that A = Program or Policy and B = Outcome.
  • Temporal Ordering: As measured, does the cause precede the effect in time? The research design can ensure that A occurs before B. For example, you can measure behavior before and after exposure to a behavioral modification program. This way, you can see changes in behavior after exposure to the program. 
  • Covariation: Is the cause statistically associated with the effect? If A causes B, then changes in A will result in changes in B. Does the behavior increase or decrease according to the dosage level of the behavioral treatment program? 
  • Ruling Out Spuriousness: Are there confounding effects that may explain the relationship between the cause and effect? Are there factors, such as C, D, E, and so on, that may explain changes in B? For example, is there something else that is causing the behavior to change? The behavioral modification program may not have changed the behavior, but the threat of harsh legal sanctions may influence the behavior.

Week 5: Week 5 - Developing Outcome Measures (1 of 2)

(correlation does not = causation)(underlying factors)

Confounds are factors, other than the treatment or intervention, that are responsible for the observed outcome changes. In other words, these are factors, other than the program or policy, that caused the change in the outcome.

Two techniques to minimize confounds are:
  • Random Assignment Technique or Use of a Control Group: In the random assignment technique, the planner randomly selects and assigns clients to a treatment group. These individuals experience the program or policy services or activities. The planner also assigns specific stakeholders as a control group. These individuals are not exposed to the program or policy services or activities. This way, the planner compares those who receive the intervention with those who do not receive the intervention and gauges their differences on the outcome measure. 
  • Nonequivalent Comparison Approach: The nonequivalent comparison approach is conducted when randomized assignment is not possible. The planner creates a nonequivalent comparison group to reflect members who are already in the treatment group, based on important characteristics such as age, sex, race, and social class. This technique also allows the planner to chart the differences in outcomes between those who experience the intervention and those who do not, for average client characteristics. 
Remember that the strength of the design is dependent upon whether you have a control group—random assignment—or have constructed a semi-control group, often referred to as a nonequivalent comparison group. Without these, the methodology is weak, and consequently, the results can only be treated as speculative.

Week 5: Week 5 - Selecting a Research Design (1 of 2)

Face validity
content validity
criterion-related or construct validity
test-retest reliability
Another quality that is desirable in a research design is the presence of both a
pre-test and a post-test. This allows the researcher to determine whether a change has occurred and, more importantly, what the magnitude of the change is.

eek 5: Week 5 - Selecting a Research Design (2 of 2)

A technique has been created to accommodate situations where random assignment is not possible; this technique is referred to as
quasi-experimental designs and is commonly used to evaluate policies conducted at a macrolevel—cities, counties, states, and nations.  
Evaluating Policies

To assess impact, we want to compare actual outcomes to desired outcomes (objectives)175
Exactly how specific costs and benefits should be defined, however, is a matter of some controversy, and procedures for conducting efficiency analyses tend to be quite complex.177

No comments:

Post a Comment