Rapple+Journal+7

Back to Lisa's Journal or Go to Journal 8 7 DVB Comparisons between independent and dependent variables can be made by checking for correlations. Be wary of spurious correlations. **Spurious correlations** happen by coincidence. Dee gave the example of shoe size and math scores. A persons shoe size is going to correlate to their age, thereby, affecting the math scores spuriously. The first consideration when testing for correlations is to identify the correct correlation test. When a **nominal** **variable** is involved (nominal vs. nominal or nominal vs. ordinal), you can only determine if there is a relationship or not. You can not determine if there is a positive or negative value, since nominal is just a label. This limits the test to using "Lambda". **Lambda** determines whether there is an association (total association is 1-one) or not (no association is 0-zero), appropriate for nominal values. A lambda of zero means there is no association, a lambda of 1 means there is total association. To conduct a lambda correlation test using spss do the following: Analyze -- descriptive -- cross-tabs -- (place the dependent variable in the row) -- checkbox lambda and chiSquare. For **ordinal variables** use **Gama** to determine association. This will tell you not only if the variables are associated, but if that association is positive or negative. Gama ranges from -1 to 0 to +1. This would be ordinal/ordinal or ordinal/continuous variables.
 * MEASURES OF ASSOCIATION AND CORRELATION:**
 * Correlation** measures the direction and strength of the relationship between 2 variables. The variables can be continuous and nominal. Pearson's r is the statistic to use for this. **Partial correlation** reflects the relationship between two variables while a third variable is controlled. This gives a clearer picture of the relationship of two variables. **Multiple regression** is used to see how closely one (dependent continuous) variable can be predicted by a set of variables. It also defines the contribution each variable makes to the prediction. **Factor analysis** is used when there are a lot of related variables and you want to tease out the relationships of a set of variables.
 * Correlation coefficients**: Which coefficient you use is limited by the type of variables you are comparing.
 * Pearson's r** (Pearson's product-moment correlation) can not only determine if there is a relationship between variables, but also if the relationship is positive or negative. It is used for continous variables only. A positive relationship means both variables change in the same direction and a negative relationship means they change in opposite direction. For me this is best represented mathematically. AB - negative relationship, A/B positive relationship.
 * Interpreting correlation coefficients**: A scatterplot is a way to visually check the pearson's r relationship. Pearson's r is dramatically affected by a non-linear relationship as well as outliers, so looking at the scatterplot is helpful, and make sure that assumptions are met. The shared variance between the variables can be calculated by squaring the correlation coefficient. Look at the current research around your topic to see if your results are practically significant compared to the literature.
 * Writing correlation results**: //r//(N) = .xx, p > 0.05 (if that is the alpha you are using). or you can actually state the p value. AN important note is that correlation is only telling you that there is a relationship between two variables, but it is not saying that one causes a change in the other.