View Single Post
  #8  
Old 08-02-2012, 02:34 AM
agzg agzg is offline
GreekChat Member
 
Join Date: Apr 2004
Location: but I am le tired...
Posts: 7,283
Quote:
Originally Posted by DrPhil View Post
The Joe Paterno items and the survey length.

As for length, they may need to reconsider their survey design. It is difficult enough getting people to complete surveys no matter how many times you distribute them. Alma mater surveys sometimes include items about department and campus dynamics that can impact pride and involvement. There is no way to know the impact of things without conducting the research. That also applies to the Paterno items on this survey. Still seemed strange when I first read it.
Yeah, that was the point I found weird, too. I thought you were saying it was weird that someone posted it here ("doing your research" v. actually doing research).

Alumni surveys are frustrating. And college campuses shouldn't be asking questions of alumni relating to departments beyond career services/alumni relations although I understand why in this case they included those. But their scales are... Well... Not usual.

My undergrad asked about Admissions. What the flippty what?

Quote:
Originally Posted by KSig RC View Post
StrategyOne has a fairly good reputation, but this questionnaire is kind of awkward - the choice of scales (particularly the Likert portion), the bins, and the wording of the single-sentence questions, and so on.

It makes me wonder what the design goals were, and who had final sign-off ... it seems geared toward some narrow outcomes.



A 12% response rate isn't really out of the ordinary for phone/email hybrid research - it's low-yield. They did not appear to take any measures to account for 'motivated pollers' in this research, and the 58% male split could well indicate a group with incentive to respond at a higher rate (it could also simply reflect the historically higher number of male students, as well - hard to know without the pool demo). Even if that's 'clean', the proximity to PSU's issues kind of dicks up your ability to claim randomness in the response population (particularly via email) - even in a hypothetical sense, we can imagine who would be most likely to respond: those who are more invested, or those who are most angry, skewing to the sides.
I bet campus leadership was heavily involved in the design. My surveys always get the most frustrating when senior leadership freaks and gets handsy. Why else would they have multiple categories that were essentially the same thing?

Were it a normal Alumni Survey I'd say 12% was fine, but given the proximity of the controversy, it's tricky.

Last edited by agzg; 08-02-2012 at 10:05 AM.
Reply With Quote