Student Life Program EvaluationDean of Students
Why conduct evaluation?
- To see if what do is having the intended impact.
- To contribute to our understanding of student learning.
- To support increased funding for our services.
- To understand who is using our services and to see what they are experiencing as well as being impacted by our services.
How do I get started?
- Attend conference sessions on evaluating programs in your area.
- Watch for webinars on this topic (AND pay to participate).
- Reach out to peer institutions (especially those you admire) to see how they are evaluating their programs.
- Define the goals of your area (in general) as well as the specific programmatic goals you have and review options to assess those goals.
What tools might I use to evaluate my programs?
- Utilization surveys or tracking tools (who is using the service (ethnicity, gender, 1st gen)).
- Satisfaction surveys (did they like what we did?)
- Surveys that measure student learning.
- Pre-packaged instruments that measure impact of the experience (consider pre- post-test to measure change).
- Focus group model to understand in depth experience of individuals.
- Interviews of individual.
- Benchmarking studies.
What to consider when crafting a survey
-
Sampling —
- Gathering information on our services and programs is best done if it is representative of the population from those who could or do use the service. For example, if you do a program with 10 in attendance, it is important to have all or nearly all of those present complete the evaluation instrument. The sample should resemble the target population as much as possible (not all First Years or not a sample of all Juniors if you are trying to represent the input of the student body).
- Timing is important and the survey should given as near to the time of the intervention/program as is practical.
- Length of survey should be consistent with the purpose and importance of the information. The more information you gather the more you have to analyze. Shorter is generally better.
- Test the survey with students prior to its administration — Have several students take the survey in the format in which it will be delivered (e.g., if it’s an on line survey, have several students take it and give you feedback).
- Tips for Creating Effective Survey Questions
- Be clear and concise – Don’t ask questions that are overly wordy or unclear.
- Avoid questions that make assumptions – Don’t ask questions that assume something about your service or program (“Since counseling is a right . . . “; Or, “Because your first two weeks in school will be difficult . . . “)
- Ask one question at a time – For example, don’t ask: Agree or Disagree “The Campus Safe Ride Program provides an essential service and is easy to access.”
Focus Group — the when and why of focus groups
- Focus groups allow for an in-depth exploration of a specific topic. This is an excellent methodology for exploring the development of services (for example, what kinds of services would be useful for out of state students).
- Focus groups function best when there is a gathering of 5 to 10 individuals from a target group (for example, out of state students or first generation students).
- A trained facilitator sets the grounds rules and lays out the purpose of the session and a second person functions as note taker.
- The discussion is open ended and spontaneous and can last from 45 to 90 minutes.
- Using an ice-breaker at the beginning helps to ease individuals into the conversation.
- Conducting 3 or 4 focus group sessions of relatively homogenous participants who don’t know each well will lead ultimately to repetition of certain themes. When this repetition occurs, it is referred to as “saturation,” meaning you have arrived at some key ideas that deserve attention and further consideration.
- When scheduling a focus group session, it is a good idea to over-invite the number of participants to ensure a sufficient number.
- Use a consent form and collect relevant demographic information at the outset of a session.
- Consider writing up a summary of the key themes that were expressed and send these out to the focus group members for feedback and confirmation if appropriate.
- Some issues to consider when creating focus groups
- Should there be some separation between groups to promote honest exploration of a particular topic (for example, should men be separate from women?).
- Are there any power differentials that might impede exploration (seniors versus first year students)?
Benchmarking Studies
- Benchmarking studies allow departments, divisions and entire institutions to compare what is being done with other “higher achieving” comparison groups. These are best done when there is a focused area or finite issue with which to compare.
- When seeking out comparison groups, it is important that what is being compared are institutional settings that are substantially similar to one’s own. For example, comparing a small, residential liberal arts institution in a rural setting to an urban, non residential, large institution, may be meaningless or difficult to make true comparisons. Type of student (for example, their academic preparation) and size of endowment (resources that are available) might make the comparison difficult to translate if there are substantial differences.
- Some benchmarking studies involve looking at “aspirant institutions” – institutions that are not substantially similar but that represent situations that are desirable and worthy of comparison. In some instances, aspirant institutions can be useful to help gain further resources as it is desirable to match these institutions (i.e., “if [fill in institutional name] does this, we definitely want to keep up with them!”).
- Types of information that can be gathered and compared includes:
- Staffing levels (how many psychologists to they have?)
- Types of services offered
- Types of duties performed
- Budget available
- Types of processes (how are sexual misconduct cases handled)?
- Because this kind of information gathering is time consuming, it is helpful to have multiple individuals gather the information. Sometimes it is helpful to conduct an online survey, to meet with individuals at a national conference or to use information judiciously from the institutions web site.
Ethical Considerations
- All evaluations efforts need to be reviewed by our Institutional Review Board (IRB) to assure ethical and safe practices.
- Subjects should be given a clear understanding of the objectives of the evaluation and how the information will be stored and used.
- If the nature of the evaluation covers sensitive topics, it should be clear to participants from the outset that that is the case and they can opt in or out of any question.
- Participants should be directed to supportive resources if they are or reasonably could be “triggered” by the material.
- Efforts should be taken to safeguard the confidentiality and privacy of participants. If an evaluation effort is designed to be confidential, participants should not be able to be identified through participation.
- Data from evaluation efforts should be maintained in a secure location and destroyed when it is no longer being used.
- Do not represent data in such a way that is out of context or that presents a picture that is simply not representative of the student experience.