Survey Approaches

Back to Assessment Resources home

Surveys for information can take a variety of different forms. Surveys work particularly well for learning about engagement in programs or the effectiveness of programs, and can be used to measure student learning outcomes particularly when learning is spread over a variety of learning experiences across campus or classes or happens over the span of multiple years. To maximize the effectiveness of surveys, try to keep them short and focused. It can be helpful to start your design considering what you would include in a survey if you could only ask 10 questions. Prioritize questions that will give you information that you can learn from and act on wherever possible.

Counts are useful when you need to know is how often something is happening or how much engagement you have with one of your initiatives.

For example, you may want to consider counting:

  • Attendance at an event or workshop
  • Products generated from a program or workshop
  • ‘Melt’ (participants dropping out) during a longer program
  • Courses with vocation in syllabi or in learning outcomes

Satisfaction surveys give you more information about how participants experienced an event. If you are piloting a workshop, program, or discussion group and hope to learn from participants about whether the structure worked, you may find it helpful to include some general questions like the ones listed below.

 Please indicate your level of agreement with each of the elements of the five-week faculty workshop in which you recently participated (1=Disagree; 2=Somewhat Disagree; 3=Somewhat Agree; 4=Agree).​

  • The communication that I received prior to the workshop was clear.
  • The facilitator of the workshop was well-prepared.
  • The length of the workshop was just right.
  • The readings for the workshop were informative.
  • My expectations for the workshop were fulfilled.
  • I felt comfortable contributing comments during the workshop.
  • The physical location of the workshop was conducive to the discussions that occurred there.

You might also consider asking questions more specific to the mode in which you engaged the content.

For example, a programming survey about a discussion group:

  • How, if at all, have your thoughts and feelings about the purpose of college changed as a result of this experience?
  • What was the most meaningful part of the discussion group?
  • How do you think future discussion groups like this could be improved?
  • Is there a specific topic you’d be interested in, or think might be important, for a future discussion group related to leadership and character? If so, what?

Your survey can include open-ended questions, which require the respondent to construct a response. You can get a lot of information from these responses, but they also take much more time to review and gather into a report. Open ended questions are particularly useful early on in programs when you are still looking for themes or want to understand more about how people are engaging with a program.

Example questions directed towards students:

  • In what way has [insert program/class/curriculum] supported your development of a sense of purpose?
  • What activity or class has most supported your development of a sense of purpose?
  • What is understanding of vocation? Or what is your definition of vocation? This question can work well before and after a programmatic intervention if your goal is to create a shared definition understanding of vocation across campus.

Example questions for a faculty survey to understand what is already happening on campus

  • Do you have student learning outcomes associated with the topic of vocation in your courses?
  • How do you define vocation?
  • Where do you think the topic of vocation fits in with the courses you teach?
  • What is the one effective thing you do to engage with your students on the topic of vocation?

If you do not have common language around vocation or you want a more indirect way to engage student thinking relating to vocation, you can provide a variety of pre-determined scenarios and ask students which one resonates with their thinking the most. This approach works best if the scenarios are constructed in a way that it is not obvious that one particular scenario is the ‘best’ or ‘right’ answer.

The scenarios survey gives an example where students are provided three different ways to think about the work they do and its relationship to a sense of calling. Students then respond to each scenario indicating how much it feels like them and the way they think about their own work. 

A survey with multiple choice questions is much easier to process and analyze, but you must know enough about your program and its goals to know what you want to ask.

This is a good example of a multiple choice survey that targets the specific goals of a discussion group about meaning and purpose.

Please indicate to what extent you agree with the following, using a scale from 1 (strongly disagree) to 5 (strongly agree). I found the discussion group helped me…

  1. Foster my sense of community.
  2. Better understand my sense of purpose.
  3. Develop a deeper understanding of what it means to have a strong friendship.
  4. Strengthen my capacity for resilience.
  5. Learn how to express gratitude.
  6. Develop my ability for critical self-reflection.
  7. Feel more capable of taking meaningful action in college and beyond.

Another example of using multiple choice questions to understand engagement with a program is the survey used to evaluate the Program for the Theological Exploration of Vocation (PTEV). Note that this survey does include a mixture of multiple choice questions as well as free response questions. While more examples and context can be found in the linked document above, some key examples of questions that target the effectiveness of programming and what participants learned include:

  • Through participation in [name of program], I developed a better sense of my vocation, calling, or purpose. (Response options: strongly agree, agree, not sure, disagree, strongly disagree).
  • [Name of program] programs helped me identify my skills and talents. (Response options: strongly agree, agree, not sure, disagree, strongly disagree).
  • [Name of program] programs encouraged me to both see and serve the needs of others. (Response options: strongly agree, agree, not sure, disagree, strongly disagree).
  • [Name of program] programs have deepened my appreciation for the mission of [name of college or university]. (Response options: strongly agree, agree, not sure, disagree, strongly disagree).
  • Engaging concepts like vocation, calling, or purpose has helped me move advising/mentoring conversations into deeper and more important matters. (Response options: strongly agree, agree, not sure, disagree, strongly disagree).
  • Overall, how has the [name of exploration project] been received by the faculty and/or staff at [name of college or university]? (Response options: [1] faculty and/or staff have been extremely welcoming of the grant. [2] faculty and/or staff have generally been welcoming of the grant. [3] faculty and/or staff have expressed some resistance to the grant. [4] faculty and/or staff have expressed extreme resistance to the grant. [5] faculty and/or staff do not know enough about the grant to welcome or resist it. [6] Not sure.).
  • The impact of [name of program] programs will be felt long after the grant money is gone. (Response options: strongly agree, agree, not sure, disagree, strongly disagree).

You can also consider using standardized measures related to vocation and calling.