Assessment Update - 2015

This post originally appeared on the Software Carpentry website.

About a year ago, I hoped to help answer the question, "Is our students learning?" We are now a bit closer to an answer, and I wanted to update the community about what we have accomplished this year. First, I want to thank and acknowledge the assessment subcommittee and Blake Joyce, who helped to define tasks, possibilities, and provided significant feedback. Especially here I wanted to thank Daniel Chen, Jeramia Ory, and Rayna Harris who put in a lot of hours moving things along.

Where are we today and why

Currently, we have three surveys, all of which you can preview at the links below:

  • Pre-workshop Survey: Basic information about learner capabilities and attitudes

  • Post-workshop Survey: Feedback on the perceived effectiveness of the workshop,learner motivation, and confidence

  • Long-term Survey: Determine impacts and outcomes for learners after attending a workshop

As it turns out, there aren't widely accepted and standardized ways to evaluate workshops like these. We apply evidence-based teaching techniques designed for the classroom, but resemble science museums that try to teach within a very short exposures. Our teaching materials might have been done as an online or self-paced course, but we use socially driven in-person workshops. Given these complexities, here are are some of the questions that drove these surveys:

  • Do we create a learning environment that is welcoming and inclusive?

  • Can we generate feedback for our instructors that will help them improve?

  • What do learners think they know before/after our workshops?

  • Are we lowering anxieties and barriers to learning the topics we teach; are we motivating learners?

  • What positive outcomes do learners attribute to their Software Carpentry experience?

  • Who are our learners, and are we setting and meeting goals for diversity?

We think these are reasonable questions given our context and constraints. We know that a two-day workshops are not going to change learners into experts, but we do think we can set a lot of learners down the path to competency within the best practices of software use in their respective sciences.

What instructors need to know

The most important task for instructors is to remind (and provide time for) learners to complete the surveys. These surveys will not only help Software Carpentry do a better job in general, but will generate feedback that help you to improve personally. I like emphasize to the learners that these data are important to us, and we really need them to contribute to this.

Survey Workflow

Survey Setup

  1. Links to the survey are part of the workshop template. These should automatically be a part of your agenda. When your agenda becomes public to your learners, you can remind them to take the pre-survey.

  2. During the survey, learners will select your workshop from a list ('which workshop are you attending' Q3 on pre-survey, Q2 on post-survey). Check this question to see your workshop is listed. If not, check with admin@software-carpentry.org. Adding workshop sites is a manual step for us, and if something was overlooked, there will be an option for the learner to just type in the workshop name. The survey link generated when you create your workshop template also encodes information on the workshop ID; a backup against mistakes in user input.

Administering surveys

  1. In any reminders or communications to learners before the workshop, please remind learners to take the pre-survey.

  2. Remind learners to take the pre-survey if they haven't done so before you begin day one.

  3. On day two, leave extra time for learners to take the post-survey. The best way to ensure a high-completion rate is to ask your learners to complete this survey before they leave the workshop. Please allow time for this, and for learners that must leave, remind them that they can still complete this at home.

  4. The admin will send the long-term follow up survey to learners after the workshop by email.

Viewing survey results

  1. The Software Carpentry administrator will generate a URL for you that will enable you to view survey responses to both the pre and post workshop surveys. These will also be visible to the mentors at the workshop debriefs.

Like everything Software Carpentry, this effort is open to and benefits from your feedback. Since we are using Survey Monkey the usual methods of making pull requests aren't possible. There may be a better way to make changes, but for now suggestions should be sent to the assessment list.

What do we do next

As good a start as this is, we still have lots more we want to do. First, thanks to Data Carpentry we will be getting some professional assessment help from a dedicated evaluator. At the very least, we think our efforts so far will create a great starting point for improved assessments. There are still many questions that we will need more effort to get at including:

  • What is the best way to get live feedback during our teaching?

  • Can we get a more quantitative understanding of what skills our learners have and if they are improved by our workshops?

  • What misconceptions do our learners have and how do we address them?

  • How can we get more fine-grained feedback on individual lessons?

  • How can all of this be integrated with AMY to make it easier for our instructors and organization to react to collected feedback?

We have some ideas on this that will have to wait for another blog post. As we get more data, we will want to regularly report on what we have been learning, and over the past month we have already collected more than 200 responses. For now though, please let us know both about your opinions on what we have done so far, and what should come next.

Dialogue & Discussion

Comments must follow our Code of Conduct.

Edit this page on Github