Assessing Our Learners Part I
This post originally appeared on the Software Carpentry website.
Three weeks ago, Jason Williams, Jeramia Ory, and Daniel Chen met at the New York Public Library to work out an initial survey to assess our learners. Greg Wilson and Katerena Kuksenok joined virtually to provide feedback. The goal was to take the comments from the various initial GitHub issues and create a draft of an assessment survey for everyone to provide input. Our first draft is up, so please provide feedback at {{site.github_url}}/assessment/issues/6.
We set out to create an intial set of questions on the following topics:
- attitude/opinions/mindsets,
- facts/declarative knowledge, and
- actual, testable skills.
and discussed the idea of giving out the survey immediately after the end of the workshop. This way we can get a decent number of responses about how that particular workshop went for students. Additionally, some hosts conduct surveys of their own after the workshop so we should aim to have about 15 to 20 minutes after the workshop for feedback.
The survey itself will have the format outlined below.
Introduction
A few words on the purpose and importance of the survey, as well as information on privacy, etc.
Questions about this particular workshop
Quick questions about the overall experience of this particular workshop
- Likert scale: Strongly Disagree to Strongly Agree
- The agenda for the workshop was reasonable
- I got sufficient help from instructors and helpers
- The overall atmosphere of the workshop was welcoming and unintimidating
- I learned valuable things at this workshop
- this question is a bit vague can could lead to unexpected responses
- really want to capture if the workshop was useful (at all) to the individual
- possible rephrases per Katerena:
- 'The workshop was worth my time'
- 'I'm glad I went to the workshop'
- Rate how you perceived the pace of the workshop
- Too slow (We dwelled too long on most topics)
- Somewhat slow
- Just right
- Somewhat fast
- Too fast (Most topics presented at a pace I could not follow)
- Inconsistent (Some parts were too slow, and others were too fast)
- 5 or 7 pt. Scale: How did you feel about the balance of lecture to hands-on exercises?
- Too much hands-on..... balanced .... too much lecture
- trying to measure whether people prefer lecture, or more hands on.
- Are our lessons too much towards one side?
Attitudes
Perspectives we want to get from our learners (hopefully that we have convinced them on) types of people who come:
- People who come with no intention of using any of the skills (just curious)
- PIs who want their researchers to learn skills used in the lab
- Researcher who need skills to function in the lab
- People who see 'shell, python, R, git, SQL' and want to attend (more detail)
- Self taught programmers who want more formal instruction and learn better/best practices
We wanted to create a series of questions that can account for the above types of attendees.
- Ranking: Here are some reasons why people take a Software
Carpentry Workshop. Rank your top 3 reasons below. If your reason is
not listed, please enter it under 'Other'
- To get an overview of technologies, skills, and best practices used in my field
- To apply these skills in making my analyzes more efficient (automated)
- To share these skills by training others
- To improve my coding or learn a new language
- Because these skills will be needed for my research
- To explore different options for approaching some specific ongoing project(s)
- To gain a specific skill for some specific ongoing project(s)
- The workshop was recommended by my colleagues or advisers
- Because it was a workshop (not an online course)
- Other: ___
- What we want target: We want to find out if we have made
learning difficult things easier and less intimidating and whether
people are feeling better about some of the topics we cover
(assuming they felt intimated about them previously)
We need feedback on how to better phrase these questions
- Likert: Before the workshop, did you feel any of the
following topics were intimidating? (Not intimidating |
Intimidating | Did not know what this was):
- Shell
- Git
- R/Python/Matlab/etc
- SQL
- Likert: If your opinion has changed about these topics,
please indicate which ones below: (I now feel less intimidated |
I now feel more intimidated | No change):
same choices as above - Likert: Before the workshop how valuable did you consider
skills in the following topics? (Not valuable | Valuable | Did
not know what this was):
same choices as above - Likert: After the workshop how valuable do you consider
skills in these areas? (Not Valuable | Valuable | No change):
same choices as above
- Likert: Before the workshop, did you feel any of the
following topics were intimidating? (Not intimidating |
Intimidating | Did not know what this was):
Select all the statements you think this workshop accomplished for you (116) Made learning these skills enjoyable Made me more likely to continue learning these skills etc.
Important Takeaways
- 3-5 key facts or concepts that we need every learner to understand
- automation
- modular code
- version control
- structured data
Skills: Self-assessment
Originally we thought of asking students to check off which learning objectives from the lessons students thought they could do. But there are too many objectives in the lessons to list in a survey. We need a better (and fast) way to assess actual skills that doesn't take any significant time away from the workshop itself.