Assessment

The Carpentries value a culture of assessment, and we are consistently evaluating our workshops to understand the impact workshops have on our learners, and how we can improve our content delivery.

Dr. Kari L. Jordan, our Director of Assessment and Community Equity, leads our assessment efforts. She has established the Assessment Network, a space where those working on assessment within the open source/research computing community can collaborate and share resources.

Dr. Erin Becker, our Associate Director serves as Principal Investigator for our research efforts related to assessing the effectiveness of our workshops. This work is carried out under supervison of the Institutional Review Board at University of California, Davis. Our faculty sponsor at UC Davis is Dr. Megan Welsh, Assistant Professor within the UC Davis School of Education.

Please see our IRB proposal and determination letter for more details about our research efforts.

Surveys

We conduct pre and post workshop surveys for each workshop

Data Carpentry’s Pre and Post-Workshop Surveys

Software Carpentry’s Pre and Post-Workshop Surveys

Analysis of Software and Data Carpentry’s Pre- and Post-Workshop Surveys

Long-Term Survey

Additionally, we began collecting data on the long-term impact our workshops are having on both Data Carpentry and Software Carpentry learners. The Carpentries Long-Term Impact Survey was launched in March 2017, and data is collected every six months.

Data

Data sets (provided as csv files) and assessment reports are in a GitHub repository, along with the version of the surveys that were in use at time of analysis.

Assessment Results

We release regular reports from our survey results. These reports are published through Zenodo and can be accessed through the links below.

Post-Workshop Reports

The analysis in this report serves the following purposes:

  • To inform the community of the impact Data Carpentry workshops have made on its learners.
  • To provide context for the survey responses as they relate to Data Carpentry learners.
  • To discuss what Data Carpentry is doing well, areas of improvement, and questions we should be asking.

The report finds that learners are reporting that after a workshop, their level of data management and analysis skills have increased, they have increased confidence in their ability to use these skills and that the learners have increased appreciation for these skills (i.e. scripting) to improve and promote reproducible research. 95% of learners agree or strongly agree that they would recommend the workshop to a colleague.

Long-Term Survey Reports

The long-term survey assessed confidence, motivation, and other outcomes more than six months after respondents attended a Carpentry workshop. Provided below are a few highlights from the data.

  • 77% of our respondents reported being more confident in the tools that were covered during their Carpentry workshop compared to before the workshop.
  • 54% of our respondents have made their analyses more reproducible as a result of completing a Carpentry workshop.
  • 65% of our respondents have gained confidence in working with data as a result of completing the workshop.
  • 74% of our respondents have recommended our workshops to a friend or colleague.

December, 2017

When Do Workshops Work? A Response to the ‘Null Effects’ paper from Feldon et al. Author: Karen R. Word. Contributors: Kari Jordan, Erin Becker, Jason Williams, Pamela Reynolds, Amy Hodge, Maxim Belkin, Ben Marwick, and Tracy Teal.

This was a collaborative response to the paper: Feldon, David F. et al. Null effects of boot camps and short-format training for PhD students in life sciences Proc Natl Acad Sci U S A. 2017 Sep 12; 114(37): 9854–9858. doi: 10.1073/pnas.1705783114. Our data suggest that we are having a positive impact, and we expect that other short-format programs can be similarly effective. Read our full response.

February, 2018

Webinar with Rochelle Tractenberg: Debrief

On February 2, the Assessment Network held a webinar with Rochelle Tractenberg. Dr. Tractenberg directs the Collaborative for Research on Outcomes and Metrics at Georgetown University, where she is a tenured professor in the Department of Neurology. Our starting point was the controversy about short-format training which arose last year, following the publication of a 2017 PNAS paper titled Null effects of boot camps and short-format training for PhD students in life sciences””. The Carpentries design and deliver short-format training for working with software and data; trainees are researchers from various fields. The Carpentries’ initial response to the paper discussed many ways in which we have been successful with respect to our goals for Software Carpentry and Data Carpentry workshops. However, given that short-format training is a known challenge for generating sustainable content learning, we hoped that Dr. Tractenberg’s expertise might shed some light on areas with room for improvement. See the full webinar.