Home> Blog> 2015 Post-Workshop Instructor Debriefing, Round 8

2015 Post-Workshop Instructor Debriefing, Round 8

This post originally appeared on the Software Carpentry website.

The mentorship team met last week for a discussion with instructors who recently taught, including workshops at the National Center for Atmospheric Research (NCAR) and University of Texas at Arlington (the latter of which was taught by both authors of this post). Three important issues emerged during our discussion: recording the instructor's shell code, using example scripts to model increasing complexity in coding, and preparing instructors/helpers with answers to challenges.

Recording the instructor's shell code:

A common problem with students new to coding is keeping up with the instructor's live coding. To overcome this, we suggest following an awesome tip from the Software Carpentry discussion list last yearusing the following script:

export PROMPT_COMMAND="history 1 >> ~/Dropbox/history.txt"

This script sends all commands entered in the instructor's shell to a text file. In this example, the text file is a URL to a shared file in Dropbox that students can access and refresh to see the latest commands entered. Multiple students and helpers said they appreciated having this record for reference. We recommend adding this script to the instructor's .bashrc or .bash_profile to ensure continuity throughout the entire lesson (but remember to remove it when you're done teaching!).

Preparing instructors/helpers with answers to challenge questions:

While most instructors are familiar with the challenge questions (multiple choice questions, short answer prompts, etc) listed in the canonical lesson materials, sometimes instructors introduce their own questions or more complicated challenge exercises to the class. Our debriefing participants indicated that the instructor for each lesson should give other instructors/helpers possible solutions to the challanges a few days before the workshop. This will better equip our helpers and co-Insturctors with the tools to assist students and solve problems.

Using example scripts to model increasing complexity in coding:

The UT Arlington workshop deviated from the traditional lesson materials by incorporating a workflow using portions of the Gapminder dataset. In the Unix lesson on Day 1, students are taught basic scripting commands, and they build a script from scratch to modify some of the Gapminder files. However, on day 2, part of this restructured series of lessons leads students through the use of three pre-constructed shell scripts to prepare, clean, and combine data for later analysis in R. This modification creates a substantial departure from the pedagogy of the original lessons, as the canonical materials lead students through scripting by starting with a blank text editor. We found that offering students pre-constructed scripts offered a few advantages. First, the scripts included only commands that we covered already (i.e., mv, cp) or that were intuitive and easy to explain (i.e., echo), which reinforced the material we covered already. Second, we could model for students how to combine bits of code together to conduct complicated but reproducible tasks. Finally, using these scripts in a workflow replicates the programming process for many scientists, since most of us use and modify code written by other people to perform our own particular tasks. We believe that teaching students both 1) how to write scripts from scratch and 2) how to read and modify exisiting scripts will provide trainees with a robust and toolkit for their data-driven research careers.