Computer Science


PeerWise: Hassan Khosravi used the PeerWise system in APSC 160 and CPSC 259 and 304 in 2014–2016. PeerWise supports an online repository of multiple-choice questions that are created, answered, rated and discussed by students. This peer created and curated content was used for formative assessment by the students. The tool provides the instructor access to extensive analytics on student behaviours, and Hassan has downloaded this information for further analysis; for example, interaction graphs for students mediated by the questions. Hassan also developed a series of scripts which allow questions and answers to be extracted from the PeerWise system and used independently (for example, PeerWise could be used in a single offering and then an instructor curated subset of questions could be hosted locally in subsequent offerings); however, the benefits of this additional functionality was not sufficient to offset the disadvantages of losing access to the more comprehensive user interface provided by using the PeerWise tool directly, so its development has been suspended.

Computer Science Student Experience Project: Jessica Dawson began a research study to examine the outcomes and experiences of students in CS introductory courses, and in particular, to understand how these experiences may differ for difference students (for examples for CS majors and non-CS majors). In 2015 baseline data collection began via pre-post surveys in (CPSC 110 and CPSC 301) and student interviews and focus groups (CPSC 110). In collaboration with the course instructors, this data will be used to evaluate two new introductory CS courses (CPSC 100 and CPSC 103), each of which has a different target student audience than the existing introductory courses (CPSC 110, 301) the department offers. As part of this project, version 4 of the Computing Attitudes Survey (CAS) developed by Alison Tew is being used. The CAS has also been administered in CPSC 210 to evaluate changes in attitudes of CS-major students after their first and second programming course.
PDF Poster (UBC Science Education Open House 2016): Student Experience in Introductory CS Courses

Foundations of Computing Concept Inventory: Steve Wolfman has been developing a set of related concept inventories to assess student progress through our Foundations of Computing Stream (CPSC 121, 221, 320). The process began with the high level learning goals from the course, and then analyzed data from exams, project submissions and think-aloud interviews. Preliminary results were presented at SIGCSE 2014, and a special session on concept inventories in CS was run at SIGCSE 2015. A draft multiple choice CI covering the basic material has been piloted on students at the start of CPSC 121 and at the end of CPSC 121 and 221. New questions are still being developed, and further offerings of the CI will be undertaken.
PDF Poster (CWSEI EOY 2014): Misconceptions and Concept Inventory Questions for Hash Tables and Binary Search Trees
Paper (SIGCSE 2014): Misconceptions and concept inventory questions for binary search trees and hash tables
Talk by Steve Wolfman, Nov 2013: Developing a Concept Inventory for the Foundations of Computing Course Sequence
PDF Poster (CWSEI EOY 2013): Developing a Formative Assessment of Instruction for the Foundations of Computing Stream

Mechanical TA Software for Peer Review: CPSC 430 has traditionally used essay questions on the midterm and final exams to judge students’ ability to express concepts discussed in this non-technical class; however, because of the high cost of grading essays it was not feasible to provide opportunities during the term for students to practice such essays. Kevin Leyton-Brown has developed a software system called Mechanical TA (MTA) which allows students to submit brief essays through an online portal and then shares the essays out for peer review. While many such systems are available, the novel feature of MTA is that it divides the students into two groups based on the quality of their peer reviews. The “supervised” pool of students submit their essays and peer reviews as normal, but TAs provide grades on both their essays and their peer reviews. Students whose peer reviews are consistently good graduate into the “unsupervised” pool, where their essays are assigned the median score among the peer reviews and their peer reviews are assumed to be good; TAs need grade only a subset of spot-checks and appealed reviews. Not only does this system reduce TA workload, but the students have incentive to produce high-quality peer reviews and (with the recent addition of a pool of example essays for calibration) the means to improve their reviewing. A paper has appeared at the ACM Technical Symposium on Computer Science Education (SIGCSE) 2015: Wright, Thornton & Leyton-Brown, Mechanical TA: Partially Automated High-Stakes Peer Grading, doi: 10.1145/2676723.2677278. A survey was run at the end of the fall 2014 offering to explore student opinions about MTA, and the data is currently being analyzed. There is also ongoing work on the user interface (both student-side and instructor-side).

The MTA software has been tested with three other courses (CPSC 101, 110 and 301) for other types of assignment (images and code). It is not currently being used in the other courses for several reasons: the instructor interface is still rather fragile, the department's tech staff have concerns about the stability and security of the software, and it appears that the approach is most effective when used for many assignments (such as the weekly essays in 430) but not worth the overhead when used for a small number of assignments. Avenues to overcome the first two issues have been identified, but it remains to be seen whether there are other instructors and courses which can overcome the latter issue.

Computing Attitudes Survey (CAS): Allison Tew is in the process of developing and validating the Computing Attitudes Survey (CAS), a new assessment instrument to gauge student attitudes and perceptions about learning computer science. The CAS is based on the Colorado Learning Attitudes about Science Survey (CLASS) and extends that work to include specific computing issues such as debugging and data representation. The CAS will be applicable to a broad range of students and was piloted in three introductory courses in Fall of 2011: CPSC 101, CPSC 110 and APSC 160. Various versions of the survey were run in various classes in 2012, 2013 and 2014.
Paper: B. Dorn and A. E. Tew. Empirical Validation and Application of the Computing Attitudes Survey in Computer Science Education, 25(1), 2015, doi: 10.1080/08993408.2015.1014142 and version 4 of the survey has been released.
PDF Poster (CWSEI EOY 2012): Adapting the CLASS for Use in Computer Science

Longitudinal Study of Student Learning: Allison Elliott Tew is designing a research study into assessment of student learning across a sequence of software design courses running from 1st to 4th year.  Implementation details are currently under development.  Initial meetings have been held with faculty who teach the courses.  The first step is to move from learning goals that focus on particular courses to learning goals that capture the progression from novice to expert over a sequence of courses.

Decomposition techniques in teaching proof by induction:  Kim Voll applied a decomposition technique when teaching proof by induction in CPSC 121 in spring 2010.  Ben Yu have interviewed students from both sections of the course taught last term using a think-aloud protocol developed in conjunction with Wendy Adams (UC).  The results will be analyzed to determine if students taught with the decomposition technique demonstrate a stronger ability to perform proof by induction. 

Just-in-time-teaching (JiTT) in APSC 160: Instructor has developed screencasts to introduce basic content to students. Students are expected to watch one or more screencasts before coming to class and are assessed on their grasp of this introductory material using clicker questions at the start of class.  A collection of in-class problem sets has also been developed that will allow students to explore their understanding of more advanced content. We plan to conduct an assessment of retention of learning at the start of the follow-on course (CPSC 260) in the Fall and compare with results from last year where students had taken APSC 160 with more traditional instruction.
APSC 160: Student perceptions of online multimedia instruction with JiTT

Just-in-time-teaching in CPSC 121: Instructor has identified a subset of learning goals called `pre-class` learning goals.  These are goals that students are expected to meet before coming to class.  On-line tests have been developed to assess student learning for those goals. A set of in-class problems have been developed that address more advanced learning goals. Comparative survey work indicates dramatic increases in percentages of students that use the textbook and find it useful to their learning. 
Adaptation of JiTT in CPSC 121

Just-in-time-teaching (JiTT) in CPSC 221: One instructor taught both sections in 2008/09 Winter term 2. Students in one section are seeing a JITT approach and the use of in-class activities involving peer instruction and discussion. Students in the other section are receiving more traditional instruction. Students in both sections are writing the same exams and completing the same homework assignments.

Study on the use of online collaborative multiple-choice question repository (PeerWise): Donald Acton and Beth Simon conducted a study of the use of PeerWise by students in Computer Science 2nd and 4th year courses in 2007/08.  A survey was given to the students about how they use PeerWise and whether they feel that submitting or answering questions helps them learn.
PDFPaul Denny's poster (April 2009): PeerWise - Students sharing and evaluating their multiple choice questions

Self-theories: Beth Simon and collaborators conducted a study of the impact of students’ self-theories (based on the work of Carol Dweck) on success and persistence in beginning programming courses.  Through 30 years of research, Dweck has shown that students who adopt a “growth mindset” – that is believe that through hard work and learning from errors they can improve their intelligence – perform better in a variety of academic settings.  A minimal intervention to encourage students to adopt a growth mindset was investigated in both CPSC 111 and APSC 160. This work was presented at the 2008 International Computing Education Research Workshop in Sydney, Australia. Read the paper:
Saying isn't necessarily believing: influencing self-theories in computing

Learning Goals study: Beth Simon in collaboration with Jared Taylor, STLF in Life Sciences, conducted a study of student and faculty perceptions of the usefulness of learning goals. Explicit use of learning goals has also spread to the Computer Science and Engineering department at UC San Diego, home institution of our first STLF. Read the paper (published in the Journal of College Science Teaching):
What is the Value of Course-Specific Learning Goals?

Parson’s puzzles: Conducted a study in 2007/08 of a new type of exam question for assessing similar skills to code writing questions. Results have been published in the proceedings of the Fourth International Computing Education Research Workshop.