TCB Cluster Building Workshop

Evaluation of the Theoretical and Computational Biophysics Group Cluster Workshop

Beckman Institute, Room 5602
November 30 - December 1, 2006

Questionnaire: Gila Budescu, TCB Group, UIUC, and modified by David Brandon, TCB Group, UIUC
Analysis and report: David Brandon, TCB Group, UIUC, and Kathryn Walsh, TCB Group, UIUC

The UIUC's Theoretical and Computational Biophysics Group (TCBG), an NIH Resource for Macromolecular Modeling and Bioinformatics is headed by Klaus Schulten, and CO-PIs Z. Luthey-Schulten,  E. Tajkhorshid, and A. Aksimentiev. As part of its outreach, the Resource offers workshops to introduce and transfer its programs and technological solutions to the biomedical community. The Resource produced a day-and-a-half workshop (November 30 - December 1) on cluster building, held in Room 5602 of the Beckman Institute on the University of Illinois at Urbana-Champaign campus.  The program of the workshop consisted of lectures and hands-on sessions providing experience in building a cluster.  Instructors at the workshop were Jim Phillips, a Senior Research Programmer at the TCBG, and Tim Skirvin, Senior Systems Administrator at TCBG.

At the end of the second day of the workshop, participants were asked to complete a general evaluation survey.  The general evaluation form asks about topics such as outcomes of the program, ratings of attributes of lectures and tutorials, organization and communication, and so on; click here to see the form used.  Participation in the evaluation was voluntary.  A total of 17 general evaluation forms were returned, providing an overall response rate of 81%.  Demographically, education levels of participants were high; the majority of attendees were PhDs or PhD candidates.

All responses in the following tables are reported in percentages, rows adding up to ~100% depending on rounding. Not all respondents answered to all items; the number of responses per item is presented next to each question. References to 'agreement' among respondents is calculated by adding together the percentages for the 'agree' and 'strongly agree' responses.

I. Outcome

  N Strongly disagree (%) Disagree (%) Unsure (%) Agree (%) Strongly agree (%)
I1. The Workshop broadened my understanding of concepts and principles in designing and operating clusters. 17   6   35 59
I2. The Workshop improved my ability to set up a cluster. 17     6 18 76
I3. The Workshop improved significantly my computational skills. 17   29 35 24 12
I4. The Workshop taught me techniques directly applicable to my research and/or job/career. 17     6 41 53

Items I1-I4 refer to desirable outcomes for those attending the workshop.  A majority of participants indicated that the workshop broadened their understanding of cluster building concepts (94%), improved their ability to set up a cluster (94%), and taught them techniques directly applicable to their research and/or job/career (94%).  About one-third (35%) of respondents indicated the workshop significantly improved their computational skills.

II. Lectures

  N Strongly disagree (%) Disagree (%) Unsure (%) Agree (%) Strongly agree (%)
II1. The instructors’ knowledge of the subjects was good. 17       29 71
II2. The instructors explained the material well. 17     6 35 59
II3. The instructors provided real-world examples. 17     6 59 35
II4. The instructors were prepared for the lectures. 17       35 65
II5. The lectures were coordinated between instructors. 17     6 24 71
II6. Lectures incorporated recent developments in the field. 17     29 24 47
II7. The range of lectures captured the overall essentials of the field. 17   6 18 41 35
II8. The level of the lectures was appropriate. 17     12 47 41
II9. The underlying rationale of the techniques presented was clear. 17     18 47 35
II10. The relevance of the lectures was strong. 17       59 41

Items II1-II10 evaluated lectures given by the instructors.  Ratings are quite high, with all or nearly all participants (94-100%) agreeing or strongly agreeing that the instructors were knowledgeable (100%), explained the material well (94%), provided real-world examples (94%), prepared for the lectures (100%), coordinated their lectures (95%), and provided relevant material (100%). High ratings agreement was also found for instructor lectures incorporating recent developments in the field (71%), capturing the essentials of the field (76%), lectures being at the right level for the audience (88%), and presenting a clear rationale (82%).

Lecture Comments

Participants were also given the opportunity to provide written comments about the lectures on day one and day two of the workshop.  Few took advantage of this opportunity, and of those responses some were not about the lectures.  Here are the comments provided:

Day One Lecture:

  • "Very good!"
  • "Good introduction to the topic."

Day Two Lecture:

  • "Went well, helped me to understand how to manage a cluster and build one."
  • "Good too!"

Other comments suggested that there was some redundancy in the lectures, that shorter lectures could be presented, and that some material was too geared towards a UIUC audience.

III. Hands-on Practice

  N Strongly disagree (%) Disagree (%) Unsure (%) Agree (%) Strongly agree (%)
III1. The hands-on sessions were important for the learning process in the Workshop. 17       12 88
III2. The concrete examples in the hands-on tutorials increased my understanding of the lectures. 17     24 24 53
III3. The hands-on sessions were long enough. 17   12 12 18 59
III4. The hands-on sessions were coordinated with the lectures. 17   12 24 29 35
III5. There were sufficient instructions to proceed with the hands-on assignments. 17     6 35 59
III6. The supplied computers were adequate for the exercises. 17   6   41 53

Survey items III1-III6 asked about the hands-on practice provided in the workshop, specifically the building of clusters using the computers and other hardware and software provided by the workshop.  All participants (100%) felt the hands-on sessions were important to the goals of the workshop, and majorities indicated that the hands-on sessions were long enough (76%), coordinated with the lectures (65%), and included sufficient instructions on how to proceed (94%).  Ratings also support that the examples increased understanding of the lectures (76%), and the participants indicated that the supplied computers were adequate for the exercises (94%).

IV. General Organization

  N Strongly disagree (%) Disagree (%) Unsure (%) Agree (%) Strongly agree (%)
IV1. The number of participants was reasonable. 17       41 59
IV2. There were enough staff to help the participants. 17       29 71
IV3. There was sufficient information provided about the workshop via the website and e-mails. 17   6   41 53
IV4. Instructors were readily available for Q&A outside the lecture periods. 17     6 18 76
IV5. The Workshop web site was informative before the workshop started. 17   6   41 53

The general organization items, questions IV1-IV5 of the survey, ask about the basic structure and operation of the workshop.  All participants (100%), indicated that the number of participants in the workshop was reasonable, and that there were enough staff to help participants. Majorities also indicated that there was sufficient information about the workshop via the website and e-mails (94%), that the instructors were available for questions outside of the lecture periods (94%), and that the workshop web site was informative before the workshop started (94%).

V. Overall Satisfaction

  N Strongly disagree (%) Disagree (%) Unsure (%) Agree (%) Strongly agree (%)
V1. Overall technical support was good. 17       47 53
V2. Overall general support was good. 17       35 65
V3. The Workshop was well organized. 17       41 59
V4. The balance between lectures and hands-on sessions was optimal. 17   12 12 35 41
V5. The Workshop addressed my research needs. 17     24 47 29
V6. Overall, the Workshop met my expectations. 17       53 47

A final set of scale questions, items V1-V6, rated overall satisfaction.  All participants (100%) indicated overall technical support was good, that overall general support was good, that the workshop was well organized, and that the workshop met expectations.  Most participants (78%) indicated that the balance between lectures and hands-on sessions was optimal (76%), and that the workshop addressed their research needs (76%).

VI. General Comments

At the end of the survey, participants were asked to write responses to two open questions, "What suggestions do you have for improving the workshop?" and "What suggestions do you have for similar workshops?".  A summary of these comments is provided below, raw comments are available by writing workshop+cluster@ks.uiuc.edu.  Suggestions from participants include the following:

  • separate sessions for researchers and administrators
  • more time for the hands-on sessions
  • more info on sample hardware configurations, different applications, and how a job actually runs on a cluster
  • be clear that the workshop only covers parallel processing
  • change to one-day schedule, avoid winter, support clustermatic more, allow users to install their own programs
  • help participants cope with parking difficulties
  • consider a workshop on computational chemistry, or on strengths/weaknesses of UNIX/Linux operating system versions