TCBG Cluster Building Workshop

Evaluation of the Theoretical and Computational Biophysics Group Cluster Workshop

Beckman Institute, Room 5602
April 20-21, 2006

Questionnaire: Gila Budescu, TCB Group, UIUC, and modified by David Brandon, TCB Group, UIUC
Analysis and report: David Brandon, TCB Group, UIUC, and Kathryn Walsh, TCB Group, UIUC

The UIUC's Theoretical and Computational Biophysics Group (TCBG), an NIH Resource for Macromolecular Modeling and Bioinformatics is headed by Klaus Schulten, and CO-PIs Z. Luthey-Schulten,  E. Tajkhorshid, and A. Aksimentiev. As part of its outreach, the Resource offers workshops to introduce and transfer its programs and technological solutions to the biomedical community. The Resource produced a day-and-a-half workshop (April 20-21) on cluster building, held in Room 5602 of the Beckman Institute on the University of Illinois at Urbana-Champaign campus.  The program of the workshop consisted of lectures and hands-on sessions providing experience in building a cluster.  Instructors at the workshop were Jim Phillips, a Senior Research Programmer at the TCBG, and Tim Skirvin, Senior Systems Administrator at TCBG.

At the end of the second day of the workshop, participants were asked to complete a general evaluation survey.  The general evaluation form asks about topics such as outcomes of the program, ratings of attributes of lectures and tutorials, organization and communication, and so on; click here to see the form used.  Participation in the evaluation was voluntary.  A total of 20 of 22 general evaluation forms were returned, providing an overall response rate of 91%.  Demographically, education levels of participants were high; the majority of attendees were PhDs, PhD candidates.

All responses in the following tables are reported in percentages, rows adding up to ~100% depending on rounding. Not all respondents answered to all items; the number of responses per item is presented next to each question. References to 'agreement' among respondents is calculated by adding together the percentages for the 'agree' and 'strongly agree' responses.

I. Outcome

  N Strongly
disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly
agree
%
I1. The Workshop broadened my understanding of concepts and principles in designing and operating clusters. 20   5 5 35 55
I2. The Workshop improved my ability to set up a cluster. 20     5 20 75
I3. The Workshop improved significantly my computational skills. 20 5 15 35 20 25
I4. The Workshop taught me techniques directly applicable to my research and/or job/career. 20 5 15 10 50 25

Items I1-I4 refer to desirable outcomes for those attending the workshop.  A majority of participants indicated that the workshop broadened their understanding of cluster building concepts (90%), improved their ability to set up a cluster (95%), and taught them techniques directly applicable to their research and/or job/career (75%).  Nearly half (45%) of respondents indicated the workshop significantly improved their computational skills.

II. Lectures

  N Strongly
disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly
agree
%
II1. The instructors knowledge of the subjects was good. 20       25 75
II2. The instructors explained the material well. 20       30 70
II3. The instructors provided real-world examples. 20     5 25 70
II4. The instructors were prepared for the lectures. 20       30 70
II5. The lectures were coordinated between instructors. 20       30 70
II6. Lectures incorporated recent developments in the field. 20     10 40 50
II7. The range of lectures captured the overall essentials of the field. 20     10 55 35
II8. The level of the lectures was appropriate. 20     5 40 55
II9. The underlying rationale of the techniques presented was clear. 20     5 55 40
II10. The relevance of the lectures was strong. 20     5 45 50

Items II1-II10 evaluated lectures given by the instructors.  Ratings are quite high, with all participants (100%) agreeing or strongly agreeing that the instructors were knowledgeable, explained the material well, provided real-world examples, prepared for the lectures, and coordinated their lectures.  All participants (100%) also agreed that the level of the lectures was appropriate and that the rationale of the techniques presented was clear, and majorities agreed that the lectures captured the essentials of the field (95%) and that the relevance of the lectures was strong (91%).

Lecture Comments

Participants were also given the opportunity to provide written comments about the lectures on day one and day two of the workshop. Here are some of the comments provided:

Day One Lecture:

  • "Excellent."
  • "Really good."

Day Two Lecture:

  • "Wonderful! Thanks!"
  • "Great lecture, answered questions & integrated to hands-on"

A further comment on both days approved of the lectures, but suggested more details on the steps followed in the hands-on session be provided.

III. Hands-on Practice

  N Strongly
disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly
agree
%
III1. The hands-on sessions were important for the learning process in the Workshop. 19       32 68
III2. The concrete examples in the hands-on tutorials increased my understanding of the lectures. 18   6 11 22 61
III3. The hands-on sessions were long enough. 18   6 11 28 56
III4. The hands-on sessions were coordinated with the lectures. 17     18 35 47
III5. There were sufficient instructions to proceed with the hands-on assignments. 19     11 32 58
III6. The supplied computers were adequate for the exercises. 18   11 17 22 50

Survey items III1-III6 asked about the hands-on practice provided in the workshop, specifically the building of clusters using the computers and other hardware and software provided by the workshop.  All participants felt the hands-on sessions were important to the goals of the workshop, and majorities (72-90%) indicated that the hands-on sessions were long enough, coordinated with the lectures, included sufficient instructions on how to proceed, increased understanding of the lectures, and indicated that the supplied computers were adequate for the exercises.

IV. General Organization

  N Strongly
disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly
agree
%
IV1. The number of participants was reasonable. 19       37 63
IV2. There were enough staff to help the participants. 19       26 74
IV3. There was sufficient information provided about the workshop via the website and e-mails. 17     6 41 53
IV4. Instructors were readily available for Q&A outside the lecture periods. 18       33 67
IV5. The Workshop web site was informative before the workshop started. 18     11 17 72

The general organization items, questions IV1-IV5 of the survey, ask about the basic structure and operation of the workshop.  All participants indicated that the number of participants in the workshop was reasonable, that there were enough staff to help participants, and that the instructors were available for questions outside of the lecture periods.  A majority also indicate that there was sufficient information about the workshop via the website and e-mails (94%), though slightly less indicated the website was informative before the workshop started (89%).


V. Overall Satisfaction

  N Strongly
disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly
agree
%
V1. Overall technical support was good. 19   5   37 58
V2. Overall general support was good. 19     5 26 68
V3. The Workshop was well organized. 19       21 79
V4. The balance between lectures and hands-on sessions was optimal. 19     5 26 68
V5. The Workshop addressed my research needs. 18   11 11 33 44
V6. Overall, the Workshop met my expectations. 19     5 32 63

A final set of scale questions, items V1-V6, rated overall satisfaction.  All participants indicated the workshop was well organized. A majority of participants (94-95%) indicated overall technical support was good, that  overall general support was good, that the balance between lectures and hands-on sessions was optimal, and that the workshop met expectations. A majority (77%) indicated that the workshop addressed their research needs.

VI. General Comments

At the end of the survey, participants were asked to write responses to two open questions, "What suggestions do you have for improving the workshop?" and "What suggestions do you have for similar workshops?".  A summary of these comments is provided below, raw comments are available by writing workshop+cluster@ks.uiuc.edu.  Suggestions from participants include the following:

  • more testing of hardware used prior to workshop
  • use newer machines
  • longer workshop
  • more hands-on
  • more real applications