TCB Cluster Building Workshop

Evaluation of the Theoretical and Computational Biophysics Group Cluster Workshop

Beckman Institute, Room 5602
September 22-23, 2005

Questionnaire: Gila Budescu, TCB Group, UIUC, and modified by David Brandon, TCB Group, UIUC
Analysis and report: David Brandon, TCB Group, UIUC, and Kathryn Walsh, TCB Group, UIUC

The UIUC's Theoretical and Computational Biophysics Group (TCBG), an NIH Resource for Macromolecular Modeling and Bioinformatics is headed by Klaus Schulten, and CO-PIs Z. Luthey-Schulten,  E. Tajkhorshid, and A. Aksimentiev. As part of its outreach, the Resource offers workshops to introduce and transfer its programs and technological solutions to the biomedical community. The Resource produced a day-and-a-half workshop (September 22-23) on cluster building, held in Room 5602 of the Beckman Institute on the University of Illinois at Urbana-Champaign campus.  The program of the workshop consisted of lectures and hands-on sessions providing experience in building a cluster.  Instructors at the workshop were Jim Phillips, a Senior Research Programmer at the TCBG, and Tim Skirvin, Senior Systems Administrator at TCBG.

At the end of the second day of the workshop, participants were asked to complete a general evaluation survey.  The general evaluation form asks about topics such as outcomes of the program, ratings of attributes of lectures and tutorials, organization and communication, and so on; click here to see the form used.  Participation in the evaluation was voluntary.  A total of 18 general evaluation forms were returned, providing an overall response rate of 78%.  Demographically, education levels of participants were high; the majority of attendees were PhDs, PhD candidates, or had obtained a Masters degree.

All responses in the following tables are reported in percentages, rows adding up to ~100% depending on rounding. Not all respondents answered to all items; the number of responses per item is presented next to each question. References to 'agreement' among respondents is calculated by adding together the percentages for the 'agree' and 'strongly agree' responses.

I. Outcome

  N Strongly disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly agree
%
I1. The Workshop broadened my understanding of concepts and principles in designing and operating clusters. 18   6 6 33 56
I2. The Workshop improved my ability to set up a cluster. 18 6 6 6 22 61
I3. The Workshop improved significantly my computational skills. 18 22 11 22 33 11
I4. The Workshop taught me techniques directly applicable to my research and/or job/career. 18   11 17 39 33

Items I1-I4 refer to desirable outcomes for those attending the workshop.  A majority of participants indicated that the workshop broadened their understanding of cluster building concepts (89%), improved their ability to set up a cluster (83%), and taught them techniques directly applicable to their research and/or job/career (72%).  Nearly half (44%) of respondents indicated the workshop significantly improved their computational skills.

II. Lectures

N Strongly disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly agree
%
II1. The instructorsí knowledge of the subjects was good. 18 6     6 89
II2. The instructors explained the material well. 18 6     22 72
II3. The instructors provided real-world examples. 18 6     22 72
II4. The instructors were prepared for the lectures. 18 6     22 72
II5. The lectures were coordinated between instructors. 18 6     17 78
II6. Lectures incorporated recent developments in the field. 18   6 6 44 44
II7. The range of lectures captured the overall essentials of the field. 18   6 6 39 50
II8. The level of the lectures was appropriate. 18   6 6 44 44
II9. The underlying rationale of the techniques presented was clear. 18   6 6 39 50
II10. The relevance of the lectures was strong. 18 6   6 28 61

Items II1-II10 evaluated lectures given by the instructors.  Ratings are quite high, with nearly all participants (94-95%) agreeing or strongly agreeing that the instructors were knowledgeable, explained the material well, provided real-world examples, prepared for the lectures, and coordinated their lectures.  Nearly as high agreement (88-89%) was found for instructor lectures incorporating recent developments in the field, capturing the essentials of the field, being at the right level for the audience, presenting a clear rationale, and being relevant to the topic of the workshop.

Lecture Comments

Participants were also given the opportunity to provide written comments about the lectures on day one and day two of the workshop.  Few took advantage of this opportunity, and of those responses some were not about the lectures.  Here are the comments provided:

Day One Lecture:

  • "Very good perspective of introductory concepts."

  • "Nice introduction.  Lecturers enjoy the subject, which make it an interesting lecture."

Day Two Lecture:

  • Good details lecture.  It may need additional details on SGE stuff and on the way things actually work.  However, due to time constraints, I think this is pretty good.  I mean, we were able to build a cluster in one day!

A further comment on both days approved of the lectures, but suggested more details on the steps followed in the hands-on session be provided.

III. Hands-on Practice

  N Strongly disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly agree
%
III1. The hands-on sessions were important for the learning process in the Workshop. 18 6   6 22 67
III2. The concrete examples in the hands-on tutorials increased my understanding of the lectures. 18 6 6 17 22 50
III3. The hands-on sessions were long enough. 18 6 11   22 61
III4. The hands-on sessions were coordinated with the lectures. 18   6 11 33 50
III5. There were sufficient instructions to proceed with the hands-on assignments. 17     18 35 47
III6. The supplied computers were adequate for the exercises. 18 11 6 33 11 39

Survey items III1-III6 asked about the hands-on practice provided in the workshop, specifically the building of clusters using the computers and other hardware and software provided by the workshop.  Nearly all participants (89%) felt the hands-on sessions were important to the goals of the workshop, and majorities (82-83%) indicated that the hands-on sessions were long enough, coordinated with the lectures, and included sufficient instructions on how to proceed.  Slightly less support (72%) was found for ratings that the tutorials increased understanding of the lectures, and half of the participants indicated that the supplied computers were adequate for the exercises.

IV. General Organization

  N Strongly disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly agree
%
IV1. The number of participants was reasonable. 18 6     28 67
IV2. There were enough staff to help the participants. 18 6   11 39 44
IV3. There was sufficient information provided about the workshop via the website and e-mails. 18   11   28 61
IV4. Instructors were readily available for Q&A outside the lecture periods. 18   6   28 67
IV5. The Workshop web site was informative before the workshop started. 18   11 11 33 44

The general organization items, questions IV1-IV5 of the survey, ask about the basic structure and operation of the workshop.  Nearly all participants (95%), indicate that the number of participants in the workshop was reasonable, and that the instructors were available for questions outside of the lecture periods.  A majority also indicate that the there were enough staff to help participants (83%), and that there was sufficient information about the workshop via the website and e-mails (89%), though slightly less indicated the website was informative before the workshop started (77%).

V. Overall Satisfaction

  N Strongly disagree
%
Disagree
%
Unsure
%
Agree
%
Strongly agree
%
V1. Overall technical support was good. 18 6   6 28 61
V2. Overall general support was good. 18 6     28 67
V3. The Workshop was well organized. 18 6   11 33 50
V4. The balance between lectures and hands-on sessions was optimal. 18   11 11 28 50
V5. The Workshop addressed my research needs. 18   6 17 39 39
V6. Overall, the Workshop met my expectations. 18   6 11 22 61

A final set of scale questions, items V1-V6, rated overall satisfaction.  A majority of participants (89%) indicated overall technical support was good, and nearly all participants (95%) indicated overall general support was good.   A majority (83%) indicated that the workshop was well organized, and that the workshop met expectations.  Most participants (78%) indicated that the balance between lectures and hands-on sessions was optimal, and that the workshop addressed their research needs.

VI. General Comments

At the end of the survey, participants were asked to write responses to two open questions, "What suggestions do you have for improving the workshop?" and "What suggestions do you have for similar workshops?".  A summary of these comments is provided below, raw comments are available by writing workshop+cluster@ks.uiuc.edu.  Suggestions from participants include the following:

  • more testing of hardware used prior to workshop
  • workshop for advanced students
  • longer workshop or earlier start
  • more details earlier at website
  • more detailed notes