TCB Cluster Building Workshop

Evaluation of the Theoretical and Computational Biophysics Group Cluster Workshop

Beckman Institute, Room 5602
November 10-11, 2005

Questionnaire: Gila Budescu, TCB Group, UIUC, and modified by David Brandon, TCB Group, UIUC
Analysis and report: David Brandon, TCB Group, UIUC, and Kathryn Walsh, TCB Group, UIUC

The UIUC's Theoretical and Computational Biophysics Group (TCBG), an NIH Resource for Macromolecular Modeling and Bioinformatics is headed by Klaus Schulten, and CO-PIs Z. Luthey-Schulten,  E. Tajkhorshid, and A. Aksimentiev. As part of its outreach, the Resource offers workshops to introduce and transfer its programs and technological solutions to the biomedical community. The Resource produced a day-and-a-half workshop (November 10-11) on cluster building, held in Room 5602 of the Beckman Institute on the University of Illinois at Urbana-Champaign campus.  The program of the workshop consisted of lectures and hands-on sessions providing experience in building a cluster.  Instructors at the workshop were Jim Phillips, a Senior Research Programmer at the TCBG, and Tim Skirvin, Senior Systems Administrator at TCBG.

At the end of the second day of the workshop, participants were asked to complete a general evaluation survey.  The general evaluation form asks about topics such as outcomes of the program, ratings of attributes of lectures and tutorials, organization and communication, and so on; click here to see the form used.  Participation in the evaluation was voluntary.  A total of 22 general evaluation forms of 24 distributed were returned, providing an overall response rate of 91.7%.  Demographically, education levels of participants were high; the majority of attendees were PhDs or PhD candidates.

All responses in the following tables are reported in percentages, rows adding up to ~100% depending on rounding. Not all respondents answered to all items; the number of responses per item is presented next to each question. References to 'agreement' among respondents is calculated by adding together the percentages for the 'agree' and 'strongly agree' responses.

I. Outcome

N Strongly disagree Disagree Unsure Agree Strongly agree
% % % % %
I1. The Workshop broadened my understanding of concepts and principles in designing and operating clusters. 22     9.1 22.7 68.2
I2. The Workshop improved my ability to set up a cluster. 22     18.2 9.1 72.7
I3. The Workshop improved significantly my computational skills. 22 9.1 22.7 31.8 31.8 4.5
I4. The Workshop taught me techniques directly applicable to my research and/or job/career. 22   4.5 13.6 54.5 27.3

Items I1-I4 refer to desirable outcomes for those attending the workshop.  A majority of participants indicated that the workshop broadened their understanding of cluster building concepts (90.9%), improved their ability to set up a cluster (81.8%), and taught them techniques directly applicable to their research and/or job/career (81.8%).  Over a third (36.3%) of respondents indicated the workshop significantly improved their computational skills.

II. Lectures

N Strongly disagree Disagree Unsure Agree Strongly agree
% % % % %
II1. The instructors’ knowledge of the subjects was good. 22     4.5 18.2 77.3
II2. The instructors explained the material well. 22     4.5 31.8 63.6
II3. The instructors provided real-world examples. 21     4.8 38.1 57.1
II4. The instructors were prepared for the lectures. 21     9.5 14.3 76.2
II5. The lectures were coordinated between instructors. 21     4.8 19 76.2
II6. Lectures incorporated recent developments in the field. 21     19 23.8 57.1
II7. The range of lectures captured the overall essentials of the field. 21     33.3 23.8 42.9
II8. The level of the lectures was appropriate. 22   4.5 13.6 27.3 54.5
II9. The underlying rationale of the techniques presented was clear. 22     4.5 50 45.5
II10. The relevance of the lectures was strong. 20     5 25 70

Items II1-II10 evaluated lectures given by the instructors.  Ratings are quite high, with nearly all participants (90-96%) agreeing or strongly agreeing that the instructors were knowledgeable, explained the material well, provided real-world examples, prepared for the lectures, and coordinated their lectures.  Nearly as high agreement (80-96%) was found for instructor lectures incorporating recent developments in the field, capturing the essentials of the field, being at the right level for the audience, presenting a clear rationale, and being relevant to the topic of the workshop.

Lecture Comments

Participants were also given the opportunity to provide written comments about the lectures on day one and day two of the workshop.  Few took advantage of this opportunity, and of those responses some were not about the lectures.  Here are the comments provided:

Day One Lecture:

  • "Good intro for a novice like me."

  • "Great job. Good setup of ideas."

Day Two Lecture:

  • "Good quality details. Would be also interesting to mention the “big boys” clusters on there and implementation details on them."
  • "This got even more detailed and more helpful."

A further comment on both days approved of the lectures, but suggested more details on the steps followed in the hands-on session be provided.

III. Hands-on Practice
 

N Strongly disagree Disagree Unsure Agree Strongly agree
% % % % %
III1. The hands-on sessions were important for the learning process in the Workshop. 20       15 85
III2. The concrete examples in the hands-on tutorials increased my understanding of the lectures. 20     10 15 75
III3. The hands-on sessions were long enough. 20   5 15 15 65
III4. The hands-on sessions were coordinated with the lectures. 20     15 35 50
III5. There were sufficient instructions to proceed with the hands-on assignments. 19     15.8 21.1 63.2
III6. The supplied computers were adequate for the exercises. 20       30 70

Survey items III1-III6 asked about the hands-on practice provided in the workshop, specifically the building of clusters using the computers and other hardware and software provided by the workshop.  All participants (100%) felt the hands-on sessions were important to the goals of the workshop, and majorities (80-90%) indicated that concrete examples helped understanding, that the hands-on sessions were long enough, coordinated with the lectures, and included sufficient instructions on how to proceed.  And, all participants (100%) found  that the supplied computers were adequate for the exercises.

IV. General Organization

N Strongly disagree Disagree Unsure Agree Strongly agree
% % % % %
IV1. The number of participants was reasonable. 20       15 85
IV2. There were enough staff to help the participants. 20       20 80
IV3. There was sufficient information provided about the workshop via the website and e-mails. 20     5 20 75
IV4. Instructors were readily available for Q&A outside the lecture periods. 19     5.3 21.1 73.7
IV5. The Workshop web site was informative before the school started. 19     5.3 5.3 89.5

The general organization items, questions IV1-IV5 of the survey, ask about the basic structure and operation of the workshop.  All participants (100%), indicate that the number of participants in the workshop was reasonable, and that there were enough staff to help the participants.  Nearly all (~95%) of participants indicated that the workshop website and e-mails provided sufficient information, that instructors were available for questions outside of lecture periods, and that the workshop website was informative before the workshop started.

V. Overall Satisfaction

N Strongly disagree Disagree Unsure Agree Strongly agree
% % % % %
V1. Overall technical support was good. 20       20 80
V2. Overall general support was good. 20       15 85
V3. The Workshop was well organized. 20       30 70
V4. The balance between lectures and hands-on sessions was optimal. 20     5 40 55
V5. The Workshop addressed my research needs. 20 5   10 40 45
V6. Overall, the Workshop met my expectations. 20       5 95

A final set of scale questions, items V1-V6, rated overall satisfaction.  All participants (100%) indicated overall technical support was good, that overall general support was good, that the workshop was well organized, and that the workshop met their expectations.  Most participants (85-95%) indicated that the balance between lectures and hands-on sessions was optimal, and that the workshop addressed their research needs.

VI. General Comments

At the end of the survey, participants were asked to write responses to two open questions, "What suggestions do you have for improving the workshop?" and "What suggestions do you have for similar workshops?".  A summary of these comments is provided below, raw comments are available by writing workshop+cluster@ks.uiuc.edu.  Suggestions from participants include the following:

  • longer workshops
  • insert more technical discussions
  • more detailed step-by-step instructions
  • allows participants to bring their own source code