General Evaluation of the Theoretical and Computational Biophysics Workshop held at the University of Western Australia, Perth

June 7-19, 2004

Questionnaire: Gila Budescu, TCB group, UIUC, and modified by David Brandon, TCB Group, UIUC
Analysis and report: Molly Punke, UIUC, and David Brandon, TCB group, UIUC

The UIUC's Theoretical and Computational Biophysics Group (TCBG), an NIH Resource for Macromolecular Modeling and Bioinformatics is headed by Klaus Schulten, and CO-PIs Z. Luthey-Schulten,  L. Kale, and R. Skeel. As part of its outreach, the Resource offers workshops to introduce and transfer its programs and technological solutions to the biomedical community. The Resource participated in a two-week (June 7-12 and June 14-19) summer school sponsored by the Institute of Advanced Studies (IAS) at the University of Western Australia in Perth, Australia.  The IAS provided facilities, internet access, and organization of the summer school, while the Resource provided instructors, lectures, tutorials, and 20 Apple G4 laptops loaded with needed software and tutorial files.

Summer school lectures were given by K. Schulten (UIUC) and Z. Schulten (UIUC).  Two Resource graduate students and one graduate student from the UIUC School of Chemical Sciences accompanied the lecturers to the summer school and provided instructional support on-site.  Tutorials and preparation of the laptops was provided by the graduate students and by other Resource staff, and on-site the tutorial sessions were led by the graduate students.  The program of the summer school consisted of lectures and hands-on sessions, with an emphasis on the latter; student continuing in the school after the first week in particular had the opportunity for in-depth modeling of their favorite biological system.

At the end of the first week, students leaving the program were asked to complete a general evaluation questionnaire, and remaining and new students were asked to complete the same form at the end of the second week.  The general evaluation form asks about topics such as outcomes of the program, ratings of attributes of lectures and tutorials, organization and communication, and so on; click here to see the form used.  Participation in the evaluation was voluntary.  The two weeks of the school varied in attendance, with 20 persons attending the first week, and 18 the second week of the school; 11 persons attended both weeks, for a total of 27 individual participants.  A total of 20 general evaluation forms were returned, providing an overall response rate of 74%. Demographically, education level of participants was very high; all attendants had a doctoral degree or were PhD candidates, with the exception of four Masters students.

The results of the general evaluation questionnaire are summarized below, within the following sections:

All responses in the following tables are reported in percentages, rows adding up to 100%. Not all respondents answered to all items; the number of responses per item is presented next to each question.

Table I.  Distribution of Outcome Ratings

 

N

Strongly Disagree

Disagree

Unsure

Agree

Strongly Agree

1.The Summer School broadened my understanding of concepts and principles in the field of Computational and Theoretical Biophysics.

20

 

 

 

35

65

2.The Summer School improved my ability to carry out original research in the field of Computational and Theoretical Biophysics.

20

 

 

5

60

35

3.The Summer School improved significantly my computational skills.

20

 

20

15

40

25

4.The Summer School taught me techniques directly applicable to my career.

20

 

 

35

35

30

5.The material presented in the Summer School was relevant to my research.

20

 

5

25

30

40

Items I.1-5 refer to desired outcomes.  All respondents agreed that the summer school broadened their understanding of the field.  Most respondents agreed that the school strengthened their research abilities (95%).  The majority of respondents agreed that the school significantly improved their computational skills (65%), that they acquired useful techniques (65%) and that the material was relevant to their own research (70%). 

Table II.  Distribution of Lecture Ratings

 

N

Strongly Disagree

Disagree

Unsure

Agree

Strongly Agree

1.The instructor's knowledge of the subjects was good.

20

 

 

 

15

85

2.The instructors explained the material well.

19

 

 

 

47.4

52.6

3.The instructors provided real-world examples.

20

 

 

 

25

75

4.The instructors were prepared for the lectures.

20

 

 

 

25

75

5.The lectures were coordinated between instructors.

20

 

 

 

45

55

6.Lectures incorporated recent developments in the field.

20

 

 

5

25

70

7.The range of lectures captured the overall essentials of the field.

20

 

5

20

30

45

8.The level of the lectures was appropriate.

20

 

5

15

50

30

9.The underlying rationale of the techniques presented was clear.

20

 

5

5

40

50

10.We were exposed to a well representative range of techniques.

20

 

 

15

50

35

11.The instructors stimulated my intellectual curiosity.

20

 

 

 

25

75

Items II.1-11 address the level, scope, and quality of summer school lectures.  All respondents rated the speakers’ knowledge good, and also agreed that the instructors explained the material well, provided real-world examples, were prepared for lectures, and coordinated the lectures with other instructors.  There was agreement that the lectures incorporated recent developments (95%) and captured the field essentials (75%).  Most respondents found the level of the lectures to be appropriate (80%).  And, most respondents felt that the rationale of techniques was clear (90%), and that the range of techniques presented was representative of the field (85%).  All respondents agreed that the instructors stimulated their intellectual curiosity.

Table III.  Distribution of Research Tutorial Ratings

 

N

Strongly Disagree

Disagree

Unsure

Agree

Strongly Agree

1.The hands-on sessions were important for the learning process in the summer School.

20

 

 

 

10

90

2.The instructors explained the material well.

20

 

 

 

15

85

3.The jobs we ran during the hands-on sessions were useful for understanding the material.

20

 

5

 

35

60

4.The concrete examples in the hands-on tutorials increased my understanding of the lectures.

20

 

 

5

15

80

5.The hands-on sessions were long enough.

20

 

15

10

40

35

6.The hands-on sessions were coordinated with the lectures.

20

 

 

 

40

60

7.TAs were well-prepared to answer questions.

20

 

 

5

30

65

8.The ‘Model Your Own System’ opportunity improved my understanding of the lectures.

14

 

7.1

14.3

35.7

42.9

Items III.1-8 deal with the level, quality, and scope of the hands-on tutorials. Ratings were very high (95% agreement or above) for the importance of the sessions for the school, the instructor's explanations of the material, the jobs run during the sessions, the examples used in the tutorials, the coordination of the tutorials with the lectures, and the ability of the TAs to answer questions.  A majority also agreed that the sessions were long enough (75%), and that the 'model your own system' section of the school improved their understanding of the lectures (78.6%).

Table IV.  Distribution of Environment & Technical Resources Ratings

 

N

Strongly Disagree

Disagree

Unsure

Agree

Strongly Agree

1.The Apple Powerbook G4s were adequate for the exercises.

20

 

 

 

10

90

2.The Apple Powerbook G4s ran smoothly.

20

 

 

 

10

90

3.It was easy to learn how to use the Apple Powerbook G4s.

20

 

 

 

15

85

4.The software used in the Summer School ran well on the Apple Powerbook G4s.

20

 

 

5

15

80

5.School access to the Internet was sufficient.

20

 

 

 

10

90

 Items IV.1-8 address the effectiveness of the physical environment and technical support during the summer school.  Most of these items refers to the Apple Powerbook G4 laptop computers set up and taken to Australia for the school.  All respondents agreed  that the laptops were adequate for the exercises, ran smoothly, and were easy to learn how to use.  A very high percentage of respondents, 95%, further agreed that the software used in the summer school ran well on the laptops.  All respondents felt the school provided sufficient access to the internet.

Table V.  Distribution of Communication & Dissemination Ratings

 

N

Strongly Disagree

Disagree

Unsure

Agree

Strongly Agree

1.Instructors were readily available for Q&A outside the lecture periods.

20

 

 

 

25

75

2.The daily noon Q&A period was beneficial.

18

5.6

5.6

22.2

33.3

33.3

3.The Summer School web site was informative during the school period.

20

5

5

45

25

20

4.The online information was up-to-date.

20

5

10

30

25

30

5.The online material was well-organized.

20

 

 

15

50

35

Items V.1-5 were designed to assess the effectiveness of the school communication and dissemination efforts and tools.  All respondents agreed that the instructors were available outside the lecture periods, and the majority of the respondents agreed that the daily Q&A period was beneficial (66.7%), though only approximately half of the respondents were satisfied that the Summer School website was informative (45%) and that the online information was up-to-date (55%).  Most agreed that the online material was well organized (85%). 

Table VI.   Distribution of General Organization Ratings

 

N

Strongly Disagree

Disagree

Unsure

Agree

Strongly Agree

1.The number of participants was reasonable.

20

 

 

 

65

35

2.There were enough instructors to help the participants.

20

 

 

 

35

65

Items VI.1-2 evaluate the general organization of the school.  All respondents agreed that the numbers of participants was reasonable, and that there were enough instructors to help the school participants..

Table VII.  Distribution of Satisfaction Ratings

 

N

Strongly Disagree

Disagree

Unsure

Agree

Strongly Agree

1.Overall technical support was good.

19

 

 

 

26.3

73.7

2.Overall general support was good.

19

 

 

 

42.1

57.9

3.The Summer School was well organized.

19

 

5.3

10.5

31.6

52.6

4.The balance between lectures and hands-on sessions was optimal.

19

 

 

10.5

63.2

26.3

5.The Summer School addressed my research needs.

18

 

5.6

22.2

50

22.2

6.Overall, the Summer School met my expectations.

19

 

15.8

5.3

42.1

36.8

Items VII.1-6 were intended to measure the attendees’ satisfaction across the various summer school components.  The results show that the participants were satisfied with the school.  All respondents agreed that both technical and general support was good, and most respondents felt that the school was both well organized (84.2%) and had an optimal balance of lectures and hands-on sessions (89.5%).  A majority (72.2%) indicated the school addressed their research needs, and a majority also felt that the summer school met their expectations at 78.9%.

VII. Comments

13 of the 20 respondents took the time to answer two open-ended items, 1) "What suggestions do you have for improving the summer school?", and b) "What suggestions do you have for similar workshops?". Their comments, in their own words, can be requested from the summer school organizers by emailing brandon@ks.uiuc.edu.

While hard to reliably quantify and less than systematic, open-ended comments can provide insight into some important matters that may be improved and others, not covered by the questionnaire and beyond the control of the organizers. Generally, the comments touch on participants wanting more and/or earlier information about the summer school, wanting more or less information on particular topics, and compliments about the school.  Overall, the many narrative responses do indicate that the participants cared enough about the school to take the time to write them down, which reflects a certain level of commitment and satisfaction on their part, regardless whether the comments are positive or negative. They also shed more light on some participant expectations and provide new ideas to consider for future training efforts.