Evaluation of the Summer School on Theoretical and Computational Biophysics - Questionnaire 1

Questionnaire: Gila Budescu, TCB group, UIUC
Analysis and report: Gila Budescu and David Brandon, TCB group, UIUC

Summer School feedback form (65 KB)
Summer School feedback form cover (56 KB)

At the end of the summer school participants completed an evaluation questionnaire with the following sections:
  • I.Outcomes
  • II.Lectures
  • III.Hands-on tutorials
  • IV.Cluster Building
  • V.Environment & Technical Resources
  • VI.Communication & Dissemination
  • VII.General Organization
  • VIII.Satisfaction
  • IX.Comments

The full questionnaire is attached.

Of the 83 forms distributed to participants, 64 completed their questionnaires, yielding a response rate of 77.1%.

All responses in the following tables are reported in percentages, rows adding up to 100%. Not all people responded to all items and the number of responses per item is presented next to each.

Table I. Distribution of Outcome Ratings
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.The Summer School broadened my understanding of concepts and principles in the field of Computational and Theoretical Biophysics. 64   3.1 9.4 39.1 48.4
2.The Summer School improved my ability to carry out original research in the field of Computational and Theoretical Biophysics. 61 3.2   15.9 42.9 38.1
3.The Summer School improved significantly my computational skills. 61 3.2 17.5 28.6 31.7 19.0
4.The Summer School taught me techniques directly applicable to my career. 64   4.7 9.4 45.3 40.6
5.The material presented in the Summer School was relevant to my research. 64   4.7 7.8 45.3 42.2

Items I.1-5 refer to desired outcomes. Most respondents agreed that the summer school broadened their understanding of the field (87.5%), and strengthened their research abilities (81.0%). 50.7% of the respondents agreed that their computational skills improved following the summer school and this could be explained by their heterogeneous levels of expertise. The majority of respondents agreed that they have acquired useful techniques (85.9%) and that the material was relevant to their own research (87.5%).

Table II. Distribution of Lecture Ratings
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.The instructor's knowledge of the subjects was good. 64   1.6 1.6 35.9 60.9
2.The instructors explained the material well. 64   3.1 9.4 60.9 26.6
3.The instructors provided real-world examples. 64   1.6 4.7 46.9 46.9
4.The instructors were prepared for the lectures. 64   1.6 1.6 31.3 65.6
5.The lectures were coordinated between instructors. 63   3.2 12.7 42.9 41.3
6.Lectures incorporated recent developments in the field. 63     6.3 39.7 54.0
7.The range of lectures captured the overall essentials of the field. 64   3.1 21.9 37.5 37.5
8.The level of the lectures was appropriate. 64 1.6 6.3 20.3 48.4 23.4
9.The underlying rationale of the techniques presented was clear. 64   6.3 18.8 40.6 34.4
10.We were exposed to a well representative range of techniques. 64   1.6 6.3 46.9 45.3
11.The instructors stimulated my intellectual curiosity. 64 1.6 1.6 1.6 45.3 50.0

Items II.1-11 address the level, scope and quality of the summer school lectures. Most respondents rated the speakers knowledge (96.8%), and their explanation of the material (87.5%) high. Real-world examples were judged sufficient (93.8%), level of preparedness was rated high (96.9%), and coordination between lectures was good (84.2%). Most respondents felt that the lectures incorporated recent developments (93.7%), and that they captured the field essentials (75.0%). While the participants' background was quite heterogeneous in terms of knowledge and expertise, most thought that the level of the lectures was appropriate (71.8%), that rationale of techniques was well presented (75.0%), and that the range was representative (92.2%). There was an agreement that the speakers were successful in stimulating intellectual curiosity (95.3%).

III. Distribution of Research Tutorial Ratings
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.The hands-on sessions were important for the learning process in the summer School. 64   1.6 3.1 25.0 70.3
2.The Model Your Own System opportunity improved my understanding of the lectures. 60 1.7 10.0 25.0 35.0 28.3
3.The jobs we ran during the hands-on sessions were useful for understanding the material. 64   3.1 9.4 51.6 35.9
4.The concrete examples in the hands-on tutorials increased my understanding of the lectures. 63     7.9 49.2 42.9
5.The hands-on sessions were long enough. 64 7.8 23.4 17.2 35.9 15.6
6.The hands-on sessions were coordinated with the lectures. 64     4.7 42.2 53.1
7.TAs were well-prepared to answer questions. 64 1.6 3.1 12.5 35.9 46.9
8.There were sufficient instructions to proceed with the hands-on assignments. 63     6.3 30.2 63.5

Items III.1-8 deal with the level, quality and scope of the hands-on tutorials. Except for items III.2 and III.5 the ratings were high. Responses to the two lower-rated items (63.3% and 51.5% respectively) indicate that participants were not highly satisfied with the Model Your Own System opportunity, nor with the length of the hands-on sessions. Based on episodic verbal comments made by participants and some written comments below, it appears that many felt that these sessions were not long enough, that lack of reasonable internet access was an obstacle in this context, that perhaps some of the participants were not skilled enough to fully benefit from those sessions.

IV. Distribution of Cluster Building Sessions Ratings
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.The cluster-building lecture was useful. 56   1.8 10.7 37.5 50.0
2.The cluster-building exercises were useful. 54   1.9 11.1 25.9 61.1
3.I think that I could now build my own cluster. 54 1.9 1.9 22.2 40.7 33.3

Items IV.1-3 assess the special cluster building sessions. The ratings indicate a high agreement that both the lecture and tutorials were beneficial.

V. Distribution of Environment & Technical Resources Ratings
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.The Sun computer lab was adequate for the exercises (if N/A go to V.3). 56 5.4 3.6 5.4 41.1 44.6
2.The computers in the Sun lab ran smoothly (if N/A go to V.3). 55 1.8   7.3 43.6 47.3
3.The Linux computer lab was adequate for the exercises. 54 1.9   5.6 18.5 74.1
4.The computers in the Linux lab ran smoothly. 54   3.7 9.3 27.8 59.3
5.The access to NCSA resources was valuable. 64   3.1 9.4 40.6 46.9
6.The lecture room (auditorium) was conducive to learning. 64   1.6   35.9 62.5
7.School access to the Internet was sufficient. 64 28.1 28.1 14.1 15.6 14.1
8.BioCoRE was helpful in submitting jobs to NCSA. 56 7.1 10.7 55.4 21.4 5.4

Items V.1-8 address the effectiveness of the physical environment and technical support during the summer school. All items are rated highly except for V.7 and V.8. The participants were less than satisfied with their access to the Internet (29.7%) on campus, and found BioCoRE less than useful (26.8%) in submitting jobs to NCSA. The reasons for the latter are complex and may be attributed equally to BioCoRE and to the general networking infrastructure on campus.

VI. Distribution of Communication & Dissemination Rating
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.Instructors were readily available for Q&A outside the lecture periods. 64   3.1 10.9 32.8 53.1
2.The daily noon Q&A period was beneficial. 64   4.7 12.5 40.6 42.2
3.The Summer School web site was informative before the school started. 64   3.1 14.1 48.4 34.4
4.The Summer School web site was informative during the school period. 64     7.8 45.3 46.9
5.The online information was up-to-date. 62     9.7 37.1 53.2
6.The online material was organized. 64     4.7 39.1 56.3
7.BioCoRE facilitated interactions among students. 60 6.7 5.0 38.3 31.7 18.3
8.BioCoRE facilitated interactions between students and School staff. 60 6.7 3.3 45.0 30.0 15.0

Items VI.1-8 were designed to assess the effectiveness of the school communication and dissemination efforts and tools. All items except for VI.7 and VI.8 are rated highly. The participants found BioCoRE support of the school communication adequate, 50.0%) and 45.0%) respectively, though many were unsure. As in section V, the reasons for the latter are complex and may be attributed equally to BioCoRE and to the general networking infrastructure on campus.

VII. Distribution of General Organization Ratings
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.The number of participants was reasonable. 64   3.1 6.3 56.3 34.4
2.The cost of the School was reasonable. 64     3.1 28.1 68.8
3.There were enough TAs and support staff to help the participants. 64   3.1 1.6 34.4 60.9
4.The banquet enhanced the Summer School experience. 60   1.7 11.7 33.3 53.3
5.The bus trip to Chicago was fun (if N/A go to VII.1). 37     18.9 32.4 48.6

Items VII.1-5 evaluate the general organization of the school and they all were rated high.

VIII. Distribution of Satisfaction Ratings
  N Strongly Disagree Disagree Unsure Agree Strongly Agree
1.Overall technical support was good. 64       43.8 56.3
2.Overall general support was good. 64     3.1 37.5 59.4
3.The Summer School was well organized. 64     1.6 32.8 65.6
4.The balance between lectures and hands-on sessions was optimal. 64 1.6 6.3 14.1 35.9 42.2
5.The Summer School addressed my research needs. 64   4.7 10.9 51.6 32.8
6.Overall, the Summer School met my expectations. 64 1.6   3.1 37.5 57.8

Items VIII.1-6 were intended to measure the attendees' satisfaction across the various summer school components. The results show that the participants came out highly satisfied. The only item generating a somewhat lower level of enthusiasm had to do with the balance between lectures and tutorials (VIII.4). Ratings in previous sections and the following comments suggest that participants would have liked to have more hands-on and practical sessions in the program.

39 of the 64 respondents took the time to answer two open-ended items. Their comments, in their own words, could be requested from the summer school organizers by emailing sumschool03@ks.uiuc.edu.

While hard to reliably quantify and less than systematic, the significant number of comments offers an insight into some important matters that may be improved and others, not covered by the questionnaire and beyond the control of the organizers. Overall, the many narrative responses do indicate that the participants cared enough about the school to take the time to write them down, which reflects a certain level of commitment and satisfaction on their part, regardless whether the comments are positive or negative. They also shed more light on some participant expectations and provide new ideas to consider for future training efforts.