INSTRUCTION THAT FITS THE LEARNER: SCIENCE CONTENT FOR UNDER-PREPARED TEACHERS

GARY K. WEBBER

Kansas Collaborative For Excellence in Teacher Preparation

University of Kansas, Lawrence, KS 66045

gwebber@ku.edu

 

The Kansas Collaborative for Excellence in Teacher Preparation (KCETP) was asked to develop a physics course for middle-level teachers wishing to improve their content knowledge and inquiry skills.  The teachers stated they wanted a deep understanding of physics concepts relevant to their curricula, modeling of inquiry learning, a class held at a location and time that fit their schedule, and graduate credit in the content area.  KCETP was able to design and deliver a course that met their needs, using a combination of Peer Instruction, Modeling Physics, an electronic student response system, and co-teachers from both physics and education backgrounds.  Significant gains in achievement on the Force Concept Inventory and high ratings on the effectiveness of instructional methods from the participants indicated that this format offers considerable advantages for adult learners over a more conventional lecture/lab format.

 

Background

During the fall semester of 2002, one of our partner school districts came to KCETP with a request to help their under-prepared middle school science teachers.  Many of the eighth grade science teachers in the district had moved up from the elementary grades, and needed to improve their content knowledge in physics and chemistry.  Only a few of these teachers had taken physics at the university level, and even these teachers were feeling poorly prepared for the curriculum in their schools.  In addition, many were struggling with implementing inquiry-based laboratory experiences for their students, a district priority.  Representatives from the School of Education, the Physics Department, and KCETP met with the district science coordinator to discuss designing a physics content course, and she shared a list of requirements that had been developed by the teachers.  The teachers felt strongly that the course needed to:

 

 

Research shows that these teachers were well justified in demanding this type of format.  The introductory physics courses offered at most universities and colleges are not specifically designed for teachers [1].  Most teachers take an algebra-based Physics course that often leaves them skilled at plugging numbers into equations, but does not challenge the misconceptions they bring into the classroom [2].  This gives them the tools to do well on tests that measure their skill at equation manipulation, but unprepared to identify, challenge and correct the misconceptions of their students.  In addition, most introductory science courses are primarily lecture with weekly labs, a format that is not effective in overcoming physics misconceptions [3]. 

 

Introductory Physics courses with hundreds of students in large lecture halls such as those found at Doctoral/Research Universities rarely demonstrate best practices for teachers to use with their students.  Many teacher preparation programs rely on science methods courses to accomplish this, and assume that the teachers will take what they learn in science content courses and combine it with the pedagogy skills they learn in methods courses.  Visits to the classrooms of teachers prepared this way show the fallacy of this assumption.  The tenacity of lecture as the primary mode of delivering content is evidence of the validity of the oft-quoted paradigm that “teachers teach the way they are taught.”  If they learn physics in a lecture mode, they tend to teach it to their students in the same mode. 

 

In addition, nearly all of these teachers had never participated in a formal science course that modeled inquiry laboratory investigations.  Although they all knew what inquiry labs looked like, and many used some form of inquiry in their classrooms, most were using the activities included in their textbook that, while adequate, gave little insight into the nature of science.  In a pre-course questionnaire, twelve of the twenty participants specifically asked for inquiry activities they could use with their students.  Clearly, simply offering the standard introductory Physics course in the summer did not meet the needs of these teachers.

 

At this point, coincidence and good fortune stepped in.  As a result of connections developed during the work of the Kansas Collaborative for Excellence in Teacher Preparation, the associate chairman of the Physics Department at the University of Kansas (KU), Dr. Robin Davis, generously agreed to assist me in developing the course and to act as instructor of record.  In addition, he agreed to sponsor the creation of the graduate course, and to team-teach the course with me during the summer of 2003. 

 

Next, I had recently attended an informative presentation by Dr. Steve Shawl, professor of astronomy at KU, on the use of Peer Instruction and a student response system in his introductory astronomy course.  He also suggested I investigate Modeling Physics as a possible format for the laboratory component.  After a week of research and investigation, I decided to make these the foundation of the course. 

 

During a videoconference related to a different grant proposal, John Farley at the University of Nevada, Las Vegas, mentioned that he and a colleague, Aimee Govett, had developed and taught a similar course at UNLV one year earlier, and had recently submitted a paper for publication that described the course and the results of their research.  Dr. Farley sent me a copy of the manuscript, and I was surprised and pleased to discover that the format for the UNLV course was almost identical to the one I had decided to use for the KU course.  Dr. Farley generously offered to share the resources his team had developed for the course, and within a month, I had assembled the curricular materials, revised the UNLV curricular resources, and was ready to go!

 

Course Design

The course design framework included three elements: a physics text, content pedagogy and inquiry activities.  Details about each element and the reasoning for the selections made are described below.

 

Physics Text

Selecting a text to support the course created a greater challenge than anticipated.  The text used for the introductory physics course at KU emphasized a mathematical explanation of concepts that we felt was inappropriate for the course.  This is not to imply that this is not an excellent way to approach the teaching of physics.  Mathematics is the language of physics, and any treatment of the subject should include the formulae that illustrate the relationships that exist in the physical world.  This approach is perfectly appropriate for university students who have completed both a high school and a university level algebra course.  However, for a two-week summer course, it was a far too rigorous treatment of these relationships.  This is especially true when you consider that the eighth-grade students of these teachers would not have taken algebra, or would be studying it concurrently.  These teachers needed a text that explained the content of physics they would address in their classes in a way that would allow them to grasp the basic concepts quickly and would help them foster the same understanding in their students.  It needed to be very readable, since they would be asked to read as much as 60 pages each evening in order to address the range of topics in their curriculum.  In the end, we selected a high school text, Conceptual Physics, ninth edition by Paul G. Hewitt, published by Addison Wesley.  This text is easy to read and understand, and although it does address basic mathematical relationships, it is not nearly as “math intense” as the college-level texts we reviewed.

 

Content Pedagogy

The pedagogy for content instruction was based on the book “Peer Instruction, A User’s Manual” by Eric Mazur, published by Prentice Hall.  Dr. Mazur is an award-winning physics Professor at Harvard, who saw his students mastering computations, but failing to understand basic physics concepts.  A typical class begins with what Dr. Mazur calls a “ConcepTest” over an assigned reading from the previous class.  This short quiz ensures that the students read the assignment in the text, and also gives instantaneous feedback on their grasp of the content.  The ConcepTest uses questions that require no calculations, but require deep understanding of basic concepts to answer correctly.  An excellent example of a compilation of this type of question is the “Force Concept Inventory (FCI)” developed by Hestenes, Wells, and Swackhammer [4].  The FCI consists of 30 multiple-choice questions, and was used as a content mastery evaluation instrument for the summer course.  Dr. Mazur has continued to develop Peer Instruction, and has created a World Wide Web site, Project Galileo [5], at http://galileo.harvard.edu, that is an excellent source of ConcepTests covering a number of different science content areas.  Recently, development of Project Galileo has stopped and Mazur and his team are working on a new project, development of an Interactive Learning Toolkit (ILT) that will take over for Project Galileo.  More information on the ILT project is available at the Project Galileo website, which will remain available during the development of the ILT project. 

 

One interesting and effective element of Peer Instruction is the use of an interactive student response system to poll for student answers to the ConcepTests.  This system uses infrared or radio signals generated by small, hand-held transmitters to send individual responses to a computer, where they are compiled and displayed as a bar graph for the class to view.  This system is very effective in insuring that the responses are anonymous, and therefore reduces student anxiety over responding incorrectly.  (In fact, it was our experience that this system fosters a very positive cooperative atmosphere among the students.  As correct responses became more frequent, students encouraged each other and were proud of their performance as a class.)  Dr. Mazur displays each question, and gives the students one minute to respond.  He then asks them to turn to their neighbor and discuss the answer for one or two minutes, perhaps trying to convince each other of the correct answer.  They are then encouraged to change their answer if they wish, and the results are displayed.  Finally, the correct answer is discussed for a few minutes.  If the results indicate the concept is understood, the instructor moves on.  If not, it may be necessary to reteach the concept, and pose additional ConcepTests.  Student grades are not affected by the ConcepTests.  The computer records all responses, and since each transmitter is registered to a specific student, the instructor has a record of the performance of each student and the class as a whole.  One low-tech alternative would be to use colored index cards, stapled at one corner.  The answers are correspondingly color-coded, and students respond by holding up the stapled pack with the color of their answer facing the front. 

 

Inquiry Activities

Two major criteria guided our search for an inquiry laboratory format.  We wanted to find a system that (1) contained specific activities which were appropriate for the grade level and curriculum of the participants, and (2) provided a consistent framework within which to present any inquiry activity.  The first criterion was important because many of the teachers had specifically requested new activities to use with their students.  Finding challenging, well-designed, age appropriate inquiry activities can be difficult and time consuming for busy science teachers.  To address this, we searched not only for appropriate activities to present during the course, but also for exemplary sources of activities that the participants could use to further enhance their curricula.  In addition, we scheduled a sharing session during the course and encouraged the teachers to bring copies of their favorite activities to share with their peers. 

 

The second criterion is critical to successfully implement student-centered inquiry activities.  If teachers are introduced to a model that can be used to structure a diverse mix of activities from a number of varied sources, the range of potential activities is greatly expanded.  Even laboratory exercises written in a “cookbook” style with little or no inquiry characteristics can often be modified to move the focus from following directions and replicating a predetermined result to designing and implementing student-directed research to challenge misconceptions.  Toward this end, after introducing the course model, we provided the participants with “cookbook” labs and practiced rewriting them based on the new model.  When teachers become proficient at this technique, the number of activities available to them greatly increases.  With practice, they can begin developing their own inquiry activities, allowing them to challenge misconceptions that arise during instruction for which they have no activities. 

 

Modeling Physics, developed by Halloun and Hestenes at Arizona State University [6], was selected.  This model uses guided inquiry to teach students physics concepts.  It was designed for high school physics courses, but was suitable for 8th grade students with adaptations.  A typical Modeling Physics activity begins with a simple teacher-led demonstration of a phenomenon such as constant motion, or action/reaction, followed by student observations and discussion.  The teacher then poses one or two questions related to the demonstration, and challenges student teams to develop investigations to address the questions.  The teams develop their experiments, with help as needed from the teacher, and conduct the experiments.  Information about the investigation, including the hypothesis, procedure, data, analysis and conclusion, is recorded on a large whiteboard.  Teams are encouraged to use a variety of communication styles, including but not limited to, drawings, charts, prose, even poetry, to describe their investigations and present their results.  When teams have finished, the teacher selects two or three to present their investigations to the class, and defend their conclusions.  The class discusses the presentations, and attempts to reach consensus about answers to the research questions posed earlier.  Teams whose investigations are unproductive, or poorly designed are given opportunities outside of the class period to redesign and repeat their investigations. 

 

These three elements, a physics textbook, content pedagogy, and inquiry activities, were combined in a daily 4-hour regimen that began with a series of ConcepTests that gauged understanding of concepts addressed in the reading assignment for that day.  The instructor came to class with far more ConcepTests than were actually used, so that, based on the responses of the participants, concepts that were not understood could be explained and discussed in detail, and followed up with more ConcepTests.  The second half of the daily class was reserved for a Modeling Physics activity.  Eight classes were held during the two-week class, for a total of 32 contact hours resulting in 2 hours of graduate credit.

 

Results

 

Demographic Background of Participants

As shown in Table 1, participants in Physics for Middle School Teachers were mostly female (85%), and had completed a master’s degree (80%).

 

Table 1: Gender and Educational Background

 Gender

N

%

Degree

N

%

Male

3

15.00

Bachelors

4

20.00

Female

17

85.00

Masters

16

80.00

Total

20

100.00

Total

20

100.00

 

Of the participants who answered the question relating to the grade they teach, the majority of them noted that they taught eighth grade, as shown in Table 2.

 

Table 2:  What Grade do You Teach?

Grades

N

    %

Fifth

1

5.00

Sixth

2

10.00

Seventh

1

5.00

Eighth

9

45.00

Did not answer

7

35.00

Total

20

100.00


 

Table 3 provides information related to the number of years participants have taught various courses.  None of the participants had taught physics.  Half of the participants had taught physical science courses for five or fewer years.  Nearly all of the participants had taught other subjects (n=19) for two or more years.

Table 3:  Number of Years Teaching

Courses in:

N

  %

physics

 

 

0

20

100.00

Total

20

100

Physical Science

N

  %

0

1

5.00

1

3

15.00

2

2

10.00

3

3

15.00

4

1

5.00

6

1

5.00

9

1

5.00

10

1

5.00

11

4

20.00

12

1

5.00

15

1

5.00

Did not answer

1

5.00

Total

20

100

Other Subjects

N

  %

0

1

5.00

2

1

5.00

3

1

5.00

5

1

5.00

8

3

15.00

9

1

5.00

10

2

10.00

11

1

5.00

12

1

5.00

14

2

10.00

16

1

5.00

20

1

5.00

26

3

15.00

27

1

5.00

Total

20

100.00

Participant Expectations

At the beginning of the course, participants were asked what they expected to learn.  Their responses are shown in Table 4.  Over half of the participants (n=11) expected to acquire enough understanding of physics to plan their curriculum and teach the course.  In addition, they hoped to acquire new activities and techniques to use in their physics courses.  The third most frequent response was to learn to apply physics concepts to real life. 

 

Table 4: Participants’ Expectations Related to the Course

Response Categories:

N

%

Acquire understanding of background and content necessary for planning curriculum and teaching physics

11

30.5

Acquire activities and techniques

Learn new hands-on activities

Learn techniques for productive labs

Learn methods of incorporating inquiry learning

11

30.5

Learn to apply physics concepts to real-life

4

11.1

Engage, hook, and excite students about physics

3

8.3

Acquire a deeper and broader understanding of physics

3

8.3

Refresh memory of physical science principals

1

2.8

Share ideas with other teachers

1

2.8

Learn how to integrate science and technology requirements

1

2.8

Learn applicable skills

1

2.8

            Notes:    1. Respondents may have had multiple comments. 

2.  Percentages may not add up to 100% due to rounding error.

 

Based on the type and quality of science content courses that KU has offered in the past, participants appeared to have reasonable expectations of the physics content course.

 

Pre-Post Content Survey Results

At the beginning of the course, 19 of 20 participants completed the Force Concept Inventory pre-survey to determine the amount of physics knowledge they had at the beginning of the course.  The Force Concept Inventory is a content test composed of 30 questions.  Thus, the maximum score participants could achieve was 30.  As shown in Table 5, the range of correct scores at the beginning of the course was 3 to 21.  The average score was 8.5. 


 

 

Table 5. Participant Scores on the

Force Concept Inventory Pre-survey

 Score

N

        %

3

1

5.0

4

1

5.0

5

2

10.0

6

4

20.0

7

2

10.0

8

1

5.0

9

2

10.0

10

1

5.0

11

1

5.0

12

1

5.0

13

1

5.0

14

1

5.0

21

1

5.0

Did not answer

1

5.0

Total

20

100.0

 

 

At the end of the course, participants completed the Force Concept Inventory survey again to determine how much they had learned.  Based on the SPSS® Exact option, an SPSS® Module add-on that mathematically controls for small sample sizes and non-normal distributions, improvement in participants’ post-survey scores were statistically significant (a = .01).  More than half of the participating teachers achieved a 90% or above on the content test.  The increase in correct answers ranged from 8 to 25.  The average score was 26.8.  Table 6 provides the frequencies of post-survey scores.  The course was successful in significantly increasing participants’ knowledge of physics content based on pre and post surveys.

 

Table 6. Participant Scores on the

Force Concept Inventory Post-survey

Scores

N

         %

21

1

5.0

22

1

5.0

24

1

5.0

25

2

10.0

26

2

10.0

27

2

10.0

28

5

25.0

29

4

20.0

30

1

5.0

Did not answer

1

5.0

Total

20

100.0

 

Course Evaluation

The course evaluation survey consisted of 25 seven-point Likert scale questions and two open-ended questions.  The quantitative data was analyzed using SPSSÒ.  Analysis consisted of frequencies and descriptive statistics.  A summary of the responses to the open-ended questions will follow the discussion of quantitative data.

 

Quantitative Analysis

Table 7 displays the evaluation ratings for Physics for Middle School Teachers.  The table provides the anchors for each question so that readers can determine whether the ratings were positive or negative. 

 

The majority of participating teachers indicated:

The Physics 701 class was a lot of fun.

They liked the physics for teachers’ class.

They felt good or very good in the course.

The instructor explained the topics well or very well.

They always or almost always talked about the topics related to the discipline of science in class.

They always or almost always talked about scientific principals and laws in class.

They always or almost always talked about the nature of science in class.

For teaching science, the topics in class were useful or very useful.

It was important to their instructor to find out participants’ thoughts on the topics.

The instructor considered their questions seriously.

They could always ask the instructor questions about class.

The class participated very much.

The class made quite bit (very much) effort.

Peer instructional methods were effective or very effective.

The Modeling Physics laboratory instructional methods were very effective.

 

Individually, the majority of teachers indicated:

My efforts to understand the teaching objectives were high or very high.

My participation in class was high or very high.

During instruction I was focused or very focused.

 

Participant ratings were varied for questions 4, 5, 12, 13, and 18 (see Table 9).  Because there was less agreement between participants on the ratings of these questions, the mean and median ratings tended to fall closer to the middle of the continuum than for other questions.  

 

Overall, participants were most likely to indicate:

They always understood the topics in class.

They had enough time to think about questions and tasks in class.

In class they always talked about how to teach science.

In class they always talked about scientific topics from every day life.

The class was disciplined. 

Ratings for questions 7 and 8 were more varied and did not provide a clear distinguishable preference for either end of the scale.  Rather, the median rating for each question was 4.00, the mid point on the continuum. 

 

The questions were:

The topics in class are difficult to understand/very easy to understand.

The topics in class are very theoretical/very concrete.


 

 

 

Table 7:  Physics 701 Course Evaluation

Questions:

1

2

3

4

5

6

7

Missing

Mean

Median

N

%

N

%

N

%

N

%

N

%

N

%

N

%

N

%

1. The physics for teacher class is…

Scale: 1 = a lot of fun                                                                       7 = no fun at all

 

 

 

 

11

55.5

4

20.00

4

20.00

 

 

 

 

 

 

 

 

1

5.00

1.63

1.00

2. I ____ like the physics for teachers’ class.

Scale: 1 = do like                                                                              7 = do not like

 

 

 

 

14

70.00

5

25.00

1

5.00

 

 

 

 

 

 

 

 

 

 

1.35

1.00

3. In the physics for teacher class I feel…

Scale: 1 = very good                                                                        7 = very bad

 

 

 

 

10

50.00

6

30.00

3

15.00

1

5.00

 

 

 

 

 

 

 

 

1.75

1.50

4. I understand the topics in class.

Scale: 1 = never                                                                               7 = always

 

 

 

 

 

 

1

5.00

2

10.00

2

10.00

8

40.00

7

35.00

 

 

 

 

4.90

5.00

5. In order to think about the questions and tasks in class I have…

Scale: 1 = never enough time                                                7= always enough time

 

 

 

 

 

 

2

10.00

 

 

6

30.00

4

20.00

6

30.00

1

5.00

1

5.00

4.79

5.00

6. Our instructor explains the topics…

Scale: 1 = very well                                                                          7 = very badly

 

 

 

 

8

40.00

9

45.00

2

10.00

 

 

 

 

 

 

 

 

1

5.00

1.68

2.00

7. The topics in class are…

Scale: 1 = difficult to understand                                    7 = very easy to understand

 

 

 

 

1

5.00

4

20.00

2

10.00

4

20.00

7

35.00

1

5.00

 

 

1

5.00

3.79

4.00

8. The topics in class are...

Scale: 1 = very theoretical                                                                7= very concrete

 

 

 

 

1

5.00

1

5.00

3

15.00

11

55.00

1

5.00

1

5.00

1

5.00

1

5.00

3.89

4.00

9. In class we ____ talk about the topics related to the discipline of science.

Scale: 1 = always                                                                             7= never

 

 

 

 

11

55.00

4

 

20.00

 

2

 

10.00

 

1

 

5.00

 

 

 

 

 

 

 

 

 

 

 

2

10.00

1.61

1.00

10. In class we ____ talk about scientific principles and laws.

Scale: 1 = never                                                                               7 = always

 

 

 

 

2

10.00

 

 

 

 

 

 

 

 

8

40.00

9

45.00

1

5.00

5.95

6.00

11. In class we ____ talk about the nature of science.

 

 

 

Scale: 1 = never                                                                               7 = always

 

 

 

 

1

5.00

 

 

1

5.00

 

 

1

5.00

7

35.00

9

45.00

1

5.00

6.00

6.00

12. In class we ____ talk about how to teach science.

Scale: 1 = always                                                                             7 = never

 

 

 

 

7

35.00

4

20.00

2

10.00

1

5.00

1

5.00

4

20.00

 

 

1

5.00

2.84

2.00

13. In class we ____ talk about scientific topics from every day life.

Scale: 1 = never                                                                               7 = always

 

 

 

 

 

 

2

10.00

 

 

1

5.00

5

25.00

4

20.00

 

 

 

 

5.65

6.00

14. For teaching science the topics in class are…

Scale: 1 = very useful                                                                   7 = of no use at all

 

 

 

 

11

55.00

8

40.00

1

5.00

 

 

 

 

 

 

 

 

 

 

1.50

1.00

15. It is important to our instructor…

Scale: 1 = to find out our thoughts about the topics, 7 = to stay with his/her teaching objectives no matter what we think

 

 

 

 

7

35.00

7

35.00

3

15.00

2

10.00

 

 

 

 

 

 

1

5.00

2.00

2.00

16. Our instructor considers our suggestions…

Scale: 1 = seriously                                                                          7 = never

 

 

 

 

15

75.00

3

15.00

2

10.00

 

 

 

 

 

 

 

 

 

 

1.35

1.00

17. We can ask our instructor questions about class…

Scale: 1 = always                                                                             7 = never

 

 

 

 

17

85.00

 

 

3

15.00

 

 

 

 

 

 

 

 

 

 

1.30

1.00

18. The class is…

Scale: 1 = very undisciplined                                                      7 = very disciplined

 

 

 

 

 

 

1

5.00

1

5.00

3

15.00

5

25.00

6

30.00

 

 

 

 

5.30

5.50

19. The class participates…

Scale: 1 = not very much                                                                 7 = very much

 

 

 

 

1

5.00

 

 

3

15.00

 

 

 

 

 

 

16

80.00

 

 

6.75

7.00

20. The class makes ____ effort...

Scale: 1 = very little                                                                         7 = very much

 

 

 

 

 

 

 

 

 

 

 

 

 

 

7

35.00

13

65.00

 

 

6.65

7.00

21. My efforts to under-stand the teaching objectives are…

 

 

Scale: 1 = very high                                                                 ;        7 = very low

 

 

 

 

11

55.00

6

30.00

2

10.00

 

 

 

 

 

 

 

 

1

5.00

1.53

1.00

22. I try to participate in class…

Scale: 1 = very high                                                                         7 = very low

 

 

 

 

7

35.00

9

45.00

4

20.00

 

 

 

 

 

 

 

 

 

 

1.85

2.00

23. During instruction I am…

Scale: 1 = very distracted                                                                7 = very focused

 

 

 

 

 

 

 

 

 

 

2

10.00

1

5.00

9

45.00

8

40.00

 

 

6.15

6.00

 

24. Peer instruction instructional methods are...

 

Scale: 1 = very effective                                                             7 = very ineffective

 

 

 

 

14

70.00

5

25.00

1

5.00

 

 

 

 

 

 

 

 

 

 

1.35

1.00

25. Modeling Physics laboratory instructional methods are…

Scale: 1 = very effective                                                             7 = very ineffective

 

 

 

 

16

80.00

3

15.00

1

5.00

 

 

 

 

 

 

 

 

 

 

1.25

1.00

 


 

Qualitative Analysis

The course evaluation survey also included two open-ended questions.  The first question asked whether participants’ expectations for the course had been met.  The majority of respondents (N=17, 85%) answered “Yes.”  Table 8 provides the categories of responses to this question.

 

Table 8. Participants’ Expectations for the Course

Response Category

N

%

N

%

Yes

17

34.7

 

 

What participants learned or took from the course

19

38.8

 

 

·        Opportunity to expand and deepen knowledge of physics

 

 

8

 

·        More effectiveness as a teacher

 

 

1

 

·        Increased comfort in teaching

 

 

1

 

·        Ability to teach at a higher level

 

 

1

 

·        Learned so much personally and professionally

 

 

1

 

·        Increased knowledge about the inquiry method of modeling science

 

 

1

 

·        Ability to take concepts and turn them into hands-on activities

 

 

1

 

·        Learned how to motivate and challenge students in the physical sciences

 

 

1

 

·        The course motivated and stimulated me

 

 

1

 

·        Ability to bring science and math standards to classes

 

 

1

 

·        Learned new teaching strategies

 

 

1

 

·        Opportunity to discuss the topics with others

 

 

1

 

Comments about teachers

6

12.2

 

 

·        Enthusiastic

 

 

2

 

·        Great job explaining material, answering questions, and demonstrating how to teach physics

 

 

1

 

·        Treated participants in a personal and professional manner

 

 

1

 

·        Always prepared

 

 

2

 

Comments about course

7

14.3

 

 

·        Course was excellent, informative, well thought out, worthwhile

 

 

4

 

·        Would have liked more labs to complete district’s objectives and state/national standards

 

 

1

 

·        Wanted more time to share ideas

 

 

1

 

·        Course was intense and required a lot of reading

 

 

1

 

 Notes:   1. Respondents may have had multiple comments. 

2. Percentages may not add up to 100% due to rounding error.

 

In addition to responding yes, most participating teachers also provided additional comments.  Over a third of those comments (N=19) were related to what participants wanted to (or did) learn/take from the course.  As shown by the following quotes, participants most frequently commented that they had wanted to expand and deepen their knowledge of physics.

 

“I grew in understanding, comprehension, as well as awareness of the scope and magnitude of the physical world around us.  I learned how to take concepts and turn them into hands-on (modeling peer instruction, etc.) to motivate students to learn/be challenged in the physical science area.”

 

“I have experienced and learned so much personally and professionally. 

 

I have been motivated and stimulated.  I have such a broader and deeper understanding of the concept addressed; this will help me to be so much more effective with my students.”

 

Participants also provided comments about the course instructors.  All of these comments were positive.  Some examples are provided below.

 

“____ did a great job explaining the material, answering questions, and showing how to teach physics.”

 

Both instructors were well prepared for class and treated us in a personal and professional manner.”

 

“____ was fabulous.  He was always prepared, superior at being informative, always listened to our ideas, and was a great conversationalist”

 

 

The last question on the course evaluation surveys asked participating teachers whether they had any suggestions for improving the course.  As shown in Table 9, four of seven individuals noted positive aspects of the course, while three participants provided comments related to future suggestions.  Table 9 provides the response categories for this question.  Over a quarter of the participants did not respond to this question.  Another 17.4% noted that they had no suggestions.  Most of the individuals who had no suggestions for improvement noted that they thought the course had been great.

 

The two suggestions made with the most frequency were expanding the course over a longer period of time and offer more courses structured like this.  Examples of these comments include:

 

“Although teachers do not like to spend more than two weeks in class – more time for concept attainment would have helped me.”

 

“Spread it out over a longer period of time so that we have more time to absorb the information.”

 

“Offer again next summer, as well as concept chemistry!”

 

“Please think about offering a level II course and/or follow-up sessions.”

 

“I would love to see more classes structured like this.”

 

Table 9:  Suggestions for Improving the Course

Response Category

N

%

No Response

6

25.0

None

4

16.7

Expand course over longer period

3

12.5

Offer more courses structured like this

3

12.5

Great job, great course

3

12.5

More time to discuss concepts in class

1

4.2

Devote last 30 minutes of course to group discussion

1

4.2

Offer more labs

1

4.2

Provide more opportunity to convert “cookbook” labs to inquiry

1

4.2

Thank you for being respectful to the people of faith

1

4.2

Notes:    1. Respondents may have had multiple comments. 

2. Percentages may not add up to 100% due to rounding error.

 

 

Conclusion

There are three indicators by which course success can be determined.  The first is the Force Concept Inventory survey.  As discussed earlier, improvement in participant’s scores from the pre- to the post-survey was statistically significant.  The second indicator is the quantitative ratings of the course.  Overall, participants favorably rated the course.  That is, the majority of items were rated high on the appropriate Likert scale response [response categories were reversed for some questions].  Lastly, qualitative comments by participants further reinforced that the course met their expectations and needs.  The participants suggested that KU offer more content courses in the future.  Comments related to the course and the instructors were also quite positive. 

 

Both quantitative and qualitative data confirm that Peer Instruction is an effective method of content instruction for science teachers.  We attribute this primarily to the emphasis on conceptual understanding and the collaborative, “convince your neighbor” techniques.  When learners are unable to “hide behind the equation,” but instead are required to repeatedly demonstrate and defend their understanding of a concept, misconceptions can be quickly and consistently identified and challenged.  In addition, the immediate feedback to the instructor reveals stubborn misconceptions and provides an opportunity to correct them immediately.  Multiple ConcepTests reinforce newly acquired knowledge, and ensure that students truly understand the concept, and have not simply memorized the correct responses to one or two questions.  The rich data gathered by the student response system allows the instructor to identify struggling students, and provide individualized instruction when appropriate.

 

A majority of the participants rated Modeling Physics as very effective, and many planned to use this instructional model with their students.  The whiteboarding technique was mentioned repeatedly as very effective.  The Socratic discussion following whiteboarding was also mentioned as an effective way to move focus from the teacher to the students, and help them understand the process of constructive criticism.  Participants were encouraged to investigate the Modeling Physics website, and to consider formal instruction in this instructional framework.

 

The course format that combined content instruction with researched-based pedagogy was well received.  The participants felt their expectations for the course had been met, and that the course was a valuable experience.  The comments of the participants at the end of the course reinforced the importance of a deep understanding of concepts they teach to their confidence and feelings of competency.  This was especially evident in participants with the least formal preparation in the subjects they teach.  For many of the participants, this was their first experience as a student in a science course that was not primarily lecture.  Their positive appraisals, consistent with those of students in courses that were reformed as part of the KCETP initiative, reinforce the importance of moving toward student-centered instruction at all levels of education.  If teachers do indeed “teach the way they are taught,” it is critical that science instruction models exemplary, research-based techniques throughout the educational experience of all students.

 

Acknowledgements

We gratefully acknowledge the support of John Farley and Aimee Govett at the University of Nevada, Las Vegas.  Their assistance in developing this course was invaluable.  They generously provided many resources used in the course.

 

We gratefully acknowledge the assistance of the Marie Steichen and Brenda Fergen at the Office of Educational Innovation and Evaluation (OEIE), Kansas State University, for their excellent work evaluating this and all of the courses and workshops sponsored by KCETP.  OEIE provided the evaluation contained in the “Results” section of this paper.

 

Bios

Gary K. Webber is Administrative Officer of the University of Kansas Center for Science Education, and Project Coordinator of the Kansas Collaborative for Excellence in Teacher Preparation.

 

References

[1]        McDermott, L. C. (1990) A perspective on teacher preparation in physics and other sciences: The need for special science courses for teachers. Am. J. Phys. 58, 734-743.

[2]        Mazur, E. (1996) Peer Instruction: A User’s Manual. Upper Saddle River, N. J.; Prentice-Hall. (Book includes CD-ROM)

[3]        Redish, Edward F. and Steinberg, R. N., (1999). Teaching Physics: Figuring Out What Works. Physics Today 52, 24-30.

[4]        Hestenes, D., Wells, M., & Swackhammer, G. (1992) Force Concept Inventory, The Physics Teacher, 30, 141-158.

[5]        Galileo web site: ConcepTests in physics, chemistry, and biology are available on-line from the Project Galileo web site (galileo.harvard.edu).

[6]        Halloon, I. A., and Hestenes, D (1987), Modeling instruction in mechanics, Am. J. Phys. 55, 455-462.

 

This material is based upon work supported by the National Science Foundation under Grant No. DUE9876676.

Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the National Science Foundation.