1. Jianlan Wang
  2. Assistant professor
  3. Measuring and improving pedagogical content knowledge of student assistants in introductory physics classes
  4. https://www.pck-q-ttu.com/
  5. Texas Tech University
  1. Stephanie Hart
  2. Director, Texas Tech OnRamps
  3. Measuring and improving pedagogical content knowledge of student assistants in introductory physics classes
  4. https://www.pck-q-ttu.com/
  5. Texas Tech University
  1. Beth Thacker
  2. Associate professor
  3. Measuring and improving pedagogical content knowledge of student assistants in introductory physics classes
  4. https://www.pck-q-ttu.com/
  5. Texas Tech University
  1. Kyle Wipfli
  2. Research Assistant
  3. Measuring and improving pedagogical content knowledge of student assistants in introductory physics classes
  4. https://www.pck-q-ttu.com/
  5. Texas Tech University
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Pati Ruiz

    Pati Ruiz

    Facilitator
    Learning Sciences Researcher
    May 11, 2021 | 08:54 a.m.

    Thank you for sharing your project around the use of questioning. The learning assistants (LAs) sharing their perspectives helped contextualize the work that LAs are doing in the classroom. Dr. Hart makes a distinction when talking about asking questions in an inquiry based environment between knowing one should ask questions and guiding students through questioning. Can you tell us more about the written instrument for PCKQ? Is the instrument meant for use by researchers only or for practitioners (the LAs/SAs) as well? Also, can you please tell us a little bit more about what you've seen in terms of how this type of LA/SA preparation benefits student learning in the classroom?

     

  • Icon for: Jianlan Wang

    Jianlan Wang

    Lead Presenter
    Assistant professor
    May 11, 2021 | 12:16 p.m.

    These are great questions. The written instrument contains questions that describe situations that LAs may encounter in their interaction with students. LAs need to analyze the information provided and determine how they would provide support. Below is an example (the picture is not presented here):

    Students are presented with a picture of three setups a) a single bulb in a circuit with one battery; b) two bulbs in series in a circuit with one battery; and c) two bulbs in parallel in a circuit with one battery. All the bulbs and batteries are identical. They are ranking the brightness of the bulbs.

    You approach a group of students and have a conversation with them as shown below:

    You: So how would you rank the brightness of the light bulbs?

    Students: The bulb in the single circuit is the brightest. The other bulbs are dimmer but equally bright between each other.

    You: Why?

    Students: Because in the single circuit, there is only bulb that gets all the power from the battery. In both series and parallel circuits, there are two bulbs that share the power from the same battery. Each bulb gets only one half of the power. So the bulbs in b) and c) are half bright as the one in a).

    You: Did you calculate the power of each bulb?

    Students: Yes, we did. The equation is P=UI. In the single circuit, the power is UI. In the series circuit, I is the same and U is shared by the two bulbs. The voltage of each bulb is U/2. So the power for each bulb is IU/2. In the parallel circuit, U is the same and I is shared by the two bulbs. The current of each bulb is I/2. So the power for each bulb is still IU/2.

    Then respondents need to analyze what the strengths and difficulties are in students' understanding and articulate how they would respond to students and why. Their answers can indicate their PCK-Q, i.e., orientation toward questioning, knowledge of curriculum, knowledge of students, and knowledge of appropriate guiding questions. 

    This instrument can be used to quantify LAs' PCK-Q and predicate how they intervene with student learning. Once measured, a model can be built about how LAs' PCK-Q contributes to student learning, which will be our focus in the last year of this project. It can also be used for LA preparation and also professional development of high school physics teachers. The participating LAs reported that working on certain questions ahead of time would better prepare them teach the physics concept involved.

    As for the last question, I don't have any data yet. This instrument can predict how LAs intervene with student learning, such as using guiding questions to help students overcome a difficulty or directly providing the answer that students need. It's unclear which approach would help with student learning better. Theoretically, it's the former, i.e., guiding through questions. In the last year, we will test that theory by building a model about how LAs' PCK-Q (measured by this instrument) contributes to students' conceptual understanding (measured by conceptual inventory like FCI and BEMA) and critical thinking skills. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Pati Ruiz
  • Icon for: Li Ke

    Li Ke

    Researcher
    May 12, 2021 | 10:19 a.m.

    Very interesting project! I love how questioning, an important scientific practice, is featured in your teacher education program. I'm curious to learn more about how the pre-service science teachers apply them in real school settings. Great work! 

  • Icon for: Jianlan Wang

    Jianlan Wang

    Lead Presenter
    Assistant professor
    May 12, 2021 | 03:13 p.m.

    Thanks for the comments. Our major focus is on PCK-Q of learning assistants in non-traditional college physics courses. Learning assistants are undergraduate students who have taken the course and normally received a good grade. They return to the course to facilitate student learning in an inquiry-oriented setting. They are not necessarily pre-service teachers. As for pre-service science teacher education, I have applied this method in my methods course and designed an instrument to measure pre-service teachers' PCK-Q. The questions have a similar format, but are embedded in science concepts at K-8 levels, such as a food chain and light reflection. In my methods course, I have one module (3 weeks) allocated to address guiding questions, especially what guiding questions should be like in order to intervene with students' misconceptions. In Spring 2021, the pre-service teachers took a pre-survey with 10 questions embedded in 10 science topics prior to this module. After this module, they took a post-survey with 10 different questions. Normally, they would demonstrate their skill of questioning in a microteaching with students in the field at the end of this module. Due to COVID, they conducted a rehearsal with peers instead. I am analyzing the data right now. Hopefully, I can share the findings in NARST or AERA next year. Looking forward to your feedback. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Dalila Dragnic-Cindric
  • Icon for: Suzanne Otto

    Suzanne Otto

    Facilitator
    Teacher / Fellow
    May 13, 2021 | 07:54 a.m.

    I commend these efforts to provide better support for your physics students.  Teaching physics is so different from doing physics. As a classroom teacher, my whole purpose is to facilitate my students' learning and I have been trained and have practiced the specialties of my craft. Oftentimes in general education courses, graduate level STEM students serve as TA's and are assigned teaching duties, whether teaching is their passion or not.  This works sometimes, but from my own experiences long ago, I know that it sometimes does not.  Anything you can do to build the teaching capacities of your SAs will pay dividends in the understanding and motivation of the students in their classes.

    Learning to question without providing answers is a great place to start.  Have you partnered with faculty from the Education College to learn about best practices and practicalities in questioning techniques?  What is your process for training SAs?  Do you provide training on STEM education techniques and terminology?  Also, I'd love to see this questioning assessment tool and be able to use it with high school teachers.  It would be insightful to see where we stack up relative to best practices.

     
    1
    Discussion is closed. Upvoting is no longer available

    Dalila Dragnic-Cindric
  • Icon for: Jianlan Wang

    Jianlan Wang

    Lead Presenter
    Assistant professor
    May 13, 2021 | 01:18 p.m.

    Thanks for your comments and the thought-provoking questions. I completely agree with your comments about the importance of teaching capacities of student assistants, including graduate TA and undergraduate LA. Graduate TAs normally have stronger content knowledge than LAs do. However, they may be equally inexperienced with physics teaching. Their teaching practices may not be as good as those from instructors. It is very challenging to prepare STEM SAs with appropriate teaching capacities. Do SAs need all the pedagogical strategies suggested by educational theories, such as classroom management and scaffolding argumentation? If not, which strategies are most important to them? How should SAs be prepared with those strategies considering that the time for pedagogical instruction is very limited and some SAs even have difficulties with physics content knowledge? We take the first step to answer these questions by identifying questioning as one critical pedagogy for SAs in inquiry-oriented settings. We take the stance of practice-based teacher education (Hiebert & Morris, 2012) and engage SAs in the discussion and reflection of concrete cases with the support of educational theories.

    As for your first question, yes, this project involves faculty from college of education (me) and physics department (Dr. Beth Thacker), I am an assistant professor in college of education with a master's degree in physics. My research interest lies in K-8 science education and high school & college physics education. My work involves K-8 science teacher preparation, so I am somehow familiar with the theories about questioning, like Chin's work (Chin, 2007). Q2, so far we have conducted this study in two institutions where SA training happens in the weekly preparation sessions. The preparation sessions are mainly focused on physics content knowledge, i.e., previewing the concepts and activities in the coming week. Right now, we are developing and validating the instrument to probe SAs' PCK-Q. Later on, we will test using this instrument for SA preparation. Q3, we have mentioned STEM education terminology, but not gone deep in the discussion about educational theories. Research and my personal experience suggest that SAs, and even K-8 pre-service teachers do not care about educational theories, or educational theories instructed do not necessarily shape their teaching practice. Q4. We will host a virtual workshop at AAPT this summer on July 10, where we will introduce this instrument. Please join us. Alternatively, you can email me at jianlan.wang@ttu.edu for more details. Looking forward to any possible collaboration.

  • Icon for: Dalila Dragnic-Cindric

    Dalila Dragnic-Cindric

    Facilitator
    Postdoctoral Researcher
    May 13, 2021 | 10:27 a.m.

    Congratulations on your work! I fully agree with Suzanne’s comments on what plays out in practice and the importance of building LAs’ capacity to ask good questions. Could you tell us a bit more about your coding scheme and how you assessed the question quality?

    Thank you in advance.

  • Icon for: Jianlan Wang

    Jianlan Wang

    Lead Presenter
    Assistant professor
    May 13, 2021 | 01:30 p.m.

    Thanks. We have coding schemes for written responses and classroom videos. As for written responses, we code how accurate respondents' analyses are about students' strengths and difficulties, and how appropriate their questions are in terms of the potentiality in supporting student learning under the PCK framework of orientation, knowledge of curriculum, knowledge of students, and knowledge of instructional strategies. As for classroom videos, we code questions used by SAs in specific vignettes of SA-students interaction, such as type of question and who is the agent of knowledge construction. There are a lot of details. I am not sure which aspect(s) to share right here. If you are interested, please email me at jianlan.wang@ttu.edu for further discussion. Thanks

     
    1
    Discussion is closed. Upvoting is no longer available

    Pati Ruiz
  • Icon for: Lei Liu

    Lei Liu

    Researcher
    May 14, 2021 | 01:41 p.m.

    Glad to see that you are distinguishing asking questions from asking guiding questions, which I completely agree that they are two different practices. It would be interesting to look at how LA's questioning skills actually impact students' learning (both in terms of what they learn and how they learn). Look forward to follow up updates about your research study.

  • Icon for: Jianlan Wang

    Jianlan Wang

    Lead Presenter
    Assistant professor
    May 14, 2021 | 03:04 p.m.

    Thanks for the comments. Yes, questions are of various functions. Learning assistants and pre-service teachers are aware of asking question as a way to hold students accountable to their own learning. However, they are not that knowledgeable about how their questions may intervene with students' conceptual understanding. This is the focus of this project. Looking forward to further discussion.