1. Karl Kosko
  2. https://www.kent.edu/ehhs/tlcs/profile/dr-karl-w-kosko
  3. Associate Professor
  4. Design and Implementation of Immersive Representations of Practice
  5. https://xr.kent.edu/
  6. Kent State University
  1. Christine Austin
  2. Graduate Assistant
  3. Design and Implementation of Immersive Representations of Practice
  4. https://xr.kent.edu/
  5. Kent State University
  1. Richard Ferdig
  2. http://www.ferdig.com
  3. Summit Professor of Learning Technologies
  4. Design and Implementation of Immersive Representations of Practice
  5. https://xr.kent.edu/
  6. Kent State University
  1. Enrico Gandolfi
  2. Assistant Professor
  3. Design and Implementation of Immersive Representations of Practice
  4. https://xr.kent.edu/
  5. Kent State University
  1. Jennifer Heisler
  2. Graduate Assistant
  3. Design and Implementation of Immersive Representations of Practice
  4. https://xr.kent.edu/
  5. Kent State University
  1. Maryam Zolfaghari
  2. Graduate Assistant
  3. Design and Implementation of Immersive Representations of Practice
  4. https://xr.kent.edu/
  5. Kent State University
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Barry Fishman

    Barry Fishman

    Facilitator
    Professor
    May 11, 2021 | 04:07 p.m.

    What a great video - I love that the video itself embodies the technological affordances you are focused on. "Professional noticing" is a great frame, and really embodies a lot of what we have come to understand and appreciate about the progression from novice-to-expert.

    Have you experimented at all with guided pathways through an immersive video? I'm reminded of some work that Rand Spiro led in the 1990s around what he called "experience acceleration." Of course, Rand didn't have access to XR technologies!

     
    1
    Discussion is closed. Upvoting is no longer available

    Karl Kosko
  • Icon for: Karl Kosko

    Karl Kosko

    Lead Presenter
    Associate Professor
    May 11, 2021 | 04:29 p.m.

    Great question! The short answer is yes and no. If I recall, one aspect Rand advocated for experience acceleration was multiple viewings of shorter clips. We do have at least two viewings required in our methods course assignments and almost all our studies. As far as providing more explicit scaffolding, this is a work in progress informed by both our research and use in our methods courses. One thing we are attempting to do now is to shift towards encouraging patterns of noticing rather than focusing on specific moments alone. We believe eye-tracking and machine learning in these contexts will help open opportunities in this regard. 

  • Icon for: Barry Fishman

    Barry Fishman

    Facilitator
    Professor
    May 11, 2021 | 04:54 p.m.

    You do recall Rand's work correctly. I really admire the combination of approaches you are employing here. What a great testbed for experimentation!

     
    1
    Discussion is closed. Upvoting is no longer available

    Karl Kosko
  • Icon for: Margie Vela

    Margie Vela

    Facilitator
    Senior Program Manager
    May 12, 2021 | 12:34 a.m.

    What a great way of training! Technology has surely impacted teaching in many ways. 

    I am really curious if XR technology can have some application to equity and inclusion training? I would imagine this could have a huge impact in many different ways. 

  • Icon for: Karl Kosko

    Karl Kosko

    Lead Presenter
    Associate Professor
    May 12, 2021 | 08:58 a.m.

    Thanks for the post! I do believe there could be applications to such professional development. One thing we've noticed in our own work is that participants' field-of-view (FOV) can be used to determine differences in who or what they are focusing in a 360 video (either with a headset or on a laptop screen). In one case, whether participants attended to the mathematics in the video was associated with whether they focused on two girls' contributions to whole class discussion in the particular video viewed. I imagine something similar could be adapted for a equity-specific focus. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Margie Vela
  • Icon for: Teon Edwards

    Teon Edwards

    Co-founder of EdGE & Game Designer
    May 12, 2021 | 09:46 a.m.

    Very interesting video, and great to see the actual use of the 360  video technology in your presentation. Thank you.

    I'm wondering if you are also looking at the uses of 360 video with students. We've done some work with location-based learning, including the capturing and "annotating" of 360 video to engage learners in the world around them and help facilitate others, who encounter and explore the created videos, in engaging with a place in new/different ways. What aspects of your eye tracking and machine learning, or more generally what you are learning about the use of this tech with teachers, do you think will translate to other audiences/settings? Is exploring this of interest?

    Also, have you thought at all about augmented reality (AR) complements? I'm wondering about how to take advantage of what you are learning, as well as what your teachers are learning, to be offering live, real-time assistance in professional noticing. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Karl Kosko
  • Icon for: Karl Kosko

    Karl Kosko

    Lead Presenter
    Associate Professor
    May 12, 2021 | 01:06 p.m.

    Very nice thoughts! We considered annotation tools initially, but given our focus on teacher noticing and some of the things we learned about such noticing, opted for examining tools like the machine learning & eye tracking aspects. Part of this is to better facilitate more timely feedback since some annotation features could add more time to teacher educators providing feedback (although pre-given annotation features may help to reduce this). For K-12 students, the machine learning aspects could be useful to see whether students are noticing/attending to things we would want them to focus. I could see this being coupled with timely questions/prompts (for example, engaging in a virtual field trip underwater and seeing if students are attending to certain ecological features / patterns). 

    We do have folks on our team with a great deal of background with AR and have thought about aspects of its integration. We haven't integrated this aspect with the machine learning components, but this could be done. 

    I think some of your points regarding the real-time audience are very important. Right now, I think there are limitations in the devices (specific 360 cameras & streaming capabilities) that make this somewhat difficult, but we were discussing things like "virtual observations" pre-pandemic. 

    There's a LOT of what you have in your question that we've discussed as well. Much of it is really "not having enough time" to do it all. It's heartening to see so many NSF projects and scholars delving into these issues (too much great work for just one group!).

  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Researcher
    May 12, 2021 | 12:05 p.m.

    This is great!  I'd be interested to know what information the teachers were given before participating, whether they knew the goals of the study or were just given a headset and presented with the video. 

    When I first started watching your video, I thought it was going to be more about students using 360 video technology when remote learning to look around their actual classroom.  I wonder if anyone has been doing that during the pandemic.

  • Icon for: Karl Kosko

    Karl Kosko

    Lead Presenter
    Associate Professor
    May 12, 2021 | 01:11 p.m.

    Preservice teachers who participate know that we're engaging them in professional noticing tasks and we're studying their use of the technology & their recorded viewing / written responses along these lines. We usually engage them with the task used in the recorded lesson(s) and have them watch a "tutorial" video of "how to watch 360 while watching 360." This tutorial is essential because in our early work with folks we found they didn't always move the perspective (so we made a tutorial to engage them in doing just that). 

    We also have participants engaging with and without the VR Headsets (some folks can't wear them and during the Pandemic we couldn't provide them for participants). We ended up developing a way to record where participants looked while watching 360 videos in the background (i.e., without screen recording software). We're now working on improving this tool so teacher educators can use it for research & practice (it is the tool we coupled with our machine learning approach so we can examine what PSTs are focusing in their 360 video viewing). 

  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Researcher
    May 12, 2021 | 02:00 p.m.

    Thanks for all of the information.  Great to hear about this.

     
    1
    Discussion is closed. Upvoting is no longer available

    Karl Kosko
  • Icon for: Elizabeth Adams

    Elizabeth Adams

    Researcher
    May 12, 2021 | 03:12 p.m.

    Hi Karl and team, this is awesome work. I am wondering if you have used 360 with classroom observational measures. I wonder how raters would go about processing the large amount of information captured here. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Karl Kosko
  • Icon for: Karl Kosko

    Karl Kosko

    Lead Presenter
    Associate Professor
    May 13, 2021 | 08:40 a.m.

    We have not, but that is a very interesting measurement question/topic for study. There's definitely enough 360 video material that could be used for such purposes. If this is something of interest for your work, we do have several videos on our YouTube channel and website (I'd ask to use a video that has me as the "teacher" as one premise for all other videos was that it was to focus on students' learning and not evaluating the teachers - fine with myself being evaluated/critiqued). I'd be happy to chat about that in more detail. 

  • Icon for: Brian Foley

    Brian Foley

    Facilitator
    Professor
    May 15, 2021 | 04:16 p.m.

    This is really interesting. We have had our student teachers use a lot of classroom video to evaluate their own teaching and other people's teaching. But the 360 video seems to add a lot more detail to what you can see and hear about the classroom (great demo in your video!). How difficult is it to implement in the classroom?  Do the pupils get distracted by the presence of the 360 camera?  How easy is it to scale the use of this technology?

  • Icon for: Jamie Mikeska

    Jamie Mikeska

    Researcher
    May 18, 2021 | 10:58 a.m.

    This is a fascinating project! You mentioned that you plan to provide teachers with feedback to support their noticing. What are your plans for the nature and type of feedback you plan to use to support teacher development of their noticing practices?