1. Ying Wu
  2. https://insight.ucsd.edu/our-team/
  3. Project Scientist
  4. An Embodied, Augmented Reality Coding Platform for Pair Programming
  5. https://insight.ucsd.edu/
  6. University of California San Diego
  1. Amy Eguchi
  2. https://eds.ucsd.edu/discover/people/faculty/eguchi.html
  3. Assistant Teaching Professor
  4. An Embodied, Augmented Reality Coding Platform for Pair Programming
  5. https://insight.ucsd.edu/
  6. University of California San Diego
  1. Thomas Sharkey
  2. https://www.tlsharkey.com
  3. PhD Student
  4. An Embodied, Augmented Reality Coding Platform for Pair Programming
  5. https://insight.ucsd.edu/
  6. University of California San Diego
  1. Monica Sweet
  2. Co-Director of Research and Evaluation
  3. An Embodied, Augmented Reality Coding Platform for Pair Programming
  4. https://insight.ucsd.edu/
  5. University of California San Diego
  1. Robert Twomey
  2. http://roberttwomey.com
  3. Assistant Professor
  4. An Embodied, Augmented Reality Coding Platform for Pair Programming
  5. https://insight.ucsd.edu/
  6. University of Nebraska Lincoln
  1. Timothy Wood
  2. Post-doc
  3. An Embodied, Augmented Reality Coding Platform for Pair Programming
  4. https://insight.ucsd.edu/
  5. University of California San Diego
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Michael Chang

    Michael Chang

    Facilitator
    Postdoctoral Research
    May 11, 2021 | 12:15 p.m.

    Thanks for sharing this project! I appreciate how this approach is driven by a desire to increase self-efficacy and self-confidence to non-dominant groups in computer science. Could you talk more about the specific affordances of the AR/VR space, and compare it against using physical embodiments of coding primitives (e.g., wooden blocks)? How does the AR/VR technology specifically augment the experience of the young people?

     
    3
    Discussion is closed. Upvoting is no longer available

    J. Adam Scribner
    Robert Twomey
    Ying Wu
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2021 | 10:53 a.m.

    Good questions, Michael.  Part of our project centers on deciding what types of affordances are most likely to benefit learning.  We have conducted a need-finding study, interviewing several coding educators in our region.  Many of them rely on physical embodiments of coding primitives to teach core concepts (e.g. using paper airplanes passed between students to represent transfer of information from functions).  In VR/AR, it is possible to visualize this sort of metaphor in a systematic way so that it gets reinforced every time a person uses a function.  We hope that this type of systematicity can benefit learning.  

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 11, 2021 | 01:14 p.m.

    Welcome to Embodied Coding, a project funded by the National Science Foundation through the program, Cybearlearning for Work at the Human-Technology Frontier.  This video offers an overview of our work to develop whole-body approaches to learning to code, leveraging Augmented and Virtual Reality (AR/VR).  We hypothesize that human understanding of abstract computational concepts (e.g. functions, data, variables, and so forth) is grounded in embodied experience – and aim to facilitate learning of these concepts through an AR/VR platform whose affordances allow users to exercise their embodied knowledge as they produce computer programs.  Feel free to reach out with questions and comments!  We look forward to your feedback.

     
    1
    Discussion is closed. Upvoting is no longer available

    Robert Twomey
  • Sara Kazemi

    Graduate Student
    May 11, 2021 | 03:06 p.m.

    I am really looking forward to seeing your hypotheses tested with underrepresented students who are reluctant learners of CS. I was interviewed as a high school CS educator regarding this project! I’m moving on to specialize in interactive intelligence in a CS grad program, so I look forward to following this research. 

     
    2
    Discussion is closed. Upvoting is no longer available

    Ying Wu
    Robert Twomey
  • Icon for: Karl Kosko

    Karl Kosko

    Higher Ed Faculty
    May 11, 2021 | 04:34 p.m.

    Very nice video and interesting project! I noticed in the video that you used both hand tracking and the controllers. When using the controllers to model/construct code, is there haptic feedback? Do you think the lack of haptic feedback with using one's hands may have any sort of 'negative' effect (i.e., use of the controllers may facilitate embodied interaction to a higher degree)? Or do you think the reverse may be the case? 

    I'm looking forward to seeing what you produce in this project!

     
    1
    Discussion is closed. Upvoting is no longer available

    Robert Twomey
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2021 | 01:20 p.m.

    Thanks, Karl!  Your point about haptic feedback is intriguing.  For now we are focusing on visual, auditory, and kinesthetic perception. Definitely haptic feedback is something to explore in the future. 

     
    1
    Discussion is closed. Upvoting is no longer available

    Robert Twomey
  • Icon for: Kimberly Arcand

    Kimberly Arcand

    Researcher
    May 12, 2021 | 02:48 p.m.

    I'm also interested in hearing about some of the pros/cons to using hand tracking vs. controllers.  Following!

  • Icon for: Robert Twomey

    Robert Twomey

    Co-Presenter
    Assistant Professor
    May 14, 2021 | 11:08 a.m.

    Hi Kimberly, thanks for your question! Though we have done initial design studies using the VR headset with hand controllers, we are gravitating to hand tracking for our both our VR prototypes and eventual AR coding platform. Camera-based hand tracking (with skeleton models) lends itself to a more granular sampling of user gesture (down to wrists, fingers, etc.), unencumbered by the need to hold a physical controller. We are still determining the role of user gesture within our platform, but expect this hand tracking will lend itself to gestural expressivity from users, a more natural mode of interface. Quest 2, hololens 2, and webxr, all tools/platforms we are developing with, have implemented hand tracking interfaces we are building off of. I do like Karl's point above about haptic feedback as being one positive affordance with the controllers.

  • Kit-Bacon Gressitt

    Higher Ed Faculty
    May 11, 2021 | 06:21 p.m.

    Interesting, Ying. To me, one of the many uninitiated, it seems akin to VR entertainment. ... I hope outreach to BIPOC students works.

     
    1
    Discussion is closed. Upvoting is no longer available

    Ying Wu
  • Icon for: Teon Edwards

    Teon Edwards

    Co-founder of EdGE & Game Designer
    May 12, 2021 | 09:57 a.m.

    Very interesting idea; I'm really interested in seeing how your hypotheses test out, as well as how the technology itself progresses. Thank you.

    The video imagery focused on VR; I'm wondering about your thoughts on how embodied coding would be impacted by seeing the coding on top of a real-world situation related to what's being coded. e.g., what if (samples of) the variables are actually in the room, building on what you suggest about "dropping the variable into the container".

     
    1
    Discussion is closed. Upvoting is no longer available

    Ying Wu
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2021 | 10:59 a.m.

    Hello,Teon!  Yes -- I agree that AR is a very powerful tool, and from the start, we conceptualized our coding platform in AR.  For prototyping and development purposes, we have been using VR.  The stand alone Quest 2 is certainly much cheaper than the Hololens.  However, we will explore AR in the future as well.  Please stay in touch!  (ywu@ucsd.edu, https://insight.ucsd.edu)

     
    1
    Discussion is closed. Upvoting is no longer available

    Robert Twomey
  • Sean Strough

    Informal Educator
    May 12, 2021 | 10:53 a.m.

    So cool when you can find a project that is both useful and fun. From what I'm seeing, VR is highly appealing to kids of all ages. I have no doubt that this can only make STEM and CS even more appealing to a wider audience! Good luck!!

     
    2
    Discussion is closed. Upvoting is no longer available

    Robert Twomey
    Ying Wu
  • Sean Strough

    Informal Educator
    May 12, 2021 | 10:53 a.m.

    So cool when you can find a project that is both useful and fun. From what I'm seeing, VR is highly appealing to kids of all ages. I have no doubt that this can only make STEM and CS even more appealing to a wider audience! Good luck!!

     
    1
    Discussion is closed. Upvoting is no longer available

    Robert Twomey
  • Icon for: Andres Colubri

    Andres Colubri

    Facilitator
    Assistant Professor
    May 12, 2021 | 10:54 a.m.

    Very cool project, the idea of embodiment in problem solving is very powerful but overlooked. Two questions:

    * the video seems to focus solely in VR, and I'd imagine that working in an AR environment would significantly expand the possibilities of mapping algorithmic entities to physical objects. Are you working on that direction as well?

    * Also from the videos, it seems that you are creating some kind fo visual programming environment in VR… where you have containers representing variables, blocks for operations, and connectors for flow of information. I wondering if even more direct translations, like in puzzle games such as LittleBigPlanet, where a algorithmic rules are embedded into a virtual mechanism or system that more immediately tangible.

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 12, 2021 | 01:23 p.m.

    I appreciate your feedback, Andres!  Yes -- we are planning ultimately to design our platform and activities for AR -- however, we are using VR for protoyping purposes at the moment.  We will definitely check out LittleBigPlanet.  Thanks for the suggestion.

  • Icon for: Suzy Gurton

    Suzy Gurton

    Informal Educator
    May 12, 2021 | 02:24 p.m.

    As a relative new user of VR, I find it fatiguing to use. Is there a limit to how long students are comfortable in the VR environment?

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 13, 2021 | 01:55 p.m.

    Good question, Suzy.  There is variability in people's tolerance of VR.  At times, it can induce motion sickness.  Ultimately, our platform will be instantiated in Augemented Reality, which may prove less fatiguing.

  • Icon for: Anita Crowder

    Anita Crowder

    Researcher
    May 13, 2021 | 09:32 a.m.

    Thank you for sharing this project.  I have worked with high school students using VR, but not in this context.  I taught programming and they built software for the HTC Vive.  This is very intriguing from a meta-cognitive perspective.  It is almost like physical recursion!  I look forward to hearing more about the findings from your work! 

     
    1
    Discussion is closed. Upvoting is no longer available

    Ying Wu
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 13, 2021 | 07:21 p.m.

    Thanks, Anita!  Yes -- we are in the design phase now -- but hope to have the core components of our platform in place by next year.

     

     
    1
    Discussion is closed. Upvoting is no longer available

    Anita Crowder
  • John Schumann

    Higher Ed Faculty
    May 13, 2021 | 06:56 p.m.

    I find this research very interesting. In my own work, I am looking at abstract concepts that do not have all the characteristics of physical entities. They lack mass, energy, and observability, but nevertheless, they can have causal effects on the world. They are concepts such as democracy, freedom, motivation, emotion, peace, obstruction etc.  I'll be very interested in how this research with abstract computational concepts develops, and to see its possible relevance to the study of less-than-fully-physical concepts.

     

     
    1
    Discussion is closed. Upvoting is no longer available

    Ying Wu
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 13, 2021 | 08:04 p.m.

    Indeed, John -- embodied cognition is important for our ability to reason about many abstract concepts above and beyond the domain of computation.  Great to hear from you!

     

  • Icon for: Eric Hamilton

    Eric Hamilton

    Higher Ed Faculty
    May 14, 2021 | 11:44 a.m.

    Ying, we are just starting a CS strand of activity with partners in the Middle East, and I would love to learn whether there are ways that we could connect with the innovation your group is advancing. I am not sure how it could play out, but my intuition is that there could be some powerful synergies.  We are based up the street from you in Los Angeles and Malibu.  I hope to connect post forum on this.  Many thanks.

  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 15, 2021 | 08:58 p.m.

    Thanks, Eric -- yes it seems that there are many intersections between our groups.  I would love to talk further.  

  • Icon for: H Chad Lane

    H Chad Lane

    Higher Ed Faculty
    May 14, 2021 | 01:24 p.m.

    What wonderful ideas to explore for new coding interfaces. I can see how engaging this could be for many learners. I was specifically intrigued by the visual appeal of how control structures looked in the interface.... have you considered (or do you already) take advantage of the fact that you have a 3D space to work in?  I'm curious how you would leverage that - in thinking about all scratch-like or text-based coding environments are 2D, the affordance of depth seems like a highly novel aspect for investigation. Perhaps there are some ways to represent abstraction in novel ways?  Thanks again, this is incredibly creative work!

     
    2
    Discussion is closed. Upvoting is no longer available

    Ying Wu
    Robert Twomey
  • Icon for: Robert Twomey

    Robert Twomey

    Co-Presenter
    Assistant Professor
    May 14, 2021 | 03:19 p.m.

    Thanks Chad for your comments. Yes, we are particularly excited about the new possibilities that arise from extending visual coding into 3d space. For instance, users can attach code to particular locations in space, harnessing coders' spatial memory and spatial organization strategies to scaffold arrangement of code logics. With some programming tasks, code might be attached to particular objects. This becomes particularly interesting for programming robots or IoT, for instance, where we can display live code execution, debug, or pre-visualize future output that is spatially attached to the robot or device. Finally, we could even use the architecture of the room as a meaningful framework for code arrangement. We see many exciting avenues to explore!

     
    2
    Discussion is closed. Upvoting is no longer available

    Ying Wu
    H Chad Lane
  • Icon for: Jeremy Roschelle

    Jeremy Roschelle

    Facilitator
    Executive Director, Learning Sciences
    May 17, 2021 | 01:07 a.m.

    Interesting video, team! The "container" metaphor reminded me of Boxer, a variant of Logo that I admittedly worked on circa 1985 or so. You might want to look up Andy diSessa's papers on Boxer -- here's one -- Andy was very thoughtful about principles like Spatial Metaphor and Naive Realism -- and although your tech is newer, the principles may be helpful to you.

     
    1
    Discussion is closed. Upvoting is no longer available

    Ying Wu
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 17, 2021 | 02:02 p.m.

    Thanks, Jeremy!  It's great to get the perspective from a classical, old-school developer.  It seems that some of the concepts motivating VPLs such as Scratch were already in their nascency with Boxer.  I appreciate this paper.  

  • Icon for: Robert Twomey

    Robert Twomey

    Co-Presenter
    Assistant Professor
    May 18, 2021 | 06:42 p.m.

    Thanks Jeremy, this is a great reference for us! I wasn't familiar with Boxer. More evidence of the wealth of ideas and approaches in these lesser-known histories of programming languages. I look forward to digging into the paper. I see the second author is Harold Abelson who co-authored one my favorite programming books of all time, Structure and Interpretation of Computer Programs (SCIP). 

  • Icon for: Pendred Noyce

    Pendred Noyce

    Founder and Executive Director
    May 17, 2021 | 12:35 p.m.

    Very interesting video, and one that raises so many questions, such as how far can you get in teaching about abstract concepts through physical motion and extended, embodied metaphor? I will be fascinated to learn about your findings from this project, especially transfer of concepts from VR to a more traditional coding environment.

     
    2
    Discussion is closed. Upvoting is no longer available

    Robert Twomey
    Ying Wu
  • Icon for: Ying Wu

    Ying Wu

    Lead Presenter
    Project Scientist
    May 17, 2021 | 02:22 p.m.

    Thanks, Pendred, for your insight.  I agree that transfer of learning to 2D visual programming or text-based coding is an important question to address.  For now, our focus centers on how the affordances of coding in 3D space can facilitate computational concept learning.  But you make a good point -- and we will must definitely keep in mind the importance of transfer of knowledge and skills to 2D platforms as we design our coding environment.  Also, it may not be clear from the video, but we are ultimately planning to design a coding system (akin to Scratch in some ways) for Augmented Reality.  The examples shown in the video are just prototypes created in VR for development purposes.

     
    1
    Discussion is closed. Upvoting is no longer available

    Robert Twomey