1. Jodi Asbell-Clarke
  2. https://edge.terc.edu/
  3. Director, EdGE at TERC
  4. Zoombinis: The Full Development Implementation Research Study of a Computational Thinking Game for Upper Elementary and Middle School Learners
  5. http://ZoombinisEdu.com
  6. TERC
  1. Teon Edwards
  2. Co-founder of EdGE, Game Designer, and Production Manager
  3. Zoombinis: The Full Development Implementation Research Study of a Computational Thinking Game for Upper Elementary and Middle School Learners
  4. http://ZoombinisEdu.com
  5. Educational Gaming Environments (EdGE) at TERC, TERC
  1. Elizabeth Rowe
  2. Director of Research
  3. Zoombinis: The Full Development Implementation Research Study of a Computational Thinking Game for Upper Elementary and Middle School Learners
  4. http://ZoombinisEdu.com
  5. Educational Gaming Environments (EdGE) at TERC, TERC
Presenters’
Choice
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 15, 2017 | 08:02 a.m.
    Thanks for taking time to watch our video on Zoombinis.   We would love to strike up conversations with folks thinking about the measurement of computational thinking and/or game-based learning assessment in general. Or even if you have questions about the research in general, we’d love to hear from you.   We are also looking for teachers for our research studies. If you or someone you know is teaching Computational Thinking in grades 3-8 and would like to get involved in Zoombinis research, please go to http://zoombiniseducators.terc.edu/   Thank you and enjoy!
  • Icon for: Cyrus Shaoul

    Cyrus Shaoul

    Researcher
    May 15, 2017 | 04:00 p.m.

    This is great stuff! Here are my questions: how can a game like Zoombinis be used as a formative assessment in a classroom? Are you looking at the meta-cognitive aspects of this game? How do you think the affect of the learner changes learning in Zoombinis?

    Thanks so much for this wonderful video.

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 15, 2017 | 04:39 p.m.

    Hey Cyrus. We are looking at the behaviors that players exhibit during gameplay as a way to formatively assess their learning. This allows us to measure implicit learning rather than relying only on the explicit knowledge that learners can express on a test. In our previous studies we have looked at how the challenge in GBL impacts learning, but we have not connected that research to Zoombinis yet. thanks!

  • Icon for: Cyrus Shaoul

    Cyrus Shaoul

    Researcher
    May 15, 2017 | 04:41 p.m.

    Great! Thanks for the link. I will check them out.

  • Icon for: Daniel Heck

    Daniel Heck

    Researcher
    May 15, 2017 | 04:50 p.m.

    Thank you for sharing your work. My kids and I have been playing the Zoombini's games for many years -- we love them!

    I've always thought of the games as tests of logic and deductive reasoning, not computational thinking. Could you say more about the relationship and distinctions between CT and logic/deduction?

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 15, 2017 | 05:05 p.m.

    Great question. I think they are related but looking with a slightly different lens. CT is about the process of how to break down the problems and abstract general solutions that can then be designed into algorithms for a computer program (or any procedural representation). Boolean logic (AND, OR, NOT) are part of those solutions and so is deductive reasoning. If deductive reason is the process of putting evidence together into a solution, CT is the process of breaking down the problems to solve and then generalizing that process. Does that make sense? 

  • Icon for: Neil Plotnick

    Neil Plotnick

    Facilitator
    Teacher
    May 15, 2017 | 05:06 p.m.

    Always nice to see how games can be used to enhance learning. I would like to know more about the teachers using explicit strategies in their classroom to track computational thinking with students. During the video, you briefly showed a classroom where students were navigating a grid pattern on the floor.

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 15, 2017 | 06:45 p.m.

    thanks Neil. We have teachers use a variety of bridge activities including video clips from the game with discussion points, acting out a puzzle, and using data tables and eventually pseudo code to represents the game's algorithms and their own algorithms for solving the puzzles. We also have created a few Scratch activities using the Zoombinis characters. In Fall 2017 we start our implementation study to study this in more detail.

  • Icon for: Shuchi Grover

    Shuchi Grover

    Researcher
    May 15, 2017 | 09:23 p.m.

    So cool, Jodi! I'm so keen on educators having ways of providing kids non-programming avenues for building these skills (as you've seen in our VELA STEM+C project).
    Very interested in learning about what you find in your studies about what aspects of CT kids are learning. I'm particularly interested in how you define abstraction in the context of Zoombinis and how you measure it..
    All the best to you all in your project!

  • Icon for: Nicole Reitz-Larsen

    Nicole Reitz-Larsen

    Facilitator
    Educator
    May 16, 2017 | 10:23 a.m.

    I love the explanation you gave earlier of computational thinking.  You've got some great discussion going on around the activities, students and thinking.  I'm curious to know how you are working with teachers to help build their understanding of computational thinking so that they can facilitate those discussions with students as they are working on and off the computer to build their CT skills.

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 16, 2017 | 10:30 a.m.

    thanks Nicole. You tapped right into one of the trickiest part of our work these days. I think teachers are doing cool CT stuff in class, but they call it different things. There is a lot of discussions of variables in Math and Science, and we have similar problem-solving strategies in many areas....but it is figuring out how to help teachers see those connections that takes a lot of thought and time. We are developing Bridge materials to help teachers make these connections between the CT concepts developed in the game and what they are teaching in Math, Science, TechED, and even ELA and other subjects.

     
    1
    Discussion is closed. Upvoting is no longer available

    Nicole Reitz-Larsen
  • Icon for: Nicole Reitz-Larsen

    Nicole Reitz-Larsen

    Facilitator
    Educator
    May 17, 2017 | 11:51 p.m.

    As you work on those resources, I'd be interested to hear how you connect CT concepts, the game and Math, Science, TechEd, ELA, etc.

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 18, 2017 | 07:29 a.m.

    thanks Nicole - we will publish our findings and our curriculum after our 2017-2018 study. 

     

  • Icon for: Jeremy Roschelle

    Jeremy Roschelle

    Researcher
    May 16, 2017 | 10:53 p.m.

    Hi Jodi,

    Thanks for the clear discussion of computational thinking and great example of how to use the "classic" Zoombinis to measure it. I was wishing for a closing "punchline" in the video -- suppose your program of research plays out (say in a couple of years) and Zoombinis is a valid assessment of CT. What's the biggest contribution of this work that you can imagine? What's the headline news 2-3 years from now?

    best,

    jeremy

     

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 17, 2017 | 07:27 a.m.

    thanks Jeremy - I think that line was left on the editing floor :) Implicit game-based learning assessments, like what we are building with Zoombinis, can include all types of learners, even those who may have barriers to traditional assessments. By measuring what people DO rather than just what the SAY or WRITE, we may be able to unleash the potential of all learners - and nowhere is that more important than in Computational Thinking. Our new line of work is looking at the connections between CT and  Autism, ADD, and Dyslexia so that we can leverage the strengths of all learners.

  • Icon for: Neil Plotnick

    Neil Plotnick

    Facilitator
    Teacher
    May 17, 2017 | 07:46 a.m.

    I would be VERY interested in what your work on children with learning differences is showing. I am a licensed Special Education teacher (as well as a CS teacher) and have found that computers are a great tool for several reasons. They have infinite patience, some children will have greater focus with a screen and keyboard compared to a book or teacher, and some students naturally gravitate to computers over other learning scenarios. 

     

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 17, 2017 | 08:29 a.m.

    thanks Neil - we just applied for new grants for this work....so stay tuned!

  • Icon for: Katharine Sawrey

    Katharine Sawrey

    Graduate Student
    May 17, 2017 | 09:11 a.m.

     Jodi, I'd like to build on this thread. I love the idea of assessments being grounded in students' "doing" rather than their "reporting." I am curious about how you qualify computational thinking? The video mentions that your group used the results of 70 players to operationalize CT in terms of click actions in Zoombinis. What do you feel are the strengths and shortcomings of your analysis?

     
    1
    Discussion is closed. Upvoting is no longer available

    Mary Dussault
  • Icon for: Chris Dede

    Chris Dede

    Higher Ed Faculty
    May 17, 2017 | 11:11 a.m.

    We need performance-based ways to measure computational thinking, so this is an important topic

  • Icon for: Mary Dussault

    Mary Dussault

    Researcher
    May 17, 2017 | 03:05 p.m.

    Hi Jodi,

     

    It's great to dip back into your work these many years on! I'm also really interested in Katharine's and Chris's comments -- can you give an example of how your analysis of the "big data" of user actions has helped you develop a predictive(?) model of CT? 

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 17, 2017 | 04:59 p.m.

    Hi Mary(great to hear from you!!), and Chris, and Katherine too,

     

    We use the video analysis on our sample of 70ish just to provide groundtruth (human analysis) of what strategies players exhibit while they are playing. We are looking at just 5 of the puzzles right now, several levels each, for all these players.

     

    While they solve the Zoombinis puzzles, we watch for indicators of systematic testing (such as holding one variable constant or trying one of each thing to lay out the domain space) and within those events of systematic testing we label evidence of problem decomposition, pattern recognition, abstraction, and algorithm design.

     

    So if they are holding one thing constant they are in problem decomposition; when they see the particular value of the Zoombini (e.g. red noses) that works for the puzzle that might be pattern recognition; when they realize that the general rule is that the puzzle is sorting on noses that is abstraction; and when they repeat the same type of strategy in multiple applications that is consistent with algorithm design.

     

    Once we can get good reliability among ourselves as human coders of these behaviors, we build data mining models to recognize those patterns within huge sets of play data (far more than we could ever human code). The more data we use, the more we can refine and train our models to be better predictors of those behaviors. Meanwhile we will study students playing Zoombinis under a variety of conditions to see if/how the behaviors they exhibit in their gameplay predict how they improve on pre/post assessments of the same CT facets. Make sense?

  • Icon for: Nicole Reitz-Larsen

    Nicole Reitz-Larsen

    Facilitator
    Educator
    May 17, 2017 | 11:55 p.m.

    I like the description you give about students exploring concepts as they are learning.  Are they journaling what concepts they are using during the game to identify how and when they were used and changes they may make after that?

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 18, 2017 | 07:28 a.m.

    Hi Nicole - we are not doing journaling with students (but we do have teachers in the study complete logs about their bridging activities). We are looking at implicit learning that is demonstrated by behaviors rather than by formal expressions (writing)....we will tie these findings to others more explicit assessments as the research goes on. thanks!

  • Icon for: Katharine Sawrey

    Katharine Sawrey

    Graduate Student
    May 17, 2017 | 07:22 p.m.

    Yes, thanks. 

  • Icon for: Lien Diaz

    Lien Diaz

    Facilitator
    Sr. Director
    May 18, 2017 | 02:38 p.m.

    What a great project! It's interesting to me to hear about the different ways that computational thinking is being measured in this project. Can you share more about how computational thinking is defined in game-based learning? In other words, what are some of the "skills" that are elicited/observed/being researched that are considered computational thinking? 

  • Icon for: Kathy Perkins

    Kathy Perkins

    Director
    May 18, 2017 | 09:55 p.m.

    Thanks for sharing, Jodi! Are you finding signature patterns in the back end data that help you identify those engaged in productive computational thinking? and those engaging in ways that are not productive? 

     

  • Icon for: Jodi Asbell-Clarke

    Jodi Asbell-Clarke

    Lead Presenter
    Director, EdGE at TERC
    May 19, 2017 | 09:13 a.m.

    Thanks Lien and Kathy. Zoombinis is about characters that have four different attributes (nose, eyes, hair, and feet) that have five different values. We are able to watch as players choose strategies to discern which values of which attributes are salient to the puzzles in the game. In some cases, it is not the Zoombini attributes that the players are testing, as in the case of Pizza Pass. Each Pizza troll want a specific set of toppings for their pizza and ice cream sundaes. Players often start by making their own favorite pizza or random trial and error, but we can watch as they learn to be systematic in their testing, choosing one topping at a time and keeping track of the results.  We can also see as they abstract their individual test results into generalized rules. Also the game allows players to create their own packs of Zoombinis, they can use overall algorithms to help them solve puzzles like always keeping one attribute of all Zoombinis the same - that reduces the ambiguity of the puzzles for them overall. I hope that makes sense. It is hard to get more specific than that outside the context of each puzzle, but play the game and let me know if you want to know how we are analyzing any of the puzzles of interest. We are looking at Allergic Cliffs, Pizza Pass, Mudball Wall, Fleens, and Bubble Wonder.  thanks, Jodi

  • Further posting is closed as the showcase has ended.