1. Libby Gerard
  2. http://wise.berkeley.edu
  3. Project Learning with Automated, Networked Supports (PLANS)
  4. http://wise.berkeley.edu
  5. University of California, Berkeley
  1. Lauren Applebaum
  2. Postdoctoral Scholar
  3. Project Learning with Automated, Networked Supports (PLANS)
  4. http://wise.berkeley.edu
  5. University of California, Berkeley
  1. Ady Kidron
  2. Post-Doctoral Researcher
  3. Project Learning with Automated, Networked Supports (PLANS)
  4. http://wise.berkeley.edu
  5. University of California, Berkeley
  1. Jonathan Lim-Breitbart
  2. Web Designer and Developer
  3. Project Learning with Automated, Networked Supports (PLANS)
  4. http://wise.berkeley.edu
  5. University of California, Berkeley
  1. Marcia Linn
  2. http://gse.berkeley.edu/people/marcia-linn
  3. Professor of Cognition and Instruction
  4. Project Learning with Automated, Networked Supports (PLANS)
  5. http://wise.berkeley.edu
  6. University of California, Berkeley, Educational Testing Service, National Science Foundation
  1. Beth McBride
  2. Project Learning with Automated, Networked Supports (PLANS)
  3. http://wise.berkeley.edu
  4. University of California, Berkeley
  1. Jonathan Vitale
  2. https://www.linkedin.com/profile/view?id=115215126
  3. Post-Doctoral Researcher
  4. Project Learning with Automated, Networked Supports (PLANS)
  5. http://wise.berkeley.edu
  6. University of California, Berkeley
Public Discussion
  • Icon for: Brian Drayton

    Brian Drayton

    Researcher
    May 16, 2017 | 10:32 a.m.

    Hi, 

      An intriguing environment.  I am curious what it takes for teachers to learn to make use of the automated scoring, and whether they disagree with the judgement of the system -- and what you do about differences in judgment?

     
    Mark this discussion post as helpful
  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 16, 2017 | 06:45 p.m.

    Thanks for your question. PLANS makes several uses of automated scoring. For essays, the most common is to use the scores to guide students to revise their essays. We find that teachers are glad to have help in guiding student revisions. Teachers often comment that they emulate the knowledge integration guidance that the system provides when they make their own comments. 

    For MySystem diagrams, students also get knowledge integration guidance and use the guidance to make revisions to improve the accuracy of the diagrams. Teachers applaud the system since it would take them considerable time to give similar guidance. They are able to spend their time guiding students on other challenges.

    See: 

    Gerard, L., & Linn, M. C. (2016). Using automated scores of student essays to support teacher guidance in classroom inquiry. Journal of Science Teacher Education, 27(1), 111-129. doi:10.1007/s10972-016-9455-6

    Gerard, L. F., Ryoo, K., McElhaney, K. W., Liu, O. L., Rafferty, A. N., & Linn, M. C. (2015). Automated Guidance for Student Inquiry. Journal of Educational Psychologydoi:10.1037/edu0000052.

     
    Mark this discussion post as helpful
  • Icon for: Marcia Linn

    Marcia Linn

    Co-Presenter
    May 16, 2017 | 06:52 p.m.

    Hi folks,

    It would be great to hear your experiences combining hands-on and simulation-based investigations. 

    Let us know your ideas.

     
    Mark this discussion post as helpful
  • Icon for: Martin Storksdieck

    Martin Storksdieck

    Facilitator
    May 17, 2017 | 09:09 a.m.

    Wow: this is an ambitious project and I love how it integrates hands-on and computer-based exploratory learning. Like Marcia, I am curious about other experiences in this area. But my biggest question is: assuming data show that students learn a variety of things, and that participating teachers can handle the complexity of the environment, how would or could you get this type of hybrid exploration into more classrooms? 

     
    Mark this discussion post as helpful
  • Icon for: Beth McBride

    Beth McBride

    Co-Presenter
    May 17, 2017 | 04:23 p.m.

    Hi Martin,

    We agree that the environment is quite complicated for teachers - managing both the hands-on and virtual aspects can be a challenge. We've developed the virtual environment to specifically guide students through the designing process and alert them to what materials they will use, how they will test their designs, etc. This takes some load off of any teacher using the curriculum. We are also working on developing succinct guides for teachers that will be embedded in the teacher tools in WISE. These guides would alert teachers to materials they need, the length of the project and pacing, difficult points for students, and possible classroom openers. Any teacher can use WISE in their classroom, and we hope that by developing a few more resources around these hands-on projects they will become more usable for teachers new to WISE as well.

     
    Mark this discussion post as helpful
  • Icon for: Martin Storksdieck

    Martin Storksdieck

    Facilitator
    May 18, 2017 | 02:23 a.m.

    Thanks for elaborating, Beth. If I understand correctly, then you'll increase usability for teachers by lowering barriers for implementation. But what is the recruitment strategy? How do teachers know about it?  I am wondering whether you could get this to places where teacher are, like PBS LearningMedia/

     
    Mark this discussion post as helpful
  • Icon for: Jackie DeLisi

    Jackie DeLisi

    Facilitator
    May 17, 2017 | 10:00 p.m.

    I love how students can manipulate aspects of models and view graphs. I wonder what students are learning about the data they generate. You mention that students are learning about the content, but have you been able to demonstrate that students develop deeper understandings about collecting data or using evidence to support claims?

     
    Mark this discussion post as helpful
  • Icon for: Jonathan Vitale

    Jonathan Vitale

    Co-Presenter
    May 18, 2017 | 12:04 a.m.

    Hi Jackie,

    Yes, getting our students to think deeply about data is definitely one of our goals. One of the ways we've done this is by "flipping" the traditional 'run the model' -> 'view the data' ordering. Instead the students 'make the data' -> 'run the model'. So, for example, in a unit on climate change, the students will draw a line graph that depicts how much carbon dioxide is in the air over the course of the model's run. They also draw a graph to predict the temperature. When they run the model, it actually evolves according to the carbon dioxide graph they drew. Then they can compare the temperature graph drawn from the model, to their own prediction graph.

    This general approach helps them to interpret data and make connections between what they see on the graph and the scientific context.

    I hope that helps!

     
    Mark this discussion post as helpful
  • Further posting is closed as the showcase has ended.