NSF Awards: 1451604
Project Learning with Automated Network Support is an NSF-funded cyberlearning initiative. Called PLANS for short, this work is powered by the open-source, Web-Based Inquiry Science Environment (wise.berkeley.edu). The PLANS units showcased in this video were developed by an interdisciplinary team of learning scientists, computer scientists, disciplinary experts, teachers and engineers. Units combine hands-on design and a plethora of analytic technologies to improve student learning. The video features student designs of solar-powered ovens, and self-propelled vehicles along with the computer models students use to investigate the science principles behind their designs. Automated scoring of knowledge integration essays and drawings is used to guide student exploration.Teachers use these same scores to monitor student progress, lead discussions, plan Openers and assign grades. Results show that diverse learners including students in urban schools and language learners gain integrated understanding of complex concepts and practices as well as developing their identity as scientists. PLANS is transforming science and engineering education for current and future learners.
Units are free and open-source. Check out the units at wise.berkeley.edu.
Brian Drayton
Hi,
An intriguing environment. I am curious what it takes for teachers to learn to make use of the automated scoring, and whether they disagree with the judgement of the system -- and what you do about differences in judgment?
Marcia Linn
Professor of Cognition and Instruction
Thanks for your question. PLANS makes several uses of automated scoring. For essays, the most common is to use the scores to guide students to revise their essays. We find that teachers are glad to have help in guiding student revisions. Teachers often comment that they emulate the knowledge integration guidance that the system provides when they make their own comments.
For MySystem diagrams, students also get knowledge integration guidance and use the guidance to make revisions to improve the accuracy of the diagrams. Teachers applaud the system since it would take them considerable time to give similar guidance. They are able to spend their time guiding students on other challenges.
See:
Gerard, L., & Linn, M. C. (2016). Using automated scores of student essays to support teacher guidance in classroom inquiry. Journal of Science Teacher Education, 27(1), 111-129. doi:10.1007/s10972-016-9455-6
Gerard, L. F., Ryoo, K., McElhaney, K. W., Liu, O. L., Rafferty, A. N., & Linn, M. C. (2015). Automated Guidance for Student Inquiry. Journal of Educational Psychology: doi:10.1037/edu0000052.
Marcia Linn
Professor of Cognition and Instruction
Hi folks,
It would be great to hear your experiences combining hands-on and simulation-based investigations.
Let us know your ideas.
Martin Storksdieck
Director and Professor
Wow: this is an ambitious project and I love how it integrates hands-on and computer-based exploratory learning. Like Marcia, I am curious about other experiences in this area. But my biggest question is: assuming data show that students learn a variety of things, and that participating teachers can handle the complexity of the environment, how would or could you get this type of hybrid exploration into more classrooms?
Beth McBride
Hi Martin,
We agree that the environment is quite complicated for teachers - managing both the hands-on and virtual aspects can be a challenge. We've developed the virtual environment to specifically guide students through the designing process and alert them to what materials they will use, how they will test their designs, etc. This takes some load off of any teacher using the curriculum. We are also working on developing succinct guides for teachers that will be embedded in the teacher tools in WISE. These guides would alert teachers to materials they need, the length of the project and pacing, difficult points for students, and possible classroom openers. Any teacher can use WISE in their classroom, and we hope that by developing a few more resources around these hands-on projects they will become more usable for teachers new to WISE as well.
Martin Storksdieck
Director and Professor
Thanks for elaborating, Beth. If I understand correctly, then you'll increase usability for teachers by lowering barriers for implementation. But what is the recruitment strategy? How do teachers know about it? I am wondering whether you could get this to places where teacher are, like PBS LearningMedia/
Jackie DeLisi
Research Scientist
I love how students can manipulate aspects of models and view graphs. I wonder what students are learning about the data they generate. You mention that students are learning about the content, but have you been able to demonstrate that students develop deeper understandings about collecting data or using evidence to support claims?
Jonathan Vitale
Post-Doctoral Researcher
Hi Jackie,
Yes, getting our students to think deeply about data is definitely one of our goals. One of the ways we've done this is by "flipping" the traditional 'run the model' -> 'view the data' ordering. Instead the students 'make the data' -> 'run the model'. So, for example, in a unit on climate change, the students will draw a line graph that depicts how much carbon dioxide is in the air over the course of the model's run. They also draw a graph to predict the temperature. When they run the model, it actually evolves according to the carbon dioxide graph they drew. Then they can compare the temperature graph drawn from the model, to their own prediction graph.
This general approach helps them to interpret data and make connections between what they see on the graph and the scientific context.
I hope that helps!
Further posting is closed as the showcase has ended.