1. Jacqueline Barber
  2. Director of the Learning Design Group
  3. Digital Internship Modules for Engineering (DIMEs)
  4. Lawrence Hall of Science, Amplify
  1. Eric Greenwald
  2. Director of Assessment and Analytics
  3. Digital Internship Modules for Engineering (DIMEs)
  4. University of California, Berkeley, Lawrence Hall of Science
  1. Ryan Montgomery
  2. Learning Analytics Lead
  3. Digital Internship Modules for Engineering (DIMEs)
  4. University of California, Berkeley, Lawrence Hall of Science
  1. Kathryn Quigley
  2. https://www.linkedin.com/in/kathryn-chong-quigley-786b6943?trk=nav_responsive_tab_profile
  3. Producer and Media Lead
  4. Digital Internship Modules for Engineering (DIMEs)
  5. Lawrence Hall of Science
Public Discussion

Continue the discussion of this presentation on the Multiplex. Go to Multiplex

  • Icon for: Sharon Lynch

    Sharon Lynch

    Researcher
    May 14, 2017 | 06:33 p.m.

    What an exciting new approach to NGSS's design emphasis, and conduct research into how  students learn to do design. Another great project for the Lawrence Hall of Science. 

     

     
    1
    Discussion is closed. Upvoting is no longer available

    Jacqueline Barber
  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Lead Presenter
    Director of the Learning Design Group
    May 15, 2017 | 07:03 p.m.

    Thanks, Sharon!

  • Icon for: Tami LaFleur

    Tami LaFleur

    Facilitator
    K-12 STEM Coordinator
    May 15, 2017 | 08:37 a.m.

    Very interesting concept to have internships on-line! It is very authentic to have an e-mail from the 'project director' each morning- Is the project director the teacher? What is the age/grade-span for these projects that have been developed? I love the video of the students working- how many students have already been a part of this project? Lastly, it is appealing that your model for engineering has been condensed to 3 over-arching themes- Define, Develop Solutions, and Optimize. We don't want to overwhelm students by thinking there are just 'way too many steps' to the engineering process (even though each of the three has criteria within). 

     
    2
    Discussion is closed. Upvoting is no longer available

    Michael Kolodziej
    Jacqueline Barber
  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Lead Presenter
    Director of the Learning Design Group
    May 15, 2017 | 07:26 p.m.

    Hi Tami,

    Our Engineering Internships have been designed for middle school students. We have conducted field tests with over a thousand students, and are midway through research trials with hundreds more.

    The "fictional" project director, who greets the students in the initial video, is who the daily email is from, and also s/he also provides personalized feedback to students on their designs. (the teacher actually is the one to decide what feedback to send students, and when—but that is behind the scenes) The Teacher plays the role of internship coordinator. She coaches students from the side, as help is requested, and advise students on what the expectations of an intern are, treating this as students' first foray into the "work world". We set things up in this way for several reasons: 1) to create a more authentic feel—where students are "on first." Even though these internships are used in a school context, we want students to feel like that are in an internship, and that this is their opportunity to shine. Teachers have commented that students really step up to the opportunity; 2) so teachers, new to engineering, wouldn't feel like they need have engineering expertise in order to use these with students—many teachers have told us they have really enjoyed being in a sidelines role, both as a change of pace, but also seeing their students really tackle challenges handed to them "by someone else"; and 3) to provide a more student-centered experience.

    Yes—Cary, correctly anticipated my answer to your last comment. Students engage in a full inquiry cycle, but our assessment focuses in on one of the three core ideas in engineering from the NGSS—Optimization (and we break that down further).

    Thanks so much for taking the time to look at the video and ask questions!

    Jacquey

  • Icon for: Cary Sneider

    Cary Sneider

    Higher Ed Faculty
    May 15, 2017 | 09:57 a.m.

    In response to the above question, I understand that the focus of this digital module is to focus on just one of the three core ideas in engineering from the NGSS, Optimization, and to break that down into three areas for this module. My question is whether the intention is to replace hands-on with digital scenarios, or to use a blended approach?  Has LHS decided to go all-digital to save the teacher time and trouble?

     
    1
    Discussion is closed. Upvoting is no longer available

    Michael Kolodziej
  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Lead Presenter
    Director of the Learning Design Group
    May 15, 2017 | 10:00 p.m.

    HI Cary!

    Our digital internships do include hands-on experiences. Each internship begins with a Research Phase, where students come to understand the problem they are trying to solve, and become familiar with the variables involved. It is in this Research Phase that students engage in hands-on experiences. For instance, in the Force and Motion Digital Internship, students are challenged to design and test emergency supply drop pods, attempting to minimize cargo damage to ensure the safety of the cargo, maximize the shell condition so the pod can be used for shelter, and keep pod costs low so that more pods can be build and dropped to an many people in need as possible. In the Research Phase, students conduct the classic egg drop experience. This serves as a physical model for the design challenge, and helps them gain experience in manipulating the variables of mass, velocity, and duration of experienced collision as ways to reduce impact force. They then move into the second phase of their internship—the Design Phase. We created digital design tools where students conduct their ultimate design of solutions. While students learn a lot from the hands-on egg drop experience that couldn't be learned otherwise, they aren't able to make and test countless iterations, as they are with the digital design tool. In this case, using a digital model is more fruitful in learning about optimization than the physical model of the egg drop, and it bears authenticity to how many engineers work. The third phase of the experience is the Proposal Phase, where students write a proposal (responding to the RFP they got at the beginning of the experience). The proposals they write are actually design arguments, where students muster evidence to support their proposed design. 

    Hope this makes sense!

    Jacquey

  • Icon for: Vivian Guilfoy

    Vivian Guilfoy

    Senior Advisor
    May 15, 2017 | 11:48 a.m.

    Great project.  The internships are inviting and challenging.  How do students decide which they will pursue?  I'd be very interested in what have you learned so far about student thinking.   Do you have early results?   Also, I'm curious to know the nature of the interaction with the PD.  Do they have any "real time"  interaction (online or in person)   with the PD or is it all done in the daily email to students?  

  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Lead Presenter
    Director of the Learning Design Group
    May 15, 2017 | 10:28 p.m.

    Thanks for your questions, Vivian!  We designed the digital engineering internships to follow science units on specific topics. The idea is that students will engage in learning relevant science, and then apply that science as they take on the design challenge in the Digital Engineering Internship. So for instance, the internship described that involves students in creating emergency supply drop pods would be used as a culminating experience following a unit on Force and Motion. The whole experience serves in some ways as an assessment of their ability to apply their newfound science knowledge to a novel problem. Given this, we haven't offered students choice of which internship to take on. Because students turn in their proposals (basically an argument supported by evidence to support their proposed design), we have seen that students are successful in coming up with a range of defensible designs, and can support why they think so. We don't yet have results regarding the different pathways students take to come up with a solution that optimizes for X or Y.  Stay tuned!

  • Icon for: Tami LaFleur

    Tami LaFleur

    Facilitator
    K-12 STEM Coordinator
    May 15, 2017 | 08:05 p.m.

    Thank you, Jacuey. I work with middle school teachers and students, and I think many of them do NOT feel like experts in engineering. Therefore, a program like yours is attractive and exciting. Do you plan to market this to schools? If so, when? I look forward to seeing more. I wonder if any students report wanting to become engineers in their future...

  • Icon for: Jacqueline Barber

    Jacqueline Barber

    Lead Presenter
    Director of the Learning Design Group
    May 15, 2017 | 10:41 p.m.

    Thanks, Tami!

    The 1.0 version of these Digital Engineering Internships are currently part of the Amplify Science middle school curriculum, the Lawrence Hall of Science's newest K-8 program, designed from the ground up to meet the NGSS. 

    One of the things that I am proud of, is that our 6 different internships enable students to step in the role of a range of different kinds of engineers, including: mechanical engineers, chemical engineers, biomedical engineers, geotechnical engineers, and food engineers. And each one of the engineering projects has a humanitarian purpose. We are hopeful that this will make engineering appear to more than the usual suspects.

    Thanks again.

     

  • Icon for: Tami LaFleur

    Tami LaFleur

    Facilitator
    K-12 STEM Coordinator
    May 16, 2017 | 09:54 a.m.

    Yes! Many times people think 'engineer' is a specific job. This sounds like a great way to create awareness of the varying aspects of this pathway.

  • Icon for: Christopher Whitmer

    Christopher Whitmer

    Researcher
    May 16, 2017 | 05:29 p.m.

    This is great.  How much freedom do they have in problem specification, conceptualization, and analysis in their internships or are you primarily focused on optimization?

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    Director of Assessment and Analytics
    May 16, 2017 | 06:20 p.m.

    That's a great question. In the units, students work early on to specify the problem (e.g., articulating/operationalizing design criteria and constraints), develop initial design solutions, and through iterative testing of their designs, get great practice analyzing data.

    We are initially focusing on optimization because it pulls in some of those practices (e.g., attention to design criteria, revising designs based on empirical evidence) and because we expect it to be reasonably tractable by analyzing student interactions with the digital design tool. That is to say, we expect the moves students make in the digital environment (that we are applying our learning analytics model to assess) to reveal a lot about how well they are optimizing solutions. Our long term hope is that we can parlay what we learn about using analytics to assess optimization to assess other science and engineering practices--stay tuned! 

  • Icon for: Sherry Hsi

    Sherry Hsi

    Researcher
    May 17, 2017 | 10:39 a.m.

    I really like the rich problem scenarios that the team has created and offered to students in the online internships. The problems are also inviting to students. I want to do this!  

    I am curious about the learning analytics model you are using and if you can tell me more about this. My colleagues at the Concord Consortium are using several analytic methods, one of which uses Monte-Carlo Bayesian Knowledge Tracing to assess knowledge growth using physics game log files in the context of a high school inquiry activity, also funded by the National Science Foundation. The possibility of being able to scale to many many students dynamic online assessments and formative feedback on more complex learning tasks and activities is exciting. 

  • Icon for: Ryan Montgomery

    Ryan Montgomery

    Co-Presenter
    Learning Analytics Lead
    May 17, 2017 | 02:26 p.m.

    Hi Sherry,
    Thanks for your enthusiasm and your question about our analytics model! We, too, are currently exploring the use of several analytics models. Our main analytics model is a Dynamic Belief Network - similar to the BKT method you mention, but positing (and fitting for) additional connections between skills. We are currently developing two layers of interpretation of the log-file data to feed into our Dynamic Belief Network: 1) a top-down approach consisting of evaluations of student behavior along expert-specified patterns, and 2) a bottom-up approach that uses those pattern evaluations (along with the rest of the log-files) as features in a machine learning analysis to discover additional "optimization-aligned" patterns of behavior.
    And yes, the potential applications and scalability of these kinds of automated analyses are super exciting! Such a wonderful time to be in the field!

  • Icon for: Dale McCreedy

    Dale McCreedy

    Facilitator
    Vice President of Audience & Community Engagement
    May 16, 2017 | 10:24 p.m.

    This seems like a perfect stepping stone to exploration and problem solving related to authentic issues occurring in the community where these students live. Have you seen examples of application of or interest in addressing real life problems by the students?  I am wondering if there is movement from the book contexts and characters to personal ownership and application?  Or - have you considered embedding community issues into the internship challenges?

     
    1
    Discussion is closed. Upvoting is no longer available

    Michael Kolodziej
  • Icon for: Michael Kolodziej

    Michael Kolodziej

    Facilitator
    Associate Vice President
    May 17, 2017 | 06:03 p.m.

    This is also an area of interest for me in terms of scaling this sort of activity.  There seem to be some clear trade-offs in terms of the use of models and controlled environments vs. real world applications from the availability of meaningful learning analytics, to the ability to ensure that all learners regardless of their geographic location or other demographic factor have the same opportunity to explore and play. 

    I’m wondering if there is any difference in the level of engagement/motivation/learning outcome achievement of students who work in the in these type of controlled environments vs. those who work in a truly authentic environment on a hands on project.

     

    Thanks in advance for any thoughts you may have.

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    Director of Assessment and Analytics
    May 17, 2017 | 07:25 p.m.


    Thanks, Dale and Michael, those are great questions--and you've nicely captured some of the key trade-offs. In piloting and field trials, we've found that kids are really invested in the problems that are the focus of the existing internships (we pilot and iterate on the ideas to make sure that they are solving problems that are meaningful to them). But, the idea of kids parlaying these experiences, and their newfound engineering chops, to tackle local problems is a really cool idea that i hope we can explore in the not-too-distant future! And, wow, doing so would enable some great research to examine those differences.



     


     

     
    1
    Discussion is closed. Upvoting is no longer available

    Michael Kolodziej
  • Icon for: Michael Kolodziej

    Michael Kolodziej

    Facilitator
    Associate Vice President
    May 19, 2017 | 01:07 p.m.

    Agreed!  It's really exciting to see learning occur in more deeply situated and even authentic environments, with assessments that have real-world application.  I hope to hear how this project evolves and appreciate the great work!

  • Icon for: David Webb

    David Webb

    Researcher
    May 16, 2017 | 11:51 p.m.

    Very clever design to the project -- but I shouldn't be surprised. ; )

    Enjoyed watching the video!

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    Director of Assessment and Analytics
    May 17, 2017 | 07:04 p.m.

    thanks!

  • Icon for: Leah Clapman

    Leah Clapman

    Informal Educator
    May 17, 2017 | 02:35 p.m.

    Wow, what a wonderful resource. How do schools access the internships? My son goes to DCPS in Washington, DC and I'm passing this on to his teacher!

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    Director of Assessment and Analytics
    May 17, 2017 | 07:12 p.m.

    Thanks, Leah! The Engineering Internship units themselves are commercially available as part of the Amplify Science middle school curriculum. The analytics/digital assessment elements are under development as part of this ongoing NSF study, but we'll be taking what we learn and incorporating it into the units for a later version of the curriculum. 

  • Icon for: Dale McCreedy

    Dale McCreedy

    Facilitator
    Vice President of Audience & Community Engagement
    May 17, 2017 | 11:02 p.m.

    With regard to kids taking on the challenges of addressing local issues, be sure to look at Angie Calabrese Barton's project http://videohall.com/p/864 - called i-engineering. Do love this project...

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    Director of Assessment and Analytics
    May 19, 2017 | 12:47 a.m.

    That is a cool project, indeed! Thanks for the tip!

  • Icon for: Kathy Perkins

    Kathy Perkins

    Director
    May 18, 2017 | 10:07 p.m.

    I love the project design, and the subcategories you have started with of exploration, systematicity, and feedback. Do you find its pretty clear when students transition from exploration to systematicity?

     

    I can't wait to learn more about the results from the classroom. 

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    Director of Assessment and Analytics
    May 19, 2017 | 12:54 a.m.

    Thanks! I'd say that the transition is a little fuzzy (we don't have a fixed cut off for one or the other), and we've seen some evidence that kids go back into an 'explore' mode at different moments throughout their iterative design process. What we're trying to work out is a sense of when it is particularly important to be exploring, and when it's particularly important to approach the design task systematically--and we expect that this is in part a function of the structure of the Internships, and in part a function of how the student is approaching the problem space.  

  • Icon for: Dale McCreedy

    Dale McCreedy

    Facilitator
    Vice President of Audience & Community Engagement
    May 19, 2017 | 12:15 p.m.

    The individualized nature of this project and the ability to customize for skills is really quite incredible. Awesome project and using the internship model is fantastic! Thanks!

  • Icon for: Eric Greenwald

    Eric Greenwald

    Co-Presenter
    Director of Assessment and Analytics
    May 19, 2017 | 02:03 p.m.

    thanks!

  • Further posting is closed as the showcase has ended.