Evaluating Efficacy of Remote Learning Content and Tools

This past spring our lives as we knew them were turned upside down. The greatest things that came out of this scary and life altering pandemic, were the innovations and truly caring hearts that came to the forefront of our communities. As an educator, I was so incredibly thankful to still have the ability to collect a paycheck and have that stressor relieved – however, my job still changed quite a bit. And while it was not all negative, it was still an incredible change to push a 100% in-person model online. Education leaders and educational tech companies and organizations got to work immediately, creating resources and support for teachers who were switching to a remote model at the drop of the hat. It warmed my heart to see all the love and collaboration that was flowing through the education communities to help one another get through this quick change!  With all of these quickly produced and released resources flooding into teachers’ emails, the question was brought up in my teams collaborative planning meetings of which resources were best and which we should be focusing our time on. What a great question! I was reminded of this conversation recently when I am once again planning for going back to school in a fully remote model. How do we test the efficacy and effectiveness of not only resources, but also digital tools?  How can coaches partner with educators to reflect on digital learning content and tools to enhance remote learning? ISTE Coaching Standard 3 Collaborator While searching for ways that educators have been able to reflect and analyze on digital learning content and tools, I came across an article “6 Ways Administrators Can Prove the Efficacy of Digital Tools”  written by Eric Sheninger, a digital leadership expert at the International Center for Leadership in Education.  Sheninger goes through a list of 6 ways to prove digital tool efficacy. He states that the way to start off is to take a look at pedagogy and then move onto the research behind tools and content. You can then look at the reason why you are choosing to use those resources and finish with a reflection. My favorite pieces of this post are the reflection that can be posed to educators to help them think through the effectiveness of the tools/content that they are using: Did my students learn? How do I know if my students learned? How do others know if my students learned? What can be done to improve? What point of view have I not considered? With the wonderful insight from some of my Digital Education Leadership cohort members, we were able to extend these questions to help give educators more information on continuing with digital learning content or tools Did my students learn? Which students learned? Are there a certain grouping of students that were able to access this content or tool with more success than a separate grouping of students? How can you differentiate this so that all students have the same access to the content or tool? How do I know if my students learned? What formative assessment strategies will be used in order for you, as the educator, to determine if students have learned? What success criteria will be in place? How do others know if my students learned? How will a student know they were successful? How will this learning be easily communicated with parents? How will administration see that students are aware of their learning with this digital content or tool? What can be done to improve? Is more scaffolding necessary? Is this content within my students zone of proximal understanding? If not, how can I ensure students will be able to stretch to understand this content?  What point of view have I not considered?  Is this digital content or tool culturally responsive? What trauma informed practices are available to be integrated with this content or tool to ensure students who have higher ACES (adverse childhood experiences) will have similar learning to students who have less ACES? How will our ELL (English Language Learners) students access this content or tool? How will a student with a 504 plan or receiving specially designed instruction access this content or tool?  All of these questions will help a coach work through the process of evaluating digital content or tools to ensure that they are effective. Through a different approach, educators could rate their digital content or tool by using a rubric. Here is one example provided by ISTE: No matter what method you choose to reflect upon digital content or tools, the most important piece is that you are taking the time to actually reflect.  How do you reflect on digital learning content or tools? What are some other pieces of teaching that you feel coaches can help educators reflect on? Comment below! References ISTE Standards for Coaches | ISTE. (n.d.). ISTE. Retrieved August 1, 2020, from https://www.iste.org/standards/for-coaches Klein, A. (2019, November 18). Digital Learning Tools Are Everywhere, But Gauging Effectiveness Remains Elusive, Survey Shows. Education Week. https://www.edweek.org/ew/articles/2019/09/18/digital-learning-tools-are-everywhere-but-gauging.htmlSheninger, E. (2020, May 6). 6 Ways Administrators Can Prove the Efficacy of Digital Tools. Technology Solutions That Drive Education. https://edtechmagazine.com/k12/article/2017/11/6-ways-administrators-can-prove-efficacy-digital-tools

Continue reading