Planning for Success with Digital Collaboration

Even before the availability of technology in the classroom, group projects have gotten a bad rap. Students worry that the work will not be shared equally or that other’s actions (or inaction) will impact their grade. Teachers likewise want to ensure that collaboration results in all students accessing the content.

The benefit of using technology to facilitate collaboration is that students’ actions can be easily quantified and qualified. Features like the Revision History within Google Apps will reveal each student’s contribution to an assignment in color-coded format. Posts on a discussion board or LMS platform also make a student’s level of participation apparent. However, what can teachers do to eliminate the need for this “got you” approach and instead be proactive about ensuring the success of digital collaboration?

Carefully and intentionally structuring courses and projects is one way that teachers can ensure students have meaningful digital collaborations, thereby satisfying ISTE Coaching Standard 3a, “Model effective…collaborative learning strategies to maximize teacher and student use of digital tools and resources and access to technology-rich learning environments” (Iste.org, 2017).

The Argument for Collaboration

Though it may seem like planning for collaboration is more involved than traditional assignments, the benefits are overwhelming. Dr. Patty Shank makes the following argument for collaboration in the higher education classroom: “[S]ocial interaction can positively influence learning, motivation, and problem-solving, and can help learners gain needed support and overcome frustration” (n.d.). I put together the following infographic to highlight Shank’s rationale for incorporating collaborative learning.

Planning for Collaboration

One of my favorite sayings is ‘failing to plan results in planning to fail.’ The element of planning is vital to the success of collaboration. According to Shank, “It takes preparation and practice to design and implement good collaborative activities, and learners need preparation and practice to get the most from them” (n.d.). For guidance in what this planning might look like, I turned to an article written by Jan Engle, a coordinator of instruction development at Governors State University.

Build Collaboration into the Course

Engle suggests making your expectations regarding collaboration clear from the beginning.  In order to ensure that the responsibility for learning is shared by all students in a group, Engle makes participation in group work a grade requirement. Not adequately participating in group work results in an automatic single grade-level reduction (ie- A to B). Engle does this “because really bad group experiences and failure to participate in the online environment just decimate the sense of community we’ve worked so hard to develop up to that point” (n.d.).  

Initially Focus on Process over Product

Even adult learners may enter the classroom unprepared for successful collaboration. Instead of making assumptions about what students can or can’t accomplish as a group, Engle suggests explicitly teaching collaboration. Depending on the age group, this might involve giving students the language to disagree. When I taught English Language Learners, we used the Kate Kinsella framework to provide students with sentence frames. More advanced learners might just need guidance in developing group norms.

Engle (n.d.) asks her groups to collaboratively discuss and then respond to the following questions:

  • How are you going to divide the project so that each team member has a part?
  • Who is going to be responsible for each part?
  • How are you going to communicate during the project?
  • How will members submit their work to the group?
  • What is the deadline for the submissions of individual pieces?
  • Who is going to be responsible for putting the pieces together into one paper [or presentation]?
  • How are you going to handle final proofing?
  • What will you do it somebody does not do his or her part or does not meet deadlines?
  • How are you going to go about answering questions that group members might have about the project?

Scaffold Up to Larger Projects

Beginning the collaboration process with a low-stakes project is a great way to test out the group dynamics and work through conflict. Early in a course, Engle assigns a group project that is “relatively easy and fun in order to emphasize group processes” (n.d.). Once students have the concept down, Engle then moves on to larger collaborative projects. One example of an introductory collaborative activity is an information scavenger hunt designed to introduce students to the basic concepts of research. Engle chose this task because it was easy for students to divide the tasks, was not worth many points, and wouldn’t create much room for conflict since the answers were all either right or wrong.

Engle also suggests introducing smaller collaborative components ahead of time in order to scaffold up to the larger assessment. This might include sharing responses with a partner who is then required to report them out to the class. Or you might include Jigsaw learning where each group is responsible for reporting on a particular text or concept.

Multiple Modes of Monitoring

Peer Evaluation: While students are welcome to contact Engle at any point in time with concerns, they also have a say in their fellow teammates’ final grade. Collaborative project grades are based partly on end result and partly on peer evaluation. That peer evaluation is based on a rubric that all students review. I really appreciate the addition of a rubric component into the peer feedback process because it helps students to make quantitative evaluations and not judge based on personal chemistry or connection. An additional step that I would take is having students justify each line item response on the rubric.

Teacher Observation: Whether students are collaborating on a Google Slide, discussion board, or Wiki page, Engle requires students to give her access throughout the process. One mistake that many teachers make is being involved in the initial explanation of the assignment and then checking out until the final product is returned. By being involved every step of the way, you can head off potential inequities and disagreements. Even with this oversight, it is important to encourage a productive struggle before stepping in. Instead of simply solving the problem for students, consider how you might facilitate a resolution.

Self-Assessment: Though not mentioned by Engle as a monitoring strategy, I believe self-assessment to be a valuable tool in helping students ensure they are collaborating successfully. I have found that students are typically harder on themselves than peers (and sometimes even the teacher). Like peer evaluation, self-assessments can be based on a given rubric. In addition to the rubric reflection, I have also had success with asking students to explicitly share the contribution they made to their group on a particular day.

Conclusion

Just as it is essential to teach students rules and routines at the beginning of the school year, it is also essential to explicitly plan for and teach collaboration. The time investment made up front will pay off when learners are able to fairly and successfully participate in the online learning environment.

Sources:

Engle, J. How to Promote Collaborative Active Online Learning . Student Collaboration In The Online Classroom, 11-12. Retrieved from http://www.hartnell.edu/sites/default/files/u285/student-collaboration-in-the-online-classroom.pdf

Iste.org. (2017). ISTE Standards For Coaches. [online] Available at: https://www.iste.org/standards/for-coaches [Accessed 19 Jul. 2018].

Shank, P. Considering Collaboration. Student Collaboration In The Online Classroom, 12-13. Retrieved from http://www.hartnell.edu/sites/default/files/u285/student-collaboration-in-the-online-classroom.pdf

Archives and Analysis

Extending Literary Interpretation through Archival Research and Global Collaboration

Visit the Website for this Project

 

Goals

The goals of a 100 level college Introduction to Literature course include learning the rudiments of literary analysis, considering how literature interprets the human condition, and analyzing the cultural and historical contexts of works of literature to interpret the meaning of literature and articulate the contemporary relevance of literary works. For these goals to be realized, a literature course needs to inculcate both independent critical thinking and a classroom (or digital space) community of critical discourse.

While core college courses in the humanities, such as Introduction to Literature, generally include an outcome connected to the use of digital information literacy competencies (see, for example, ISTE Standard for Students 2) such as (in the case of this course) “using contemporary technologies to select and use sources relevant to the study of literature,” these competencies are often interpreted by instructors of introductory literature courses as students’ use of library databases to find secondary works of interpretation to use as sources when writing analytical or interpretive papers. However, requiring novice readers of literature to synthesize scholarly interpretations of texts too soon can undermine the development of students’ independent critical analytical and interpretive skills; even with appropriate instruction in synthesis writing, students in an introductory course may transfer previous habits of working with sources such that their use of secondary sources tends to replace rather than extend independent inquiry.

By contrast, providing students with access to primary sources can give them additional textual and contextual elements that may serve as more productive tools for independent development of new lines of inquiry about the texts they have read. Charlotte Nunes (2015) described her incorporation of digital archives into a first-year literature class, arguing that “students can benefit greatly from even preliminary exposure to archives early in their undergraduate careers, by means of short-term, small-scale archival research tasks” (115).

This teaching unit builds upon Nunes’ suggestion by providing modelled and scaffolded access to digital archives to allow students to develop hypotheses about literary texts and address those hypotheses through contextual documents located in digital archives; in turn, the archival information located can problematize students’ original questions, and thus extend their critical thinking and allow for better application of students’ new understanding of literary, political, and social history to current issues as well.

The use of digital archives to position students as knowledge constructors aligns with ISTE Standard for Students 3. Through critical curation of primary sources located through digital archives, students can use archival technologies to develop inquiries, explore real-world sources, grapple with ill-structured problems presented by how primary sources must be interpreted to provide contextual relevance (rather than with the predigested solutions that may be the focus of students’ use of secondary sources), and pursue more personally owned theories and answers.

 

Barriers

Students who are first or second year students at two-year colleges may or may not have taken the freshman year writing course sequence, may be nontraditional students with considerable life and academic experience, or may lack the preparation typically required by four-year colleges for admission, so their levels of skill in using research methodologies can vary considerably. Nunes (2015) noted that while learning the research strategies involved in archival research is beyond the scope of an introductory literature course, providing students with the “intellectual access” to archival materials can greatly deepen their ability to contextualize their thinking about the historical and social issues they encounter in literature (p. 117). Hence her approach to including primary sources in an introductory literature class typically involved students in working from instructor-provided primary sources.

Similarly, in a study of a problem-based learning project supported by digital archival resources, Chen and Chen (2010) noted that digital libraries face the challenge of effective informational architecture: even when curated by a college library, digital archives may not be intuitively or optimally organized for use by novice students. Likewise, Sharkey (2013), a professor of library science and Head of Information Use and Fluency at Illinois State University, noted that “information and technology are no longer separate entities but are inextricably connected” (p. 34), highlighting the importance of the instructor’s role in designing technology fluency instruction that focuses on the higher order thinking that will “give students a high level of aptitude to interact fluently with both (the) information and technology” (p. 37). Thus, the use of digital archives in this learning context itself presents a twofold barrier in terms of a lack of student knowledge about digital archival research methods, as well as in terms of a need for the development of 21st century teaching competencies that can support students in developing understanding of the nature of the information contained in digital archives, how that information is organized, and how to access and use that information (see ISTE Standard for Students 1). What is needed are both teaching approaches and instructional design approaches that will allow for student use of archival technologies while still foregrounding content learning and extension of students’ critical thinking and inquiry skills.

There are precedents for such an approach. In a controlled study of a Problem-Based Learning unit incorporating digital archives, Chen and Chen (2010) found that the use of digital archives that had been structured by the instructor resulted in deeper learning for students at three phases of the learning process (cognition, action, and reflection), in part because the problem of cognitive overload and the problem of students finding ineffective resources on the Internet were bypassed when more structured resources were presented (25).

 

Solutions

The pedagogical frameworks that form the basis for this unit work together to support the concept of student “intellectual access” and draw upon the social constructivist approaches of Problem- and Project-Based Learning (PBL) in which student design of learning goals and iterative work on developing solutions (ISTE Standard for Students 4) is enacted through collaborative knowledge construction supported by the digital communication media that characterize these students’ world (ISTE Standard for Students 7).

Specifically, this unit develops a pedagogical foundation for the use of digital archives in an introductory literature course through: 1. The Community of Inquiry model, and an outgrowth of it (the QUEST Model for Inquiry-Based Learning); 2. The cultivation of 21st century instructional competencies in instructional design, facilitation, and collaboration (See ISTE Standards for Educators 4, 5, and 6; Florida State University Technology Integration Matrix) to scaffold and support students’ development of critical knowledge construction and communication competencies; and 3. The affordances of 21st century communication venues such as Web 2.0 content authoring tools, the platforms in which today’s researchers and tomorrow’s graduates will communicate, for similarly supporting students’ development critical knowledge construction and communication competencies.

 

The Community of Inquiry and QUEST Models

The Community of Inquiry model that has been the subject of research for nearly 20 years at Alberta’s Athabasca University and beyond describes the inter-relationships between three key elements that must be present for a meaningful higher education learning experience to take place among a community of instructors and students: cognitive presence, social presence, and instructor presence. The most essential element, cognitive presence, denotes the cognitivist elements of the learning process (such as experience, questioning, pattern recognition, making and applying connections, and noting and reconciling dissonances). In this model, the overlap between cognitive presence, social presence and teaching presence creates ways to focus on developing social critical discourse through the design of the educational experience. QUEST is a model for an instructional unit or learning experience based on Community of Inquiry principles. In the QUEST model, which focuses on the elements of cognitive presence and social presence, students formulate a personalized Question about course content, Understand the topic better by conducting research and sharing sources, Educate and collaborate by interacting with peers through an iterative process of discussion of the shared questions and resources, engage in defining a Solution through reflection on the inquiry process; and Teach others through presenting a final product in a blog or other Web 2.0 genre that engages an authentic audience.

 

21st-Century Communication Media

The use in college humanities courses of multimodal composition assignments presented through blogs, video and other Web 2.0 technologies represents more than a shift to 21st-century composition media. This movement is situated within a larger pedagogical “social turn” that emphasizes literacy as not merely cognitivist but sociocultural in nature. In this theoretical view, ways of reading and writing such as genres and written and spoken forms of the English are generated by the communities of practice–professional and institutional, but also historical and cultural–who use these conventions to achieve shared goals (Gee, 2010). Thus people learn not literacy but “literacies” that involve the ability to engage with those communities in terms of the evolving discourse structures of those communities. From the perspective of compositionists, “digital literacies” involve the way digital tools are used within sociocultural groups, such as the scholarly communities who use blogs, digital archives, and online journals for scholarly communication, but also the many other sociocultural groups that produce work in online media to learn and communicate (Gee, 2010).

Although the design for this unit presupposes neither that the unit must be used in a primarily online course (in fact it is piloted here in two face to face course sections) nor that digital composition tools should be chosen other than in the context of a comprehensive list of criteria for how learning materials should be optimized for learning, several factors suggest the use of a blog format for student work in this unit: two factors in particular–1. that this unit brings together students from across the globe in a condensed timeframe to achieve the desired learning outcomes of articulating the relevance of literature for contemporary contexts; and 2. the reciprocal goal of developing students’ existing social media literacies for an academic purpose and relating such a purpose to 21st century research and communication venues–indicate the type of computer-mediated learning context for which the QUEST model suggests the use of a blog format for student work.

For this unit, students from multiple classes and locations need shared venues in which they can locate primary sources in digital archives, post sources and reflections on sources, collaborate through feedback, and share final presentations of the results of their inquiries. They need a research and communication environment that will support the process of development of deeper understanding of a problem, generating ideas, and finding solutions (Kuo, Chen, & Hwang, 2014). On the other hand, they need an environment with affordances that support the creative development of student final products that display learning. Further, they need to work in an environment in which the cognitive load of learning new technologies and methods is minimized.

Criteria considered in the choice of a primary content curation tool for this project, included:

  • ease and affordability of access (a free tool was desired)
  • capacity for instructor and student curation of web links to digital archives
  • capacity for supporting student development of creative content
  • capacity for supporting student writing and revising
  • capacity for supporting the small group and peer-to-peer aspects of a research and writing process as well as the “voice to all” and visual presentation aspects of an archival product (Brownstein & Klein, 2006)
  • ease of use, including minimal layers of technology
  • privacy: students should be able to opt out of associating work posted publicly with their names

In addition to developing this list of criteria, the main venue for student work in this unit was also considered in light of how it would support the exemplary instructional design in terms of the 6A’s Project Idea Rubric and Puentedura’s (2003) Matrix Model for designing and assessing network-enhanced courses. Puentedura’s model includes the diagnostic tool for selecting computer-based technology tools, known as SAMR, in which the best use of technological pedagogy achieves “redefinition,” where “the computer allows for the creation of new tasks, inconceivable without the computer” (Puentedura, 2003). The combination of student digital archive research and curation with the collaborative process of the QUEST model in the context of highly creator-friendly blog space seems to meet this criterion.

After developing two prototypes using different Web 2.0 tools, the instructors for this unit chose to create a Google Sites webpage to house students’ work, interaction, and access to digital archival resources. Support for student use of the site was provided both through a series of modeling sessions (Preparatory Lessons 1 and 2) and through written project instructions and a technology tutorial, as well as daily in-class “check-ins” and individualized support through email and in person.

 

21st-Century Teaching Competencies

Garrison, Anderson & Archer (2000), the authors of the Community of Inquiry model, note that “the binding element in creating a community of inquiry for educational purposes is that of teaching presence” (p. 96). They categorize this element in the three indicator categories: instructional management, direct instruction, and building understanding. Instructional management has to do with the design and planning, and the considerations of how technological teaching media change learning and call for the sorts of criteria for instructional choices considered above. For this unit, the lead instructor worked with the participating instructor to select and design learning environments, technologies, and a series of learning tasks structured according to the QUEST Model. The Community of Inquiry model also features a second teaching indicator, “direct instruction,” which overlaps with unit design but is not a focus of the QUEST Model. Direct instruction involves both content and pedagogical expertise, and is described by Garrison, Anderson, & Archer as “those indicators that assess the discourse and efficacy of the educational process” (p. 101). A significant shift in computer-mediated instruction such as that used in this unit, is the shift from instruction that takes place in written as opposed to verbal formats. Engaging this shift successfully involves much more than following a list of netiquette protocols, such as providing timely standards-based feedback at each stage of student work, and much more than choosing a unit design framework. Key aspects of direct instruction that were considered in this unit included how the instructors would teach the literary skills and content to be employed in students’ work, how we would how we would teach requisite technology skills and content to students, how we would facilitate student engagement with the unit project, and how we would move the learning of the unit along, for instance through intervention or providing opportunities for reflection. These aspects of direct instruction are nowhere more paramount than with a community college student population, with its diverse range of backgrounds in terms of culture, literacy, and preparation.

Two key ways in which we addressed direction instruction in this unit included instructor modelling and individualized support and feedback that anticipates and responds to student needs.

Pursel and Xie (2014) studied the use of blogs housed internally by a university to explore which blog patterns led to improved student performance over time. One finding of their study was the relationship between instructors’ use of modeling the behavior expected from students and student achievement. Instructors who model alongside facilitating and making effective technology choices are more likely to leverage student engagement.

Greener (2009), in her article “e-Modeling – Helping learners to develop sound e-learning behaviors,” provides a fuller picture of what effective instructional modelling looks like. She calls for not just demonstrating proficient skills, but for taking risks and showing students what it looks like to try new things and face unexpected results, then comparing those approaches to more effective strategies. The need for direct instruction, in part through this sort of modelling, which leads beyond observation on students’ part to collaboration as students engage with the dynamic and uncharted nature of digital learning environments, formed the basis for Preparatory Lessons 1 and 2, which precede implementation of the student project in this unit. These lessons were designed to be used in as many iterations as needed or to be distributed across class days as needed to provide instructors with flexible opportunities to provide effective direct instruction through modelling and collaboration as well as through lecture and through the videos that would be implemented in the student project.

Our second consideration for direct instruction involved how to proactively support individual student needs, including helping students develop the ability to use digital tools to connect with peer audiences, a key indicator of ISTE Student Standard 7. One approach was to provide modelling through samples of student work, including constructing peer feedback, and in Preparatory Lesson 2 to model not only products of student work but the process of constructing student work and peer feedback. A second was the decision to provide instructor feedback within the same learning and composing space that students occupied. (This decision was partly driven by the lack of a mechanism for a private communication channel between students and instructors on the blog site.) Thus in this unit, instructor provision of individualized feedback (direct instruction) is blended with small group facilitation (building understanding).

“Building understanding,” the third group of teaching indicators in the Community of Inquiry model, is described by Garrison, Anderson & Archer (2000) as follows:

A process that is challenging and stimulating is crucial to creating and maintaining a community of inquiry. This category is very much concerned with the academic integrity of a collaborative community of learners. It is a process of creating an effective group consciousness for the purpose of sharing meaning, identifying areas of agreement and disagreement, and generally seeking to reach consensus and understanding. Through active intervention, the teacher draws in less active participants, acknowledges individual contributions, reinforces appropriate contributions, focuses discussion, and generally facilitates an educational transaction. (101)

Building understanding in a digital learning space is a 21st century teaching competency that is critical for effective learning by community college students. Although the instructors of this unit considered whether instructor presence in students’ learning spaces would stifle student conversations, our hypothesis that, on the contrary, it would help to facilitate meaningful student conversations, was borne out by the fact that in the exit survey for the pilot implementation of this unit, some students requested more instructor feedback evaluating the quality of peer feedback they were providing, and a number of students expressed frustration at lack of peer involvement. Suggestions for how instructors of this unit can “build understanding” include: creating a visual social connection between participating groups prior to implementation of the student project, either through synchronous interaction or asynchronous video introductions; and intervention and support through email or another private channel for all students early in the project to provide coaching and support in their peer feedback.

References:

Brownstein, E., & Klein, R. (2006). Blogs: Applications in science education. Journal of college science teaching, 53(6).

Chen, C., & Chen, C. (2010). Problem-based learning supported by digital archives: Case study of Taiwan libraries’ history digital library. The electronic library, 28(1), 5-28. Retrieved from: http://dx.doi.org/10.1108/02640471011005414

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. Retrieved from the Athabasca University website: http://cde.athabascau.ca/coi_site/documents/Garrison_Anderson_Archer_Critical_Inquiry_model.pdf

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23.

Gee, J.P. (2010). A situated sociocultural approach to literacy and technology. In Elizabeth A. Baker (Ed.), The new literacies: Multiple perspectives on research and practice. New York: Guilford, (pp. 165-193). Retrieved from http://jamespaulgee.com/pdfs/Literacy%20and%20Technology.pdf

Greener, S. (2009). e-Modeling – Helping learners to develop sound e-learning behaviors. Electronic journal of e-learning, 7(3), 265-272. Retrieved from: https://files.eric.ed.gov/fulltext/EJ872416.pdf

ISTE Connects. (2016, January 19). Here’s how you teach innovative thinking. International Society for Technology in Education. Retrieved from https://www.iste.org/explore/articleDetail?articleid=651

Kingsley, T., & Tancock, S. (2014). Internet inquiry. Reading teacher, 67(5), 389-399.

Koehler, M.J., & Mishra, P. (2009). What is Technological Pedagogical Content Knowledge?Contemporary issues in technology and teacher education, 9(1), pp. 60-70. Retrieved from: http://www.citejournal.org/volume-9/issue-1-09/general/what-is-technological-pedagogicalcontent-knowledge/

Kop, R. (2010). Using social media to create a place that supports communication. In George Veletsianos (Ed.), Emerging technologies in distance eduction, Athabasca University Press. Retrieved from http://www.aupress.ca/books/120177/ebook/14_Veletsianos_2010-Emerging_Technologies_in_Distance_Education.pdf

Kuo, F.-R., Chen, N.-S., & Hwang, G.-J. (2014). A creative thinking approach to enhancing the web-based problem solving performance of university students. Computers & Education, 72(c), 220–230.

Lehman, R.M. & Conceição, S. (2013) Motivating and retaining online students: Research-based strategies that work, Jossey-Bass / Wiley. Retrieved from http://ebookcentral.proquest.com/lib/spu/detail.action?docID=1376946

Nunes, C., (2015). Digital archives in the wired world literature classroom in the US, Ariel 46(1/2), 115-141.

Puentedura, R.R. (2003). A matrix model for designing and assessing network-enhanced courses. Retrieved from the Hippasus website at http://www.hippasus.com/resources/matrixmodel/index.html

Pursel, B.K., & Xie, H. (2014). Patterns and pedagogy: Exploring student blog use in higher education. Contemporary educational technology, 5(2), 96-109.

Redmond, P., Abawi, L., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online learning, 22(1), 183-204. doi:10.24059/olj.v22i1.1175

Sharkey, J. (2013). Establishing twenty-first-century information fluency. Reference & user services quarterly, 53(1), 33-39.

Wicks, D. (2017). The QUEST model for inquiry-based learning. [PDF document]. Retrieved from: http://davidwicks.org/iste-2-design-and-develop-digital-age-learning-experiences-and-assessments/quest-model-for-inquiry-based-learning/

Wray, E. (n.d.) Rise Model for Peer Feedback. [PDF document]. Retrieved from: https://static1.squarespace.com/static/502c5d7e24aca01df4766eb3/t/582ca65915d5db470077ce05/1479321178144/RISE_rubric-peer.pdf

Empower Science Students through Activism

Science and technology are fundamentally interwoven with society. The benefits of science and technology are only as great as their application in society, and, inversely, the needs and interests of society are what drive advancements in science and technology. To fully understand the practice of science, students must understand its impact on society, and, in turn, an understanding of how science and technology affect them and their community will engage and empower students to take action and affect change.

Focusing science students on social and environmental justice issues, or “socio-scientific issues” (Bencze, Sperling & Carter, 2012, p. 129) is advisable in science education, in part, because of the urgency of the challenges they face. Hodson (2003) and dos Santos (2009) argue that “an orientation in school science towards encouraging and enabling students to take sociopolitical action to address socio-scientific issues seems necessary” to address severe socio-scientific issues and have hope for social and environmental sustainability (as cited in Bencze, Sperling & Carter, 2012, p. 132). We can help students prepare for life after of school by helping them think, plan and act for the future now.

Hodson (2003) offers the following schema for the emphasis on socio-scientific issues in science education (as cited in Bencze, Sperling & Carter, 2012, p. 132):

  1. Appreciating the societal impact of scientific and technological change, and recognizing that science and technology are, to some extent, culturally determined.
  2. Recognizing that decisions about scientific and technological development are taken in pursuit of particular interests, and that benefits accruing to some may be at the expense of others. Recognizing that scientific and technological development are inextricably linked with the distribution of wealth and power.
  3. Developing one’s own views and establishing one’s own underlying value positions.
  4. Preparing for and taking action.

Technology offers students resources, tools and environments that can connect them to the world outside like never before. As Stornaiuolo and Thomas (2017) argue, “one of the most powerful dimensions of social media for youth activists is its collective nature, as young people no longer need traditional gatekeepers (teachers, librarians, community organizers) to build or share knowledge, find other like-minded people, or plan and coordinate actions.” From #BlackLivesMatter to #MarchForOurLives, young activists have made their voices heard across social media platforms like Facebook, Twitter and Instagram.

News of the ongoing water crisis in Flint, Michigan, only gained traction after scientists with Virginia Tech’s Flint Water Study publicized their findings through social media. “Flint residents fought to be heard, and Dr. Edwards and the Flint Water Study team helped sound the alarm” (Smith, 2016, as cited in Jahng & Lee, 2018, p. 95). This example shows how social media can be a tool for direct action in science. “When scientists are engaged in political actions, they are interested in both educating the public about their scientific research and pressuring responsible target organizations or government agencies to increase regulatory measures to protect citizens from potential harm” (McCormick, 2009, as cited in Jahng & Lee, 2018, p. 93). Students can use this model to guide their own socio-scientific activism.

Student choice is essential to fostering engagement and intellectual investment in a socio-scientific action plan. Ito, Soep, Kligler-Vilenchik, Shresthova and Zimmerman (2015) identify that “young people are often driven to act on issues of public concern when those issues are connected to their deeply felt interests, affinities, and identities (as cited in Stornaiuolo & Thomas, 2017, p. 347). So, while it is important for educators to give students access and offer initial exposure to socio-scientific issues and current events, students should be allowed the freedom to choose the cause they identify most closely with.

Activism can take many forms in the science classroom. Next Generation Science Standards (NGSS) practices call for students to engage in argument from evidence (practice 7). Science teachers can use the framework of scientific argumentation to support students in activism. For example, students may decide to start an awareness campaign (make flyers, posters or share ideas online) to convince others using scientific evidence and reasoning. While the supports and teaching process behind writing a claim, with evidence and reasoning will be familiar to science teachers, the context of activism expands the range of modalities students can create as final products and extends the reach of student voice beyond the classroom.

“In an era of struggle and contestation over narrative and meaning, young people today are, in the words of literacy scholar Vivian Vasquez (2014), ‘reading and writing the self into existence,’ using digital participatory cultures to restory schooling and society by making it into their own image” (Stornaiuolo & Thomas, 2017, p. 351).

As students develop a more global perspective and understanding, they are rebuilding the stories that society has written for them as individuals and reshaping the world around them. “In our current landscape of persistent inequality, the efforts of marginalized people to author themselves in order to be heard, seen, and noticed—to assert that their lives matter—has the potential to contribute not only to a new activist imagination but also to the making of a new world” (Stornaiuolo & Thomas, 2017, p. 352) Through activism, science students will see that they have the power to shape the world as they share their knowledge and ideas with others.

 


References

Bencze, L. l., Sperling, E., & Carter, L. (2012). Students’ Research-Informed Socio-scientific Activism: Re/Visions for a Sustainable Future. Research In Science Education42(1), 129-148. doi:10.1007/s11165-011-9260-3

dos Santos, W. L. P. (2009). Scientific literacy: a Freirean perspective as a radical view of humanistic science education. Science Education, 93(2), 361382.

Hodson, D. (2003). Time for action: science education for an alternative future. International Journal of Science Education, 25(6), 645670.

Ito, M., Soep, E., Kligler-Vilenchik, N., Shresthova, S., & Zimmerman, A. (2015). Learning connected civics: Narratives, practices, infrastructures. Curriculum Inquiry, 45, 10–29. doi:10.1080/03626784.2014.995063

Jahng, M. j., & Lee, N. (2018). When Scientists Tweet for Social Changes: Dialogic Communication and Collective Mobilization Strategies by Flint Water Study Scientists on Twitter. Science Communication40(1), 89-108. doi:10.1177/1075547017751948

Stornaiuolo, A., & Thomas, E. E. (2017). Disrupting Educational Inequalities Through Youth Digital Activism. Review Of Research In Education41(1), 337-357. doi:10.3102/0091732X16687973

Vasquez, V. M. (2014, March). Critical ethnography and pedagogy: Bridging the audit trail with technology. Keynote address presented at the 35th Annual Ethnography in Education Forum, University of Pennsylvania, Philadelphia.

Professional Development Strategies from Tech Coaches

With technology constantly changing and the demand for teachers to integrate this new technology into the classroom can sometimes be a daunting task. To help with this large task districts are hiring technology coaches to teach teachers to use the district technology in their daily classroom routines. The ISTE Coaching Standard 2 states “Technology Coaches assist teachers in using technology effectively for assessing students learning in differentiating instruction, and providing rigorous, relevant, and engaging learning experiences for all students” (ISTE). More specifically ISTE Coaching Standard 2e says “Coach teachers in and model design and implementation of technology-enhanced learning experiences using differentiation including adjusting content, process, product and learning environment based upon student readiness levels, learning styles, interests, and personal goals.”. This made me wonder how coaches could effectively model technology- enhanced experiences that best fits the needs of the teachers.

There are many different strategies out there to improve student and teacher learning. ISTE wrote an article titled “Know the ISTE Standards for Coaches: Support learning with technology” by Helen Crompton. In the article Crompton looks at one strategy that would help technology coaches effectively model technology-enhanced learning experiences. Project Based Learning (PBL) is one strategy that helps students learn content and skills through the process of solving a real-world issue. The teacher presents a driving question to the students, who develop their own line of inquiry to address the problem. The result is a student-generated product that answers the question.” Technology can enhance PBL by expanding students’ ability to research, collaborate and share their work. And tech-enhanced PBL can enable teachers to differentiate instruction at various points in the learning process. With modeling and later coaching, student-selected problems and questions, as well as investigation of those problems through technology-enhanced learning experiences, guarantee differentiation of content, process and/or product. Design of instruction, which embeds student choice, addresses learning styles and interests. The technology coach uses PBL and models and coaches teachers to use it effectively with technology and works with them to plan and implement PBL in their classrooms. The coach can create a PBL website as a repository of information and works alongside teachers to use it to locate resources, develop activities to scaffold the process, discuss and select appropriate tools, and design assessment methods to evaluate student products. Add in coaching and modeling how to do these things effectively, and this approach meets all elements of this indicator.

Coaching is most meaningful to teacher when it is content-specific and stimulates collaboration between coaches and teachers in a coaching relationship. The advance of technological tools impacts not only teaching but also coaching. As stated previously, the possibility of virtual coaching allows for coaches to spend more time coaching and less time travelling to school sites (White
et al., 2015). While video-sharing and online conference platforms can pose a challenge to teachers who are not as comfortable with  technology, they have benefits that outweigh the challenges. In addition to helping coaching happen in more efficient ways, technology can also help improve the quality of experience for teachers. For instance, online video-sharing platforms create a way to share exemplar videos with teachers (Kurz et al., 2017). Other platforms have now been created that allow teachers and coaches to interact with uploaded classroom videos, resulting in more timely feedback. In addition to these benefits, these experiences serve as a model to use technology in meaningful ways in service of a larger goal, in this case, the coaching of teachers.

 

 

 

Instructional Coaching: Using Rubrics to Quantify Qualitative Data for Improved Teaching Outcomes

Feedback can be a powerful tool to improve teaching and learning. Through feedback, new perspectives can be gained as teachers begin to can acern what is working and what isn’t in current instructional methods. Feedback also offers suggestions on achieving goals and standards that drive an educator’s work. There are four different types of feedback: formative, summative, confirmative, and predictive. Formative feedback occurs before an intervention takes place, such as giving students feedback on an assignment where the feedback does not impact the final grade.  I explore the benefits of formative feedback in this post. Summative feedback occurs after an intervention, such as when students turn in an assessment and the feedback provided is in relation to the grade outcome, (Becker, 2016). Predictive feedback occurs before any instruction has ever taken place to ensure that the method will be effective while confirmative occurs well after summative feedback to ensure that the methods are still effective, (Becker, 2016).  Of the four types, formative, and summative feedback are among the most widely used evaluation in educational institutions.

At the end of each quarter,  two types of summative evaluation is collected for each of the classes I’ve taught, quantitative and qualitative data to assess my performance as a professor, and the course outcomes.   The quantitative portion uses a likert scale ranging from 1=strongly disagree to 5= strongly agree, whereas at the bottom of the evaluation form, there is a section where students can provide comments, intended to give constructive feedback for classroom improvement.  While the comments are not always written constructively (I am addressing this through a mini-module students are required to complete for all of my classes), it’s mainly the common themes that present themselves in the evaluations that are powerful influencers of improving my classes.  However, what I’ve learned is that most of the time, the summative feedback is simply too late to improve the current student experience because the issue can’t be addressed until the next time the course is offered. As a technology and instructional coach, in order to help other educators improve their teaching outcomes, more timely feedback would be required that utilized both quantitative and qualitative assessment measures. While most learning management system (LMS) platforms can offer a multitude of analytics, quantifying data such as exam scores, class averages for assignments, and average engagement time on the platform, there isn’t an explicit way to neither collect nor quantify qualitative data.

The ISTE standard for coaching states that coaches should, “coach teachers in and model effective use of tools and resources to systematically collect  and analyze student achievement data, interpret results, and communicate findings to improve instructional practice and maximize student learning, (ISTE, 2017). If LMS can collect quantitative data that can be assessed throughout the quarter (through summative feedback), could it also be used to quantify qualitative data (i.e. comments) for improved teaching outcomes?  To answer this question,  I’d like to address it two ways:  1) Establish an understanding in the value and importance of self-reflection of assessments, and 2) Address how rubrics can help quantify qualitative data.

Importance of self-reflection.  Self-reflection can give several insights into the effectiveness of teaching.  According the Virginia Journal of Education, self reflection is a method to support current strengths and identify areas of improvement including continuing education or professional development needs. Educators may seek out self-reflection in order to review past activities, define issues that arise throughout the quarter/semester, understand how students are learning, modify a class due to unexpected circumstances, or address whether or not the teacher’s expectations have been met. Overall, self-reflection improves teacher quality, (Hindman & Stronge, n.d.)

Educators may sometimes make decisions based on emotions when deciding whether or not an element worked well in the classroom. However, without context to justify that decision, emotions are not a clear indicator of outcomes. Self reflection puts a process in place in which educators can collect, analyze, and interpret specific classroom outcomes, (Cox, n.d.).  Though there are various ways to perform self-reflection (see Figure 1.1), the most effective outcome is to ensure that the process has been thoroughly completed.

Figure on Cox's Types of Self-Reflection
Figure 1.1 Cox’s Types of Self-Reflection.

For an  instructional coach, following the proper self-reflection steps would be a great way to begin the discussion with someone wanting to improve their teaching. An instructional coach would help the educator:

  • Understand their outcome goals,
  • Choose the data collection/reflection method best suited to meet these goals,
  • Analyze the data together to identify needs,
  • Develop implementation strategies to address needs.

Because is the process is general, it can be modified and applied to various learning institutions. With my coaching background as a dietitian, similar to my clients needs for change, I would also include questions about perceived barriers to change implementation.  These questions would include a discussion on any materials, or equipment the educator would deem necessary but that may be difficult to obtain or that may require new skills sets to use fully.

Using rubrics to quantify qualitative data. Part of self-assessment includes using rubrics, in addition to analyzing data, goal setting, and reflection. According to the Utah Education Association (UEA), using a rubric helps to address the question “What do I need to reach my goals?”,  (UEA, n.d.). Rubrics present expected outcomes and expected performance, both qualitative qualities, in quantifiable terms. Good rubrics should include appropriate criteria that is definable, observable, complete, and includes a continuum of quality, (UEA, n.d.).  

If rubrics help quantify qualitative data, then how can rubrics assess reflection?  DePaul University tackled that very question, in which the response asked more questions including: what is the purpose of the reflection, will the assessment process promote reflection, and how will reflection be judged or assessed? (DePaul, n.d.).  Educational Leader, Lana Danielson remarks on the importance of reflective thinking and how technological, situational, deliberate, or dialectical thinking can influence teaching outcomes. Poor reflective outcomes, according to Danielson, is a result of not understanding why teachers do the things they do, and that great teachers are those know what needs to change and can identify reasons why, (Danielson, 2009).   Figure 1.2 describes the four types of reflective thinking in more detail.

Infographic on the four modes of reflective thinking
Figure 1.2 Grimmett’s Model of the Four Modes of Reflective Thinking

Developing rubrics based on the various types of reflective thinking will help quantify expectations and performances to frame improvement. The only issue with this model is that it is more diagnostic rather than quantifiable.  A more specific rubric model developed by Ash and Clayton in 2004, involves an eight-step prescriptive process including:

  • Identifying and analyzing the experience,
  • Identifying, articulating, and analyzing learning,
  • Undertaking  new learning experiences based on reflection outcomes, (DePaul, n.d.)

The Ash/Clayton model involves developing and refining a rubric based on learning categories related to goals.  All of the qualities related to the learning categories are defined and refined at each stage of the reflection process. More information on the eight-step process can be found here.

Regardless of the reflection assessment model used, coaches can capture enough criteria to create and use rubrics as part of the self-reflection process that can help improve teaching outcomes due to new awareness, and identified learning needs that may block improvements. Most LMS systems support rubrics as part of assessment in various capacities (some only support rubrics on designated “assignments” but not features like “discussions,” for example).  Each criteria item includes quality indicators which are also associated with a number, making the qualitative data now quantifiable similar to the way “coding” in qualitative research allows for quantifiable results. New rubric features allow for a range of quality points on common criteria and freeform responses, allowing for the possibility of modifications to the various reflection types. Because of the new functionalities and the myriad of rubric uses in LMS today, creating a good-quality rubric is now the only obstacle of rubric implementation for self reflection.

References

Becker, K. (2016, August 29.) Formative vs. summative vs. confirmative vs. predictive evaluation. Retrieved from: http://minkhollow.ca/beckerblog/2016/08/29/formative-vs-summative-vs-confirmative-vs-predictive-evaluation/

Cox, J. (n.d). Teaching strategies: The value of self-reflection. Retrieved from: http://www.teachhub.com/teaching-strategies-value-self-reflection.

Danielson, L. (2009). Fostering reflection. Educational Leadership. 66 (5)  [electronic copy]. Retrieved from: http://www.ascd.org/publications/educational-leadership/feb09/vol66/num05/Fostering-Reflection.aspx

DePaul University, (n.d.) Assessing reflection. Retrieved from: https://resources.depaul.edu/teaching-commons/teaching-guides/feedback-grading/Pages/assessing-reflection.aspx

Hindman, J.L., Stronge, J.H. (n.d). Reflecting on teaching: Examining your practice is one of the best ways to improve it. Retrieved from: http://www.veanea.org/home/1327.htm

ISTE, (2017). ISTE standards for coaching. Retrieved from: https://www.iste.org/standards/for-coaches.

Utah Education Association., (n.d.) Self-Assessment: Rubrics, goal setting, and reflection. [Presenter’s notes]. Retrieved from: http://myuea.org/sites/utahedu/Uploads/files/Teaching%20and%20Learning/Assessment_Literacy/SelfAssessment/Presenter%20Notes_Self-Assessment_Rubrics_Goal_Setting.pdf

Google Forms and the Power of Self-Assessment

Mention the word ‘data’ in a staff meeting and you’ll see teachers stifle eye rolls and sighs. Because we know what’s coming next…graphs and charts depicting test scores from the prior school year or quarter showing us all the ways in which our students didn’t meet the districts’ lofty goals. This isn’t the kind of data I want to talk about today. I want to talk about data that is meaningful and student-driven.

Data collection and analysis is part of the ISTE Coaching Standard 2h, “…model effective use of technology tools and resources to systematically collect and analyze student achievement data.” Being a self-professed Google junkie, I knew I wanted to cover Google Forms for this post. Then, after researching the many ways Google Forms can be used for data collection, I discovered a post on the blog Lindsay Ann Learning which suggested using Forms for student self-assessment. I’ve used Forms to gather and analyze multiple choice data, but this post opened my eyes to new ways to use Forms for data. It also challenged me to consider how I define “quality” data. Is it the percentage of students who chose the correct letter answer, or is it growth over time as defined by a much broader set of standards and demonstrated through reflection?

What is meaningful self-assessment?

  • A process in which students “1) monitor and evaluate the quality of their thinking and behavior when learning and 2) identify strategies that improve their understanding and skills. That is, self-assessment occurs when students judge their own work to improve performance as they identify discrepancies between current and desired performance.” (McMillan & Hearn, 2008)

Why student-driven data?

  • Students regularly provided with the opportunity to self-assess and reflect on their own learning are more likely to recognize the elements that led to success: hard work, effort, and studying. (Fernandes & Fontana, 1996)

New Data Idea 1: Collect data in the form of student reflection

To test out this new way of collecting data, I made a sample Research Project Self-Assessment in Google Forms. I incorporated the advice shared on Lindsay Ann Learning including using linear scales with an odd number of choices (to ensure no middle-line stances), incorporating open and close-ended questions, and writing questions designed to measure self-perception of learning. Here’s what data from that self-assessment might look like:

Class-wide data as seen from Google Form Responses tab

 

Sample student report with change over time (click to enlarge)

 

New Data Idea 2: Give the power of the rubric to students

Rubrics. I have a love-hate relationship with them. Why? They’re so helpful in understanding where a student is at and why, yet almost no student actually reads through them! That’s why I appreciated Jennifer Roberts’ idea. As part of her Memoir Self-Reflection (which you can make a copy of here), students must read through her rubric and rate themselves on each element.

Photo credit: Google Form ‘Memoir Self Evaluation’ made by Jennifer Roberts

Research supports the value of rubrics in helping students meet learning goals. As stated by McMillan and Hearn: “[P]roviding evaluation criteria through rubrics…helps students concretely understand outcomes and expectations. They then begin to understand and internalize the steps necessary to meet the goals.” (2008)

New Data Idea 3: Exit tickets for quick reflection

Exit tickets as formative assessment are nothing new in education. However, using Google Forms to streamline this process can help you easily gauge how students feel about their own learning after a lesson. Here’s a sample Exit Ticket I made. Taking a minute at the end of class to allow students to self-assess can inform instruction before you dive into assessments and projects with a large portion of your class potentially in the dark.

 

 

Next Steps

Self-assessment data should drive instruction in your class in the same way that traditional high-stakes testing instruction should. Below are some next-steps you might consider when using self-assessment data to drive instruction.

 

Sources

Fernandes, M., & Fontana, D. (1996). Changes in control beliefs in Portuguese primary school pupils as a consequence of the employment of self-assessment strategies. British Journal Of Educational Psychology, 66(3), 301-313. doi: 10.1111/j.2044-8279.1996.tb01199.x

Google Forms for Data Collection. (2016). Retrieved from https://lindsayannlearning.com/student-data-google-forms/

McMillan, J., & Hearn, J. (2008). Student Self-Assessment: The Key to Stronger Student Motivation and Higher Achievement. Educational Horizons, 40. Retrieved from https://files.eric.ed.gov/fulltext/EJ815370.pdf

Roberts, J. (2017). Self-Evaluation Google Form for Students. Retrieved from http://www.litandtech.com/2017/09/self-evaluation-google-form-for-students.html

Creating Opportunities for Paraprofessionals to Explore EdTech

I began this school year with the intent of supporting other teachers with technology integration.  However, my intentions were put on hold as I became aware of a greater need, supporting our paraprofessionals in digital education through peer coaching. As I look back to Fall quarter, I was inspired by Les Foltos, author of Peer Coaching: Unlocking the Power of Collaboration. In my blogpost Visionary Leadership and Peer Coaching, I set out to find how coaches can successfully inspire and assist peers with planning and implementing technology integration. 

The key principles that stood out to me were:

  • Willingness
  • Personal Relationship
  • Trust and Support vs Judgement
  • Understanding of the Education System
  • Time
  • Reciprocal Communication

Background

Initially I found teachers who were willing to collaborate, yet time seemed to be the biggest barrier in collaboration.  Perhaps one of the greatest assets to my ability to collaborate with IAs this year came from a classroom change. In four years, I’ve had classrooms in three corners of the school property.  My first year was out in a portable, not many people made their way out to my space. Then I spent two years in the main building off of a kindergarten classroom. Not the most convenient to get to, but relatively close to the office, so I would have people stop by each day, typically to discuss students or content support.  This year, I moved to a room near the staff room and playground. Since all IAs have playground duty, I feel this location created more opportunities for collaboration. After offering repeated open invitations for staff to come into my room, I began to get weekly visits from several IAs. In turn, this has strengthened our personal relationships, helped me understand their level of understanding and their vision for the future, and most importantly, reinforced collaboration and a team mentality vs working in isolation.

What I discovered, was a passionate team of bilingual support staff who want to use more resources, but do not have access to the same professional development opportunities as certificated staff (teachers). Unless they seek out professional development, EdTech opportunities are not provided to them. This therefore became my quest, to advocate for and help implement EdTech opportunities for our IAs.

Using Data to Drive Professional Development

Understanding that Professional Development requires planning, collaboration, and support, there are several experiences that have led to the creation of the upcoming PD. These experiences include my Peer Coaching Project, informal collaboration with IAs, teachers, and administrators, my Technology Needs Assessment, an ELL Family Tech Event, and Staff Technology Use Survey.

Peer Coaching Project

For my Peer Coaching Project I worked 1:1 with an IA serving a small group of first grade ELL students. Driven by her interest in using Seesaw, we moved her small group from working on whiteboards and paper to using Seesaw to read, write, record, and share.  The feedback from the project was positive. Here are two of the ten questions I asked of the IA for feedback:

How has using Seesaw changed student learning in your group? I really liked how it inspired students who normally run a little behind the others to step forward and slowly but steadily work towards the finish line seeing their smile of achievement as they heard themselves read is priceless.
How would you like to use Seesaw in future?  What are you comfortable doing without my support? I like the students seeing and saying their sight words. Circling the words they still struggle with.  Taking a book from their book box taking a picture of a page. Circling the sights words and/or words they didn’t quite get to further work on them.

Informal Collaboration

After meeting with several instructional assistants individually, I approached our administration about creating a Technology Needs Assessment for instructional assistants. The IAs did not seem aware of any tech support made available to them, other than troubleshooting software issues. For example, this year all certificated staff were offered an online tech PD with various modules related to new devices and software.  Classified staff however, did not initially receive access to this PD and access was not widely publicized. It was this data the guide of the direction towards our upcoming fall Workshop.

Technology Needs Assessment

The Technology Needs Assessment provided usable data.  By asking 14 intentional questions, I was able to gain insight to how the IAs view, use and envision technology in the classroom. The IAs are interested in learning more about how to use digital tools both professionally and in the classroom. All data for the Technology Needs Assessment through this link.

Staff Technology Use Survey

With the Technology Needs Assessment showing all IAs are interested in learning about apps to use in the classroom and differentiated instruction, I then needed to survey teachers to find out what they are already using. This is also a critical piece due to parent feedback from our ELL Family Tech Event. Many ELL parents are unable to access or understand the purpose of educational programs that teachers recommend for students.  Our IAs are also unfamiliar with how to create an account, log in, or use the programs, leading to lack of enrollment and activity from our ELL students. The Staff survey also asked teachers how IAs could use technology to support student learning in class. Below is a sample of the responses.

Workshop Rationale

To plan a three hour workshop, I consolidated data from the Peer Coaching Project, informal collaboration, Technology Needs Assessment, ELL Family Tech Event, and Staff Technology Use Survey. Wanting to offer intentional, relevant, and personalized PD, I’ve created three segments, all with time to actively engage with digital tools and collaborate with colleagues. I’ve chosen the order based on interest received from both classroom teachers and IAs. Recognizing that several classroom teachers have shown interest in learning more about Seesaw, the Seesaw segment is scheduled first.  The second segment looks at Office 365 and the T-Drive on our district computers, exploring apps IAs are expected to access and use during the year.  The final segment gives IAs and opportunity to explore how to access, log in, and understand the learning objectives of various apps used by classroom teachers.  Having a better understanding of Seesaw, Office 365, and educational apps will also assist IAs when translating for families.  This PD is scheduled for a half day the week before we return to school.in the Fall.  At present, 17 IAs will be invited to attend, serving both our Bilingual and Special Education population.

 

Technology Professional Development That Teachers Can Use

Many districts are seeing the value of hiring teachers with the job of helping other teachers integrate technology into their classrooms. Although these positions can have many different titles (tech integration specialists, technology coach, educational technology consultant, technology coordinator, etc) and different districts use people in these roles in different capacities, having a person support and coach classroom teachers as they integrate technology into their classroom is becoming a necessity in education.  ISTE summarizes the role of these professionals in the “ISTE Standards for Coaches”. The 2nd standards reads, “Technology Coaches assist teachers in using technology effectively for assessing student learning, differentiating instruction, and providing rigorous, relevant, and engaging learning experiences for all students (ISTE, 2011). Teachers have so much to try and stay on top of these days, such as new and changing standards, standardized testing, teacher evaluations, social-emotional learning needs, and outdated curriculum that needs to be supplemented.  Integrating technology in efficient and meaningful ways can make life easier for students, teachers, and families. However, making these shifts and trying new things can be daunting when your plate is already full. Having a specialist whose job it is to help teachers make these changes by providing trainings, individualized coaching, and on-going support can have tremendous benefits to a district. But, with so many teachers on different pages as far as their experience, skill set, and comfort level with technology, it can be hard for a technology coach to provide professional development to a large group of teachers with the goal of everyone leaving feeling that their time was valued and that they now have something new they can implement in their teaching.

What do Teachers Want/Need? Just Ask!

When you are not currently teaching in a classroom position, it is hard to know what exactly teachers want or need at any particular moment.  Like with so many things in life, when you don’t know or aren’t sure, just ask! People appreciate this! This can be done in formal or informal ways.  An easy way to formally survey a group of teachers to which you will be providing upcoming training is to send out a Google Form with a variety of answer formats (multiple choice, open-ended questions, scales (1-10)). Be sure to ask a variety of questions and be specific in your requests for information from teachers (Gonzales, 2016).  Informal ways of getting to know what your audience’s preferences for a training might be to come to the school a week or two beforehand and stop in classrooms before or after school to chat. Or eat lunch in the staffroom and engage teachers in casual conversation on what they might be looking for as far as technology integration needs. Another option would be to “work the room” as teachers are arriving at the training and getting set-up. Gonzales writes in her blog about ed-tech consultant, Rodney Turner, using this strategy, “If you can’t send out a survey ahead of time, you can still get to know your audience the day of the training. Rodney Turner describes how he does this: “What I love to do is to circulate the room. I come in early, and I set my stuff up and have it done, so that way as people are coming in, I talk to them: ‘Hi, how are you doing, my name’s Rodney, where are you from, what grade, what do you teach…what do you want to learn from this session?’ And that has helped me so much in being able to reach out to people to understand where they’re coming from.” (Gonzales, 2016).

Enlist Help from the Experts

When teachers want help on how to prepare for a lesson or how to understand the curriculum, they typically walk next door or down the hall.  Note the percent of teachers who say ideas from other teachers is the most helpful when it comes to technology training in the chart below (Education Week Research Center, 2016 ).

Teachers respect other teachers and know that “they know what it’s like”.  Teachers are such an invaluable asset to each other because each and every teacher has different skill sets, different teaching styles, and different teaching experiences. You can learn something from every teacher and every teacher can learn from you.  When a technology coach is planning for a professional development training they should enlist help from the group they will be “training”. Find the “experts” in different areas of technology and use them to share examples of what they have done in their classrooms and what has worked and what hasn’t. In her blog post, Gonzales talks with tech coordinator Sarah Thomas about how she looks for teachers in the audience as a potential resource.  “Not only does this approach enhance her presentation, it also makes the training more enjoyable for the teacher who already has that knowledge. “There’s nothing worse than being at a session where you already know what’s going on and you’re just kind of being talked at, you know?” says Thomas.” (Gonzales, 2017).

Provide Options

If there are several technology coaches in your district, or if you have enlisted the help of teacher leaders (see paragraph above!), then another way to help provide staff with technology integration learning experiences that are best suited for their needs is to provide options for professional development.  This might be structured with multiple “levels” on the same topic that teachers can self select in to, or it might be that you have a larger menu of a variety of options so that teachers can choose what will be most useful for them depending on factors such as their grade level, subject area, and their experience with technology.  Another option is to make these trainings optional for teachers or offer 3 different session times and someone can attend 0, 1, 2, or 3 sessions based on their needs. The key here is to give teacher’s choice on how they spend their time. Everyone wants to feel that their time is valued, especially teachers with limited time and ever-growing demands on this time.

Follow-Up

Receiving a lot of new and exciting information can feel both inspiring and overwhelming.  You walk out of a professional development session and you can’t wait to get back to your classroom to try out all that you have learned, but when you arrive at school the next day you are met with a long to-do list just to keep on top of your daily work routine.  Or after reflecting on the training, you have some logistical questions to figure out before you attempt implementation of what you just learned. When this happens, teachers will either struggle through and give this new skill or strategy their best shot or they will throw the towel in because they don’t have what they need to feel confident implementing what they have learned.  This is the time period when we need to “capture” these teachers and give them what they need to feel empowered to make this change in their teaching. Following up in a timely manner is key.

Be sure to send the teachers you are training away with your contact information and a digital link to any resources you shared or any resources that might help them deepen their understanding of what they have learned (Gonzales, 2016). But, as a technology coach, don’t rely on teachers to reach out to you. Technology integration, although we all know how important it is, is only one aspect on a classroom teacher’s job. Reach out to them, whether it’s individually, as a large group, online, or in-person.  Make that connection and work on building these professional relationships.  “What I have said to the teachers I work with is that the time we are together, in person, is just the start of a conversation. Because technology grows and changes so quickly, we can’t rely on traditional methods of learning to stay on top of it. We can’t wait for a textbook to be published; to really make the most of what the machines can offer us, what we ultimately need is each other, so staying connected is an essential part of any tech training. (Gonzales, 2016).”

 

Sources:

 

Flanigan, R. (2016). Education Week (35, 35), pp. 31-32. Ed-Tech Coaches Becoming Steadier Fixture in Classrooms

 

Gonzales, Jennifer (2016). Cult of Pedagogy Website (Retrieved on May 24, 2018) from: https://www.cultofpedagogy.com/tech-training-for-teachers/

 

ISTE.org. (2017) ISTE Standards for Coaches. (Retrieved on 2018, May 30) from: https://www.iste.org/standards/for-coaches

 

Toward a theory and practice of coaching higher ed faculty

Why re-imagine faculty professional learning?

Among the three 2017-2021 strategic priorities of Colorado- and Washington, D.C.-based nonprofit higher education association EDUCAUSE is “reimagined professional learning.” By replacing the term professional development with professional learning in its vocabulary, CEO John O’Brien (2018) indicates the extent to which EDUCAUSE and other higher education associations and collaboratives believe that for higher education institutions to thrive in a world characterized by “automation, cognitive computing, and digital transformation,” higher education faculty members, executives, and information technology (IT) administrators alike must cultivate the very digital age skill sets—“adaptability, creativity, empathy, problem solving and decision making”—that are characteristic of the digital age learning and of the teaching skills defined by ISTE.

Stating this equation in business management terms, O’Brien’s colleague Mike Meyer (2018) notes that community colleges in particular have failed to recognize that if the user experiences of students from specific populations will make or break the success of an institution, then the “purchase and partial integration of” new student pathway systems, tracking systems, and learning management systems into an old organizational infrastructure of divisions and committees, with IT (information technology) or ITC (information and communication technologies) remaining in a peripheral role, is a paradigm unable to support a viable future.

Reconceptualizing the central role that IT—which encompasses long-term strategic organizational approaches to developing digital learning environments; creating an administrative, teaching, and learning culture rich in digital creation, problem-solving, and collaboration abilities, and developing the information, data, media literacy, and ethical capacities that support that culture—involves, for one, a reconceptualization of how faculty professional learning intersects with the work of IT administrators and instructional designers in delivering an educational product that will serve a student population and sustain the educational market that population represents.

This post is about the quest for a professional learning model that a small rural community college can use to focus on how technology integration impacts faculty pedagogical capabilities. ISTE Standard for Coaches 2 calls for coaches to use technology in ways that model and coach faculty in best assessment, differentiation, and learning design practices. But the scope of this investigation has implications for how an institution-wide culture might grapple with its own quest to adapt to the current higher education landscape.

At the conclusion of this post, I’ll propose a model for a faculty-led professional development workshop series that uses assessment, differentiation, collaboration, coaching, communities of practice and their associated technologies to suggest how a faculty-led initiative could assist an institution in conceptualizing the sort of professional learning that might support cultivation of innovation and excellence in education.

To get there, I’ll explore the nature of coaching (a professional learning paradigm indebted to the fields of business and athletics) and the principles of adult learning (which apply to professional development for all higher education stakeholders), the collaborative- and action-based best practices for professional learning that are identical to the best practices used in higher ed classrooms and digital learning spaces, examine the implications of a validated construct of higher education teaching for professional learning, and review several professional learning models based on that construct, including a model developed by one of my institution’s sister colleges.

 

Andragogy and coaching fundamentals

Professional development is learning. In my conversations with higher ed stakeholders, I too have dropped the term professional development in favor of professional growth or professional learning. One reason I do this is to emphasize that higher education culture aimed at cultivating student learning should prioritize faculty development of the leadership, innovation, and teaching skills needed in digital age higher ed. A second is that I believe an institution’s approach to professional development should use the same principles of learning and teaching we say we expect instructors to enact. Professional development that results in professional learning, according to Zepeda (2015) should involve:

  1. active knowledge construction;
  2. collaborative learning;
  3. application in context, over time, with follow-up feedback that can be incorporated into continual learning; and
  4. differentiation.

The principles of andragogy further reinforce these well-established cognitivist and constructivist learning principles. Professional adult learners are self-directed and need to apply new knowledge immediately; as members of local and disciplinary professional learning communities, they need to collaborate in ways that allow each faculty member to co-learn, co-teach, contribute knowledge and benefit from collective knowledge; as those who are learning skills that were often not part of their graduate programs and may be determined by policymakers rather than by disciplinary best practices, thy need access to coaching, technical support, and follow-up as part of professional development projects and infrastructure; and they need access to differentiated learning that engages their particular disciplinary, technological, and pedagogical proficiencies and teaching assignments (Zepeda, 2015). Indeed, one key feature that distinguishes college instructors from K-12 teachers is their heterogeneity (Bachy, 2014). More on this below.

Institutions that seek to respond strategically to the changing higher education paradigms and markets of the 21st century need to overcome the obstacle for many of inspiring collective commitment among the highly diverse individuals who make up an institution (Gentle, 2014).

Characteristics of higher education institutions that are able to create such mutually committed, emotionally intelligent cultures include:

  • embracing collaboration between faculty and administration that promotes the sharing of best practices;
  • sufficient professional staff support for administrators and faculty in their primary roles (as in the current discussion of supporting faculty technological-pedagogical knowledge and practice);
  • a focus on student needs and expectations that also protects faculty from unrealistic demands;
  • and a balance between expecting accountability from staff and respecting and developing academic autonomy. (Gentle, 2014)

One of the keys to developing such an institutional characteristics is to establish a culture of feedback. Creating such a culture is much more stating a “open door policy,” providing places for various stakeholders on committees, or conducting annual faculty reviews based on models that may not include clear quantitative measures of technology-supported teaching competencies (Dana, Havens, Hochanadel, & Phillips, 2010).

What can make these practices effective and develop these characteristics are relationships based on coaching principles (Gentle, 2014). While these principles will be related to both faculty peer coaching models and to professional learning models below, coaching principles can also be adopted as consistent practices in the informal relationships that make up an organizational culture to keep the institutional focus on collaboration and growth.

If coaching allows facilitates the reflection and practice that allow faculty to grow in their application of specific disciplinary and technology-related 21st century teaching skills, and to approach challenges through problem-posing and problem-solving, coaching must be differentiated to address individual faculty members’ needs and also embedded in the real work of teaching (Zepeda, 2015). While much of the work of coaching involves positive conversations and questions that help a faculty member clarify and discern where she needs growth and respond strategically, coaching also involves modelling and progress monitoring. A higher education institution’s professional learning program or a single professional learning project can employ coaching effectively at the peer level as well as at the program level. Peer coaching as a form of professional development was introduced by Joyce and Showers (2002) in the 1970s. The hallmark of its effectiveness is not verbal feedback but rather the unexpected growth that happens through collaboration.

Zepeda (2015) compares peer coaching to “clinical” supervisory models, which also mirror constructivist learning approaches. In this cycle, an instructor is presented with a theory or technological pedagogical tool, observes and discusses modeling of the theory or tool in practice, creates and practices an application of the theory or tool, and receives feedback on that practice. Significantly, the model progresses to coaching, which involves more than observation and feedback, but for the faculty member to then step into the coaching role. Joyce and Showers (2002) found that including of the integral component of coaching led to 95% mastery and transfer by instructors. Indeed, this model parallels Wiggins and McTighe’s (2005) Understanding by Design model (also known as UbD or “backwards design”) for development of curricula or any other learning experiences, such as professional development. In this conceptual framework, learning (or “understanding”) results are first identified, then evidence of learning is established, flowed by development of instructional materials. A key concept in UbD is the nature of feedback; true feedback is formative and summative feedback defined through specific criteria that enables the learner to improve and meet goals (Wiggins & McTighe, 2005).

 

Using TDPK to understand faculty diversity and provide differentiated coaching

A fundamental consideration for either peer or supervisory/program coaching of postsecondary faculty is the heterogeneity of faculty, not only in terms of individual faculty members’ existing competencies with technology in general, but with knowledge of teaching technologies and of pedagogy/andragogy (which may or may not have been a focus of their graduate study). In addition, the nature and types of knowledge in each discipline vary widely. Individual faculty epistemologies—their individual beliefs about the nature of knowledge and how it can be constructed—also affect their teaching practices.  The complex interaction between these different components of teaching has been studied over time through the development of a series of models for understanding what is involved in postsecondary teaching, and hence in faculty professional learning.

Bachy (2014) summarizes the development of this evolving model, provides validation for it, and begins to suggest how this model relates to constructivist faculty development approaches (such as UbD). She also suggests how this model can provide diagnostic assessments to differentiate professional learning opportunities for faculty.

Bachy’s (2014) model of teacher effectiveness, TPDK, includes the four dimensions of an “individual teacher’s discipline (D), personal epistemology (E), pedagogical knowledge (P), and knowledge of technology (T). Each of these dimensions, as well as their effects on one another, contributes to a unique profile of how a faculty member teachers in terms of disciplinary knowledge, beliefs about learning, knowledge of pedagogies, and knowledge of technologies for communication, learning, and disciplinary knowledge-construction. For example, “when a teacher feels competent in the technology associated with their discipline (TD dimension), it influences their educational choices… and, to a lesser extend (we observe a lower, significant, correlation), their epistemological choices” (Bachy, 2014, p. 33). Thus by identifying four validated aspects of teaching as well as six validated dimensions (such as TD) or relationships between those aspects, faculty members and their reviewers and coaches alike can define and conceptualize an instructor’s teaching strategies and define and conceptualize how to chart professional growth.

 

tdpk model
The TDPK model, showing four knowledge dimensions and the relationships between them. Retrieved from http://www.eurodl.org/materials/contrib/2014/Bachy.pdf
elationships between tdpk dimensions
Relationships between the TDPK dimensions each define a specific type of teaching knowledge. Retrieved from http://www.eurodl.org/materials/contrib/2014/Bachy.pdf

In addition to presenting the historical underpinnings of the TPDK model, Bachy presents the study that provided initial validation of the tool. The survey used in this study, along with the resulting profiles of individual faculty members’ educational strategies, can provide a diagnostic tool and a graphically represented profile that can help faculty members and their coaches plan and measure professional learning. I tried out the survey and used the same radar data charts used in the study to create my TDPK teaching profile. The value of the radar chart that is used to represent the profile is that it shows the influences that each teaching dimension may exert on the others and which of the four dimensions influence an instructor’s practices most. (For more explanation of the profiles and a comparison of the initial experimental profiles of four faculty members to qualitative descriptions of their teaching profiles, see Bachy’s article.)

 

faculty tdpk profile
Radar charts based on my completion of Bachy’s survey. To compare with four other faculty profiles and written descriptions of those faculty members, see Bachy’s article at http://www.eurodl.org/materials/contrib/2014/Bachy.pdf

Bachy’s presentation of the TPDK profile as a diagnostic and assessment tool for faculty coaching suggests a number of applications to guide effective professional development and coaching based on better understanding of instructors’ actual educational strategies. One valuable potential use of the TDPK profile in a small college with limited professional learning resources would be for trainers to develop training tracks (with tailored training focuses and materials) based on the most frequently occurring TDPK profiles at the institution. The survey and resulting profile could also be used for a training pre-assessment.

Another compelling aspect of the TPDK model is its affinity for constructivist views of teaching and learning, including its understanding of disciplinary knowledge as constructed by the “Communities of Practice” who make up disciplines, support of the trend toward student-centered learning that forms the basis for the “pedagogical knowledge” dimension of teacher, move teacher training beyond mere focus on tools and into application, and merge technology training with pedagogical training.

Thus TDPK provides a validated theoretical model upon which to build approaches to professional learning that use assessment, differentiation, learning by design, coaching, and communities of practice linked with technology. These are the principles that are established in professional development literature (Zepeda, 2015).

 

Models for faculty professional development integrating technology

Institutional support for faculty development of technological pedagogical knowledge should encompass three dimensions. First, faculty need immediate, navigable access to knowledge supports such as tutorials, videos and a repository of curricular approaches adopted by the institution. Second, faculty need defined, sustained pathways to development of TPCK knowledge, such as through trainings or articulated levels of development. Third, faculty need always-accessible support, such as coaching and troubleshooting. The institutions that form the Colorado Community College System (CCCS) have met these needs in a variety of ways. At Front Range Community College (FRCC), defined levels of technological pedagogical training can be earned, and result in pay increases. This model allows for both standardization and differentiation. For example while all teachers who teach online must take a 3 credit hour course in online teaching, professional development levels of certification can be earned from a menu of webinars and other options that can be customized by faculty. FRCC has provided the third dimension of a professional development program, ongoing support, through coaching. In 2010, FRCC began creation of an instructional coaching program embedded in its professional development approach. Although coaches were hired for each of the college’s campuses and its eLearning program, integral parts of the program included collaborative peer coaching through Reflective Practice Groups, workshops with follow-up, and networking (Patterson, 2013).

For the community college seeking to begin an embedded, centralized and sustainable approach faculty in effectively teaching with technology, a key consideration is moving beyond providing mere resources or mere conference-style, one-shot “training” focused on speakers (whether external or in-house), into practice-based learning similar to the “clinicals” approach developed by Joyce and Showers (2002).

peer coaching cycle
Joyce and Showers’ peer coaching model

Dysart and Weckerle (2015) propose a workshop model similar to one that I proposed at my institution that incorporates the principles of learning and professional learning that form a common thread through the literature reviewed in this post. Both proposals contain in seed form the three larger institutional professional development program dimensions of accessible resources, specific but differentiated training opportunities, and coaching with feedback, but on a small scale version. Both suggest that the TDPK (or TPACK, a prior conceptual iteration of TDPK) provides a way to differentiate professional learning for the broad diversity of faculty needs with regard to incorporating technology into discipline-specific pedagogy, and stress the importance of providing technological and pedagogical training to give content knowledge experts the self-efficacy they need to teach effectively and ultimately to become members of an innovative and collaborative institutional culture. Both incorporate the active learning principles, and the three research- and theory-based approaches of Learning by Design, Peer Coaching, and Communities of Practice.

dysart and weckerle model
Dysart and Weckerle’s (2015) professional development model. Retrieved from http://www.jite.org/documents/Vol14/JITEv14IIPp255-265Dysart2106.pdf

In Dysart and Weckerle’s (2015) model, a practice-based professional development opportunity would follow Joyce and Showers’ active learning-through-coaching loop in three phases: during training, during teaching, and beyond training. During training, faculty would create a situated lesson or unit incorporating new technology. During the teaching phase, faculty would be supported by peer coaching that here would involve resource-sharing as peer coaches begin the transition to becoming future trainers. After implementation, faculty with similar interests in terms of any dimension of technology, discipline, or pedagogy would continue to share understanding and a repertoire of ideas. In the model I proposed, this repertoire would be housed in a digital repository, and would be housed and extended through technology-based repositories (such as LMS-based curriculum banks of pedagogies developed by faculty at the institution) and through the development of personal learning networks (PLNs) through which faculty could develop and receive ongoing real-time support through shared networking via blogs, twitter accounts that would support the limited centralized instructional design support that is currently available.

While in the short term, higher education teaching can be complicated by the policy changes—or failure to change—that may produce growing points in which faculty may indeed feel hindered from connecting disciplinary best practices to institutional technology decisions; overwhelmed by a focus on student success that unwittingly makes unrealistic demands on instructors along with insufficient support for developing the related competencies, it is the faculty leaders themselves who possess balanced technology, pedagogical, epistemological and disciplinary knowledge who may be best equipped to find professional learning solutions that will enable higher education institutions to cultivate cultures of collaboration, innovation, and teaching and learning excellence.

 

References:

Bachy, S. (2014). TPDK, A new definition of the TPACK model for a university setting. European journal of open, distance, and e-learning, 17(2), 15-39. Retrieved from http://www.eurodl.org/materials/contrib/2014/Bachy.pdf

Dana, H., Havens, B., Hochanadel, C., & Phillips, J. (2010, November). An innovative approach to faculty coaching. Contemporary issues in education research, 3(11), 29-34. Retrieved from https://files.eric.ed.gov/fulltext/EJ1072680.pdf

Dysart, S., & Weckerle, C. (2015). Professional development in higher education: A model for meaningful technology integration. Journal of information technology education: Innovations in practice, 14, 255-265 Retrieved from http://www.jite.org/documents/Vol14/JITEv14IIPp255-265Dysart2106.pdf

Gentle, P., & Forman, D. (2014). Engaging leaders: The challenge of inspiring collective commitment in universities. New York, NY: Routledge.

Joyce, B., & Showers, B. (2002). Student achievement through staff development. Alexandria, VA: Association for Supervision and Curriculum Development.

Meyer, M. (2018, May 7). How change has changed: The community college as an IT enterprise. Retrieved from the EDUCAUSE website at  https://er.educause.edu/articles/2018/5/how-change-has-changed-the-community-college-as-an-it-enterprise

Obrien, J. (2018, May 7). The Future of EDUCAUSE, Part 3: Reimagined Professional Learning. Retrieved from the EDUCAUSE website at https://er.educause.edu/articles/2018/5/the-future-of-educause-part-3-reimagined-professional-learning

Patterson, B. (2013). A model for instructional coaching at the community college. Innovation showcase, 8(12). Retrieved from the League for Innovation in the Community College website at https://www.league.org/innovation-showcase/model-instructional-coaching-community-college

Wiggins, G.P., & McTighe, J. (2005). Understanding by design. Alexandria, VA: Assoc. for Supervision and Curriculum Development.

Zepeda, S.J. (2015). Job-embedded professional development: Support, collaboration, and learning in schools. New York, NY: Routledge.

Elementary Digital Footprints

Looking at ISTE Educator Standard 3 Citizen: 3d Model and promote management of personal data and digital identity and protect student data privacy. After reading this standard I began to think about what it would look like to teach young primary students about their digital identity and digital footprint. This made me ask the question “How can you teach primary students to manage their digital footprints?”

As a primary teacher I know very little about teaching my students on their digital footprints and how to manage them. Reflecting on what my current school does, it seems like we are more focused on teaching the students computer skills such as typing, editing, and care rather than being a digital citizen. Know this I wanted to see what my district technology policy is and then find resources to help propel my school into teaching our students about becoming responsible digital citizens.

Common Sense Media

Common Sense Media offers fun and engaging activities to help teach students about a variety of topics related to technology. Some of the many lessons available cover topics such as cyber-bullying & digital drama, internet safety, creative credit & copyright, and more. The one topic that they offer is to also teach students about digital footprints and reputation. This is something that I believe that I have and many of my coworkers put on the back burner when it comes to teaching technology to our students. Not only are multiple topics covered, all of the lessons are aligned with CommonCore Standards. A family tip sheet is also available to send home with students to reinforce their learning outside of the classroom.

Digital Footprint and Reputation: Follow the Digital Trail

I picked this lesson specifically because it helps answer my question of how to teach primary age students on managing their digital footprints. In this lesson students learn that the information they put online leaves a digital footprint or “trail.” This trail can be big or small, helpful or hurtful, depending on how they manage it. Students follow the digital information trails of two fictional animals. They make observations about the size and content of each trail, and connect these observations by thinking critically about what kinds of information they want to leave behind.

Below is a neat introduction video that can be used for younger students.

 

Implementing Across Grade Levels or Beyond

After getting feed back from my critical friend this week and feed back from the professors I decided to dig deeper in implementing this teaching to across entire grade levels or even in a school. As mentioned earlier my school district has adopted the Common Sense Media program. One idea from my critical friend was to have students learn the material and then teach it to another grade level. I loved this idea because it involves multiple classes and grades. If your school does a Big Buddy/Little Buddy system with different grades. If buddy classes aren’t available these lessons can still be taught across grade levels. When looking at this lesson for my resource I reached out to my grade level team and we discussed two ways of teaching this to all of our kindergarten students. The first was to have one teacher teach the same lesson to each kindergarten class, while the other teachers picked a different subject to teach. The second was to plan out a certain week/time to all teach the lesson to our classes individually. It can look different between grades and even schools.

css.php