Category Archives: EDTC 6103

Archives and Analysis

Extending Literary Interpretation through Archival Research and Global Collaboration

Visit the Website for this Project

 

Goals

The goals of a 100 level college Introduction to Literature course include learning the rudiments of literary analysis, considering how literature interprets the human condition, and analyzing the cultural and historical contexts of works of literature to interpret the meaning of literature and articulate the contemporary relevance of literary works. For these goals to be realized, a literature course needs to inculcate both independent critical thinking and a classroom (or digital space) community of critical discourse.

While core college courses in the humanities, such as Introduction to Literature, generally include an outcome connected to the use of digital information literacy competencies (see, for example, ISTE Standard for Students 2) such as (in the case of this course) “using contemporary technologies to select and use sources relevant to the study of literature,” these competencies are often interpreted by instructors of introductory literature courses as students’ use of library databases to find secondary works of interpretation to use as sources when writing analytical or interpretive papers. However, requiring novice readers of literature to synthesize scholarly interpretations of texts too soon can undermine the development of students’ independent critical analytical and interpretive skills; even with appropriate instruction in synthesis writing, students in an introductory course may transfer previous habits of working with sources such that their use of secondary sources tends to replace rather than extend independent inquiry.

By contrast, providing students with access to primary sources can give them additional textual and contextual elements that may serve as more productive tools for independent development of new lines of inquiry about the texts they have read. Charlotte Nunes (2015) described her incorporation of digital archives into a first-year literature class, arguing that “students can benefit greatly from even preliminary exposure to archives early in their undergraduate careers, by means of short-term, small-scale archival research tasks” (115).

This teaching unit builds upon Nunes’ suggestion by providing modelled and scaffolded access to digital archives to allow students to develop hypotheses about literary texts and address those hypotheses through contextual documents located in digital archives; in turn, the archival information located can problematize students’ original questions, and thus extend their critical thinking and allow for better application of students’ new understanding of literary, political, and social history to current issues as well.

The use of digital archives to position students as knowledge constructors aligns with ISTE Standard for Students 3. Through critical curation of primary sources located through digital archives, students can use archival technologies to develop inquiries, explore real-world sources, grapple with ill-structured problems presented by how primary sources must be interpreted to provide contextual relevance (rather than with the predigested solutions that may be the focus of students’ use of secondary sources), and pursue more personally owned theories and answers.

 

Barriers

Students who are first or second year students at two-year colleges may or may not have taken the freshman year writing course sequence, may be nontraditional students with considerable life and academic experience, or may lack the preparation typically required by four-year colleges for admission, so their levels of skill in using research methodologies can vary considerably. Nunes (2015) noted that while learning the research strategies involved in archival research is beyond the scope of an introductory literature course, providing students with the “intellectual access” to archival materials can greatly deepen their ability to contextualize their thinking about the historical and social issues they encounter in literature (p. 117). Hence her approach to including primary sources in an introductory literature class typically involved students in working from instructor-provided primary sources.

Similarly, in a study of a problem-based learning project supported by digital archival resources, Chen and Chen (2010) noted that digital libraries face the challenge of effective informational architecture: even when curated by a college library, digital archives may not be intuitively or optimally organized for use by novice students. Likewise, Sharkey (2013), a professor of library science and Head of Information Use and Fluency at Illinois State University, noted that “information and technology are no longer separate entities but are inextricably connected” (p. 34), highlighting the importance of the instructor’s role in designing technology fluency instruction that focuses on the higher order thinking that will “give students a high level of aptitude to interact fluently with both (the) information and technology” (p. 37). Thus, the use of digital archives in this learning context itself presents a twofold barrier in terms of a lack of student knowledge about digital archival research methods, as well as in terms of a need for the development of 21st century teaching competencies that can support students in developing understanding of the nature of the information contained in digital archives, how that information is organized, and how to access and use that information (see ISTE Standard for Students 1). What is needed are both teaching approaches and instructional design approaches that will allow for student use of archival technologies while still foregrounding content learning and extension of students’ critical thinking and inquiry skills.

There are precedents for such an approach. In a controlled study of a Problem-Based Learning unit incorporating digital archives, Chen and Chen (2010) found that the use of digital archives that had been structured by the instructor resulted in deeper learning for students at three phases of the learning process (cognition, action, and reflection), in part because the problem of cognitive overload and the problem of students finding ineffective resources on the Internet were bypassed when more structured resources were presented (25).

 

Solutions

The pedagogical frameworks that form the basis for this unit work together to support the concept of student “intellectual access” and draw upon the social constructivist approaches of Problem- and Project-Based Learning (PBL) in which student design of learning goals and iterative work on developing solutions (ISTE Standard for Students 4) is enacted through collaborative knowledge construction supported by the digital communication media that characterize these students’ world (ISTE Standard for Students 7).

Specifically, this unit develops a pedagogical foundation for the use of digital archives in an introductory literature course through: 1. The Community of Inquiry model, and an outgrowth of it (the QUEST Model for Inquiry-Based Learning); 2. The cultivation of 21st century instructional competencies in instructional design, facilitation, and collaboration (See ISTE Standards for Educators 4, 5, and 6; Florida State University Technology Integration Matrix) to scaffold and support students’ development of critical knowledge construction and communication competencies; and 3. The affordances of 21st century communication venues such as Web 2.0 content authoring tools, the platforms in which today’s researchers and tomorrow’s graduates will communicate, for similarly supporting students’ development critical knowledge construction and communication competencies.

 

The Community of Inquiry and QUEST Models

The Community of Inquiry model that has been the subject of research for nearly 20 years at Alberta’s Athabasca University and beyond describes the inter-relationships between three key elements that must be present for a meaningful higher education learning experience to take place among a community of instructors and students: cognitive presence, social presence, and instructor presence. The most essential element, cognitive presence, denotes the cognitivist elements of the learning process (such as experience, questioning, pattern recognition, making and applying connections, and noting and reconciling dissonances). In this model, the overlap between cognitive presence, social presence and teaching presence creates ways to focus on developing social critical discourse through the design of the educational experience. QUEST is a model for an instructional unit or learning experience based on Community of Inquiry principles. In the QUEST model, which focuses on the elements of cognitive presence and social presence, students formulate a personalized Question about course content, Understand the topic better by conducting research and sharing sources, Educate and collaborate by interacting with peers through an iterative process of discussion of the shared questions and resources, engage in defining a Solution through reflection on the inquiry process; and Teach others through presenting a final product in a blog or other Web 2.0 genre that engages an authentic audience.

 

21st-Century Communication Media

The use in college humanities courses of multimodal composition assignments presented through blogs, video and other Web 2.0 technologies represents more than a shift to 21st-century composition media. This movement is situated within a larger pedagogical “social turn” that emphasizes literacy as not merely cognitivist but sociocultural in nature. In this theoretical view, ways of reading and writing such as genres and written and spoken forms of the English are generated by the communities of practice–professional and institutional, but also historical and cultural–who use these conventions to achieve shared goals (Gee, 2010). Thus people learn not literacy but “literacies” that involve the ability to engage with those communities in terms of the evolving discourse structures of those communities. From the perspective of compositionists, “digital literacies” involve the way digital tools are used within sociocultural groups, such as the scholarly communities who use blogs, digital archives, and online journals for scholarly communication, but also the many other sociocultural groups that produce work in online media to learn and communicate (Gee, 2010).

Although the design for this unit presupposes neither that the unit must be used in a primarily online course (in fact it is piloted here in two face to face course sections) nor that digital composition tools should be chosen other than in the context of a comprehensive list of criteria for how learning materials should be optimized for learning, several factors suggest the use of a blog format for student work in this unit: two factors in particular–1. that this unit brings together students from across the globe in a condensed timeframe to achieve the desired learning outcomes of articulating the relevance of literature for contemporary contexts; and 2. the reciprocal goal of developing students’ existing social media literacies for an academic purpose and relating such a purpose to 21st century research and communication venues–indicate the type of computer-mediated learning context for which the QUEST model suggests the use of a blog format for student work.

For this unit, students from multiple classes and locations need shared venues in which they can locate primary sources in digital archives, post sources and reflections on sources, collaborate through feedback, and share final presentations of the results of their inquiries. They need a research and communication environment that will support the process of development of deeper understanding of a problem, generating ideas, and finding solutions (Kuo, Chen, & Hwang, 2014). On the other hand, they need an environment with affordances that support the creative development of student final products that display learning. Further, they need to work in an environment in which the cognitive load of learning new technologies and methods is minimized.

Criteria considered in the choice of a primary content curation tool for this project, included:

  • ease and affordability of access (a free tool was desired)
  • capacity for instructor and student curation of web links to digital archives
  • capacity for supporting student development of creative content
  • capacity for supporting student writing and revising
  • capacity for supporting the small group and peer-to-peer aspects of a research and writing process as well as the “voice to all” and visual presentation aspects of an archival product (Brownstein & Klein, 2006)
  • ease of use, including minimal layers of technology
  • privacy: students should be able to opt out of associating work posted publicly with their names

In addition to developing this list of criteria, the main venue for student work in this unit was also considered in light of how it would support the exemplary instructional design in terms of the 6A’s Project Idea Rubric and Puentedura’s (2003) Matrix Model for designing and assessing network-enhanced courses. Puentedura’s model includes the diagnostic tool for selecting computer-based technology tools, known as SAMR, in which the best use of technological pedagogy achieves “redefinition,” where “the computer allows for the creation of new tasks, inconceivable without the computer” (Puentedura, 2003). The combination of student digital archive research and curation with the collaborative process of the QUEST model in the context of highly creator-friendly blog space seems to meet this criterion.

After developing two prototypes using different Web 2.0 tools, the instructors for this unit chose to create a Google Sites webpage to house students’ work, interaction, and access to digital archival resources. Support for student use of the site was provided both through a series of modeling sessions (Preparatory Lessons 1 and 2) and through written project instructions and a technology tutorial, as well as daily in-class “check-ins” and individualized support through email and in person.

 

21st-Century Teaching Competencies

Garrison, Anderson & Archer (2000), the authors of the Community of Inquiry model, note that “the binding element in creating a community of inquiry for educational purposes is that of teaching presence” (p. 96). They categorize this element in the three indicator categories: instructional management, direct instruction, and building understanding. Instructional management has to do with the design and planning, and the considerations of how technological teaching media change learning and call for the sorts of criteria for instructional choices considered above. For this unit, the lead instructor worked with the participating instructor to select and design learning environments, technologies, and a series of learning tasks structured according to the QUEST Model. The Community of Inquiry model also features a second teaching indicator, “direct instruction,” which overlaps with unit design but is not a focus of the QUEST Model. Direct instruction involves both content and pedagogical expertise, and is described by Garrison, Anderson, & Archer as “those indicators that assess the discourse and efficacy of the educational process” (p. 101). A significant shift in computer-mediated instruction such as that used in this unit, is the shift from instruction that takes place in written as opposed to verbal formats. Engaging this shift successfully involves much more than following a list of netiquette protocols, such as providing timely standards-based feedback at each stage of student work, and much more than choosing a unit design framework. Key aspects of direct instruction that were considered in this unit included how the instructors would teach the literary skills and content to be employed in students’ work, how we would how we would teach requisite technology skills and content to students, how we would facilitate student engagement with the unit project, and how we would move the learning of the unit along, for instance through intervention or providing opportunities for reflection. These aspects of direct instruction are nowhere more paramount than with a community college student population, with its diverse range of backgrounds in terms of culture, literacy, and preparation.

Two key ways in which we addressed direction instruction in this unit included instructor modelling and individualized support and feedback that anticipates and responds to student needs.

Pursel and Xie (2014) studied the use of blogs housed internally by a university to explore which blog patterns led to improved student performance over time. One finding of their study was the relationship between instructors’ use of modeling the behavior expected from students and student achievement. Instructors who model alongside facilitating and making effective technology choices are more likely to leverage student engagement.

Greener (2009), in her article “e-Modeling – Helping learners to develop sound e-learning behaviors,” provides a fuller picture of what effective instructional modelling looks like. She calls for not just demonstrating proficient skills, but for taking risks and showing students what it looks like to try new things and face unexpected results, then comparing those approaches to more effective strategies. The need for direct instruction, in part through this sort of modelling, which leads beyond observation on students’ part to collaboration as students engage with the dynamic and uncharted nature of digital learning environments, formed the basis for Preparatory Lessons 1 and 2, which precede implementation of the student project in this unit. These lessons were designed to be used in as many iterations as needed or to be distributed across class days as needed to provide instructors with flexible opportunities to provide effective direct instruction through modelling and collaboration as well as through lecture and through the videos that would be implemented in the student project.

Our second consideration for direct instruction involved how to proactively support individual student needs, including helping students develop the ability to use digital tools to connect with peer audiences, a key indicator of ISTE Student Standard 7. One approach was to provide modelling through samples of student work, including constructing peer feedback, and in Preparatory Lesson 2 to model not only products of student work but the process of constructing student work and peer feedback. A second was the decision to provide instructor feedback within the same learning and composing space that students occupied. (This decision was partly driven by the lack of a mechanism for a private communication channel between students and instructors on the blog site.) Thus in this unit, instructor provision of individualized feedback (direct instruction) is blended with small group facilitation (building understanding).

“Building understanding,” the third group of teaching indicators in the Community of Inquiry model, is described by Garrison, Anderson & Archer (2000) as follows:

A process that is challenging and stimulating is crucial to creating and maintaining a community of inquiry. This category is very much concerned with the academic integrity of a collaborative community of learners. It is a process of creating an effective group consciousness for the purpose of sharing meaning, identifying areas of agreement and disagreement, and generally seeking to reach consensus and understanding. Through active intervention, the teacher draws in less active participants, acknowledges individual contributions, reinforces appropriate contributions, focuses discussion, and generally facilitates an educational transaction. (101)

Building understanding in a digital learning space is a 21st century teaching competency that is critical for effective learning by community college students. Although the instructors of this unit considered whether instructor presence in students’ learning spaces would stifle student conversations, our hypothesis that, on the contrary, it would help to facilitate meaningful student conversations, was borne out by the fact that in the exit survey for the pilot implementation of this unit, some students requested more instructor feedback evaluating the quality of peer feedback they were providing, and a number of students expressed frustration at lack of peer involvement. Suggestions for how instructors of this unit can “build understanding” include: creating a visual social connection between participating groups prior to implementation of the student project, either through synchronous interaction or asynchronous video introductions; and intervention and support through email or another private channel for all students early in the project to provide coaching and support in their peer feedback.

References:

Brownstein, E., & Klein, R. (2006). Blogs: Applications in science education. Journal of college science teaching, 53(6).

Chen, C., & Chen, C. (2010). Problem-based learning supported by digital archives: Case study of Taiwan libraries’ history digital library. The electronic library, 28(1), 5-28. Retrieved from: http://dx.doi.org/10.1108/02640471011005414

Garrison, D. R., Anderson, T., & Archer, W. (2000). Critical Inquiry in a Text-Based Environment: Computer Conferencing in Higher Education. Retrieved from the Athabasca University website: http://cde.athabascau.ca/coi_site/documents/Garrison_Anderson_Archer_Critical_Inquiry_model.pdf

Garrison, D. R., Anderson, T., & Archer, W. (2001). Critical thinking, cognitive presence, and computer conferencing in distance education. American Journal of Distance Education, 15(1), 7–23.

Gee, J.P. (2010). A situated sociocultural approach to literacy and technology. In Elizabeth A. Baker (Ed.), The new literacies: Multiple perspectives on research and practice. New York: Guilford, (pp. 165-193). Retrieved from http://jamespaulgee.com/pdfs/Literacy%20and%20Technology.pdf

Greener, S. (2009). e-Modeling – Helping learners to develop sound e-learning behaviors. Electronic journal of e-learning, 7(3), 265-272. Retrieved from: https://files.eric.ed.gov/fulltext/EJ872416.pdf

ISTE Connects. (2016, January 19). Here’s how you teach innovative thinking. International Society for Technology in Education. Retrieved from https://www.iste.org/explore/articleDetail?articleid=651

Kingsley, T., & Tancock, S. (2014). Internet inquiry. Reading teacher, 67(5), 389-399.

Koehler, M.J., & Mishra, P. (2009). What is Technological Pedagogical Content Knowledge?Contemporary issues in technology and teacher education, 9(1), pp. 60-70. Retrieved from: http://www.citejournal.org/volume-9/issue-1-09/general/what-is-technological-pedagogicalcontent-knowledge/

Kop, R. (2010). Using social media to create a place that supports communication. In George Veletsianos (Ed.), Emerging technologies in distance eduction, Athabasca University Press. Retrieved from http://www.aupress.ca/books/120177/ebook/14_Veletsianos_2010-Emerging_Technologies_in_Distance_Education.pdf

Kuo, F.-R., Chen, N.-S., & Hwang, G.-J. (2014). A creative thinking approach to enhancing the web-based problem solving performance of university students. Computers & Education, 72(c), 220–230.

Lehman, R.M. & Conceição, S. (2013) Motivating and retaining online students: Research-based strategies that work, Jossey-Bass / Wiley. Retrieved from http://ebookcentral.proquest.com/lib/spu/detail.action?docID=1376946

Nunes, C., (2015). Digital archives in the wired world literature classroom in the US, Ariel 46(1/2), 115-141.

Puentedura, R.R. (2003). A matrix model for designing and assessing network-enhanced courses. Retrieved from the Hippasus website at http://www.hippasus.com/resources/matrixmodel/index.html

Pursel, B.K., & Xie, H. (2014). Patterns and pedagogy: Exploring student blog use in higher education. Contemporary educational technology, 5(2), 96-109.

Redmond, P., Abawi, L., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online learning, 22(1), 183-204. doi:10.24059/olj.v22i1.1175

Sharkey, J. (2013). Establishing twenty-first-century information fluency. Reference & user services quarterly, 53(1), 33-39.

Wicks, D. (2017). The QUEST model for inquiry-based learning. [PDF document]. Retrieved from: http://davidwicks.org/iste-2-design-and-develop-digital-age-learning-experiences-and-assessments/quest-model-for-inquiry-based-learning/

Wray, E. (n.d.) Rise Model for Peer Feedback. [PDF document]. Retrieved from: https://static1.squarespace.com/static/502c5d7e24aca01df4766eb3/t/582ca65915d5db470077ce05/1479321178144/RISE_rubric-peer.pdf

Google Forms and the Power of Self-Assessment

Mention the word ‘data’ in a staff meeting and you’ll see teachers stifle eye rolls and sighs. Because we know what’s coming next…graphs and charts depicting test scores from the prior school year or quarter showing us all the ways in which our students didn’t meet the districts’ lofty goals. This isn’t the kind of data I want to talk about today. I want to talk about data that is meaningful and student-driven.

Data collection and analysis is part of the ISTE Coaching Standard 2h, “…model effective use of technology tools and resources to systematically collect and analyze student achievement data.” Being a self-professed Google junkie, I knew I wanted to cover Google Forms for this post. Then, after researching the many ways Google Forms can be used for data collection, I discovered a post on the blog Lindsay Ann Learning which suggested using Forms for student self-assessment. I’ve used Forms to gather and analyze multiple choice data, but this post opened my eyes to new ways to use Forms for data. It also challenged me to consider how I define “quality” data. Is it the percentage of students who chose the correct letter answer, or is it growth over time as defined by a much broader set of standards and demonstrated through reflection?

What is meaningful self-assessment?

  • A process in which students “1) monitor and evaluate the quality of their thinking and behavior when learning and 2) identify strategies that improve their understanding and skills. That is, self-assessment occurs when students judge their own work to improve performance as they identify discrepancies between current and desired performance.” (McMillan & Hearn, 2008)

Why student-driven data?

  • Students regularly provided with the opportunity to self-assess and reflect on their own learning are more likely to recognize the elements that led to success: hard work, effort, and studying. (Fernandes & Fontana, 1996)

New Data Idea 1: Collect data in the form of student reflection

To test out this new way of collecting data, I made a sample Research Project Self-Assessment in Google Forms. I incorporated the advice shared on Lindsay Ann Learning including using linear scales with an odd number of choices (to ensure no middle-line stances), incorporating open and close-ended questions, and writing questions designed to measure self-perception of learning. Here’s what data from that self-assessment might look like:

Class-wide data as seen from Google Form Responses tab

 

Sample student report with change over time (click to enlarge)

 

New Data Idea 2: Give the power of the rubric to students

Rubrics. I have a love-hate relationship with them. Why? They’re so helpful in understanding where a student is at and why, yet almost no student actually reads through them! That’s why I appreciated Jennifer Roberts’ idea. As part of her Memoir Self-Reflection (which you can make a copy of here), students must read through her rubric and rate themselves on each element.

Photo credit: Google Form ‘Memoir Self Evaluation’ made by Jennifer Roberts

Research supports the value of rubrics in helping students meet learning goals. As stated by McMillan and Hearn: “[P]roviding evaluation criteria through rubrics…helps students concretely understand outcomes and expectations. They then begin to understand and internalize the steps necessary to meet the goals.” (2008)

New Data Idea 3: Exit tickets for quick reflection

Exit tickets as formative assessment are nothing new in education. However, using Google Forms to streamline this process can help you easily gauge how students feel about their own learning after a lesson. Here’s a sample Exit Ticket I made. Taking a minute at the end of class to allow students to self-assess can inform instruction before you dive into assessments and projects with a large portion of your class potentially in the dark.

 

 

Next Steps

Self-assessment data should drive instruction in your class in the same way that traditional high-stakes testing instruction should. Below are some next-steps you might consider when using self-assessment data to drive instruction.

 

Sources

Fernandes, M., & Fontana, D. (1996). Changes in control beliefs in Portuguese primary school pupils as a consequence of the employment of self-assessment strategies. British Journal Of Educational Psychology, 66(3), 301-313. doi: 10.1111/j.2044-8279.1996.tb01199.x

Google Forms for Data Collection. (2016). Retrieved from https://lindsayannlearning.com/student-data-google-forms/

McMillan, J., & Hearn, J. (2008). Student Self-Assessment: The Key to Stronger Student Motivation and Higher Achievement. Educational Horizons, 40. Retrieved from https://files.eric.ed.gov/fulltext/EJ815370.pdf

Roberts, J. (2017). Self-Evaluation Google Form for Students. Retrieved from http://www.litandtech.com/2017/09/self-evaluation-google-form-for-students.html

Technology Professional Development That Teachers Can Use

Many districts are seeing the value of hiring teachers with the job of helping other teachers integrate technology into their classrooms. Although these positions can have many different titles (tech integration specialists, technology coach, educational technology consultant, technology coordinator, etc) and different districts use people in these roles in different capacities, having a person support and coach classroom teachers as they integrate technology into their classroom is becoming a necessity in education.  ISTE summarizes the role of these professionals in the “ISTE Standards for Coaches”. The 2nd standards reads, “Technology Coaches assist teachers in using technology effectively for assessing student learning, differentiating instruction, and providing rigorous, relevant, and engaging learning experiences for all students (ISTE, 2011). Teachers have so much to try and stay on top of these days, such as new and changing standards, standardized testing, teacher evaluations, social-emotional learning needs, and outdated curriculum that needs to be supplemented.  Integrating technology in efficient and meaningful ways can make life easier for students, teachers, and families. However, making these shifts and trying new things can be daunting when your plate is already full. Having a specialist whose job it is to help teachers make these changes by providing trainings, individualized coaching, and on-going support can have tremendous benefits to a district. But, with so many teachers on different pages as far as their experience, skill set, and comfort level with technology, it can be hard for a technology coach to provide professional development to a large group of teachers with the goal of everyone leaving feeling that their time was valued and that they now have something new they can implement in their teaching.

What do Teachers Want/Need? Just Ask!

When you are not currently teaching in a classroom position, it is hard to know what exactly teachers want or need at any particular moment.  Like with so many things in life, when you don’t know or aren’t sure, just ask! People appreciate this! This can be done in formal or informal ways.  An easy way to formally survey a group of teachers to which you will be providing upcoming training is to send out a Google Form with a variety of answer formats (multiple choice, open-ended questions, scales (1-10)). Be sure to ask a variety of questions and be specific in your requests for information from teachers (Gonzales, 2016).  Informal ways of getting to know what your audience’s preferences for a training might be to come to the school a week or two beforehand and stop in classrooms before or after school to chat. Or eat lunch in the staffroom and engage teachers in casual conversation on what they might be looking for as far as technology integration needs. Another option would be to “work the room” as teachers are arriving at the training and getting set-up. Gonzales writes in her blog about ed-tech consultant, Rodney Turner, using this strategy, “If you can’t send out a survey ahead of time, you can still get to know your audience the day of the training. Rodney Turner describes how he does this: “What I love to do is to circulate the room. I come in early, and I set my stuff up and have it done, so that way as people are coming in, I talk to them: ‘Hi, how are you doing, my name’s Rodney, where are you from, what grade, what do you teach…what do you want to learn from this session?’ And that has helped me so much in being able to reach out to people to understand where they’re coming from.” (Gonzales, 2016).

Enlist Help from the Experts

When teachers want help on how to prepare for a lesson or how to understand the curriculum, they typically walk next door or down the hall.  Note the percent of teachers who say ideas from other teachers is the most helpful when it comes to technology training in the chart below (Education Week Research Center, 2016 ).

Teachers respect other teachers and know that “they know what it’s like”.  Teachers are such an invaluable asset to each other because each and every teacher has different skill sets, different teaching styles, and different teaching experiences. You can learn something from every teacher and every teacher can learn from you.  When a technology coach is planning for a professional development training they should enlist help from the group they will be “training”. Find the “experts” in different areas of technology and use them to share examples of what they have done in their classrooms and what has worked and what hasn’t. In her blog post, Gonzales talks with tech coordinator Sarah Thomas about how she looks for teachers in the audience as a potential resource.  “Not only does this approach enhance her presentation, it also makes the training more enjoyable for the teacher who already has that knowledge. “There’s nothing worse than being at a session where you already know what’s going on and you’re just kind of being talked at, you know?” says Thomas.” (Gonzales, 2017).

Provide Options

If there are several technology coaches in your district, or if you have enlisted the help of teacher leaders (see paragraph above!), then another way to help provide staff with technology integration learning experiences that are best suited for their needs is to provide options for professional development.  This might be structured with multiple “levels” on the same topic that teachers can self select in to, or it might be that you have a larger menu of a variety of options so that teachers can choose what will be most useful for them depending on factors such as their grade level, subject area, and their experience with technology.  Another option is to make these trainings optional for teachers or offer 3 different session times and someone can attend 0, 1, 2, or 3 sessions based on their needs. The key here is to give teacher’s choice on how they spend their time. Everyone wants to feel that their time is valued, especially teachers with limited time and ever-growing demands on this time.

Follow-Up

Receiving a lot of new and exciting information can feel both inspiring and overwhelming.  You walk out of a professional development session and you can’t wait to get back to your classroom to try out all that you have learned, but when you arrive at school the next day you are met with a long to-do list just to keep on top of your daily work routine.  Or after reflecting on the training, you have some logistical questions to figure out before you attempt implementation of what you just learned. When this happens, teachers will either struggle through and give this new skill or strategy their best shot or they will throw the towel in because they don’t have what they need to feel confident implementing what they have learned.  This is the time period when we need to “capture” these teachers and give them what they need to feel empowered to make this change in their teaching. Following up in a timely manner is key.

Be sure to send the teachers you are training away with your contact information and a digital link to any resources you shared or any resources that might help them deepen their understanding of what they have learned (Gonzales, 2016). But, as a technology coach, don’t rely on teachers to reach out to you. Technology integration, although we all know how important it is, is only one aspect on a classroom teacher’s job. Reach out to them, whether it’s individually, as a large group, online, or in-person.  Make that connection and work on building these professional relationships.  “What I have said to the teachers I work with is that the time we are together, in person, is just the start of a conversation. Because technology grows and changes so quickly, we can’t rely on traditional methods of learning to stay on top of it. We can’t wait for a textbook to be published; to really make the most of what the machines can offer us, what we ultimately need is each other, so staying connected is an essential part of any tech training. (Gonzales, 2016).”

 

Sources:

 

Flanigan, R. (2016). Education Week (35, 35), pp. 31-32. Ed-Tech Coaches Becoming Steadier Fixture in Classrooms

 

Gonzales, Jennifer (2016). Cult of Pedagogy Website (Retrieved on May 24, 2018) from: https://www.cultofpedagogy.com/tech-training-for-teachers/

 

ISTE.org. (2017) ISTE Standards for Coaches. (Retrieved on 2018, May 30) from: https://www.iste.org/standards/for-coaches

 

Toward a theory and practice of coaching higher ed faculty

Why re-imagine faculty professional learning?

Among the three 2017-2021 strategic priorities of Colorado- and Washington, D.C.-based nonprofit higher education association EDUCAUSE is “reimagined professional learning.” By replacing the term professional development with professional learning in its vocabulary, CEO John O’Brien (2018) indicates the extent to which EDUCAUSE and other higher education associations and collaboratives believe that for higher education institutions to thrive in a world characterized by “automation, cognitive computing, and digital transformation,” higher education faculty members, executives, and information technology (IT) administrators alike must cultivate the very digital age skill sets—“adaptability, creativity, empathy, problem solving and decision making”—that are characteristic of the digital age learning and of the teaching skills defined by ISTE.

Stating this equation in business management terms, O’Brien’s colleague Mike Meyer (2018) notes that community colleges in particular have failed to recognize that if the user experiences of students from specific populations will make or break the success of an institution, then the “purchase and partial integration of” new student pathway systems, tracking systems, and learning management systems into an old organizational infrastructure of divisions and committees, with IT (information technology) or ITC (information and communication technologies) remaining in a peripheral role, is a paradigm unable to support a viable future.

Reconceptualizing the central role that IT—which encompasses long-term strategic organizational approaches to developing digital learning environments; creating an administrative, teaching, and learning culture rich in digital creation, problem-solving, and collaboration abilities, and developing the information, data, media literacy, and ethical capacities that support that culture—involves, for one, a reconceptualization of how faculty professional learning intersects with the work of IT administrators and instructional designers in delivering an educational product that will serve a student population and sustain the educational market that population represents.

This post is about the quest for a professional learning model that a small rural community college can use to focus on how technology integration impacts faculty pedagogical capabilities. ISTE Standard for Coaches 2 calls for coaches to use technology in ways that model and coach faculty in best assessment, differentiation, and learning design practices. But the scope of this investigation has implications for how an institution-wide culture might grapple with its own quest to adapt to the current higher education landscape.

At the conclusion of this post, I’ll propose a model for a faculty-led professional development workshop series that uses assessment, differentiation, collaboration, coaching, communities of practice and their associated technologies to suggest how a faculty-led initiative could assist an institution in conceptualizing the sort of professional learning that might support cultivation of innovation and excellence in education.

To get there, I’ll explore the nature of coaching (a professional learning paradigm indebted to the fields of business and athletics) and the principles of adult learning (which apply to professional development for all higher education stakeholders), the collaborative- and action-based best practices for professional learning that are identical to the best practices used in higher ed classrooms and digital learning spaces, examine the implications of a validated construct of higher education teaching for professional learning, and review several professional learning models based on that construct, including a model developed by one of my institution’s sister colleges.

 

Andragogy and coaching fundamentals

Professional development is learning. In my conversations with higher ed stakeholders, I too have dropped the term professional development in favor of professional growth or professional learning. One reason I do this is to emphasize that higher education culture aimed at cultivating student learning should prioritize faculty development of the leadership, innovation, and teaching skills needed in digital age higher ed. A second is that I believe an institution’s approach to professional development should use the same principles of learning and teaching we say we expect instructors to enact. Professional development that results in professional learning, according to Zepeda (2015) should involve:

  1. active knowledge construction;
  2. collaborative learning;
  3. application in context, over time, with follow-up feedback that can be incorporated into continual learning; and
  4. differentiation.

The principles of andragogy further reinforce these well-established cognitivist and constructivist learning principles. Professional adult learners are self-directed and need to apply new knowledge immediately; as members of local and disciplinary professional learning communities, they need to collaborate in ways that allow each faculty member to co-learn, co-teach, contribute knowledge and benefit from collective knowledge; as those who are learning skills that were often not part of their graduate programs and may be determined by policymakers rather than by disciplinary best practices, thy need access to coaching, technical support, and follow-up as part of professional development projects and infrastructure; and they need access to differentiated learning that engages their particular disciplinary, technological, and pedagogical proficiencies and teaching assignments (Zepeda, 2015). Indeed, one key feature that distinguishes college instructors from K-12 teachers is their heterogeneity (Bachy, 2014). More on this below.

Institutions that seek to respond strategically to the changing higher education paradigms and markets of the 21st century need to overcome the obstacle for many of inspiring collective commitment among the highly diverse individuals who make up an institution (Gentle, 2014).

Characteristics of higher education institutions that are able to create such mutually committed, emotionally intelligent cultures include:

  • embracing collaboration between faculty and administration that promotes the sharing of best practices;
  • sufficient professional staff support for administrators and faculty in their primary roles (as in the current discussion of supporting faculty technological-pedagogical knowledge and practice);
  • a focus on student needs and expectations that also protects faculty from unrealistic demands;
  • and a balance between expecting accountability from staff and respecting and developing academic autonomy. (Gentle, 2014)

One of the keys to developing such an institutional characteristics is to establish a culture of feedback. Creating such a culture is much more stating a “open door policy,” providing places for various stakeholders on committees, or conducting annual faculty reviews based on models that may not include clear quantitative measures of technology-supported teaching competencies (Dana, Havens, Hochanadel, & Phillips, 2010).

What can make these practices effective and develop these characteristics are relationships based on coaching principles (Gentle, 2014). While these principles will be related to both faculty peer coaching models and to professional learning models below, coaching principles can also be adopted as consistent practices in the informal relationships that make up an organizational culture to keep the institutional focus on collaboration and growth.

If coaching allows facilitates the reflection and practice that allow faculty to grow in their application of specific disciplinary and technology-related 21st century teaching skills, and to approach challenges through problem-posing and problem-solving, coaching must be differentiated to address individual faculty members’ needs and also embedded in the real work of teaching (Zepeda, 2015). While much of the work of coaching involves positive conversations and questions that help a faculty member clarify and discern where she needs growth and respond strategically, coaching also involves modelling and progress monitoring. A higher education institution’s professional learning program or a single professional learning project can employ coaching effectively at the peer level as well as at the program level. Peer coaching as a form of professional development was introduced by Joyce and Showers (2002) in the 1970s. The hallmark of its effectiveness is not verbal feedback but rather the unexpected growth that happens through collaboration.

Zepeda (2015) compares peer coaching to “clinical” supervisory models, which also mirror constructivist learning approaches. In this cycle, an instructor is presented with a theory or technological pedagogical tool, observes and discusses modeling of the theory or tool in practice, creates and practices an application of the theory or tool, and receives feedback on that practice. Significantly, the model progresses to coaching, which involves more than observation and feedback, but for the faculty member to then step into the coaching role. Joyce and Showers (2002) found that including of the integral component of coaching led to 95% mastery and transfer by instructors. Indeed, this model parallels Wiggins and McTighe’s (2005) Understanding by Design model (also known as UbD or “backwards design”) for development of curricula or any other learning experiences, such as professional development. In this conceptual framework, learning (or “understanding”) results are first identified, then evidence of learning is established, flowed by development of instructional materials. A key concept in UbD is the nature of feedback; true feedback is formative and summative feedback defined through specific criteria that enables the learner to improve and meet goals (Wiggins & McTighe, 2005).

 

Using TDPK to understand faculty diversity and provide differentiated coaching

A fundamental consideration for either peer or supervisory/program coaching of postsecondary faculty is the heterogeneity of faculty, not only in terms of individual faculty members’ existing competencies with technology in general, but with knowledge of teaching technologies and of pedagogy/andragogy (which may or may not have been a focus of their graduate study). In addition, the nature and types of knowledge in each discipline vary widely. Individual faculty epistemologies—their individual beliefs about the nature of knowledge and how it can be constructed—also affect their teaching practices.  The complex interaction between these different components of teaching has been studied over time through the development of a series of models for understanding what is involved in postsecondary teaching, and hence in faculty professional learning.

Bachy (2014) summarizes the development of this evolving model, provides validation for it, and begins to suggest how this model relates to constructivist faculty development approaches (such as UbD). She also suggests how this model can provide diagnostic assessments to differentiate professional learning opportunities for faculty.

Bachy’s (2014) model of teacher effectiveness, TPDK, includes the four dimensions of an “individual teacher’s discipline (D), personal epistemology (E), pedagogical knowledge (P), and knowledge of technology (T). Each of these dimensions, as well as their effects on one another, contributes to a unique profile of how a faculty member teachers in terms of disciplinary knowledge, beliefs about learning, knowledge of pedagogies, and knowledge of technologies for communication, learning, and disciplinary knowledge-construction. For example, “when a teacher feels competent in the technology associated with their discipline (TD dimension), it influences their educational choices… and, to a lesser extend (we observe a lower, significant, correlation), their epistemological choices” (Bachy, 2014, p. 33). Thus by identifying four validated aspects of teaching as well as six validated dimensions (such as TD) or relationships between those aspects, faculty members and their reviewers and coaches alike can define and conceptualize an instructor’s teaching strategies and define and conceptualize how to chart professional growth.

 

tdpk model
The TDPK model, showing four knowledge dimensions and the relationships between them. Retrieved from http://www.eurodl.org/materials/contrib/2014/Bachy.pdf
elationships between tdpk dimensions
Relationships between the TDPK dimensions each define a specific type of teaching knowledge. Retrieved from http://www.eurodl.org/materials/contrib/2014/Bachy.pdf

In addition to presenting the historical underpinnings of the TPDK model, Bachy presents the study that provided initial validation of the tool. The survey used in this study, along with the resulting profiles of individual faculty members’ educational strategies, can provide a diagnostic tool and a graphically represented profile that can help faculty members and their coaches plan and measure professional learning. I tried out the survey and used the same radar data charts used in the study to create my TDPK teaching profile. The value of the radar chart that is used to represent the profile is that it shows the influences that each teaching dimension may exert on the others and which of the four dimensions influence an instructor’s practices most. (For more explanation of the profiles and a comparison of the initial experimental profiles of four faculty members to qualitative descriptions of their teaching profiles, see Bachy’s article.)

 

faculty tdpk profile
Radar charts based on my completion of Bachy’s survey. To compare with four other faculty profiles and written descriptions of those faculty members, see Bachy’s article at http://www.eurodl.org/materials/contrib/2014/Bachy.pdf

Bachy’s presentation of the TPDK profile as a diagnostic and assessment tool for faculty coaching suggests a number of applications to guide effective professional development and coaching based on better understanding of instructors’ actual educational strategies. One valuable potential use of the TDPK profile in a small college with limited professional learning resources would be for trainers to develop training tracks (with tailored training focuses and materials) based on the most frequently occurring TDPK profiles at the institution. The survey and resulting profile could also be used for a training pre-assessment.

Another compelling aspect of the TPDK model is its affinity for constructivist views of teaching and learning, including its understanding of disciplinary knowledge as constructed by the “Communities of Practice” who make up disciplines, support of the trend toward student-centered learning that forms the basis for the “pedagogical knowledge” dimension of teacher, move teacher training beyond mere focus on tools and into application, and merge technology training with pedagogical training.

Thus TDPK provides a validated theoretical model upon which to build approaches to professional learning that use assessment, differentiation, learning by design, coaching, and communities of practice linked with technology. These are the principles that are established in professional development literature (Zepeda, 2015).

 

Models for faculty professional development integrating technology

Institutional support for faculty development of technological pedagogical knowledge should encompass three dimensions. First, faculty need immediate, navigable access to knowledge supports such as tutorials, videos and a repository of curricular approaches adopted by the institution. Second, faculty need defined, sustained pathways to development of TPCK knowledge, such as through trainings or articulated levels of development. Third, faculty need always-accessible support, such as coaching and troubleshooting. The institutions that form the Colorado Community College System (CCCS) have met these needs in a variety of ways. At Front Range Community College (FRCC), defined levels of technological pedagogical training can be earned, and result in pay increases. This model allows for both standardization and differentiation. For example while all teachers who teach online must take a 3 credit hour course in online teaching, professional development levels of certification can be earned from a menu of webinars and other options that can be customized by faculty. FRCC has provided the third dimension of a professional development program, ongoing support, through coaching. In 2010, FRCC began creation of an instructional coaching program embedded in its professional development approach. Although coaches were hired for each of the college’s campuses and its eLearning program, integral parts of the program included collaborative peer coaching through Reflective Practice Groups, workshops with follow-up, and networking (Patterson, 2013).

For the community college seeking to begin an embedded, centralized and sustainable approach faculty in effectively teaching with technology, a key consideration is moving beyond providing mere resources or mere conference-style, one-shot “training” focused on speakers (whether external or in-house), into practice-based learning similar to the “clinicals” approach developed by Joyce and Showers (2002).

peer coaching cycle
Joyce and Showers’ peer coaching model

Dysart and Weckerle (2015) propose a workshop model similar to one that I proposed at my institution that incorporates the principles of learning and professional learning that form a common thread through the literature reviewed in this post. Both proposals contain in seed form the three larger institutional professional development program dimensions of accessible resources, specific but differentiated training opportunities, and coaching with feedback, but on a small scale version. Both suggest that the TDPK (or TPACK, a prior conceptual iteration of TDPK) provides a way to differentiate professional learning for the broad diversity of faculty needs with regard to incorporating technology into discipline-specific pedagogy, and stress the importance of providing technological and pedagogical training to give content knowledge experts the self-efficacy they need to teach effectively and ultimately to become members of an innovative and collaborative institutional culture. Both incorporate the active learning principles, and the three research- and theory-based approaches of Learning by Design, Peer Coaching, and Communities of Practice.

dysart and weckerle model
Dysart and Weckerle’s (2015) professional development model. Retrieved from http://www.jite.org/documents/Vol14/JITEv14IIPp255-265Dysart2106.pdf

In Dysart and Weckerle’s (2015) model, a practice-based professional development opportunity would follow Joyce and Showers’ active learning-through-coaching loop in three phases: during training, during teaching, and beyond training. During training, faculty would create a situated lesson or unit incorporating new technology. During the teaching phase, faculty would be supported by peer coaching that here would involve resource-sharing as peer coaches begin the transition to becoming future trainers. After implementation, faculty with similar interests in terms of any dimension of technology, discipline, or pedagogy would continue to share understanding and a repertoire of ideas. In the model I proposed, this repertoire would be housed in a digital repository, and would be housed and extended through technology-based repositories (such as LMS-based curriculum banks of pedagogies developed by faculty at the institution) and through the development of personal learning networks (PLNs) through which faculty could develop and receive ongoing real-time support through shared networking via blogs, twitter accounts that would support the limited centralized instructional design support that is currently available.

While in the short term, higher education teaching can be complicated by the policy changes—or failure to change—that may produce growing points in which faculty may indeed feel hindered from connecting disciplinary best practices to institutional technology decisions; overwhelmed by a focus on student success that unwittingly makes unrealistic demands on instructors along with insufficient support for developing the related competencies, it is the faculty leaders themselves who possess balanced technology, pedagogical, epistemological and disciplinary knowledge who may be best equipped to find professional learning solutions that will enable higher education institutions to cultivate cultures of collaboration, innovation, and teaching and learning excellence.

 

References:

Bachy, S. (2014). TPDK, A new definition of the TPACK model for a university setting. European journal of open, distance, and e-learning, 17(2), 15-39. Retrieved from http://www.eurodl.org/materials/contrib/2014/Bachy.pdf

Dana, H., Havens, B., Hochanadel, C., & Phillips, J. (2010, November). An innovative approach to faculty coaching. Contemporary issues in education research, 3(11), 29-34. Retrieved from https://files.eric.ed.gov/fulltext/EJ1072680.pdf

Dysart, S., & Weckerle, C. (2015). Professional development in higher education: A model for meaningful technology integration. Journal of information technology education: Innovations in practice, 14, 255-265 Retrieved from http://www.jite.org/documents/Vol14/JITEv14IIPp255-265Dysart2106.pdf

Gentle, P., & Forman, D. (2014). Engaging leaders: The challenge of inspiring collective commitment in universities. New York, NY: Routledge.

Joyce, B., & Showers, B. (2002). Student achievement through staff development. Alexandria, VA: Association for Supervision and Curriculum Development.

Meyer, M. (2018, May 7). How change has changed: The community college as an IT enterprise. Retrieved from the EDUCAUSE website at  https://er.educause.edu/articles/2018/5/how-change-has-changed-the-community-college-as-an-it-enterprise

Obrien, J. (2018, May 7). The Future of EDUCAUSE, Part 3: Reimagined Professional Learning. Retrieved from the EDUCAUSE website at https://er.educause.edu/articles/2018/5/the-future-of-educause-part-3-reimagined-professional-learning

Patterson, B. (2013). A model for instructional coaching at the community college. Innovation showcase, 8(12). Retrieved from the League for Innovation in the Community College website at https://www.league.org/innovation-showcase/model-instructional-coaching-community-college

Wiggins, G.P., & McTighe, J. (2005). Understanding by design. Alexandria, VA: Assoc. for Supervision and Curriculum Development.

Zepeda, S.J. (2015). Job-embedded professional development: Support, collaboration, and learning in schools. New York, NY: Routledge.

Digital Citizenship for Elementary Students

Just when elementary teachers thought they couldn’t possibly have anything else stacked on their plate, teaching digital citizenship has been added to their load. However, when a district or school has a intentional, well-organized, and comprehensive plan in place, digital citizenship does not have to seem like another chore or standard to check off.  Digital citizenship can be woven into what is already being taught in the classroom and should not be the responsibility of just one person or position. Digital citizenship should become a way of life in the classroom. Children often learn as much, or more, from adults modeling behavior than by adults expliciting teaching skills and behaviors. Crompton (2014) summarizes this well in her blog post on the ISTE website: “Students are much more likely to understand good digital citizenship — the norms of appropriate, responsible technology use — when teachers model it on a regular basis. It is also important for all educators to spend time directly teaching and actively promoting digital citizenship. And keep in mind that it’s not just one person’s job to teach digital citizenship in a school, but everyone’s shared responsibility.”

 

Curriculum: A Place to Start

 

Common Sense Media is a tremendous resource for digital citizenship lessons. These lessons address students K-12 and cover all aspects of digital citizenship such as: internet safety, privacy and security, relationships and communication, cyberbullying and digital drama, digital footprint and reputation, self-image and identity, information literacy, and creative credit and copyright.  I have taught many of these lessons K-5 and was impressed by the ease of use for teachers, engagement for students, and quality and quantity of material covered. There is even a brief tutorial for teachers to introduce them to digital citizenship instruction and this suite of free products.  Some of my favorite features of this resource are the “family tip sheet” and the videos. I also like how the lessons are interactive for the students and build upon each other throughout the grades.  You can teach just one lesson or use every lesson in the curriculum, it’s really up to you to customize what is best for your school or classroom. If you are new to teaching digital citizenship, I recommend Common Sense Media as a good place to start!

 

Why Digital Citizenship

 

There have been many years where in my elementary classroom I had only 3 simple “rules” for students to follow: Be safe, be respectful, and participate as best as you can.   Diana Fingal, in her article “Infographic: Citizenship in the Digital Age” from the ISTE website describes the elements of digital citizenship in similar terms. “The elements of digital citizenship, it turns out, are not so different from the basic tenets of traditional citizenship: Be kind, respectful and responsible, and participate in activities that make the world a better place. (Fingal, 2017)”.  Below is the infographic Fingal shared in her article:

Inforgraphic from: https://www.iste.org/explore/articleDetail?articleid=192&category=Lead-the-way&article=Digital-citizenship-infographic

 

Our students are using technology at skyrocketing rates both in the classroom and at home. Most of them enter Kindergarten well versed in how to navigate their way around a phone or tablet and able to manipulate websites and digital cameras. School is a place where we encourage our students to “make mistakes”. We want them to try new things, take risks, and step out of their comfort zones in order to develop and grow as life-long learners and citizens.  We want them to makes mistakes when the stakes are low and when they are well-supported by adults they trust. It is imperative that we teach our students how to become responsible, respectful, and valuable digital citizens when they are in our classrooms. This is not a skill set they come with and although this generation of digital natives may seem to have this all ingrained into them, they do not and this is a teaching opportunity, we (as educators) cannot miss. Crompton and Fingal both agree.

“Contrary to popular belief, however, digital natives don’t pick up these skills through osmosis. It falls on parents and educators to teach them how. Just as a teacher would talk to students about etiquette and safety before they enter a public place on a school trip, so must they remind students of what’s expected of them online.” (Crompton, 2014).

 

Just as all kids throughout the centuries have needed help from their parents, teachers and mentors along the path to becoming good citizens, our digital natives need guidance as they learn how to apply the elements of citizenship to the realities they encounter in a connected world.” (Fingal, 2017).

 

Sources:

 

Common Sense Media website (Retrieved on May 17, 2018) form: https://www.commonsense.org/education/digital-citizenship

 

Crompton, H. (2014).  ISTE.org website (Retrieved on May 20, 2018) from: https://www.iste.org/explore/articleDetail?articleid=142

 

Fingal, D. (2017). ISTE.org website (Retrieved on May 20, 2018) from: https://www.iste.org/explore/articleDetail?articleid=192&category=Lead-the-way&article=Digital-citizenship-infographic

 

ISTE.org. (2017) ISTE Standards for Educators. (Retrieved on 2018, April 30) from: https://www.iste.org/standards/for-educators

Krueger, N. (2014).  ISTE.org website (Retrieved on May 20, 2018) from:  https://www.iste.org/explore/articleDetail?articleid=242

Kiddom: A Tool to Support Standards-Based Grading and Individualized Learning

This week’s post was inspired by a Standards-Based Grading system I observed while subbing in a middle school math class recently. In the class, students were using the Schoology LMS (Learning Management System) to view the math goals (dubbed proficiencies) they had not yet reached for the quarter. They then took that information and sought out resources posted online by the teacher in order to help them meet those goals. Proficiency was demonstrated through quizzes posted online by the teacher. To study for each proficiency, students explored linked Khan Academy videos and completed various practice activities.

The system appealed to me for several reasons. Most importantly, students were aware of those skills they had mastered and which needed more practice. They also had the self-sufficiency to find and use the appropriate resources to help prepare to meet those goals. Students had a high degree of ownership over their learning and technology was providing both students and the teacher with data to analyze learning.

ISTE Educator Standard 6a asks educators to, “Foster a culture where students take ownership of their learning goals and outcomes in both independent and group settings.” While the system I observed used technology to meet those goals, I wanted to explore other tools available. I am not a big fan of Schoology and I was also curious about LMS systems designed specifically for Standards-Based Grading.

First of all, what is Standards-Based Grading (SBG)?

Standards-Based Grading is a method of assigning students a grade based on mastery of concepts instead of averages across multiple miscellaneous assignments. Ideally, the goals of a course should be driven by standards. Those goals are what is being measured in an SBG system. Instead of traditional letter grades, you may see terms like ‘Developing, Approaching, Mastery, Exceeding.’ These terms are sometimes converted to a number scale 1-4.

Why the shift to Standards-Based Grading (SBG)?

  • Traditional grades are inconsistent. Mastery based on standards is a much more qualitative learning measure than a traditional letter grade which also represents a student’s motivation, interest, and level of home support. Traditional letter grades also tend to be very subjective. What one teacher deems an A+ paper, another might assign a B-. SBG is much more objective. A grade is assigned based on whether or not specific learning goals are met (example- ‘Student used text evidence to support their analysis’).
  • Traditional grades rarely reflect mastery. As Scriffiny argues, “If we base our grades on standards rather than attendance, behavior, or extra credit (which often has nothing to do with course objectives), we can actually help students grapple with the idea of quality and walk away with a higher degree of self-sufficiency.” (2008)
  • SBG better promotes growth. Like many teachers, it frustrates me when students focus on making the minimum grade and moving on. Learning becomes a siloed, once-and-done experience. It feels inauthentic and drives students away from intrinsic motivation. Teacher and author John Spencer connects traditional grading to students’ fear of taking risks; “…when they see that their grades are based upon mastery rather than averaging, they realize that mistakes are an integrated part of the class flow.” There is value in these risks and mistakes because “when students can see [mistakes] as a natural part of the process, they can use mistakes to guide their reflection and ultimately celebrate their successes and the mistake-laden journey that led them there.” (Spencer, as quoted in Ferlazzo, 2016)

What is Kiddom?

In searching for ways that technology can support SBG and student ownership, I discovered a blog post by Angela Watson discussing Kiddom online. Kiddom is a Learning Management System that supports Standards-Based Grading for teachers in grades K-12. free for teachers and students. Dozens of standards are available including state-specific standards, Common Core standards, the ISTE standards, Next Generation Science standards, and even Social Emotional Learning standards.

Much like Google Classroom or Schoology, Kiddom serves as an online grade book and classroom. After setting up your class and inviting students, you can use the Kiddom library of lessons or upload your own assignments. Kiddom is fully integrated with Google Doc and you can incoporate lessons from popular sites like Newsela, Khan Academy, and IXL Learning. For each assignment, you can choose multiple standards and rubrics. You can choose between pre-populated rubrics and creating your own.

Kiddom also supports blended learning since you can include non-digital assignments and input the scores based on a rubric. To get an idea of what Kiddom looks like from a student’s perspective, check out this blog post.

How does Kiddom support individualized learning?

The data available to teachers give an accessible and visually appealing overview of where individual students and the class as a whole are at in terms of meeting set goals and overall standards. You can see some examples of that data on the page ‘What Insights Do My Reports Offer?

Using that data to inform instruction, assignments can be given to specific students based on their level of mastery. This is ideal for providing either extensions, extra practice, or remediation based on student need.

 

How can Kiddom support student ownership?

Kiddom uses the following graphic to depict the learning cycle that is possible with Standards-Based Grading and the Kiddom software. It’s similar to many cyclical education models where the student defines the tasks and sources, completes the task, reflects, and refines.

 

Using the reporting tools, students have the ability to view their progress toward specific goals. Kiddom provides a student-centered video tutorial for how to interpret reports on Kiddom. Below is a snapshot of the overall view students and parents can see from the Kiddom dashboard. Clicking on individual assignments within a standard reveals comments and the rubric used to assess the work. In this way, students know what they need to do in order to improve their mastery.

Another neat feature that Kiddom provides is the ability to teach using Playlists. Playlists support student choice (as I’ve previously explored). Playlists can be used to offer multiple ways to learn and show you know. Additionally, they can be assigned to specific students based on need or interest. One way I can envision using this tool in the Language Arts classroom is to support digital literature circles where each group is reading a different novel.

Are there any drawbacks?

The level of teacher investment and interest in the Kiddom system will likely impact how successful the tool is in supporting student ownership and individualized learning.

Kiddom offers a powerful set of tools if fully utilized. If not fully utilized, I can see this being a glorified online rubric system. For instance, if a teacher is only checking a box on a generic pre-populated rubric and not providing any comments or additional support/differentiation, it’s kind of like using a Ferrari only to transport your kid to soccer practice. It gets the job done, but you don’t need this tool if that’s all you want to accomplish.

To be successful, detailed feedback should be provided by the teacher along with additional support as needed. This is easily accomplished through the differentiated assignment options. Students should be given the option to refine and resubmit work. Standards-Based Grading is about a mind shift as much as it is a grade shift.

 

Sources

Ferlazzo, L. (2016). Response: ‘Freedom to Fail’ Creates a Positive Learning Environment [Blog]. Retrieved from http://blogs.edweek.org/teachers/classroom_qa_with_larry_ferlazzo/2016/09/response_freedom_to_fail_creates_a_positive_learning_environment.html

Kiddom – Collaborative Learning Platform. (2018). Retrieved from https://.kiddom.co/

Scriffiny, P. (2008). Seven Reasons for Standards-Based Grading. Retrieved from http://www.ascd.org/publications/educational_leadership/oct08/vol66/num02/Seven_Reasons_for_Standards-Based_Grading.aspx

Townsley, M. (2014). What is the Difference between Standards-Based Grading (or Reporting) and Competency-Based Education?. Retrieved from https://www.competencyworks.org/analysis/what-is-the-difference-between-standards-based-grading/

Writing Math: Integrating Universal Design with “Social Turn” Writing Pedagogy

Research-based Frameworks for Addressing and Assessing Online Learning Engagement

I begin this post with three hypotheses about online learning, based on my experience as a community college composition and humanities teacher in both face to face and digital formats, and on my experience as a graduate student who has taken digital courses from two public research institutions and one private university. The first hypothesis is that the ratio of nontraditional to traditional students is greater in digital than in brick and mortar formats. The second hypothesis is that despite the prevalence in #edtech online instructor training of “frameworks” and lists of “best practices” and available technologies, most college teachers and institutions that are implementing online learning formats could do more to align online course design and instructor behavior with cognitivist and constructivist learning theory, with empirically verified pedagogical strategies, and with systematic piloting and review of digital innovations at the course and program level.

At the intersection of these two hypotheses is the crux of the matter: if a higher ratio of first generation, English language learner, adult, and other nontraditional students is enrolling in online (defined here as any combination of synchronous or asynchronous learning) courses, and the shift to teaching in digital spaces and/or new technological innovations requires teachers to develop new communication, technology and pedagogical design skills while amplifying the negative effect of the lack of such skills, do resulting negative impacts on student retention and motivation create a significant disparate impact for nontraditional students? On the flip side, how can implementation of learning theories such as andragogy and social constructionism, together with more evidence-based review of digital teaching approaches, result in increased success for the less traditional student population that tends to take online courses?

A number of studies confirm that nontraditional students comprise the majority of online leaners (Chen, Lambert & Guidry, 2010: Thompson, Miller & Pomykal Franz, 2013). Teachers and institutions who create online learning experiences thus need to consider both the assumptions of adult learning theory, such as that adult learners have and use more personal experience in learning, maintain responsibility for their own learning and resist situations where learning appears to be dispensed or controlled by others, and are more intrinsically than extrinsically motivated (Holton, Swanson, & Naquin, 2001), as well as the issues of diversity such as adaption, acculturation, identity formation, and diverse practices and understandings of knowledge acquisition and demonstration (Nielsen, 2014).

I would like to posit that a question as complex as how to build capacity at the course and institutional level for supporting such learners in online formats cannot be addressed effectively without systematic analysis, design, and evaluation of possible solutions that consider not only what innovations may work but why they work and whether they are scalable.

In this post, I respond to recent literature seeking to put principles of learning theory into conversation with systematic qualitative analytical approaches to problems of student engagement by suggesting that other educators join in early implementation of these models as well as in systematic review of results. I’ll discuss two research-based frameworks for online course design and one for course/program review.

 

 

References:

Chen, P., Lambert, A., & Guidry, K. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54, 1222-1232. doi:10.1016/j.compedu.2009.11.008

Ford, C., McNally, D., & Ford, K. (2017). Using design-based research in higher education innovation. Online learning, 21(3), 50-67. doi:10.24059/oli.v%vi%i

Holton, E.F., Swanson, R.A., & Naquin, S.S. (2001). Andragogy in practice: Clarifying the andragogical model of adult learning. Performance improvement quarterly, 14(1), 118-143. Retrieved from https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1937-8327.2001.tb00204.x

Lehman, R.M. & Conceição, S. (2013) Motivating and retaining online students: Research-based strategies that work, Jossey-Bass / Wiley. Retrieved from http://ebookcentral.proquest.com/lib/spu/detail.action?docID=1376946

Nielsen, K. (2014) On class, race, and dynamics of privilege: Supporting generation 1.5 writers across the curriculum. In Zawacki, T.M. & Cox, M. (Eds.), WAC and second-language writers: Research towards linguistically and culturally inclusive programs and practices (pp. 129-150). Retrieved from https://wac.colostate.edu/books/perspectives/l2/

Oremus, W. (2015, October 25). No more pencils, no more books: Artificially intelligent software is replacing the textbook—and reshaping American education. Slate. Retrieved from http://www.slate.com/articles/technology/technology/2015/10/adaptive_learning_software_is_replacing_textbooks_and_upending_american.html

Redmond, Pl, Abawi, L., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online learning, 22(1), 183-204. doi:10.24059/olj.v22i1.1175

Thompson, N., Miller, N., & Pomykal Franz, D. (2013). Comparing online and face-to-face learning experiences for non-traditional students: A case study of three online teacher education candidates. The quarterly review of distance education, 14(4), 233-251.

Personalizing and Differentiating Teaching with Playlists

ISTE Educator Standard 5a calls for teachers to “Use technology to create, adapt and personalize learning experiences that foster independent learning and accommodate learner differences and needs.” When considering which tool might best serve this purpose, my mind immediately went to Google Apps for Education. GAfE offers many features that support personalized learning such as differentiation by assignment in Google Classroom,  custom redirection based on responses in Google Forms, and utilizing the Google Classroom roster to easily BCC students who need individual attention. Not to mention the host of helpful Chrome extensions like Read&Write for text-to-speech capabilities,  Grammarly for built-in spell checking, and WolframAlpha for science and math help.

For this week’s inquiry, I initially considered spending time exploring HyperDocs which is a tool I have previously dabbled in. My students really liked the creative and digital presentation. Yet I was challenged by my professors to consider whether or not a HyperDoc is really just a glorified worksheet. I happened to be exploring one of my favorite educational blogs, Cult of Pedagogy, when I came across a post featuring a tool that was engaging like a HyperDoc, but had more potential for personalization, independence, and differentiation. That tool is called a Learning Playlist and the idea is the brainchild of teacher Tracy Enos. Jennifer Gonzalez interviews Enos and shares examples of what Learning Playlist look like in her post, “Using Playlists to Differentiate Instruction.”

Figure 1: Playlist for Argument Writing by Tracy Enos, full Google Doc available here

What is a Learning Playlist?

The most basic description of a Learning Playlist is “an individualized digital assignment chart that students work through at their own pace” (Gonzalez, 2016). Tracy Enos developed the idea for Learning Playlists when she grew frustrated with trying to meet all students’ needs with a single lesson. She’s not alone in rejecting the one-size-fits-all approach. In fact, “[N]early 50% of the students in today’s classrooms have some form of learning diversity that impacts how they learn best” (Digitalpromise.org, 2016).  This diversity includes differences in background, history, culture, linguistics, and socioeconomic status (Digitalpromise.org, 2016). Given these needs, it’s no wonder teachers are turning to technology to help meet students’ individual needs. Playlists are one tool that can foster independence, allow for choice, and differentiate based on need. Playlists also take the responsibility for learning and place it in the hands of students.

How is a Learning Playlist created?

Consider the many elements that go into creating a successful unit plan in any content area. There will be guiding questions, lessons, content to review, formative assessments, discussions, and perhaps articles or film clips. In a traditional classroom model, all of that learning takes place at the same time. Each student reads the same short story at the same pace. One day is dedicated to completing one set of questions. Test day is the same for everyone, regardless of need. This model typically meets the needs of those students in the middle of the spectrum while leaving some students struggling to catch up and others bored because they’ve finished early. Meanwhile all students have a low degree of choice and ownership. It’s passive learning.

What if you took the same essential elements of your unit plan and instead digitized them? Lessons could be bookmarked for review whether through a Slideshow, Screencast, or other tool. Formative assessment could occur through ActivelyLearn, EdPuzzle, or Google Forms. Discussions can still be had via Padlet, Google Groups, or Slack. Students could work at their own pace, independently. With this newfound independence, your time is freed to assist struggling learners or to conference individually with students.

It’s a pretty revolutionary way to consider teaching. Yet you can still have deadlines and require students to check-in daily or weekly. Learning can be reflected on via whole group discussion days. (I’m an enormous fan of Socratic Seminars.) Students can still work in groups to meet the learning goals. You can even opt for a blended model where some of the lessons are given whole-group, and then students work individually on a Playlist based on their needs.

How can Learning Playlists support independent learning?

Within the Playlist format, there is plenty of room to support independence and choice. Though Enos doesn’t mention the addition of student choice, I can easily see how it can be incorporated into Playlists.

  • Choice of content: have students choose their own short story to apply plot skills to or Civil War battle to research and describe.
  • Choice of task: have students choose which tool they want to use to ‘show they know;’ perhaps an Explain Everything Screencast to demonstrate the steps of an algebra problem, or using Storyboard That to create a digital story about rock cycles.
  • Choice of question: have students come up with their own essential question and use a Playlist to guide them through the inquiry model.

How can Learning Playlists support differentiation?

Differentiation within Playlists can be accomplished without drawing attention to those students who need extra help. Given the ability to individually assign work within Google Classroom, no student knows they have a different version. Another option is to provide links to the leveled Playlists and let students self-select.

  • Pacing: Students can replay or accelerate a lesson as needed.
  • Leveling: For each Playlist, you can create a Level 1, 2, and 3 Doc/Slide to better meet student needs.
  • Personal Support: Enos leaves several tasks on her students’ Playlist ‘to be determined.’ She then goes back and adds extra resources or practices as needed based on the work students submit.

Sources

Digitalpromise.org. (2016). The Growing Diversity in Today’s Classroom. [online] Available at: http://digitalpromise.org/wp-content/uploads/2016/09/lps-growing_diversity_FINAL-1.pdf [Accessed 4 May 2018].

Gonzalez, J. (2016). Using Playlists to Differentiate Instruction. [online] Cult of Pedagogy. Available at: https://www.cultofpedagogy.com/student-playlists-differentiation/ [Accessed 1 May 2018].

Animation in the Elementary Classroom

Many children’s first experience with technology is animation.  So it makes sense that animation can have a valuable and influential impact in the classroom. Currently in my coursework we are looking at ISTE Standards for Educators, specifically Standard 5: Designer- Educators design authentic, learner-driven activities and environments that recognize and accommodate learner variability. (ISTE, 2017).  One of the three indicators for this standard reads “Use technology to create, adapt and personalize learning experiences that foster independent learning and accommodate learner differences and needs.” (ISTE, 2017)  One way I have used technology to personalize learning experiences for students is through animation. Through animation I have been able to differentiate my instruction, engage my students, and help all learners make connections to the real world.  

 

Using Animation to Hook Students and Create Connections

 

As I mentioned in the previous paragraph many children’s first experience with screens is with animation. Cartoons have an ability to hook very young children (which can be viewed as positive or negative) and also have the ability for toddler and preschool age children to begin to form connections to “people” outside of their family.  In a blog post on ASCD, Janelle Vargo writes about “10 Reasons to Use Animation in the Classroom” (Vargo, 2017). She discusses how she has seen her students adjust their behavior due to the positive influences of animated characters and videos.  Of Vargo’s 10 Reasons, I have chosen the 5 that I agree with the most and have experienced in my personal and professional life. Here are my “top picks” from Vargo’s list (Vargo, 2017):

 

  1. “Students in K-2 Classrooms Relate to Animated Characters”
  2. “Animated Stories Can Teach Empathy”
  3. “Student’s Imitate the Character’s Behavior”
  4. “Animated Stories are an Effective Way to Convey Information”
  5. “Stories Create a Shared Viewing Experience”

 

Students come to us with social-emotional needs that, when not addressed, can hinder their learning experiences in our classrooms.  Animation is a tool that can be used to help address these needs and create a classroom community based on common language, relatable character “friends”,  and shared experiences.

 

The Art of Creating Animation

 

Most things that we enjoy watching or experiencing, we are bound to want to try out. How is that done? Can I do that? Creating animation in the classroom can be multidisciplinary, is learner-driven, and can be adapted to accommodate a variety of learning styles and skill sets. Animation can be a very authentic learning experience for a variety of subject areas. In my classroom I have had students create animated stories with no guidelines just to introduce them to the process. Other times I have given a specific assignment (example: create an animated video on how you got to school this morning) and/or given them instructions for features I wanted them to use or the length of their animated story.  In my experience, students approached creating the animations in very different ways. It allowed for me to see the diversity in my students’ creativities and how they worked through the design process in different ways. One thing that all students had in common was that they wanted an audience for their animations; they wanted to share their story with their peers and with me. This is what I want for my students. I want them to want to share their stories and to be comfortable and confident sharing their thinking and their creativity.

 

Best Programs for Elementary Students

 

ABCYA.com is a website that I use frequently in my classroom. I like the variety and value of the programs on the website and I appreciate that the programs are free. (There is a fee to go ad-free and for use on tablets and phone.) One of the digital tools on ABCYA.com that I have used for my 3rd, 4th, and 5th grade students is Animate. I found the Animate Tutorial, which is only a couple minutes long, to be a great resource and I have shown this to each class when I introduced the program. The students found it very helpful and at times I saw them revisiting the tutorial on their own.  Animate seems best for upper elementary students who have had no or limited experience creating animation. There is an option to export an animation as a GIF file, although I have yet to do this with my students. Here are some of my favorite features on Animate:

  1. Copy Frame– I like the ability to copy each frame and how this is really emphasized in the tutorial video.
  2. Images– There are numerous images students can choose from when creating their animation. This takes away the pressure to create original drawing for those students who choose not to or have limited time.
  3. Edit Background– There are 7 backgrounds to choose from as well as the option to create your own or upload an image.
  4. Frame Rate– Students can view their animation in slow, medium, or fast speed and choose to play it in a loop as well.

While there are many other options for creating animation in the classroom, I have found success in terms of student engagement, ease of use, and the ability to accomodate for all learners with using Animate from ABCYA.com.  I especially like that it is a free program. Most other programs had costs involved in the purchase of the apps and/or weren’t compatible for the platforms I use in my classroom. If interested in other options, I encourage you to visit Common Sense Media’s blog for “16 Websites and Apps for Making Videos and Animation”. I found several apps on this list that seem to be good options when I am ready to take animation to the next level in my classroom.

 

Sources:

Holderman, E. (2014). Common Sense Media website (Retrieved on May 1, 2018) from: https://www.commonsense.org/education/blog/16-websites-and-apps-for-making-videos-and-animation

Iste.org. (2017) ISTE Standards for Educators. (Retrieved on 2018, April 30) from: https://www.iste.org/standards/for-educators

Vargo, J. (2017). ASCD Website (Retrieved on May 1, 2018) from: http://inservice.ascd.org/10-reasons-to-use-animation-in-the-classroom/