Just when elementary teachers thought they couldn’t possibly have anything else stacked on their plate, teaching digital citizenship has been added to their load. However, when a district or school has a intentional, well-organized, and comprehensive plan in place, digital citizenship does not have to seem like another chore or standard to check off. Digital citizenship can be woven into what is already being taught in the classroom and should not be the responsibility of just one person or position. Digital citizenship should become a way of life in the classroom. Children often learn as much, or more, from adults modeling behavior than by adults expliciting teaching skills and behaviors. Crompton (2014) summarizes this well in her blog post on the ISTE website: “Students are much more likely to understand good digital citizenship — the norms of appropriate, responsible technology use — when teachers model it on a regular basis. It is also important for all educators to spend time directly teaching and actively promoting digital citizenship. And keep in mind that it’s not just one person’s job to teach digital citizenship in a school, but everyone’s shared responsibility.”
Curriculum: A Place to Start
Common Sense Media is a tremendous resource for digital citizenship lessons. These lessons address students K-12 and cover all aspects of digital citizenship such as: internet safety, privacy and security, relationships and communication, cyberbullying and digital drama, digital footprint and reputation, self-image and identity, information literacy, and creative credit and copyright. I have taught many of these lessons K-5 and was impressed by the ease of use for teachers, engagement for students, and quality and quantity of material covered. There is even a brief tutorial for teachers to introduce them to digital citizenship instruction and this suite of free products. Some of my favorite features of this resource are the “family tip sheet” and the videos. I also like how the lessons are interactive for the students and build upon each other throughout the grades. You can teach just one lesson or use every lesson in the curriculum, it’s really up to you to customize what is best for your school or classroom. If you are new to teaching digital citizenship, I recommend Common Sense Media as a good place to start!
Why Digital Citizenship
There have been many years where in my elementary classroom I had only 3 simple “rules” for students to follow: Be safe, be respectful, and participate as best as you can. Diana Fingal, in her article “Infographic: Citizenship in the Digital Age” from the ISTE website describes the elements of digital citizenship in similar terms. “The elements of digital citizenship, it turns out, are not so different from the basic tenets of traditional citizenship: Be kind, respectful and responsible, and participate in activities that make the world a better place. (Fingal, 2017)”. Below is the infographic Fingal shared in her article:
Our students are using technology at skyrocketing rates both in the classroom and at home. Most of them enter Kindergarten well versed in how to navigate their way around a phone or tablet and able to manipulate websites and digital cameras. School is a place where we encourage our students to “make mistakes”. We want them to try new things, take risks, and step out of their comfort zones in order to develop and grow as life-long learners and citizens. We want them to makes mistakes when the stakes are low and when they are well-supported by adults they trust. It is imperative that we teach our students how to become responsible, respectful, and valuable digital citizens when they are in our classrooms. This is not a skill set they come with and although this generation of digital natives may seem to have this all ingrained into them, they do not and this is a teaching opportunity, we (as educators) cannot miss. Crompton and Fingal both agree.
“Contrary to popular belief, however, digital natives don’t pick up these skills through osmosis. It falls on parents and educators to teach them how. Just as a teacher would talk to students about etiquette and safety before they enter a public place on a school trip, so must they remind students of what’s expected of them online.” (Crompton, 2014).
Just as all kids throughout the centuries have needed help from their parents, teachers and mentors along the path to becoming good citizens, our digital natives need guidance as they learn how to apply the elements of citizenship to the realities they encounter in a connected world.” (Fingal, 2017).
As students in the US are finishing up their 2017 – 2018 academic school year right now teachers, professors, teachers’ assistants, and advisors are looking over summative projects and assignments. These can range from essays, reports, stories, explications and standardized tests just to name a few. When these students finish drafting, writing, revising and editing that last sentence punctuation mark or citation of the school year right before the freedom of Summer but not before grades come out. “They will have to do something their parents never did: run their work through anti-plagiarism software” as NPR Education’s Corey Turner explains it in his 2014 piece for “All Things Considered” where he namesTurnitin as a company that has a database it utilizes to screen for potential similarity/originality, and he comments that the database is big. Really, really big. Computer technology and the Internet now make plagiarism an easier enterprise. As a result, faculty must be more diligent in their efforts to mitigate the practice of academic integrity, and institutions of higher education must provide the leadership and support to ensure the context for it. This study explored the use of a plagiarism detection system to deter digital plagiarism. Findings suggest that when students were aware that their work would be run through a detection system, they were less inclined to plagiarize. These findings suggest that, regardless of class standing, gender, and college major, recognition by the instructor of the nature and extent of the plagiarism problem and acceptance of responsibility for deterring it are pivotal in reducing the problem. Chris Harrick, Turnitin’s vice president of marketing, describes it this way: “A student submits a paper through Turnitin’s website. The company’s algorithms then compare strings of text against its massive database. And, as Harrick continues, it doesn’t just check the Internet. Most of the papers, once they’ve been run through the system and scrubbed of student names, actually stay in the system. When all the comparing is done, the teacher gets a report that gives the percentage of the paper that matched other sources. The report never says: This is plagiarism. Just: This is similar” (NPR, 2014). This part of the product is referred to as the “Originality Checker” and is only one small element of what was previously known as iParadigms and is now the Turnitin LLC empire.
Other than the originality checker product there are several other pieces, the oldest grouping of systems is referred to asThe Feedback Studio (TFS) which is considered its core. As Tom Dee, a professor in the graduate school of education at Stanford examines the usefulness of the product he explains that ‘” these tools are like a hammer or a scalpel,” cautions Dee. “Whether using them is helpful or hurtful depends on the care and discretion with which they’re used”’ (NPR, 2014). Now TFS is just one aspect of what Turnitin can do and one one of its products. It has grown extensively in the previous four years and now includes other products likeRevision Assistant,Ithenticate, Write Check and bought one of its largest competitors Vericite in early 2018. A lot has changed since the company started in 1998 but even since 2015, when Ry Marcattilio-McCracken wrote a commentary piece In the Chronicle of Higher Education; My Love-Hate Relationship With Turnitin, where he begins by explaining how much he loves Turnitin as a tool in his classes but despite the obvious time-saving benefits he had a student come to him with a couple questions that he couldn’t necessarily answer. “The student was nontraditional, and this was his first college course in some years. He was concerned first about accidentally plagiarizing, and wondered (naïvely, but completely understandable) if TurnItIn let students run their work through free to make sure this didn’t happen. Second, the student didn’t like the idea of being forced to surrender his work to a company that would make money from it. He was articulate, respectful, and tentative.” The privacy concerns of who owns what and after a project or paper enters Turnitin’s database does it now belong to the company? As Ry Marcattilio explains that after little searching turned up surprisingly few lawsuits brought against iParadigms, the (former as of 2016) parent company of TurnItIn. But someone had issued a challenge. Six years ago a court weighed in, and the judge ruled in favor of iParadigms on four grounds, as summarized in the Harvard Journal of Law & Technology: “1) Commercial use can be fair use, and … use can be transformative ‘in function or purpose without altering or actually adding to the original work.’ TurnItIn transformed the work by using the papers to prevent plagiarism and not for factual knowledge; 2) The website’s use does not diminish or discourage the author’s creativity or supplant the students’ rights to first publication; 3) Using the entirety of the papers did not preclude fair use; and 4) TurnItIn’s use does not affect marketability” (2015). Teachers, parents, and administrators have the right to be concerned about the privacy rights and how software tools like Turnitin are consuming data from students. Educators from the common core era know that tools, where students feel caught or shown how wrong they are, will not feel empowered to do their work. Turnitin in fortunately has evolved and as it grew it also expanded the capabilities.
In this article by Jinrong Li entitled “Turnitin and peer review in ESL academic writing classrooms”, the authors share their experience of using Turnitin for peer review in an English as a Second Language ESL academic writing course and discuss its advantages, its limitations, and how different features of PeerMark may be used to address some of the challenges identified in previous research on peer review in the L2 writing classroom. Throughout a semester, the students were required to complete three peer review tasks through Turnitin. Based on the instructor’s experience and the students’ reports, we found that Turnitin could help shift students’ attention from local to global issues in writing, scaffold students in their effort to provide more helpful comments and to make connections between specific suggestions and holistic advice for writing, and facilitate classroom management during peer review. In this article, we share our experience of using Turnitin for peer review in an ESL academic writing course and discuss its advantages, its limitations, and how different features of PeerMark may be used to address some of the challenges identified in previous research on peer review in the L2 writing classroom. Throughout a semester, the students were required to complete three peer review tasks through Turnitin. Based on the instructor’s experience and the students’ reports, we found that Turnitin could help shift students’ attention from local to global issues in writing, scaffold students in their effort to provide more helpful comments and to make connections between specific suggestions and holistic advice for writing, and facilitate classroom management during peer review.
Another newer product of Turnitin’s which was introduced 2016 is the Revision Assistant which was acquired from Pittsburgh-based LightSide Labs in a deal that kept the company’s office in Pittsburgh and lead to more hiring, and the placing of co-founder Elijah Mayfield as a VP of New Technologies at Turnitin. LightSide Labs, which was founded in 2013, employs machine learning algorithms developed by researchers at Carnegie Mellon University’s Language Technologies Institute to help instructors assess student writing and provide real-time automated formative feedback. What LightSide brings to the table is greater opportunity to grow the Turnitin brand in the K-12 arena and especially the tough to enter middle schools market and add to the tool’s student-facing interface. Revision Assistant helps students to become better writers by reducing the amount of time spent scoring essays and the number of time students spends waiting for feedback. From my personal experience as a public school teacher for six years, specifically working with grades 7-11th in the subject of language arts and history I can safely say that immediately after the students finish the last word on their papers they tend to want to turn it in. Getting the students to reread their papers multiple times was like pulling teeth and if and only if I made the task of peer editing and self-editing another assignment in the grade book that I had to grade then the students would be tempted to not do it. In a study done by Turnitin and presented to me in my new hire orientation, that “57% of students will admit that they receive feedback too late in the writing process and 79% of teachers say feedback is important crucial.”With Revision Assistant (RA) students go through 7.9 drafts per students on average (Turnitin Review). In the revision process, students ask for more signal checks and comments. The more papers, essays, or reports that are collected the better RA is trained, then a team of assessment experts evaluates the initial essays. The Turnitin curriculum team of veteran teacher crafts actionable comments. And Turnitin’s research team analyzes patterns in sample essays to build a model to evaluate future student work. This all leads to the idea that when students submit new essays, RA provides scoring and feedback.
Initially, the product hit some skepticism about whether Elijah Mayfield and LightSide Labs was trying to intentionally replace educators. But unlike human educators, there are limitations to the software around advanced writing elements. With the software, writing is restricted to responses to prompts. Examples of prompts include writing about “a time that laughter featured prominently in their lives” and “the pros and cons of social media,” Since the software is primarily aimed at K-12 students and initial community college courses, several of the prompts are framed as responses to pieces of writing. As Turnitin works with school districts and community colleges that are interested in turning their curricula into writing prompts, the number of prompts has grown to 84 prompts as of May 2018 and are continually growing with the help of the curriculum team at Turnitin.
Turnitin’s Revision Assistant the uphill battle continues among writing instructors, many of whom philosophically object to turning writing into an activity that can be evaluated by a machine. Carl Straumshein explains in his 2016 article from Insider Higher Ed that “the National Council of Teachers of English in 2013 issued a position paper on that topic, members of the NCTE’s assessment task force said ‘the ways in which humans and machines analyze writing continue to serve very different outcomes.’ The task force members are also involved in the Conference on College Composition and Communication, the NCTE’s professional organization for writing instructors. “As is the case in K-12 classrooms, teaching writing at the college level that is successful calls for thoughtful response to student writing,” the members said in the statement, naming face-to-face conferences with instructors and feedback on drafts as two examples of responses. ‘Such human formative assessments are essential building blocks supporting writers’ development’” (2016).
Turnitin doesn’t intend for the software to serve the purpose as the educator, instructor, or even teacher’s assistant. Revision Assistant does not come with a traditional grading feature but gives students a score of one to four in each of the four categories. Turnitin calls those scores “signal checks,” since they resemble wireless signal strength logos. Continuing from the Carl Staumshein piece, “if [students] spend some time with Revision Assistant, they’ll remember that they have to have a hook, that they have to have transitions. Then instructors can start helping them with the things the computer can’t” (Insider Higher Ed, 2016). Mayfield a linguist and medieval poetry expert at heart admits skepticism about technology in writing instruction a “valid concern,” and said technology has not yet reached a point where it can be used in upper-level courses teaching advanced, open-ended writing. Mayfield goes on to explain “there is room for technology to help that conversation, but it’s not the most crucial place to have technology insert itself right now.” “We don’t think of [Revision Assistant] as something for upper-level electives where students are able to engage with teachers in a strong dialogue. We see it as empowerment of students who don’t have those skills already.” When I think of Revision Assistant, I think of it as the ultimate writing coach that I never had in my classes. Before allowing my peers and especially my teacher look over my work a faceless computer would Improve my writing skills this behavior would certainly motivate me and hopefully my students to write. The discussions that my students and I had were tough at times because they felt they were personally picked on for certain elements of their writing or because of their writing style. Discussions and conferences could change because the data would come from a computer system and so the student’s questions could become “signal checks and spot checks on revision assistant said I need to work on x,y, or z” instead of putting the blame and subjectivity on the teacher when pointing out places were a paper could be improved. The tools can extend teacher’s reach and save time, checks and tracks student progress helps differentiate instruction is a timesaving piece. On top of that, it creates actionable reports for teachers to deliberate over with their plc, department or administration. Within the Turnitin study of the papers submitted through RA, 53% increase in writing scores from first to last round of feedback, on average and in the study 93% of teachers say that RA improves their students’ writing. It really is a win, win because the students can get receive some initial writing coaching in the form of generic feedback and suggestions. While the teacher gains visibility to identify gaps in their writing and make informed decisions on what they should be teaching and how they are teaching it. Elijah Mayfield goes on to explain that when he began talking to education companies, it was clear that the emphasis was on measuring student learning. What that does is deemphasize the role of collaborative learning. It de-emphasizes the role of essay writing and communication—the stuff that in fact is probably more valuable for the majority of students.
This week’s post was inspired by a Standards-Based Grading system I observed while subbing in a middle school math class recently. In the class, students were using the Schoology LMS (Learning Management System) to view the math goals (dubbed proficiencies) they had not yet reached for the quarter. They then took that information and sought out resources posted online by the teacher in order to help them meet those goals. Proficiency was demonstrated through quizzes posted online by the teacher. To study for each proficiency, students explored linked Khan Academy videos and completed various practice activities.
The system appealed to me for several reasons. Most importantly, students were aware of those skills they had mastered and which needed more practice. They also had the self-sufficiency to find and use the appropriate resources to help prepare to meet those goals. Students had a high degree of ownership over their learning and technology was providing both students and the teacher with data to analyze learning.
ISTE Educator Standard 6a asks educators to, “Foster a culture where students take ownership of their learning goals and outcomes in both independent and group settings.” While the system I observed used technology to meet those goals, I wanted to explore other tools available. I am not a big fan of Schoology and I was also curious about LMS systems designed specifically for Standards-Based Grading.
First of all, what is Standards-Based Grading (SBG)?
Standards-Based Grading is a method of assigning students a grade based on mastery of concepts instead of averages across multiple miscellaneous assignments. Ideally, the goals of a course should be driven by standards. Those goals are what is being measured in an SBG system. Instead of traditional letter grades, you may see terms like ‘Developing, Approaching, Mastery, Exceeding.’ These terms are sometimes converted to a number scale 1-4.
Why the shift to Standards-Based Grading (SBG)?
Traditional grades are inconsistent. Mastery based on standards is a much more qualitative learning measure than a traditional letter grade which also represents a student’s motivation, interest, and level of home support. Traditional letter grades also tend to be very subjective. What one teacher deems an A+ paper, another might assign a B-. SBG is much more objective. A grade is assigned based on whether or not specific learning goals are met (example- ‘Student used text evidence to support their analysis’).
Traditional grades rarely reflect mastery. As Scriffiny argues, “If we base our grades on standards rather than attendance, behavior, or extra credit (which often has nothing to do with course objectives), we can actually help students grapple with the idea of quality and walk away with a higher degree of self-sufficiency.” (2008)
SBG better promotes growth. Like many teachers, it frustrates me when students focus on making the minimum grade and moving on. Learning becomes a siloed, once-and-done experience. It feels inauthentic and drives students away from intrinsic motivation. Teacher and author John Spencer connects traditional grading to students’ fear of taking risks; “…when they see that their grades are based upon mastery rather than averaging, they realize that mistakes are an integrated part of the class flow.” There is value in these risks and mistakes because “when students can see [mistakes] as a natural part of the process, they can use mistakes to guide their reflection and ultimately celebrate their successes and the mistake-laden journey that led them there.” (Spencer, as quoted in Ferlazzo, 2016)
What is Kiddom?
In searching for ways that technology can support SBG and student ownership, I discovered a blog post by Angela Watson discussing Kiddom online. Kiddom is a Learning Management System that supports Standards-Based Grading for teachers in grades K-12. free for teachers and students. Dozens of standards are available including state-specific standards, Common Core standards, the ISTE standards, Next Generation Science standards, and even Social Emotional Learning standards.
Much like Google Classroom or Schoology, Kiddom serves as an online grade book and classroom. After setting up your class and inviting students, you can use the Kiddom library of lessons or upload your own assignments. Kiddom is fully integrated with Google Doc and you can incoporate lessons from popular sites like Newsela, Khan Academy, and IXL Learning. For each assignment, you can choose multiple standards and rubrics. You can choose between pre-populated rubrics and creating your own.
Kiddom also supports blended learning since you can include non-digital assignments and input the scores based on a rubric. To get an idea of what Kiddom looks like from a student’s perspective, check out this blog post.
How does Kiddom support individualized learning?
The data available to teachers give an accessible and visually appealing overview of where individual students and the class as a whole are at in terms of meeting set goals and overall standards. You can see some examples of that data on the page ‘What Insights Do My Reports Offer?’
Using that data to inform instruction, assignments can be given to specific students based on their level of mastery. This is ideal for providing either extensions, extra practice, or remediation based on student need.
How can Kiddom support student ownership?
Kiddom uses the following graphic to depict the learning cycle that is possible with Standards-Based Grading and the Kiddom software. It’s similar to many cyclical education models where the student defines the tasks and sources, completes the task, reflects, and refines.
Using the reporting tools, students have the ability to view their progress toward specific goals. Kiddom provides a student-centered video tutorial for how to interpret reports on Kiddom. Below is a snapshot of the overall view students and parents can see from the Kiddom dashboard. Clicking on individual assignments within a standard reveals comments and the rubric used to assess the work. In this way, students know what they need to do in order to improve their mastery.
Another neat feature that Kiddom provides is the ability to teach using Playlists. Playlists support student choice (as I’ve previously explored). Playlists can be used to offer multiple ways to learn and show you know. Additionally, they can be assigned to specific students based on need or interest. One way I can envision using this tool in the Language Arts classroom is to support digital literature circles where each group is reading a different novel.
Are there any drawbacks?
The level of teacher investment and interest in the Kiddom system will likely impact how successful the tool is in supporting student ownership and individualized learning.
Kiddom offers a powerful set of tools if fully utilized. If not fully utilized, I can see this being a glorified online rubric system. For instance, if a teacher is only checking a box on a generic pre-populated rubric and not providing any comments or additional support/differentiation, it’s kind of like using a Ferrari only to transport your kid to soccer practice. It gets the job done, but you don’t need this tool if that’s all you want to accomplish.
To be successful, detailed feedback should be provided by the teacher along with additional support as needed. This is easily accomplished through the differentiated assignment options. Students should be given the option to refine and resubmit work. Standards-Based Grading is about a mind shift as much as it is a grade shift.
I have a dilemma. No one comes to my office hours anymore. I made this realization years ago when I would find myself alone in my office, staring at the clock, waiting for my “shift” to be over or filling that time with grading and lesson planning. On average, I’d probably have 1-2 students come see me before the end of the quarter and it was usually because the situation was dire. Later, I changed my approach to be more flexible. I didn’t have fixed office hours so that students could make appointments with me that better accommodated both schedules. Students would approach me either in class or via email to set up an appointment time. For a time, this strategy worked very well to catch struggles and issues earlier on. Despite all of these efforts to be available for students, resolving major issues, addressing prolonged absences, and discussing successful study strategies are not what the typical student emails me about. Now, students email me about anything and everything.
It wouldn’t be too bad filtering through emails, if students also didn’t have the expectation that professors respond to any email with 48 hours, during which all of the responsibility for investigating that question gets placed on the instructor. “I wasn’t sure what to do, I was waiting for a response from you,” is the usual response I get if I was too busy to answer a non-urgent email. It’s difficult not to become frustrated in this scenario when about 2.5 hours of my day is spent answering emails. With work-life balance considered, that means that ¼ of my day is spent unproductively. During that time, I could have been working on assessment, lesson planning, or updating content with current research.
This is not the only email communication concern I have. At least three times a quarter, I need to gently correct the students that choose to address me by my first name as opposed to my professional title- Professor Vlad-Ortiz. To their merit, once corrected, students do not repeat that mistake. What happens far more often is unclear communication and informal tone. Emails starting in “I need you to…”, or “lift my registration hold…” demonstrates a misunderstanding of the formality needed to address faculty. Rather than phrasing their request politely, it reads more like a demand. Because of the implications and expectations loaded into each of these emails, it is important to investigate and address appropriate strategies for teaching effective email communication to students.
Why is all of this important? Understanding how to properly communicate online, including email, is part of good digital citizenship. The skills of knowing email appropriateness, tone, and formality are essential to be successful in the 21st century. Though there are several other caveats to good online communication, I’ve identified three basic email communication components to help students get started in practicing successful digital citizenship.
All emails to educators, regardless of their title, should be formal. The educator-student dynamic is professional in nature so communication should reflect that relationship. Addressing professors by their professional name not only establishes that formal relationship, but as Molly Worthen, Assistant Professor at University of North Carolina, explains, in a world where formality is on the decline, using a professor’s title helps to ensure respect regardless of the professor’s race, age, and gender, (Worthen, 2017). This is particularly important considering that it is the more privileged students that tend to violate this formality, (Worthen, 2017). Along the lines of respect, the tone of the email should be polite and courteous. By sending an email, the sender is asking for the professor’s time and consideration on a particular manner. Worthen brilliantly explains that requests should not sound like a text message nor communication with a customer service representative, (Worthen, 2017). As with my examples above, the professor doesn’t need to do anything, as in “I need you to lift a hold from my account,” or “I need to register for your class…” but rather understands that the sender is asking for a favor. As Mark Tomforde, Associate Professor at University of Houston, very accurately describes, professors are incredibly busy, so emails should truly represent issues that can’t be resolved through any other means. Using email to request anything and everything trival is a disrespectful of the professor’s time and expertise, (Tomforde, n.d.). Emails should demonstrate that the sender has already taken several steps to solving the problem on their own and clearly defines how the reader can help resolve that problem, (Purdue, n.d.). Ideally, the issue should be quickly resolved through one email and the sender should be able to distinguish when it is appropriate to talk in person as emails should not be substitutions for real conversations, (Tomforde, n.d.).
Role of the Educator. According to the ISTE standard for educators, the role of the educator is to “…inspire students to positively contribute to and responsibly participate in the digital world,” (ISTE, 2017). The key words in that definition are “positively contribute” and “responsibility participate”. The issues addressed above indicate that there is a weight to the actions and intentions set-forth in email and other online communication. The responsibility of the student is to create communication that is both framed positively and courteously while taking the responsibility for the resolution of the email’s request. One of the indicators for this ISTE standard charges educators to “create experiences for learners to make positive, socially responsible contributions and exhibit empathetic behavior online that build relationships and community, (ISTE 2017). Relationships and community rely on the actions of many in order to be successfully built. In building a healthy online community, we can’t expect students to just know how to behave and communicate properly. Skills are not intuitive and should be taught. In order to address this ISTE indicator, I’ve compiled three solutions or strategies can be used to reverse the current culture and promote good digital citizenship for our students.
1) Professor Modeling. Teaching digital citizenship is a shared responsibility, so it is important for educators to actively address and model proper practices on a regular basis, (Crompton, 2014). In addition to using good email etiquette when communicating with students, professors should give students opportunities to explore and practice good etiquette. This can be achieved through explicit learning. For specific examples, Helen Crompton provides three scenarios of how digital citizenship can be modeled by professors in the classroom. Another example is an activity that Mrs. Jizba created in which she has students write two emails, one to their friend and one to their principle. She engages the students in a conversation about what content, tone, and choice of words are appropriate in each scenario. This simple activity clearly demonstrates how students establish the norms of good digital citizenship through modeling and practice.
2) Explicit language in department handbook that is then repeated in syllabi. Just as there are codes of conduct at each institution, departments should include standards of conduct for online communication. In order for these standards to have impact, each faculty member should mirror these standards in their syllabi. Through these collaborative efforts, the message of appropriate online communication is clear and consistent. Both Worthen and Tomforde share their guidelines to help with standard development.
3) Holding students up to the expectations. Just as important as modeling and creating language in the department handbooks and syllabi, is holding students up to those expectations. That means addressing any violations in a gentle and professional manner. For example, when students address me incorrectly, I respond back with, “We are a formal institution and ask that students address all faculty by their professional title, in my case you would address me as Professor Vlad-Ortiz. Please know that I am telling you this not to reprimand you or make you feel bad, but simply to let you know of our institutions professional standards so that you avoid potentially offending faculty in the future.” As Worthen concludes, it’s all about treating students as adults, (Worthen, 2017). As educators, we prepare students for the real world. If we do not hold students to these expectations, they will not be successfully prepared for their future professional lives.
The International Society for Technology in Education (ISTE) standards for educators as designers calls (5b) calls for us to “design authentic learning activities that align with content area standards and use digital tools and resources to maximize active, deep learning.” Designing technology-enhanced instructional materials for the Next Generation Science Standards (NGSS) can be a complex task, so it is best to use a guide. The makers of the NGSS designed the Educators Evaluating the Quality of Instructional Products (EQuIP) Rubric to assess and inform the development of science lessons and units. Educators and curriculum developers can use the EQuIP Rubric criteria as a guide to enhance science instructional materials with technology where most effective.
In this post, I select EQuIP criteria from each of its three sections (I. NGSS 3D Design, II. NGSS Instructional Supports, and III. Monitoring NGSS Student Progress) to examine possible technology enhancements for science lessons or units. The EQuIP Rubric may also be used as part of the Primary Evaluation of Essential Criteria (PEEC) for NGSS Instructional Materials Design in evaluations of year-long programs or programs that span multiple grade levels. One useful aspect of the PEEC is its “less” and “more” format, which compares traditional science instruction to ideal NGSS instruction. I will borrow this format to compare how technology is often used in classroom to how it should be used within each selected .
I. NGSS 3D Design
A. Explaining Phenomena/Designing Solutions: Making sense of phenomena and/or designing solutions to a problem drive student learning. (EQuIP Version 3.0, 2016, p. 2)
Prefabricated models for students to examine without opportunities to critique the model and create their own. Some digital instructional materials for science are visually beautiful and scientifically accurate, but leave nothing for the student to create on their own (see previous post about coding science models). If our goal is to get students to think about science phenomena through modeling, we, as educators and curriculum designers, should not do all of the thinking and modeling for them, and only allow students to interact on a consumer level with our products.
Opportunities and supports for students to observe phenomena and critique and design their own models. This strategy is more effective, and often more difficult for educators to design. Technology can provide experiences for students that traditional science instruction cannot, and should be used alongside real-world observations and systems modeling. For example, using computer models to represent a phenomena that would otherwise be impossible for a middle school student to observe (e.g., the Earth, Moon, Sun system from outer space). Students should be given a variety (multiple modalities) of opportunities to make observations and design models (e.g., observing phases of the moon, creating “hands-on” models with spheres and a light source, or creating their own computer model).
II. NGSS Instructional Supports
E. Differentiated Instruction: Provides guidance for teachers to support differentiated instruction by including appropriate [supports and extensions for students.] (EQuIP Version 3.0, 2016, p. 2)
Less: Traditional, one-size-fits all instruction in digital format, with suggested differentiation strategies.
More: Differentiation built into digital curriculum. Differentiation is an area where the technological potential is great, but the curriculum design and implementation lags far behind. Computer software’s great advantage over textbooks is wasted if it is not designed to be differentiated, dynamic and supportive. Ideally, multiple levels of support and extensions would be built in so that students can access them when needed.
Newsela provides a good example of how science texts can be adapted to meet the needs of students at different reading levels. Newsela also shows us that if curriculum is to be truly differentiated, then instructional materials must be designed to be far more robust. Such reading supports need to be designed and written into the curriculum before it reaches the classroom.
The same is true for supports like charts, graphs, audio narration, illustrations, animations, additional examples, and additional practice questions. A truly differentiated curriculum could be designed like a “choose-your-own-adventure” book, that offers different pathways to students to reach the same learning goal. For such a curriculum to be successful, instruction must be available for all students along the way. Here is an opportunity for “flipping the flipped classroom” (Watson, 2017), where instructional videos are available for students when they need them.
While it would be best to organize and access the instructional materials for this type of differentiated curriculum online, students’ learning experiences and the products of their learning should not all live and stay exclusively online. Online instructions should guide both online and offline learning experiences, like student-to-student dialogue and debate (online: with students in other classrooms and offline: with students in the same classroom), hands-on science experiments, engineering challenges and outdoor learning experiences.
Admittedly, designing a curriculum with greater complexity takes more planning, design work and professional development than a traditional, one-size-fits-all curriculum. Educators, curriculum designers and school administrators should take up this challenge so we can adequately serve 21st century science students.
III. Monitoring NGSS Student Progress
F. Opportunity to Learn: Provides multiple opportunities for students to demonstrate performance of practices connected with their understanding of disciplinary core ideas and crosscutting concepts and receive feedback. (EQuIP Version 3.0, 2016, p. 3)
Biased tasks that favor some learners over others as summative assessment. We know that students learn in different ways, but, too often, we offer only a single means (modality) by which they can demonstrate their understanding.
Variety in assessment tasks, reflecting learning experiences in multiple modalities. As we vary the learning process, we must vary the assessment process accordingly. Just as curriculum supports would need to be more robust to support both students and teachers in a curriculum with more options, so too must assessment supports be expanded and improved to facilitate the monitoring of student progress.
Adaptive learning software can help provide some of the many pathways students take. In essence, the learning software guides them on their path and can provide supports along the way. Adaptive learning software in science might present a student with an excerpt from an article, a diagram or a video clip when students need further explanation or with a supported extension when the student has demonstrated mastery of a concept. The primary limitation of adaptive learning software in NGSS classrooms would be its reliance on multiple choice questions, where the focus of NGSS is constructing sound scientific arguments and solving problems.
If students learn a chemical process like photosynthesis through physical movement, dance or song, there should be an option for students to demonstrate their knowledge of a complimentary chemical process (cellular respiration) in a similar fashion. Supports for these assessments would include rubrics, (maybe dynamic digital rubrics, where students can choose how they will demonstrate their learning, and share their proposal with teachers for review/approval) instructional supports (as mentioned before with “flipping the flipped classroom”) that are focused on the chosen medium (for example, if a student chooses to make a video, there should be video editing tutorials available for them). Allowing and supporting student choice with thoughtful and robust digital curriculum design will increase student engagement and learning.
This winter quarter (2018) I co-taught a course called Learning with Technology at Seattle Pacific University. It was a course for pre-service teachers from a variety of different disciplines. We had students who were already teaching and going back for their Master’s degrees, some that were student teaching and others that were somewhere in between. The description of the course was as follows:
This course addresses research and promising practices related to how to use technology effectively for student learning. During this course we will address the ISTE Standards for Students, focusing on how students use technology for creativity and innovation, communication and collaboration, research and information fluency, critical thinking, problem solving and decision making, digital citizenship, and technology operations and concepts.
After meeting with meeting with my co-instructor, Dr. David Wicks, we set out to design a course that addressed the new student standards and expose students to one of the many tech integration tools that they could use to evaluate the quality of technology enriched instruction.
The QUEST model
The QUEST model (Wicks 2017) was used to design the learning modules related to the Student Standards. This model is inquiry based and aligns nicely with the Student Standard 1. Empowered Learner, which asks students to “leverage technology to take an active role in choosing, achieving and demonstrating competency in their learning goals, informed by learning sciences.”(ISTE 2016) We posed questions from the standard each week and asked them to come up with, and post, their own question to a discussion board in the Canvas course. Since they came from such diverse experiences, we asked them to try to focus their question on their own content area specialty so that it would have some relevance to them and make the resources they found more useful in their daily practice.
During the Understanding phase they were asked to do their own research to answer their own question. Not only did they have to practice digital literacy skills such as searching and curating content but they were actively engaged in the Knowledge Constructor student standard as they “made meaningful learning experiences for themselves and others” and “critically curated a variety of resources using digital tools”. (ISTE 2016). Once they’d found research and resources that helped them answer their question they posted them to a discussion so that their thinking was made public to the group. Because they were making that learning public they were also experiencing Student Standard 2. Digital Citizen because they needed cite their sources and “employ effective research strategies”.
In the Educate phase they interacted with each others posts and posed more questions or suggested solutions to others. We also tried to do a Google Hangout every other week so that students could learn more about the standard and ask questions of each other and the instructors. This was meant to be a collaborative learning process. It allowed them to stretch their thinking outside their subject area expertise and allowed them to give feedback or share resources that would help others answer their questions. This kind of collaboration is part of the student standard 7. Global Collaborator that asks students to “use digital tools to broaden their perspectives and enrich their learning by collaborating with others.” (ISTE 2016)
After this step, they were asked to share their final solution and related resources that addressed their question. They did this on a personal blog that they were using as an ongoing portfolio of their work. They had been put into PLC groups to create job alike groups (as much as possible) so that they only had to comment on three other classmates posts but they were expected to comment on each others final blog posts. This is a way for them to teach others and share their work with a larger audience. This helped them experience Student Standard 1.b “Students build networks” and to practice 6. Creative Communicator as they had to “communicate clearly and express themselves”. (ISTE 2016)
Reflection on the Model
After having experienced the QUEST model as a student myself in the Digital Education Leadership program I will say that this kind of learning has a high learning curve for some students. Many of us come from traditional teaching and learning models and are used to being told what to think by the instructor. This model asks students to truly take ownership of their own learning. There are rubrics and check lists in place to help guide the structure of the posts and the type of information that should be included but the question is unique to each student. For some this was a little unsettling at first and they asked a lot of questions that were mostly related to “am I doing this right?” Eventually, they settled in. Both David and I had vacations early in the quarter which, unfortunately for the students, meant that they didn’t get immediate feedback on their first module very quickly. In retrospect, the first module is the most vital to respond to so that they are confident they are meeting expectations. I wish I could go back and do that differently.
As for the model itself, it was very freeing for me as a student to focus my energy on the questions that interested me or were relevant to my job as a coach. I think that the students in our class valued that flexibility as well by the end of the quarter. We did run into some difficulties with finding a good tool for students to interact with the recording of the Google Hangouts if they were not able to attend in person. They used YouTube to comment in so everyone could see their questions but it was not ideal. It would be nice to find something that could be embedded in Canvas to keep everything in one place. The other challenge was with Canvas. It did not make it easy for students to provide evidence that they’d commented on each others blogs. We had them cut and paste their comments into a Canvas group discussion but that also added an extra step for the participants.
The Individual Project – Evaluating Technology Integration
One of the challenges with technology in my own district right now is not necessarily getting teachers to use technology but in getting them to use technology purposefully and with good quality design that engages learners and moves away from just recreating traditional learning experiences digitally. The largest part of my work in this class was designing the individual project. We wanted to our students to have the experience of looking at a lesson through the lens of technology and evaluating it based on a model. We chose the Triple E Framework (Kolb, 2017) as a model partly because I like the simplicity of the rubric and the questions and because it is focused on developing quality learning experiences that have a technology component rather than focusing on the technology as the centerpiece of the lesson.
The project was divided into 5 phases. The first phase was to find a lesson that they could use to evaluate using the rubric. We left the selection of the lesson pretty wide open and encouraged them to find something in their content specialty area or something they had used or were thinking of using with their students.
The next three phases asked students to use the rubrics for Engage, Enhance and Extend, which are the three E’s in the Triple E model, to evaluate the lesson based on the three questions that are central to each area. They used her None, Somewhat and Absolutely scoring system and then wrote a reflection on what they learned about their lesson using the rubric and some guiding questions that I posed.
The final phase was to put it all together, tabulate the scores and write a reflection on the process of lesson evaluation using this model. We also asked what, if any, suggestions they would make about technology tools, resources or strategies they might use to improve the lesson.
Reflection on the Project
I discovered a couple of things. First, they needed either an example, a checklist or some more explanation of what type of lesson they should look for. Some students chose lessons with very little technology included in them in the first place. When they were evaluating using the Triple E rubric they were hesitant to score it low in areas that asked for technology or had a number of areas that they could not score at all because there was not enough technology included in the lesson. If I taught this again I would add some supports into that phase that would help them pick a lesson that would make the job of evaluation a little easier.
Overall they did a nice job reflecting on the lesson and the process. I am not sure they had enough tools in their toolbox to be able to adequately suggest tools and enhancements to their lessons, however. Although Dr. Wicks often included a tool in our Google Hangouts that was related to the standard, it might be worth considering working in some more hands on work with tools as part of the course. It would give them a broader range of knowledge of the types of tools that would be available to them if they were teaching to either the standard or as a part of redefining a learning experience. I am not quite sure how to do that without making the time spent on the class unreasonable but they could be creating artifacts with various digital tools to show their learning around the solution to their question as opposed to just a written reflection in their blog post.
ISTE | Standards For Students. (2017). Retrieved from https://www.iste.org/standards/for-students
Kolb, L. (2017). Triple E Framework. Retrieved from https://www.tripleeframework.com/
During this week’s module we looked at two educator standards from ISTE. The first was standard 5 “Designer: Educators design authentic, learner-driven activities and environments that recognize and accommodate learner variability” and standard 7 “Analyst: Educators understand and use data to drive their instruction and support students in achieving their learning goals”. After reading these standards I was drawn to the designer standard and I wanted to know “What digital tools are available for Kindergarten students that allows for personalized learning?”
With this question in mind I started my quest to find an answer, that’s when I found the article “TOP 9 MUST HAVE PERSONALIZED LEARNING APPS, TOOLS, AND RESOURCES” by Matthew Lynch from The Tech Edvocate. In this article the author lists 9 different personalized learning apps that are available to teachers, students, and parents.
What is personalized learning and what does it look like:
What is personalized learning? It can look a little different in every school, in every classroom and for each student. Teachers create lessons that are challenging without being too hard, and that suit the individual interests of each child. When teachers face 25 or 30 students, if they decide to teach in the conventional way of standing in front of the room and lecturing, then they must provide a lesson that will benefit the majority of the room — the average student. That leaves behind those who are below or above expectations. The best teachers find ways to help those students, but it’s a tall order. It takes a superhuman amount of work to find lessons that fit each student every single day. Educational computer programs can identify specific weaknesses in a child’s skills, such as understanding analogies or adding fractions. Teachers can review these outcomes daily, then assign lessons to each student according to his/her needs — for the next time they log on. The computer system does this by constantly assessing how a particular student answers questions and what kind of lesson most engages that student. In a classroom personalized learning can take on different forms. In a kindergarten classroom students might use a math program during rotations that adjusts itself to each students learning level and helps students grasp skills they are missing. For older grades it might look like giving the students an array of different, personalized tasks to do. However a teacher stuctures a student’s personalized learning I think its important to remember all students shouldn’t be required to show their learning the same way and digital media open up a host of possibilities beyond the traditional essay, poster, report, or quiz.
Knewton has been around for a while compared to other personalized learning resources. The company uses learning analytics to track past performance and modify future curricular experiences based on that performance. Knewton actually provides the course materials and gives recommendations to both students as what to study and to the teacher as what to help students study.
Classkick is an iPad app that allows the teacher to see all of the students’ screens as they are working on a problem. Teachers who are in the classroom can use this data to tailor the help they give students. Teachers who are online can use the data to complete curated blog posts for the class based on where students are having trouble and can set-up individual help sessions with students.
The current national math competencies expect students to be able to solve problems and use critical thinking. This can’t happen before basic fluency is achieved. Reflex is a platform that teaches math fluency using games. As students complete games, they are marked as competent for math facts that they have memorized. Students are notified of daily time by a green circle that fills up when they have spent enough time on Reflex. Teachers and parents get weekly reports on student progress.
There is a saying that the best way to know if you know something is to explain it to someone else. Explain Everything does that and more. It is an excellent tool for creating and designing presentations, forcing students to articulate their understanding, and collaborating with their peers.
Students have significant variations in their reading ability. It is impossible for students to understand the meaning if they are reading at a level outside what they can do by themselves. Newsela affords a personalized reading experience with information from reputable sources such as the History Channel and The Guardian. Analytics are provided to the teacher based on completion and reading comprehension.
Smart Sparrow is a platform that allows for content creation, assessments, and adaptive authoring. Each student will receive an individualized learning experience based on their interactions with the software.
A well-designed personalized learning system focuses on mastery-based learning. In mastery-based learning,students stay with a topic or level until they demonstrate competency. RealizeIt brings mastery-based learning into a personalized environment where students are presented with content at their level and do not progress until mastery has been demonstrated.
Self-regulation is a difficult skill to master. It requires subcomponents such as metacognition and time management. Summit Learning is an entire solution for personalized learning that ultimately helps students be able to be self-directed learners. Students learn content through authentic problems and projects.
Students need personalized learning when it comes to classroom management in addition to instruction. Class Dojo provides teachers with a platform to track student behavior and assign positive and negative remarks. In addition, teachers can send instant messages to specific parents and share photos from the class. Students can choose their own avatar.
Personalized Learning through a Kindergarten Lens
Of all of the recommended digital tools for personalized learning Class Dojo seems to be great fit for a Kindergarten classroom. Class Dojo isn’t just for classroom management, students can create digital learning portfolios. With these portfolios teachers can encourage students for any skill or value — whether it’s working hard, being kind, helping others or something else. This program also allows for students to have voice by allowing them to showcase and share their learning by adding photos and videos to their own portfolios. One of the many benefits of Class Dojo for a Kindergarten classroom is how easy it is for students to post to their portfolios.
Once students have posted in their portfolio’s the teacher can review their work and can then give instant feedback to students.
The video from Class Dojo’s website gives a quick peek of how it is used throughout a classroom and shows different features that can be used.
Dobo, N. (2017, January 25). The Growing Role of Technology in Personalized Learning. Retrieved May 5, 2018, from https://www.kqed.org/mindshift/47376/the-growing-role-of-technology-in-personalized-learning
Lynch, M. (2017, August 13). Top 9 Must Have Personalized Learning Apps, Tools, and Resources. Retrieved May 5, 2018, from http://www.thetechedvocate.org/top-9-must-personalized-learning-apps-tools-resources/
I begin this post with three hypotheses about online learning, based on my experience as a community college composition and humanities teacher in both face to face and digital formats, and on my experience as a graduate student who has taken digital courses from two public research institutions and one private university. The first hypothesis is that the ratio of nontraditional to traditional students is greater in digital than in brick and mortar formats. The second hypothesis is that despite the prevalence in #edtech online instructor training of “frameworks” and lists of “best practices” and available technologies, most college teachers and institutions that are implementing online learning formats could do more to align online course design and instructor behavior with cognitivist and constructivist learning theory, with empirically verified pedagogical strategies, and with systematic piloting and review of digital innovations at the course and program level.
At the intersection of these two hypotheses is the crux of the matter: if a higher ratio of first generation, English language learner, adult, and other nontraditional students is enrolling in online (defined here as any combination of synchronous or asynchronous learning) courses, and the shift to teaching in digital spaces and/or new technological innovations requires teachers to develop new communication, technology and pedagogical design skills while amplifying the negative effect of the lack of such skills, do resulting negative impacts on student retention and motivation create a significant disparate impact for nontraditional students? On the flip side, how can implementation of learning theories such as andragogy and social constructionism, together with more evidence-based review of digital teaching approaches, result in increased success for the less traditional student population that tends to take online courses?
A number of studies confirm that nontraditional students comprise the majority of online leaners (Chen, Lambert & Guidry, 2010: Thompson, Miller & Pomykal Franz, 2013). Teachers and institutions who create online learning experiences thus need to consider both the assumptions of adult learning theory, such as that adult learners have and use more personal experience in learning, maintain responsibility for their own learning and resist situations where learning appears to be dispensed or controlled by others, and are more intrinsically than extrinsically motivated (Holton, Swanson, & Naquin, 2001), as well as the issues of diversity such as adaption, acculturation, identity formation, and diverse practices and understandings of knowledge acquisition and demonstration (Nielsen, 2014).
I would like to posit that a question as complex as how to build capacity at the course and institutional level for supporting such learners in online formats cannot be addressed effectively without systematic analysis, design, and evaluation of possible solutions that consider not only what innovations may work but why they work and whether they are scalable.
In this post, I respond to recent literature seeking to put principles of learning theory into conversation with systematic qualitative analytical approaches to problems of student engagement by suggesting that other educators join in early implementation of these models as well as in systematic review of results. I’ll discuss two research-based frameworks for online course design and one for course/program review.
Chen, P., Lambert, A., & Guidry, K. (2010). Engaging online learners: The impact of web-based learning technology on college student engagement. Computers & Education, 54, 1222-1232. doi:10.1016/j.compedu.2009.11.008
Ford, C., McNally, D., & Ford, K. (2017). Using design-based research in higher education innovation. Online learning, 21(3), 50-67. doi:10.24059/oli.v%vi%i
Holton, E.F., Swanson, R.A., & Naquin, S.S. (2001). Andragogy in practice: Clarifying the andragogical model of adult learning. Performance improvement quarterly, 14(1), 118-143. Retrieved from https://onlinelibrary.wiley.com/doi/abs/10.1111/j.1937-8327.2001.tb00204.x
Nielsen, K. (2014) On class, race, and dynamics of privilege: Supporting generation 1.5 writers across the curriculum. In Zawacki, T.M. & Cox, M. (Eds.), WAC and second-language writers: Research towards linguistically and culturally inclusive programs and practices (pp. 129-150). Retrieved from https://wac.colostate.edu/books/perspectives/l2/
Redmond, Pl, Abawi, L., Brown, A., Henderson, R., & Heffernan, A. (2018). An online engagement framework for higher education. Online learning, 22(1), 183-204. doi:10.24059/olj.v22i1.1175
Thompson, N., Miller, N., & Pomykal Franz, D. (2013). Comparing online and face-to-face learning experiences for non-traditional students: A case study of three online teacher education candidates. The quarterly review of distance education, 14(4), 233-251.
For this week’s inquiry, I initially considered spending time exploring HyperDocs which is a tool I have previously dabbled in. My students really liked the creative and digital presentation. Yet I was challenged by my professors to consider whether or not a HyperDoc is really just a glorified worksheet. I happened to be exploring one of my favorite educational blogs, Cult of Pedagogy, when I came across a post featuring a tool that was engaging like a HyperDoc, but had more potential for personalization, independence, and differentiation. That tool is called a Learning Playlist and the idea is the brainchild of teacher Tracy Enos. Jennifer Gonzalez interviews Enos and shares examples of what Learning Playlist look like in her post, “Using Playlists to Differentiate Instruction.”
Figure 1: Playlist for Argument Writing by Tracy Enos, full Google Doc available here
What is a Learning Playlist?
The most basic description of a Learning Playlist is “an individualized digital assignment chart that students work through at their own pace” (Gonzalez, 2016). Tracy Enos developed the idea for Learning Playlists when she grew frustrated with trying to meet all students’ needs with a single lesson. She’s not alone in rejecting the one-size-fits-all approach. In fact, “[N]early 50% of the students in today’s classrooms have some form of learning diversity that impacts how they learn best” (Digitalpromise.org, 2016). This diversity includes differences in background, history, culture, linguistics, and socioeconomic status (Digitalpromise.org, 2016). Given these needs, it’s no wonder teachers are turning to technology to help meet students’ individual needs. Playlists are one tool that can foster independence, allow for choice, and differentiate based on need. Playlists also take the responsibility for learning and place it in the hands of students.
How is a Learning Playlist created?
Consider the many elements that go into creating a successful unit plan in any content area. There will be guiding questions, lessons, content to review, formative assessments, discussions, and perhaps articles or film clips. In a traditional classroom model, all of that learning takes place at the same time. Each student reads the same short story at the same pace. One day is dedicated to completing one set of questions. Test day is the same for everyone, regardless of need. This model typically meets the needs of those students in the middle of the spectrum while leaving some students struggling to catch up and others bored because they’ve finished early. Meanwhile all students have a low degree of choice and ownership. It’s passive learning.
What if you took the same essential elements of your unit plan and instead digitized them? Lessons could be bookmarked for review whether through a Slideshow, Screencast, or other tool. Formative assessment could occur through ActivelyLearn, EdPuzzle, or Google Forms. Discussions can still be had via Padlet, Google Groups, or Slack. Students could work at their own pace, independently. With this newfound independence, your time is freed to assist struggling learners or to conference individually with students.
It’s a pretty revolutionary way to consider teaching. Yet you can still have deadlines and require students to check-in daily or weekly. Learning can be reflected on via whole group discussion days. (I’m an enormous fan of Socratic Seminars.) Students can still work in groups to meet the learning goals. You can even opt for a blended model where some of the lessons are given whole-group, and then students work individually on a Playlist based on their needs.
How can Learning Playlists support independent learning?
Within the Playlist format, there is plenty of room to support independence and choice. Though Enos doesn’t mention the addition of student choice, I can easily see how it can be incorporated into Playlists.
Choice of content: have students choose their own short story to apply plot skills to or Civil War battle to research and describe.
Choice of task: have students choose which tool they want to use to ‘show they know;’ perhaps an Explain Everything Screencast to demonstrate the steps of an algebra problem, or using Storyboard That to create a digital story about rock cycles.
Choice of question: have students come up with their own essential question and use a Playlist to guide them through the inquiry model.
How can Learning Playlists support differentiation?
Differentiation within Playlists can be accomplished without drawing attention to those students who need extra help. Given the ability to individually assign work within Google Classroom, no student knows they have a different version. Another option is to provide links to the leveled Playlists and let students self-select.
Pacing: Students can replay or accelerate a lesson as needed.
Leveling: For each Playlist, you can create a Level 1, 2, and 3 Doc/Slide to better meet student needs.
Personal Support: Enos leaves several tasks on her students’ Playlist ‘to be determined.’ She then goes back and adds extra resources or practices as needed based on the work students submit.
Digitalpromise.org. (2016). The Growing Diversity in Today’s Classroom. [online] Available at: http://digitalpromise.org/wp-content/uploads/2016/09/lps-growing_diversity_FINAL-1.pdf [Accessed 4 May 2018].
Gonzalez, J. (2016). Using Playlists to Differentiate Instruction. [online] Cult of Pedagogy. Available at: https://www.cultofpedagogy.com/student-playlists-differentiation/ [Accessed 1 May 2018].
SPU’s M.Ed. in Digital Education Leadership is project-based. Students "play" with emerging technologies, building real-world products for their schools.