Tuesday, October 27, 2015

Week 10: BABR 10/11, J 9, Castek

BABR 10 talks about the fact that different media (mediums?) are appropriate for different forms of feedback and assessment. For instance, some digital tools are most appropriate for students' self-assessment, and some are more suited for feedback from peers and teachers. What are some tools that you can think of that would be useful for each of these types of assessment? 
I love the idea of doing reader-based responses to provide feedback to and among students. I feel like this would be especially helpful in digital environments where writing incorporates text as well as other components like images, video, etc. that need to be integrated seamlessly. I also think that it's especially useful with digital writing because there are so many digital tools that allow for easily sharing these responses--one that I'm thinking of is VideoNo.tes, which would allow students to annotate their thinking with videos or with any sort of audio or written work that could be turned into a video. If you had students do peer review with reader-based responses, what guidelines would you give them, or how would you model it for them?

One of the teachers in the Castek article discussed the challenge of creating assessments for online comprehension and research that are "simultaneously authentic and motivating" (548). How do you think we can best accomplish this when we assess students with digital texts? 

What form of digital assessment do you think would best suit your current or future teaching situation, and what purpose would it serve for you? Consider whether it would be more suitable for you to provide static electronic feedback (intertextual vs. marginal/end commentary) or dynamic electronic feedback (asynchronous board discussions, recorded oral feedback, synchronous chats). If I had to pick one, I think that I would choose intertextual static electronic feedback. At the high school level, students would need to do fairly print-heavy writing quite often, even with multimodal texts. I feel like that type of feedback would be really useful for me, especially if I were taking a reader-based response approach, to tell them my thoughts at different points during my reading. I think that makes it easier for students to pinpoint specifically what they might need to work on, since they see where I had those thoughts, even if my comments remain fairly overarching.

BABR 11 discussed digital systems for automatically evaluating students' writing. I am really wary of them, especially considering the concerns listed in the chapter. Do you think these automatic evaluation tools could ever be viable methods of assessment? If so, how would you implement them in your teaching situation?

Have you ever used a program or service for maintaining records and data related to students' assessments over time, or an online service for building student portfolios? This year, my school implemented MasteryConnect. I really like it for its data analysis--it can let tie individual test questions to academic standards so you can see students' progress with each standard over time. The problem is that it only really allows for multiple choice assessments, and there isn't much of a way for me to enter scores related to other types of assessments. It's useful for rote knowledge related to language arts concepts, but I would ideally like something a bit more comprehensive and literacy-focused and less test-prep focused.

Looking through some of the rubrics in chapter 9 of Johnson, I can estimate about where my students would fall within the dispositions discussed in the chapter. I think that a lot of the types of assessments that the chapter discusses would be useful for helping students improve their new literacies skills through reflection, and the rubrics would be most useful for monitoring their progress. Which of these assessments do you think would be best for promoting growth and progress with students' new literacies dispositions? Personally, I like the idea of using checklists like the ones provided in the chapter. I feel like it creates a tangible tool for students to use to monitor their own behaviors, as well as showing them how they are doing.

16 comments:

  1. After reading BABR 10, I'd say that the most useful tools for feedback assessment that I use right now are the comments on our kidblog. I model how to comment on someone's writing to begin with and teach my students to comment on two peers' writing pieces by giving a 'glow' and a 'grow'. A 'glow' is something specific that reviewers noticed writers did well. For example, students have written 'I like the characters you made' and 'Your title is creative' before on each others' stories. A 'grow' is something reviewers notice writers need to work on. An example of this is 'check your spelling' and 'don't forget capital letters'. Using the 'glows and grows' model of teaching peer feedback has been useful and I hope to be able to have students expand on that even more, so that the feedback becomes more authentic with time, as the teacher in Castek brought up! If only we had more tech time to do so lol!

    ReplyDelete
    Replies
    1. I like that method for constructive feedback! Do you feel like having the access to technology aids in the feedback and reflection process in ways that non-digital forms lack?

      Delete
    2. I do, but I'm finding that they are still needing to vocalize or discuss their feedback first. When I am giving feedback, though, it seems to motivate them to make necessary changes more-so than with non-digital feedback. I think they enjoy knowing that they are writing for an audience (me) and look forward to it after a while!

      Delete
  2. One program I've used to keep and help our class track data before is www.jumpro.pe - it is a lot of work to input the data and track it at the elementary level, since we teach so many standards, so I just input our summative benchmark data and can see what standards students need the most work on. It is a cool program, though!

    ReplyDelete
    Replies
    1. Our math data is organized in a similar way, Breanna. It is a great way to identify student strengths and weaknesses regarding certain skills or strands. Using the online data can show you what you need to reteaching whole group or what you need to focus on during small group instruction. I wish I had something similar for literacy data. However, because I have to input literacy data into so many places (date notebooks, report cards, portfolios, etc) I would be hesitant to try it at this point.

      Delete
  3. When assessing comprehension of digital texts in an authentic and meaningful manner, I think it is important to consider how students can demonstrate what they know through projects. If students conduct online research, a powerful way of finding out what they understood from the digital text is to have them create a product that demonstrates their knowledge. For example, a teacher can students a choice board of options like making a poster, writing a letter, creating a video. These projects can be collaborative or individual and can be done through traditional or nontraditional means.

    ReplyDelete
    Replies
    1. Very good point. So if you had them research, for example, mammals and helped guide them to relevant digital texts on their level they could show what they learned in a digital story, a screencast, blog post, bubbleme map, etc. Giving feedback is where it can get tricky I feel like, though. But I do see the importance in doing it otherwise the use of the digital literacies in creating something to show their knowledge was almost for nothing - because it makes the technology an end result instead of something utilized to insight deeper thought. If that makes sense...haha!

      Delete
    2. I'd like to do that! I think it's especially conducive when working with digital tools to learn content area knowledge. It also helps assess multiple areas--research skills, content understanding, new literacies skills, etc. Also, good point, Breanna!

      Delete
  4. (On an unrelated note, I'd like to point out our blog we now has 666 page views, just in time for Halloween...)

    ReplyDelete
  5. Which of these assessments do you think would be best for promoting growth and progress with students' new literacies dispositions?

    To assess my students' learning and school dispositions, I give a survey at the end of every nine weeks. It is for younger students, so there is a statement like "I can read." Then there is a smiley face, a neutral face, and a frowny face. I could add a couple of statements about new literacies too.

    ReplyDelete
    Replies
    1. I do something similar and that's a great idea! What would it include, though?

      Like "I can look up a new vocabulary word in a search engine" would be a good one for my kids. Maybe "I can type my name" or "I can open the internet" would be a good one for your younger kiddos?

      Delete
    2. How about adding a couple for your KidBlog? For example, you could say, "I can write my ideas on a blog." Or "I can write comments on someone else's blog."

      Delete
  6. Nice set of questions and comments from the readings, Angela, and interesting ideas, Breanna and Neka. You have given me some ideas to share with the undergraduates. on a side note, has anyone communicated with Courtney in the last 2 weeks?

    ReplyDelete
    Replies
    1. I have not. She hasn't gotten in touch with us in a while and I really hope everything is okay! It's not really like her to not participate or respond.

      Delete
    2. Yeah, I'm pretty sure we haven't heard from her in about 3 weeks now. I think we've all tried contacting her. Do you have her number, Dr. Beach? I'm getting kind of worried.

      Delete