Monthly Archives: October 2014

Reflection of Gleanings from Collecting and Interpreting Data

Aleta May

EDET636         10-26-14

Week 8 Reflection

My Research Proposal can be found at: https://aleta57.wordpress.com/2014/10/26/research-proposal-motivation-and-engagement-in-a-blended-learning-reading-environment/

When I read Sunshine’s post, I started thinking about my own observation data and why I am quantifying it as well as interpreting it. I wrote to her:

“How will you collect observation data? I’m curious, because I am familiar with some methods, but less familiar with others? Will your data be anecdotal where you watch body language and make notes every so many minutes, or keep records over time to look for changes? Including interaction with peers is important because sometimes a distracted peer creates distraction with the focus student, or the opposite, where the peer helps draw the focus student in by their actions.”

By reading through Sunshine’s comments, I could identify with the feeling of, “What do I with all that information? and, How can it be more focused?”  Here is what I said to her:

“I agree with you that it is easy to get overwhelmed with too much data. There seems to be a balance between keeping questions in an interview focused, yet broad enough to gather unexpected findings. I’m wondering whether it is best to pick one form of data as the center of the picture, while using surrounding information from other data to interpret. I think I may be doing that when I use observation—keeping that as the center.”

As I read Thomas’ post, I reflected more on my own need to narrow my focus. So I came to a similar conclusion as when I read and commented on Ali’s post. In one case, my data needs to be useful so that I may be convincing to others when I propose change or modification to one portion of the Read 180 rotation cycle in a balanced learning environment. In another case, my data collection needs to have a centralized focus (engagement/motivation) with others serving to interpret that focus data. I kept changing which to make as my center, but I keep landing on observations.

Here is what I wrote to Thomas:

“I think you’re right that a narrowed focus yields specific data. I am collecting information from whole class surveys, interviews of 6 students that I will also be observing three times each. I found questions in research that would pertain to my questions, with some modification. I’ll also analyze data from Measures of Academic Progress (MAP) assessment in the area of reading comprehension—but this will be kept global unless my observations lead me to focus on a specific part of comprehension. I am going after motivation/engagement within a blended learning environment for reading (Read 180).”

The issue I have is that in order to observe, I stop modeling for students and supporting their “independent” reading needs. Even with my being the support teacher, the lead teacher can only focus on small group instruction in this rotation model, because the student needs in that group are as great as the student needs in the independent reading group rotation. So I really need to take this observation and truly analyze it to make it valuable to our students, our teachers.”

One particular researcher who really speaks to engagement is Louise Rosenblatt. Even though her focus is on reading and literature, she focuses on a lived through experience, not just knowing about; and the transaction between the text and the reader. The text is not interpreted by its own merits, it is when the reader interacts with that text—thus engagement. Here is a chapter from a book written in 2005 that may help both of us with our research regarding how to engage students in learning: http://www.heinemann.com/shared/onlineresources/e00768/chapter5.pdf

Ali’s post led me to thinking along the same lines as that of Thomas’ and Sunshine’s posts; keep research focused and purposeful, while at the same time noticing the unexpected. Ali asked herself whether she was just adding another variable. Here is how I responded to her as I thought this through for her and myself:

“I ask myself that question often: Am I just adding another variable? It is difficult to keep these narrowed down for me. However, I think that gathering data is gathering information that can be interpreted through many different lenses. Sometimes I find something I was not looking for, but is still very significant.”

Ali said, “Did fluency, accuracy and expression improve or decrease?” This question, I believe, will lead the research into meaningful ‘Why’ questions.”  This week’s postings were helpful in that I peered into the window of others’ research, in turn creating a platform from which I can stand on my own research.

Research Proposal: Motivation and Engagement in a Blended Learning Reading Environment

Research Proposal: Motivation and Engagement in a Blended Learning Reading Environment

Aleta May, M.S./M.A, Ed.

University of Alaska Southeast

Research Questions

What does the pre-designed, comprehensive, blended learning model contribute to the learners’ reading skills in this specific setting? How will students become engaged in a book that may be related to their interests, but has many features not in their current mental schemata, build background knowledge about a book for independent reading? Is this model more effective for students who have prior experience with a supplemental computer reading model?

Framework

My research will be based on an interpretive framework. It will be guided by a mixed method study approach. An interpretive framework is guided by using one data set, primarily qualitative, to interpret the traditional standardized quantitative data set; although themes will emerge from qualitative data independent of quantitative. Surveys of reading attitude in both digital and traditional reading settings will reveal information that is both quantitative and qualitative—bridging across data and creating a true mixed method research approach.

Method

Quantitative data will be derived from standardized assessments, and less formal survey responses; although survey responses will also serve as qualitative information since it evaluates attitudes and engagement. Quantitative data will be gathered and interpreted within its own merits, but more importantly with qualitative data from observations and interviews.

Rationale for this Classroom Research

            Students in the Read 180 program have had some experience with reading on-line, but mostly in class using textbooks, worksheets, and some children’s literature. With most students being English Language Learners (ELL), it is important for them to accommodate to and engage in what they read. Although the materials in this reading program are set to their reading lexile level, consideration for engagement in reading is limited. The books offered in the program are varied, thus exposing students to a variety of genres in children’s literature. However, very few are culturally relevant. The ELL students in this group speak Yup’ik and English. Yup’ik is primarily an oral language. Discussion of what is read and reading material that is relevant to their lives are motivating factors that are largely missing from this program.   Yet, the variety of trade books available to students and the novelty of a balanced learning approach are factors that are motivational; the question is how can this program be maximized to enhance student reading motivation and growth? Engagement is a major focus in this research, because engagement is key to deep reading; both through digital and traditional means.

Literature Review

Thematic Review of the Literature for Blend Learning Environments

One theme that sprang forth unexpectedly to me was the theme of how students’ attitudes and self-assessments need to be considered within the pool of evidentiary data collected when an analysis of the impact of blended learning, in this case in the area of literacy skills, is investigated. In one article, the impact of learning on two groups; a blended e-learning (experimental group) and a traditional learning (control group); was compared. The typical focus was to compare groups based primarily on examining the achievement data. When the pretest and posttest included the average score (five items each for cognition and skill) and a self-assessment (six items) over two midterms (a five week interval), the authors found that “the average score on the achievement test for the experimental group was only slightly higher than that for the control group” (Chang, Shu, Liang, Tseng, and Hsu, p.9). Taken at face value, blended e-learning achievement test scores were not significantly better than traditional learning? However, when the results for self-assessment between the two learning platforms was compared, apparent differences caught the eye of the authors; students in the experimental group scored noticeably better in the areas of cognition especially and in skill, but not attitude. So the question is raised, “Why blended learning?” Given more than five weeks to adjust to the new blended e-learning environment, their attitude toward this style of learning will lead to higher motivation. Motivation leads to higher engagement. Engagement leads to retention of learning.

Even at beginning college levels, the issue of course engagement is being addressed via use of hybrid course design. There is a shift in emphasis from teacher-centered to student-centered pedagogy; this includes high school, and starting to begin much younger. If students prepare for class before coming (or attending online), there is time for students to collaborate with each other to solve problems, and discuss insights. These in class-time strategies increase engagement in subject matter (Foote & Mixson-Brookshire, 2014). The role of the instructor becomes facilitator of group discussion, managing course content, designer, and transmitting information from sources, between groups, or technology like video clips.

Side-by-side with the theme of engagement is the teacher’s functioning as one who promotes collaboration and knowledge sharing within well-planned frameworks. When planning for social technologies, for example, the teacher considers participation, interaction, and synthesis as vital components of planning for productive groups who collectively create and construct new knowledge (Agosto, Copeland, & Zach, 2013). With technology, time and distance boundaries disappear. Students are not always tied to the confines of face-to-face, brick and mortar or synchronous learning on a learning management system (LMS). Blogs and mobile devices like tweeting encourage ongoing discussion that is motivational for students. Social media like blogs encourages students to learn the perspective of others. English language learners (ELLs) have more time to translate their thoughts and share ideas when not pressured to immediately respond. A framework will encompass teacher participation, personalizing (like entering into a blog conversation), and simplifying or promoting learning. To captivate students, students need to feel in control and see a connection to their lives (Agosto, Copeland, & Zach). Engaging students into learning means that teachers need to avail themselves of technology that encourages integration of social technologies into their course design.

Planning instruction and gauging its impact means that students attitudes need to be considered, and there is a framework for quantifying their attitudes for reading in different environments, and utilizing both traditional and digital medium either to motivate and expand students’ use of both. The Survey of Adolescent Reading Attitudes (SARA), seeks to do just that. By self report, students rate their attitudes for academic and recreational reading, in print and digital media, on a “6-point scale from ‘very good’ to ‘very bad’” (Conradi, Jang, Bryant, Craft, & McKenna, p. 568).

Closely related to engagement is considering grouping strategies to build momentum through pace and support in small group (and sometimes individual) instruction. Technology is a logical mechanism for supporting students as they rotate between groups. There are supplemental reading programs that can focus students on learning word parts, morphemes (small units within a word that have meaning), building background knowledge such as through video clips, and other common, repetitive activities that free up teachers to work with small groups in deeper thinking activities (Cheung and Slavin, 2013). It is like adding an assistant to the room who motivates the student through games. Comprehensive programs for reading include the entire reading model so teachers can work in small groups and check on individual progress for planning purposes. When growth is noticeable, students are engaged. Small group instruction within a blended environment is one way to meet this goal.

Read 180 is a comprehensive program that promotes and capitalizes on small group instruction. The purpose of the program is to remediate students experiencing reading difficulties beginning where “a general stagnation in reading growth in the upper elementary and middle grades” (Kim, Samson, Fitzgerald, & Hartry, 2010, p.1110) occurs. This is also referred to as the fourth-grade slump where at this level of reading students transition from “learning to read to reading to learn the new” (Kim, et al., p. 1111). There is an array of areas of reading skills where the breakdown may occur. Since readers may need to repeat learning or fill in where gaps have occurred in the areas of word recognition or language comprehension abilities, this program addresses that need through blended learning while utilizing small group direct instruction, small group supported independent reading, and computer practice lessons that address students needs individually. This article is based on an experiment to study to find out what the effects of this unconventional blended reading program, referred to mixed-methods in this article, would be when compared to another district after-school program. Among other studies considered by this article, it is not uncommon to find that significant positive effects may be found in the third-grade, but not in the fifth-grade for fluency, reading accuracy, and comprehension. In this study only fourth-grade students had notable gains, and that may be due to increased time spent in reading instruction. The researchers, therefore, separated out scores for areas of reading (like reading efficiency, comprehension, vocabulary, and fluency). A finding that is very powerful when looking to see the impact on comprehension because of student engagement in reading is that the computer activities incorporating leveled text with comprehension accountability and videos that teach vocabulary, word study and background building produced compelling improvement. Students in this study were not taught vocabulary that has application across content areas, because the 30-minute whole group segment where this is normally taught was omitted. Including this segment may have produced greater gains in reading comprehension follow-up scores.

Looking beyond the Read 180 blended learning program, teacher-designed blended learning assimilates a variety of technologies that can benefit students in ways that engage them into deeper thinking that leads to high levels of comprehension; such as blog sites, You Tube video clips, and discussion boards and utilizing students’ mobile devices (Tucker, 2013). Small group instruction can be part of that blended model design, such as teaching students according to individual needs within mini-lessons with the teacher as facilitator. Teachers are able to maximize their instruction time, because they have time to analyze student computer data and plan without the added time usually spent grading papers (Ash, 2012, March). Data is available in real-time for teachers to access for each student. The result of accessing data is the ability of teachers to have the flexibility to use that learning-management system to plan for very direct instruction (Ash, 2012, October).   To maximize instruction on a larger scale, some programs like Hybrid High School in Los Angeles, divides blended learning spaces into math/science and English/language arts/social studies groups (Ash, 2013). In this flexible model, individual instruction occurs through individual stations.

The blended learning plan may be set up as a flexible classroom. Flexible classrooms were developed in the Rocketship program in response to noted limitations when using a blended learning station rotation program (incidentally, station rotation is the design Read 180 follows). According to the designers and educators in the Rocketship program, station-rotation is too rigid, resulting in students who follow a plan, but do not spend much effort learning for themselves like they would in a more self-directed atmosphere (Herold, 2014, January). Through trying to expand a flexible and self-directed learning atmosphere within the Rocketship elementary school level program, the designers found that it was most effective in fourth and fifth grades. Further, in an article entitled, “Outside Public System, Blended Models Take Hold,” Herold found that among a private school group of schools, effective communication between educators is essential; so a consortium was created (2014).

One way self-directed learning is effective is when students use technology to build knowledge together. This occurs when an environment is designed augment collaboration between students, and between students and teachers. Students feel safe sharing their thinking from an affective stance when parameters are set within a blended learning environment for discussion. In Hew (2014), evidence-based practices are explored through a compilation of research to help teacher/designers find the right blend through use of a framework to set an environment for deeper and self-directed learning, exploration and discussions.

Computer Supported Collaborative Learning (CSCL) must be planned. A questionnaire distributed to college students across five CSCL subject areas resultantly found that upholding both cognitive and social learning conditions for individuals and groups is not only important to building knowledge, but must be planned into the pedagogical structure when designing a course (Hernandez, Gonzalez & Munoz, 2013). At the center is CSCL; closely connected to this are the teacher, the student and the task facilitating and learning within an organization that utilizes technology that supports course/subject pedagogy. More specifically, Computer-Supported Collaborative Blended Learning (CSCBL) may be founded in scripts that are designed to consider “the space, the pedagogical method, the participants and the history (4SPPI). The 4SPPIces combine to develop strong targeted learning objectives and supply a conceptual model for educators to use when communicating with each other for course design (Perez-Sangustin, Santos, Hernandez-Leo, & Blat, 2012). Within this CSCL design, both quantitative and qualitative data can be organized and studied to guide future ventures and planning. The overarching reasons for using CSCL scripts is for improving learning outcomes (Sobreira & Tchounikine, 2012). Sobreira, P. & Tchounikine, P. (2012). Facilitating change, adapting script phases allows well thought out script editing. The script is a way to manage a teaching situation, but within a complex, multilayered setting. It provides a framework for students to collaborate within. When a blended learning environment is changed, the pedagogy of the original learning management system needs to be considered so that changes within student interaction planning is conducted in a way that upholds the macro-level script (Sobreira & Tchounikine, 2012).

This literature review is broad in scope, but it addresses specific concerns that need to be considered when conducting classroom research in a specific rural village setting using a pre-designed blended learning model for reading; particularly if the research leads the researcher to recommend changes to the model or modify and expand it to other learning environments in the seventh and eight grade students’ school day. Observations, interviews, a survey, and close analysis of computer test data will be conducted with an understanding of what the research already says about the various aspects of blended learning. Questions will be answered regarding how the comprehensive blended reading program impacts student engagement that leads to meaningful growth and learning.         

                                                            Method

Participants

            The participants for this action research study include 22 seventh and eighth grade students who are in a blended model reading program, Read 180, for their first time. The eighth grade students participated in an online only model reading program the previous school year, called Lexia. Three of the 22 are proficient in English reading, writing, speaking and listening skills according to a standardized measure of English language proficiency. The remaining 19 students are limited English proficient. All of the students have participated in a dual language or Yup’ik immersion program prior to the seventh grade. The participants’ reading skills have been negatively impacted by living in a remote Alaskan village that is off the road system most or all of their lives, in that their background knowledge, vocabulary, and culture does not match most books read in school.  Most participants read primarily at school, with little practice at home.

During a one-hour time frame, students are divided into three small groups. Each group rotates every 20 minutes to participate in a blended learning environment: instructional software, modeled and independent reading, and small-group instruction. There are twenty-two students divided as follows:

Blue Group:

7th grade: Jamie; Sophie; Jared; Ann; Trisha; Halley; Kiarah. 8th grade: Alfred. White Group:

7th grade: Jason; Joseph; Preston. 8th grade: N. A.; Kristen; Chelsea

Yellow Group:

8th grade: Anna; N. K.; Alexia; Charity; Isaiah; Katya; Nelliann; Thomas.

Procedure

I will use behavior observation as an indicator for engagement in reading during the independent reading times. I will observe and interview two students from each group, since observation /note taking timeframes will be limited. I will use an application on my iPad to collect on-task/off-task observation data. The application description and link is listed in the resources section: YoungStone Innovations, LLC.School Psychology Tools. As I tap whether they are on/off-task, I will pause to type observation notes as notable behaviors occur. After the observation sessions, I will comment on the objective note taking, which will be reflected as my professional impressions of what I observed; this is called note making. In Appendix A, there is a copy of the email I sent to the lead teacher for Read 180.

As a participant observer, I will take notes and pictures. Pictures become artifacts that support the anecdotal notes taken, which will in turn aid in interpreting data. As I write notes, I believe they create a story that may provide more questions than answers for future reflection. An observation in my situation is a reflection of what occurred that day during group facilitation. One article suggests that creating a vignette or short story with a detailed description of events may be a way to record and interpret to find a problem to be solved and suggest action based on the theory of the problem’s basis: “vignettes provide . . . hypothetical snapshots of actual classroom situations; since vignettes are problem-based, either to propose a solution or to evaluate one given” (Jeffries & Maeder, 2011, p. 166). I think writing vignettes of my observations will help me deal with the inherent “lack of uniformity” (p. 166) that occurs when observing real students in real time.

Additionally, I am looking for a snapshot of students’ views about using traditional resources for reading versus using technology resources and how this may apply to students’ reading within the Read 180 program. This is a Survey of Adolescent Reading Attitudes (SARA) and uses a scale from 1 to 6: very bad to very good (Conradi, Jang, Bryant, Craft & McKenna, 2013). See Appendix B for a copy of this rating scale. It is a group measure, a rating scale, which takes about 10 to 15 minutes to administer. This may be given right away.

In Appendix C, there is a survey that evaluates the impact of the Read 180 program on student learning through their motivation to participate as well as how they see this program helping or not helping them to increase their reading skills and desire to read. This survey should be given about the middle of November.

I will individually interview six students I observe. Question six will be reserved for 8th grade students who participated in both Lexia (during 7th grade) and more recently in Reading 180 (during 8th grade). These interview questions were derived from Huang (2012). This interview should be given between the middle and the end of November. See Appendix D for questions.

Also, I will analyze Measures of Academic Progress (MAPS) scores from fall to winter in the area of reading comprehension to determine whether reading scores went up. The purpose is to compare how students did or did not increase standardized reading comprehension scores, the amount of increase if it applies, with how they feel about how their skills improved and how they feel about using a blended reading model. The rating scale in Appendix B, the survey in Appendix C, and the interview questions in Appendix D should be completed prior to the second MAP test so that the MAP test will not influence the answers given in other measures.

References

Conradi, K., Jang, B.G., Bryant, C., Craft, A., & McKenna, M.C. (2013). Measuring

adolescents’ attitudes toward reading: A classroom survey. Journal of Adolescent & Adult Literacy, 56(7). 565-576.

Ash, K. (2012, March). Blended learning mixes it up. Education Week, 31(25), 1-7.

Ash, K. (2012, October). Blended learning choices. Education Week, 32(9), 1-5.

Ash, K. (2013). Spaces for blended learning. Education Week, 32(25) 1-5.

Agosto, D.E., Copeland, A.J., and Zach, L. (2013). Testing the benefits of

blended education: Using social technology to foster collaboration and knowledge

sharing in face-to-face LIS courses.  J. of Education for Library and Information

            Science, 54(2).

Chang, C-C., Shu, K-M., Liang, C., Tseng, J-S., and Hsu, Y-S. (2014, April). Is blended

e-learning as measured by an achievement test and self-assessment better than

traditional classroom learning for vocational high school students? The   

            International Review of Research in Open and Distance Learning, 15(2), 1-12.

Cheung, A. & Slavin, R., (2013). Effects of educational technology applications on

reading outcomes for struggling readers: A best-evidence synthesis, 277-299.

Conradi, K., Jang, B.G., Bryant, C., Craft, A., & McKenna, M.C. (2013). Measuring

adolescents’ attitudes toward reading: A classroom survey. Journal of

            Adolescent & Adult Literacy, 56(7). 565-576.

Foote, S.M. & Mixson-Brookshire, D. (2014). Enhancing learning with technology:

Applying the findings from a study of students in online, blended, and face-

to-face first-year seminar classes. Currents in teaching and learning, 6(2),

35-41.

Hernandez, N., Gonzalez, M. & Munoz, P. (2014). Planning collaborative learning in

virtual environments. Media Education Research Journal, 42(21), 25-32.

Herold, B. (2014, January). New model underscores rocketship’s growing pains.

Education Week 33(19).

Herold, B. (2014). Outside public system, blended models take hold.

Education Week, 33(19), 1-5.

Hew, K. F. & Cheung, W. S. (2014). Using blended learning evidence-based practices.

            Singapore, Heidelberg, New York, Dordrecht & London: Springer

Huang, S. (2012). A mixed method study of the effectiveness of the accelerated

reader program on middle school students’ reading achievement and

motivation. Reading Horizons, 51(3), 229-246.

Jeffries, C. & Maeder, D.W. (2011). Comparing vignette instruction and assessment tasks

to classroom observations and reflections. The Teacher Educator, 46, 161-175.

Routledge Taylor & Francis Group.

Kim, J.S., Samson, J.F., Fitzgerald, R., & Hartry, A. (2009). A randomized experiment

of a mixed-methods literacy intervention for struggling readers in grades 4-7:

Effects on word reading efficiency, reading comprehension and vocabulary, and oral

reading fluency.

Perez-Sanagustin, M., Santos, P., Hernandez-Leo, D., & Blat, J. (2012). 4SPPIces: A

case study of factors in a scripted collaborative-learning blended course

across spatial locations. Computer-Supported Collaborative Learning 7:443-465.

Singapore, Heidelberg, New York, Dordrecht & London: Springer

Sobreira, P. & Tchounikine, P. (2012). A model for flexibly editing CSCL scripts.

            Computer-Supported Collaborative Learning 7:567-592. New York:

International Society of the Learning Sciences, Inc. and Springer Science+Business

Media

Tucker, C. R. (2013, March). The basics of blended instruction.

Educational Leadership. ASCD/www.ascd.org

YoungStone Innovations, LLC.School Psychology Tools application:

http://www.schoolpsychologytools.com

https://itunes.apple.com/ushttps://itunes.apple.com/us/artist/youngstone-innovations-llc/id435891537?mt=8/app/school-psychology tools/id435891534?mt=8

Appendices

Appendix A

The email I sent to the lead teacher for Read 180:

Jeff,

As part of my college project, I need to collect observation data.  I will be observing 2 students in each group for about 8 minutes each.  I downloaded an app for my iPad to collect on-task/off-task data, and it has a feature included for me to pause this and type notes.  I am using their first name only.

What this means is that two days next week, like Monday (or Tuesday) and Thursday, I will collect observation data; the same two days during the week of November 3rd, and 1 day each during the weeks of November 10 and 17 for a total of six times.   The only exception would be absent students that I may need to make up an observation for.

I won’t be of assistance to the students during those days, since I will be sitting back behind Patrick’s desk observing.  I’ll be telling the students that I am taking notes for a class I am participating in—but not telling them I am observing them.  I can share the information with you when I compile it.

Please let know what you think.

Thank you,

Aleta

Appendix B

Survey of Adolescent Reading Attitudes (SARA). Derived from the research of: Conradi, Jang, Bryant, Craft, & McKenna, (2013).

Name:_____________________                                   Date:­­­­______________

How do you feel about reading in books, online, and other ways?

6 = I feel very good about … (5, 4, 3, 2 = I feel somewhere in between very good or very bad about) 1 = I feel very bad about …

  6

very good

 

5

mostly

very good

 

4

in between (but closer to good)

3

in

between

(but closer to bad)

2

mostly very

bad

1

very bad

  1.  How do you feel about reading news online for class?
  1.    How do you feel about reading a book in your free time?
  1. How do you feel

about doing research using encyclopedias (or other books) for a class?

  1.   How do you feet      about emailing (or texting) friends in your free time?
  1.   How do you feel about reading online for a class?
  1.   How do you feel about reading a textbook?
  1.   How do you feel about reading a book online for a class?
  1.   How do you feel about talking with friends about     something you’ve been reading in your free time?
  1.   How do you feel about getting a book or a magazine for a present?
  1. How do you feel about texting friends in your free time?
  1. How do you feel about reading a book for fun on a

rainy Saturday?

  1. How do you feel about working on an internet project with classmates?
  1. How do you feel about reading anything printed (book, magazine, comic books, etc.) in your free time?
  1. How do you feel about using a dictionary for class?
  1.   How do you feel about using social media like Facebook or Twitter in your free time?
  1.   How do you feel about looking up information online for a class?
  1.   How do you feel about reading a newspaper (like the Delta Discovery paper) or a magazine for a class?
  1.   How do you feel about reading a novel for class?

Appendix C

 Adapted from Huang, 2012.

  Please give your answer under the appropriate column. Only give one answer for each question. 1

Almost never

2

Rarely

3

Often

4

Almost

always

  1. The Read 180 program increases your reading scores.
  1. The Read 180 program increases your reading levels (lexile).
  1. The Read 180 program improves your reading comprehension.
  1. The Read 180 program increases your vocabulary size.
  1. The Read 180 program changes your habits and attitudes toward reading.
  1. The Read 180 program improves your motivation (desire) to read.
  1. The Read 180 program encourages your social interaction with your friends about book talk.
  1. The Read 180 program increases your joy of reading.
 

List three to five positive things you like about the Read 180 program.

a.

b.

c.

d.

e.

List three to five negative things about the Read 180 program.

a.

b.

c.

d.

e.

  1. Eighth grade students only: Which program do you prefer: Lexia or Read 180. Why? ____________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

What makes one better or than the other?

___________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________________

Appendix D

 

Interview Questions for students (Huang, 2012):

  1. What types of books do you like to read? Why?
  2. What are your favorite books? Why do you like reading them? What makes you want to read?
  3. Tell me about reading in your classroom, do you read alone or with others? Do your classmates value reading? How do you know?
  4. What are some things in school that help or get in the way of your wanting to read? How do they help or not help?
  5. What types of the Read 180 books do you like to read? Why?
  6. What type of computer-based reading programs do you like? Why? How does the Read 180 program motivate your reading?

Aleta’s Classroom Research of a Balanced Reading Program Approach

Research Questions

What does the pre-designed, comprehensive, blended learning model contribute to the learners’ reading skills in this specific setting? How will students become engaged in a book that may be related to their interests, but has many features not in their current mental schemata, build background knowledge about a book for independent reading? Is this model more effective for students who have prior experience with a supplemental computer reading model?

Framework

My research will be based on an interpretive framework. It will be guided by a mixed method study approach. An interpretive framework is guided by using one data set, primarily qualitative, to interpret the traditional standardized quantitative data set; although themes will emerge from qualitative data independent of quantitative. Surveys of reading attitude in both digital and traditional reading settings will reveal information that is both quantitative and qualitative—bridging across data and creating a true mixed method research approach.

Method

Quantitative data will be derived from standardized assessments, and less formal survey responses; although survey responses will also serve as qualitative information since it evaluates attitudes and engagement. Quantitative data will be gathered and interpreted within its own merits, but more importantly with qualitative data from observations and interviews.

Reflection for Research Question, Method, Framework

Aleta May

10-19-14

Reflection for Research Question, Method, Framework

I really liked reading Nicole’s “Research Question, Justification, and Framework.” I have researched behaviorism in the late 90’s. My purpose though was to find out what causes or maintains both wanted and unwanted behaviors in students who cannot speak for themselves. As stated by Nicole, it is vital for that students be involved in their own process of learning. It is with this in mind, she looks at using a badging system for maintaining positive communication/contributions of students with each other. Used appropriately, this type of system can be used to build intrinsic motivation to participate and learn. In an article I just read about the Accelerated Reader (AR) program, students were actually deterred by the point system. Some students complained that the scores did not measure what they were good at in reading and that they were not awarded enough points for reading thicker/more difficult books (Huang, 2012). Also, points can lead to competition between students and a feeling of letting their classmates down when the class does not earn enough points as a whole group. In Nicole’s theoretical framework, she states what I believe will serve to establish enough incentive to create accountability among students, acknowledgement for efforts, and move to the real goal of bringing students into a classroom environment with their own culture so they want to learn because learning is the incentive.

Jon’s research question regarding the use of Socrative interactive response tool summarizes the heart of what I want to look for. Is the program I am assisting with creating an opportunity for student engagement through discussion? In fact, I think the teacher could build this in by transferring the skills they learn during the limited time a rotation model creates to subject area classes throughout the day. Jon’s framework is built on constructivism where students build knowledge together. I’m interested in the Socrative interactive response tool he will be using. This sounds like a tool to teach students to get started, kind of like reciprocal teaching.

I will need to write my framework, justification and research question much more succinctly than I did in the initial blog this week. Jon and Nicole’s model this for me.

Reference

Huang, S. (2012). A mixed method study of the effectiveness of the accelerated reader program on middle school students’ reading achievement and motivation. Reading Horizons, 51(3), 229-246.

Method/Research Design in the Making

Aleta May

EDET636

Blog Post for October 17, 2014

How will data collection ‘look’ for me? What challenges am I anticipating?

Post your Method/Research Design for peer feedback

Questions include: 1. Quantitative–Does the Read 180 program have an effect on 7th/8th grade students at Lewis Angapak Memorial School (LAMS) reading achievement?

I will look at standardized test scores from Measures of Academic Progress (MAPs) they took during September 2014 initially to note differences I find between 7th and 8th grade students entering into Read 180, with the 8th grade students having had one semester of time spent on another computer reading program (one class period per day) and those without this experience. The 7th grade students have also taken a reading MAZE (filling in the blank from word choices) comprehension test last spring, and will be given this 10 minute test in a whole group setting, at the same time both 7th and 8th grade students take the MAP tests. The test window for MAPs and the MAZE from AIMSweb opens December 1st, in time for this research. One purpose for this quantitative assessment will be to compare students’ standardized scores after being exposed to Read 180; further to compare 7th/8th grade students having had previous computer reading program experience to those who have had much more limited experience. Another purpose for this comparison is to document qualitative effectiveness of the Read 180 program by using the standardized assessment comparisons (MAPs and AIMSweb) (Huang, 2012). To interpret these results, I will include qualitative research.

  1. Qualitative—What are the students’ views about using the Read 180 program? Does the program promote reading motivation for 7th/8th grade students at LAMS? (Huang, 2012). These questions are for determining attitudes, beliefs and personal experiences about this program to look toward how motivation may affect achievement.

To determine student’s views of the Read 180 program, I will provide ten specific statements to be rated on a Likert-type scale that I obtained from an article about the Accelerated Reader (AR) program, but apply these to the program I am researching. In fact, I may use this for 8th grade students to compare Read 180 to Lexia (the program they were in last spring). Here are the questions with 1 being almost never, 2 rarely, 3 often, 4 almost always (Huang, 2012):

  1. The Read 180 program increases your reading scores.
  2. The Read 180 program increases your reading levels (lexile).
  3. The Read 180 program improves your reading comprehension skills.
  4. The Read 180 program increases your vocabulary size.
  5. The Read 180 program changes your habits and attitudes toward reading.
  6. The Read 180 program fosters your motivation in reading.
  7. The Read 180 program fosters your social interaction with your friends about book talk.
  8. The Read 180 program fosters your joy of reading.
  9. List five positive aspects associated with the Read 180 program.
  10. List five negative aspects associated with the Read 180 program.

Interview Questions (Huang, 2012).:

  1. What types of books do you like to read? Why?
  2. What are your favorite books? Why do you like reading them? What makes you want to read?
  3. Tell me about reading in your classroom, do you read alone or with others? Do your classmates value reading? How do you know?
  4. What are some things in school that help or get in the way of your wanting to read? How do they help or not help?
  5. This one for 8th grade students: What type of computer-based reading programs do you like? Why? How does the Read 180 Program motivate your reading?
  6. Does the Read 180 program cultivate your reading skills? How does it work?
  7. What types of Read 180 books do you like to read? Why?
  8. What are the strengths and weaknesses of using the Read 180 program? Why?

Additionally, I am looking for a snapshot of students’ views about using traditional resources for reading versus using technology resources and how this may apply to students’ reading within the Read 180 program. This is a Survey of Adolescent Reading Attitudes (SARA) scale I have mentioned in a previous blog and uses a scale from 1 to 6: very bad to very good (Conradi, Jang, Bryant, Craft & McKenna, 2013).

When I observe students in action, I take pictures. These pictures become artifacts for part of a story they tell. As a participant observer, it is not always best to turn and type as I assist students or enter into their individual reading group to assist, on their terms. I do not insist on helping, I am there to guide and facilitate as they write what they read on graphic organizers, figure out what a reading log should say, guide them to discovering what a word read in text means, take turns reading as a reading partner would, provide pictures or additional information from an online resource to shed light on a topic they are unfamiliar with, and more. As I write notes, I believe they create a story that may provide more questions than answers for future reflection. An observation in my situation is a reflection of what occurred that day during group facilitation. One article suggests that creating a vignette or short story with a detailed description of events may be a way to record and interpret to find a problem to be solved and suggest action based on the theory of the problem’s basis: “vignettes provide . . . hypothetical snapshots of actual classroom situations; since vignettes are problem-based, either to propose a solution or to evaluate one given” (Jeffries & Maeder, 2011, p. 166). I think writing vignettes of my observations will help me deal with the inherent “lack of uniformity” (p. 166) that occurs when observing real students in real time.

References

Conradi, K., Jang, B.G., Bryant, C., Craft, A., & McKenna, M.C. (2013). Measuring

adolescents’ attitudes toward reading: A classroom survey. Journal of

            Adolescent & Adult Literacy, 56(7). 565-576.

Huang, S. (2012). A mixed method study of the effectiveness of the accelerated

reader program on middle school students’ reading achievement and

motivation. Reading Horizons, 51(3), 229-246.

Jeffries, C. & Maeder, D.W. (2011). Comparing vignette instruction and assessment

tasks to classroom observations and reflections. The Teacher Educator, 46,

161-175. Routledge Taylor & Francis Group.

Aleta May

Reflection for Week 6: Writings for October 9 – October 12.

EDET636

I posted my “Method and Research Design for a Blended Reading Environment” and “Data Collection Musings & Their Inherent Challenges” for this week’s assignments. Scott responded that he thought my plan looks well organized, and reminded me that a variety of sources is important to any research study. It was really helpful to receive this input, because blended learning is such a broad topic, even though I am narrowing it to Read 180 and its impact on student engagement and student-centered learning. It was also helpful to receive feedback from him regarding my plan to wait before interviewing students, adding that by then the novelty of the program will have worn out.

Sunshine inspired me to look into computer-based simulated learning environments. I found an article that related to my research in that it discusses the issue of cognitive overload. By briefly researching into her topic, I believe I found information that could help her write a rubric for her student’s presentation. Scientific discovery learning based on inquiry learning is the pedagogy behind using many science simulation environments: “Scientific discovery learning is in line with aspects of inquiry learning, including processes such as predicting (stating a possible simulation outcome), conducting (carrying out the simulated experiment and collecting data), and reasoning (drawing conclusions about the simulation outcome. . .” (Eckhardt, et al., p. 106).

I noticed that Tammy is using a mixed methods research approach like I am. She added to my thinking about open ended questions and how I could add these to an interview I wish to give to students regarding how the feel about reading in traditional and blended learning environments. I looked up mixed-methods research and found an insightful, but simple definition: “the joint use of qualitative and quantitative methods in a single study” (Maxwell, 2013, p. 102). One purpose for using more than one data collection method is for triangulation, meaning one method sheds light on the data collected by another method, leading in turn to a deeper understanding. What if observations and interviews reveal information that seems to be divergent? Then the researcher is forced “to reexamine your understanding of what is going on” (Maxwell, p. 104). Also, there is less inference when one source of data seems unclear.

This week I completed my literature review. Working on this simultaneously with thinking about my research design and theoretical framework (recently), has helped pull several parts together to give me focus. It feels good to be at this point. The next logical step is to begin collecting data.

References 

Eckhardt, M., Urhahne, D., Conrad, O., Harms, U. (2013). How effective is

instructional support for learning with computer simulations? Instructional Science,

41, 105-124.

Maxwell, J.A., (2013). Qualitative research design: An interactive approach, (3rd

ed.)  Thousand, Oaks, CA: Sage.

Thematic Review of the Literature for Blend Learning Environments

Aleta May

EDET636 Impact of Technology on Learning

October 12, 2014

https://wordpress.com/post/62905616/new/

https://aleta57.wordpress.com/2014/10/13/thematic-review-of-the-literature-for-blend-learning-environments/

Thematic Review of the Literature for Blend Learning Environments

One theme that sprang forth unexpectedly to me was the theme of how students’ attitudes and self-assessments need to be considered within the pool of evidentiary data collected when an analysis of the impact of blended learning, in this case in the area of literacy skills, is investigated. In one article, the impact of learning on two groups; a blended e-learning (experimental group) and a traditional learning (control group); was compared. The typical focus was to compare groups based primarily on examining the achievement data. When the pretest and posttest included the average score (five items each for cognition and skill) and a self-assessment (six items) over two midterms (a five week interval), the authors found that “the average score on the achievement test for the experimental group was only slightly higher than that for the control group” (Chang, Shu, Liang, Tseng, and Hsu, p.9). Taken at face value, blended e-learning achievement test scores were not significantly better than traditional learning? However, when the results for self-assessment between the two learning platforms was compared, apparent differences caught the eye of the authors; students in the experimental group scored noticeably better in the areas of cognition especially and in skill, but not attitude. So the question is raised, “Why blended learning?” Given more than five weeks to adjust to the new blended e-learning environment, their attitude toward this style of learning will lead to higher motivation. Motivation leads to higher engagement. Engagement leads to retention of learning.

Even at beginning college levels, the issue of course engagement is being addressed via use of hybrid course design. There is a shift in emphasis from teacher-centered to student-centered pedagogy; this includes high school, and starting to begin much younger. If students prepare for class before coming (or attending online), there is time for students to collaborate with each other to solve problems, and discuss insights. These in class-time strategies increase engagement in subject matter (Foote & Mixson-Brookshire, 2014). The role of the instructor becomes facilitator of group discussion, managing course content, designer, and transmitting information from sources, between groups, or technology like video clips.

Side-by-side with the theme of engagement is the teacher’s functioning as one who promotes collaboration and knowledge sharing within well-planned frameworks. When planning for social technologies, for example, the teacher considers participation, interaction, and synthesis as vital components of planning for productive groups who collectively create and construct new knowledge (Agosto, Copeland, & Zach, 2013). With technology, time and distance boundaries disappear. Students are not always tied to the confines of face-to-face, brick and mortar or synchronous learning on a learning management system (LMS). Blogs and mobile devices like tweeting encourage ongoing discussion that is motivational for students. Social media like blogs encourages students to learn the perspective of others. English language learners (ELLs) have more time to translate their thoughts and share ideas when not pressured to immediately respond. A framework will encompass teacher participation, personalizing (like entering into a blog conversation), and simplifying or promoting learning. To captivate students, students need to feel in control and see a connection to their lives (Agosto, Copeland, & Zach). Engaging students into learning means that teachers need to avail themselves of technology that encourages integration of social technologies into their course design.

Planning instruction and gauging its impact means that students attitudes need to be considered, and there is a framework for quantifying their attitudes for reading in different environments, and utilizing both traditional and digital medium either to motivate and expand students’ use of both. The Survey of Adolescent Reading Attitudes (SARA), seeks to do just that. By self report, students rate their attitudes for academic and recreational reading, in print and digital media, on a “6-point scale from ‘very good’ to ‘very bad’” (Conradi, Jang, Bryant, Craft, & McKenna, p. 568).

Closely related to engagement is considering grouping strategies to build momentum through pace and support in small group (and sometimes individual) instruction. Technology is a logical mechanism for supporting students as they rotate between groups. There are supplemental reading programs that can focus students on learning word parts, morphemes (small units within a word that have meaning), building background knowledge such as through video clips, and other common, repetitive activities that free up teachers to work with small groups in deeper thinking activities (Cheung and Slavin, 2013). It is like adding an assistant to the room who motivates the student through games. Comprehensive programs for reading include the entire reading model so teachers can work in small groups and check on individual progress for planning purposes. When growth is noticeable, students are engaged. Small group instruction within a blended environment is one way to meet this goal.

Read 180 is a comprehensive program that promotes and capitalizes on small group instruction. The purpose of the program is to remediate students experiencing reading difficulties beginning where “a general stagnation in reading growth in the upper elementary and middle grades” (Kim, Samson, Fitzgerald, & Hartry, 2010, p.1110) occurs. This is also referred to as the fourth-grade slump where at this level of reading students transition from “learning to read to reading to learn the new” (Kim, et al., p. 1111). There is an array of areas of reading skills where the breakdown may occur. Since readers may need to repeat learning or fill in where gaps have occurred in the areas of word recognition or language comprehension abilities, this program addresses that need through blended learning while utilizing small group direct instruction, small group supported independent reading, and computer practice lessons that address students needs individually. This article is based on an experiment to study to find out what the effects of this unconventional blended reading program, referred to mixed-methods in this article, would be when compared to another district after-school program. Among other studies considered by this article, it is not uncommon to find that significant positive effects may be found in the third-grade, but not in the fifth-grade for fluency, reading accuracy, and comprehension. In this study only fourth-grade students had notable gains, and that may be due to increased time spent in reading instruction. The researchers, therefore, separated out scores for areas of reading (like reading efficiency, comprehension, vocabulary, and fluency). A finding that is very powerful when looking to see the impact on comprehension because of student engagement in reading is that the computer activities incorporating leveled text with comprehension accountability and videos that teach vocabulary, word study and background building produced compelling improvement. Students in this study were not taught vocabulary that has application across content areas, because the 30-minute whole group segment where this is normally taught was omitted. Including this segment may have produced greater gains in reading comprehension follow-up scores.

Looking beyond the Read 180 blended learning program, teacher-designed blended learning assimilates a variety of technologies that can benefit students in ways that engage them into deeper thinking that leads to high levels of comprehension; such as blog sites, You Tube video clips, and discussion boards and utilizing students’ mobile devices (Tucker, 2013). Small group instruction can be part of that blended model design, such as teaching students according to individual needs within mini-lessons with the teacher as facilitator. Teachers are able to maximize their instruction time, because they have time to analyze student computer data and plan without the added time usually spent grading papers (Ash, 2012, March). Data is available in real-time for teachers to access for each student. The result of accessing data is the ability of teachers to have the flexibility to use that learning-management system to plan for very direct instruction (Ash, 2012, October). To maximize instruction on a larger scale, some programs like Hybrid High School in Los Angelous, divides blended learning spaces into math/science and English/language arts/social studies groups (Ash, 2013). In this flexible model, individual instruction occurs through individual stations.

The blended learning plan may be set up as a flexible classroom. Flexible classrooms were developed in the Rocketship program in response to noted limitations when using a blended learning station rotation program (incidentally, station rotation is the design Read 180 follows). According to the designers and educators in the Rocketship program, station-rotation is too rigid, resulting in students who follow a plan, but do not spend much effort learning for themselves like they would in a more self-directed atmosphere (Herold, 2014, January). Through trying to expand a flexible and self-directed learning atmosphere within the Rocketship elementary school level program, the designers found that it was most effective in fourth and fifth grades. Further, in an article entitled, “Outside Public System, Blended Models Take Hold,” Herold found that among a private school group of schools, effective communication between educators is essential; so a consortium was created (2014).

One way self-directed learning is effective is when students use technology to build knowledge together. This occurs when an environment is designed augment collaboration between students, and between students and teachers. Students feel safe sharing their thinking from an affective stance when parameters are set within a blended learning environment for discussion. In Hew (2014), evidence-based practices are explored through a compilation of research to help teacher/designers find the right blend through use of a framework to set an environment for deeper and self-directed learning, exploration and discussions.

Computer Supported Collaborative Learning (CSCL) must be planned. A questionnaire distributed to college students across five CSCL subject areas resultantly found that upholding both cognitive and social learning conditions for individuals and groups is not only important to building knowledge, but must be planned into the pedagogical structure when designing a course (Hernandez, Gonzalez & Munoz, 2013). At the center is CSCL; closely connected to this are the teacher, the student and the task facilitating and learning within an organization that utilizes technology that supports course/subject pedagogy. More specifically, Computer-Supported Collaborative Blended Learning (CSCBL) may be founded in scripts that are designed to consider “the space, the pedagogical method, the participants and the history (4SPPI). The 4SPPIces combine to develop strong targeted learning objectives and supply a conceptual model for educators to use when communicating with each other for course design (Perez-Sangustin, Santos, Hernandez-Leo, & Blat, 2012). Within this CSCL design, both quantitative and qualitative data can be organized and studied to guide future ventures and planning. The overarching reasons for using CSCL scripts is for improving learning outcomes (Sobreira & Tchounikine, 2012). Sobreira, P. & Tchounikine, P. (2012). Facilitating change, adapting script phases allows well thought out script editing. The script is a way to manage a teaching situation, but within a complex, multilayered setting. It provides a framework for students to collaborate within. When a blended learning environment is changed, the pedagogy of the original learning management system needs to be considered so that changes within student interaction planning is conducted in a way that upholds the macro-level script (Sobreira & Tchounikine, 2012).

This literature review is broad in scope, but it addresses specific concerns that need to be considered when conducting classroom research in a specific rural village setting using a pre-designed blended learning model for reading; particularly if the research leads the researcher to recommend changes to the model or modify and expand it to other learning environments in the seventh and eight grade students’ school day. Observations, interviews, a survey, and close analysis of computer test data will be conducted with an understanding of what the research already says about the various aspects of blended learning. Questions will be answered regarding how the comprehensive blended reading program impacts student engagement that leads to meaningful growth and learning.

References

~Ash, K. (2012, March). Blended learning mixes it up. Education Week, 31(25), 1-7.

~Ash, K. (2012, October). Blended learning choices. Education Week, 32(9), 1-5.

~Ash, K. (2013). Spaces for blended learning. Education Week, 32(25) 1-5.

~Agosto, D.E., Copeland, A.J., and Zach, L. (2013). Testing the benefits of

blended education: Using social technology to foster collaboration and

knowledge sharing in face-to-face LIS courses.  J. of Education for Library

and Information Science, 54(2).

~Chang, C-C., Shu, K-M., Liang, C., Tseng, J-S., and Hsu, Y-S. (2014, April). Is

blended e-learning as measured by an achievement test and self-assessment

better than traditional classroom learning for vocational high school

students? The International Review of Research in Open and Distance 

Learning, 15(2), 1-12.

~Cheung, A. & Slavin, R., (2013). Effects of educational technology applications on

reading outcomes for struggling readers: A best-evidence synthesis,

277-299.

~Conradi, K., Jang, B.G., Bryant, C., Craft, A., & McKenna, M.C. (2013). Measuring

adolescents’ attitudes toward reading: A classroom survey. Journal of

Adolescent & Adult Literacy, 56(7). 565-576.

~Foote, S.M. & Mixson-Brookshire, D. (2014). Enhancing learning with technology:

Applying the findings from a study of students in online, blended, and face-

to-face first-year seminar classes. Currents in teaching and learning, 6(2),

35-41.

~Hernandez, N., Gonzalez, M. & Munoz, P. (2014). Planning collaborative learning

in virtual environments. Media Education Research Journal, 42(21), 25-32.

~Herold, B. (2014, January). New model underscores rocketship’s growing pains.

Education Week 33(19).

~Herold, B. (2014). Outside public system, blended models take hold.

Education Week, 33(19), 1-5.

~Hew, K. F. & Cheung, W. S. (2014). Using blended learning evidence-based

practices.  Singapore, Heidelberg, New York, Dordrecht & London: Springer

~Kim, J.S., Samson, J.F., Fitzgerald, R., & Hartry, A. (2009). A randomized

experiment of a mixed-methods literacy intervention for struggling readers

in grades 4-7: Effects on word reading efficiency, reading comprehension

and vocabulary, and oral reading fluency.

~Perez-Sanagustin, M., Santos, P., Hernandez-Leo, D., & Blat, J. (2012).

4SPPIces: A case study of factors in a scripted collaborative-learning

blended course across spatial locations. Computer-Supported 

Collaborative Learning 7:443-465. Singapore, Heidelberg, New York,

Dordrecht & London: Springer.

~Sobreira, P. & Tchounikine, P. (2012). A model for flexibly editing CSCL scripts.

Computer-Supported Collaborative Learning 7:567-592. New York:

International Society of the Learning Sciences, Inc. and Springer

Science+Business Media

~Tucker, C. R. (2013, March). The basics of blended instruction.

Educational Leadership. ASCD/www.ascd.org