OLOR Series: | Research in Online Literacy Education |
Author(s): | Diane Martinez, Mahli Xuan Mechenbier, Beth L. Hewett, Lisa Melonçon, and Heidi Skurat Harris, with Kirk St.Amant, Adam Phillips, and Marcy Irene Bodnar |
Original Publication Date: | 15 March 2019 |
Permalink: |
<gsole.org/olor/vol2.iss1.b> |
Best practices in online writing instruction (OWI) have been a concern for more than a decade, yet students’ voices have not played a major role in OWI research projects to date. This article reports on the data from a U.S.-based national online student survey conducted in 2017. When viewed comprehensively, survey data revealed students valued instructor expertise and feedback; however, they did not know how the work in online writing courses helped them improve their writing.
Keywords: Online writing instruction (OWI), student voices, survey research, online writing courses (OWCs), hybrid, fully online
1. Background[1] For the fourteenth straight year, student enrollment in online classes in higher education has increased, and recent reports indicate that 32% of all students are taking at least one online class (Seaman, Allen, & Seaman, 2018). This national growth in online enrollments aligns with growth in online courses and programs in composition and technical and professional communication (TPC). For example, as of 2012, 15% of all TPC degree programs were offered fully online (Melonçon, 2012) and 21% of service courses were offered fully online or in a hybrid format (Melonçon, 2009). [2] These numbers strongly indicate that educators must deliver effective online instruction. Best—or, more accurately, effective—practices in online writing instruction (OWI) have been a concern for writing studies for more than a decade as evidenced by the work of the Conference on College Composition and Communication (CCCC) Committee on Computers in Composition and Communication and the CCCC Committee for Effective Practices in Online Writing Instruction (hereafter called the OWI Committee). The OWI Committee, constituted in March 2007 (Hewett & Depew, 2015), was charged with developing a position statement of effective practices for OWI. After several years of research into the issues, the OWI Committee (2013) composed A Position Statement of Principles and Examples of Effective Practices in Online Writing Instruction (hereafter called the OWI Principles document) that described 15 grounding principles for ethical and effective teaching of postsecondary-level writing in online learning. [3] These OWI principles have been key to new research into OWI because they provide a solid ground for examining both theoretical and practical challenges of OWI pedagogy and administration. They offer a position from which scholars could agree and differ, thus enriching the research and educational understanding of OWI’s possible effects on student writing. However, despite its strengths, the OWI Principles document, like much of the research surrounding instruction that occurs in online environments, failed to address student needs head-on. Indeed, along with attention to pedagogical practice and growth in courses and programs, OWI researchers have responded with vast and diverse research studies and articles that recently were collected to form the Bedford Bibliography of Research in Online Writing Instruction (2017). Nonetheless, despite the research advances that these publications represent, online literacy educators still do not know much about student needs from the student point of view. With most of the research engaging instructional perspectives, students’ stated needs regarding their experiences in online writing courses (OWCs) remain woefully underexplored; exceptions include research published by Patricia Webb Boyd (2008), Beth Brunk-Chavez and Shawn J. Miller (2006), Jennifer M. Cunningham (2015), Angela Eaton (2005, 2013), and Scott Warnock (2018). This report of a national survey of OWI students is the result of the OWI Committee’s attempt to address this gap in published literature. |
1.1 Background of the OWI Student Survey’s Construction[4] In 2014, the OWI Committee conducted an informal OWI-focused literature review, which indicated that no published reports existed of any national surveys dedicated to learning about students’ experiences in OWI. To address this gap in OWI research, in 2015, the OWI Committee tasked a student-survey working group to develop a U.S.-based national online survey for students in OWCs. [5] The student-survey working group settled on a survey designed to gather quantitative data primarily, using only a few subjective, open-ended questions. The working group decided to keep the survey as short as possible to encourage students to complete it fully, given research that suggests long surveys may be abandoned by respondents (Chudoba, n.d.). Given the ultimate intention of obtaining a generalizable sample of U.S. OWI students, the student-survey working group sought as large and as representative a student sample as possible. [6] The pilot student survey focused on student experiences regarding preparation for, access to, and learning in OWCs. Reading and alphabetic writing are minimal, core components to succeeding in online courses, and the working group was especially interested in learning directly from students what pedagogical components of OWCs they considered helpful in improving their writing. Development of the survey was guided by the following research questions:
[7] Testing the survey as a pilot was critical to refining it for the broadest participant base. To ensure a strong survey instrument, the student-survey working group conducted a pilot of the survey in the fall of 2015 to test question construction and survey length. Its first iteration was piloted with students taking classes from online instructors associated either directly or indirectly with the OWI Committee. When the OWI Committee reported its general results at the CCCC conference in 2016, the audience responses revealed ways to refine the instrument to engage student voices more fully about their OWI experiences. During that same conference, the OWI Committee received word that it had not been reconstituted; despite this unexpected news, the student-survey working group was committed to continuing the work begun with the pilot survey. They subsequently partnered with Macmillan Learning and the newly constituted Global Society of Online Literacy Educators (GSOLE), which stepped up as the sponsor of this work. A second iteration of the survey was created from the pilot survey’s learned lessons, and the working group requested feedback from online technical writing students at Louisiana Tech University in June and July 2017. Further revisions were made to the survey based on the second pilot, and the national survey was distributed in September 2017. [8] Western Carolina University is the IRB of record for the survey. The survey was designed in SurveyGizmo by Macmillan Learning, and it was intended for students in only fully online—defined on the survey as “a class that is delivered solely online, meaning your class does not meet in a face-to-face setting. All work and class participation is online”—and hybrid writing courses—defined on the survey as “a class that is delivered online and sometimes meets in a face-to-face setting (maybe once a week, once a month, or once or twice semester).” Student responses were anonymous. The student-survey working group used both a convenience sample in that the survey was advertised on listservs and relevant social media and a purposeful sample garnered from specific lists of online writing instructors that Macmillan Learning made available. The survey was distributed during the week of September 18, 2017 and remained open until December 31, 2017; three reminder emails were sent during that time. Since the student-survey working group did not have access to students, the recruitment process involved two layers of participant requests in that the working group asked instructors who were teaching online to ask their students to complete the survey. There is no exact count of how many people received the email, and there is no way to know who distributed the survey link to students, so calculating a rate of return is impossible. Data were collected and stored in a SurveyGizmo account owned by Macmillan Learning and monitored by IRB-approved members of the survey group. |
3. Pertinent Details about the Survey Data[13] The final survey had approximately 25 questions (fewer depending on how respondents answered several skip-logic questions); 18 were multiple choice, 2 were frequency rankings, and 1 was a Likert scale question on pedagogical activities. There were four open-ended questions. As part of their sponsorship of this research project, GSOLE agreed to house the full dataset from this survey, which can be accessed by the public. Table 1 shows the breakdown of respondents. |
[14] The student-survey working group chose to include all responses in the analysis, even partially completed surveys, because they had value for a grounded, initial understanding of student perspectives about OWI. In this written report, responses include n numbers as well as percentages to qualify the offset of completed and partial surveys; however, using all the data prohibited us from making more advanced statistical comparisons, which typically can be done only by comparing variables from complete data sets. [15] The complete survey responses are weighted to TPC with 67% (n=231). This result is likely because of the recruitment strategies used. The student-survey working group had access to a list of individual instructors who primarily taught TPC courses, and survey research indicates that response rates typically are higher when the request for participation comes directly to a personal email rather than from other sources (Lindemann, 2018). 3.1. Demographic and Course Data[16] To better frame the results and discussion of this survey in subsequent sections, we first provide some demographic information of students who participated in the survey. Demographic questions appeared in the final quarter of the survey itself. Questions that asked students about the course they were enrolled in appeared in the first part of the survey, and they are included in this section. [17] 3.1.1. Age.Question 20 asked, “Please identify your age range.” Results appear in Table 2. |
[18] A little over half of respondents, 59%, were traditional-age students, which typically are defined as 24-years old or younger (U. S. Department of Education, n.d.). The remainder, 41%, were (as defined in higher education) non-traditional students. The age distribution in this survey was much more traditional than the typical distribution of online-only students where the mean age of undergraduate online students is 29 (Clinefelter & Aslanian, 2016), possibly indicating that the traditional-aged students who took the survey were taking a single online course as a part of a traditional on-campus degree. The high number of students over age 24 supports the notion that online learning is a valuable resource for students with complex family lives who need the flexibility of online courses (e.g., Gos, 2015; Noel-Levitz, 2015; Ruffalo Noel-Levitz, 2016). Age data specific to OWI can help faculty and administrators plan and market courses accordingly. [19] 3.1.2. Gender.Question 21 asked, “What is your gender?” Figure 1 summarizes those results. |
[20] Although Figure 1 shows responses of “male,” “female,” and “I do not wish to respond,” the survey did include the option of “I identify as” with an open comment to allow participants to signify how they would like to be identified. This comment box did not generate any usable answers. [21] The gender breakdown of this survey matches national trends wherein women students are more prevalent in higher education (56% female) (U. S. Department of Education, 2018); additionally, it matches the increase in women students who take fully online and hybrid online courses (approximately 70%) potentially because of family or work obligations (Clinefelter & Aslanian, 2016). [22] 3.1.3. Academic level and type of institution.Question 22 asked, “I am a [fill in academic level] at a [fill in type of institution].” Students could select from a dropdown menu for the fill-in-the-blank options. Part one included freshman-to-senior options paired with the typical year students would be in school (e.g., freshman/first year), graduate, non-degree seeking, and “other” options. Part two included options for two-year, four-year, and “other.” Figure 2 shows results for academic level reported. |
[23] Only 2% (n=7) of the students reported attending a two-year college; the remaining 98% reported attending a four-year college or university. These results likely reflect the survey distribution more than typical online student-to-institution ratios since recent research has suggested that 36% of students attend two-year colleges (Ginder, Elly-Reid, & Mann, 2017), and two-year colleges comprise 30% of online enrollments (U. S. Department of Education, 2018). [24] The distribution of student academic levels is not surprising based on the type of courses being taken by the participants as shown next. [25] 3.1.4. Type of course.Question 1 asked, “Please identify the writing class in which you were asked to take this survey.” Students were provided a series of courses from which to choose, and there was an “other” option for writing in a course not listed. Table 3 presents the breakdown of courses identified. |
Table 3. Courses Students Enrolled in at Time of Survey (n=413)\
|
[26] Aligning with the recruitment and distribution of the survey link, more students in TPC completed the survey with first-semester composition (taken broadly to be first-year writing courses) following. “Other” written-in responses included “expository writing,” “analytical and research writing,” “creative writing,” and “master’s level course.” [27] 3.1.5. Length of course.Question 2 asked, ‘The duration of this course is” with options for students to choose from and an open comment box for participants to write-in other options. Results are shown in Table 4. |
[28] That over three quarters of respondents were in traditional 13-16-week semester courses aligns with the high percentage who reported they were in a four-year college or university. [29] 3.1.6. Previous online course experience.Question 25 asked, “How many additional online or hybrid courses—in any subject—have you taken since you began college?” Figure 3 displays the results. |
[30] The general experience of students in online courses can be seen as a positive when thinking about OWCs. However, as discussed below, this experience may make it more difficult for students to adjust to the OWC that requires a different type of engagement than with other online courses such as math or basic science. Only 15% (n=51) had not taken an online course prior to their enrollment in the OWC, but this number suggests that care is necessary to ensure those students new to this learning environment are properly oriented to the setting as well as the particular class. The next section expands on results from the rest of the survey. |
4. Results and Discussion[31] In this section, results of the three content-based sections—Course Information, Orientation, and Online Course Activities—are reported and discussed, leading to implications of this research at the end of this report. 4.1. Course Information[32] Other than information reported on the courses in which students were enrolled (see Table 3) and duration of the courses (see Table 4), this part of the survey asked questions about the physical location where students completed their course work and what types of devices they used for accessing and completing their course work. [33] 4.1.1. Location.Question 3 asked, “Please rank in order of frequency (with 1 being least frequent and 5 being most frequent) where you do the majority of your work for this online writing course.” Results are presented in Figure 4. |
[34] As shown in Figure 4, the most frequent location in which students completed their school work was at home, followed by on campus; however, 274 students reported completing course work at their workplaces and, therefore, the conditions of such an environment and how it influences both the degree and frequency of engagement are important considerations. For example, when students access and complete their schoolwork at home or on campus, it would be reasonable to think they have a time and place set aside for such activity. Yet when they access and complete their schoolwork in the workplace, the question of whether the schoolwork is sandwiched between workplace tasks (or even completed instead of workplace tasks) arises. In such cases, it would be useful to learn at what kinds of workplaces students work and to what degree they can focus completely on their schoolwork. Certainly, an entire survey regarding location of access and completion of online coursework would be illuminating. [35] 4.1.2. Device used to access and complete course work.Question 4 asked, “Please rank in order of frequency (with 1 being least frequent and 5 being most frequent) what kind of device you use to do the work for this online writing course.” See Figure 5 for the results. |
[36] A majority of students reported using a laptop to access their OWCs; however, use of a mobile phone exceeded use of desktops, tablets, and notebooks, suggesting that the size of the device mattered less than either the convenience factor or the possibility that a mobile device was the only connection the student had with the internet. The Pew Internet Project (2013) reported that 73% of Advanced Placement and National Writing Project teachers surveyed said that their students used mobile phones inside and outside of the classroom to complete their school work. Some high school students from that Pew survey were college students at the time of this survey, suggesting that they would have access to and experience using mobile devices in classrooms. In 2018, over 94% of traditional college-age students (18-29) owned a smartphone (Pew Research Center), and Esteban Vazquez-Cano (2014) reported “[m]obile learning often takes place outside a formal learning environment, tending to become personalized via users’ personal mobile devices” (p. 1508). One implication of these data is that students may be using technology that is not conducive for the kind of work they are assigned in OWCs, so they may need technological guidance, which is addressed in the next section. [37] 4.1.3. Mobile application.Question 13 asked, “Does your institution provide a mobile application (e.g., Blackboard Mobile Learn) that supports accessing (getting into or working in) this course?” Results are shown in Figure 6. |
[38] For students who responded “yes” to Question 13, Question 14 asked, “Please describe how you use the mobile application in this online writing course (check all that apply).” Results are shown in Figure 7. |
[39] The type of device students use to access and work in a course may be influenced by pathways into a course and how easy it is to use the course software on any given device; thus, some universities (e.g., Oregon State University, Colorado State University, and University of Southern California) now have mobile applications available for students to access online courses (Friedman, 2017). Most students who reported that their institution had a mobile app, 60% (n=238), also reported completing two to three activities using it (see Figure 7). It is worth noting that 13% (n=52) students reported not using the app even if it was available, suggesting a need to learn why they made this choice. [40] Students who preferred to use mobile devices to check-in—for example, with an online group or to ensure they are not missing an announcement—reported doing so for quick activities such as verifying due dates, reading messages, or accessing voice-threads. “Other” responses in the student survey of OWI included completing assignments and quizzes and checking grades. These activities are indeed quick undertakings that both are practical and supported through a mobile device. In a national survey of over 1500 online students, Dian Schaffhauser (2018) reported that 67% completed their coursework on a mobile device. The most common activities from that survey, which provided no information about the types of courses, included accessing course readings (51%), communication with professors (51%), and communicating with fellow students (44%) (Schaffhauser, 2018). The findings from that national survey align with the findings from the OWI student survey and provides additional support for understanding ways that students are integrating mobile devices as part of their online learning and how instructors can support mobile technology in OWCs. However, it does not suggest how many students may be attempting to complete their writing assignments via mobile devices. [41] Additionally, Rochelle Rodrigo (2015) reviewed studies associated with using mobile devices in higher education and discussed mobile apps specifically in terms of OWI and the OWI Principles. She emphasized that it is impossible for any online instructor to be familiar with all devices available for students to complete their work; yet, given that OWI Principles 2 and 10 address institutional responsibility for supporting students’ technology (CCCC OWI Committee, 2013), online writing instructors can provide “reasonable support.” Reasonable support can manifest through instructors being “reasonably aware of some of the major issues that might occur when their OWC interacts with popular mobile devices and operating systems” (Rodrigo, 2015, p. 501), such as incorporating “low-stakes learn-the-technology assignments where students safely can explore how they will interact in a specific course with their individual devices” (p. 504). Implications of this survey’s data indicating how students use their mobile apps minimally offers online writing instructors a glimpse into how to guide students as to what extent their mobile devices might work for some assignments and not for other work. Importantly, despite the useful voice-to-text functions of many mobile devices and their value for students who compose best orally, students likely should arrange to use other devices to complete their writing, such as when revising, editing, and formatting a final paper. 4.2. Orientation[42] As a section, “Orientation” regarded how students had been prepared by their teachers or the institution for their OWCs.[43] Question 5 asked, “Were you offered an orientation about taking an online writing class?” This question used skip logic so that students who responded “No” were advanced to the next section of the survey. Results for Question 5 are presented in Figure 8. |
[44] Only 28% of respondents indicated they had received some sort of orientation to the OWC. Yet, once students enter the online learning platform, orienting them to the space is key to student success (Melonçon & Harris, 2015). Whether orientations take place at the institutional level (Bozarth et al., 2004), in face-to-face settings (Gos, 2015), or at the start of online courses (Dockter, 2016), instructors ideally should receive institutional support to help students adjust to online learning (Minter, 2015). Online orientations have been shown to increase retention in online classes (Taylor, Dunn, & Winn, 2015), and they can vary from a short tutorial video about the affordances of a single classroom to a multi-day on-ground orientation for students new to an online program (Lieberman, 2017). [45] Distinguishing between an orientation that specifically addresses OWI versus an orientation to online learning more generally is important. The survey question asked about an orientation specifically for an OWC; however, we do not know whether students identified the words “taking an online writing class” as central to the question or whether they simply responded to being offered an orientation of any kind. This lack of clarity existed despite a follow-up question (Question 8, “Did this orientation adequately prepare you for the work in this course?”). [46] The failure of the survey question to be sufficiently specific aside, the 48% who responded “No” and the 24% who responded “I don’t know” indicate an alarming number without orientation to the OWC; the published literature strongly supports orienting online students to online classes in general (Lee & Choi, 2011) and to OWI in particular (Bozarth, Chapman, & LaMonica, 2004; CCCC OWI Committee, 2013; Gos, 2013; Melonçon & Harris, 2013; Minter, 2013). Additionally, because of the decision to keep the survey short, follow-up questions were not provided regarding actual participation in and components of offered orientations. Therefore, we do not know whether those students who reported having been offered an orientation actually participated in it and what they might have learned from it. [47] When developing the survey, the student-survey working group wondered whether previous experience in online courses affected whether students were offered an orientation in their current class. Although the question about being offered an orientation and the demographics question about the number of previous online courses students had taken (Figure 3) cannot be correlated statistically, we speculate that prior experience (or lack thereof) in taking online courses is one possible reason students may or may not participate in an orientation if one is offered. [48] Question 6 asked, “Was the orientation delivered online?” Students could choose to check “online” (meaning “fully online”), “hybrid,” or “face-to-face.” According to OWI Principle 13, “OWI students should be provided support components through online/digital media as a primary resource” (CCCC OWI Committee, 2013, p. 25), which indicates that orientation for an OWC should be online in some fashion dependent on the type of course to be taken, and most respondents reported the orientation as having been fully online (n=102), with only 8 students reporting “No,” and 5 students reporting it was a hybrid orientation. [49] Additionally, OWI Principle 10 states that “Students should be prepared by the institution and their teachers for the unique technological and pedagogical components of OWI” (CCCC OWI Committee, 2013, p. 21), which means students should be oriented not only for the technology they will encounter in the OWC, but they also should be given a course-specific orientation that includes interface familiarization, lessons, and examples of study habits and skills needed to succeed in an OWC (OWI Committee, 2013). Results from the survey were split for general orientations to online courses, 51% (n=39), compared to course-specific orientations, 49% (n=35), which we further address below with an open-ended question. [50] Question 8 asked, “Did this orientation adequately prepare you for the work in this course?” and the majority of students, 78%, (n=82), responded positively. For those students who found the orientation helpful in preparing for the online course, the following responses exemplify why they found it helpful. The orientation helped give me a better understanding of what I should be prepared for during the semester. The orientation allowed me to have a “heads-up” of things that could happen in an online course (good or bad) and where/how to seek help if problems arise. The orientation clearly showed me how to navigate the class website and gain access to course material. It was very clear that I had to have some basic computer skills, that I had to be self-motivated to check in, and showed me how the course was set up and how to get around in the course. [51] For those who responded “No” (n=23), the following responses indicate that the orientations were too general to be helpful. The orientation was too generic to be of any real assistance. Customizable course orientations would be more beneficial, and would also enable more uniformity in the way that online courses operate. It was just general information on whether I would be able to keep up with the demands of an online course. The video discussed how to use the interface, but it did not discuss how the professor would use the interface or layout the course (it is a pretty flexible interface from the professor side). It did not create a successful mental model of the course. [52] The combination of responses indicate that students need—and want—an orientation that includes information about technology, general online learning pedagogy, and course-specific information. Orientation to the online environment and a specific online course are the first steps to ensuring student success (Lee & Choi, 2011). In the case of OWI, “course-specific” should include a range of information depending on the course, including how tools and activities within a course are designed to help students improve their writing as indicated in OWI Principle 10 (CCCC OWI Committee, 2013). We encourage writing program administrators to advocate for instructor funding and support in creating online orientations perhaps through university centers for teaching and learning or through offices of distance education. 4.3 Online Platform and Technical Difficulties[53] As part of the Online Course Activities section, students were asked questions about their experiences with the learning management system (LMS) or other software programs used to deliver the OWC and technical difficulties in accessing the course. [54] 4.3.1. Online platform.Question 9 asked, “In what program do you do most of your online work for this course?” Students were given four options for specific LMS programs, one “I don’t know” option, and one write-in option. In this survey, by far, Blackboard and Canvas were the two most common platforms reported being used to deliver OWCs: Blackboard at 38% (n=152) and Canvas at 31% (n=123). The other options of Moodle, Google, D2L, and “I don’t know” were less than 10% each. As one would expect, the types of tools used to deliver online courses are vast, and since the survey did not define exactly what was meant by “program,” respondents included a number of responses in the open comment box. Examples of “other” responses included “Zoom,” “Adobe Connect,” “a website,” “Google,” “Word,” and “Eli Review.”
[55] According to Phil Hill (2017), Blackboard continues to be used in just over one-quarter of academic institutions (28%) representing 37% of all student enrollments in the U.S. and Canada. Canvas continues to gain market share, used in 21% of academic institutions representing 27% of student higher education enrollment. The next closest competitor, Moodle, is used at 25% of higher education institutions but represents only 12% of enrollments in the higher education market, indicating that it is used primarily at smaller institutions. Canvas continues to gain users, pulling market share primarily from Blackboard and D2L/Brightspace (edutechnica, 2018). Thus, the survey results are unsurprising when considered in the broader context of national conversations about learning management systems and online course delivery. [56] 4.3.2. Technical difficulties.Question 10 asked, “Have you ever had technical difficulty accessing (getting into or working in) this course?” This question used skip logic, so those who did not experience technical difficulties were moved to the next section of the survey. Most students reported they did not have difficulty accessing their OWCs. However, for those students who reported difficulties, 28% (n=111), a follow-up question asked what those difficulties were. Most responses indicated “system” problems in that students had problems accessing the LMS. Common access problems included “server was down,” “system was down,” “freezes,” and “connectivity issues.” Of the 28% students who reported problems accessing the course, they expressed they were using the devices identified in Figure 9. |
[57] The reported devices in Figure 9 align with Figure 5 for the most commonly used devices by students to access and complete their online work in the Course Information section. Few students reported access problems stemming from how the course was set up within the LMS or other tools used to deliver the course. They further conveyed that if problems existed within the course, instructors were responsive in resolving those issues. The following detailed response encapsulates the findings from technological access: When I first enrolled in the course, I could view the course but I was unable to create a thread in the discussion board. So I was unable to do the assignment for the week. I simply emailed my professor and a few days later I was able to create a thread. In the past, I have had difficulties with accessing online courses due to poor internet connection at home. Also when using the app on a mobile phone it is very difficult to write in the discussion board. The page jumps around, deletes or hides things you’ve written or doesn’t allow you to fix mistakes. 4.4. Effectiveness of Course Activities[58] As part of the Online Course Activities section, the following discussion centers on three questions that asked students to rate course tools and activities commonly found in OWCs and then to explain their ratings. For an in-depth discussion of these pedagogical activities and a way to improve online course content, design, and deliver, see “A Call for Purposeful Pedagogy-driven Course Design in OWI” in this issue. [59] Question 15 asked, “Please rate the effectiveness of the following activities in your current online writing course as they relate to you improving your writing ability or becoming a better writer. There are two rows to add activities that are not presented in the options. ”In this question, using a 1-5 Likert scale, students were asked to rate the effectiveness of common tools and activities implemented in their OWCs as they related to improving their writing. Students could indicate whether the item was not incorporated in their current course, and a write-in option also was built in. The items in this question included:
[60] Question 16 followed, asking, “Considering your responses in Question 15, please identify what work in your current online writing course is the most valuable or helpful to you in improving your writing and explain why.” Question 17 asked, “Considering your responses in Question 15, please identify what work in your current online writing course is the least valuable or helpful to you in improving your writing and explain why.” Results in this section include both quantitative data from Question 15 (shown in Figures 10-17) and qualitative data from Questions 16 and 17. [61] 4.4.1. Discussion boards.The quantitative data regarding discussion boards suggests that most students found this tool helpful in OWCs (see Figure 10). |
[62] When coupled with the open-ended responses, however, contrary results about discussion boards emerged. Although we are uncertain why such a discrepancy between the quantitative and qualitative data sets existed, by considering both data sets, we speculate a possible disconnect between students’ general perceptions of this learning tool versus their perceptions of—and familiarity with—asynchronous communication for specific uses in their OWCs. [63] Students who reported discussion boards as being useful generally commented on the benefit of receiving multiple perspectives on their writing. I find the discussion boards most valuable and helpful. There, we students can collaborate and gain different perspectives of assigned discussions. The discussion board is the most useful because it is nice to have a place where we can communicate. It provides feedback and is similar to a face-to-face class. The discussion board as well as instructor feedback on my writing. It helps me to better gauge the work that is expected of me and work out any smaller issues that I may have overlooked during my writing process. It helps extensively with content development. [64] Students who remarked that discussion boards are among the least useful activities generally perceived participation in discussion as being “forced,” which they reported as resulting in thoughtless and meaningless responses and peer feedback. In his research, Michael Wilson et al. (2015) reported that students’ “trust in the ability of other students to mark their work was quite low” (p. 22), which appears to match some results of this survey based on open-ended comments. Specifically, students reported not understanding how discussion posts improved their writing, and they did not see how peer response—either through their responses to others or in others responding to them—improved their writing, as indicated in the comments below. Discussion boards can help generate new ideas but don’t necessarily help my writing, in my opinion. Discussion posts. They seem to be used as “filler” points for the class. I have yet to complete a discussion post that brings value to the course. Discussions. I feel like I don’t get much out of the peer reviews and discussions because the other students in the course don’t give me feedback I find helpful. The discussion board is the least valuable because I feel people just are writing to complete it rather than having meaningful discussion. [65] Only 43% of the respondents in a national survey of online students (Schaffhauser, 2018) found discussion board use helpful as class activities. This result is similar to our findings, and, in some ways, connect to data below on peer reviews wherein students may not have seen the value of individual student comments. Yet, per other studies, when they were guided toward self-reflection and self-assessment of their writing, they expressed that discussion board use could help them improve their writing and learning (Nielsen, 2012; Papadopoulos, Lagkas, & Demetriadis, 2017). Student comments in this study also suggest that the use of discussion boards, while widespread, needs to be carefully considered—and explained pedagogically from a student-centered perspective—when designing OWCs. [66] 4.4.2. Quizzes, tests, and assigned readings.Figures 11 and 12 display quantitative results for students’ perceptions of helpfulness of quizzes, tests, and assigned reading. Students wrote comments concerning quizzes and tests that often were coupled with comments about assigned reading; therefore, we present these results together. |
[67] Students reported that quizzes were generally assigned to test whether they had completed the reading, suggesting that they understood quizzes can be helpful in learning about their understanding of the readings (e.g., “The quizzes help focus on what’s important in the readings”), but some students commented that they did not know how reading about writing could improve their writing: Reading a textbook about writing is not helpful I think it takes more experience and practice to develop writing skills. I think the course readings are somewhat hard to follow at times, especially since we can’t see an in-person demonstration or explanation of the material. There are times it feels like it would be quicker and easier to learn some of the book material in class, so we can learn from our peers and professor about our specific concerns right then and there in class. [68] Some students reported readings as being most helpful. In the open-ended responses for Questions 16 and 17, some students simply listed the textbook as being effective with no other comments; others reported the textbook and readings to be beneficial for several reasons, such as those provided below. The text and assigned readings because they were geared to a specialized for of writing. The course readings were the most valuable work assignments because I learned to look at writing differently which opened a door to new dimensions of writing. Writing became something I admired, not just a literacy requirement. [69] One take-away from these data is that quizzes and readings can be seen as “added” assignments if they are not connected to the other assignments in an OWC. For example, Ingrid Spanjers et al. (2015) found that in hybrid environments, frequent quizzes help students “[space] their learning activities” (p. 72), which may positively affect them achieving learning objectives, and feedback on quizzes help students determine what is important. Additionally, Mary Margaret Kerr and Kristen Frese (2017) noted four reasons that students do not read, one being an underestimation about the importance of the reading. Thus, online writing instructors should explain to students how readings and quizzes connect with learning outcomes and other assignments in the course; additionally, instructors should provide feedback to help students realize the connections and importance of the reading, which is addressed in “A Call for Purposeful Pedagogy-Driven Course Design in OWI”. [70] 4.4.3. Synchronous chats, podcasts, videos, and PowerPoints.Figures 13, 14, 15, and 16 display the results of student rankings of various multimedia teaching tools. Although typically considered at least somewhat helpful when instructors used them, students appeared not to have much experience with instructional synchronous chats, podcasts, videos, and PowerPoints as evidenced by the significant number of responses in the “N/A” category. Regarding OWI as a course that potentially teaches multimodal composition, these results are somewhat dismal. If instructors do not use multimedia (particularly in the TPC OWCs prevalent in the survey responses), it is highly unlikely that such skills are taught as part of the composing processes overall. |
[71] When evaluating how students responded to open-ended questions about helpfulness of multimedia in OWCs, we found that open-ended responses yielded more information than the numbers/percentages alone. In the open-ended responses, some students simply listed chats, podcasts, videos, or PowerPoints as being most or least helpful with no further explanation, but for those who gave explanations, the responses provide some reasons why students found these tools or activities helpful or not, such as how these activities or tools provided clarification about the course or their assignments. Real-time synchronous chats work best for me; if I have questions, I usually get an immediate answer, either from other students or the instructor. the [video-recorded] lectures helped me most because while not face to face they were still more informative than me trying to read and figure it out myself. i love the videos provided for the lecture. it’s as though you are actually in the classroom. i love the feedback. i like the incorporation of google for like EVERYTHING in the class. This is an editing class, so there is not as much writing, but we are learning the skills and grammar rules that strengthen my writing and professionalism. However, The PowerPoint slides we have used in class have been the most helpful. Posting presentations and handouts in the discussion forum. We can access these to study for the midterm. [72] PowerPoints and videos were remarked upon the most where students provided explanations for whether they thought creating such multimodal products as writing activities were helpful. As with reading, lack of context and knowing how to apply the information to their composition was a common complaint about videos and PowerPoint slides, as was the volume of these materials. I found the video responses to be the least helpful to my writing abilities. They were more focused on responding to social issues but I find writing out thoughts to be a more productive source of learning. creating videos. this helps my speaking skills but not my writing skills. Videos because I usually don’t have the time to sit down and watch them. If I do, I am distracted and don’t learn anything from them Powerpoints. I don’t read / watch them. You can’t use a presentation medium to deliver information without a presenter. The work that is the least valuable/helpful to me would have to be any PowerPoint/lecture slides. The course is primarily based off of videos and essays, so in terms of lectures, this is not specifically provided for the class. [73] Technology is an integral part of OWI; however, as one can see from these responses, students were using multimedia to retrieve information to study from or to get answers to questions they had. Likewise, students expressed that they disliked multimedia in the classroom because they saw these activities as time consuming activities that simply deliver important or not-so-important information to them. They did not appear to see multimedia as composition devices, which begs the question whether instructors are using multimedia as delivery devices only or as compositional tools. As Scott Warnock (2015) mentioned, “Using audio/video is one way technology can enhance communications” (p. 158); thus, technology in the OWC can be used as a delivery system and as compositional aids that students should be taught to use in support of conveying the message they seek to express. GSOLE advocates that students learn this literacy skill as well as ways to read and write alphabetic text; the use of multimodal technologies should be connected with specific OWI pedagogy (Cargile Cook & Grant-Davie, 2005, 2013; Hewett, 2004-2005, 2006, 2010, 2011, 2015a, 2015b; Hewett & DePew, 2015; Hewett & Ehmann, 2004; Paull & Snart, 2016; Warnock, 2009), which may not be the case given the student feedback in this survey. [74] 4.4.4. Instructor feedback and peer reviews.Figure 17 displays the quantitative responses to the helpfulness of instructor feedback and of peer review, whether it is getting or giving feedback.
|
[75] Respondents overwhelmingly reported the most helpful type of feedback came from their instructor (Figure 17). In my opinion, instructor feedback is the most valuable or helpful. The instructor grades according to rubric and they explain why the paper may or may not have touched on all require information. Instructor feedback- clarity in what I am doing wrong and if I am doing something right. Direct comments on a written item from the professor. Its the most personal to me and can help me see my downfalls and where I need to improve. I think having students review work is positive, but value of reviews is often not that helpful for me personally. Receiving instructor feedback is the most helpful in improving my writing, because sometimes students don’t have all the information that my professor has that could help me. The responses concerning instructor feedback are noteworthy because they indicate that students value their instructors because of their expertise in writing, they value their instructor’s feedback on their writing, and they want and need advice/directions from instructors regularly. Thus, student responses confirmed that instructor presence is a crucial aspect of successful OWI and can influence student participation, overall learning experiences (Richardson, Besser, Koehler, Lim, & Strait, 2016), and satisfaction in online courses (Ladyshewsky, 2013). [76] In terms of peer review, in this survey, 22% (n=68) of students reported peer review as helpful and 15% (n=33) did not find it helpful. Students who commented that peer reviews were helpful in the qualitative comments mentioned they had an audience other than the instructor that they wanted to impress. According to Arianne Rourke (2013), students reported that peer review of their work leads them to “feel less alone, more supported and more motivated to continue with the writing process” (p. 5). Additionally, they appreciated receiving different perspectives on their ideas and their writing as reflected in the comments below. I think the feedback is the most helpful because you get to see different opinions on it. Definitely, feedback from my classmates. I was able to see what I was doing wrong or could improve on before submission to the teacher. It was most helpful having feedback from peers and my teacher and being able to stay in constant communication because I was able to identify what needed work in my writing. Peer reviews is hands down the most helpful asset in this course as it helps other students perfect their own work when critiquing others. A common thread in these comments is that students may be using peer reviews to assess their own writing. Kristen Nielsen (2012) reported that formative activities, such as peer review, can improve student writing achievement and learning; however, both Nielsen (2012) and Pantelis Papadopoulos et al. (2017) mentioned that such improvement may come from writers analyzing their own work through their analysis of others’ writing and not necessarily from the comments provided by their peers in the review, which could be one reason for the negative comments about peer review received on the survey, such as: If I had to pick one, I would say peer feedback, just because peers are usually not trained in written response. The way they frame their comments can sometimes come across as rude or they comment on areas that aren’t that important, instead of areas that would be helpful. I find that peer reviews tend to be the least valuable, because most students and simply participating for the credit rather than giving meaningful information to improve the documents. Peer feedback. Sometimes peers don’t give legitimate feedback or do not know what they are talking about. Thus, instructors may need to reconsider how peer review is commonly conducted in OWCs and what guidance students are given for giving—and receiving—peer reviews. [77] Comments regarding which activities students found to be least useful were not insignificant complaints. To characterize comments from this section of the survey generally, students did not recognize why they were doing the work they were assigned in their OWCs (which may, in fact, be a challenge for students in more traditional classrooms as well); they expressed uncertainty about the structure, content, and participation in a OWC; and they did not see how the assigned work related to improving their writing. Therefore, we must ask: Are the OWCs these students experienced strategically and pedagogically designed to help them become better writers? If not, what features of the OWCs could be improved and how? (A potential answer to this question is posited in "A Call for Purposeful Pedagogy-Driven Course Design in OWI." These questions are crucial points for administrators to address in conversations and professional development with faculty. Furthermore, it seems important to ask whether OWCs are designed by writing experts (the instructor), and whether instructors have access to the institutional support they need to provide quality OWI in this primarily digital setting. These issues are addressed in the implications section. [78] 4.4.5. Other.The pedagogical activities question was followed by two open-ended response questions.
[79] Question 16 asked, “Considering your responses in Question 15 [regarding effectiveness of tools], please identify what work in your current OWC is the most valuable or helpful to you in improving your writing and explain why.” [80] Question 17 asked, “Considering your responses in Question 15 [regarding effectiveness of tools], please identify what work in your current OWC is the least valuable or helpful to you in improving your writing and explain why.” [81] Many of the responses from these two questions were similar to the preceding open-ended questions where students were asked to clarify their comments in relation to the Likert-scaled questions, so the responses addressed some of the same issues which were raised previously. [82] Question 18 asked, “What is not included in your online writing class that would benefit the learning experience for you in relation to improving your writing?” Student comments in response to this question shed light on pedagogical activities that administrators and faculty should consider since a few aspects of the course were repeated more than others. These included students asking for more visual resources (i.e., PowerPoints, podcasts, and synchronous chats). Specifically, students requested videos as a way to feel the “human” element of online courses (e.g., “more videos from the professor, the audio video is helpful to bring the human element into the digital classroom.”). There also were several comments associated with desiring better directions and more context for assignments (e.g., “I want lectures on the topic. Stuff to put the reading or blog posts into context. We just have five major assignments and no real curricula in my course. The prof doesn’t seem real because there’s no teaching involved.”). [83] The final content question of the survey, Question 19, asked, “Do you have any other comments, concerns, or suggestions about online writing courses that you feel we should know?” Student responses did not contain repeated elements as in the previous question. Because students had not been queried about the content of their OWCs, it was an interesting finding that learners addressed context for assignments, grading, and appreciation for their instructors. Context for the assignment: Concerned that my fellow classmates and I aren’t given a lot of context for our writing assignments, and therefore we do not put in a lot of time and energy into them...why we’re writing is not made clear, in other words. How is a literary analysis going to help me do my job? Grading: Please don’t make a class that is graded based on labor that is bell curved against the other students’ labor. Please don’t make a class that is without lectures and only uses blog posts to convey tiny snippets of learning. Please don’t make a class where even if you do all the assignments perfectly, if someone commented more, it counts as more labor, so they get the higher grade when there’s no set standard for a participation grade. If you guarantee people a B for doing all the assignments moderately okay, but don’t actually let them know if their labor is higher than the rest of the class, or in the middle, they’re gonna complain about how they don’t actually know what their grade is. Thanks. Grading: An online writing class should not be graded solely on participation, as that is a biased remark. I, for one, think about my papers for several hours over a span of days before I begin writing it, but I know some people that can crank out a solid paper in less than an hour. That being said, a writing course should be focused on content and correctness, rather than “effort”. Appreciation for organized instructors: I love the structure of this course! I love that my instructor sends out emails before the new week begins which tells us what we have in store for that particular week and when we have due dates coming up. I also love that we are given a checklist with what we have to do for each week, as well as additional links that help with that week’s lesson. Organization really helps me in courses! Appreciation for organized instructors: It is *so* helpful when the instructor is organized and has explicit expectations for every aspect of the class (when homework is due, what’s the plan for the course, what readings are due and where a student can find them, where and how should a student submit homework, etc). Having clear expectations and directions takes a lot of stress off the student because they don’t have to try to guess what the teacher wants. [84] Additionally, some students used this space for self-reflection about their learning styles, such as that they preferred online learning or face-to-face classes and why they were taking online classes, such as schedule conflicts with campus-based courses. These included the following: Would prefer in person classes but they don't fit my schedule. I don't like taking courses online because it doesn't allow for flowing and interesting discussion among classmates. I took regular composition and found it much more useful. The classroom interaction, exercises, and feedback can't be replicated online. The reason I attend online class from home rather than work (where I would prefer) is because Zoom, Slack, and Skype Groups are all blocked from my work by Internet policy. I received exemption for a while, but it has to be renewed every month. |
5. Implications[85] This article presented the data of the survey with generalized findings and now includes a brief discussion regarding implications. The companion article “A Call for Purposeful Pedagogy-Driven Course Design in OWI” offers analytical discussion regarding the findings and a model for purposeful, pedagogy-driven course design. [86] Some of the most illuminating responses from the survey related to the tools and activities in OWCs that students found least helpful in helping them improve their writing and the importance of the role and presence of the instructor. Students disclosed their perceptions about pedagogical approaches and components of OWC design as they related to improving their own writing without, perhaps, being cognizant they were providing such information. When viewed comprehensively, student comments in this section revealed a disconnect between the intended pedagogical application (as we speculate based on the scholarly literature) and how the tools and activities were perceived—and used—by students regarding the improvement of their writing. It seems important to note students did not express that the tools and activities in OWCs were ineffective or unhelpful in and of themselves. Instead, they communicated that the tools and activities were not always implemented in ways that improve student writing, rendering them somewhat ineffective or unhelpful for these respondents. [87] These results strongly indicate that instructors should consider how their discussion board prompts are designed, how the prompts function to meet learning objectives, what kind of feedback (i.e., grammar, content, and/or organization, among others) the instructor has been providing (and how that feedback relates to improving each individual student’s writing), and what kind of feedback students are expected to provide to each other. Sheri Williams, Amy Jaramillo, and John Carl Pesko (2015) argued that “if instructors make their expectations explicit regarding depth of posts and exploration/problem resolution and collaboration/reflection, students will come to value and use these processes to extend their thinking” (p. 62). How could Williams et al.’s perspective be brought to fruition in an OWC? Warnock (2009, 2018) expressed that discussion boards can be more useful when the instructor takes on a respectful yet critical persona that questions student assumptions and asks them to rethink their statements in an asynchronous discussion. Is this a skill limited to one educator or can all online writing instructors learn and implement it? [88] Furthermore, peer feedback arose as a meaningful concern for students. If peer feedback is expected to help students improve their own writing, how should it be taught as a skill (or measured as a competency)? The data made apparent that students valued instructors and instructors’ expertise with writing; furthermore, the data visibly revealed that these student respondents preferred and desired instructor feedback to improve their writing. This point has important implications regarding the role of the instructor in an OWC because instructor feedback is crucial in supporting student confidence of their application of course material (Borup, West, & Thomas, 2015; Hewett, 2015b). [89] Finally, but not to be minimized, the data indicated that instructors should explain the rationale—and long-term learning impact—for their specific uses of readings, quiz assessments, and peer review in each OWC’s context, as well as for the assignments themselves. By taking the time to include a pedagogical foundation for course activities, instructors may motivate students to more effectively attempt or complete tasks in an OWC; as a result, students may more substantially associate successful completion of an assignment with framed and scaffolded learning activities. (See “A Call for Purposeful Pedagogy-Driven Course Design in OWI”). [90] This point also emerges in other data discussed below—students expressed that they did not know how the work in OWCs helped them improve their writing. Thus, one implication is that the potential to learn should not only be apparent to instructors and administrators in the form of course learning outcomes—it should be made apparent to students, and the orientation is one place to do so. In other words, the course design should ensure that course activities and course tools are actively and meaningfully contributing to successful completion of course assignments and that students can understand these connections throughout the course (Cargile Cook & Grant-Davie, 2005, 2013; Hewett, 2004-2005, 2006, 2010, 2011, 2015a, 2015b; Hewett & DePew, 2015; Hewett & Ehmann, 2004; Paull & Snart, 2016; Warnock, 2009). |
6. Future Studies[91] Data from students about their learning experiences in OWCs are vital to the future of OWI. As the first attempt to gather student perceptions on a national scale, the survey data raised additional questions key to future research. Acknowledging—and including—student perception and student voice in studies associated with OWI will improve faculty understanding of these issues from the learners’ perspectives. [92] Future studies should focus on and expand attention to additional aspects of student learning in OWCs beyond what was attempted in this survey. Especially important components of such study should specifically address the issue that access is multifaceted, including physical and cognitive abilities, language proficiency, and socioeconomic conditions. Beyond access, the nature of the modality in which the course is offered (i.e., whether it was asynchronous or synchronous and whether it was fully online or hybrid) needs to be raised to offer additional context for student responses; hybrid courses vary regarding inclusion of required (or optional) synchronous face-to-face and/or fully online sessions, which affect student experiences. Moreover, studies should address the physical location in which students access the LMS as it may impact experiences both in using course tools and in completing course activities. For example, students who accessed the course primarily from a work location may be completing course tasks within time restraints or in a hectic environment where course activities are consistently achieved, but disjunctively. [93] At this point in the development of OWI, more research is needed to better understand the student learning experience to comprehend what students envision they want and need in an OWC. This survey and its subsequent results are only the first steps in the necessary evolution of research from the students’ perspective. Researchers interested in further studying the approach taken by the student-survey working group have full access to the survey and survey data in hopes other researchers will advance and expand on the work presented here. |
7. Conclusion[94] This survey was deliberately developed to understand student experiences and perspectives in online writing courses. Even though literature on online writing pedagogy is vast and growing due to increasing numbers of students taking OWCs, it is worth understanding if instructors’ intentions and their perceptions of what should be happening in OWCs matches student perceptions of what is happening to and for them in these classes. Such determinations cannot be known unless students contribute to the body of knowledge on this subject. In other words, simply looking at student evaluations at the end of each semester is insufficient; students must be participants in the research process, which was a major goal of this study. [95] In reading and assessing student needs from the student point of view, we encourage instructors to consider student preparation for online writing courses and to investigate methods that may more successfully orient students to online writing courses. Additionally, as instructors develop courses, they should design course elements while considering what components of online writing courses students find most helpful and least helpful in improving their writing. Exploring creative yet effective ways to integrate these helpful components will improve student experience in OWI. These efforts may necessitate collaboration with instructional designers, central offices of distance learning, department heads, and writing program coordinators. Interdisciplinary associations of this nature allow for coordination among areas of expertise and communication between faculty, staff, and administrators to meet the common objective of student success and satisfaction. [96] Even with the limitations of this survey, student responses provide instructors and administrators with new conversations and topics for further investigation regarding how to best teach writing online. One of the most important conversations is how to measure pedagogical impact of OWCs on student writing. Online writing instruction is distinct from distance education courses in other disciplines, and it is important that students see and understand these distinctions. One point made repeatedly in the student data was the absence of rationale for course design and course work. If students are made aware of and taught how to intentionally use feedback and online writing tools to improve their writing, they will understand the online writing activities they are asked to do and connect those activities with their own writing strengths and weaknesses, thus improving their understanding of writing and their writing skills as well. |
8. ReferencesAnson, Ian G. and Anson, Chris M. (2017). Assessing peer and instructor response to writing: A corpus analysis from an expert survey. Assessing Writing, 33, 12-24. Borup, J. West, R.E., and Thomas, R. (2015). The impact of text versus video communication on instructor feedback in blended courses. Education Tech Research Dev, 63, 161-184. Boyd, Patricia Webb. (2008) Analyzing students’ perceptions of their learning in online and hybrid first-year composition courses. Computers and Composition, 25(2), 224-43. Science Direct, doi:10.1016/j.compcom.2008.01.002 Brunk-Chavez, Beth, and Shawn J. Miller. (2006). Decentered, disconnected, and digitized: The importance of shared space. Kairos,1(2), Retrieved from kairos.technorhetoric.net/11.2/binder.html?topoi/brunk-miller/index.html Bozarth, Jane, Diane Chapman, and Laura LaMonica. (2004). Preparing for distance learning: Designing an online student orientation. Journal of Educational Technology and Society, 7(1), 87-106. Cargile Cook, Kelli, and Keith Grant-Davie (Eds.). (2005). Online education: Global questions, local answers. Amityville, NY:Baywood, 2005. Chudoba, Brent. (n.d.). How much time are respondents willing to spend on your survey? SurveyMonkey. Retrieved from https://www.surveymonkey.com/curiosity/survey_completion_times/ Clinefelter, D. L. & Aslanian, C. B. (2016). Online college students 2016: Comprehensive data on demands and preferences. Louisville, KY: The Learning House, Inc. Retrieved from http://www.learninghouse.com/wp-content/uploads/2017/10/OCS-2016-Report.pdf Clinefelter, D. L., & Aslanian, C. B. (2017). Online college students 2017: Comprehensive data on demands and preferences. Louisville, KY: The Learning House, Inc. Retrieved from https://www.learninghouse.com/wp-content/uploads/2017/10/OCS-2017-Report.pdf Conference on College Composition and Communication (CCCC) Committee for Effective Practices in Online Writing Instruction. (2013). A position statement of OWI principles and effective practices. Retrieved from http://www.ncte.org/cccc/committees/owi Cunningham, Jennifer M. (2015). Mechanizing people and pedagogy: Establishing social presence in the online classroom.” Online Learning, 19(3), 34-47. Dockter, Jason. (2016). Improving access with a course orientation. CCCC OWI Open Resource. Retrieved from http://cccc.ncte.org/cccc/owi-open-resource/improve-access?_ga=2.101018690.230881614.1524756874-54836739.1504453572 Eaton, Angela. (2005). Students in the online technical communication classroom. In K. Cargile Cook and K. Grant-Davie (Eds.), Online education: Global questions, local answers, Amityville, NY: Baywood Publishing, 31-48. Eaton, Angela. (2013). Students in the online technical communication classroom: The next decade. In K. Cargile Cook and K. Grant-Davie (Eds.), Online education: Global questions, local answers, Amityville, NY: Baywood Publishing, 133-58. Edutechnica. LMS data: Spring 2018 Updates. Retrieved from https://edutechnica.com/2018/03/04/lms-data-spring-2018-updates/. Friedman, Jordan. (2017). Ask 4 questions about accessing online courses on mobile. Retrieved from https://www.usnews.com/higher-education/online-education/articles/2017-02-16/ask-4-questions-about-accessing-online-courses-on-mobile-devices. Ginder, Scott, Kelly-Reid, Janice and Mann, Farrah. (2017). Enrollment and employees in postsecondary institutions, Fall 2016; and Financial statistics and academic libraries, Fiscal year 2016. NCES, Department of Education. Retrieved from https://nces.ed.gov/pubs2018/2018002.pdf. Gos, Michael W. (2015). Nontraditional student access to OWI. In B.L. Hewett and K.E. DePew (Eds.) Foundational practices of online writing instruction, Fort Collins, CO: Parlor Press, 309-46. Hewett, Beth L. (2004-2005). Asynchronous online instructional commentary: A study of student revision. Readerly/Writerly Texts: Essays in Literary, Composition, and Pedagogical Theory, (Double Issue) 11 & 12(1 & 2), 47-67. Hewett, Beth L. (2013). Fully online and hybrid writing instruction. InH. Brooke Hessler, A. Rupiper Taggart, K. Schick, (Eds.), A guide to composition pedagogies, 2nd ed., Oxford, England: Oxford University Press, 194-211. Hewett, Beth L. (2015a). Reading to learn and writing to teach: Literacy strategies for online writing instruction. Boston, MA: Macmillan. Hewett, Beth L. (2006). Synchronous online conference-based instruction: A study of whiteboard interactions and student writing. Computers and Composition, 23(1), 4-31. Hewett, Beth L. (2015b). The online writing conference: A guide for teachers and tutors. Boston, MA: Macmillan. Hewett, Beth and DePew, Kevin E. (2015). Foundational practices of online writing instruction. Fort Collins, CO: Parlor Press. Hewett, Beth L. and Ehmann, Christa. (2004). Preparing educators for online writing instruction. Urbana, IL, NCTE. Hill, Phil. (2017, Oct. 31). State of higher ed LMS market for U.S. and Canada: Fall 2017 edition. E-literate. Retrieved from https://mfeldstein.com/state-higher-ed-lms-market-us-canada-fall-2017-edition/. Kerr, M.M. and Frese, K.M. (2017). Reading to learn or learning to read: Engaging college students in course readings. College English, 65(1), 28-31. Kimme Hea Amy C. (Ed.). (2009). Going wireless: A critical exploration of wireless and mobile technologies for composition teachers and researchers. Creskill, NJ: Hampton P. Ladyshewsky, Richard.K. (2013). Instructor presence in online courses and student satisfaction. International Journal for the Scholarship of Teaching and Learning, 7(1), 1-23. Lavrakas, Paul. J. (2008). Encyclopedia of survey research methods. Thousand Oaks, CA: Sage Publications, Inc. doi: 10.4135/9781412963947 Lee, Youngju, and Choi, Jaeho. (2011). A review of online course dropout research: Implications for practice and future research. Educational Technology Research & Development, 59(5), 593-618. Lieberman, Mark. (2017, Sept. 13). Welcome aboard. Inside Higher Ed. Retrieved from https://www.insidehighered.com/digital-learning/article/2017/09/13/orientation-programs-set-online-learners-success. Lutkewitte, Claire. (2016). Mobile technologies and the writing classroom. NCTE. McArdle, Casey. (2016). Mobile learning just keeps on running: Renegotiating online collaborative spaces for writing students. In C. Lutkewitte Mobile technologies and the writing classroom(Ed.). NCTE,117-132. Melonçon, Lisa. (2009). Understanding the service course. Unpublished raw data. Melonçon, L. (2012). The rise of academic programs: A call for collaboration. Intercom, 59(7), 13-15. Melonçon, Lisa, & Harris, H., (2015). Preparing students for OWI. In B.L. Hewett and K.E. DePew (Eds.) Foundational practices of online writing instruction, Fort Collins, CO: Parlor Press, 411-438. Minter, Deborah. (2015). Administrative decisions for OWI. In B.L. Hewett and K.E. DePew (Eds.) Foundational practices of online writing instruction, Fort Collins, CO: Parlor Press, 211-26. Murphy, Daniel. (2002). Surveys and questionnaires. Research methods in technical communication. In M. M. Lay & L. Gurak (Eds.), Research in Technical Communication, Westport, CT: Praeger, 93-110. Noel-Levitz (2015). 2015-16 National online learners priorities report. Coralville, IA: Noel-Levitz. Retrieved from www.noellevitz.com/Benchmark. Ooi, Keng-Boon, Hew, Jun-Jie, and Lee, Voon-Hsien. (2018). Could the mobile and social perspectives of mobile social learning platforms motivate learners to learn continuously? Computers & Education, 120, 127-145. Paull, Joanna N. and Snart, Jason. A. (2016). Making hybrids work: An institutional framework for blending online and face-to-face instruction in higher education. Urbana, IL: NCTE. Paulhus, Delroy, & Vazier, Simine. (2007). The self-report method. In R. W. Robins, R. C. Fraley, & R. F. Krieger (Eds.) Handbook of research methods in personality, NY, NY: Guilford Press, 224-239. Rife, Martine. C. (2013). Invention, copyright, and digital writing. Carbondale, Il: Southern Illinois University Press. Richardson, J.C., Besser, E., Koehler, A., Lim, J., and Strait, M. (2016). Instructors’ perceptions of instructor presence in online learning environments. International Review of Research in Open and Distributed Learning, 17(4), 82-103. Rodrigo, Rochelle. (2015). OWI on the go. In B.L. Hewett and K.E. DePew (Eds.) Foundational practices of online writing instruction, Fort Collins, CO: Parlor Press, 493-516. Rourke, Arianne Jennifer. (2013). Assessment ‘as’ Learning: The Role that Peer and Self-Review can Play towards Enhancing Student Learning. The International Journal of Technology, Knowledge, and Society, 8(3), 1-12. Ruffalo Noel Levitz. (2016). 2015-16 National online learners satisfaction and priorities report. Cedar Rapids: Ruffalo Noel Levitz. Schaffhauser, Dian. (2018). Two-thirds of online students do some coursework on a mobile device. Campus Technology. Retrieved from https://campustechnology.com/articles/2018/06/19/two-thirds-of-online-students-do-some-coursework-on-a-mobile-device.aspx. Skurat Harris, Heidi Ann, (Ed.). (2017). Bedford Bibliography of Research in Online Writing Instruction. Macmillan Community. Bedford/St. Martin/Macmillan. Spanjers, Ingrid A.E., Könings, Karen D., Leppink, Jimmie, Verstegen, Daniëlle M.L, de Jong, Nynke, Czabanowska, Katarzyna, van Merriënboer, Jeroen J.G. (2015). The promised land of blended learning: Quizzes as a moderator. Educational Research Review, 15, 59-74. Taylor, J.M., Dunn, M., & Winn, S.K., (2015). Innovative orientation leads to improved student success in online courses. Online Learning, 19(4), Retrieved from https://files.eric.ed.gov/fulltext/EJ1079576.pdf U. S. Department of Education. (2018). National Center for Education Statistics, Table 311.15. Retrieved from https://nces.ed.gov/programs/digest/d17/tables/dt17_311.15.asp?current=yes. U. S. Department of Education. (2018). National Center for Education Statistics, Fast facts. Retrieved from https://nces.ed.gov/fastfacts/display.asp?id=98. U.S. Department of Education, National Center for Education Statistics, National Postsecondary Student Aid Study: 1986-87 (NPSAS:87), 1989-90 (NPSAS:90), 1992-93 (NPSAS:93), Data Analysis Systems. Retrieved from https://nces.ed.gov/pubs/web/97578e.asp. Vazquez-Cano, Esteban. (2014). Mobile distance learning with smartphones and apps in higher education. Educational Sciences: Theory & Practice, 14(4), 1505-1520 Warnock, Scott. (2009). Teaching writing online: How and why. NCTE. Warnock, Scott and Diane Gasiewski. (2018). Writing Together: Ten Weeks Teaching and Studenting in an Online Writing Course. NCTE. Williams, Sheri S., Jaramillo, Amy, and Pesko, John Carl. (2015). Improving depth of thinking in online discussion boards.” The Quarterly Review of Distance Education, 16(3), 45-66. Wilson, Michael, John Diao, Ming Ming, and Huang, Leon. (2015). ‘I’m not here to learn how to mark someone else’s stuff’: An investigation of an online peer-to-peer review workshop tool. Assessment & Evaluation in Higher Education, 40(1), 15-32. |