OLOR Logo: "OLOR Online Literacies Open Resource, A GSOLE Publication"

 Stylized green and purple 'G' with "Global Society of Online Literacy Educators" in purple.


ROLE: Research in Online Literacy Education, A GSOLE Publication

Exploring Student Success

Modality Implications on Accessibility, Student Support, & Class Community

by Cat Mahaffey



Publication Details

 OLOR Series:  Research in Online Literacy Education
 Author(s):  Cat Mahaffey
 Original Publication Date:  19 December 2025
 Permalink:

 <gsole.org/olor/role/vol4.iss2.e>

Abstract

This study explores the impact of course modality on student success, accessibility, community, and support in first–year writing courses at a large urban public university. Two instructors taught identical courses in asynchronous online, synchronous online, and face–to–face formats, implementing best practices for asynchronous design across all modalities. Data was collected through final exam responses and institutional research, analyzing student perceptions and performance across the three delivery modes. Key findings include: Synchronous online students achieved the highest grades, while asynchronous students had higher withdrawal and failure rates; All modalities were perceived as highly accessible, with face–to–face courses rated most accessible; Sense of community was comparable across modalities, with asynchronous students providing more detailed feedback; Student perceptions of instructor support were consistent across all modalities. The study results suggest that implementing asynchronous design principles can enhance accessibility and community across all modalities. The research highlights the importance of active instructor presence, multiple communication pathways, and the need for better guidance in course selection to match student strengths with appropriate modalities.

Keywords: Accessibility, support, community, modalities, participatory design

Resource Contents

1. Introduction

[1] The rapid shift to online and remote teaching during the pandemic of 2020 highlighted multiple gaps in course design and delivery. Many of us in the Online Literacy Instruction (OLI) community were not surprised at the lack of preparedness that both students and instructors had in navigating online and distance learning environments. Of note is that many instructors were encouraged to move to synchronous online teaching (i.e., Zoom) because it was believed that transitioning to this mode would be more straightforward than, say, trying to build an asynchronous learning environment. For me, this raised several questions I had not fully considered before the pandemic: Was synchronous online course delivery better than asynchronous? And do synchronous courses successfully mimic face–to–face courses? Which mode of course delivery is most appropriate/effective for the diverse populations of undergraduate students?

[2] Thus, this project was born and implemented during the fall semester of 2022, the first semester my university resumed a somewhat normal, onsite schedule. I say somewhat because what my co–researcher and I did not know at the time was that the incoming freshmen class of 2022 was anything but typical. It’s a testament to our curiosity and motivation that we were willing to make major pivots and amendments to our research protocol because we did not foresee the enormous hardships we would face due to student struggles with mental health and lack of focus and engagement.

[3] In an effort to explore modalities and student success, two instructors (myself and my co–researcher) taught the same course at a large public urban university in three different delivery modes: asynchronous online, synchronous online, and face–to–face. While she is not co–authoring this article, some steps in the project were completed with my co–researcher, while other steps were completed on my own. I will use “we” to denote shared processes, and “I” to denote my own individual work and assertions.

[4] Both instructors designed their Canvas courses the same for all three modalities and made every attempt to align instructions, assignment descriptions, feedback, and interactions (both instructor–student and student–student) as much as possible. Students in all modalities completed assignments and activities on the same timelines. To ensure that all sections were pedagogically sound, the instructors leaned toward course design that implemented asynchronous online best practices, including features like video walkthroughs for all assignments and module–based navigation. Our goal was to leverage participatory design principles, involving student voices to ensure their needs and perspectives were central to our findings. To that end, we sought to capture student perceptions of three different course elements: accessibility, community, and support. Our goal was to compare and contrast perceptions of these elements across modes. In addition, final grade data was collected from my school’s Institutional Research Department to add a data component that compared student success across modes, or as much as success can be determined by such data.

2. Brief Literature Review

[5] Effective course design can be daunting, as many educators suddenly realized in 2020. As mentioned above, this project focuses on three main elements of course design: accessibility, community, and support. While I explore each element separately below, the overlaps and gaps in understanding and implementing them in effective ways cannot be overstated.

[6] Before moving forward, I want to clarify that my understanding of accessibility, community, and support are informed by ongoing conversations among scholars about how best to measure student success. I don’t view grades as the major indicator of success. Rather, I have argued that determining student success involves considerations of some combination of grades, student engagement, and the ability to navigate and utilize technology efficiently (Mahaffey & Walden, 2019). For further reading on this topic, I recommend Burgstahler & Cory (2008) who argue that success should be measured not just by grades but also by the ability to apply knowledge in real–world settings, or Crow (2008) who asks instructors to consider students’ ability to overcome their own challenges and their individual progress toward content mastery. Overall, scholars who focus on accessibility, community, and student support suggest a holistic approach, one that considers students’ individual challenges, diverse learning styles, and the ability to apply knowledge in practical contexts.

2.1. Accessibility

[7] Scholars have long noted the importance of attention toward accessibility for students, with heightened attention to courses delivered through online/digital interfaces (Burgstahler & Cory, 2008; Crow, 2008; CAST, 2011; Mahaffey & Walden, 2019). While there are multiple definitions and lenses through which to describe and measure accessibility, this project includes participants in the general population of first–year writing courses at a large urban public university; therefore, it draws upon notions of access espoused by scholars who are, for the most part, outside Disability Studies. In particular, scholars who explore topics like assignment design and navigation at all levels of a course. For example, Bourelle et al. (2013) focus on learner–centered pedagogy and multimodal instruction, which involves careful navigation and assignment design, and Cargile Cook & Grant–Davie (2005) outline a five–step process for developing pedagogy–driven online courses, including defining course goals, activities, and assessment strategies, which are crucial for effective navigation and assignment design.

2.2. Community

[8] Like accessibility, the importance of developing a sense of community within a classroom is well documented in the scholarship (Bender, 2003; Carter & Rickly, 2005; Blair & Hoy, 2006; Warnock, 2009; Cunningham, 2015; Borgman & Dockter, 2016; Hilliard & Stewart, 2019). Students need to feel a sense of belonging and shared purpose in order to learn and be motivated to work through challenging concepts. While there are myriad ways of fostering community in the classroom, this concept essentially comes down to interactions, building in moments throughout a course where students interact with each other and their instructor. Some of the most frequently noted elements that build community are discussions (both synchronous and asynchronous), group projects, and peer workshops (see Bender, 2003; Blair & Hoy, 2006; Boyd, 2008; Comer et al., 2014; Cunningham, 2015).

2.3. Support

[9] Scholars note the importance of student support across a broad range from technical to academic (Palloff & Pratt, 2001; de Montes et al., 2002; Mick & Middlebrook, 2015; Stella & Curdy, 2013; Snart, 2017). In many ways, support is often discussed as intertwined with accessibility and community. For example, Brickman (2003) relates clear instructions and expectations to both support and accessibility. In a similar way, Cox et al. (2015) and Blair & Hoy (2006) note that timely and substantive feedback enhance instructor presence, which contributes to student perceptions of both community and support. Indeed, providing support for student learning is complex and perhaps overwhelming at times, but it’s clearly a vital component of course design.

3. Methodology

3.1. Participatory Design

[10] Participatory design, in a nutshell, invites users into the design process beginning in the early stages and values input from users/consumers/students of products, technologies, and content. Such an approach complicates our notion of what positive outcomes and engagement can look like (Lazar, 2007; Oswal, 2014; GSOLE 2019; Still & Crane, 2017). Scholars of online literacy instruction (OLI) have long espoused the importance of course design that is attentive to student input and perception of accessibility, community, and student support; however, it’s also clear that such methods are labor–intensive and require additional professional development (Cargile Cook & Grant Davie, 2005; Oswal & Meloncon, 2017; Hitt 2018; Borgman & Dockter, 2016; Mahaffey & Walden, 2019). Published in 2019, the Global Society of Online Literacy Educators’ (GSOLE’s) Principles and Tenets recommends “regular, iterative” professional development in OLI Principle 3: Tenets of OLI Design and Pedagogy. Indeed, the dynamic nature of technology–enhanced teaching environments requires that teacher–scholars continually evaluate the effectiveness of their teaching across all modes. Participatory design is perhaps the most effective way to engage in such ongoing revision of course design and delivery because it is informed by students.

[11] As an embodiment of participatory design practices, this study embraces a broad definition of accessibility, focusing on assignment design and navigation at all levels of a course rather than merely considering ADA compliant practices like captioned videos, accessible PDFs, or alt text for images. This expanded notion of accessibility aligns with participatory design principles, providing student agency without the risk of self–disclosing a disability, ensuring inclusivity and comfort for all students.

[12] To incorporate participatory design principles, the original design aimed to survey first–year writing students at three different points in the semester so that we could have them focus on each concept separately (accessibility, community, and support). We planned to follow up on these surveys with a focus group of students from across the modalities. This plan shifted in the last full month of the semester because we were unable to get a representative number of students to complete the surveys and had no volunteers for our focus groups. We attribute this to the many post–COVID issues with student engagement and mental health, as well as our own unexpected troubles transitioning back to onsite teaching in 2022. This breakdown in our original study design forced us to grapple with whether or not to abandon the project; however, since we were unsure if we would have another opportunity to teach across modes, we instead switched gears and designed a reflective final exam to capture student perceptions of the courses, coupled with demographic and final grades data we could collect from our school’s institutional research office.

3.2. Final Exam Data

[13] The final exam became our main data set. It included three question sets, one for each class component. Each question included a brief definition and then open–ended questions:

Question Set 1: Accessibility

Definition: For the purposes of this question, accessibility refers to your ability to locate, understand, and/or use elements and features of the course.

What worked well in terms of accessibility in this course? Give explicit examples from the course: ex. assignments, resources, calendar events, to–do list items, etc.

What are some areas for improvement?

Question Set 2: Classroom Community

Definition: For the purposes of this question, classroom community refers to a sense of connection with classmates and your instructor, and the feeling that you are sharing the classroom space and the course journey with them.

What worked well in terms of classroom community in this course? Give explicit examples from the course: ex. responding to group work, research teams, peer workshops, informal lines of group communication, etc.

What are some areas for improvement?

Question Set 3: Student Support

Definition: For the purposes of this question, student support refers to help, guidance, feedback, and communication between you and your instructor.

What worked well in terms of student support in this course? Give explicit examples from the course: ex. responding to emails, office hours, teacher feedback, discussion forums, etc.

What are some areas for improvement?

This produced a robust qualitative data set. The exams were sorted based on which students had consented to the use of their exam submissions for research purposes and then coded using NVIVO. The coding process took several rounds. I first did some preliminary discovery coding to understand what I had to work with. Then, I discussed the discovery coding results at length with my co–researcher, who helped me develop the codes and descriptors using the samples (see Table 1).

Table 1. NVIVO codes with descriptions of each code. We used a combination of numbers and letters to designate the three components. For example, the code A1 stands for Accessibility Level 1, meaning the student expressed they “cannot locate” course materials.

Code  Description

A1 Cannot locate
A2 Some difficulty locating
A3 Neutral perception of accessibility
A4 Relatively easy to locate
A5 Very easy to locate
Accessibility – Suggestions for improvement Expressions of desire for different/better course elements and/or instructor practices
Accessibility – Helped Course elements and/or instructor practices identified as enhancing accessibility
Accessibility – Hindered Course elements and/or instructor practices identified as hindering accessibility
C1 Not a sense of community
C2 Little sense of community
C3 Neutral
C4 Measurable sense of community
C5 High sense of community
Community – Helped Expressions of desire for different/better course elements and/or instructor practices
Community – Hindered Course elements and/or instructor practices identified as enhancing community
Community – Suggestions for Improvement Course elements and/or instructor practices identified as hindering community
S1 Not effective
S2 Somewhat effective
S3 Neutral effectiveness
S4 Effective
S5 Highly effective
Support – Helped Expressions of desire for different/better course elements and/or instructor practices
Support – Hindered Course elements and/or instructor practices identified as enhancing support
Support – Suggestions for Improvement Course elements and/or instructor practices identified as hindering support

[14] We also requested a report from our school’s institutional research office that included general demographic data and final course grades for all students enrolled in the study courses. This spreadsheet included the following data:

  • Course section
  • Final grade (including withdrawal numbers)
  • Class level for each student (non–degree, freshman, sophomore, junior, or senior)
  • Gender
  • Race
  • Student type (new freshman, continuing student, or transfer student)

Note that we requested this information to be sent to us de–identified. Neither of us knew which student was represented for any data point on the spreadsheet.

3.3 Culmination of Data

[15] The NVIVO coding from the final exam responses produced both quantitative and qualitative data points, and the spreadsheet from the institutional research office provided multiple data points that could be compared and contrasted. The culmination of this data informed our perspectives on the four main research questions for this project:

  1. How does modality impact student success based on the following data points?
    1. DFW rates
    2. Final course grades comparison
  2. How does modality impact student perceptions of accessibility?
  3. How does modality impact student perceptions of the classroom community?
  4. How does modality impact student perceptions of instructor support and availability?

4. Results

4.1. NVIVO Coding

[16] The coding process produced several lenses, and NVIVO affords many ways to manipulate and sort data. Figures 1, 2, and 3 below show the quantifiable results from this work. Note that the n–factor varies. This is due to the nature of the coding process where I used NVIVO’s automated search tool rather than reading each student’s response and deciding where it fell on the coding chart. This was partially due to the fact that some student responses were too vague, making it difficult to determine which code to apply.

[17] Figure 1 compares the results of the NVIVO coding for accessibility across the three modes. All three modes were perceived as highly accessible, with the majority of students finding course elements “very easy to locate.” This underscores the effectiveness of designing courses with clear navigation and comprehensive resources, which benefit all students regardless of the delivery mode.

Figure 1: Chart of NVIVO coded data for question set 1 of the final exam discussing accessibility. n=101


[18] Figure 2 compares the NVIVO coding for community across the three modes. Students in asynchronous courses provided more detailed responses, indicating strong feelings about their sense of community. This suggests that asynchronous courses, when designed with interactive elements, can indeed successfully create a sense of belonging and engagement on par with synchronous and face–to–face courses.

Figure 2: Chart of NVIVO coded data for question set 2 of the final exam discussing community. n=20.


[19] Figure 3 shows the NVIVO coding for support across the three modes. These results show consistency in perceptions of support across the three modes, highlighting the importance of active instructor presence and multiple communication pathways, which suggests that the instructors successfully implemented essentially the same support for all students, regardless of the delivery method. It’s important to note here that the instructors designed their courses to be identical, leaning toward the more complicated delivery method of asynchronous.

Figure 3: Chart of NVIVO coded data for question set 3 of the final exam discussing support. n=95.


4.2. Institutional Research Data Results

[20] Table 2 below shows the breakdown of students by mode along with their final course grades. It is important to note that most students in the cohort were asynchronous (43%). In addition, the synchronous cohort scored the highest overall final course grades, followed by face–to–face, and then asynchronous, and a higher number of students in asynchronous sections failed or withdrew from the course (21%). However, the similarities across the modes are more significant than the differences, reinforcing the idea that well–designed courses can support student success in any delivery mode.

Table 2. Final grades by mode. n=153

Mode # of Students A B C D F W
Asynchronous 66 (43%) 40 (61%) 7 (11%) 5 (8%) 0 8 (12%) 6 (9%)
Synchronous 44 (29%) 38 (86%) 3 (7%) 1 (2%) 0 2 (5%) 0
Face-to-Face 43 (28%) 34 (79%) 5 (12%) 1 (2%) 0 1 (2%) 2 (5%)
TOTAL 153 (100%) 112 (73%) 15 (10%) 7 (5%) 0 11 (7%) 8 (5%)

[21] Overall, the results indicate a notable parity across different delivery modes, challenging the continued skepticism about the effectiveness of online teaching and learning. This study demonstrates that with thoughtful design, asynchronous, synchronous, and face–to–face modalities can equally support student success, accessibility, community, and support.

[22] Also of note is that the participatory design approach significantly impacted the results of this study. Student voices and perspectives collected through final exams were instrumental in informing course design strengths and weaknesses. For instance, students suggested incorporating more video walkthroughs for assignments, which would enhance accessibility across all delivery modes. Additionally, students noted how effective peer review was to their sense of community.

5. Limitations

[23] Most of the limitations in this project stem from the shift in protocol during the semester. It was designed to be much more rigorous, with anonymous surveys and a robust focus group, but such protocols were not possible during the first semester that students and faculty returned to campus after being remote during COVID. This created several unfortunate limitations:

  • This was not a typical semester. Unforeseen mental health and engagement issues are well noted, and we certainly saw them across all three modes of delivery. In addition, both instructors adopted a more generous grading and support policy, a phenomenon that occurred across all educational spectrums at that time of the pandemic.
  • Data was not gathered anonymously. Using final exams means that student responses may have been less genuine without the protection of anonymity during a graded assignment.
  • Qualitative coding is fraught with subjectivity issues. While we had intended to code some qualitative data from the focus groups, the original design aimed for a robust quantitative data set from the surveys for an overall richer perspective of student perceptions.

[24] In addition, the results are highly contextual to the semester/instructor/school, especially since the courses in all three modes were essentially designed according to asynchronous best practices. We chose to do this because we wanted the courses to be as identical as possible, so we designed the course for the most complex delivery mode (asynchronous). In some ways, I see this as even more evidence that all three modes are comparable. Consider that course design for asynchronous teaching includes maximum access, greater attention toward community, and often extra support built into the course shell. My point is that the synchronous and face–to–face sections in this project were greatly enhanced, with multiple pathways to navigate the course and extra support. Had the courses not been designed with such enhancements, the asynchronous sections may have been ranked much higher by students for all three course elements.

6. Discussion

Research Q1: How does modality impact student success?

[25] Measuring student success is anything but straightforward. This study certainly complicates that notion. If we only consider final grades and DFW rates, then we might conclude that modality does in fact impact student success, with synchronous students being most successful, and asynchronous students being less so. However, another factor to consider is the conditioning and shaping of student expectations post–COVID. Students who registered for an asynchronous college course in 2022 had high school teachers who were instructed to simplify their courses and be overly generous with grading. They likely expected the asynchronous sections in this study to be structured in a similar way. This could account for the higher withdrawal rate and failures in those sections.

[26] This impact on success via course delivery mode raises larger questions about whether or not students are adequately informed during the registration process. My school does not have a system that helps students self–identify their strengths and weaknesses in ways that inform them as to which delivery mode they can be most successful in. I suspect that many schools have similar shortcomings in this area. As an experienced teacher–scholar of online asynchronous course design and delivery, I applaud the growing prevalence toward remote learning opportunities so that all students have access to higher education, but these offerings become a disservice if students fail courses and give up on their degrees without realizing they might have been better off in a different delivery environment.

Research Q2: How does modality impact student perceptions of accessibility?

[27] For the purposes of this study, I don’t see a major difference in perceptions of accessibility across the three modes. The higher number of negative perceptions from asynchronous students is from the largest cohort in the study (43%), so this likely reflects a broader perspective than the other two cohorts. Having said that, it’s also likely that the higher perceptions of accessibility in the synchronous and face–to–face cohorts are a result of the project’s design wherein the instructors incorporated higher levels of accessibility than most courses offer.

[28] This study reveals that course design aiming toward full access within the learning management system, a design toward asynchronous learning environments (video walkthroughs, multiple pathways of navigation, etc.), promotes accessibility for any and all delivery modes. Consider that creating a walkthrough video for an assignment allows students to watch and rewatch explanations and guidelines. In a typical synchronous setting (including face–to–face), instructors go over assignments verbally, so students must either take copious notes or hope they remember what was explained. This enhanced accessibility doesn’t diminish synchronous lessons and discussions, although I do admit that attendance was an issue for all student cohorts, and because that was also a phenomenon when students and instructors returned to campus, it’s difficult to say with any certainty whether or not having the asynchronous access to the course encouraged students to miss more classes.

[29] Perhaps the most important takeaway is that designing all courses toward asynchronous access carves a clearer pathway for students with disabilities, learning differences, and/or who are neurodivergent. My own daughter suffers from migraines, and even with accommodations during her college experience, she struggled to keep up when she missed classes because there were no alternative access points provided. This meant that she could not use her accommodations properly and often attended class when she should have been practicing self–care.

Research Q3: How does modality impact student perceptions of the classroom community?

[30] This may be perhaps the most exciting data point from this study: students expressed comparable perceptions of community across all modalities. Even with the lower n= factors in the coded responses, the overall impression is stronger for asynchronous course delivery than I expected. Since community is perceived by many as difficult to establish in asynchronous course delivery, this study offsets that perception. With the right setup and course design, students can indeed feel a shared sense of belonging and purpose even though they may never interact with each other synchronously. Students mentioned things like group assignments, peer/instructor feedback, and discussion forums as contributing to their perceptions of community.

[31] It’s important to note that both instructors are experienced in online asynchronous instruction, so the results here would likely be quite different if an instructor who lacked such experience repeated this study.

Research Q4: How does modality impact student perceptions of instructor support and availability?

[32] The consistency in perception across the three cohorts of students with regard to support is probably the least surprising. One frequently cited feature that both instructors include in their courses is what we call an “ask the professor forum.” Students also noted receiving quick responses from instructors via email and timely feedback on their submissions.

[33] The two larger takeaways from this data point are (1) the need for active instructor presence in an asynchronous course and (2) the importance of communication pathways for all course delivery modalities. All students in this project could contact the instructor for support through multiple pathways (Q & A forums, email, in person, via Zoom). The pathways provided both synchronous and asynchronous opportunities to interact with the instructor, including synchronous zoom and in–person meeting options for asynchronous students. This level of support is not standard for all contexts, but this study reinforces the need for more attention to this feature.

7. Conclusion

[34] This project underscores the value of participatory design in course development. By involving students in the design process, we can create a more inclusive and responsive learning environment. Feedback loops can not only improve course accessibility and community but also empower students by giving them a voice in their education. Having said that, a participatory design approach also presents challenges, such as the need for additional time and resources. Future research should explore strategies to streamline this process and further investigate its long–term impact on student success.

[35] This project also demonstrates that with thoughtful design and active instructor presence, courses delivered across modalities can achieve comparable outcomes in terms of accessibility, community, and support. In addition, there are key takeaways about specific strategies that students felt enhanced their learning environments:

  1. Enhanced Accessibility: Designing all courses using features typically reserved for asynchronous contexts, like video walkthroughs and multiple navigation pathways, enhances accessibility for all students. This is particularly beneficial for students with disabilities or learning differences.
  2. Community Building: Despite common perceptions, asynchronous courses can foster strong community through interactive elements like group assignments and discussion forums.
  3. Instructor Support: Providing multiple access points for support like Q&A forums, optional synchronous meetings, and timely email responses are the most frequently requested features by students across all delivery modes.

The larger point that this parity across modalities reveals is that instructors who are trained and willing to engage in online teaching best practices can be counted on to design and deliver highly effective online courses without compromising on quality or student success.

8. References

Bender, T. (2003). Discussion–based online teaching to enhance student learning: Theory, practice and assessment (2nd ed.). Stylus Publishing.

Blair, K. & Hoy, C. (2006). Paying attention to adult learners online: The pedagogy and politics of community. Computers and Composition, 23(1), 3–48. ScienceDirect, https://doi:10.1016/j.compcom.2005.12.006

Borgman, J. & Dockter, J. (2016). Minimizing the distance in online writing courses through student engagement. Teaching English in the Two–Year College, 44(2), 213–22.

Bourelle,T ., Rankins–Robertson, S., Bourelle, A. & Roen, D. (2013). Assessing learning in redesigned online first–year composition courses. In H. McKee & D. DeVoss (Eds.), Digital Writing Assessment and Evaluation. Computers and Composition Digital Press/Utah State Press. http://ccdigitalpress.org/dwae/12_bourelle.html

Boyd, P. W.. (2008). Analyzing students’ perceptions of their learning in online and hybrid first–year composition courses.” Computers and Composition, 25(2), 224–43. ScienceDirect, https://doi:10.1016/j.compcom.2008.01.002

Brickman, B. (2003). Designing and teaching online composition. Teaching English in the Two–Year College, 30(4), 358–.

Burgstahler, S., & Cory, R. (2008). Universal Design in Higher Education: From Principles to Practice. Harvard Education Press.

Cargile Cook, K., & Grant–Davie, K. (2005). Online education : global questions, local answers. Routledge. https://doi.org/10.4324/9781315223971

Carter, Joyce Locke, and Rebecca Rickly. (2005). Mind the gap(s): Modeling space in online education.” In K. C. Cook & K. Grant-Davie (Eds.), Online Education: Global Questions, Local Answers (pp. 123–39). Baywood Publishing.

CAST. Universal Design for Learning Guidelines Version 1.0. The National Center on Universal Design for Learning, 2008.

Comer, D. K., Clark, C. R., & Canelas, D. A. (2014). Writing to Learn and Learning to Write across the Disciplines: Peer–to–Peer Writing in Introductory–Level MOOCs. International Review of Research in Open and Distance Learning, 15(5), 26–82. https://doi.org/10.19173/irrodl.v15i5.1850

Cox, S., Black, J., Heney, J., & Keith, M. (2015). Promoting teacher presence: Strategies for effective and efficient feedback to student writing online. Teaching English in the Two-Year College, 42(4), 376-391.

Cunningham, J. M. (2015). Mechanizing people and pedagogy: Establishing social presence in the online classroom. Online Learning, 19(3), 34–47.

Crow, K. L. (2008). Four Types of Disabilities: Their Impact on Online Learning. TechTrends, 52(1), 51–55.

de Montes, L. E. S., Oran, S. M., & Willis, E. M. (2002). Power, language, and identity: Voices from an online course. Computers and Composition, 19(3), 251–271. https://doi.org/10.1016/S8755–4615(02)00127–5

GSOLE. (2019). Online Literacy Instruction Principles and Tenets. GSOLE.org. Retrieved from https://gsole.org/oliresources/oliprinciples

Hilliard, L. P., & Stewart, M. K. (2019). Time well spent: Creating a community of inquiry in blended first–year writing courses. The Internet and Higher Education, 41, 11–24. https://doi.org/10.1016/j.iheduc.2018.11.002

Hitt, A. (2018). Foregrounding Accessibility Through (Inclusive) Universal Design in Professional Communication Curricula. Business and Professional Communication Quarterly, 81(1), 52–65. https://doi.org/10.1177/2329490617739884

Lazar, J. (2007). Universal usability: Designing computer interfaces for diverse user populations. John Wiley & Sons.

Mahaffey, C. & Walden, A. (2019). “#teachingbydesign: Complicating Accessibility in the Tech–Mediated Classroom.” In K. Becnel (Ed.), Emerging Technologies in Virtual Learning Environments. 

Mick, C. S. & Middlebrook, G. (2015). Chapter 3: Asynchronous and synchronous modalities. In B. L. Hewett & K. E. DePew (Eds.), Foundational practices of online writing instruction (pp. 135-154). The WAC Clearinghouse. https://doi:10.37514/PER–B.2015.0650.2.03

Oswal, S. K. (2014). Participatory design: Barriers and possibilities. Communication Design Quarterly Review, 2(3), 14–19. https://doi.org/10.1145/2644448.2644452

Oswal, S. K., & Meloncon, L. (2017). Saying No to the Checklist: Shifting from an Ideology of Normalcy to an Ideology of Inclusion in Online Writing Instruction. WPA. Writing Program Administration, 40(3), 61–.

Palloff, R. M., & Pratt, K. (2001). Lessons from the cyberspace classroom : the realities of online teaching. Jossey–Bass.

Snart, J. A. (2017). Making Hybrids Work: An Institutional Framework for Blending Online and Face–To–Face Instruction in Higher Education. Beaverton: Copyright Clearance Center. Retrieved from https://www.proquest.com/other–sources/making–hybrids–work–institutional-framework/docview/1892824752/se–2

Stella, J., & Corry, M. (2013). Teaching writing in online distance education: Supporting student success.” Online Journal of Distance Learning Administration, 16(2), www.westga.edu/~distance/ojdla/summer162/stella_corry162.html

Still, B., & Crane, K. (2017). Fundamentals of User–Centered Design: A Practical Approach (1st ed.). CRC Press. https://doi.org/10.4324/9781315200927

Warnock, S. (2009). Teaching Writing Online: How and Why. National Council of Teachers of English.

Privacy Policy | Contact Information  | Support Us| Join Us 

 Copyright © Global Society of Online Literacy Educators 2016-2023

Powered by Wild Apricot Membership Software
!webmaster account!