CoI QUESTIONNAIRE: 2.0?
D. Randy Garrison
July 13, 2021

My goal in this post is to draw attention to an important study that used advanced statistical techniques to analyze the CoI questionnaire (Abbitt & Boone, 2021). While exploring statistical anomalies may not be front of mind for most practitioners, or even those doing research with the CoI framework, this provides significant conceptual and analytical impact regarding the CoI questionnaire. By providing a brief overview of the study I hope to attract interest in pursuing research into the refinement and further validation of the CoI quantitative questionnaire.

However, before addressing suggestions for improvement, it is important to highlight and emphasize the findings of this research regarding the existing strengths of the CoI questionnaire. In this regard, Abbitt and Boone (2021) state that the CoI framework has demonstrated its value in providing insight into online and blended learning environments and the CoI “instrument exhibits strong measurement properties as evaluation of item reliability and person reliability suggested strong reliability” (p. 389).

Recognizing the strengths of the CoI questionnaire is not in contradiction to suggested area for possible improvement. First, Abbitt and Boone (2021) provide important insights concerning a problem with a specific cognitive presence (exploration) item; “Online discussions were valuable in helping me appreciate different perspectives.” Using Rasch measurement techniques, they found there was a possible misfit with this item and the cognitive presence scale. That is, the item may not be consistent with the other items on the cognitive presence subscale. This is notable as another study found this particular cognitive presence item loaded on the social presence factor (Kovanović et al., 2018). Similarly, there was also a possible misfit for a teaching presence (direct instruction) item: “The instructor provided feedback in a timely fashion.” As a result, the authors suggested that consideration may be given to revising or removing these items from the CoI questionnaire. My preference would be to test rewording of these particular items. This would contribute to future research of the CoI framework.

Another issue revealed by this study relates to possible item difficulty where it is easier or harder to agree with an item. For example, by examining the ordering and spacing of the items on the teaching presence scale, the items easiest for respondents to agree with were related to communication while the most difficult related to instructor feedback. Similarly, the easiest items on the social presence scale were related to open communication while the most difficult were related to affective expression. Finally, regarding cognitive presence, respondents found it easier to agree with the application and relevance of course material were the hardest. The implications are:

When viewed through the lens of the CoI framework, it is reasonable to examine and comment upon the item difficulty and spacing of items … for SP, we see that items relating to the dimensions (Affective Expression, Group Cohesion, and Open Communication) fall along distinct and different portions of the scale. For the CP and TP scales, however, this pattern is less distinct. These unique characteristics of item difficulty as well as spacing … provide insights into aspects of construct validity and also the online courses on which future continuous improvement efforts can focus. (Abbitt & Boone, 2021, p. 390)

Without getting into the details, this study identifies areas for productive research with the potential to improve understanding of communities of inquiry and provide opportunities to explore the development of a second generation of the CoI questionnaire. At the same time it should be kept in mind that this research also makes it clear that the CoI questionnaire has proven to be an essential tool to study communities of inquiry and that it offers “strong measurement properties.” Further refining the conceptual integrity of the questionnaire does not discount its current strengths.

To reiterate, the purpose of this post is to draw attention to this research and encourage work on developing a second generation of the CoI survey questionnaire. It would also seem to me that given the complexity of this challenge, it may be effectively addressed through a collaborative approach much like the one that led to the creation of the original CoI questionnaire (Arbaugh, et al., 2008).



REFERENCES

Abbitt, J. T., & Boone, W. J. (2021). Gaining insight from surveydata: an analysis of the community of inquiry survey using Rasch measurement techniques. Journal of Computing in Higher Education, 33, 367–397.

Arbaugh, J. B., Cleveland-Innes, M., Diaz, S., Garrison, D. R., Ice, P., Richardson, J., Shea, P., & Swan, K. (2008). Developing a community of inquiry instrument: Testing a measure of the Community of Inquiry framework using a multi-institutional sample. Internet and Higher Education, 11, 133-136.

Kovanović, V., Joksimović, S., Poquet, O., Hennis, T., Čukić, I., deVries, P., et al. (2018). Exploring communities of inquiry in massive open online courses. Computers and Education, 119, 44–58.




POST A COMMENT
Anastasios Katsaris · 2 months ago
Thank you so much professor for bringing to our attention interesting new research on your inspirational work.
Reply · Recommend
CONTRIBUTE A RESPONDING EDITORIAL


ABOUT THE AUTHOR

D. Randy Garrison
Professor Emeritus, University of Calgary
D. Randy Garrison is professor emeritus at the University of Calgary.Dr. Garrison has published extensively on teaching and learning in adult, higher and distance education contexts. He has authored, co-authored or edited twelve books and well over 100 refereed articles/chapters.His recent books are Thinking Collaboratively: Learning in a Community of Inquiry (2016) and E-Learning in the 21st Century: A Community of Inquiry Framework for Research and Practice (3rd Edition) (2017); for which he won second place for the Association for Educational Communications and Technology, Division of Distance Learning Book Award, 2017.


RECENT EDITORIALS

CoI Questionnaire: 2.0?
D. Randy Garrison
July 13, 2021
My goal in this post is to draw attention to an important study that used advanced statistical techniques to analyze the CoI questionnaire (Abbitt & Boone, 2021). While exploring statistical anomalies may not be front of mind for most

Purposeful and Social Interaction
D. Randy Garrison
April 22, 2021
In a previous editorial I had addressed the challenge of designing a collaborative inquiry that goes beyond simple interaction to achieve deep and meaningful learning ( Editorial 18

Teaching Presence Meta-Analysis
D. Randy Garrison
January 16, 2021
I want to draw your attention to a theoretically and pragmatically significant meta-analysis of the Community of Inquiry (CoI) teaching presence construct. I describe teaching presence (TP) as the connective tissue of a functional community of ...

CoI Emergence and Influence
D. Randy Garrison
November 9, 2020
I was recently interviewed for a podcast focused on Reflective Teaching in a Digital Age ( https://coi.athabascau.ca/coi-model/ ). The focus of

Intellectual Roots of DE and the CoI Framework
D. Randy Garrison
September 1, 2020
Bozkurt (2019) has provided a needed insight into the intellectual and theoretical development of the field of distance education through an analysis of 1685 articles from 1916 to 2018. This was precipitated by the fact that distance education (DE) has ...
The Community of Inquiry is a project of the Centre for Distance Education at Athabasca University, researchers of the Community of Inquiry framework, and members of the CoI community.