Friday, April 1, 2016

An Account of Online Learning Experiences with First-Year Doctoral Students

by Sheena Ghanbari, Program Promotion Specialist at UC San Diego; Nahid Nariman, Director of Research at Transformative Inquiry Design for Schools and Systems and Karina M. Viaud, Director of Parent and Family Programs at UC San Diego


Higher Education is faced with multiple challenges such as tuition inflation, student retention, campus climate, and time to degree. Specifically, graduate school, namely Ed.D. and Ph.D. programs, continue to face challenges with student retention; approximately 50% of doctoral students leave the program before attaining the degree (Cassuto, 2013; Lovitts & Nelson, 2000; Reid, 2012) and most students discontinue the program at the ABD stage (All But Dissertation), a stage in which coursework is complete without successfully defending the dissertation (Barnett, 2008). Before reaching the dissertation phase of the doctoral program, many Ed.D. programs institute an end of the first year assignment - i.e., students must qualify for the subsequent year by writing a Qualifying Paper. This stage of the program is important in two ways. First, this stage signifies a critical junction in which the student demonstrates the ability to write a coherent literature review. Second, this stage signifies how well the Qualifying Paper Course prepared and supported the student to move forward in the doctoral program.

Researchers have begun analyzing data trails as means to obtain insight into various academic individualization, prediction, intervention, and adaptation. Analytics has captured meaningful attention in institutional research and business (MacNeill, 2012), and is a relatively new area of research in higher education (Barneveld, Arnold, & Campbell, 2012; Bichsel, 2012). It can also support innovative and meaningful ways of improving students’ performance and success (MacNeill, 2012). Learning analytics enables us to generate specific learning processes and images of students’ performance in a way that the two can be compared to the overall performance of a course (Gaviria, Glahn, Drachsler, Specht, & Gesa, 2011). Early investigation on academic analytics predict students’ academic difficulty in order to help faculty members generate individualized learning instructions tailored to the student’s learning needs (Campbell, DeBlois, & Oblinger, 2007). Broadly speaking, schools or other programs that use analytics evaluate data to help make decisions with information that can determine the best course of action to improve student learning (Hawkins, 2008; Long & Siemens, 2011; Norris, Baer, & Offerman, 2009), and improve teaching and learning (Campbell, 2007; Baepler & Murdoch, 2010).

In an effort to gain a different perspective of writing a Qualifying Paper in the student’s doctoral journey, we proposed that one class of the Qualifying Paper Preparation Course be conducted online to see how students engaged in dialogue about on-becoming a critical consumer of research. We analyzed data from this online activity to garner insight in information-giving while individual and collective learning developed among three groups of four to five doctoral students. We saw data related to students’ early understanding of the assignment develop in real-time. This digital footprint compiled evidence to consider creative ways of action in teaching and students’ learning development in this Qualifying Paper Preparation Course. We were interested in how online analytics can unveil the beginning stages of student learning and affect teaching strategies. We used Google Drive as the platform for the online shared-activity because it is a familiar technology. We collected data over a three-hour course that took place in the third week of the quarter. Participants were randomly assigned to three groups of four or five students. Each group navigated through four modules that explored different approaches to becoming a critical consumer of research. The following is a snapshot of the participants’ reflection on their experiences.

The first module asked participants to think back to assigned course readings and share what it meant to be a critical consumer of research. Building on this premise, the second module began with a video that outlined the process of writing a literature review followed by describing three strategies from the video they would be incorporating in their upcoming literature review for the Qualifying Paper milestone in the Ed.D. program. The third module asked if any new knowledge was gained during this activity and if the participant’s initial definition of a critical consumer of research has changed. After each module, participants also wrote a brief reflection. Finally, the fourth module was an overall reflection and opportunity to share general feedback about their online experience.

At the onset of examining our data, we identified and used key adjectives linked to individual knowledge such as learn, explore, understand, find, and realize. We coded instances of collective learning and interactive conversations that showed collaboration. Responses that suggested individual or collective learning were closely reviewed for evidence of learning during the online activity. The first module focused on defining what it meant to be a critical consumer of research. Group-One had noticeably less interaction than the other groups. This group consistently shared resources and information, but gave less insight into where they were in their learning as a group. There was evidence of some contribution to learning with words like “additionally” and “also” after acknowledging an agreement. Group-Two illustrated greater collective understanding of shared learning experiences and challenges by providing interpretations of each other’s responses and using words like “agree” and “exactly.” Overall, there was a great deal of agreement and key words similar to “learning” were used sparingly in this module. Group-Three showed evidence of interaction and illustrated an awareness of changing current practice as they explored the definition of critical research consumption. This group conversation was deeper and showed a balance of individual and collective learning. For example, one of the exchanges in Group-Three exemplified a thoughtful interaction that included further questioning. Student-One described her interpretation of being a critical consumer of research and then Student-Two responded “I would agree with everything [name of student] said about critical consumption of research ...I would also add that just because it [article] does pertain, does not mean it should be included.” This point was affirmed by Student-One and then she followed with, “I would think that identifying why it [article] doesn’t fit can begin to expose the gaps. Is that your thinking?” The exchange continued and showed a confident display of understanding and deepening of thinking surrounding on-becoming a critical consumer.

The second module shared a video synopsis of how to construct a sound literature review. Group-One and Group-Two shared a level of discomfort with the scholarly process and moving forward with research. As one student in Group-One stated, “I am still questioning everything I write in my paper.” Other students agreed, “I have moments of uncertainty...it is helpful and a bit of validation that this is not an easy process.” Group-Two demonstrated a similar tone of empathy and appreciation for one another, but more acceptance toward the general challenges and ambiguities of the research process. Student-One in Group-Two described “Be comfortable in not being comfortable right now. I need to trust the process and realize that things will start to come together and make sense.” Group-Three showed a high level of interaction as well, and focused more on answering the prompt and sharing strategies with one another. For example, when one student shared a chart on how to organize literature, she received praise and appreciation from her peers.

The third module asked to share any shifts in thinking from their initial definition of becoming a critical consumer of research. Overall, Group-One’s definition of a critical consumer broadened or changed. As one student stated, “My new knowledge has come from the video and the dialogue between my team members.” Statements like this were heard repeatedly from Group-One and there was a particular emphasis in gaining knowledge from the video. Group-Two and Group-Three by-in-large expressed that their definition of a critical consumer of research did not change. Group-Three participants stated: “The discussions tonight helped solidify my existing understanding and allowed me the time to reflect on my own practices as a critical consumer of research.” Statements like this captured students experience in Group-Three. While Group-Two’s and Group-Three’s original definition remained unchanged, Group-Two also shared their preference for in-person (classroom) learning. This type of reflection did not surface in Group-Three where insight and comfort were the more prominent themes.

The final module was a reflection on the entire online activity and an opportunity to provide feedback on the experience. There was a high level of interaction among all three groups in this module and some variations on where they focused their reflections. Group-One delved deeply into practices of online platforms, strengths, weaknesses, and purposes. Group-Two emphasized their appreciation of being able to work remotely with constant reference to the value of time. Group-Three did not have many criticisms of the online platform. They seemed to have navigated through it with a level of ease not as apparent in Group-One and Group-Two. Group-Three saw value in this online activity earlier in their research journey.

Overall, the three groups demonstrated different characteristics to help us assess their understanding of and progress with the course assignment. Group-One constantly shared information and had moments of individual learning, but the collective interaction and knowledge-gaining attributes were less visible. There was a sense of community, but threads of dialogue were surficial at times. Group-Two had a higher level of comfort with their own skill set and showed an understanding of what it meant to be a critical consumer of research. Group-Three demonstrated a high level of competency with learning together and, much like Group-Two, their conversations pushed one another’s thinking. Group-Three also showed a healthy balance of individual and collective learning throughout the different modules. All of the groups conveyed instances of individual and collective learning, and their unique approaches have been enlightening as we learned about their needs as graduate students and emerging researchers.

In brief, this online activity was significant in that it made clear how the students defined their learning progress related to becoming a critical consumer of research and affirmed and supported each other’s success and struggles with the task of writing the Qualifying Paper. Many of the students felt overwhelmed and lost by the task of writing a 20-25 page Qualifying Paper which also meant undergoing critical examination of existing research; an intimidating academic undertaking. These uncertainties were masked before their participation in the online activity and we believe these uncertainties resembled syndromes of an impostor. The impostor syndrome is defined and redefined in a few ways since it was first presented in 1978 by psychologists Pauline Rose Clance and Suzanne Imes. The impostor syndrome includes characteristics of fraud, fear of discovery, and difficulty of internalizing actual successes (Craddock, Birnbaum, Rodriguez, Cobb, & Zeeh, 2011; Gravois, 2007). The students who may have felt like they were “faking their way through the course” no longer felt alone once they became vulnerable in the online open forum. Upon outing their vulnerability, they validated one another and provided words of support to help overcome fears or struggles.

Simultaneously, the online interaction deepened our understanding of the students’ learning and progress with the Qualifying Paper. We gained insight in (1) areas in which to spend class time to review developing a literature review, (2) some of the students who had unique struggles in developing the literature review, and (3) students who were doing well with the assignment. Therefore, the online activity served as an intervention tool to learn students’ progress in the Qualifying Paper Preparation Course. In other words, the evidence from the online interaction required us to be flexible in and adaptive to the different levels of learning occurring among the students and respond to their unique needs.

The differences in individual and group learning made it clear that students don’t learn the same way and/or at the same rate, even though they are provided the same information in the same classroom. Although the online learning analytics were ephemeral, it is evident that data has provided information to have discussions about ways to be creative with learning, improve learning and possibly approach the entire course differently for subsequent cohorts.

References

Baepler, P., & Murdoch, C. J. (2010). Academic analytics and data mining in higher education. International Journal for the Scholarship of Teaching and Learning, 4(2), Article 17. DOI= http://digitalcommons.georgiasouthern.edu/ij-sotl/vol4/iss2/17.
Barneveld, A. V., Arnold, K. L., & Campbell, J. P. (2012). Analytics in Higher Education: DOI= http://net.educause.edu/ir/library/pdf/MWR07085.pdf.
Barnett, D. L. (2008). Experiences influencing degree completion articulated by doctoral students in education administration (Doctoral dissertation).  Retrieved from http://ezproxy.csusm.edu/
Bichsel, J. (2012). Analytics in Higher Education: Benefits, Barriers, Progress, and Recommendations. Research Report. Louisville, CO: EDUCAUSE Center for Applied Research. DOI=  http://www.educause.edu/ecar.
Campbell, J. P. (2007). The Grand Challenge: Using Analytics to Predict Student Success. Presentation at the 2007 EDUCAUSE Midwest Regional Conference (Oct. 2007). DOI= http://net.educause.edu/ir/library/pdf/MWR07085.pdf.
Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic analytics: A new tool for a new era. EDUCAUSE Review (July/August 2007), 41–57. DOI=  http://net.educause.edu/ir/library/pdf/erm0742.pdf.
Cassuto, L. (2013). Ph.D. Attrition: How Much Is Too Much? Chronicle of Higher Education. DOI= http://chronicle.com/article/PhD-Attrition-How-Much-Is/140045.
Craddock, S., Birnbaum, M., Rodriguez, K., & Zeeh, S. (2011). Ph.D. Doctoral students and the impostor phenomenon: Am I smart enough to be here? Journal of Student Affairs Research & Practice. 48, 4 (2011), 429-442.
Gaviria, F., Glahn , C., Drachsler, H. Specht, M., & Gesa, R. F. (2011). Activity-based learner-models for learner monitoring and recommendations in Moodle. In C. Delgado Kloos, D. Gillet, R. M. Crespo GarcĂ­a, F. Wild, and M. Wolpers, M. (Eds.) Proceedings of the 6th European Conference on Technology-Enhanced Learning (EC-TEL 2011, Palermo, Italy, September 20-23, 2011). Heidelberg, Berlin: Springer-Verlag.
Gravois, J. (2007). You’re not fooling anyone. Chronicle of Higher Education. 54, 11, A1.
MacNeill, S. (2012). What is Changing and Why Does it Matter? A Briefing Paper. CETIS Analytics Series. 1, 1. DOI= http://publications.cetis.org.uk/wp-content/uploads/2012/11/Analytics-Vol1-No1-Briefing-Paper-online.pdf.
Long, P., & Siemens, G. (2011). Penetrating the fog: Analytics in learning and education. EDUCAUSE review, 46(5), 30-40
Lovitts, B. E., & Nelson, C. (2000). The hidden crisis in graduate school. Attrition from Ph.D. programs. Academe, 86, 6, 44-50.
Norris, D., Baer L., & Offerman, M. (2009). A National Agenda for Action Analytics. White Paper. Presented at the National Symposium on Action Analytics. (St. Paul, Minnesota, September 21-23, 2009). DOI= http://lindabaer.efoliomn.com/uploads/settinganationalagendaforactionanalytics101509.pdf
Reid, C. D. (2012). Joyful obligation: Listening to Black doctoral students in the academy. Doctoral Thesis. UMI Order Number: UMI Order No. 3493868., New York University.