Manageable Steps to Implementing Data-Informed Advising
Authored by: Stephanie Kraft-Terry and Cheri Kau
Academic Analytics, otherwise known as hypothesis-driven data-mining, is a fast-growing field that developed out of a desire to implement data-driven decision making to support student success (Baepler, 2010; Campbell, 2007). Data mining requires an understanding of approaches and tools necessary to analyze large data sets (Kumar, 2011). Without the appropriate background or resources (software included), the idea of undertaking academic analytics to support student success within academic advising can be daunting. This article will address simple approaches advising units can employ to implement data-informed decision making to improve or enhance advising and student success without a large investment in additional resources.
In recent years, advising units have implemented early warning systems that utilize analytics to identify at-risk students and potential intervention points to improve student success (Aguilar, 2014). However, many units do not have the means to purchase or utilize these sophisticated systems. Meanwhile, advisors are relied upon to make recommendations to students, often based on their professional knowledge and the collective experiences of their students.
In the context of advising, recommendations are most beneficial to students when based on data rather than anecdotal advice (Cuseo, 2008). Such data should be analyzed as part of a regular program review for continuous improvement in advising units (Troxel, 2008). In addition to using data to ensure appropriate curricular recommendations, data is a valuable tool for identifying retention and time-to-degree hurdles often experienced by students. Advisors have regular interactions with students, putting them in a unique position that provides insight on course availability, student performance in particular courses, or even rumors students hear from one another regarding courses to avoid. Through simple analysis of readily available data (i.e. course enrollment, grades, etc.), advising units can easily validate or invalidate many of these concerns to create policies or distribute information to ensure optimal paths for student success without the need of expensive analytical tools. Specifically, a solid understanding of data specific to each advising units’ population can aid advisors in making accurate recommendations to students, as well as academic units, based on objective evidence to support the knowledge gained through working with the students.
Types of Data
When considering the variety in types of data available, it is important to select the data that is most appropriate to evaluate for the proposed change or area of interest. It is important throughout the data analyzing process to remain focused on the specific changes or areas of concern to avoid straying from your original purpose for data analysis. Carefully outlining data that would shed light on the success or failure of changes within an advising unit is essential. The three most common types of data include institutional, survey, and student learning assessment.
Institutional data provides objective, quantitative indicators that can support or negate changes in advising practices when analyzed. Data can usually be obtained from an institution’s Office of the Registrar, Institutional Research Office, or a student management system already used for advising. Institutional data includes enrollment data, such as the number of credits and courses attempted, retention from semester to semester (overall and of specific student groups), grade point average, cost of attendance, major(s), graduation rates, and more. It is important to remember that while large amounts of data may be available, the focus of analysis should be on the most pressing concerns related to the indicators that can inform advising practices. It is best to outline a clear, concise question at the outset to prevent being overwhelmed by data. Remember to stay focused.
While institutional data provides concrete, objective data surrounding success indicators important to institutions (i.e., retention and graduation rates), it is not the only valuable data source in the decision-making processes. Survey data, such as student satisfaction evaluations, exit or graduation surveys, and surveys specifically designed for research studies, may prove to be insightful in making improvements or modifications to advising practices.
A third form of data to improve advising curricula and practices comes from the internal assessment of student learning in relation to academic advising. Continuous assessment of advising through the evaluation of direct and indirect evidence is an important component of a successful advising unit (Hurt, 2007). In some analyses there may be overlap between survey data that is useful for data-informed decision-making and learning outcome assessment, while institutional data is rarely used as an indicator of student learning within an advising setting.
Consider the types of data necessary to draw a meaningful conclusion before implementing changes. This ensures the appropriate information is collected throughout the process for evaluation of success and identification of areas for improvement. For example, if the goal is to determine whether a new intervention targeted at retaining students is successful, examining enrollment in the subsequent semester would provide the evidence needed to determine success of the initiative. While other information may be available, such as student satisfaction of an advising appointment, that information does not directly inform decision-making in relation to the specific goals of the student retention initiative. Similarly, if the purpose is to determine whether a student learned which resources on campus could augment their academic performance, survey data could be employed to examine whether students understand where to seek help. If only final grades are the only examined, information about student performance is presented, but advisors would not gain insight on what resources discussed in the appointment were beneficial for the student.
The following example walks through the five important steps to consider when implementing data-informed decision.
- Identify your area of interest or proposed change: Identify assumptions that currently influence advising decisions within an advising unit. Alternatively, identify new policies or procedures being implemented within an institution that would require monitoring to ensure there are no deleterious effects on students during and/or after implementation.
Enrolling in mathematics early in a student’s college career correlates with positive student retention (Moore, 2009), yet many students delay mathematics in favor of a less rigorous schedule in their first semester. Could simply analyzing which semester a student enrolls in mathematics and the corresponding retention rates support changing policies to require students to enroll in mathematics in their first semester?
- Identify the type of evidence (data) to examine: What type of data is required to address your proposed change? It is also important to consider what type of data is accessible and already exists, or new data that needs to be collected specific to the initiative?
Collecting data on which semester students enroll in mathematics and whether they return to the institution in subsequent semester would provide enough information to draw an informed conclusion.
- Identify who, where and how: Special permission or access may be required for certain types of data. If the data is not in aggregate form, Family Educational Rights and Privacy Act (FERPA) will apply to any student information prior to deidentification, therefore all individuals who will work with such data may be required to undergo appropriate training to retain strict confidentiality. Refer to your institution’s requirement for FERPA training to ensure you are appropriately complying with their policies.
Course enrollment data should be available through most institutional research offices. Staff or faculty with appropriate FERPA training should be able to work with such data to remove all identifying information so students or other employees can assist in aggregating the information. Due to the simplicity of this question, the data can be displayed as simply bar charts depicting which math course, if any, a student enrolled in in the previous semester and whether they returned for the current semester.
- When to collect and analyze evidence: Are there important institutional deadlines or events that could be influenced by the data analysis? If so, consider timing your data review to align with such events.
At the end of the add/drop period for the subsequent semester, course enrollment data for the previous semester and university enrollment for the current semester should be collected. For longitudinal studies it is important to be consistent about collecting data in the same time intervals.
- Draw conclusions and implement change: Under some circumstances, the data will support existing practices, while others may result in the need to change advising practices in an effort to improve student success.
Once it is determined if a significant difference exists in retention based on which semester a student attempts mathematics, one could dig deeper to see if retention further correlates with the specific math course a student enrolls in and the course outcome. Literature suggests mathematics placement and performance is the largest predictor of science, technology, engineering, and mathematics (STEM) student success (Chen, 2013). Sharing the results of these analyses with both advisors and students, and working to incorporate them in advising recommendations and curricula, can be a valuable tool in promoting student success through data-informed advising.
Advising and Assessment Coordinator
Department of Biology
University of Hawai‘i at Mānoa
Department of Biology
University of Hawai‘i at Mānoa
Aguilar, S., Lonn, S., & Teasley, S. D. (2014, March). Perceptions and use of an early warning system during a higher education transition program. Proceedings of the fourth international conference on learning analytics and knowledge (pp. 113-117). New York, NY: ACM.
Baepler, P. & Murdoch, C. J. (2010). Academic Analytics and Data Mining in Higher Education, International Journal for the Scholarship of Teaching and Learning, 4(2). Retrieved from http://dx.doi.org/10.20429/ijsotl.2010.040217
Campbell, J. P., DeBlois, P. B., & Oblinger, D. G. (2007). Academic analytics: A new tool for a new era. EDUCAUSE review, 42(4), 40-57. Retrieved from http://er.educause.edu/articles/2007/7/academic-analytics-a-new-tool-for-a-new-era
Chen, X., & Soldner, M. (2013). STEM attrition: College students’ paths into and out of STEM fields [NCES 2014–001]. Washington, DC: National Center for Education Statistics. Retrieved from: http://nces.ed.gov/pubs2014/2014001rev.pdf
Cuseo, J. (2008). Assessing advisor effectiveness. In V. N. Gordon, W. R. Habley, T. J. Grites, et. al (Eds.), Academic advising: A comprehensive handbook (2nd ed, pp. 369-385). San Francisco, CA: Jossey-Bass.
Hurt, R. L. (2007). Advising as teaching: Establishing outcomes, developing tools, and assessing student learning. NACADA Journal, 27(2), 36-40.
Kumar, V., & Chadha, A. (2011). An empirical study of the applications of data mining techniques in higher education. International Journal of Advanced Computer Science and Applications, 2(3), 80-84. Retrieved from http://citeseerx.ist.psu.edu/viewdoc/download;jsessionid=DBD3F08BF3DACBB2151DB5CF1769F39C?doi=10.1.1.631.6325&rep=rep1&type=pdf
Moore, C., & Shulock, N. (2009). Student progress toward degree completion: Lessons from the research literature. Sacramento, CA: California State University, Sacramento, Institute for Higher Education Leadership & Policy. Retrieved from https://moodle.elac.edu/pluginfile.php/62351/mod_resource/content/0/R_Student_Progress_Toward_Degree_Completion.pdf
Troxel, W. G. (2008). Assessing the effectiveness of the advising program. In V. N. Gordon, W. R. Habley, T. J. Grites, et. al (Eds.), Academic advising: A comprehensive handbook (2nd ed, pp. 386-395). San Francisco, CA: Jossey-Bass.
Cite this using APA style as:
Kraft-Terry, S. & Kau, C. (2016). Manageable steps to implementing data-informed advising. Retrieved from the NACADA Clearinghouse of Academic Advising Resources Website http://www.nacada.ksu.edu/Resources/Clearinghouse/View-Articles/Manageable-Steps-to-Implementing-Data-Informed-Advising.aspx