Session Abstracts


Plenary Sessions

Plenary 1: The Assessment Process in Academic Advising: An Overview

Understanding the foundation for the assessment process is essential to not only your involvement in this Institute, but also is essential to your leading the development of an assessment plan for your institution. This session will provide an overview of the Institute as well as introduce the framework of assessment and process to follow. It will also provide instruction for the development of mission, vision, and programmatic goals.

Plenary 2: Developing and Reflecting on Student Learning and Process/Delivery Outcomes - Getting to the Vital Few

Student Learning & Process/Delivery outcomes (SLOs and PDOs) are significant aspects of the assessment process. This topical session will present the processes involved in generating effective student learning and process/delivery outcomes in the assessment of academic advising. Participants will be facilitated through discussion and activities in connecting SLOS and PDOs to establish the foundation of the assessment plan.

Plenary 3: Identifying Opportunities for Learning: Mapping the Experience

The assessment cycle includes the development of a vision, mission, goals, process delivery (PDOs) and learning (SLOs) outcomes through an academic advising lens. During the next phase of the cycle, the processes requires an identification of what opportunities are available for learning, how students and advisors are exposed to those opportunities, as well as when learning occurs through the mapping process. Beyond matching PDOs and SLOs with experiences and opportunities to clarify expectations, mapping recognizes milestones in order to gather evidence. In this plenary session, presenters will provide an overview of the mapping process by charting what occurs between developing outcomes and measurement during which participants will create a map of a PDO and SLO. Plenary 4: Identifying and Using Multiple Measures

The use of multiple measures to gather evidence about your desired outcomes  is paramount. A single measure – whether it be a survey, focus group, observable behavior, etc. – may or may not measure what you expect it to measure, i.e., be a valid measure, and there is no way to understand  its validity without comparison data. Therefore, for any given desired outcome, multiple measures are necessary. This presentation offers examples of types of measures, and includes participants’ identification of multiple measures for their developed process/delivery outcomes and student learning outcomes. To demonstrate the concept of multiple measurement, a problem set will be introduced along with  “dummy data” in order to engage participants in discussion.

Plenary 5: Interpreting, Sharing, and Acting Upon Outcome Data

So what do I do now that I have outcome data? This session will draw from Plenary 4, and continue with a discussion of what outcome data mean and how to adapt data presentation to various parties. This session will also include a discussion of how to implement change based on the evidence gathered. Institute participants who have completed an assessment cycle and acted upon data on their campus will share their insights as well. The session ends with a brief discussion of assessment as scholarly research.


Special Topic Sessions

Using a Change Model to Create a Winning Assessment Team

Assessment activities lead to changes that enhance academic advising services. Change isn’t always easy or well received, and you’ll need the right team to help your campus transition and keep them motivated to continue to make changes based on assessment. This session will focus on the change model introduced by John Kotter to develop an advising assessment team to promote success as we examine common challenges in creating your team, staying focused on the goal, and keeping your campus motivated in implementing your assessment plan.

Focus Groups and Academic Advising

As part of the assessment process, multiple measures are critical in gathering additional feedback to collect data. Utilizing focus groups provides an opportunity to “drill down” into your data and gather more specific, qualitative information, as well as validate your quantitative results. This session will assist in providing a step-by-step approach for facilitating and gathering data through focus groups.

Advisor Evaluation: Beyond the Student Satisfaction Survey

Evaluation of academic advisors can serve various positive functions, yet most institutions who conduct advisor evaluation rely solely on student satisfaction surveys. This presentation will introduce the participants to the importance and role of advisor evaluation, review the differences between evaluation and assessment of advising, discuss different aspects and uses of advisor evaluation, and provide suggestions on how to best conduct advisor evaluation and collect and utilize evaluative data. Participants are encouraged to share their successes and challenges in conducting advisor evaluations at their respective institutions. 

What Does the Data Mean?

This Special Topics presentation is designed for Institute participants who want to go beyond the identification of specific outcome measures to use in their assessment processes to actual interpretation of outcome data. Two specific examples of outcome data will be presented, discussed, and interpreted by those attending this Special Topics session. In addition, there will be time allowed for group discussion and analysis of outcome data from at least one participant who brings this information to the session, so bring your outcome data for a chance to have it reviewed by Institute participants from around the world!

Creating an Advising Syllabus

Developing an advising syllabus can enhance the concept of advising as teaching for academic advisors and students. This session will discuss the purpose and advantages of utilizing an advising syllabus, how to create one, and examples of existing syllabi at various institutions.

Developing a Student Satisfaction Survey

This Special Topics presentation considers the basic steps in developing the always popular method of evaluation of the student satisfaction survey.  Included is the outlining of important aspects to consider when developing a survey as an evaluation tool and some basic strategies for evaluating advising services.  Participants will be familiarized with the basic processes involved in developing and administering a student satisfaction survey as one of multiple measures to utilize as part of the overall assessment of their respective academic advising programs.

Developing a Rubric as One Measurement Tool in the Assessment Process for Academic Advising

Rubrics are descriptive scoring tools used to communicate the level to which a desired concept or learning outcome has been achieved. Simply put, rubrics are scales, using numeric (1, 2, 3, etc.) or descriptive terms (e.g. developing, demonstrating some understanding, achieved competency) to describe the levels of achievement along a continuum of performance or quality. This session will describe components of a rubric, provide an overview of uses for a rubric, and provide practice in the development of rubrics for use in academic advising assessment.

Using Technology for Assessment

What is technology’s role in assessment?  How can we apply assessment principles when assessing technology tools?  Participants will be introduced to a technology model for advising for evaluating and assessing the use, implementation, and integration of technology into daily practice. Participants will learn about the role technology can play in advancing effective academic advising by contributing to data-driven decision-making; the use of technologies designed for teaching and learning to advance advising practice; and the use of technology for student evaluation, program assessment and delivery of advising. If you attended the Technology Seminar, you will receive similar information in this session.

Key questions:

  • Do your goals for advising align well with the technology you use to deliver advising?
  • Do your advising goals maximize the limits of technology for advising?
  • If advising is teaching, how can you use technologies designed for teaching and learning to advance your advising practice?