SKIP TO PAGE CONTENT

Student Affairs Assessment and Resources

Student Affairs Assessment and Resources

Service Request Form

Assessment Mission

Our mission is to foster a growth-approach and feedback-oriented culture to enhance student success, engagement, support, and development. The committee is dedicated to collaboration and implementation of qualitative and quantitative assessment practices.  We refer to professional organizations for standards towards the development, assessment, and improvement of quality student experience.

Assessment Goals

  • To produce relevant and reliable data regarding our student population to inform, enhance, and transform our programs and services to help students achieve success holistically in various domains.
  • To provide tools to staff in the division to gain knowledge, skills, and strategies about the various ways to conduct assessment and analyze outcomes unique to their areas.
  • To promote and share stories about our students, campus community and programs.
What is Student Success?
A successful A&M-SA student experience will maximize a students personal, academic, and professional potential through impactful and holistic programs, academic development, and personalized services that provide a foundation for persistence through degree attainment and success after graduation.
Assessment Cycle
At the core of the assessment cycle sit the missions, visions, and values of A&M-SA and the Division of Student Affairs. It is imperative that our work from planning to implementation, from data collection to reporting, all reflect the institution and the division.
Learning Outcomes
What is a Learning Outcome? A formal definition of a Learning Outcome might be Learning Outcomes articulate and define the transformation goals of which the learner will achieve as a result of participating in the experience. In other words, what do we want the learners to take away from their participation.

Assessment Terminology

 

Assessment Terminology

Assessment Terminology
Term Definition Citation
Action Plan The actions the program/department is taking in response to the assessment in an attempt to make improvements. Typical action plans include goals, assigned tasks, and deadlines.
Assessment

“Assessment is an ongoing process aimed at understanding and improving student learning. It involves making our expectations explicit and public; setting appropriate criteria and standards for learning quality; systematically gathering, analyzing, and interpreting evidence to determine how well performance matches those expectations and standards, and using the resulting information to document, explain and improve performance.” (Angelo, 1995)

“Assessment is any effort to gather, analyze, and interpret evidence which describes institutional, departmental, divisional, or agency effectiveness.” (Upcraft & Schuh, 2001)

“Assessment is the process of providing credible evidence of resources, implementation actions, and outcomes undertaken for the purpose of improving the effectiveness of instruction, programs, and services.” (Banta & Palomba, 2015)

Cited in definition
Assessment Plan A tool used to document or record the process of assessment as well as the results or product of the assessment.
Benchmark An internal or external standard used to compare assessment findings. The measurement of individual or group performance against an established standard. University of Texas – El Paso
Closing the Loop Using assessment results to inform program changes or improvements is the final step in the assessment cycle, commonly referred to as “closing the loop.” California Polytechnic State University
Cohort A group whose progress is followed by means of measurements at different points in time. University of Wisconsin – Stevens Point
Culture of Assessment A culture of assessment in student affairs is defined as a set of shared values and beliefs that inspire an ongoing, embedded practice of data collection and analysis that informs decision-making for the purpose of continuously improving programs and services at all levels of the organization (Leary, 2018). Cited in definition
Data Analysis The process of systematically applying statistical and/or logical techniques to describe, condense, summarize, and evaluate data. Northern Illinois University
Data Collection The process of gathering and measuring information on variables of interest in an established systematic fashion that enables one to answer research questions, test hypotheses, and evaluate outcomes. Northern Illinois University
Direct Method Direct methods require students to demonstrate their knowledge and skills. Examples include quizzes, rubric-based evaluation, document analysis, observation, portfolios, visual methods, one-minute assessments, and case studies. Applying & Leading Assessment
Evaluation “Evaluation refers to the process of determining the merit, worth, or value of something, or the product of that process.” (Scriven, 1999). Any effort to use assessment evidence to improve institutional, departmental, or divisional effectiveness (Upcraft & Schuh, 1996). Cited in definition
Focus Groups A collective group of participants assembled to provide perspectives or feedback toward a specific subject or product. Focus groups are carefully planned discussions conducted by trained moderators and are useful for identifying themes or group consensus. University of Nebraska Kearney; University of Texas – El Paso
Formative Assessment Assessment conducted while a program or activity is occurring. Feedback allows practitioners to make adjustments in real time.

Pro: Allows programs to adapt to student needs during development.
Con: Large-scale programs may not be easily modified quickly enough to apply changes.
Applying & Leading Assessment
Goals Broad, general statements describing what a program or service intends to accomplish. Goals should align with the mission of the program and the institution. Clemson University
Indirect Method Indirect methods require participants to reflect learning rather than demonstrate it directly. These commonly include surveys or feedback from faculty and staff. Applying & Leading Assessment
Interview A data collection procedure in which one person asks questions of another. Interviews may be assessed through rubrics, checklists, or thematic analysis. Weber State University
Journaling / Student Reflections Reflective writing where students record thoughts and feelings. Themes can be identified and evaluated using rubrics. Weber State University; Clemson University
Key Performance Indicator (KPI) Metrics used to track performance of tasks critical to achieving objectives. Example: tracking the number of students successfully registering for courses after orientation. University of Texas – El Paso
Method of Assessment Identifies how learning outcomes will be measured, such as surveys, interviews, or focus groups. Multiple methods typically produce richer data. Clemson University
Mission Statement A formal summary of the aims and values of an organization. It defines purpose, direction, and the population served. University of Texas – El Paso
Mixed Methods Procedures that combine both qualitative and quantitative data collection methods in a single study. Clemson University
Objectives Specific, concrete ways that goals are achieved through program activities and student learning. Clemson University
Observations Assessment method that involves directly watching participants demonstrate learning outcomes. Observations may be measured using rubrics or checklists. Weber State University
Outcomes Statements describing the intended results of a program or activity. Learning outcomes focus on changes in student knowledge, skills, attitudes, or behaviors. Clemson University
Performance-Based Assessment Direct observation and evaluation of performance in a real-world task, often assessed using a rubric or scoring guide. University of Wisconsin – Milwaukee
Population The entire group from which a sample is drawn or about which conclusions are made. Clemson University
Portfolio Assessment A direct measure where collections of student work are reviewed over time to evaluate learning outcomes. University of Wisconsin – Stevens Point
Pretest Assessment conducted before a learning experience to establish baseline knowledge or skills. Clemson University
Posttest Assessment conducted after a learning experience to measure changes or improvements. Clemson University
Qualitative Method Assessment methods focused on descriptive data such as interviews, focus groups, observations, reflections, and open-ended questions. Clemson University
Quantitative Method Assessment methods primarily based on numerical data such as surveys and pre/posttests. Clemson University
Random Sampling Sampling method where each participant has an equal chance of being selected. Clemson University
Reliability The consistency of an assessment instrument. Reliable tools produce similar results over time and across populations. University of Wisconsin – Stevens Point
Research Differs from assessment because it contributes to theory development and broader knowledge beyond a single institution. Cited in definition
Response Rate The percentage of participants who completed a study compared with the number originally sampled. Clemson University
Rubric A scoring guide with criteria used to evaluate student work aligned with learning objectives. Clemson University
Sampling Selecting a subset of a population for participation in an assessment activity. Clemson University
Summative Assessment Assessment conducted after a program concludes to measure outcomes and inform future practice. Applying & Leading Assessment
Survey / Questionnaire A set of structured questions used to gather information from participants for analysis. University of Wisconsin – Stevens Point; Clemson University
Triangulation Using multiple assessment methods to determine whether results consistently support the same conclusions. University of Wisconsin – Milwaukee
Validity Indicates whether an assessment accurately measures the intended learning outcomes. University of Wisconsin – Stevens Point
Vision Statement Describes what the institution should look like and how it should behave as it fulfills its mission. University of Wisconsin – Stevens Point

Citations

Schuh, J. H., Biddix, J. P., Dean, L. A., & Kinzie, J. (2016). Assessment in student affairs. San Francisco, CA: Jossey-Bass.

Upcraft, M. L., & Schuh, J. H. (2001). Assessment practice in student affairs: An applications manual. San Francisco, CA: Jossey-Bass.

Angelo, T. A. (1995). Reassessing and defining assessment. American Association for Higher Education Bulletin, 48, 7-9.

Banta, T. W., Palomba, C. A., & Kinzie, J. (2015). Assessment essentials: Planning, implementing, and improving assessment in higher education.

Leary, Margaret. (2018). Tracing the Change Process: Fostering and Sustaining Student Affairs Cultures of Assessment.

Scriven, M. (1999). The Fine Line Between Evaluation and Explanation. Research on Social Work Practice.

Additional resources referenced from:

  • California Polytechnic State University Academic Programs and Planning

  • Clemson University Student Affairs Assessment Glossary

  • Northern Illinois University Faculty Development and Instructional Design Center

  • University of Nebraska Kearney Division of Academic and Student Affairs

  • University of Texas at El Paso Student Affairs Assessment

  • University of Wisconsin–Milwaukee Division of Student Affairs

  • University of Wisconsin–Stevens Point Division of Student Affairs

  • Weber State University Student Affairs Assessment Resources