A person standing at a podium giving a speech to graduates.

“Education is not a preparation for life; education is life itself.” — John Dewey

What is Program Assessment?

For every course and academic program, faculty set learning goals for their students. Progress made towards these goals is evaluated when grades are assigned. However, grading is the analysis and use of student data to make judgments about an individual student’s performance. If we want to consider how the quality of learning might be improved in an academic program, we need a way to measure, and think about, learning overall.

Program-level assessment of student learning is conducted by faculty who wish to improve the quality of student learning in their programs and provide justification for the resources needed to maintain them. These assessments involve the systematic collection of information about student learning across courses in an academic program to (a) determine how well students are meeting program goals and (b) to inform decisions about program content, delivery and pedagogy. In this way, assessment empowers faculty to make evidence-based decisions about program design and resource allocation.

Note: Academic Departments conduct two types of program assessment projects: (1) curricular and learning assessment, and (2) a comprehensive self-study, conducted once every five to seven years. The latter, known as Academic Program Review (APR), aims to identify a program’s challenges and opportunities in areas of quality, service, efficiency and resources. Assessment of student learning is just one component of APR. This page focuses on learning assessment, and not on the APR process. To learn about APR, see the Provost’s website.

How to Assess Learning within a Program

When designing a learning assessment activity, it is important to consider the following: (1) Assessment is an opportunity to address faculty concerns related to student learning and should start with genuine questions, (2) It is important to articulate the learning goals of the program and the ways in which progress on these goals can be meaningfully measured, and (3) Good assessment is manageable and sustainable. It is neither prudent nor productive to measure everything. It is also unlikely that you will be able to assess progress on all of a program’s learning goals in a single year. Remember that learning assessment is an ongoing process and create a long-term strategy for the assessment of your program’s learning goals.

“Assessment must not be treated as an end in itself, but rather as one (albeit important) means for educational improvement. Educational values should guide not only what instructors choose to assess but also how they do so.” (APA on Outcomes Assessment, 2008)

There are two fundamentally different types of program-level assessment activities: (1) Activities that explore ways in which the program is meeting its mission and goals, identifying both strengths and opportunities for improvement, and (2) Activities that determine the effects of an intervention that has been carried out in order to strengthen the program in some way, such as changes in curriculum or instructional methodology. Naturally, the second type can be evaluated meaningfully only if it is preceded by an activity of the first type.

1. Define the mission and goals of the program.

If a department has not done so recently, it should review and possibly update its mission statement. Does the mission identify the student population the program intends to serve? Does it say what students should achieve by taking part in the program? Mission statements should also be consistent with the Queens College mission statement and the University’s.

An effective mission statement states what the program does, for whom it does this for, and what makes the program distinctive. It may also describe the vision or values of a department. For example:

The mission of (program name) is to (primary purpose) by (distinctive functions/activities) for (students served).

If the department has not yet identified its program learning goals, define the learning goals for students in the program and the corresponding desired and observable outcomes; how progress on these goals can be meaningfully measured, aka. student learning objectives (SLOs). SLOs describe how students will demonstrate that a learning goal has been met. These objectives should be clearly stated on program materials (e.g., the program website, student handbook, etc.). Students need to understand what their learning objectives are so that they can plan how to achieve them and stay cognizant of how well they are achieving them. Use of action verbs from Bloom’s Taxonomy can help to ensure that your program learning objectives are measurable. For example:

Students in the program will: 

  • identify…
  • explain…
  • differentiate…
  • integrate…
  • apply…
  • challenge…
  • discover…
  • develop…
  • compose…”
2. Align program goals to the curriculum

A “Curriculum Map” identifies how the courses in a program contribute to the program’s learning goals, and can address questions such as:

  • Is the curriculum well-sequenced for student learning?
  • What does the curriculum say about our student learning priorities?
  • How do the required learning experiences of our curriculum shape a graduate of our program?

Mapping your program goals to the curriculum shows the extent to which the learning goals are addressed in the program and facilitates understanding of how courses are sequenced and “fit” together. This can help you to structure program content in ways that support students to meet program goals, and it also shifts instructors’ focus from “my course” to “our program.” Hence, mapping supports faculty collaboration and curriculum revision, as the activity often reveals curricular strengths, gaps, necessary and unnecessary overlaps, and needs (e.g., Do students have enough practice to achieve a goal?).

When program curriculum maps are made available to students, this facilitates understanding of how (1) each course is intended to contribute to specific knowledge or skills of the program, and (2) elective courses could be strategically selected to further strengthen knowledge or skills or to explore new areas of interest.

A common method for developing a basic curriculum map is to:

  • Identify the courses you will be mapping to your program learning goals
  • List the courses in a row across the top of an Excel Sheet/chalkboard/sheet of paper
  • Place program learning goals in a column down the right side
  • Convene the faculty teaching within your degree program for 1 to 2 hours
  • Have faculty identify the program learning goals they are covering in their courses
  • Transfer the finished map into Excel once it is complete
  • ALWAYS follow up the creation of a Map with an examination of the Map

Carnegie Mellon’s Eberly Center offers a great Blank Mapping Tool (xls).

3. Establish how well the curriculum accomplishes its goals.

Design an assessment activity:

(a) Articulate your assessment questions: It’s best to structure an activity around one or two main research questions:

  • How well are students meeting program goals?
    • Are students meeting some program goals better than others?
    • What does the evidence tell us about our program’s strengths and challenges?
  • How does the curriculum support and reinforce what we want students to learn?
    • How well are new majors prepared for mid-level courses?
    • How well are majors prepared for capstone courses?
    • How well are graduates prepared for life after college?

(b) Choose assessment methods. Program assessments may include:

    • Curriculum coverage, but be aware of Barr and Tagg
    • Retention rates, graduation rates, and within-group gaps
    • Attainment of learning goals* (DFW rates; student surveys; etc.)
    • Comparisons to other: departments; national norms; etc.
    • Placement success: jobs/salaries; graduate schools
    • Alignment with industry/professional needs

See the section on Assessment Methods below for more example measures of student learning.

Important considerations: When designing an assessment activity, consider both the logistics of the activity as well as the validity of the assessment. Often, assessing progress on program learning goals involves a series of decisions dependent on your sample size, resources, needs, and other project variables. To conduct meaningful and worthwhile assessment activities, consider the following questions:

    • Is the activity clearly related to the mission and goals?
    • Will it provide meaningful, actionable results?
    • Is the program prepared to carry out the activity?
    • Is the scope of the activity suitable for completion in the timeframe planned?
    • For assessments that span multiple years, how well does the activity integrate with the overall project?

(c) Explain and document the procedures used to determine whether the curriculum is effective in achieving the goals sought by the department. Copies of surveys or other instruments used should be included in the appendices of the assessment report.

NOTE: The OIE supports assessment activities with institutional data and survey projects. Learn more here: OIE Assessment Support

4. “Close the Loop"

Identify where the program can improve and make suggestions for the future: An essential part of assessment is to use the information obtained either to publicize the program’s strengths or to identify opportunities for improvement and to act on them. If the activity identified a noteworthy strength, how will this result be disseminated? If the activity reveals an area of concern, what concrete actions will the department take to address it? Use assessment results to spark meaningful conversation and collaboration among faculty as to what steps should be taken to enhance student learning (content, structure, alignment, delivery, pedagogy, etc.), and how implemented changes might be assessed in the future. In this way, assessment can foster a culture of inquiry and innovation.

*Remember, no individual faculty member has the sole responsibility for ensuring that students will acquire one or more of the program’s learning goals. Student attainment of learning goals should result from the collective learning experiences they engage in during their time at the college. Therefore, learning assessment must not be used to evaluate any individual faculty to measure the performance of any instructional program.

When an assessment activity is complete, address the following questions in the report:

  • In hindsight, was this a useful activity? Can the usefulness of the activity be improved?
  • Does the report indicate clearly what actions the academic unit will be taking, based on the results of this activity?
  • Are there suggestions for future annual activities based on the results of this one?

Program-Level Assessment Methods

When properly aligned to learning goals, assessment methods help us determine whether our students are learning what we think they’re learning, as well as how well program content, delivery and pedagogy support students’ opportunities to achieve program learning goals. Methods will vary depending on learning goal(s) and progress should be measured using several methods. Aim to measure both directly and indirectly. Direct measures measure how well students demonstrate that they have achieved a learning goal. Indirect measures measure students’ (or others’) perceptions of how well they have achieved a learning goal.

Examples of Direct Measures
  • Capstone Courses: Senior seminars or designated assessment courses where program learning goals are integrated into assignments.
  • Collective Portfolios: Faculty assemble samples of de-identified student work from various courses and use this “collective portfolio” to assess specific program learning goals. Portfolios should be assessed using normed scoring rubrics and expectations should be clarified before portfolios are examined.
  • Content Analysis: is a procedure that analyses the content of written documents. The analysis begins with identifying the unit of observation, such as a word, phrase, or concept, and then creating meaningful categories to which each item can be assigned. For example, a student’s statement that “I learned that I could be comfortable with someone from another culture” could be assigned to the category of “Positive Statements about Diversity.” The number of incidents that this type of response occurred can then be quantified and compared with neutral or negative responses addressing the same category.
  • Course Assessment: Data collected from de-identified course assessments can be analyzed to assess program learning goals if the assessments are aligned to these outcomes. 
  • Embedded Questions to Assignments: Questions related to program learning goals are embedded within course exams. For example, all sections of “research methods” could include a question or set of questions relating to your program learning outcomes. Faculty score and grade the exams as usual and then copy exam questions that are linked to the program learning outcomes for analysis. The findings are reported in the aggregate.
  • Locally developed essay questions: Faculty develop essay questions that align with program learning goals. Performance expectations should be made explicit prior to obtaining results.
  • Locally developed exams: Faculty create an exam that is aligned with program learning goals. Performance expectations should be made explicit prior to obtaining results.
  • Normed Scoring Rubrics: When developed and normed by faculty, these rubrics can be used to holistically score any product or performance such as essays, portfolios, recitals, oral exams, research reports, etc. A detailed scoring rubric that delineates criteria used to discriminate among levels is developed and used for scoring. Generally two raters are used to review each product and a third rater is employed to resolve discrepancies.
  • Observations: can be of any social phenomenon, such as student presentations, students working in the library, or interactions at student help desks. Observations can be recorded as a narrative or in a highly structured format, such as a checklist, and they should be focused on specific program objectives.
  • Primary Trait Analysis: is a process of scoring student assignments by defining the primary traits that will be assessed, and then applying a scoring rubric for each trait.
  • Standardized Achievement and Self-Report Tests: Select standardized tests that are aligned to your specific program learning goals. Score, compile, and analyze data. Develop local norms to track achievement across time and use national norms to see how your students compare to those on other campuses.

Source: Allen, Mary; Noel, Richard, C.; Rienzi, Beth, M.; and McMillin, Daniel, J. (2002). Outcomes Assessment Handbook. California State University, Institute for Teaching and Learning, Long Beach, CA.

Examples of Indirect Measures
  • Curriculum Maps: Curriculum Maps (or Degree Maps) are used to summarize the relationship between program goals and courses, course assignments, or course syllabus objectives to examine congruence and to ensure that all objectives have been sufficiently structured into the curriculum.
  • Focus Groups: A series of planned discussions among homogeneous groups of 6-10 students who are asked a carefully constructed series of open-ended questions about their beliefs, attitudes, and experiences. The session is typically recorded and later transcribed for analysis. The data is studied for major issues and reoccurring themes along with representative comments.
  • Interviews: Conversations or direct questioning with a student  or group of students. The interviews can be conducted in person or on the telephone. Interviewers should be trained to follow agreed-upon procedures (protocols).
  • Surveys: can be used to (a) assess learning, (b) assess student needs, (c) obtain student feedback, and (d) find out what happens to students after graduation. A program assessment survey should cover a number of topics, including: program goals, courses, instructors, educational supports, DEI, and career prep/professional development. Please request assistance with survey design and/or administration from OIE. If designing your own instrument, please consult Best Practices in Survey Research, as good data only comes from good methodology. See our Program Assessment Survey Template or our Alumni Outcomes Survey Template for examples of appropriate question items and answer options.
  • Transcript Analysis: are examined to see if students followed expected enrollment patterns or to examine specific research questions, such as to explore differences between transfer and freshmen enrolled students.

Source: Allen, Mary; Noel, Richard, C.; Rienzi, Beth, M.; and McMillin, Daniel, J. (2002). Outcomes Assessment Handbook. California State University, Institute for Teaching and Learning, Long Beach, CA.

To learn more about program-level learning assessment methods, browse our Assessment Resources or peruse guides to program-level assessment at other colleges, such as Clark CollegeCarnegie MellonCornell, or Rochester Institute of Technology. The Office of Institutional Effectiveness can support your assessment projects with institutional data, surveys, and applied research expertise. 

Assessing Student Learning in Our General Education Program

A General Education at Queens College is an education in the liberal arts and sciences – courses that introduce students to the perspectives and knowledge of many disciplines. Our goals for providing students an education in the liberal arts and sciences have endured since the college was founded in 1937, even as the courses and requirements have changed over the years. Since 2013, entering freshmen and transfer students follow a liberal arts curriculum that fits the framework of the CUNY Pathways Initiative. Pathways course proposals are reviewed by several committees to ensure that they achieve the required set of Pathways Student Learning Outcomes, which represent the broad skills and knowledge that all undergraduate students are expected to attain as a result of their educational experiences at the College.

As the core of the undergraduate student experience, assessment of the general education program is of utmost importance. A recent Pathways Syllabi Project and Assessment of English 110 (First Year Writing at QC) indicate we need to rethink the general education program at QC. To this end, the Curriculum Strategic Planning Committee, a faculty- and staff-led working group, is tasked with determining how our General Education curriculum might be improved.