TAMU System Student Learning Outcome–Discipline Specific Knowledge
The Texas A&M University System delivers a common set/embraces a common view of important outcomes and is accountable for sustained measurement.
For all TAMU System universities, the rationale for assessing student learning outcomes (SLOs) originates primarily from efforts to maintain institutional effectiveness, which is defined as a process of identifying outcomes, assessing the extent to which they are achieved, and providing evidence of improvement based on their analysis.
Upon graduation, students will demonstrate mastery of the depth of knowledge required for their respective degrees.
All criteria met and results exceed expectations with little room for improvement.
Most criteria met and results indicate mastery of objective with some room for improvement.
Acceptable number of criteria met and results meet expectations with room for improvement.
Some criteria met and results indicate need for improvement.
Few criteria met; results indicate need for significant improvement or no/insufficient results reported to measure performance of objective.
The assessment of student learning by an academic program begins in the curriculum development process, in which every new program must develop an assessment plan that includes clearly defined expected outcomes. Once a new program is established, the program joins the other academic units at A&M-Central Texas in annually assessing student learning through the university’s Continuous Improvement Process.
The Continuous Improvement Process requires each academic unit to submit annual operational plans, called Continuous Improvement Plans for its areas. The plans contain the unit’s mission, goals, and student learning outcomes; assessment measures; and annual performance targets. The continuous improvement process also requires each area to report annual findings and create action plans from those findings.
The primary responsibility for data collection and analysis associated with the assessment of student learning rests with the faculty.
Emerging. Review of programs and processes revealed inconsistent results. Examples provided in “Analysis” column.
A&M-Central Texas continually uses assessment data to improve student learning. The list below provides examples of assessment activities from academic programs within each of the university’s three schools, the analysis of results, and recommendations for improving student learning.
• One of the expected outcomes in the Bachelor of Science in Social Work program is for students to demonstrate skill in evaluating research and utilizing findings to enhance practice competencies. To assess this expected outcome, program faculty utilized multiple assessment methods, including the evaluation of students by their external agency field supervisor, student artifacts from the Methods of Social Work research course, and the research proposals developed in the same research course. Although the students met the faculty expectations for the student artifacts, students were rated lower than faculty expectations on their field instructor evaluation.
To help improve the program, the program faculty began using new teaching strategies in the methods of social work research course. The faculty members are also considering dividing the course into a two-part social work class in order to introduce the concepts at a slower pace and to cover statistics (currently an elective) in the research classes. Finally, because the research course requires substantial writing, program faculty began advising students to take their policy course (a writing-intensive social work class) before taking the research course.
• Program faculty in the Bachelor of Science in Mathematics program assessed their students’ ability to understand abstract mathematical ideas. To assess the program outcomes, faculty utilized course embedded exam information in the Math 409 and Math 432 courses. Although the students met the program’s faculty-established criteria for success, program faculty decided to enhance student learning by increasing the amount of technology in instruction. In particular, program faculty began using the TI-89 calculator with graphs to explain abstract concepts. Furthermore, $1,000 was spent on adding MathLab to campus labs and the installation of Geometer’s Sketchpad.
• Students in the Bachelor of Science in Psychology program are expected to use critical thinking; skeptical inquiry; and, when possible, the scientific approach to solving problems related to psychology. To measure the attainment of this outcome, a comprehensive examination is administered in each section of PSY 335. Faculty have determined that the benchmark for success for this measure requires that 75% of students score 70% or higher on the exam items related to the theory and content of psychology. Only 25% of students earned a 70% or higher on the objective component of the exam in Spring 2011. As an alternative measure, data were also collected in PSY 420 in the same semester, and 13% of the students met the anticipated benchmark.
While the results may appear discouraging, faculty are continuing to ask relevant questions to gain insight into the results. In addition, program faculty decided to begin using the ETS Major Field Test for psychology as an external exam in order to measure the effectiveness of the psychology program. The first administration of this test is scheduled for Spring 2012. After a review of the initial MFT results, in Fall 2012, program faculty began considering significant changes to enhance the program’s curriculum. To help in this process, the department formed a task force to study the potential program changes.
The institution took a global analysis of its assessment process and developed several actions that appear in the institution’s new academic plan. Those actions are:
1) Implement a Center for Innovative Teaching by fall 2016. This center will provide professional development for faculty as they assess both their courses and programs.
2.) The assessment process will now include program specific peer and aspirational programs and key performance indicators
3.) 100% of all programs that have discipline-specific certification and accreditation will pursue or utilize certification and/or accreditation standards by fall 2020.
4.) Learning analytics will be used to improve academic programs by fall 2016.
As indicated by the institution’s new academic master plan, A&M-Central Texas is restructuring its assessment process. The new process will encourage the increased use of discipline-specific assessment and evaluation standards. This will also facilitate peer and aspirational benchmarking.