previous
next

3.3.1.1 Educational Programs

The institution identifies expected outcomes, assesses whether it achieves these outcomes, and provides evidence of improvement based on analysis of the results in each of the following areas: educational programs, including student learning.

Judgment of Compliance

PVAMU SACS Accreditation - Judgement Compliance

Narrative of Compliance

Prairie View A&M University identifies expected outcomes related to student learning and assesses outcomes in its educational programs, and provides evidence of improvement based on assessment results. Rigorous assessment occurs in selected programs through professional accreditation and in all academic units through the institution's own policies and procedures.

Assessment through Professional Accreditation
Receiving accreditation and/or program approval requires detailed self-analysis and proof of best practices in the discipline. Often, improvements must be demonstrated before accreditation is awarded, so the documents produced by Prairie View's educational units show evidence of expected outcomes, assessment measures, and related curricular changes. The self-study reports submitted to NCATE for teacher education, NAAB for Architecture, NLNAC for nursing, AACSB for business, and CSWE for social work are discussed here as examples.

The National Council for the Accreditation of Teacher Education (NCATE) 
The College of Education has been accredited by NCATE since 1958. In its most recent re-accreditation report for the Board of Examiners, the College defined twelve core dispositions for teaching candidates and its E-FOLD-P (Educator as Facilitator of Learning for Diverse Populations) conceptual framework [1]. Specifically, applicant institutions are asked to respond to the question, "How is the unit assessment system evaluated and continuously improved? Who is involved and how?" NCATE asks for details about initial teacher preparation, advanced teacher preparation, content knowledge, pedagogy, understanding of technology, and licensure exam pass rates, among other measures. The College of Education explained the use of students' Professional Electronic Portfolios (PEP), which are reviewed both at entry to and exit from clinical practice and include field journals, work samples, and evidence of professional development; results of the state Professional Development and Appraisal System (PDAS) instrument, which has 51 criteria across eight domains; and employer surveys.

The National Architectural Accreditation Board (NAAB)
The excerpts from the School of Architecture's self-study for the NAAB detail the program mission; overall program assessment of multiple categories by strengths, weaknesses, and plans of action; and results of an extensive graduate exit survey with 37 items [2]. One of the School's major outcomes is increased enrollment, and therefore the faculty has focused on visiting high schools and community colleges. As a result, between 2001 and 2004, enrollment more than doubled, from 163 to 341, and the Hispanic student population increased from 2 to 27 during the same time period.
           
Based on comments received after a July 2000 site visit examined student work, multiple curricular changes were made, particularly to address perceived weaknesses in architectural history, regional traditions, comprehension of building codes, sustainable design, and non-Western design. The School also hired licensed architects to fill faculty positions, started an internship program in 2004-05 to help students professionalize, and brought in Houston-area architects to serve on design studio juries [3].

The National League for Nursing Accreditation Commission (NLNAC)
In 2008, NLNAC returned to Prairie View to conduct a five-year review of its graduate post-masters certificate in nursing, first accredited in 2003. The panel praised the "state-of-the-art simulation and standardized patient clinic for teaching and competency evaluation" [4]. Standard IV of the evaluation report, Curriculum and Instruction, explains the mission and program objectives for each of the three M.S.N. degree concentrations in the College, as well as alignment of course content, learning assessments, and program competencies [5]. Another part of the self-study for this accrediting body requires demonstration of educational effectiveness through "an identified plan for systematic evaluation including assessment of student academic achievement." The College of Nursing detailed its application of the Stake Model to decide whether poor outcomes indicate a need to revise current methodologies or to develop new ones altogether and its use of graduation rates, licensure exam pass rates, job placement rates, and exit surveys as measurements of success.

The Association to Advance Collegiate Schools of Business International (AACSB)
In April 2006, the College of Business received AACSB accreditation. The rigorous self-study report included alignment of courses with the degree to which they cover three specified business contexts: 1) ethical and global issues; 2) the influence of political, social, legal and regulatory, environmental and technological issues; and 3) the impact of demographic diversity on organizations [6]. The College also had to demonstrate coverage of accounting, behavioral science, economics, mathematics, statistics, oral and written communication and highlighted a new ethics course launched in Fall 2004 to prepare students to handle moral dilemmas in the workplace. Furthermore, AACSB requires that an accredited program "should be systematically monitored to assess its effectiveness and should be revised to reflect new objectives and to incorporate improvements based on contemporary theory and practice." Based on results of the Major Field Test (MFT) administered to students each year, the College increased its admissions standards and established the Center for Business Communications to improve critical thinking skills.

The Council on Social Work Education (CSWE)
In 2003, the Commission on Accreditation for the Council on Social Work Education granted a full eight-year reaccreditation of the Bachelor of Social Work at Prairie View A&M. In Evaluative Standard One, the Commission looks for a mission statement, program goals, objectives derived from mission and goals, systematic measurement of objectives, and the use of measurement results in curriculum planning and design [7]. Prairie View A&M's Social Work faculty outlined five program goals and eleven aligned objectives. For Evaluative Standard Six: Curriculum, the self-study included 85 pages detailing vertical and horizontal integration of courses, means on outcomes from multiple social work classes, and assessment measures like field supervisor evaluations [8]. This section also indicated changes like targeted courses for majors to take for the University core curriculum (ethics, biology, and anatomy and physiology) and a proposed new foundation course focusing on ethics and diversity. This class, SOWK 3213: Human and Cultural Diversity Social Work, is now required of all social work majors and emphasizes "advocacy for social and economic justice specific to race, ethnicity, gender, age, religion, disability, social class, nationality, and sexual orientation."

While accreditation demonstrates solid, outcomes-based curriculum processes, not all programs are accredited. Additionally, Prairie View A&M University was not content to rely on external agencies to conduct all of its assessment, and therefore increasing emphasis has been placed on the development of in-house evaluation practices.

Framework and Policies for Internal Assessment
Evaluation of educational programs has a long history at Prairie View A&M, especially in pursuit of accreditation, but until 2000, it did not occur in a systematic manner. That all changed with the near simultaneous release of three major planning initiatives: the TAMUS Integrative Plan with six azimuths of achievement [9], the Office of Civil Rights Priority Plan with five key measures, and Closing the Gaps from the Texas Higher Education Coordinating Board [10], which included an accountability system for outcome expectations [11], such as state licensure pass rates for engineering, nursing, and education graduates.

Given the demands of the three new plans, the first steps to standardize assessment efforts at the University began promptly in 2001 with the establishment of the University Assessment Council (UAC) [12]. In 2003, Prairie View A&M began work on strategic plans for 2004-2008 (called the Quality Enhancement Plan or QEP at the time). Based on the azimuths in the TAMUS Integrative Plan, with attention to concerns raised by Closing the Gaps and the Priority Plan, the QEP included up to six parts: 1) updated vision and mission statements; 2) assessment of progress; 3) alignment with the six TAMUS azimuths by identifying the unit's strengths, weaknesses, and action plans to remedy any deficiencies; 4) priority goals and objectives for 2004 to 2008; 5) new funding needs; and 6) conditions that could change priorities in the future. These four-year plans were completed by all educational units, from academic colleges like the College of Business [13] and College of Engineering [14] to departments like Mathematics [15] and Languages and Communications [16] by their respective faculty who have the primary responsibility for the content, quality, and effectiveness of coursework as stated in Comprehensive Standard 3.4.10.  Responsibility for Curriculum.

In the spring of 2004, the Provost and Vice President of Academic Affairs created the Assessment Coordinator position to work in tandem with the UAC. The Coordinator began the implementation of the first formal institutional assessment cycle in Fall 2004 by requesting that academic programs submit their assessment plans for review and feedback, including 1) unit mission statements aligned with the University mission; 2) unit core values aligned with University core values; 3) goals, objectives and outcomes; 4) detailed assessment cycles; and 5) results explaining how data would be used for improvement. By Fall 2005, units completed their first assessment cycle under the new program. Results data forms from the Colleges of Business and Engineering and the Departments of Chemistry, Mathematics, and Psychology show how outcomes were analyzed and utilized in terms of future improvements [17]. For example, the College of Business included customer service as one of its outcomes, and based on two different survey instruments, it hired a new staff member to focus on academic advising. In Engineering, when results showed deficiencies in problem-solving ability, the College developed a seven-step methodology for students to follow when tackling engineering problems. At the departmental level, the faculty in chemistry proposed several strategies for improving its majors' grasp of scientific research, including more organized departmental seminars to expose students to different fields of chemistry and more written reports in courses.

As noted in Core Requirement 2.5,for 2005-2006, the UAC, renamed the Institutional Effectiveness Council (IEC), disseminated general reviews of each unit's mission and objectives, assessment timelines, and assessment cycles. More importantly, each unit was asked to complete a new planning worksheet listing measures, targeted populations, administration period and frequency, purpose, and outcome assessed [18]. Compiled Assessment Cycle by Unit worksheets from academic units are available in the supporting documentation [19] and indicate that educational programs selected multiple, appropriate measures and goals for students. The Department of Music and Drama lists music juries, senior recitals, and proficiency exams as its measures for students studying performance. The College of Education uses student teaching portfolios, pre- and post-tests, surveys, and certification exams to gauge the proficiency and content knowledge of its undergraduate teaching candidates.

Collection and analysis of academic program assessment data was aided during this year by the introduction of TrueOutcomes, a web-based outcomes management system. It allows units to input target outcomes and supporting courses, create standard rubrics, align courses with core curriculum outcome expectations, assign and assess comprehensive student portfolio assignments, and suggest strategies to improve results in the future [20]. In the past three years, use of TrueOutcomes gradually has expanded to all educational programs and has been especially convenient in assessing distance education courses, where materials are already in electronic format; students simply upload their assignments to the eCourses class management system as well as to TrueOutcomes.

Guided by the Institutional Effectiveness & Assessment Monograph: A Practical Guide to Assessment Planning for 2007-2008 [21], an Institutional Assessment Measures Matrix [22], and the University's "closing the loop" continuous improvement model [23], by 2007, academic units completed two additional tasks: 1) aligning outcomes for each course in the undergraduate and graduate catalogs with overall program outcomes, if applicable, and 2) indicating whether courses focus on teaching, reinforcing, or integrating various competencies. This helped ensure consistency of vision and mission, as well as assessment in the appropriate courses for each outcome. Examples from developmental education [24], the School of Architecture [25], the College of Arts and Sciences [26], the College of Business [27], the College of Education [28], the College of Engineering [29], and the College of Nursing [30] are presented here. These matrices include learning outcomes for general education courses in the core curriculum, which are assessed by the programs that offer them. Comprehensive Standard 3.5.1 discusses the core curriculum and "true core" in more detail.

Current Reports of Outcome Assessment and Evidence of Improvement
To standardize the reporting of outcomes, measures, results, and improvement on each program outcome from cycle to cycle, an assessment report (known as Form A-1 internally was designed for use by educational programs, and information from those forms is distilled in a two-question Results Data document that summarizes what the data reveal and what actions will be taken based on the data to improve student learning. Noteworthy findings in multiple educational programs are highlighted below and organized by college / school.

College of Agriculture and Human Sciences
Compiled assessment reports from the College of Agriculture and Human Sciences [31] show strong understanding of assessment in areas like the accredited dietetics internship program. To measure the degree to which students "demonstrate the ability to communicate effectively with clients and peers," faculty set the goal that at least 90% of preceptors would rate interns' communication skills as satisfactory or better. During the 2003-2005 assessment cycle, 100% of the 72 student intern projects received were so rated, and based on these excellent results, the instructors targeted four courses in which to emphasize presentations and communications skills to continue the trend.

In Human Nutrition and Food, accredited by the American Dietetics Association, faculty determined that the student project they were assessing for outcomes related to learning the effects of social behavior needed to be modified to include more specific activities related to the cause and effect of various behaviors as related to food intake. Similarly, for the program outcome on "translating health care system concepts," several projects and assignments are being revised to address policies and reimbursement issues. For another outcome, even though students met program goals, faculty plan to revise curriculum to include the 2008 updates to the Nutrition Care Process (NCP) created by the American Dietetic Association, so students are current with research and best practices in their desired career field [32].

The Results Data documents from Family and Community Services indicate that for program outcome 3, "students will understand human growth and development, parent/guardian/educator role and responsibilities, and career opportunities in human development, education, and services," faculty will create an online tutorial to improve results from 79.2% of students reaching the benchmark score [32].

School of Architecture
Reports from the School of Architecture show implementation of data-driven program changes in some areas [33]. For the Construction Science program, one outcome includes student completion of the Construction Manager Certification Exam and/or the Leadership in Energy and Environment Design (LEED) exam. To reach this goal, freshmen are introduced to the exams and their value in 1000-level courses, practice exams are provided in targeted senior seminars at the 4000 level, and graduates are surveyed 3 to 5 years later to determine their performance on the exams. Because the outcome is relatively new, no data are available yet.

In the Community Development graduate program, in order to determine whether or not "students demonstrate the knowledge and skills required to identify, evaluate and assess the growth and development of urban and rural communities, design and planning trends within the four fields of community development," embedded exam questions and papers are assessed. 80% of written presentations met learning objectives, and faculty are designing a more specific rubric for future assessment cycles. When students showed weak performance on a case study designed to assess the outcome "students will demonstrate knowledge of the broad principles of design (Land Use/Planning) development," the faculty re-designed the course to include a practicum on development solutions.

College of Arts and Sciences
In the College of Arts and Sciences, evidence of multiple assessment cycles and improved student learning can be found in several units [34] [35]. Four examples of instructional and/or curricular changes based on evaluation of data are summarized briefly below and presented in the order in which they appear in the lengthy supporting documentation:

Army ROTC, which has increased the number of cadets at Prairie View A&M from 34 in 2003 to 90 in Fall 2009, has seen its students' success in the Leader Development and Assessment Course (LDAC) increase steadily from 2007 to 2009. Every summer after their junior year, cadets from across the country attend the LDAC to be assessed on their potential to serve as commissioned officers in the U.S. Army. In summer 2007, 8 of 10 (80%) graduated, in summer 2008, 11 of 12 (92%) graduated, and in summer 2009, 16 of 16 graduated (100%) [35]. Because LDAC serves as the culmination of three years of ROTC training, both in the classroom and during lab exercises, successful graduation from the course is evidence of understanding and mastery of the student learning objectives in the program, particularly program outcomes #2, "hands-on experience in managing physical, financial, and human resource" and #3, "ability to function on multi-disciplinary teams" [34].

To improve performance on the Department of Chemistry's third undergraduate program outcome, "graduates are able to use modern instrumentation and classical techniques to design experiments and properly record the results of their experiments," instrumentation use was expanded in sophomore organic chemistry classes and hands-on training was provided in four senior courses. In Spring 2007, only 55% of students performed competently on three embedded final exam questions about instrumentation and experiment design, short of the goal of 65%. Therefore instructors are increasing coverage of these topics even more, and the department acquired two new pieces of equipment to enhance student training [34]. To assess its core courses, 1033 and 1043, the Department of Chemistry uses an embedded question on the final exam. In 2007-2008, 68% of students supplied the correct answer, and in 2008-2009, 71% of examinees replied correctly. The Department credits the new tutorial service funded by a Department of Education grant with the increase in performance. Faculty also are investigating the use of online homework systems like OWL or Sapling-Learning in multiple chemistry courses to enhance student learning [35].

In the Department of Languages and Communications, English faculty use selected performance measures from senior portfolio assessments to determine success on the outcome "students will write informed, organized essays that demonstrate appropriate engagement with primary and secondary sources." Mean scores rose on all four indicators between 2006 and 2008 but fell in 2009. To address the results, faculty who have taught the course met to standardize the assignments more clearly so they incorporate crucial course outcomes. After a concerted effort to broaden students' exposure to different literary periods and genres, particularly through the addition of film thanks to a National Endowment for the Humanities grant, English faculty saw a steady increase in students' "demonstrate[d] knowledge of major historical periods and literary movements in culturally diverse literature," program outcome #3. Mean scores on this measure rose from 2.7 on a 4.0 scale in 2006 to 3.4 in 2007 to 3.5 in 2008 for the assessment of a paper in the core literary theory class. These results correlate to student perceptions indicated on the alumni survey, where students strongly agree that their English courses "covered an extensive body of literature and literary genres and that these courses gave them the opportunity to read diverse types of literature." Average scores on a 5.0 scale were 4.8 in 2009, up from 4.75 in 2007 and 4.5 in 2006 [35].

To determine achievement in their second program outcome, "History graduates should demonstrate significant knowledge of major events and trends in their area of concentration," History faculty designed pre- and post-tests in four introductory courses in U.S. history and world civilization. In Fall 2007, scores improved 14.56% in HIST 1313, a core class in American history, while in Spring 2008 only a 9.2% increase was seen. By examining performance on specific questions, instructors identified weaknesses in students' grasp of the colonial and revolutionary periods and therefore placed more emphasis on these areas in future offerings of the course [34]. The program also started administering the ETS Major Field Test in 2008, with results showing low performance compared to examinees nationwide. Based on these data and the absence of European history courses in the curriculum, the faculty is submitting two new classes in this area for approval in Fall 2009. The history department also is designing an in-house practice test to help pinpoint deficiencies before students take the field test [35].

College of Business
Due to a 2007 change in emphasis in the AACSB accrediting guidelines, the College of Business (COB) adjusted its assessment strategy in spring 2008. The current focus is on course-embedded assessment to gather artifacts and information that then are used to determine overall program effectiveness.  Assessment teams evaluate homework assignments, embedded exam questions, presentations, and papers according to rubrics developed by the College faculty. For example, to assess outcome 5b, "students will demonstrate an ability to deliver a professional quality presentation accompanied by appropriate technology," students are videotaped giving PowerPoint presentations, and a panel of faculty judges scored their work based on software skills, handouts and visual aids, vocal and bodily delivery, and principles of public speaking. In 2008, 33% of student presentations were rated excellent with another 60% rated acceptable [36].

The COB also uses the ETS Major Field Test, administered annually from 2002 to 2005 and semiannually since 2006, as one of its direct measures of student learning. Between Spring 2005 and 2007, for example, mean scores in 5 of the 9 business areas—economics, management, finance, marketing, and international issues—increased by as much as 17%. The new area of information systems had a 12% improvement just between Fall 2006 and Spring 2007 [37].

As a result of the College’s assessments, changes have been made to course content and pedagogy.  For example, the COB strengthened its free tutorial services in quantitative areas like statistics and mathematics, launched online tutorials in response to scores on the 2008 MFT, and added Skill Assessment Management (SAM) software to all sections of introductory Management Information Systems courses to augment classroom instruction.

College of Education
In addition to maintaining accreditation standards for NCATE as described above, the College of Education collects data on learning outcomes and improvements for Prairie View's assessment framework. The College looks primarily at licensure exam pass rates and responds to declines in student performance. In the Alternative Teacher Certification Program (ATCP), for example, 73 of the 97 candidates (75%) passed both parts of the TExES state licensure exam in 2004-05. Completion rates dropped substantially the following year to just 68% and continued going down in 2006-07 (67%) and 2007-08 (66%). To reverse the trend, the program implemented mentor training to better address the needs of the resident year teacher as well as provide support to colleagues in the exploration of new and varied strategies and aligned curricula more clearly with TExES standards [38].

In the doctoral program in Educational Leadership, outcome 2 is "students will be able to analyze, influence, and practice public policy decisions at the local, state, regional and national levels that impact education and communities," and to assess mastery in this area, students in EDUL 7223: Educational Governance wrote an analysis of a national education agenda. In Spring 2009, 4 out of 5 students (80%) met the acceptable or target level for performance on the rubric. While pleased with the results, faculty still decided to change future iterations of the class by having students complete more practice activities identifying research components of policy plans.

College of Engineering
The College of Engineering engages in broad outcomes-based assessment of its courses, coupled with indirect student survey measures. This is one of the reasons why seven of its eight degree programs are accredited by ABET; the eighth, computer engineering, was established in 2003 and is targeted for accreditation review in 2010. Because assessment data for Outcome A, "an ability to apply knowledge of mathematics, science, and engineering" indicate student achievement consistently below the target level in multiple engineering programs, it is highlighted as the primary example below to show how assessment data is analyzed and new approaches are implemented to improve learning.

To gauge Outcome A in chemical engineering, assignments from 10 different sophomore, junior, and senior courses are analyzed. When results fell below the target performance level of 75% in Fall 2005, the College recommended a first-semester freshman introductory course covering study skills, use of math in engineering problems, and practice with engineering software and techniques. By Fall 2006, CHEG 1011: Introduction to Engineering and CHEG 1021: Introduction to Chemical Engineering Laboratory were approved and offered to incoming students [39]. Even with the new courses, students have continued to perform below benchmarks over the last three assessment cycles (averages of 59 in 2006-07, 66 in 2007-08, and 59 in 2008-09).

Other departments found the same trends when they assessed Outcome A. In Fall 2006 four electrical engineering classes were identified for assessment, and results in 3 out of 4 classes failed to reach the 75% benchmark, with averages of 61.8%, 55%, and 65.6%. Low performance was seen in two of the four classes selected in civil engineering (56.4% and 57%) and in 2007-08 in two out of three courses identified for assessment in computer science (70% and 47%). The Department of Electrical Engineering examined ten different courses in 2006-2007, and while 4 out of 5 classes at the 3000 level failed to meet benchmarks, only 1 of the 5 courses at the 4000 level fell short of a 70% target.

Because foundational mathematics seemed to be the common denominator in low performance for Outcome A, three new engineering applications labs introduced in 2008 now must be taken as co-requisites for algebra and trigonometry, Calculus I, and Calculus II. Concepts from students' mathematics classes are reviewed and reinforced from an engineering perspective. The College theorizes that stronger mathematics skills will help students improve their problem solving and application of concepts. To this end, instructors also are being asked in 2009-10 to include more math practice problems in homework and quiz assignments, provide students with reading and homework assignments on simple engineering systems requiring use of mathematics, and offer tutorials on Engineering Equation Solver (EES) software and numerical analysis to students [40].

During the Fall 2009 semester in the Department of Mechanical Engineering, Classroom Performance System (CPS) response "clickers" and wireless receivers are being piloted in selected courses most directly related to Outcome A. Students can register their answers to the instructor's questions in class, and the faculty member immediately can assess comprehension of materials based on the number of correct responses. The department also is creating study clubs for mathematics, thermal science, and mechanical design to encourage studying and peer tutoring [40].

College of Juvenile Justice and Psychology
Every year since 2004, the College of Juvenile Justice and Psychology has administered the ETS Major Field Test (MFT) to graduating seniors since its areas of focus correlate nicely with program learning outcomes.

In Criminal Justice, student mean scores on these comprehensive exams increased steadily from 2004 to 2006 but dropped in 2007 with little to some rebound in 2008. In the theory section, for example, scores rose from 25.8 in 2004 to 38.4 in 2006 and fell to 30 in 2007 and 35 in 2008. On questions related to police practice, mean scores improved from 37.8 in 2004 to 52.8 in 2006 but were 45 in both 2007 and 2008. To address this trend, particularly low results in theory and research methodology and statistics, faculty adjusted course content and added a required course in criminology theory. Research Methods II is offered more frequently with a variety of instructors. Between 2004 and 2009, the College received approval to offer new courses in human trafficking, women in corrections, law and society, geographical information systems, women in criminal justice, and computer applications in the field to enhance student achievement in the program's seven learning outcomes [41]. Criminal Justice also uses exit surveys as an indirect measure, and since spring 2005, when 84% of respondents agreed the program prepared them for employment in the field, student satisfaction had risen to 99% in summer 2007 [42].

In Psychology, results of the six-area MFT have improved most in sensory and physiological psychology (mean score of 19.9 in 2004 rose to 28 in 2007), clinical and abnormal psychology (mean score of 31.5 in 2004 rose to 48 in 2007), and social psychology (mean score of 31 in 2004 rose to 40 in 2007). Results in measurement and methodology have remained relatively stagnant, with means scores between 37 and 38.
Faculty added course electives in research (PSYC 4823) and cognitive psychology (PSYC 4513) in 2008 to improve student learning in program outcome 1, "demonstrate knowledge of developmental theories including cognitive and social development" and outcome 3, "demonstrate knowledge of theories related to memory types and processes." The department also introduced a new senior paper core requirement (PSYC 4843) to address deeper understanding of measurement and methodology and increased use of SPSS statistical software across the curriculum [41].

College of Nursing
With programs accredited by the National League for Nursing Accrediting Commission (NLNAC) and Commission on Collegiate Nursing Education (CCNE) and approval from the Texas Board of Nursing (BON) for one degree, the College of Nursing is well versed in assessment of learning outcomes and student success [43]. To assess its outcomes, the faculty expect their students' overall mean percentile ranks on relevant components of the ATI Standardized Examination to fall above the national mean. For 2006-2007, Prairie View A&M's nursing students scored in the 62.6 percentile, just above the national mean of 62.42. In 2008-2009, scores in each of the fourteen ATI subject areas were available for comparison, and they showed that students in the College of Nursing tend to perform above national means in child health (ranking improved from 37 in 2006-07 to 64 in Spring 2009), childbearing, and mental health (ranking improved from 70.2 in 2006-07 to 76.7 in 2009) [44] [45]. Notably, the Spring 2009 cohort performed above the national mean on the comprehensive predictor exam, administered at the end of nursing coursework to determine comprehensive knowledge base. Also, while this group scored just 60.40% on the critical thinking entrance portion, they ranked in the 76th percentile for the exit exam [44].

Student surveys also are conducted each semester to determine satisfaction with the nursing program. In 2006-2007, 60.7% of students were satisfied with their academic development [44], compared to 73.18% in the 2008-2009 survey [44].

Student Evaluations of Courses
Each semester, the Student Opinion Survey (SOS) is administered in all courses at Prairie View A&M University. Currently, the surveys are disseminated through the eCourses class management system, which allows students to complete the evaluations at their leisure and readily includes distance education students in the process. Several items are related to course outcomes: the instructor stimulated intellectual curiosity; the instructor motivated students to do their best work; this course challenged students intellectually; and the instructor accomplished what he or she set out to do [46]. Deans and department heads receive SOS results, which are then shared with individual faculty members for use in their annual performance reviews. Such assessments allow programs to improve student achievement at the most fundamental level, in the classroom.

Supporting Documentation and Links


Comprehensive Standards 3.3.1.1

previous
next
© 2009 Prairie View A&M University