One West University Boulevard, Brownsville, Texas 78520 | 956-882-8200


Standard 2: Assessment system and unit evaluation

The unit has an assessment system that collects and analyzes data on applicant qualifications, candidate and graduate performance, and unit operations to evaluate and improve the performance of candidates, the unit, and its programs.


2.1.a Assessment System Summarize content, construct, process, and evaluation of the unit assessment system, its key assessments in relation to professional, state, and institutional standards, and its use in monitoring candidate performance, program quality, and unit operations.

Our unit assessment system provides regular and comprehensive data on program quality, unit operations, and candidate performance throughout each stage of its programs. Unit assessments share a common focus on cultivating highly skilled educational professionals based on professional, state and institutional standards as guided by our conceptual framework. Our assessment structure requires that initial and advanced candidates demonstrate key proficiencies through a variety of formative and summative assessments at multiple junctures in all programs, as scored by program faculty, clinical faculty and school partners. As a part of the continuous improvement process, data are compiled and stored through Tk20, analyzed by faculty and assessment staff and shared with relevant stakeholders, and then used to make program innovations that prepare better teachers, leaders and other educational professionals.

Transition Points:

Both initial and advanced programs are structured by four transition points. Data are collected at each transition point.

Initial Teacher Education Preparation Transition points

  1. Transition Point 1: Admission to Teacher Educations.
    • Declared major
    • Completion of 60 hours
    • Completion of EDCI 1301 Introduction to the Teaching Profession & EDFR 2301 Sociocultural Context of Schooling
    • Writing Skills Test
    • Cumulative & Major GPA of 2.5
    • No grade lower than a C in coursework
    • Completion of Recognition of Professional Disposition Form
  2. Transition Point 2: Required Teacher Education Course work and Program Participation:
    • Required Teacher Preparation Coursework
    • EDCI 3330 Designing & Assessing Instruction to Promote Student Learning
    • EDCI 3314 Methods of Teaching Math & Science
    • EPSY4322 Human Development & Student Learning
    • EDCI 4327 Methods of Teaching Elementary Social Studies & English Language Arts
    • EDCI 4328 Implementing & Assessing Effective Secondary Content Pedagogy (Sectioned by Program Content Area)
    • *Individual programs have additional course requirements
    • Professional Disposition Survey Assessment (4)
    • Completion of Abbreviated TWS (EDCI 4322, EDCI 4327 & EDCI 4328)
  3. Transition Point 3: Admission to Student Teaching
    • Completion of prerequisites & field experience hours
    • Senior standing (90 semester hours)
    • Passing score on TExES Content and PPR Exams
    • Proficient Professional Dispositional Assessments
    • Cumulative GPA of 2.5
    • No grade lower than a C in Teacher Preparation
  4. Transition Point 4:.Graduation & Recommendation for Certification
    • Successful Student Teaching Evaluations (6)
    • Successful Completion of TWS
    • Proficient Exit Professional Disposition Assessment
    • Completion of Student Teaching Hours
    • Competent Student Teaching Evaluations
    • Passing Score on TWS
    • State Exit Survey
    • Employer Surveys

Assessments in teacher education course work require candidates to demonstrate content knowledge, general pedagogical knowledge and skills , knowledge and skills related to content-specific pedagogy, and professional dispositions, all aligned with our conceptual framework. Upper level field experiences assess students’ capacity to put this into practice in actual classrooms, and thru the Abbreviated Teacher Work Sample, measure and reflect upon their impact on student learning. In order to ensure that candidates are prepared to have a positive impact on student learning during student teaching, candidates are required to pass TExES Content and PPR Exams and demonstrate proficiency in each professional disposition assessed by unit faculty through the Professional Disposition Survey.

During student teaching, candidates demonstrate their ability to apply content knowledge and pedagogical knowledge to positively influence student learning thru 6 student Teaching Evaluations modeled after the Texas Professional Development and Appraisal System (PDAS) conducted by field supervisors and cooperating teachers. The full TWS completed during student teaching provides a valid and reliable measure of candidate proficiency in all of the preceding areas as well as candidate’s impact on student learning. In order to monitor the effectiveness of our program and our graduates, the unit draws from the Candidate Exit Surveys, Principal Surveys administered thru ASEP as well as other follow-up studies that provide data regarding candidate success in the classroom and other relevant professional education settings (Please see Exhibits 1.3.1 and 1.3.j)

Key unit transition points at the advanced level are as follows at follows:

  1. Transition Point 1: Admission to Advanced Programs
    • Master’s:
      • Undergraduate GPA of 3.0, or over 3.0 in the last 60 hours of Undergraduate study
      • Applicants whose undergraduate GPA in the last 60 credit hours is less than 3.0 must submit official Graduate Record
      • Examination (GRE) scores above 150 Verbal, 141 Quantitative,and 4.0 Analytical
      • Curriculum Vita or Resume
    • Ed. D.
      • Grad GPA: 3.25
      • GRE within last 5 years
      • Five years of experience in education or related fields
      • Verification of 3 years teaching experience at accredited institution
      • TOEFL passing score of 600 paper test and 100 Internet based test for foreign applicants from non-English speaking countries
      • Professional Statement
      • Résumé or curriculum vita
      • 3 letters of recommendation
      • Completion of Recognition of Professional Disposition Form
      • *Additional Requirements vary by program.
  2. Transition Point 2: Program Coursework
    • Master’s:
      • EDFR 6300 Foundations of Research in Education
      • EPSY 6304 Human Development and Student Learning
      • EDFR 6388 Sociocultural Foundations of Education/ Couns. 6364 Multicultural Counseling
    • Ed.D.
      • EPSY 8318 Advanced Applications of Human Development and Cognition
      • EDFR 8322 Advanced Sociocultural Foundations for Education
      • EDFR 8300 Research Methods in Education
    • Professional Disposition Survey(4)
  3. Transition Point 3: Comprehensive Exam and/or Transition into Internship
    • Completion of prerequisite coursework
    • Field experience hours where applicable.
    • Proficient Level Professional Disposition Assessment
    • Program Specific Internship Requirement
    • Ed.D.: Pass Comprehensive Exam
  4. Transition Point 4:.Compleion/Graduation
    • Meet all degree requirements
    • Masters: Pass Comprehensive Exam
    • Ed.D: Successfully defend dissertation
    • Employer Surveys
    • Completer Surveys

Key unit assessments at the advanced level require that candidates demonstrate an in-depth understanding of knowledge in their fields as delineated in professional, state, and institutional standards. Additionally, all advanced candidates are required to demonstrate their belief that all students can learn as well as a commitment to fairness and other dispositions aligned with relevant professional standards thru the Professional Disposition Survey, which is administered at multiple junctures throughout programs by multiple faculty members.

Assessments conducted in common core courses devoted to diversity, student learning and cognition, and research that is required of all master’s and doctoral level candidates, measure candidates' capacity to analyze data related to their work, reflect on practice and use research and technology to support and improve student learning and other professional outcomes, as aligned with state and professional standards and guided by the CoE conceptual framework.

Comprehensive exams at the master’s and doctoral levels require students to demonstrate these proficiencies as well as program-specific content through prompts that demand critical analysis and synthesis. Assessments related to advanced field and clinical placements measure candidates’ ability to apply these proficiencies in relevant professional settings as well as candidates' ability to bring research to bear on such work (Please see Exhibit 3.3.f). The Doctoral Dissertation assesses advanced candidates breadth of knowledge concerning the field of Curriculum & Instruction, in-depth knowledge of their specialization area and proficiencies and skills related to conducting original research aimed at educational improvement and/or innovation. Completer exit surveys and employer surveys are also conducted.

Additionally, all initial programs and advanced programs for which SPA standards exist conduct additional assessments aligned with relevant SPA standards. Data from unit and program assessments are regularly and systematically compiled, aggregated, summarized, analyzed and shared publicly on our website, through our advisory groups, and through semesterly data summits. This is then used to make improvements in candidate performance, program quality and unit operations. Our multi-tiered assessment committee structure reviews and refines unit assessments to establish the fairness, accuracy and consistency, and to combat bias.

 

2.1.b Data Collection, Analysis, and Evaluation Summarize processes, timelines, and outcomes of data collection, analysis, and evaluation of candidate performance, program quality, and unit operations.

Our current assessment system operates across three interrelated levels: assessment of candidates, assessment of programs and unit operations assessment. In accordance with the policies, procedures and schedule described in Exhibit 2.3.d., these data are regularly and systematically compiled, aggregated, summarized, analyzed and shared with the public with the aim of boosting candidate performance and improving program quality and unit operations.

Candidate Performance Data

Our unit regularly compiles, aggregates, summarizes and analyzes data concerning candidate progress through a variety of key unit assessments that measure candidate proficiencies aligned with our conceptual framework and state and professional standards. Our assessment system also consists of course-level assessments such as work samples, micro-teaching, research papers, case study analysis, performance-based projects, examinations and reflective writing, used to assess candidate progress between transition points and beyond key assessments. This is reflected in syllabi in Exhibit 1.5.b which show how professional education courses and their assessments align with our conceptual framework and relevant professional standards.

Candidate performance data are collected, stored and summarized primarily through Tk20. This data-management system also integrates data such as enrollment and GPA from institutional databases such as DATATEL with course-related and faculty-performance data from Blackboard Outcomes. Tk20 is coordinated by the CoE Office of Institutional Effectiveness and Development (OIED), which oversees and monitors the collection, compilation, aggregation and disaggregation of unit and program-assessment data. OIED also works with the Office of Teacher Preparation and Accountability to facilitate state reporting (such as Title II reports) regarding candidate performance. Data from our assessment system is shared with CoE faculty and relevant CLA and CSMT faculty at data summits held at the start of each semester. Data is shared with our school partners and with the broader community through the Teacher Education Council, the Lower Rio Grande Valley Teacher Education Advisory Council, student and community advisory committees and our website. Each of these venues provides academic and administrative units feedback on academic programs, activities and other key issues related to the effectiveness of our unit.

Program Data

Program-level data are regularly compiled, summarized, aggregated and analyzed, and are used to make program modifications primarily through the process of external (SPA) and internal program review as well as state review where relevant. Our unit offers nineteen initial programs leading to teacher certification. Of these programs, four are nationally accredited by NASM and thirteen are recognized with conditions (RWC) by their (SPAs), and are resubmitting for full recognition.

At the advanced level, our master-level counseling programs are nationally accredited by CACREP. The Master of Educational Technology program is fully recognized by AECT. The Master of Education-Educational Leadership at both the district and building levels are fully recognized by ELCC. Our three master's-level specializations in Special Education are RWC by CEC. The master’s Bilingual Education program and master’s C & I program do not have SPA standards, and neither does the Ed.D. in C&I. These programs adhere to the same policies, practices and schedule of rigorous program review, thus ensuring that credible data are consistently compiled analyzed, shared and used to improve programs across our unit.

Unit Operation Data

The CoE OIED and the dean’s leadership team work together to coordinate data collection related to unit operations. Together they collect, organize, maintain and analyze institutional and other data used to support college strategic planning, decision making, and management and institutional evaluation. Currently our unit evaluation plan is built on assessment of administrator and faculty effectiveness and productivity, departmental assessments, program-level assessment, and summative assessment regarding candidate proficiencies. Data gathered toward this end include: regional accreditation data, demographic analysis, general retention data, credit-hour production data, external fund reports, resource-allocation data, annual faculty review, dean evaluation, department chair evaluation, faculty workload and program-improvement plans. The unit then uses this assessment data to make improvements in mission-critical areas—especially teaching and learning—but also for critical areas of institutional improvement, faculty enhancement, accreditation and accountability.

A Culture of Assessment

Faculty play a central role in tending to the unit’s assessment system and fostering the unit’s emerging culture of assessment. In addition to designing, refining and conducting many of the assessments upon which the unit-assessment system relies, faculty provide leadership to guide unit assessment and in the data-driven decisions that result. The unit has established three faculty-led assessment committees. The Unit Assessment Committee (UAC) established in Fall 2013 meets twice a semester and is composed of faculty representatives from the CoE, College of Liberal Arts (CLA), and the College of Science, Mathematics and Technology (CSMT). It is responsible for overseeing, coordinating and evaluating unit- assessment policies and procedures. This includes reviewing procedures and practices for managing student complaints. Our unit has two complaint policies. With each, complaints are addressed first by either the departmental chairs or the CoE associate dean, and then by the CoE dean, who maintains confidential files concerning complaints and their resolution. The CoE has a general appeal policy and one specifically for dispositions. The latter also involves an ad hoc committee deployed specifically for dispositional concerns. These work in tandem with UTB’s general appeals policy. The policies are available in Exhibit 2.3.e.

The CoE Assessment Committee is led by and composed of CoE faculty (who also serve as departmental assessment committee chairs) and relevant resource people. This committee meets at least twice monthly to review and analyze unit, program and candidate assessments, and to strategize ways to improve CoE assessment practices. It also plays a pivotal role in planning semesterly data summits devoted to sharing and further analyzing data from unit and program assessments and then considering program changes in response to those data. The four departmental assessment committees meet as needed to monitor data collection and provide ongoing technical assistance regarding assessment to faculty in their relevant departments.

These committees collaborate with faculty and the professional community to regularly evaluate the capacity and effectiveness of the assessment system.

Much of our work to ensure fairness and reduce bias is done through building a culture of assessment that meets student, program and unit needs, and is guided by the American Association of Higher Education-sponsored publication Principles of Good Practice for Assessing Student Learning (1991). Toward that end, the unit works to make course expectations clear through syllabi that state outcomes and include rubrics to measure them. Upon entrance into a program, students are introduced to program expectations and our unit dispositions, and are assessed multiple times by a multiplicity of raters over the course of the program. Unit expectations and the processes for academic and non-academic appeals are available in the CoE Student Handbook, Teacher Candidate Handbook, CoE Doctoral Handbook, Principalship Handbook, Counseling Handbook and the. Additionally, the CoE has student grievance procedures specific to teacher education and dispositional concerns (Please see Exhibit 2.3.e.).

Each assessment committee, along with OIED, plays a pivotal role in monitoring the quality of assessments in terms of fairness, accuracy, consistency and bias. The UAC is in charge of regularly reviewing assessment policies and practices for fairness, accuracy and bias, as well as the impact of our unit assessments on our diverse pool of teacher and other educational profession candidates. The CoE assessment committee provides ongoing examination and feedback regarding the accuracy of rubrics and fairness of raters along with the data generated through these instruments and evaluations. The committee reviews proposed assessment changes and offers recommendations for additional modification that might improve fairness and accuracy and eliminate bias, while also evaluating instruments in terms of their accuracy, validity and utility.

Beyond these committees, it is a growing unit-wide expectation that the CoE leadership team and program faculty meet regularly to discuss key assessments, evaluate that work, and develop and eventually conduct research about the fairness, validity and reliability of program assessments.

2.1.c Use of Data for Program Improvement Summarize processes, timelines, activities, and outcomes derived from use of data for program improvement of candidate performance, program quality, and unit operations.

Data-Driven Processes and Activities Aimed at Continuous Improvement

Our unit regularly and systematically uses data to improve the effectiveness of its programs and unit operations and to ensure that we are generating highly skilled professionals poised to help all students learn. The figure below provides a summary of our unit’s continuous improvement process through which—guided by our conceptual framework—we assess, evaluate and improve programs and unit operations as well as the unit assessment system itself.

Our process for continuous improvement is guided by our conceptual framework, which in turn is reciprocally influenced by the process of continuous improvement itself. Through the process of program review guided by SPAs, we currently use results from candidate assessments to evaluate and make program and unit improvements on a semesterly-basis, while simultaneously using such results to evaluate and modify individual assessments at the unit and program level and evaluate and adjust the assessment system as a whole.

Our unit assessment system has grown significantly more systematic with the adoption of Tk20 in fall 2011. Housed in OIED, this electronic data-management system now includes a broad array of formative and summative program assessments linked with professional, state and unit standards. A unit priority is to increase the functionality of Tk20 and build the capacity of students, faculty, staff and the professional community to use it to reflect on performance, evaluate programs, make improvements and generate studies that explore the effects of this change. Tk20 training for faculty and staff is offered each semester, while ongoing technical assistance is offered to faculty, staff, students and relevant professional community partners by OIED staff as well as through Departmental Assessment Committees.

In order to begin better utilizing Tk20 for shared, data-guided decision making and providing opportunities for sharing data across programs and departments, we began having data summits at the start of each semester. In fall 2012, our first data summit—guided by external consultant Dean Charles Love—focused on dispositional data and resulted in the development and implementation of our Professional Dispositions Survey.. Subsequent summits held spring 2013 and fall 2013 focused on external assessments such as the TExES exams scores and internal assessments regarding field and clinical experiences. These required data summits utilize internally Tk20-generated reports along with additional external data, and provide all faculty regular opportunities to review data and develop plans for improvement based on data. Summit participants are encouraged to complete Use of Assessment Data Forms in Tk20 and schedule additional program-level meetings (two per semester) to continue their search for stronger relationships among assessment data and performance, and to refine plans for improvement based on this data.

Examples of Data-driven Improvements

As our assessment system is becoming more comprehensive, improvements and outcomes derived from use of data for improvement of candidate performance, program quality and unit operations are becoming more evident. Specific examples of program changes made in response to data are in SPA reports available in AIMs as well as in Exhibit 2.3.g program-level changes are available in Aims SPA reports and internal review of program without SPA standards. Analysis of external and internal data has yielded other important unit improvements as well.

TExES Pass-Rate Increase

In spring 2011, trend data concerning completer pass rates of TExES content and EC-12 PPR exams, combined with changes in state reporting processes, did not bode well for our completer pass rate. During the summer of 2011, the Office of Teacher Preparation and Accountability worked with OIED to develop a new curricular trajectory that would require initial candidates to pass exams before student teaching, and ensure that candidates were fully prepared to have a positive effect on student learning during their clinical teaching experience. During fall 2011 this new trajectory was presented to faculty and advisory panels for discussion. It was implemented in fall 2012. Our pass rate is steadily on the rise as a result. Please see Exhibit 2.3.g for a full diagram of this change as well as a description of other data-driven program changes.

Content Pedagogy Knowledge and Skills

Although our pass rate is steadily rising, TExES content tests still pose a real challenge for many of our students. In response, we drew from research on best practices of teacher education that demonstrate a positive relationship among instruction in content- specific pedagogy and candidates' level of content knowledge as well as their ability to integrate this into effective instruction. We created two new field-based content pedagogy courses taught by qualified faculty and instructors. Now all initial candidates are required to take field-based courses in content pedagogy as well as general pedagogy courses. Elementary candidates take content-specific pedagogy courses in Math and Science (EDCI 3314) and Social Studies and Language Arts (EDCI 4327). Secondary candidates take a field-based secondary content pedagogy course (EDCI 4328), which is sectioned out to allow students to focus on implementing and assessing content pedagogy in their area of certification. This change included developing a position, recruiting and hiring a new tenure-track assistant professor of C&I specializing in Social Studies Education.

Impact on Student Learning

Student-teaching evaluation data have consistently demonstrated a high level of candidate proficiency in knowledge, skills and dispositions across domains of planning and preparation: nurturing a positive classroom environment; designing, implementing and assessing instruction; and professional responsibilities. Yet these evaluations do not provide an adequate assessment of our candidates' impact on student learning during clinical teaching. In response to this dearth of data, we piloted the Teacher Work Sample in fall 2011 with two cadres of student teachers. The TWS is a performance-based assessment through which teacher candidates demonstrate proficiencies related to planning and implementing standards-based instruction, while also assessing and reflecting on the candidates' impact on student learning during clinical teaching.

The completion of the TWS was made a required part of clinical teaching in fall 2012. Students receive TWS training as part of their student-teaching orientation session. Additional training coordinated by the Office of Field and Clinical Experience, which coaches students through specific aspects of the TWS, occur throughout the semester of clinical teaching. University supervisors also provide ongoing TWS assistance. CoE faculty, university supervisors, school-based mentors and other professional partners serve as raters. All raters receive general TWS training and specialized rater calibration aimed at bias amelioration prior to each scoring session, a process coordinated by the CoE assessment committee.

Results from TWS administered in 2012 indicated consistent weaknesses related to assessment of student learning and data analysis as well as inconsistent performance in the contextual factors, design for instruction and reflection sections. As a result, an abbreviated TWS (ATWS) assignment was integrated first into required a field-based course. Starting in fall 2013, the ATWS is required as part of content pedagogy courses. The ATWS also serves as a formative, unit-based assessment used to evaluate candidate knowledge and skills related to planning, implementing and assessing effective instruction prior to clinical teaching. This new unit-wide emphasis on a consistent approach to designing and differentiating instruction as measured in the ATWS better prepares candidates to have a positive impact on student learning during their clinical teaching and beyond.

Interrelatedness

One of the four guiding principles of our conceptual framework is interrelatedness. As we began the process of external program review, it became clear that while our program assessments were strong, our curriculum needed some connective tissue across programs and across undergraduate, master’s and doctoral levels in order to sculpt a set of unit-wide assessments that provided evidence of common knowledge, skills and dispositions among programs as well as within them.. In response to these results, and guided by our conceptual framework, in spring 2012 a faculty-led curriculum committee developed a set of foundational courses at initial and advanced levels, focusing on student learning and cognition, intercultural foundations and inquiry.

  • Core Courses
  • Initial Level:

    • Sociocultural Foundations: EDFR 2301 Intercultural Contexts of Schooling
    • Student Learning: EPSY 4322 Human Development and Student Learning
    • Research: Clinical Teaching

    Master’s:

    • Sociocultural Foundations : EDFR 6388 Sociocultural Foundations of Education/Couns. 6364 Multicultural Counseling
    • Student Learning: EPSY 6304 Human Development and Student Learning
    • Research: EDFR 6300 Foundations of Research in Education

    Doctoral:

    • Sociocultural Foundations: EDFR 8322 Advanced Sociocultural Foundations for Education
    • Student Learning: EPSY 8318 Advanced Applications of Human Development and Cognition
    • Research: EDFR 8300 Research Methods in Education

These courses were added to the programs of study in fall of 2012. This set of courses serves as a sort of curricular commons where CoE students from across programs interact and exchange disciplinary expertise while gaining knowledge, skills and dispositions in key foundational areas central to being highly skilled professionals in the field of education. Each of these courses is working with the CoE assessment committee to develop and/or modify common assessments across sections that might provide unit-level assessment data linked across programs and levels.

2.2 Moving Toward Target or Continuous Improvement

2.2.b Continuous Improvement Discuss plans for sustaining and enhancing performance through continuous improvement as articulated in this standard.

At the heart of our assessment system is rigorous program review. All programs and nearly every faculty member who is part of these programs that prepare teachers and other educational professionals has been directly engaged in a rigorous, data-driven study of their strengths and challenges related to candidate performance. Assessment is now part of the rhythm of professional life for our unit. As the capacity and effectiveness of our assessment system matures, we have identified several key areas for sustaining continuous improvement and enhancing performance.

Sustaining and Expanding Tk20

Tk20 has been central to our increasingly systematic collection, compilation, aggregation and summarization of data and has greatly improved the unit’s ability to analyze and use data to improve candidate performance, program quality and unit operations. So far we have mostly utilized Tk20 to manage data related to assessments and scoring guides that form foundation NCATE’s process of program review. Additionally, we will be seeking ways to connect data from our assessment system with those of the state, and/or our school partners’ performance data in order to better evaluate the effectiveness of the highly skilled teachers and other professionals we generate and thus improve outcomes. Finally, Tk20 has functionalities that pertain to advisement and field and clinical experiences. Our Office of Field and Clinical Experience and student-teaching staff as well as some advisers have begun training in these functionalities. By expanding our use of Tk20 to include its broad range of functionalities, we increase the coherence and utility of our assessment system as a whole while increasing the accessibility of relevant data for students, school partners and the broader professional community with whom we collaborate.

Expanding Procedures to Ensure Fairness, Consistency, Accuracy and Avoidance of Bias

One of the keys to strengthening the relationship between performance assessments and candidate success in our program, and later in professional contexts, is the regular evaluation and modification of our key assessments. Our assessment committee structure (unit, CoE, departmental) lends itself to such evaluation. Since all but three of our programs have been through SPA review and all but two have been fully recognized or recognized with conditions, and given our adherence to basic principles of quality assessment, our assessment committees have proceeded with measured confidence in the relationship between our assessments and performance. Going forward, however, these committees are continually searching for stronger relationships in our evaluation of our assessments and the system as a whole. Most immediately, we are expanding efforts toward ensuring fairness and accuracy of our capstone assessments. In spring 2014 the CoE assessment committee is planning to launch inter-rater reliability studies of TWS data and data related to comprehensive exams and portfolios at advanced levels. Additional ways of continuously improving our assessment system that we are exploring include:

  • Developing guidelines for constructing rubrics and other new, unit-wide tools for use in the evaluation of key assessments.
  • Collaborating with partner schools and other professional communities to develop a series of mutual professional development opportunities designed to improve assessment reliability, validity, fairness and utility.
  • Exploring the feasibility of conducting collaborative research with peer institutions to evaluate and improve assessment practices and the strength of their relationship to outcomes.

Utilizing Data to Improve and Innovate

Program review data from all of our programs demonstrates that we regularly assess candidates’ content knowledge, planning, skills demonstrated during clinical practice and their impact on student learning. Important outcomes of clinical practice of other school professionals are also regularly assessed. Similarly, program reviews provide data demonstrating that all of our programs have used data to inform decisions regarding curriculum, instruction, faculty assignments and candidate performance. Less prevalent is data regarding the way our new disposition assessment has led to program changes. This is an area all assessment committee members agree needs further study. Likewise, while we have used data from program assessments measuring what students know and can do to make program decisions, we have not yet had the opportunity to take a comprehensive look at the impact of these changes and identify unintended consequences. This process is central to our continuous improvement. As our assessment system matures, studying unintended as well as intended consequences will become a twin feature of program modification and innovation.

Standard 2 Committee: Assessment
Council Member & Chair:  Dr. James Telese
Co-Chair:  Dr. Alma Rodriguez 
Dr. Laura Jewett
Dr. Zelma Mata
Dr. Christopher Ledingham
Dr. Rene Corbeil
Dr. Carmen Garcia-Caceres
Dr. Lionel “Javier” Cavazos
Ms. Partricia Ramirez
Mr. Luis Machuca
Mr. Hector Castillo


Minutes

For comments and questions, please contact the Webmaster.