Institutional Report Addendum

Standard 1 Responses to Areas of Concern and Validation Questions

Area of Concern 1: There is limited evidence that candidate professional dispositions are assessed and data presented at transition points in the initial and advanced programs.

Rationale: Dispositions data presented are limited to the perceptions of candidates with a lack of assessment of candidate dispositions related to their work with students, families, colleagues, and communities.

The unit collects disposition data on initial and advanced candidates related to their work with students, families, colleagues, and communities throughout their matriculation in their respective programs.

Initial Programs
While the unit requires candidates to assess their dispositions related to interacting and working with diverse groups of people as evidenced by disposition survey data (Exhibit 1.1 Disposition Survey Data) and teacher education interview data (Exhibit 1.2 Teacher Education Interview Data), the unit also collects data that extends beyond candidate perceptions of their abilities. Once initial candidates enter the teacher education program, faculty rely on P-12 cooperating teachers’ and university supervisors’ ratings of candidates’ abilities to interact with diverse P-12 students as well as their ability to work collaboratively with families during the field experiences (Exhibit 1.3 Cooperating Teacher and University Supervisor Field Experience Evaluation). The unit also utilizes P-12 cooperating teachers’ and university supervisors’ ratings of candidates’ abilities to interact with diverse groups during their student teaching experience (Exhibit 1.4 Clinical Practice Evaluations). Cooperating teachers and university supervisors rate candidates’ abilities in these areas three times during the course of the student teaching experience to assess growth in these areas. Faculty assess initial candidates’ abilities to integrate various points of view through the instructional design and implementation process by examining candidates’ performance on Electronic Evidence 3 Content Area Instructional Unit Plan (Exhibit 1.5 Electronic Evidence 3 Scores) and Electronic Evidence 5 Differentiated Instruction Teacher Work Sample (Exhibit 1.6 Electronic Evidence 5 Scores). Faculty compare these data with candidates’ responses on an exit disposition survey (Exhibit 1.7 Exit Disposition Survey Data). In addition, candidates learn to work cooperatively with specialist/other school personnel. For example, case studies are used to explore topics such as “navigating the difficult colleague” and defusing the irate parent. Seminar topics vary based upon data analyzed from employers, cooperating teachers, and university supervisors.

Advanced Programs
The unit assesses dispositions related to diverse students, families, and communities of those candidates in the Master of Arts in Teaching programs similarly to those candidates in the initial undergraduate program. Faculty utilize P-12 cooperating teachers’ and university supervisors’ ratings of candidates’ abilities to interact with diverse P-12 students as well as their ability to work collaboratively with families during their early field experience (Exhibit 1.8 MAT Field Experience Evaluation) and clinical practice (Exhibit 1.9 MAT Clinical Practice Evaluation). Cooperating teachers and university supervisors rate candidates’ abilities in these areas three times during the course of the student teaching experience to assess growth in these areas. Faculty assess initial candidates’ abilities integrate elements of their students’ diversity within their instructional design and implementation by examining candidates’ performance on Electronic Evidence 3 Content Area Instructional Unit Plan (Exhibit 1.10 MAT Electronic Evidence 3 Scores) and Electronic Evidence 5 Differentiated Instruction Teacher Work Sample (Exhibit 1.11 MAT Electronic Evidence 5 Differentiated Instruction and Accommodations).

Faculty assess advanced level candidate dispositions from program admission to program completion. At the admission point, faculty assess advanced candidates by evaluating their reference letters and by measuring candidate dispositions during oral interviews via indicators that focus on their beliefs about student learning, their beliefs about assessing student learning using multiple measures, and their beliefs about interacting with parents and communities. Once admitted, faculty in each advanced program assesses candidate dispositions using multiple indicators. The Joint Master of Social Work program utilizes a combination of candidate field ratings relative to interactions with diverse students, families, and communities (Exhibit 1.12 JMSW Candidate Disposition Field Ratings). These data provide faculty with multiple data points to evaluate candidate effectiveness with diverse groups of people. In Agricultural Education, faculty assess advanced candidate dispositions related to diverse students, families, colleagues, and communities through their Electronic Evidences (Exhibit 1.13 Promoting Affective Educational Environments, Exhibit 1.14 Professional Communication and Collaboration, Exhibit 1.15 Professional Learning Community, Exhibit 1.16 AGED Diversity Management Plan Families and Communities). In the Masters in School Administration (MSA) program, candidate dispositions are assessed via electronic evidences and during the internship, which includes student internship work plans as well as feedback and evaluations from mentoring principals. A Direct Response Folio (DRF) in Taskstream assesses candidate dispositions at Transition Points 1, 2 and 4 (Exhibit 1.17 Sample MSA DRF Screenshot). Faculty assess candidates’ dispositions at the end of each semester and feedback from this instrument is provided to students (Exhibit 1.18 Disposition Instrument and Sample feedback).

Area of Concern 2 - Limited data related to P-12 Student Learning for Candidates and Other School Professionals

Rationale: Unit data provided related to P-12 student learning are based on ratings of performance indicators with a lack of evidence provided that candidates assess and analyze student learning and make data-driven decisions about strategies for teaching and learning so that all students learn.

Electronic Evidences 3 and 5 demonstrate how candidates utilized their student data to inform and drive their instruction. Electronic Evidence 3 Content Area Instructional Unit Work Sample requires the candidate to implement a unit plan then gather data on student success. Electronic Evidence 5 Differentiated Instruction Teacher Work Sample requires the candidate to analyze the data from Evidence 3 and modify the instructional plan based upon these data. Finally, the candidate administers the post-test as a means of evaluating impact on student learning. (Exhibit 1.19 MAT Technology Education EE 5 Differentiated Work Sample Scores, Exhibit 1.20 BS Elementary Education EE 5 Differentiated Work Sample Scores).  Candidate evidences for advanced programs are also available as presented in MAED Reading Electronic Evidence 1 and Electronic Evidence 2 (Exhibit 1.21 MAED Reading Electronic Evidence 1 Classroom Based Entry Document and Rubric).
 
Question 1. Are data from employer and alumni surveys utilized to assess that candidates possess knowledge, skills, dispositions to help all students learn and where are these data aggregated, disaggregated, and explained?
Data from employer and alumni surveys are utilized to assess that candidates possess the knowledge, skills, and dispositions to help all students learn. Aggregate data is compiled and produced in annual reports (Exhibit 1.22 IHE Reports). Aggregate graduate performance data from the North Carolina Department of Public Instruction and UNC General Administration are shared with Deans and during statewide meetings (Exhibit 1.23 UNC Deans’ Council Agenda). This information is shared and discussed at the campus level with the unit administrative team during unit faculty meetings (Exhibit 1.24 Faculty Meeting Minutes) and Teacher Education Council meetings. Disaggregated data are shared with the Teacher Education Council during annual Data Institutes (Exhibit 1.25 Data Institute Minutes 08.20.13) and regularly scheduled Council meetings (Exhibit 1.26 TEC Meeting Minutes 02.19.13 and 10.15.13). Alumni and employer survey data are also regularly discussed and explained during unit assessment committee meetings (Exhibit 1.27 SOE Assessment Committee Meeting Minutes). Faculty use these data to alter curricula, course content, student teaching seminar topics, and program requirements.

Question 2. Does the state require qualifying scores for both the Praxis II content exam and Principles of Learning and Teaching? If not, how are candidates assessed on pedagogy and professional knowledge and skills?
As of fall 2014, the state of North Carolina requires qualifying scores for both the Praxis II content examination and the Principles of Learning and Teaching. Between 2010 and 2014, the state did not require qualifying scores on these examinations for any content area except Elementary Education and Special Education. Although these examinations were not required during this time period, the unit maintained the Praxis II content knowledge exam as one measure of candidate content knowledge and professional knowledge (Exhibit 1.28 Required North Carolina Licensure Examinations). Candidates were not required to attain a qualifying score to receive a recommendation for licensure. Only Elementary Education and Special Education candidates were required to achieve these qualifying scores for licensure recommendations. These data were also integrated into our assessment system. The Unit juxtaposed these data with candidate performance on Electronic Evidence 2 Content Knowledge, Electronic Evidence 3 Content Area Instructional Unit Work Sample (Exhibit 1.29 Electronic Evidence 3 Directions), Electronic Evidence 5 Differentiated Instruction Teacher Work Sample (Exhibit 1.30 Electronic Evidence 5 Directions) and Clinical Practice Observations (Exhibit 1.4 Clinical Practice Performance Form Evaluation) to assess content knowledge, content pedagogical knowledge, and professional knowledge.

Question 3. Exhibits 1.4.d.26, 1.4.d.29, 1.4.f.2, 1.4.f.3, 2.4.b.1, 2.4.b.2, 2.4.b.5 could not be located.
1.4.d.26 located – Mean scores of advanced program observation rubric data indicators
1.4.d.29 located –TEC Minutes from April 2014 meeting with vote to change EE 5
1.4.f.2 located – Mean Philosophy of Education Score by Initial Program (linked on website)
1.4.f.3 located –MAT Interview Dispositions at Admissions and Graduation
2.4.b.1 located – Mean Candidate Cumulative Benchmark GPA by Advanced Program
2.4.b.2 located – Mean Candidate GRE Scores by Advanced Program
2.4.b.5 located – Evidence wrongly identified; this evidence is actually 2.4.b 2013-14 Interview Admission Scores

Question 4. How do program specific comprehensive exams at the advanced level assess candidates' pedagogical skills?
The specialty area advanced program comprehensive examination is one metric the unit uses to assess advanced candidate content pedagogical skills. Not all advanced programs utilize a comprehensive examination (Exhibit 1.31 Advanced Programs Comprehensive Exam Matrix); however, those programs that utilize specialty comprehensive examinations address instructional design, class based scenarios, and program planning and evaluation by requiring candidates to respond to questions that address candidate pedagogical skills (Exhibit 1.32 Sample Advanced Program Comprehensive Examinations).

Question 5. How are candidate assessment data regularly and systematically collected, compiled, aggregated, summarized, and analyzed specifically related to those programs identified by the unit as other school professionals?
Candidate assessment data are regularly and systematically collected, compiled, aggregated, summarized and analyzed for all degree programs, including those that serve other school professionals. At the university level, the Office of Strategic Planning and Institutional Effectiveness requires all degree programs to submit annual Institutional Effectiveness reports data related to critical thinking, communication skills, content knowledge, professional skills, and diversity (Exhibit 1.33 Institutional Effectiveness Reports for Other School Professional Programs). The unit utilizes these data to examine issues related to continuous improvement where “closing the loop” is a critical focus. Program coordinators for other school professional programs are formally appointed to TEC. The unit also relies on its assessment system to collect, compile, aggregate, summarize, and analyze for all degree programs, including those that serve other school professionals. These data are reported in the unit’s annual reports (Exhibit 1.34 SOE Annual Reports) and Title II reports (Exhibit 1.21 IHE Reports). These data are shared and discussed during Teacher Education Council meetings, School of Education faculty meetings, and the Teacher Education Council Data Institute (Exhibit 1.24 Data Institute Minutes). This information is shared regularly during Council meetings. Each of these programs are included as part of the units new tracking system.

Question 6. How are data from field experiences prior to use of the Clinical Practice Performance Form collected, analyzed, and utilized to assess candidates' knowledge, skills, and dispositions?
Prior to the clinical practice experience, the unit collects and analyzes data on candidate knowledge, skills, and dispositions during field experiences using the Intern Performance Evaluation in Taskstream. Cooperating teachers and university supervisors use this evaluation document to assess candidate performance throughout their field experience activity. Field experiences are connected to the professional core and specialty area courses and are categorized by Professional Learning Communities (PLCs) I, II, III, and IV (Exhibit 1.35 Field Experience Sequence Chart). Field experiences in PLCs I and II are connected to the freshman and sophomore professional education courses and are considered Emerging-Developing Phase courses. Field experiences in PLC III and IV, which are junior and senior level professional and specialty area courses, move the candidate to Proficient–Accomplished Phase skills.

The unit also uses Taskstream to collect data for clinical experiences. Currently, candidates complete the early field experience application, analysis and reflections, cooperating teacher information form, and candidate responses to cooperating teacher(s) evaluation in Taskstream (Exhibit 1.36 Field Experience Evaluation).

Feedback from cooperating teacher is provided to the course instructor and student. The student is required to reflect upon the evaluation from cooperating teacher. Lastly, the course instructor and student reflect upon and develop as needed any intervention plans to ensure future professional growth. Department chairpersons, program coordinators, academic advisors, and course instructors can view candidates’ performance in Taskstream to make decision regarding advisement, program recommendations, and curriculum decisions. Early field experience performance data is reviewed and shared by the director of Field Based and Clinical Experiences at Teacher Education Council Meetings and Annual Teacher Education Data Institute.

Question 7. Are the Praxis II exam scores in exhibit 1.4.d.2 from a Content Area Exam or the Principles of Learning and Teaching Exam?
Exhibit 1.4.d.2 illustrates Praxis I scores of candidates prior to their admission into the teacher education program. This table indicates that those candidates in their respective academic majors exceeded on average the required cumulative score of 522 established by the state Board of Education. Thus, the unit admits students into its program who have requisite basic knowledge and skills in the areas of mathematics, reading, and writing as assessed by this examination. When these scores are juxtaposed with the average cumulative grade point average at the point of admission for those candidates in the initial program, faculty and administrators feel comfortable and confident that high-quality students are admitted into its program.

Back to Top