€¦  · web viewthe department of education and teacher preparationquality assurance system for...

70
The Department of Education and Teacher Preparation Quality Assurance System for Continuous Improvement Initiatives The College of Coastal Georgia (CCGA) Department of Education and Teacher Preparation employs an evidence-based quality assurance system to evaluate program effectiveness with emphasis on the impact of candidates on P-12 student learning. Through systematic data collection and ongoing program review mechanisms, the Department of Education and Teacher Preparation (DETP) provides the context for continuous improvement to occur. The quality assurance system is rooted in the growth mindset philosophy indicating that, while faculty recognize that candidates bring assets to the teacher education program, candidates are expected to demonstrate growth across their respective programs. Candidates are evaluated using multiple measures with consideration to departmental teacher education goals and Interstate Teacher Assessment and Support Consortium (InTASC) standards. The revised quality assurance system was developed through a collaborative initiative involving input from CCGA Arts and Sciences faculty and other relevant P-12 partners and stakeholders. In effect beginning the 2017-2018 academic year, DETP faculty began incorporation of revised standards for both InTASC and the Council for the Accreditation of Education Preparation (CAEP) across all initial teacher preparation programs. The newly revised departmental vision emphasizes the fundamental notion that meaningful learning only occurs for teacher candidates in the context of authentic classroom settings with diverse P-12 learners. DETP faculty support candidate development related to A) the learner and learning, B) content, C) instructional practice, and D) professional responsibility. As an initial step in this revision process, DETP faculty conducted a collaborative thematic analysis exercise using accrediting agency expectations

Upload: dangnhu

Post on 24-May-2018

215 views

Category:

Documents


2 download

TRANSCRIPT

The Department of Education and Teacher PreparationQuality Assurance System for Continuous Improvement Initiatives

The College of Coastal Georgia (CCGA) Department of Education and Teacher Preparation employs an evidence-based quality assurance system to evaluate program effectiveness with emphasis on the impact of candidates on P-12 student learning. Through systematic data collection and ongoing program review mechanisms, the Department of Education and Teacher Preparation (DETP) provides the context for continuous improvement to occur. The quality assurance system is rooted in the growth mindset philosophy indicating that, while faculty recognize that candidates bring assets to the teacher education program, candidates are expected to demonstrate growth across their respective programs. Candidates are evaluated using multiple measures with consideration to departmental teacher education goals and Interstate Teacher Assessment and Support Consortium (InTASC) standards. The revised quality assurance system was developed through a collaborative initiative involving input from CCGA Arts and Sciences faculty and other relevant P-12 partners and stakeholders.

In effect beginning the 2017-2018 academic year, DETP faculty began incorporation of revised standards for both InTASC and the Council for the Accreditation of Education Preparation (CAEP) across all initial teacher preparation programs. The newly revised departmental vision emphasizes the fundamental notion that meaningful learning only occurs for teacher candidates in the context of authentic classroom settings with diverse P-12 learners. DETP faculty support candidate development related to A) the learner and learning, B) content, C) instructional practice, and D) professional responsibility. As an initial step in this revision process, DETP faculty conducted a collaborative thematic analysis exercise using accrediting agency expectations and standards resulting in development of key departmental goals that will serve as the driving force for future continuous improvement initiatives. The revised departmental goals are included in Table 1.1 below.

Table 1.11 Teacher candidates will demonstrate acceptable levels of content knowledge expertise. 2 Teacher candidates will respond to diversity as it relates to student learning and the

decision-making process for effective teaching.3 Teacher candidates will apply relevant research/theory and developmentally appropriate

practice for planning, instruction, and assessment of P-12 learners.4 Teacher candidates will employ evidence-based practices with increasing levels of

proficiency.5 Teacher candidates will invest in opportunities to develop individual beliefs and values

that will positively impact P-12 student learning.6 Teacher candidates will analyze their own teaching practices to monitor their professional

growth.

The DETP evidence-based quality assurance system is grounded in the fundamental concepts of the learner and learning, content, instructional practice, and professional responsibility and is responsive to appropriate state and national standards. As a part of the continuous improvement initiatives of the department, the overall vision will be regularly reviewed and revised (as needed) to ensure that it is current, relevant, and clearly aligned to present-day standards driving the field of teacher education. This review and revision process will occur within the context of the data analysis meetings described below. With an emphasis on the four fundamental concepts previously described, the DETP employs a comprehensive and integrated assessment system. The purpose of this system is: a) to document and monitor candidate proficiency development at key transition points in their respective programs and B) to conduct systematic and purposeful continuous improvement initiatives that are sustained and evidence-based.

Continuous Review Schedule for Faculty and Relevant Stakeholders

A systematic review of relevant data, across multiple measures, is conducted annually with both program faculty and community and P-12 stakeholders. For DETP faculty, this review occurs in the third month of the academic year in October during the monthly formal faculty meeting. For relevant stakeholders, this review of data, with opportunities for stakeholder input to be contributed, occurs in the context of the annual Teacher Education Advisory Board (TEAB) meeting. Membership on the Teacher Education Advisory Board (TEAB) is determined based on eligibility as defined by local school system administrators. Representation on the board may include, but is not limited to, administrators, teachers, instructional coaches, other support staff, and community stakeholders who express an interest in the DETP. For TEAB input opportunities, one meeting is held during the spring semester. Ongoing informal feedback opportunities are provided throughout the year with scheduled meeting with partnering administrators and mentor teachers. These meetings are documented on the shared DETP Partnership Meetings documented. For DETP faculty review meetings, program coordinators provide a summary of data collected across key assessments within the program. For TEAB meetings, the department chair is responsible for representing relevant program data (both aggregated and disaggregated, as needed) in a way that provides ready access to external reviewers within the community to create optimal opportunities for input. As the revised quality assurance system is piloted during the 2017-2018 academic year, with the goal for full implementation in 2018-2019, participants will dialogue, using the framework below as a guide, to address the major questions that need to be asked annually as indicated in the CAEP standards.

Quality Assurance and Continuous Improvement Framework for Analysis1. How are we recruiting and supporting completion of high-quality candidates from a

broad range of backgrounds and diverse populations? (CAEP Standard 3)2. How are we addressing community, state, national, regional, or local needs for hard-to-

staff schools and shortage fields? (CAEP Standard 3)3. How are we effectively monitoring candidate progress from admissions through

completion across programs? (CAEP Standard 3 and 5)4. How do we ensure that our programs are effectively preparing candidates with the

knowledge, skills, and dispositions necessary to address the learner and learning, content, instructional practice, and professional responsibility? (CAEP Standard 1 and 5)

5. How confident are we that our key assessments across programs are valid and reliable measures? (CAEP Standard 4 and 5)

6. How satisfied are employers of program completers that graduates are prepared for their assigned responsibilities in working with P-12 students? (CAEP Standard 4)

7. How well are we preparing candidates to have a positive impact on P-12 student learning? (CAEP Standard 2 and 4)

8. How are we ensuring that clinical partnerships, field experiences, and candidate expectations are co-constructed with and mutually beneficial to our P-12 school and community stakeholders? (CAEP Standard 2)

9. How well are we preparing, evaluating, supporting, and retaining high-quality clinical educators in the P-12 school setting and within the institution? (CAEP Standard 2)

10. What program and/or unit revisions are needed or being implemented to ensure that evidence-based continuous improvement is ongoing? (CAEP Standard 4 and 5)

As a mechanism of the quality assurance system, DETP faculty integrate the disaggregated data collected, using the previously described data analysis framework, within the context of the CCGA annual academic program review report for institutional effectiveness (see appendix J for template). This mechanism provides the opportunity for faculty to systematically reflect on how data informs the decision-making process through responses addressing the following: A) mission statement, B) program-specific course outcomes, C) methods for measurement, D) success criteria, E) discussion of findings, F) analysis/evaluation of findings, and G) implications and use of data for program improvement. As a mechanism for EPP-level review, DETP faculty examine the aggregate data related to candidates, programs, and other unit operations via an annual unit report required by CCGA via the institutional effectiveness office. Within this process, DETP faculty identify expected outcomes consistent with the mission of the unit and aligned with CCGA’s strategic plan. Looking at the goals from the previous year and the strategies used for their attainment, DETP faculty focus on particular assessment(s) to determine whether or not particular goals were achieved and/or led to improvement, the impact of the improvement activity on the intended outcome, and future implications based on the results of the improvement activity. Using these mechanisms, data are collected and monitored annually at program/department and EPP levels, using VIA Livetext, Microsoft Excel, and Microsoft Word to store, manage, and generate reports for the purpose of analysis. DETP faculty, in conjunction with community and P-12 stakeholders, conduct annual program and unit assessment reviews each October and November, using data reports run using VIA Livetext and completed academic program and annual unit report templates. CCGA administrators, faculty, and stakeholders review the finalized reports.

The DETP faculty use candidate and program data to measure the progress of individual candidates throughout the program and then use aggregated candidate data to determine the effectiveness of the program and EPP. Within the unit assessment system, six key assessments are embedded across every program. Assessments used to evaluate candidates across their respective programs include: 1) Candidate Assessment on Performance Standards (CAPS) (see Appendix A), 2) Educator Disposition Assessment (see Appendix B), 3) Impact on Student Learning Rubric (see Appendix C), 4) Georgia Assessments for the Certification of Educators (GACE), 5) edTPA, and 6) Intern KEYS (see Appendix D). Assessments conducted using a rubric are entered into the VIA Livetext platform and reviewed by both candidates at an

individual level and program faculty at a collective level. Reports generated from VIA Livetext focus on specific rubric elements tied to InTASC standards, overall rubric scores, and program-specific data for candidates. Using reporting features that allow for dynamic and highly specific representations of data, DETP faculty are able to analyze data from multiple perspectives. As previously described, when this data is collected and shared with faculty, a systematic program review is conducted to highlight areas of excellence and to indicate areas to adjust with the goal of continuous improvement during the annual review meeting.

Table 2.1 documents the internal and external EPP-level and program-level assessment mechanisms that are currently being used by the DETP. Assessments indicated as being internal measures are assessed by CCGA faculty, supervisors, and candidates. Assessments indicated as external measures include assessments conducted at the state and national level; assessments completed by P-12 stakeholders, CCGA graduates, and employers.

Program-Level Assessments EPP-Level Assessments

Internal Measures: assessments

completed by CCGA faculty, supervisors,

and candidates

Program-specific requirements at program admission and program completion transition points

Impact on Student Learning Rubric: evaluation of content-specific edTPA portfolios, following official submission to Pearson, to document impact on P-12 student learning

Educator Disposition Assessment: disaggregated data within programs

Common requirements at transition points of program admission, in-progress during the program, program completion, and follow-up to graduation

CAPS Observation Instrument: completed by supervisor and candidate for self-assessment purposes

DETP End of Program Survey: completed by candidates as a part of program completion

Educator Disposition Assessment: aggregate data across programs

Intern KEYS: used during the summative conference for evaluation of evidence presented by the candidate in the conference and via the developed professional growth plan

Program-Level Assessments EPP-Level Assessments

External Measures: assessments

conducted at the state and national level;

assessments completed by P-12

stakeholders, CCGA graduates, and

employers

GACE Basic Skills

GACE Content Assessments

edTPA

CAPS Observation Instrument: completed by P-12 clinical partners

Educator Disposition Assessment: completed by P-12 clinical partners

State-Provided Impact Data: collected from a sampling of graduates at the close of each academic year via TAPS and TKES

DETP Graduate Follow-Up Survey: collected via a survey sent to graduates one-year post-graduation to document the extent to which candidates felt prepared to confront responsibilities of the teaching profession

Employer Satisfaction Survey: collected via a survey sent to employers of graduates one-year post-graduation to document the extent to which employers felt that graduates were prepared to confront responsibilities of the teaching profession

Decision-Making at Program Transition Points

Key assessments are integrated within and across programs to monitor candidate progress. DETP faculty identified four transition points that serve as milestones for candidate progress, remediation, or exit. Candidates admitted to the early childhood and special education program, the middle grades education program, or the secondary education program will be assessed at four transition points: (1) program admission, (2) in progress during the program, (3) program completion, and (4) as a follow-up to graduation.

For each initial program, candidates will be vetted using specific admission and exit criteria to document competencies prior to program entry as well as prior to recommendation for graduation and certification. In addition, candidates must demonstrate growth throughout their respective programs to continue each semester. Candidates receive feedback related to five domains developed to align with the Teacher Performance Standards used by the Georgia Department of Education within the Teacher Keys Effectiveness System. Faculty have identified power domains specific to transition points in the program. These power domains require candidates to demonstrate a minimum level of proficiency in order to advance in the program. Through integration of identified power domains, the increasing expectations and levels of complexity across the program are more visible to candidates. Candidates receive formative

feedback through multiple assessments including, but not limited to disposition evaluation, work samples, and the Candidate Assessment of Performance Standards (CAPS). Using input from all of these formative assessments, candidates receive a summative evaluation occurring within the context of a candidate-led conference at the close of each semester using the Intern KEYS.

Program Admission: For admission to any teacher education program, candidates will be required to provide evidence of: (1) a cumulative GPA of 2.5 or higher on all course work, (2) successful completion of all three basic tests on GACE Basic Skills or the exemption equivalent on SAT/ACT, (3) successful completion of the Georgia Educator Ethics Assessment, (4) a grade of C or better in all Area F courses, (4) completion of Areas A-F in the core curriculum, (5) proof of insurance for tort liability, (6) a successful criminal background check (conducted by the GaPSC), and (7) a GaPSC pre-service certificate application and lawful presence form.

In Progress During the Program: Upon entry into the program, candidates are expected to maintain a cumulative GPA of 2.5 across all academic work to meet requirements for graduation. In addition, all course work must be completed with a grade of C or higher. As candidates progress in the program, they are expected to demonstrate growth across three fundamental areas of knowledge, skills, and dispositions within course work as well as the context of increasingly complex field experiences. Candidates engage in a cyclical feedback process that occurs within the context of a candidate-led summative conference framework (see Appendix E). During this conference, candidates provide documentation that they have met the performance standards incorporated within the CAPS instrument using evidence from the field, course work, and disposition evaluations through the lens of three categories: A) The Learner and Learning, B) Instructional Practice, and C) Professional Responsibility. Following this presentation by the candidate, faculty provide oral and written feedback to the candidate that is expected to be incorporated within the written submission of a professional growth plan (see Appendix F) by the candidate. Following submission of the professional growth plan and with consideration to the summative conference presentation, faculty provide a summative evaluation to the candidate via the Intern KEYS citing evidence provided in the candidate-led summative conference and the professional growth plan to indicate whether the candidate falls at the emerging, developing, practicing, or leading level across three elements: A) The Learner and Learning, B) Instructional Practice, and C) Professional Responsibility. Candidates are expected to revisit their professional growth plan as they prepare for their summative conference the subsequent semester to address feedback provided in the Intern KEYS summative evaluation and to self-monitor progress toward the goal set for professional development within the results section of the professional growth plan.

The CAPS instrument is aligned to both InTASC standards and the knowledge, skills, and dispositions specified in standard 1 of the 2013 CAEP standards. To document growth across field experiences and to be accountable to increasingly challenging expectations, candidates are evaluated using the CAPS instrument by college supervisors. Using the CAPS instrument for evaluation, candidates are accountable to increasingly complex and spiraling expectations across field experiences. Faculty have identified power domains for each semester to indicate minimum level expectations for candidates to proceed in their teacher education program. These power domain criteria are included on the CAPS instrument as aligned to specific programs. For candidates who score below the minimum requirement on any particular power standard, there is

a mechanism in place to provide remediation and support to the candidate via the Professional Improvement Plan (PIP) process (see Appendix G). The PIP process clearly indicates how progress monitoring will occur for the candidate in order for a decision to be made for the candidate to continue in the program or to exit the program in the event that a candidate is unable to meet minimum levels of expectations at a given point in a program. Additionally, candidates are expected to receive a grade of “C” or higher within professional coursework to advance in the program.

Previously, the Teacher Candidate Disposition Evaluation (TCDE) instrument was used to document dispositional evidence and growth across the program for candidates. Beginning in fall of 2018, the DETP faculty voted to adopt the Educator Disposition Assessment to ensure that the instrument used to evaluate dispositions is both valid and reliable. As recommended in the technical guide accompanying this instrument, the assessment will be administered at strategic points throughout the program to document growth and determine eligibility for continuation in the program. The EDA will be administered at the close of each practicum experience and at the close of the clinical practice experience. The EDA will be administered once each semester at the close of field-based experiences (practicum and clinical practice). The assessment will be administered by the college supervisor, the mentor teacher, and the practicum instructor. Additionally, the candidate will use the assessment to self-evaluate. The purpose of the EDA is to determine the extent to which candidates hold beliefs and values that influence them to behave in ways that are supportive of student learning in the P-12 classroom setting. Candidates are expected to score at a minimum of Level I – Developing across all dispositions at all points throughout the program. In the event that a candidate is at risk of receiving a Level 0 – Needs Improvement, the departmental faculty member responsible for assigning this rating must initiate a departmental alert to notify the candidate of concerns related to the disposition(s). Mentor teachers are responsible for notifying the college supervisor in the event that a candidate is at risk of receiving a Level 0 – Needs Improvement so as to provide the opportunity for the supervisor to initiate a departmental alert on behalf of the mentor teacher. If a candidate receives a Level 0 – Needs Improvement on any disposition in the program, he or she will be placed on a Professional Improvement Plan (PIP) where specific goals tied to the disposition under question will be established. Following the departmental PIP protocol, candidates will engage in progress monitoring following the initiation of the PIP at a date to be determined by the faculty member initiating the PIP. If goals are unmet at the progress monitoring meeting, the departmental faculty member has the option of granting an extension on the PIP for completion of recommending dismissal from the program. If dismissal from the program is recommended, this recommendation is voted on by all departmental faculty members. Candidates must receive a rating of at least Level 1 – Developing in order to advance in the program. In order to graduate, candidates must score at a Level 2 – Meets Expectations across all dispositions.

The Impact on Student Learning rubric is administered in a designated course taken by all candidates just prior to the start of clinical practice. This assessment is again administered at the close of clinical practice and is applied to the submitted edTPA portfolio. This assessment is administered once per academic semester during the senior year of study for candidates. This assessment is administered by the instructor of program-specific courses for the associated programs at the appropriate point of progression in the program. As an integral component of the teacher education program at CCGA, candidates must be able to develop and effectively

implement a learning segment to support P-12 student learning of a clearly defined central focus (i.e., learning goal). A critical part of preparation as an educator is the ability to demonstrate impact on student learning. As a part of the program of study, candidates will be expected to provide evidence of their ability to plan, implement, and evaluate implementation of a learning segment. The Impact on Student Learning rubric provides a mechanism to evaluate these proficiencies. In order to advance to clinical practice and be eligible to take the associated edTPA, candidates must score at a minimum of “Meets Expectation” across the three criteria of: 1) Design of Instruction and Assessment, 2) Analysis of Student Learning, and 3) Reflective Practice included on the Impact on Student Learning rubric. Instructors will indicate the rating of: 1) Does Not Meet Expectation, 2) Meets Expectation, or 3) Exceeds Expectation across all three criteria based upon where the majority of indicators listed under the rating are assigned. If a candidate fails to meet the required criterion prior to the close of the fall semester of senior year, the candidate will be placed on a professional improvement plan (PIP) at the discretion of the course instructor. In order to be eligible for completion of edTPA, the candidate must complete the goals indicated in the PIP. Submission of the edTPA portfolio is a program completion requirement. This assessment mechanism serves as a gatekeeper to ensure readiness of candidates for completion of this summative assessment.

As a summative evaluation, faculty use the Intern KEYS to document the proficiency level of candidates at the close of each semester in the teacher education program across the three critical areas of A) The Learner and Learning, B) Instructional Practice, and C) Professional Responsibility. Candidates are expected to demonstrate growth across semesters with the understanding that all areas will be at the proficient level by graduation. At any point in the program, candidates identified as being in need of remediation or support through any of the mechanisms described above may be placed on a PIP plan by faculty. This process results in a decision being made by faculty that is in the best interest of the candidate with regard to continuation in the program or exit.

Program Completion: Candidates who meet the minimum expectations set forth at the in-progress transition points described above will complete clinical practice in their final semester of the program. The clinical practice experience consists of a placement in a P-12 public school setting under the mentorship of a highly-qualified mentor teacher. Currently, the DETP uses the criteria set forth by the GaPSC to identify highly-qualified mentor teachers. This evidence is documented in the VIA Livetext platform. During the twelve-week clinical practice experience, candidates are expected to follow a phase in/phase out model whereby they take over particular aspects of instruction resulting in a six-week period of full-time teaching. Candidates who are in their final semester of the program must meet the minimum qualifications set forth in the guidelines described at the in-progress transition points above. Additionally, for program completion for certification, candidates must maintain a minimum cumulative GPA of 2.5, complete all coursework with a grade of C or better, earn an induction level or higher passing score on the appropriate GACE content assessment, earn a passing score on the Georgia Educator Ethics assessment for program exit, earn a passing score on edTPA, meet the minimum requirements for power domains on the CAPS observation instrument, and earn a rating of at least practicing across all categories on the Intern KEYS summative evaluation. Candidates who satisfy these criteria will graduate and be recommended for induction-level certification to the GaPSC.

Follow-Up to Graduation: Beginning with spring 2018 graduates, faculty will use three mechanisms to assess candidate readiness for the teaching profession. Faculty will use e-mail and the digital VIA Livetext platform to request, manage, and analyze state-provided TAPS and TKES impact data from a sampling of graduates at the close of each academic year. As a part of the DETP End of Program Survey, candidates will be prompted to indicate their willingness to provide this data to the EPP. In the event that a candidate is willing to participate in this data collection process, contact information will be requested so that the EPP may follow-up with this data collection via e-mail or telephone. The DETP Induction Follow-Up Survey will also be sent to CCGA DETP graduates via e-mail one-year post graduation to gather data concerning the extent to which candidates felt prepared to confront responsibilities of the teaching profession. Finally, an Employer Satisfaction Survey will be sent to employers of graduates via e-mail one-year post-graduation to document the extent to which employers felt that graduates were prepared to confront responsibilities of the teaching profession. This data will be analyzed extensively by CCGA DETP faculty using the Quality Assurance and Continuous Improvement Framework for Analysis during the annual data review meeting for program-specific and EPP-level continuous improvement decision-making.

College of Coastal GeorgiaDepartment of Education and Teacher PreparationOverview of Assessment System

The following table indicates the assessments used to measure departmental goals for teacher candidates across programs.

CAPS EDA Impact on Student

Learning Rubric

Intern KEYS

GACE edTPA

Teacher candidates will demonstrate acceptable levels of content knowledge expertise.

XDomain 1

X X

Teacher candidates will respond to diversity as it relates to student learning and the decision-making process for effective teaching.

XDomain 4

X X X

Teacher candidates will apply relevant research/theory and developmentally appropriate practice for planning, instruction, and assessment of P-12 learners.

XDomains

3

X X X

Teacher candidates will employ evidence-based practices with increasing levels of proficiency.

XDomain 2

X X

Teacher candidates will invest in opportunities to develop individual beliefs and values that will positively impact P-12 student learning.

X X X

Teacher candidates will analyze their own teaching practices to monitor their professional growth.

XDomain 5

X X X X

1. Candidate Assessment on Performance Standards (CAPS)* Included in Appendix A

What are the points in the program when the assessment is administered? This assessment is administered early (Practicum 1-2), mid (Practicum 3), and late (Clinical Practice) in the program.

How often is it administered?Practicums 1/2/3: Supervisors evaluate candidates using this observation instrument at least twice per semester.Clinical Practice: Supervisors evaluate candidates using this observation instrument at least three times per semester including one unannounced observation.

Who administers it at each different point?Supervisor

What is the purpose of this assessment?This observation instrument is used to provide feedback to candidates across five domains following a formal lesson observation: 1) Planning, 2) Instructional Delivery, 3) Assessment of and for Learning, 4) Learning Environment, and 5) Professionalism and Communication. For each of the domains, the evaluator will determine the extent to which performance indicators aligned to the domain are met using the rubric included on the instrument. Candidates scoring at a Level I – Below Target meet 20% or less of performance indicators. Candidates scoring at a Level II – Approaches Target meet 21% - 50% of performance indicators. Candidates scoring at a Level III – Meets Target meet 51% - 90% of performance indicators. Candidates scoring at a Level IV – Exceeds Target meet 90% or more of performance indicators.

How is this assessment used to make decisions about candidates’ progress through the program? What score/rating must a candidate reach in order to pass the assessment? What happens if a candidate does not meet this required criterion?Criteria for what constitutes satisfactory performance in an indicated point of progression in a program has been determined by faculty across teacher education programs. Candidates must score a minimum of Level II – Approaches Target for each of the domains indicated in the table below in order to advance in the respective program. Note that the domains are spiraling. This means that in each subsequent semester, the candidate must score a minimum of Level II – Approaches Target on the indicated domains for that semester and the domains indicated in previous semesters in order to receive a grade of satisfactory in the associated practicum course and advance in the program. In the unusual circumstance that a candidate scores a Level I – Below Target on only one required domain, a professional improvement plan (PIP) will be initiated at the discretion of the Director of Field Experience, Certification, and Outreach, practicum supervisor, and practicum instructor. If a candidate fails to complete the required professional development activities indicated in the PIP, the candidate will be subject to receiving a failing grade in the associated practicum course. In order to receive a grade of satisfactory in practicum 4 (student teaching) and be able to graduate, candidates must provide evidence that they received ratings of Level III – Meets Target across a

minimum of four domains.

Early Childhood and Special Education

Middle Grades Education

Secondary Education

Practicum 1:Professionalism and CommunicationLearning Environment

Practicum 1/2:Professionalism and CommunicationLearning EnvironmentPlanningInstructional Delivery

Practicum 1/2:Level II – Approaches Target required across all domains

Practicum 2:Professionalism and CommunicationLearning EnvironmentPlanningInstructional DeliveryPracticum 3: Level II – Approaches Target required across all domains

Practicum 3:Level II – Approaches Target required across all domains

Student Teaching:Level III – Meets Target required across a minimum of four domains.Student Teaching:

Level III – Meets Target required across a minimum of four domains.

Student Teaching: Level III – Meets Target required across a minimum of four domains.

2. Educator Disposition Assessment* Included in Appendix B

What are the points in the program when the assessment is administered?Beginning in fall of 2018, the Department of Education and Teacher Preparation faculty voted to adopt the Educator Disposition Assessment. As recommended in the technical guide accompanying this instrument, the assessment will be administered at strategic points throughout the program to document growth and determine eligibility for continuation in the program. The EDA will be administered at the close of each practicum experience and at the close of the clinical practice experience.

How often is it administered?The EDA will be administered once each semester at the close of field-based experiences (practicum and clinical practice).

Who administers it at each different point?The assessment will be administered by the college supervisor, the mentor teacher, and the practicum instructor. Additionally, the candidate will use the assessment to self-evaluate.

What is the purpose of this assessment?

The purpose of the EDA is to determine the extent to which candidates hold beliefs and values that influence them to behave in ways that are supportive of student learning in the PK-12 classroom setting.

How is this assessment used to make decisions about candidates’ progress through the program? What score/rating must a candidate reach in order to pass this assessment? What happens if a candidate does not meet this required criterion?Candidates are expected to score at a minimum of Level I – Developing across all dispositions at all points throughout the program. In the event that a candidate is at risk of receiving a Level 0 – Needs Improvement, the departmental faculty member responsible for assigning this rating must initiate a departmental alert to notify the candidate of concerns related to the disposition(s). Mentor teachers are responsible for notifying the college supervisor in the event that a candidate is at risk of receiving a Level 0 – Needs Improvement so as to provide the opportunity for the supervisor to initiate a departmental alert on behalf of the mentor teacher. If a candidate receives a Level 0 – Needs Improvement on any disposition in the program, he or she will be placed on a Professional Improvement Plan (PIP) where specific goals tied to the disposition under question will be established. Following the departmental PIP protocol, candidates will engage in progress monitoring following the initiation of the PIP at a date to be determined by the faculty member initiating the PIP. If goals are unmet at the progress monitoring meeting, the departmental faculty member has the option of granting an extension on the PIP for completion of recommending dismissal from the program. If dismissal from the program is recommended, this recommendation is voted on by all departmental faculty members. Candidates must receive a rating of at least Level 1 – Developing in order to advance in the program. In order to graduate, candidates must score at a Level 2 – Meets Expectations across all dispositions.

Fall – Senior Year Spring – Senior YearEarly Childhood and Special Education

LITR 4010 ECSP 4120

Middle Grades and Secondary Education

MSED 3020 MSED 4001

3. Impact on Student Learning Rubric* Included in Appendix C

What are the points in the program when the assessment is administered?The Impact on Student Learning rubric is administered in a designated course taken by all candidates just prior to the start of clinical practice. This assessment is again administered at the close of clinical practice and is applied to the submitted edTPA portfolio.

How often is it administered?This assessment is administered once per academic semester during the senior year of study for candidates.

Who administers it at each different point?This assessment is administered by the instructor of the courses indicated below for the

associated programs at the appropriate point of progression in the program.

What is the purpose of this assessment?As an integral component of the teacher education program at CCGA, candidates must be able to develop and effectively implement a learning segment to support P-12 student learning of a clearly defined central focus (i.e., learning goal). A critical part of preparation as an educator is the ability to demonstrate impact on student learning. As a part of the program of study, candidates will be expected to provide evidence of their ability to plan, implement, and evaluate implementation of a learning segment. The Impact on Student Learning rubric provides a mechanism to evaluate these proficiencies.

How is this assessment used to make decisions about candidates’ progress through the program? What score/rating must a candidate reach in order to pass this assessment? What happens if a candidate does not meet this required criterion?In order to advance to clinical practice and be eligible to take the associated edTPA, candidates must score at a minimum of “Meets Expectation” across the three criteria of: 1) Design of Instruction and Assessment, 2) Analysis of Student Learning, and 3) Reflective Practice included on the Impact on Student Learning rubric. Instructors will indicate the rating of: 1) Does Not Meet Expectation, 2) Meets Expectation, or 3) Exceeds Expectation across all three criteria based upon where the majority of indicators listed under the rating are assigned. If a candidate fails to meet the required criterion prior to the close of the fall semester of senior year, the candidate will be placed on a professional improvement plan (PIP) at the discretion of the course instructor. In order to be eligible for completion of edTPA, the candidate must complete the goals indicated in the PIP. Submission of the edTPA portfolio is a program completion requirement. This assessment mechanism serves as a gatekeeper to ensure readiness of candidates for completion of this summative assessment.

4. Intern KEYS* Included in Appendix D

What are the points in the program when the assessment is administered?The Intern KEYS assessment is administered at end of each practicum placement for candidates in the program. This assessment is also administered at the end of clinical practice. The assessment is always administered at the close of the candidate-led summative conference.

How often is it administered?The Intern KEYS is administered once per semester when candidates are in a practicum placement or clinical practice. For Early Childhood and Special Education candidates, this assessment is administered for a total of four times during the program. For Middle Grades and Secondary Education candidates, this assessment is administered for a total of three times during the program.

Who administers it at each different point?All departmental faculty are responsible for administering this assessment for candidates with whom they teach or supervise. During the candidate-led summative conference,

candidates meet with the college supervisor and at least one faculty member to receive initial informal feedback related to the Intern KEYS and to discuss and document a professional growth/induction plan. At the close of the summative conference and following submission of the professional growth/induction plan, the supervisor provides a suggested rating for each criteria included on the Intern KEYS. Prior to the date/time grades are due, full-time departmental faculty meet to discuss and vote to finalize ratings on the Intern KEYS. Candidates receive their ratings in VIA Livetext. When ratings have been assigned in VIA Livetext, the Director of Field Experiences, Certification, and Outreach e-mails all candidates indicating that ratings are available.

What is the purpose of this assessment?The Intern KEYS serves as a summative assessment to document candidate ability to respond appropriately to the learner and learning, engage in research-based instructional practice, and demonstrate professional responsibility. Candidates are responsible for compiling evidence from coursework and the field to substantiate a rating they have assigned themselves based on self-assessment. Candidates may receive a rating ranging from Emerging to Leading. Each of these ratings include a level 1 or 2 to indicate if candidates have evidence for some of the indicators listed for the rating or all of the indicators listed for the rating.

How is this assessment used to make decisions about candidates’ progress through the program?During practicum semesters, this instrument is used to provide formative feedback to candidates concerning their overall performance. During the final semester of clinical practice, this instrument is used to provide summative feedback.

What score/rating must a candidate reach in order to pass this assessment?Candidates are expected to score at the practicing level prior to the end of student teaching across all categories in order to graduate.

What happens if a candidate does not meet this required criterion?If candidates do not meet the required criterion, they will be placed on a professional improvement plan (PIP) where specific goals are defined to support the candidate in achieving departmental expectations. If the candidate fails to meet the goals specified in the PIP, the candidate is ineligible to graduate and will be required to repeat the clinical practice experience.

5. Georgia Assessments for the Certification of Educators (GACE)What are the points in the program when the assessment is administered? How often is it administered?The Georgia Assessment for the Certification of Educators (GACE) is administered during the senior year of study. Candidates are responsible for registering to take the appropriate assessment.

Who administers it at each different point?GACE is administered by the Educational Testing Service (ETS).

What is the purpose of this assessment?The purpose of the GACE assessments is to help the Georgia Professional Standards Commission (GaPSC) ensure that candidates have the knowledge and skills needed to perform the job of an educator in Georgia’s public schools. The assessment measures competency in the knowledge and skills needed to teach in Georgia’s P-12 classrooms.

How is this assessment used to make decisions about candidates’ progress through the program?In order to graduate, candidates must attempt their subject-specific GACE. For certification, candidates are required to pass each subject-specific GACE tied to their area.

What score/rating must a candidate reach in order to pass this assessment?All GACE test results tied to the Department of Education and Teacher Preparation programs are reported as scaled scores with a scale of 100 to 300. Minimum passing scores for all tests were established by the GaPSC with input from committees of Georgia educators. For content assessments tied to our programs, candidates may fall into one of two categories. If they score at a level 220, they pass at the induction level. If they score at a level 250, they pass at the professional level. The passing score for induction-level certification is also known as the “passing standard.” Passing at either of these levels meets the Georgia Special Requirement to pass the content knowledge assessment(s) appropriate to the field of certification. If candidates take a combined test and pass one subtest at the induction level and the other at the professional level, the entire assessment will only be considered passed at the induction level. Candidates must retake the subtest that was passed at the induction level if they wish to attempt to get a higher score and pass the entire assessment at the professional level. As previously stated, in order to graduate, candidates must attempt their subject-specific GACE. For certification, candidates are required to pass each subject-specific GACE tied to their area. (www.gace.ets.org)

What happens if a candidate does not meet this required criterion?The Director of Field Experiences, Certification, and Outreach (FECO) is responsible for monitoring GACE attempts and pass rates. In the event that a candidate fails a GACE assessment, the Director of FECO alerts the designated program coordinator who will contact candidates to determine need/desire for remediation opportunities. Candidates are responsible for pursuing remediation by responding to departmental outreach.

6. edTPAWhat are the points in the program when the assessment is administered?Candidates are eligible to register for their content-specific edTPA at the start of clinical practice.

How often is it administered?edTPA is administered during designated windows for the duration of the academic year.

Who administers it at each different point?Candidates register for this assessment via www.edtpa.com to be externally evaluated by scorers of Pearson Education, Inc.

What is the purpose of this assessment?edTPA is a performance-based, subject-specific assessment and support system used by teacher preparation programs throughout the United States to emphasize, measure and support the skills and knowledge that all teachers need from Day 1 in the classroom. (www.edtpa.com)

How is this assessment used to make decisions about candidates’ progress through the program?In order to graduate, candidates must attempt their subject-specific edTPA and receive a score with no condition codes. For certification, candidates are required to pass their subject-specific edTPA.

What score/rating must a candidate reach in order to pass this assessment?For candidates enrolled in the early childhood and special education program, the passing score across 18 rubrics is 45. For candidates enrolled in the middle grades or secondary education programs, the passing score across 15 rubrics is 38.

What happens if a candidate does not meet this required criterion?If candidates to not meet the required criterion for passing edTPA, the instructor for the seminar course taken concurrently with clinical practice provides a remediation session at the request of the candidate.

Appendix A

College of Coastal GeorgiaDepartment of Education and Teacher Preparation

Candidate Assessment on Performance Standards (CAPS)

The following observation instrument was developed using the expectations set forth within the Georgia Department of Education Teacher Keys Effectiveness System (TKES) performance standards (http://www.gadoe.org/School-Improvement/Teacher-and-Leader-Effectiveness/Pages/Teacher-Keys-Effectiveness-System.aspx) in order to align pre-service teacher expectations to those set forth for in-service teachers in the state of Georgia.

Instructions for Use During Lesson Observation

This observation instrument will be used to provide feedback to teacher candidates across five domains following a formal lesson observation: 1) Planning, 2) Instructional Delivery, 3) Assessment of and for Learning, 4) Learning Environment, and 5) Professionalism and Communication. For each of the domains, the evaluator will determine the extent to which performance indicators aligned to the domain are met using the rubric below.

Level I – Below Target Level II – Approaches Target Level III – Meets Target Level IV – Exceeds Target20% or less of

performance indicators met21%-50% of

performance indicators met51%-90% of

performance indicators met91% or more of

performance indicators met

For the professionalism and communication domain, the evaluator will base the rating on the discussion that occurs with the candidate post-observation. During this post-observation conference, the evaluator will specifically cite instances where the candidate reflected on specific strengths and areas to improve to indicate engagement in reflective practice as part of the evidence. For each of the domains, the evaluator will indicate a rating. Specific evidence using the performance indicators (and additional qualitative feedback, as desired) must be included to substantiate the rating.

Domain Requirements and Criteria

The observation instrument provides a unique opportunity for candidates to document growth as they advance in their respective programs. Candidates will be evaluated using the same level of expectation regardless of the point of progression in any given program. Criteria for what constitutes satisfactory performance in an indicated point of progression in a program has been determined by faculty across teacher education programs. Candidates must score a minimum of Level II – Approaches Target for each of the domains indicated in the table below in order to advance in the respective program. Note that the domains are spiraling. This means that in each subsequent semester, the candidate must score a minimum of Level II – Approaches Target on the indicated domains for that semester and the domains indicated in previous semesters in order to receive a grade of satisfactory in the associated practicum course and advance in the program. In the unusual circumstance that a candidate scores a Level I – Below Target on only one required domain, a professional improvement plan (PIP) will be initiated at the discretion of the Director of Field Experience, Certification, and Outreach, practicum supervisor, and practicum instructor. If a candidate fails to complete the required professional development activities indicated in the PIP, the candidate will be subject to receiving a failing grade in the associated practicum course. In order to receive a grade of satisfactory in practicum 4 (student teaching) and be able to graduate, candidates must provide evidence that they received ratings of Level III – Meets Target across a minimum of four domains.

Early Childhood and Special Education Middle Grades Education Secondary EducationPracticum 1:Professionalism and CommunicationLearning Environment

Practicum 1/2:Professionalism and CommunicationLearning EnvironmentPlanningInstructional Delivery

Practicum 1/2:Level II – Approaches Target required across all domains

Practicum 2:Professionalism and CommunicationLearning EnvironmentPlanningInstructional Delivery

Practicum 3: Level II – Approaches Target required across all domains

Practicum 3:Level II – Approaches Target required across all domains

Student Teaching:Level III – Meets Target required across a minimum of four domains.

Student Teaching:Level III – Meets Target required across a minimum of four domains.

Student Teaching: Level III – Meets Target required across a minimum of four domains.

Instructions Following Lesson ObservationFollowing the observation, teacher candidates will be expected to reflect on their practice documenting identified strengths and areas for growth using departmental reflection prompts. This informal reflection must occur prior to the post-observation conference with the evaluator and must be completed within three days of the lesson plan implementation. When the informal reflection is completed, teacher candidates are responsible for scheduling a debriefing conference with the evaluator. During this debriefing conference, the evaluator will provide detailed feedback across all domains citing specific evidence to substantiate each rating provided. Following the debriefing conference, the evaluator and the teacher candidate will sign and date the document indicating that feedback was provided to the teacher candidate.

Determination of Successful ObservationFollowing the debriefing conference, the teacher candidate will refer to the domain requirements and criteria for the appropriate point of progression within the teacher education program. In the event that a candidate receives a Level I – Below Target rating on any domain, the candidate will determine if he/she wishes to: a) schedule another observation with the same evaluator to incorporate feedback provided and demonstrate improvement or b) request an observation from another faculty member within the Department of Education and Teacher Preparation as agreed upon by the evaluator. The domains are used to determine if a candidate will receive a passing grade in an associated practicum course. Any candidate who fails to meet the domain requirements and criteria for the indicated point of progression in a teacher education program is subject to receiving a failing grade for the associated practicum course.

Teacher Candidate SchoolProgram/Course Grade LevelDate/Time of Observation Content AreaName of Evaluator Mentor Teacher

DOMAIN 1: PLANNINGPerformance Standard 1: Professional KnowledgeThe teacher candidate demonstrates an understanding of the curriculum, subject content, pedagogical knowledge, and the needs of students by providing relevant learning experiences.

Performance Standard 2: Instructional PlanningThe teacher candidate plans using state and local school district curricula and standards, effective strategies, resources, and data to address the differentiated needs of all the students.

Performance Indicators Observedo aligns to grade-level standardso identifies and aligns to observable and measurable objectives o demonstrates accurate, deep, and current content knowledgeo links present content with past and future learningo incorporates evidence-based practices identified in relevant/current

research and theoryo incorporates developmental and age-related needso develops a plan that is clear, logical, sequential, and integrated

across the curriculum, as appropriate

o plans instruction effectively for content mastery, pacing, and transitions

o plans to meet the diverse needs of learners including readiness, interest, and/or learning preferences

o plans to use a wide variety of resources to support whole group, small group, and individual learning

o _________________________________________________o _________________________________________________

Rating: _____ Level I-Below Target _____ Level II-Approaches Target _____ Level III-Meets Target _____ Level IV-Exceeds Target

Additional Qualitative Evidence

DOMAIN 2: INSTRUCTIONAL DELIVERYPerformance Standard 3: Instructional StrategiesThe teacher candidate systematically gathers, analyzes, and uses relevant data to measure student progress, to inform instructional content and delivery methods, and to provide timely and constructive feedback to both students and parents.

Performance Standard 4: Differentiated InstructionThe teacher candidate challenges and supports each student’s learning by providing appropriate content and developing skills which address individual learning differences.

Performance Indicators ObservedThe teacher candidate…o builds upon existing knowledge, skills, and experiences of studentso explains directions, concepts, and lesson content to students in a

logical, sequential, and age-appropriate mannero reinforces learning objectives consistently throughout the lessono uses a variety of instructional strategies and resources (including

technology, as appropriate) to enhance student learningo uses multiple levels of questioning to stimulate and monitor student

learningo communicates material clearly and checks for student understandingo provides opportunities for remediation, enrichment, and/or

acceleration as appropriate to the learner(s)o adapts instruction “just in time” based on formative feedback o _________________________________________________

The P-12 students…o demonstrate active engagement as evidenced by body language,

classroom talk, noise level, and/or student work sampleso act as a learning community with active participation in building,

discussing, and sharing ideas with peers and teacherso work in a variety of instructional arrangements (i.e., individual, small

group, whole group)o make thinking and learning visible through specific tasks initiated by

the teacher candidate and/or peerso self-monitor progress toward an identified learning objective(s)o ________________________________________________

Rating: _____ Level I-Below Target _____ Level II-Approaches Target _____ Level III-Meets Target _____ Level IV-Exceeds Target

Additional Qualitative Evidence

DOMAIN 3: ASSESSMENT OF AND FOR LEARNINGPerformance Standard 5: Assessment StrategiesThe teacher candidate systematically chooses a variety of diagnostic, formative, and summative assessment strategies and instruments that are valid and appropriate for the content and student population.

Performance Standard 6: Assessment UsesThe teacher candidate systematically gathers, analyzes, and uses relevant data to measure student progress, to inform instructional content and delivery methods, and to provide timely and constructive feedback to both students and parents.

Performance Indicators Observedo aligns student assessment to the established learning objectives and

curriculumo uses a range of formal and informal assessments for diagnostic,

formative, and/or summative purposeso varies/modifies assessments as appropriate to student needso analyzes/uses data to gain insights into or measure individual and

collective student learning progress

o gives clear, timely, and informative oral and/or written feedback to support students in identifying strengths and strategies to use to improve learning

o uses assessment tools to inform, guide, and adapt long and short term instructional decisions

o _________________________________________________o _________________________________________________

Rating: _____ Level I-Below Target _____ Level II-Approaches Target _____ Level III-Meets Target _____ Level IV-Exceeds Target

Additional Qualitative Evidence

DOMAIN 4: LEARNING ENVIRONMENTPerformance Standard 7: Positive Learning EnvironmentThe teacher candidate provides a well-managed, safe, and orderly environment that is conducive to learning and encourages respect for all.

Performance Standard 8: Academically Challenging EnvironmentThe teacher candidate creates a student-centered, academic environment in which teaching and learning occur at high levels and students are self-directed learners.

Performance Indicators Observedo establishes clear expectations for classroom norms, routines, and

procedures and enforces them consistently/appropriatelyo sets high expectations reflecting on relevant student learning datao supports all students in reaching set expectationso involves students in thinking about their own learning progress as

related to areas of strength/areas for growth and/or strategies to propel learning

o models caring, fairness, respect, and enthusiasm for learningo manages proactively rather than reactively while also responding to

unexpected events in a timely, appropriate mannero responds to the socioemotional needs of learners creating an

accepting and warm classroom culture where each student is valued and comfortable taking risks in learning

o celebrates the growth, talents, and efforts exerted by individual students

o maximizes instructional timeo promotes higher-order thinking through use of effective questioning,

tasks, and resources/materialso supports authentic learning opportunities for students o encourages exploration, problem solving, collaboration, and/or

student self-directed learningo _________________________________________________o _________________________________________________

Rating: _____ Level I-Below Target _____ Level II-Approaches Target _____ Level III-Meets Target _____ Level IV-Exceeds Target

Additional Qualitative Evidence

DOMAIN 5: PROFESSIONALISM AND COMMUNICATIONPerformance Standard 9: ProfessionalismThe teacher candidate exhibits a commitment to professional ethics and the school’s mission, participates in professional growth opportunities to

Performance Standard 10: CommunicationThe teacher candidate communicates effectively with students, parents or guardians, district and school personnel, and other stakeholders in ways

support student learning, and contributes to the profession. that enhance student learning.Performance Indicators Observed

o carries out duties in accordance with federal/state laws, Code of Ethics, and established state and local school board policies, regulations, and practices

o maintains professional demeanor and behavior (i.e., confidentiality, punctuality, language, and attendance)

o collaborates with mentors, colleagues, faculty, and other relevant stakeholders to reflect on and improve teaching and learning for students

o engages in reflective practice to identify strengths and areas for growth following lesson implementation

o demonstrates flexibility in adapting to school change

o uses verbal and non-verbal communication techniques to foster positive interactions and promote learning in the classroom/school environment

o uses precise language, correct vocabulary/grammar, and appropriate forms of oral and written communication

o listens and responds with cultural awareness, empathy, and understanding to the voice and opinions of diverse students

o uses modes of communication that are appropriate for a given situation

o _________________________________________________o _________________________________________________

Rating: _____ Level I-Below Target _____ Level II-Approaches Target _____ Level III-Meets Target _____ Level IV-Exceeds Target

Additional Qualitative Evidence

Evaluator Signature: ____________________________________________________ Date: ____________

Teacher Candidate Signature: _____________________________________________ Date: ____________

Appendix B

Educator Disposition Assessment

Name:___________________________________________________________________Date:_________________________ Evaluator:______________________________________________________________________________________________

Directions: Please use the following numbers to rate the individual on each disposition based on the following scale by marking the corresponding number in the cell. Please note that italicized constructs are further explained in the technical manual. Indicators for each disposition are found in the cells. Scores for each of the nine dispositions will be averaged to calculate an overall composite score. Lastly, please add comments to support ratings as needed.

0-Needs Improvement: minimal evidence of understanding and commitment to the disposition1-Developing: some evidence of understanding and commitment to the disposition2-Meets Expectations: considerable evidence of understanding and commitment to the disposition

Disposition Associated Indicators 1. Demonstrates Effective Oral Communication Skills

Needs Improvement0

Developing1

Meets Expectations2

□ Does not consistently model Standard English as evidenced by making major errors

□ Does not vary oral communication to motivate students as evidenced by monotone voice with visible lack of student participation

□ Choice of vocabulary is either too difficult or too simplistic

□ Models Standard English and makes common and noticeable errors

□ Strives to vary oral communication as evidenced of some students demonstrating a lack of participation

□ Occasionally uses vocabulary that is either too difficult or too simplistic

□ Models Standard English with a high level of competence as evidenced by no errors

□ Varies oral communication as evidenced by encouraging participatory behaviors

□ Communicates at an age appropriate level as evidenced by explaining content specific vocabulary

Disposition Associated Indicators 2. Demonstrates Effective Written Communication Skills

Needs Improvement0

Developing1

Meets Expectations2

□ Communicates in tones that are harsh or negative as evidenced by fostering negative responses

□ Demonstrates major spelling and grammar errors or demonstrates frequent common mistakes

□ Communicates respectfully and positively but with some detectable negative undertones, evidenced by unproductive responses

□ Demonstrates common errors in spelling and grammar

□ Communicates respectfully and positively with all stakeholders as evidenced by fostering cordial responses

□ Demonstrates precise spelling and grammar

Disposition Associated Indicators 3. Demonstrates professionalismDanielson: 4f; InTASC: 9(o)

Needs Improvement0

Developing1

Meets Expectations2

□ Does not respond to communications and does not submit all assignments

□ Fails to exhibit punctuality and/or attendance

□ Crosses major boundaries of ethical standards of practice

□ Divulges inappropriate personal life issues at the classroom/workplace as evidenced by uncomfortable responses from others

□ Functions as a group member with no participation

□ Delayed response to communications and late submission of assignments

□ Not consistently punctual and/or has absences

□ Crosses minor boundaries of ethical standards of practice

□ Occasionally divulges inappropriate personal life issues into the classroom/workplace, but this is kept to a minimum

□ Functions as a collaborative group member as evidenced by minimal levels of participation towards productive outcomes or monopolizes conversation

□ Responds promptly to communications and submits all assignments

□ Consistently exhibits punctuality and attendance

□ Maintains professional boundaries of ethical standards of practice

□ Keeps inappropriate personal life issues out of classroom/workplace

□ Functions as a collaborative group member as evidenced by high levels of participation towards productive outcomes

Disposition Associated Indicators 4. Demonstrates a positive and enthusiastic attitudeMarzano: 29

Needs Improvement0

Developing1

Meets Expectations2

□ Often complains when encountering problems and rarely offers solutions

□ Resists change and appears offended when suggestions are made to try new

□ Seeks solutions to problems with prompting

□ May tentatively try new ideas/activities that are suggested yet is often unsure of how to proceed

□ Actively seeks solutions to problems without prompting or complaining

□ Tries new ideas/activities

that are suggested

ideas/activities

□ Demonstrates a flattened affect as evidenced by lack of expressive gestures and vocal expressions

□ Overlooks opportunities to demonstrate positive affect □ Demonstrates an

appropriately positive affect with students as evidenced by verbal and non-verbal cues

Disposition Associated Indicators

5. Demonstrates preparedness in teaching and learningDanielson: 1e, 3e, 4a; InTASC: 3(p)

Needs Improvement0

Developing1

Meets Expectations2

□ Rejects constructive feedback as evidenced by no implementation of feedback

□ Possesses an inaccurate perception of teaching/learning effectiveness as evidenced by limited concept of how to improve

□ Comes to class unplanned and without needed materials

□ Does not have awareness to alter lessons in progress as evidenced by activating no changes when needed

□ Somewhat resistant to constructive feedback as evidenced by a lack of follow through on some suggestions

□ Reflection contains inaccuracies as evidenced by needing assistance for corrective measures of improvement

□ Comes to class with some plans and most needed materials

□ Aware that lesson is not working but does not know how to alter plans to adjust

□ Accepts constructive feedback as evidenced by implementation of feedback as needed

□ Learns and adjusts from experience and reflection as evidenced by improvements in performance

□ Comes to class planned and with all needed materials

□ Alters lessons in progress when needed as evidenced by ability to change plan mid-lesson to overcome the deficits

Disposition Associated Indicators6. Exhibits an appreciation of and value for cultural and academic diversityDanielson: 1b, 2a, 2b; Marzano: 36, 39; InTASC: 2(m), 2(n), 2(o), 3(o), 9(m), 10(q)

Needs Improvement0

Developing1

Meets Expectations2

□ Demonstrates □ Goes through the expected □ Embraces all diversities

inequitable embracement of all diversities

□ Is challenged to create a safe classroom as evidenced by ignoring negative behaviors by students

and superficial motions to embrace all diversities

□ Strives to build a safe classroom with zero tolerance of negative behaviors towards others but needs further development in accomplishing this task

as evidenced by implementing inclusive activities and behaviors with goals of transcendence

□ Creates a safe classroom with zero tolerance of negativity to others as evidenced by correcting negative student behaviors

Disposition Associated Indicators 7. Collaborates effectively with stakeholdersDanielson: 4c, 4d; Marzano: 55, 56; InTASC: 1(k), 3(n), 3(q), 7(o)

Needs Improvement0

Developing1

Meets Expectations2

□ Is inflexible, as evidenced by inability to work well with others and does not accept majority consensus

□ Tone exhibits a general lack of respect for others as evidenced by interruptions and talking over others

□ Rarely collaborates or shares strategies and ideas even when prompted

□ Demonstrates some flexibility

□ Maintains a respectful tone in most circumstances but is not consistent

□ Shares teaching strategies as evidenced by some effort towards collaboration

□ Demonstrates flexibility as evidenced by providing considered responses and accepts majority consensus

□ Maintains a respectful tone at all times, even during dissent as evidenced by not interrupting or talking over others

□ Proactively shares teaching strategies as evidenced by productive collaboration

Disposition Associated Indicators8. Demonstrates self-regulated learner behaviors/takes initiativeDanielson: 4e; Marzano: 57; InTASC: 9(l), 9(n), 10(r), 10(t)

Needs Improvement0

Developing1

Meets Expectations2

□ Is unable to self-correct own weaknesses as evidenced by not asking for support or overuse of requests for support

□ Does not conduct appropriate research to guide the implementation of effective teaching as evidenced by a lack of citations in work

□ Is beginning to recognize own weaknesses and asks for support making some effort to become involved in professional growth

□ Level of research needs further development to acquire fully and integrate resources leading to implementing different and effective teaching styles

□ Recognizes own weaknesses as evidenced by seeking solutions before asking for support

□ Researches and implements most effective teaching styles as evidenced by citing works submitted

Disposition Associated Indicators 9. Exhibits the social and emotional intelligence to promote personal and educational goals/stabilityMarzano: 37, 38

Needs Improvement0

Developing1

Meets Expectations2

□ Demonstrates immaturity and lack of self-regulation as evidenced by

□ Demonstrates level of maturity to self–regulate after initial response is one of overreaction to sensitive

□ Demonstrates appropriate maturity and self-regulation as evidenced by remaining

overreacting to sensitive issues

□ Does not demonstrate perseverance and resilience (grit) as evidenced by giving up easily

□ Demonstrates insensitivity to feelings of others as evidenced by a lack of compassion and empathetic social awareness

issues

□ Demonstrates perseverance and resilience (grit) most of the time

□ Demonstrates sensitivity to feelings of others most of the time

calm when discussing sensitive issues

□ Demonstrates perseverance and resilience (grit) as evidenced by tenacious and determined ability to persist through tough situations

□ Demonstrates sensitivity to feelings of others as evidenced by compassionate and empathetic social awareness

AVERAGE COMPOSITE SCORE ACROSS NINE DISPOSITIONS:

COMMENTS:

Appendix C

College of Coastal Georgia

Department of Education and Teacher Preparation

Impact on Student Learning Project

As an integral component of the teacher education program at the College of Coastal Georgia (CCGA), candidates must be able to develop and effectively implement a learning segment to support P-12 student learning of a clearly defined central focus (i.e., learning goal). A critical part of preparation as an educator is the ability to demonstrate impact on student learning. As a part of the course indicated in the respective program of study just prior to clinical practice, candidates will be expected to plan, implement, and evaluate implementation of a learning segment. The courses where this project will be housed are indicated in the table below as aligned to the indicated program.

Early Childhood and Special Education Candidates LITR 4010: Literacy for 21st Century LearningMiddle Grades and Secondary Education Candidates MSED 3020: Assessment and Differentiation

In order to advance to clinical practice and be eligible to take the associated edTPA, candidates must score at a minimum of “Meets Expectation” across the three criteria of: 1) Design of Instruction and Assessment, 2) Analysis of Student Learning, and 3) Reflective Practice. Instructors will indicate the rating of: 1) Does Not Meet Expectation, 2) Meets Expectation, or 3) Exceeds Expectation across all three criteria based upon where the majority of indicators listed under the rating are assigned. If a candidate fails to meet the required criterion prior to the close of the fall semester of senior year, the candidate will be placed on a professional improvement plan (PIP) at the discretion of the course instructor. In order to be eligible for completion of edTPA, the candidate must complete the goals indicated in the PIP. Submission of the edTPA portfolio is a program completion requirement. This assessment mechanism serves as a gatekeeper to ensure readiness of candidates for completion of this summative assessment.

In the semester prior to clinical practice, the Impact on Student Learning Project will include the following:

Overview of Learning Segment Candidates will select a standard(s) and describe the central focus for a planned learning segment. Given the central focus, candidates will describe how standards and learning objectives within the learning segment are connected and sequenced to support student learning in meaningful contexts. Additionally, candidates will describe academic language of the learning segment and indicate how planned instructional supports will be used to help students understand, develop, and use the academic language.

Knowledge of Students Candidates will describe how prior academic learning/prerequisite skills and personal, cultural, and community assets of learners related to the central focus informed instructional planning decisions.

Assessment Plan Candidates will describe formal and informal (i.e., formative and summative) assessments that will be used to

provide direct evidence of student learning as defined in the central focus and lesson objectives. Additionally, candidates will explain any assessment adaptations that will be incorporated to support diverse students.

Lesson Plans Candidates will design 3-5 sequential or connected lesson plans using the edTPA lesson plan framework.Analysis of Student Learning Following learning segment implementation, candidates will be expected to analyze assessment data by

summarizing student learning for all evaluation criteria. In this analysis, candidates will cite specific evidence from assessments to analyze patterns of learning for the whole class, small groups, and individuals. Candidates will also indicate next steps for instruction as a result of assessment analysis.

Learning Segment Reflection Following learning segment implementation, candidates will describe how they promoted a positive learning environment, engaged students in active learning, and deepened student learning during instruction. Candidates will also reflect on strengths and areas to improve with regard to learning segment delivery.

CCGA DETP Impact on Student Learning Rubric

Does Not Meet Expectation Meets Expectation Exceeds ExpectationDesign of Instruction and

Assessment Fails to incorporate assets of

learners Fails to incorporate prior

academic knowledge Fails to incorporate needs of

diverse students Fails to align standards,

objectives, and central focus Fails to align assessment

measures and stated objectives/central focus

Incorporates minimal informal/formal (i.e., formative/summative) assessments to drive instruction

Incorporates assets of learners Incorporates prior academic

knowledge Incorporates needs of diverse

students Aligns standards, objectives,

and central focus Aligns assessment measures

and stated objectives/central focus

Incorporates appropriate informal/formal (i.e., formative/summative) assessments to drive instruction

Incorporates a wide range of assets of learners

Incorporates varied levels of prior academic knowledge

Incorporates a wide range of diverse learner needs

Aligns standards, objectives, central focus, and assessment measures with consideration to unique needs of learners

Uses a wide variety of instructional approaches

Analysis of Student Learning Uses technology to discuss implications of data analysis but only as related to the whole class

Uses technology to summarize data for whole class and subgroups in graphs/tables or via a narrative

Uses technology to summarize data for whole class, subgroups, and individuals in graphs/tables or via a narrative

Fails to identify changes in practice to improve student learning

Fails to reflect on strengths and areas to improve in reference to teaching

Analyzes/Interprets data by citing specific evidence to document impact on student learning across groups, subgroups, and/or individuals

Analyzes/Interprets multiple forms of data, in meaningful ways (i.e., correlation), to document impact on student learning across groups, subgroups, and individuals

Reflective Practice Discusses implications of data analysis but only as related to the whole class

Fails to identify changes in practice to improve student learning

Fails to reflect on strengths and areas to improve in reference to teaching

Discusses the implications of data analysis for whole class and subgroups or individuals at varied levels of performance

Identifies changes in practice that would improve student learning based on analysis of data

Reflects on strengths and areas to improve in reference to teaching

Discusses the implications of data analysis, in meaningful ways (i.e., correlation), for whole class, subgroups, and individuals at varied levels of performance

Explains how adjustments to teaching would improve student learning based on data analysis

Self-evaluates teaching by identifying strengths and areas to improve with reference to strategies to be used for professional growth

Appendix D

College of Coastal Georgia

Department of Education and Teacher Preparation

Intern KEYS

Candidates are eligible for level 1 and level 2 ratings within classifications of emerging candidate to leading candidate. Within these classifications, if candidates provide some evidence that they are performing within the criteria of a designated classification they will receive a level 1. In order to receive a level 2 rating within a classification, candidates must provide evidence that most criteria are met. Candidates are responsible for building a case that they meet the criteria of specified classifications. Evidence must include artifacts from the field experience and course work. Candidates are expected to score at the practicing level prior to the end of student teaching across all categories in order to graduate.

Category One: The Learner and Learning:How do candidates develop learning experiences appropriate for the learner?

Emerging Developing Practicing LeadingE-1 E-2 D-1 D-2 P-1 P-2 L-1 L-2

The emerging candidate provides evidence of developing learning experiences that: align to grade-level

standards represent accurate

content knowledge of the discipline

build on students’ prior academic learning

take into consideration the developmental and age-related needs of learners

include supports that address requirements from IEP and 504 plans

In addition to meeting emerging expectations, the developing candidate provides evidence of developing learning experiences that: align to a central

focus and cohesive set of learning objectives (what students will know and be able to do)

are informed by some formative or summative assessment data

build on students’ personal, cultural, and community assets

incorporate a range of opportunities and ways for students to engage with, participate in, and represent or express subject matter learning

incorporate multiple assessments that provide evidence of student learning and progress

In addition to meeting developing expectations, the practicing candidate provides evidence of developing learning experiences that: are logically

sequenced over time are informed by

patterns of strengths and needs in both individual and collective student assessment data

encourage exploration, problem-solving, and/or collaboration

incorporate multiple ways of eliciting and making visible student ideas and thinking as (a) a resource for student learning, and (b) to monitor both individual and collective student learning progress

includes learning supports that help both individual and groups of students

In addition to meeting practicing expectations, the leading candidate provides wide-ranging evidence of developing learning experiences that: use systematic and

ongoing reflection of assessment data to provide customized learning experiences

make explicit and relevant interdisciplinary connections

incorporate innovative resources and strategies resulting in high levels of student engagement

provide learners opportunities to self-assess and use metacognitive strategies to support lifelong learning

32

include learning supports that address the needs of the class

respond to the social emotional needs of the learner

reach high standards of learning

provide academic language support

Category Two: Instructional Practice:How do candidates implement research-based practices?

Emerging Developing Practicing LeadingE-1 E-2 D-1 D-2 P-1 P-2 L-1 L-2

The emerging candidate provides evidence of implementation demonstrating: Standard American

English in written and spoken communication

rapport with students and respect for students

ability to manage whole and small group tasks

In addition to meeting emerging expectations, the developing candidate provides evidence of implementation demonstrating: a positive low-risk

learning environment explicit modeling of

skills (i.e., analyze, summarize) and thought processes necessary for student learning

ability to connect new content to prior academic learning

active engagement of students throughout the entirety of the lesson

ability to elicit and build on student responses to support content and process development of learners

skill in providing feedback to support student learning

In addition to meeting developing expectations, the practicing candidate provides consistent evidence of implementation demonstrating: student support to

learn, practice, and apply skills in an authentic context

varied levels of student participation and engagement as appropriate to the learner

ability to link new content to personal, cultural, and community assets

a challenging learning environment that provides opportunities to express varied perspectives and promotes mutual respect among students

incorporation of active and visible learning strategies

responsive teaching with adjustments made “just in time” in response to student dialogue or evidence of learning

seamless incorporation of varied resources including technology

ability to provide

In addition to meeting practicing expectations, the leading candidate provides evidence of implementation demonstrating: student-led inquiry

based learning opportunities

involvement of students in self-monitoring progress and setting learning goals

33

useable feedback that addresses both strengths and needs of individual students

Category Three: Professional Responsibility:How does the candidate invest in opportunities to grow as a professional?

Emerging Developing Practicing LeadingE-1 E-2 D-1 D-2 P-1 P-2 L-1 L-2

The emerging candidate provides evidence of: meeting established

deadlines and following relevant policies/procedures

reflective practice with incorporation of mentor teacher, instructor, and supervisor feedback

reflective practice through identification of at least 3 strengths in teaching

reflective practice through identification of at least 3 areas to improve in teaching

a strategic plan with observable actions to support continuous growth

adherence to federal/state laws, established state/local school board policies, regulations, and practices, and the GaPSC Code of Ethics

In addition to meeting emerging expectations, the developing candidate provides evidence of: reflective practice

with incorporation of peer feedback

reflection on assessment data for future instructional planning

participation in professional organizations and/or community involvement opportunities

collaboration with colleagues and other stakeholders to reach educational decisions that enhance and promote student learning

uses modes of communication that are appropriate for a given situation

In addition to meeting developing expectations, the practicing candidate provides evidence of: reflective practice

with incorporation of scholarly literature

reflection on assessment data for future instructional planning substantiated by research and theory

engagement in activities outside the classroom intended for school/student enhancement at the practicum site and/or professional growth

listens and responds with cultural awareness, empathy, and understanding to the voice and opinions of diverse students and stakeholders

In addition to meeting practicing expectations, the leading candidate provides evidence of: leadership at the

local, state, or national level in a professional capacity

34

Appendix E

SUMMATIVE CONFERENCE - SPRING 2018

Professional Growth Plan Follow-Up

Attach a copy of your professional growth plan from the previous semester below.

Did you meet the professional learning goal that you set last semester? Describe the steps you took this semester to strive toward meeting the goal that you set. Specifically, indicate how you measured your success or need to continue working toward the indicated goal.

Strengths of the Candidate

In reflection on the criteria included on the Intern KEYS rubric (see attached below), provide evidence of your strengths across the key elements of: A) The Learner and Learning, B) Instructional Practices, and C) Professional Responsibility. For each area, you will indicate the level you believe you achieved for this semester. 

ATTACHMENTS

CCGA - Intern KEYS- Revised 12-12-17.docx (37 Kb)

The Learner and Learning

Indicate the level you believe you achieved this semester ranging from Emerging (E-1) to Leading (L-2).

In the box below, provide evidence to document how you determined your rating for this category. Reference the artifacts that you are attaching and explain how these artifacts serve as evidence that you are achieving at the level identified. 

Instructional Practices

Indicate the level you believe you achieved this semester ranging from Emerging (E-1) to Leading (L-2).

In the box below, provide evidence to document how you determined your rating for this category. Reference the artifacts that you are attaching and explain how these artifacts serve as evidence that you are achieving at the level identified. 

35

Professional Responsibility

Indicate the level you believe you achieved this semester ranging from Emerging (E-1) to Leading (L-2).

In the box below, provide evidence to document how you determined your rating for this category. Reference the artifacts that you are attaching and explain how these artifacts serve as evidence that you are achieving at the level identified. 

Areas for Growth

In reflection on the criteria included on the Intern KEYS rubric, identify areas for growth across the elements of: A) The Learner and Learning, B) Instructional Practices, and C) Professional Responsibility. Use the language of the rubric to indicate how you could increase your level of achievement. Discuss the data you reviewed as you identify your areas for growth. This could include, but is not limited to, class assignment feedback, student assessment data, CAPS Observation data, and informal mentor teacher feedback.

Professional Growth Plan or Induction Plan

Attach your professional growth plan or induction plan following completion of feedback received from the summative conference. Then, submit the entire activity for review.

ATTACHMENTS

CCGA - Professional Growth Plan Template.docx (42 Kb)

36

Appendix F

Professional Growth Plan andProfessional Growth Plan for Induction

The purpose of writing a professional growth plan is to provide documentation of reflection on practice with the goal of demonstrating growth across performance standards related to the learner and learning, instructional practice, and professional responsibility. This professional growth plan template provides a framework for use as you engage in a systematic process of reflection. Each plan will be unique to the individual teacher candidate and each plan is based on candidate self-assessment through careful reflection on feedback provided from CCGA instructors, supervisors, and mentor teachers. The intent of this document is to guide and support the reflective practice required to generate a meaningful professional growth plan through identification of the goals, strategies, resources, and support, and intended results that are of value to candidates seeking to grow as professionals.

Domain Requirements and Criteria

Candidates must score a minimum of Level II – Approaches Target for each of the domains on the CAPS evaluation in order to advance in the program. Note that the domains are spiraling. This means that in each subsequent semester, the candidate must score a minimum of Level II – Approaches Target on the indicated domains for that semester and the domains indicated in previous semesters. In the unusual circumstance that a candidate scores less than a Level II – Approaches Target on a single domain, a professional improvement plan (PIP) will be initiated at the discretion of the Director of Field Experience, Certification, and Outreach, supervisor, and practicum instructor. If a candidate fails to complete the required professional development activities indicated in the PIP, the candidate will be subject to receiving a failing grade in the associated practicum course. In order to receive a grade of satisfactory in practicum 4 (student teaching) and be able to graduate, candidates must provide evidence that they received ratings of Level III – Meets Target across a minimum of four domains.

Practicum I

Power Domains:Domain 4: Learning Environment

Domain 5: Professionalism and CommunicationDomain 4:

Performance Standard 7Positive Learning Environment

The teacher candidate provides a well-managed, safe, and orderly environment that is conducive to learning and encourages respect for all.

Domain 4:Performance Standard 8

Academically Challenging

The teacher candidate creates a student-centered, academic environment in which teaching and learning occur at high levels and students are self-directed

37

Environment learners.

Domain 5:Performance Standard 9

Professionalism

The teacher candidate exhibits a commitment to professional ethics and the school’s mission, participates in professional growth opportunities to support student learning, and contributes to the profession.

Domain 5:Performance Standard 10

Communication

The teacher candidate communicates effectively with students, parents or guardians, district and school personnel, and other stakeholders in ways that enhance student learning.

Practicum I/II

Power Domains:Domain 1: Planning

Domain 2: Instructional DeliveryDomain 1:

Performance Standard 1Professional Knowledge

The teacher candidate demonstrates an understanding of the curriculum, subject content, pedagogical knowledge, and the needs of students by providing relevant learning experiences.

Domain 1:Performance Standard 2Instructional Planning

The teacher candidate plans using state and local school district curricula and standards, effective strategies, resources, and data to address the differentiated needs of all the students.

Domain 2:Performance Standard 3Instructional Strategies

The teacher candidate systematically gathers, analyzes, and uses relevant data to measure student progress, to inform instructional content and delivery methods, and to provide timely and constructive feedback to both students and parents.

Domain 2:Performance Standard 4

Differentiated Instruction

The teacher candidate challenges and supports each student’s learning by providing appropriate content and developing skills which address individual learning differences.

Practicum I/II/III

In order to successfully complete practicum III, candidates must score at a Level II – Needs Development in all domains.

Power Domains:Domain 3: Assessment of and for Learning

Domain 3: Performance Standard 5Assessment Strategies

The teacher candidate systematically chooses a variety of diagnostic, formative, and summative assessment strategies and instruments that are valid and appropriate for the content and student population.

38

Domain 3:Performance Standard 6

Assessment Uses

The teacher candidate systematically gathers, analyzes, and uses relevant data to measure student progress, to inform instructional content and delivery methods, and to provide timely and constructive feedback to both students and parents.

Practicum IV – Incorporation of all Domains

In order to successfully complete student teaching, candidates must provide evidence that they received ratings of Level III – Effective across a minimum of four domains.

Use the template below to document a professional growth plan for induction.

39

Professional Growth Plan

Teacher Candidate: _________________________________ Date: ________________

Semester ____________________

Identify the standard of teaching and learning that you see as an area of growth.

Why did you select this standard as your goal?

My Goal: Based on self-reflection, evidence from observations, and conversations with my instructors, this is the FOCUS of my growth plan.

Strategy: These are the steps I will take to address my goal statement. These steps include my specific activities, my timeline, and the measures of success that will determine whether my goal is attained.

Resources & Support: These are the resources and support I will need to help me achieve my goal.

To be revisited at the close of the subsequent semester and incorporated in the

corresponding summative conference – Results: Here is the outcome of my strategy with specific focus on the attainment of the stated measures of success.

40

Professional Growth Plan Team Members Present at Summative Conference

Name Role

41

Appendix G

Department of Education and Teacher Preparation

Professional Improvement Plan (PIP) Procedures

In the Department of Education and Teacher Preparation, faculty work to create the conditions for all candidates to succeed. As a part of a program of study, candidates may experience unique challenges and need additional support. The purpose of a professional improvement plan is to develop clear and concise guidelines for reengagement for an identified candidate within a teacher education program of study. The following procedures will be used to initiate the professional improvement plan process:

1. The College of Coastal Georgia (CCGA) faculty member will schedule a meeting with the candidate to discuss an identified area for improvement. This meeting signals to the candidate that a departmental alert is being initiated.

2. Following this meeting, the CCGA faculty member will document the discussion, the student plan for remediation, and any additional decisions or plans for follow-up using the departmental alert form. Both the candidate and faculty member will sign the departmental alert form to document shared understanding.

3. The CCGA faculty member will keep a copy of the form, provide a copy of the form to the candidate, and place the original document in the candidate’s folder.

4. In the event that a candidate fails to follow through on the agreements indicated in the departmental alert form, the CCGA faculty member will initiate the PIP process by scheduling a meeting with relevant parties including, but not limited to, the candidate, the department chair, and the Director of Field Experiences, Certification, and Outreach (for field related issues only).

5. The PIP form will be completed during the meeting with all relevant parties present. Following the meeting, all parties will sign as indicated on the PIP form. The original PIP form will be placed in the candidate’s teacher education folder. A copy of the PIP form will be provided to the candidate.

6. It is the responsibility of the candidate to keep the faculty member initiating the PIP informed of progress related to expectations set forth in the PIP. It is the responsibility of the faculty member initiating the PIP to schedule a follow-up meeting at an appropriate time to formally evaluate the progress of the PIP. The faculty member will e-mail all relevant parties, including, but not limited to, the candidate, the department chair, and the Director of Field Experiences, Certification, and Outreach (for field related issues only) to schedule the follow-up meeting.

7. The PIP progress monitoring form will be completed during the follow-up meeting with all relevant parties present. Following the meeting, faculty will make a recommendation and sign as indicated on the PIP progress monitoring form. In the event that program dismissal is a consideration, all departmental faculty will convene to vote on this decision. The candidate will be notified of this decision following the departmental vote. The original PIP progress monitoring form will be placed in the candidate’s teacher education folder. A copy of the PIP progress monitoring form will be provided to the candidate. If an extension of the PIP is indicated, steps 6 and 7 will be repeated, as needed.

42

College of Coastal GeorgiaDepartment of Education and Teacher Preparation

Departmental Alert Form

Teacher Candidate: Date:

CCGA ID: Course Prefix/Number (as applicable):

Participants:

Why is a departmental alert being initiated?

o Attendance

o Punctuality

o Preparedness

o Participation

o Collegiality

o Disposition

o Written Communicationo Oral Communicationo Professional Dresso Ethical Conducto Subject Matter Knowledgeo Other ____________________

Conference Summary – Attach relevant documentation, as applicable.

The summary will describe the area for improvement and any plans for follow-up.

Faculty Signature/Date Candidate Signature/Date

43

College of Coastal GeorgiaDepartment of Education and Teacher Preparation

Professional Improvement Plan (PIP)

Teacher Candidate: Date:

CCGA ID: Course Prefix/Number (as applicable):

Participants:

Part A: Prior ExperiencesThe faculty member will describe the prior experiences related to the indicated area for improvement. Any completed departmental alert forms and/or relevant documentation must be attached.

Part B: Candidate Response

The faculty member will document the candidate response to part A.

44

Part C: Initiation of PIP

Describe the strategic plan that will be implemented to accelerate the candidate.

Goals Strategies to

Accomplish Goals

Dates for Completion

Date/Time for Follow-Up Meeting: ___________________________________________________________

Part D: Collaborative Agreement

I/We participated in the development of this PIP, and I/We understand that a follow-up meeting will be held to indicate the status of the PIP to the candidate. Candidates will successfully complete the PIP, extend the PIP (in part or in whole), or be dismissed from the program.

__________________________________________

Faculty Signature/Date

__________________________________________

Faculty Signature/Date

__________________________________________

Department Chair Signature/Date

I participated in the development of this PIP, and I understand that failure to meet the agreed upon goals will result in an extension of the PIP (in-part of in-whole) or dismissal from the program.

__________________________________________

45

Candidate Signature/Date

College of Coastal GeorgiaDepartment of Education and Teacher Preparation

Professional Improvement Plan (PIP)Progress Monitoring

Part A: Candidate Update on ProgressThe faculty member will document the information shared during each follow-up conference using the form below. The candidate will provide an update on progress toward each goal included in the PIP with reference to the corresponding strategies related to the goal. As appropriate, the candidate will indicate when goals have been completed.

Follow-Up Conference Date:

Participants:

Candidate Update on Progress:

Part B: Faculty Recommendation

I/We participated in the progress monitoring of this PIP, and I/We recommend the following:

No further action due to successful completion of PIP by the candidate Extension of the PIP in-part as outlined below to be completed by ____________________ Extension of the PIP in-whole to be completed by ____________________ Dismissal from the program effective ___________________

46

In-Part Extension of PIP Requirements:

Data Collection and Analysis Policies and Procedures

As a part of the continuous improvement initiative with the DETP at CCGA, faculty have identified a need to develop policies and procedures for program-level and EPP-level data collection and analysis. As curricular revisions are finalized, signature assessments will be developed across programs. Policies and procedures related to data analysis will be critical to ensuring fidelity across programs assessment mechanisms. Collection, aggregation, analysis, and interpretation of assessment data will involve a multi-step process. The initial planning of these policies and procedures are indicated below:

1. When signature assessments are developed, DETP faculty will review relevant accreditation standards and departmental program outcomes to carefully select or create an instrument (i.e., rubric or examination) with clear alignment.

2. DETP faculty will establish a process for evaluation of assessment measures to help ensure validity, reliability, and fairness in the assessment process.

3. Program coordinators will serve as data managers within the Via Livetext digital assessment platform. Program coordinators will be responsible for inputting assessments that have been selected or developed and ensuring that instructors collect data using these signature assessments in the courses in which the assessments will be housed, as agreed upon by DETP faculty, at the start of each semester.

4. The Director of Field Experiences, Certification, and Outreach will be responsible for inputting, managing, and analyzing data related to field experiences, clinical practice internships, and certification requirements of CCGA candidates.

5. CCGA candidates will be required to purchase and create a Via Livetext account as a part of program entry requirements. This entry requirement will be incorporated within the curriculum revisions to be submitted to the CCGA curriculum committee in December of 2017. As a part of the account setup, candidates will be prompted to input demographic

47

_______________________________________

Faculty Signature/Date

_________________________________________

Faculty Signature/Date

________________________________________

Department Chair Signature/Date

________________________________________

Dean Signature/Date* Program Dismissal Only

data that will be used for data aggregation comparisons by faculty during continuous improvement initiatives.

6. Prior to the close of each semester, the department chair will be responsible for meeting with program coordinators and the Director of Field Experiences, Certification, and Outreach to monitor data collection and analysis procedures conducted during that semester. Any missing candidate information or data entry will be immediately reported to the individual responsible by the department chair in order to ensure that data is consistently collected for analysis during each semester.

7. Prior to the close of each semester, the department chair will meet with program faculty to review data summaries for accuracy. In September of each year, program coordinators will review aggregate data for completeness and accuracy in preparation for the October annual data analysis meeting.

8. Prior to the annual data analysis meeting with DETP faculty, program coordinators and the Director of Field Experiences, Certification, and Outreach will be responsible for making aggregate and disaggregate data reports available to DETP faculty to use in conjunction with the Quality Assurance and Continuous Improvement Framework for Analysis. Reports should be aggregated for assessment criteria relevant to those questions being asked in the analysis framework.

9. During the October annual data analysis meeting, DETP faculty will collaborate to review relevant data and discuss/develop annual program reports.

The mission of the DETP at CCGA is to prepare teachers with the knowledge, skills, and dispositions necessary to positively impact P-12 student learning. The DETP provides programs, embedded with authentic field-based application opportunities, that prepare teacher candidates for the realities of the 21st century classroom. Through a blend of traditional and innovative approaches to teacher preparation, with a focus on the learner and learning, instructional practice, and professional responsibility, graduates will enter the teaching profession poised to meet the challenges of contemporary classrooms. Candidates complete teacher education coursework and content-specific coursework in those content areas specific to the program/area of certification while engaging in structured and supervised field experiences in the P-12 school setting. In the spirit of continuous improvement, the DETP has indicated goals related to the continued development and refinement of the quality assurance system through the unit assessment plan (see appendix L) to ensure provider quality assurance and continuous improvement.

48