Institutional Reports

Overview and
Conceptual Framework

Standard 1

Standard 2

Standard 3

Standard 4 

Standard 5

Standard 6

Offsite BOE Feedback

Addendum

     Addendum Evidence

          Standard 1

          Standard 2

          Standard 3

          Standard 4

          Standard 5

          Standard 6

On-Site Review

Final Report

Rejoinder

Join us on Facebook Follow us on Twitter Watch us on YouTube

Institutional Report Addendum 2013

  • Homect_img
  • IR Addendum 2013
 

School of Education Unit

Response to BOE Offsite Report

Continuous Improvement Pathway

 

Nomsa Geleta, Ed. D.

Dean and Unit Head

ngeleta@edinboro.edu

 

Gwyneth Price, PhD.

Unit Accreditation Coordinator

gprice@edinboro.edu

  

 School of Education

310 Butterfield Hall

Scotland Road

Edinboro, PA 16444

http://www.edinboro.edu/departments/education/dean_of_ed_index.dot


 

Edinboro University appreciates the Feedback Report of the BOE Offsite Team detailing the areas of concern and requests for evidence. The following response is prepared to provide additional information requested in the Feedback Report. For each standard, there are responses to comments made within the Preliminary Findings where applicable; responses to the Areas of concern related to continuing to meet the standard; and additional explanation and evidence needed to address Evidence for the BOE Team to validate during the onsite visit.

Standard 1

Preliminary Findings

BOE Comment:  “Programs recognized with conditions, Science and Mathematics have made significant changes to assessments and curriculum to address the recommendations made by SPA reviewers. These programs have resubmitted reports to additional review. Health and Physical Education (HPE) are currently not recognized and have created a new assessment process with improved assessment instruments.”

Response: Mathematics submitted a Response to Conditions report in March and was Fully Recognized in August 2013.  Science was unable to meet all the conditions set forth by the SPA before the last submission under the old standards.  Science will submit in September using Option A under the 2012 standards.  HPE has made significant changes to the assessment process and has collected a significant amount of data. These changes include the addition of the Physical Education and Health Profile encompassing the Test of Gross Motor development and Game performance assessment.  This assessment, given twice during the program will ensure candidates have the physical abilities necessary to be successful in the classroom. Additionally, the SPA now integrates the Instructional Assessment Plan as a major assessment of ability to plan and effect on student learning.  This comprehensive Unit Plan completed during the capstone experience includes a pre/post test analysis of student learning and meets all INTASC standards as well as many of the NASPE pedagogical standards. HPE will submit using Option A in September.

BOE Comment: Beginning in summer 2012, alumni and employer satisfaction surveys were developed and administered. The raw data are presented for the various assessments, but an analysis of these data, and how these reflect on candidate preparation was not presented.

Response: An analysis of alumni, employer, and clinical faculty survey data has been completed and will be available for BOE on-site review.  It can also be found in Exhibit 1.1.  Data from these surveys will be shared with the Unit on September 27, 2013  as well as with our Educational Partner Advisory Council (EPAC) on September 20, 2013. The surveys will again be distributed in Fall 2013.

 

Areas of Concern related to continuing to meet the standard

No areas for concern.

 

Evidence for the BOE Team to validate during the onsite visit

1.      How are dispositions reviewed in online programs? Are these assessed and the data used?

a.     Students enrolled in advanced programs are evaluated throughout their graduate experience on dispositions. Students are informed of these expectations through a number of venues including course syllabi, advisement, Graduate Catalogue, Candidacy Application, course assignments and clinical experiences. Online dispositions are assessed concurrently when the Conceptual Framework is assessed; at the beginning and at the conclusion of each program.

 Conceptually assessing student dispositions in online courses is similar to the assessment process used in face-to-face courses. All department or program dispositions policy statements are available in course syllabi. Initial assessment occurs when students introduce themselves while formative assessments take the form of interacting, sharing, and helping occurs throughout the entire course. For example, students are informed that an emphasis is placed on social justice through the expectation that all voices will be honored by listening to and respecting differences.  Web discussions simulate classroom face-to-face conversations during which comments are monitored to ensure that no group or individual is excluded.  Students are reminded to acknowledge cultural backgrounds and socio-economic factors within local as well the larger contexts. In research courses for instance, generalizations are minimized by requests that all research statements are supported with accurate and reviewed data.

Data specific to online students does not exist separately since online student information is combined with data comprised of all students. Available data on dispositions can be found within each SPA report, as well as from the Disposition Survey and Conceptual Framework Reflection assessment. All programs have been challenged to become more committed to using the data on dispositions more meaningfully and proactively.

 b.   One example of collecting specific data on dispositions comes from the School Psychology program and can be seen in Exhibit 1.2. Further evidence provided by the Reading Program (all online except cap-stone experience) can be seen in the highlighted areas of the READ 712 Rubric (Exhibit 1.3). Since most of the online candidates are enrolled in programs that were in the former Professional Studies Department, this Disposition Policy has been provided in Exhibit 1.4.  Though this department has been reorganized, the policy has followed the programs to their new departments.

2.      Will three years of aggregated and summarized data be available for all programs? A data table disaggregated by program and level would be helpful.

a.    Three years of aggregated and summarized data will be available for BOE to review.  All data currently resides in our on-line platform, Desire2Learn (D2L) and is kept as current as possible by the Office of Accreditation.  Data for the Unit as well as all available program level data are kept on the Accreditation page in D2L and are open to all Unit members.

b.      A data table outlining all the assessments currently used by the Unit and by each program has been created for BOE review and placed in Exhibit 1.5.

3.      What is the status of the Foreign Language program? Was it SPA reviewed and if not, are there data on the program?

a.     Both the Spanish and German Foreign Language programs have been officially placed on moratorium. Candidates currently in the programs are being facilitated through completion but no further candidates are being admitted to those programs.  Due to a variety of reasons including faculty reduction and the inability for candidates to complete the programs in a timely fashion, these programs are no longer available. Exhibit 1.6.

b.    Due to the historically low number of candidates in these programs, the foreign language programs were never SPA reviewed, and therefore, there is no specific program data available.  Any candidate completing these programs participated in all Unit assessments as well as any assessment specific to the education courses within which they were enrolled.

4.   At the advanced level, how are internships assessed? Is assessment data available?

a.     All internships are assessed and data is available for BOE on-site review.  The following table indicates the type of internship and the corresponding assessment(s).

 

Program

Internship

Assessment(s)

Educational Leadership

SCHA 792 Supervisory Internship

OR

SCHA 799 Superintendent’s Internship

Internship Final Assessment

AND

Focus Project on Student Achievement

School Psychology

APSY 795 Internship in School Psychology

Internship Evaluation

Reading

READ 712 Reading Clinic

Clinic Performance Portfolio

Special Education

SPED 695 Graduate Internship in Special Education

Graduate Project

 

 5.     Follow up on programs that have not received national recognition, including health and physical education program. What changes have been made and what is their review status?

a.    As mentioned earlier, both Science and HPE have developed new assessment plans based on the newest national standards.  Both programs have begun the data collection process and are currently writing Initial Recognition (Option A) SPA reports to be submitted in September 2013.  Official notice of national recognition is expected in March 2014 for both programs.  The new assessment plans for each can be seen below and the full SPA reports will be available for BOE On-site review.


EU HPE NCATE/NASPE Assessment Plan

Standard #

NASPE / NCATE

Type of Assessment

Name of Assessment

Form of Assessment

Administration of Assessment

1

#1 – Licensure or other content-based assessment

Praxis II

Licensure Test

Year 4 – prior to student teaching

2

#2- Content-based assessment

Course Grades

Course Grades

Years 2-4

HPE215, HPE220, HPE222, HPE225. HPE230, HPE278, HPE300, HPE301,

HPE314, HPE350,

HPE 360

3

#3 - Assessment of candidate to plan/implement instruction

Instructional Assessment Plan

Project – HPE495

Year 4

4

#4 - Assessment of internship or clinical experience

Teacher Candidate Performance Profile (TCPP)

Supervisor and Clinical faculty evaluation of candidate

Year 4 – twice during student teaching

5

#3 - Assessment of candidate effect on student learning

Instructional Assessment Plan

Project-

 HPE 495

Year 4

6

#5- Additional assessment that addresses AAHPERD/NASPE standards – required

Field Experience Final Evaluation

Supervisor evaluation form-

HPE 405

Year 3 or Year 4

Twice (one in each placement)

2

#6 - Additional assessment that addresses AAHPERD/NASPE standards – optional

Physical Education and Health Profile: (Test of Gross Motor development; Game performance assessment)

Instructor completed scoring scale

Year 1,2,3, and/or 4

Pre-candidacy and Pre-student teaching

3

#7 – Additional assessment- optional

Adapted Swim Lesson Plan

Project-

HPE 360

Year 2 or 3

Once in HPE 360

 

EU Science programs NCATE/NSTA Assessment Plan

Standard #

NASPE / NCATE

Type of Assessment

Name of Assessment

Form of Assessment

Administration of Assessment

1

#1 – Licensure or other content-based assessment

Praxis II

Licensure Test

Year 4 – prior to student teaching

1

#2- Content-based assessment

Content Analysis Forms*

NSTA course alignment forms

Programs continuously aligned with standards

2

#3 – Content Pedagogy

Unit Plan

Unit Plan

Year 3 or 4 – During SEDU 461 or 462

3

#4 – Learning Environments

Junior Field Interview and Showcase

Portfolio Project and Interview

Year 3or 4

End of Pre-student teaching

4

#5 – Establishing a Safe Learning Environment

Report of Supervision Form

Final Supervisor evaluation form

Year 4

End of Student teaching

5

#5- Impact on student learning

Instructional Assessment Plan

Unit Plan

Assessment project

Year 4

Once during student teaching

6

#6 – Professional Knowledge and Skills

PDE 430 A

Checklist of Activities

Year 4

Once during student teaching

 

#7 – additional assessment meeting the NSTA standards

PDE 430

State evaluation form

Year 4

End of student teaching

*Post Baccalaureate programs to use transcript analysis.

6.      Summaries of data were available, but is there documentation of how data are analyzed by committees, programs and/or the unit?

 a.    Data is available to all Unit faculty members through the use of the Desire2Learn Accreditation page.  Faculty members are invited review data as necessary to inform instructional or programmatic changes.  Official discussions related to analyzing data occur at the program, department, and Unit level.  Evidence of these discussions can be seen in the minutes of such meetings (Exhibit 1.8) as well as in the annual PAR reports (Exhibit 1.9) and the submitted SPA reports (AIMS). Though NCATE is a standing item on most Department meeting agendas, Department meetings are often burdened with informational agenda items and committee reports, thus actual data discussions often occur at meetings specifically called for the purpose (Exhibit 1.7). As an example, Early Childhood and Special Education programs participated in a “Faculty Retreat” specifically designed for the analysis of data.

Graduate programs in these areas both analyzed data prior to completing the PAR reports and then shared this information with relevant parties.  In particular, during the 2011-2012 academic year, all major Reading program assignments and rubrics were revised on a course-by-course basis after reviewing data for the PAR/SLOA report.  These changes were also based on anecdotal evidence and trends in the discipline. The School Psychology program is particularly effective in their use of data for program improvement. Examples of this include their use of internship site supervisor feedback to inform a course of action for including a Clinical Practicum in School Psychology and the data from disposition assessments as impetus for the inclusion of a once-per-semester meeting for candidates with program faculty for early detection of issues. School Psychology’s use of “data analysis for program improvement” is further discussed in Exhibit 1.2.

Top of Standard           Top of Page

 

Standard 2

Preliminary Findings

BOE Comment: The IR states that the assessment systems for both the initial and the advance programs use five transitions points at which data are collected from various assessments to make decisions are made about candidate performance. However, the matrix provided by the unit which describes the assessment system for the unit’s initial programs identifies six transition points. These include: admission to the university, candidacy, clinical experience, pre-student teaching, student teaching/graduation, and first-year professional.

Response: Though “pre-student teaching” is listed in the matrix provided it not an official transition point in the assessment system.  Though it is used to aid those in the School of Education to determine when certain assessments should occur, “pre-student teaching” is actually a sub-category of “Clinical experiences”.  For clarity, the matrix has been edited (Exhibit 2.1).

 

2.3 Feedback on correcting previous areas for improvement

 The previous AFI has been removed as course assignments and rubrics are now consistently aligned to national standards.

 

2.4 Areas of Concern related to continuing to meet the standard

 1.      The assessment system is not regularly evaluated by its professional community. 

The membership of key decision-making groups does not include stakeholders from outside the unit.

 a.   Though possibly not evident through the discussions in the IR, input on pieces of the assessment system has been gleaned from both internal and external sources. Several of the CI committees include faculty members from programs outside the unit. In particular the Dispositions CIC and Diversity CIC have members from other programs on campus and these members have been integral in key discussions such as developing the SOE dispositions policy and developing diversity proficiencies.

Input from clinical faculty has been solicited for the development of individual assessments such as the Teacher Candidate Performance Profile and Instructional Assessment Plan as well as for programmatic assessments, particularly in the Early Childhood program.

In the past two years, improvements have been made that will continue to make evaluation of the assessment system by stakeholders outside the unit a more regular occurrence.  The creation of the Educational Partners Advisory Council (EPAC) in fall 2011 has allowed for regular and productive conversations between the unit and key stakeholders. Placing the assessment system as a standing item on the agenda for these meetings beginning fall 2013 will allow for more targeted discussions. There is evidence, however, that such discussions have already taken place. The productive and informative meeting with EPAC members on the proposed Special Education 7-12 program led to specific inclusions in the programs and will affect the assessments administered in those courses (Exhibit 2.2). Perhaps even more clearly related, EPAC members had direct input on the development and administration of the Employer Satisfaction Survey (Exhibit 2.3).  This survey, a key assessment for transition point #5 based on the Conceptual Framework and program standards will also serve as a source of input from key stakeholders on the quality of our graduates’ knowledge and skills, and can be used in conjunction with other assessment system data for program and unit improvement. Further input will continue to be obtained through the use of the Clinical Faculty survey, Faculty satisfaction survey, and Alumni satisfaction survey. 

 2.      It is not clear how programs and the unit as a whole use data to systematically initiate and monitor changes.

  Limited evidence was provided to show that identified changes in the unit and programs were driven by analysis of data.

 a.    The most convincing evidence of data analysis driving programmatic decisions can be found in SPA reports, Program Analysis Reports (PAR), and Student Learning Outcome Assessment (SLOA) reports (Exhibit 1.9).  Those programs recognized through the SPA process are regularly reviewing data with a greater emphasis placed on this analysis at the time of SPA review.  Two other required processes augment this and ensure that non-SPA programs are also regularly analyzing data. First, within the past two years as part of the Middle States Self-study process the University has placed a stronger emphasis on data related to Student Learning Outcomes.  All programs on campus now complete SLOA reports annually for the University SLO Coordinator and Advisory Council review.  This information is then available for and feeds directly into the 5 year review cycle for all programs in the PASSHE system.  Second, the Program Analysis Reports were developed specifically to aid programs in the SOE to bridge the gap between the NCATE assessment process and the SLOA process.  These reports require an annual review of program data based on the goals/objectives adopted by the program.  All programs are asked to analyze the available data and suggest changes or revisions necessary based on that analysis.  Implementation dates for such revisions are required.  As programs complete this annual review, a review of implementation progress and results of changes are also expected.

 Unit meetings held in the fall of each academic year now include a review of unit data and time for discussion of that data.  Suggestions from those meetings are then taken to the appropriate CIC meeting for review and implementation.

At a recent meeting of the ACC, it was decided to direct each CIC to have one meeting per semester focused on data informed decision making (Exhibit 2.4).  Any data not available through the D2L accreditation page that CIC’s wish to review should be requested through the Accreditation Office at least one month in advance of the meeting date.  A biannual report will then be filed with the accreditation office delineating the findings of the meeting. A template is to be developed to aid in the completion of these reports. Results of this analysis and any suggestions for change can then be presented at the next Unit meeting for discussion if necessary.

 

2.5 Evidence for the BOE Team to validate during the onsite visit.

 1.      How are the members of the professional community involved in the development and review of the unit assessment system?

a.    Though possibly not evident through the discussions in the IR, input on pieces of the assessment system has been gleaned from both internal and external sources. Several of the CI committees include faculty members from programs outside the unit. In particular the Dispositions CIC and Diversity CIC have members from other programs on campus and these members have been integral in key discussions such as developing the SOE dispositions policy and developing diversity proficiencies.

Input from clinical faculty has been solicited for the development of individual assessments such as the Teacher Candidate Performance Profile and Instructional Assessment Plan as well as for programmatic assessments, particularly in the Early Childhood program.

In the past two years, improvements have been made that will continue to make evaluation of the assessment system by stakeholders outside the unit a more regular occurrence.  The creation of the Educational Partners Advisory Council (EPAC) in fall 2011 has allowed for regular and productive conversations between the unit and key stakeholders. Placing the assessment system as a standing item on the agenda for these meetings beginning fall 2013 will allow for more targeted discussions. There is evidence, however, that such discussions have already taken place. The productive and informative meeting with EPAC members on the proposed Special Education 7-12 program led to specific inclusions in the programs and will affect the assessments administered in those courses (Exhibit 2.2). Perhaps even more clearly related, EPAC members had direct input on the development and administration of the Employer Satisfaction Survey (Exhibit 2.3).  This survey, a key assessment for transition point #5 based on the Conceptual Framework and program standards will also serve as a source of input from key stakeholders on the quality of our graduates’ knowledge and skills, and can be used in conjunction with other assessment system data for program and unit improvement. Further input will continue to be obtained through the use of the Clinical Faculty survey, Faculty satisfaction survey, and Alumni satisfaction survey. 

 2.      Documentation of the data analysis that resulted in the continuous improvement changes identified by the unit.

a.    Documentation of data analysis can be seen in the PARs and SPA reports.  The most current PARs can be found in Exhibit 1.9.  Other documentation can be seen in the minutes of Unit and Program meetings (Exhibits 1.7 and 1.8).

3.      Documentation of candidate complaints and their resolution. Is there a formal complaint process for advanced candidates as well as initial?

a.    The student concern resolution process is posted on the School of Education website and applies to advanced and initial candidates. The process, developed to serve all candidates, has currently resolved issues of  two (2)  initial candidates and  four (4) advanced candidates.  Both the policy and current spreadsheet documenting the resolutions can be seen in Exhibit 2.5.

 4.      If candidates are not successful in transitioning throughout the program, how are they advised? What is the process for working with candidates who are not successful?

a.    The Academic Success Center was developed by the University so that early interventions could occur with all EU students struggling academically.  Professors in all programs, including those in the School of Education, can use the online referral system to report any concerns and alert the ASC to those that may need help (Exhibit 2.6)

The First Year Experience (FYE) program was designed by the University to improve retention rates by aiding freshman in their transition to the college level.  All freshmen are required to attend an FYE seminar and all seminars are assigned by major, and thus there is an FYE seminar for Early Childhood and Special Education, and one for Middle & Secondary Education, as well as one for Health & Physical Education.  These sessions are not only used to introduce freshmen to others in their majors but also to pass along important information necessary to be successful in the program as well as to discuss topics such as study skills and time management.

Candidates are required to meet with their advisors each semester to discuss their progress and future scheduling.  For candidates that are not successfully completing their program of study, advisors discuss options for retaking classes, changing concentrations, or changing majors.

Resources are available for candidates struggling with the standardized exams required for candidacy or certification.  A tutor is provided through the SOE specifically for the PAPA and Praxis II  as well as “test prep seminars” held by both the Middle & Secondary Education and Early Childhood Departments. In addition, a PAPA preparatory course is offered through EUs partnership with Butler Community College which is taught by EU faculty. (Exhibit 2.7)

Chairs are made aware of candidates struggling academically.  Chairs can then make advisors aware of these candidates so that contact can be made. Remediation plans are often developed to aid in furthering the growth of the candidate.  These plans may include additional field experiences, extended field experiences, small group or individual meetings about course content, or additional readings or research projects. If the struggle persists or a remediation plan is not completed successfully, the advisor or possibly the Chair will counsel the candidate out of the program. Many candidates are referred to career counseling for help with these decisions. Any candidate being counseled out of the SOE meets with the Dean’s Office to review all the concerns and options.

 5.      What is the status of the Assessment CIC with department chairs regarding fairness and accuracy?

a.    Preliminary discussions have taken place within the Assessment CIC about the “elimination of bias” for all assessments.  Ideas for implementing the process will be developed late in fall 2013 after the Assessment CIC has met and the Unit meeting has occurred.  The Chair of the Assessment CIC is charged with researching and bringing samples of past, successful reliability and validity studies that would be appropriate to adopt.  One unit assessment will be chosen for discussion at the fall Unit meeting and this discussion will include the analysis of data as well as the review of the appropriateness of the instructions and rubric. The process of aligning all rubrics and assessments to the national and state standards will continue to ensure fairness and accuracy.

 Top of Standard         Top of Page
 

Standard 3

Preliminary Findings

 BOE Comment: The unit created two formal collaborative advisory groups that have provided much needed insight and feedback. The Educational Partners Advisory Council (EPAC) is made up of superintendents and administrators from area school districts. Feedback from this group has led to changes in the offerings of the unit. The ECSD Academic Advisory Group consists of four higher education institutions working together to increase opportunities for P-12 candidates.

Response:  Though it is true that EU created EPAC to better serve and gain insight from our P-12 school partners, the ECSD Academic Advisory Group was created by ECSD itself.  EU gladly participates in this collaboration, working together with other institutions to provide opportunities for candidates and P-12 students.

BOE Comment: Programs are responsible for coordinating their own early stage field experiences for their initial candidates. It is unclear if these experiences are tracked only by the individual programs or if the OCST keeps track of them as well.

Response: Though all programs have been responsible for tracking early stage experiences, we are currently in the middle of a transition which will move this responsibility to the Office of    Certification and Student Teaching (OCST). OCST is making and tracking field placements for the Special Education program currently and will be adding the Early Childhood program in fall 2013.  By spring 2014, OCST will be responsible for Middle & Secondary education field placements and will meet with Health and Physical Education to determine their needs. Discussions with Art and Music education programs will also occur in 2014. To this point, the Department Assistant Chair or designated faculty has kept records of placements (see exhibit 3.1).

In addition, Stage 1 and 2 field experiences (observations and exploratory experiences) are now tracked by successful completion of SPED 210 and SPED 370.  Each course, required for all candidates, has an integrated assignment where candidates log in their location of school, cooperating teacher, assignments completed, hours of observation or small group teaching.  These assignments are loaded into Livetext for each course (Exhibit 3.2).  It is designed as a running log for their entire time at EU for their given program.  Further, when it is time to make the student teaching placements, the candidates are required to state where their stage 3 (pre-student teaching) locations were on the application (Exhibit 3.10).  The Director will then make student teaching placements based on the information given to ensure diversity of placements.

BOE Comment: Field and capstone experiences for each advanced program are embedded to be applicable to their fields of study. They are different durations and have different requirements based on individual SPAs. The field experiences emphasize hands-on real world involvement, many times in their home school, working with practicing professionals. It is unclear how these experiences are assigned and/or tracked for consistency or to satisfy the diversity of placements requirement. Because these field experiences for advanced teachers are often completed in the candidate’s own classroom, a diverse placement can be difficult.

Response: Field experiences that are incorporated into course work are approved and tracked by the faculty member teaching the course. Internships are approved and tracked by the program head. For more on this topic, please see #3 below.

 BOE Comment: The training and evaluation of clinical faculty is not as systematic as the unit would like it to be. They have begun to address these concerns and will be developing a plan.

 Response:  This is a topic of conversation that will be on the agenda for the Clinical Experiences CIC this academic year.  The OCST continues to provide training for clinical faculty on campus (Exhibit 3.3) which includes a discussion of NCATE and PDE standards/requirements as well as assessments administered through Livetext. Evaluation of clinical faculty is complicated by the collective bargaining agreements under which such faculty are hired. Evaluation is completed per the CBA and is the responsibility of the P-12 school administration. General information may be extracted from the Clinical Faculty survey (Exhibit 3.4) as well as the Student Teacher survey now being administered at the final practicum (Exhibit 3.5)

 

Areas of Concern related to continuing to meet the standard

No areas for concern.

 

Evidence for the BOE Team to validate during the onsite visit

1.      Documentation placements with the DOD partnership and the partnership created with the schools in China. To what extent are these programs used? What is the success rate for these programs? How is it known whether the training and exit criteria are met?

a.      EU partners with DOD to place candidates in international locations.  Due to the DOD procedures, only two candidates in the past five years requesting a DOD placement have been successful in securing a spot.  A candidate will not be placed in a DOD until housing and other considerations have been secured, and this seems to preclude many of the opportunities.

b.   EU’s partnership with China lasted only two years.  With the change in administration and financial status, this partnership was not able to continue.  During those two years, the program was run as an exchange with EU sending eight candidates to schools in China as our partners sent approximately the same number of students to EU as visiting scholars. Candidates enrolled in the China experience attended the first half of their student teaching at a local school district.  Completion of the student teaching experience in China was dependent on successful completion of the first placement.

(Exhibit 3.6)

c.     In both cases described above, candidates must complete the same requirements as those candidates completing their student teaching more locally. All unit and program assessments must be completed by the candidates. Supervisors complete all observations by Skype or video. The cooperating teacher also completes all of the evaluations as well.

d.     EU has initiated a new partnership in order to provide new opportunities for candidates to gain international experience (Exhibit 3.7).  In collaboration with Slippery Rock University, a sister PASSHE institution, an international student teaching experience will be offered to candidates as an option for half of their capstone course. Through this program, EU candidates will complete all the necessary assessments and certification requirements in a PA placement and then travel to either Ireland (fall semester) or Mexico (spring semester) to teach in an international location.  All meetings and supervision will be completed via video. This collaboration began in the summer of 2013 and the intention is to have the first candidates participate in spring 2014.

2.     Stage 1-3 field experience tracking. Is this only kept track of by the individual programs? Is this information shared with the OCST prior to their making the Stage 4 student teaching placements? How is the diversity of placements maintained?

 

a.     As delineated in the second response above, In addition, Stage 1 and 2 field experiences (observations and exploratory experiences) are now tracked by successful completion of SPED 210 and SPED 370.  Each course, required for all initial program candidates, has an integrated assignment where candidates log in their location of school, cooperating teacher, assignments completed, hours of observation or small group teaching.  These assignments are loaded into Livetext for each course (Exhibit 3.2).  It is designed as a running log for their entire time at EU for their given program.  Further, when it is time to make the student teaching placements, the candidates are required to state where their stage 3 (pre-student teaching) locations were on the application.  The Director will then make student teaching placements based on the information given to ensure diversity of placements.

3.      Placement of advanced candidates in settings with P-12 students from diverse populations. What are the requirements at the advanced level for working in settings with diverse students? What process is used to track these placements? Do all advanced candidates have these experiences?

a.      All advanced programs strive to incorporate diverse opportunities into courses and field experiences as much as logistics may allow. Though not all experiences are ethnically/racially diverse, there are always differences among the population that are thought provoking, spur meaningful discussion, and necessitate thoughtful decision making. Several of the advanced programs, including the Masters in Early Childhood, Masters in Special Education, and Masters in Middle & Secondary Education require that the candidate already be certified to teach.  Thus, most of these candidates have already participated in diverse experiences during initial certification.  Having said this, when the placement is controlled by the program, an effort to find a diverse placement is made.

 

Two programs, in particular, have focused on the question of ensuring diverse experiences for their advanced candidates even though all courses are delivered online and candidates enroll from around the country. Due to this logistical restriction, controlling the placement of the candidates for all field experiences is not possible.  These programs have addressed this question in the following ways:

 

b.      Educational Leadership: Experiences provided for candidates within the program include working with diverse populations within the school district.  Candidates for Principal K-12 certification must spend a minimum of 180 hours during a culminating internship in a K-6 elementary building and a separate internship at a 7-12 secondary building.  While many candidates conduct one of their internships in their assigned building, most conduct the second internship within an unfamiliar building. Candidates for the Superintendent and/or the Supervisor of Special Education must conduct their internships while focusing on a K-12 district perspective. A district perspective takes candidates in buildings throughout the entire school district.

 

The Principal-Supervisor Final Assessment (as seen in exhibit 3.8) displays instances where diversity is assessed.  For example, standards Standard 4.2: Respond to Community Interests and Needs and Standard 6.1: Understand the Larger Context both incorporate and assess diversity.

 

A plethora of activities are intertwined in course work that requires students to assess, interview, and summarize their findings. For example, In SCHA 731, School and Community Relations, Activity 4B External Public, asks candidates to define "external publics" and indicate why it is important to develop good "external community relations." They are instructed to “Comment on how you as an educational leader would promote good communication with and among each of the following external publics: 1. Parents, 2. Older Adults,  3. General Community Groups, 4. Diverse Cultures, 5. Critics”. Likewise, students are asked to define “Internal publics" and indicate why it is important to develop good "internal community relations." They are instructed to “Comment on how you as an educational leader would promote good communication with and among each of the following internal public: 1. School Board, 2. Administration, 3. Teachers, 4. Non-Instructional Personnel, 5. Pupils”. (See exhibits 3.9 – Representative Samples of External and Internal Publics).

 

c.        Reading: All advanced candidates in the Graduate Reading Program have a diverse placement through their participation in READ 712 Reading Clinic, a required course for both the masters and certification programs. Clinics are organized by the Program faculty, and all candidates work in designated clinic sites with participating P-12 students.

 

In 2013, designated clinic sites were as follows:

Migrant Education Program at the Bayfront Center (all participating students were ELLs)

Cambridge Springs Elementary School (designated as “Rural Distant” by CCD)

Saegertown Elementary School (designated as “Rural Fringe” by CCD)

Wattsburg Area Elementary Center (designated as “Rural Fringe” by CCD)

James W. Parker Middle School (designated as “Rural Fringe” by CCD)

 4.      Clarification of the OCST director and the director of field experiences and student teaching. Are these the same people? What are their roles?

 a.   The OCST is the Office of Certification and Student Teaching.  Though the name has changed several times, the responsibilities have been always included the placement and oversight of all student teachers, university supervisors, and cooperating teachers. The OCST is also responsible for all certification processing, though the Dean of the School of Education is the Certification Officer.  As discussed previously, the OCST will now also be responsible for the placement and tracking of field experiences.

b.     The director of the OCST has the title Director of Field Experiences and Student Teaching.

 

 5.      What did the unit learn from the evaluation of PDSs?

a.     The analysis for the 2011-2012 school year from Roosevelt indicated that the site location scored either beginning or developing in each category using the PDS rubric. In discussions, it was suggested faculty/administrative turnover in the school as well as the turnover in PDS personnel from the university resulted in the inability to score “at standard” and presents challenges for sustainability. The limited time and financial support available from both sides of the partnership hinders the full work of PDS as compared to past years. Though a comparison study was not completed in 2012-2013, the work this past year at the Roosevelt site was aimed at increasing experiences and practice that is inquiry-based and focused on learning. A course was offered focusing on middle level instruction and a year-long professional development grant that focused on improving student writing was completed.

b.    Evaluations of all sites were not completed for the 12-13 school year. However conversations with the district and Unit leadership have begun to plan the next stages of the PDS relationship.

c.     Even though financial challenges exist, plans for extending and improving PDS partnerships and coordination have been solicited from those involved.

 

6.      Will the agreement with the Perseus House continue?

a.    Though Perseus House is not a PDS, it has been a wonderful partnership exposing our candidates to a unique, diverse educational environment.  To this point, grant funding has allowed for EU faculty and PHCS faculty to interact consistently and participate in professional development opportunities. Though the funding has now expired, it is anticipated that the partnership forged will stay strong allowing for many field and student teaching placements and continued professional development opportunities. As a result of the grant, SEDU 102, a college level study skills course, has been taught on site for PHCS students.  This offering is scheduled again for spring 2014.

 

7.      According to the IR, the Steering Committee for the PDS’s was supposed to begin meeting again in the spring 2013. Did this take place? What was the assessment of the current structure and has a plan been put into place for the future?

a.     Though Site meetings continued, Steering Committee meetings did not occur due to time constraints.  The co-directors met with the Associate Dean in spring 2013 and it was decided to reflect upon our potential for sustainability in conversations with partner leaders. A plan was provided by one co-director to the Dean’s office that involved additional release time and support in order to maintain current activities and also add other necessary aspects such as Steering Committee meetings.

b.    In spring 2013, the Director of Field and Student Teaching as well as one PDS co-director met with the Executive Director of Human Resources for the City of Erie School District.  The discussion focused on meeting with the new administration due to the large number of changes within the district and also introducing one of the high schools on as a PDS site.  Due to the expanding ELL population at East High School, it was suggested that this may be a good site with opportunities for candidates and faculty alike, including a good source of those interested in EU’s new ESL Master’s program.  Arrangements are still being made to continue these conversations.

Top of Standard        Top of Page

 

Standard 4

Preliminary Findings

BOE Comment: In summer 2012, an Erie Urban Seminar was designed to replicate the Philadelphia Urban experience and to make a positive connection with a local urban district. This is the first step in developing a program that will introduce students to an urban setting early in their college experience, with the hope that they will be better prepared for field experiences and student teaching. Plans also include an urban track which will include an increased number of urban field experiences as well as specialized curriculum to address the needs of learners in an urban setting.

 

Response:  Though the Erie Urban Seminar was fully developed as a curriculum, the course was not offered due to low enrollment. Discussions on how to make the program more attractive to candidates are currently underway. The “urban track” has not yet been developed, however, interested faculty persist in their efforts to eventually provide this area of concentration.

 

Areas of Concern related to continuing to meet the standard

 1.      The opportunity for candidates to work and interact with a diverse faculty and diverse candidates.

Rationale: Diversity demographics indicate limited numbers of diverse faculty and candidates.

The University as a whole continues to focus on increasing the diversity of the faculty and candidate population.  As noted in the President’s letter to faculty (Exhibit 4.1), the 2013-2018 Strategic Goals and Objectives (Exhibit 4.2) lists “recruit and support a more diverse faculty and staff” with particular attention to evaluating the faculty and staff recruitment process, assess the applicant evaluation process and establish systems to support a more diverse community. Of note, positions were offered to two diverse candidates however the candidates declined the offers.

Recent discussions with the Associate VP for Enrollment, Management, and Student Success and the Dean of Education have focused on sharing information about program quality with prospective candidates. In addition, the Coordinator of Multicultural Programs works toward recruiting diverse students for the University.

Additionally, the CUE Equity Scorecard created by the Center for Urban Education supports the university system in its efforts to close equity gaps in access and success for underrepresented minorities (URM) and PELL grant recipients. As four members of the Evidence Team are also on the SOE Diversity CIT, these members have posed inquiry questions to the team related to the under-representation of diverse candidates in the teacher preparation and related professions programs. These four members have been trained in the inquiry process to ask questions from an equity minded perspective and to critically examine institutional barriers to recruitment and retention of candidates from URM, in particular African American and Latino/Hispanic. We have discussed the value of a diverse teacher workforce in education and the related professions and have begun to formulate a specific plan to address our institutional gap. The percentage of URM candidates in the NCATE unit is under three per cent. On our team consists of the Assistant Director of Admissions and Minority Recruitment and also the Director of the Office for Multicultural Affairs. While the plan is in the early stages of discussion, the Diversity CIT is committed to exploring unique ways of diversifying our candidates. Additionally, our Associate Dean has worked closely with the CIT members from the Admissions Office and the Coordinator for Multicultural Programs to strengthen support for diverse candidates including securing tutoring and mentors, providing personal contact and advising, facilitating conversations between instructors and candidates, and encouraging attendance at events sponsored by the Office of Multicultural Affairs.

 

Evidence for the BOE Team to validate during the onsite visit

 1.    Assessments (CDAI) and (MAKASS) data related to candidate’s proficiencies for helping all students learn. What systematic data are being collected regarding candidates diversity proficiencies?

a.     The Diversity CIT established four diversity proficiencies in the fall 2012 semester based upon review of the Conceptual Framework, the NCATE definition of diversity, and the EU and SOE mission statements. The proficiencies were presented to the Unit at the fall 2012 Unit meeting. At this time co-chairs requested that programs ensure that candidates and faculty members are working toward meaningful integration of the proficiencies into coursework and assessments. During the spring 2013 Unit meeting, the co-chairs shared the concerns of the Diversity CIT team and offered that the committee would be happy to work with individual programs and/or departments to support this process as we reorganize the School of Education. The proficiencies are aligned with the MAKSS and CDAI as well, thus key questions will be used for analysis beginning with the fall, 2013 semester.

Per notes from the February 14, 2013 CIC Diversity meeting: 

“…some type of sample assessment with respect to diversity proficiencies would be helpful to share at the Unit meeting so as to enable faculty to conceptualize what assessment might “look” like. For example, programs may choose to require a capstone assessment, such as a “Diversity Portfolio,” that would require candidates to reflect on their experiences and demonstrate proficiencies related to diversity.

The form for the Report of Supervision Teacher Candidate Lesson Evaluation was shared and reviewed. SOE members who use the form observed that although some items on the form could be interpreted as addressing diversity proficiencies, the element of diversity was not explicitly stated on the form. As a result, SOE members believed that the form, in its current state, cannot be consistently used to evaluate knowledge, skills and dispositions related to diversity. The Committee recommended that this form be revised so as to more explicitly address diversity proficiencies. It was determined that a working group comprised of SOE members should be formed to recommend specific revisions and develop a sample diversity assessment plan”.

 

b.      The CDAI, MAKSS, and MAKSS-T are aligned with the diversity proficiencies. This alignment can be seen in exhibit 4.3.

 

2.      The status of Special Education dual certification program in secondary education?

a.      All paperwork for the Special Education dual certification program in secondary education has been submitted to the University Wide Curriculum Committee for review (Exhibit 4.4). This review should be completed at the first complete meeting of the UWCC program sub-committee in fall 2013.  If passed, the program should begin implementation in spring 2013.  There is no PDE approval necessary prior to implementation, however, the program will be part of the PDE Major Program Review in fall 2014.

3.      Status of the proposed recruitment and retention of candidates of color to the unit teacher education preparation programs.

a.      At the February 2013 CIC meeting, the Diversity CIC made recommendations to be shared with the University Diversity Council on this matter. Specifically, it was recommended that EU place a greater emphasis on hiring diverse faculty by:

i.      Training search committees

ii.      Ensuring diverse representation on search committees

iii.      Providing prospective diverse faculty members with names and contact information of EU faculty who could comment authentically on issues from a similar perspective as that of the candidate.

iv.      Inviting members of the UDC to participate in searches.

v.      (i.e., at conferences), building, and maintaining informal, personal relationships with potential faculty members (even when an active search is not underway)

vi.      Considering reviving the Fredrick Douglas initiative at EU

  4.   The university has also become a lead school in Pennsylvania’s State System of Higher Education (PSSHE) Access to Success initiative and the implications of the program to the teacher education unit. Meet with leader(s) of project to discuss status.

a.    The University has also become a lead in Pennsylvania’s State System of Higher Education (PASSHE) Access to Success initiative and the implications of the program to the teacher education unit.  A “save the date” email has already been sent to all members of the EU Access to Success team in order to ensure that the BOE Team will be able to meet with as many members as possible.

5.      How many candidates participate in Philadelphia and Erie Urban seminars?

a.     The Philadelphia Urban Seminar is a course (SEDU 300) that allows candidates to experience a diverse field placement in the schools of the Philadelphia.  The cost to candidates includes the cost of a 3 credit course as well as additional funds necessary for housing, food, and transportation.  Prior to summer 2012, sufficient grant money was available to greatly alleviate candidate costs. Over time, this grant money decreased to the point where, essentially, the full cost was paid by the candidate.  In summer 2013, money was procured by the Dean of Education at EU to lower the cost to candidates.  These results of these financial changes can be seen in the varying enrollment.

Candidates participating:

2009 – 43;

2010 – 33;

2011 – 22;

(course not offered in 2012 due to low enrollment)

2013 – 14.

6.      Data that show how the unit ensures that all candidates have the opportunity to work P-12 students from diverse backgrounds.

a.     Data is kept by the Office of Certification and Student Teaching to track the diversity of placements of candidates. For the semesters in question, between 72% - 95% of candidates were ensured a diverse placement. (see Exhibit 4.5)

 

Top of Standard         Top of Page

Standard 5

Preliminary Findings

 No comments with necessary response.

 

Areas of Concern related to continuing to meet the standard

No areas for concern.

 

Evidence for the BOE Team to validate during the onsite visit

 1.      What has been learned from the Clinical Faculty Survey and Results?

a.     When looking at the results of the Clinical Faculty Survey, it is apparent that Clinical Faculty members believe that EU candidates, overall, are prepared to be successful in the classroom (Exhibit 5.1). Since all questions were based on the belief statements of the Conceptual Framework, the positive responses reflect that the coursework and experiences in our programs have led to completers who demonstrate the knowledge, skills, and professional dispositions of Effective Facilitators of Learning.

According to the survey results, clinical faculty believe EU candidates are strongest in the areas of Content Knowledge & Content Pedagogy, Diversity, and Technology with 90% of clinical faculty scoring EU candidates as either Acceptable or Target in those areas.

 85% of clinical faculty rated EU candidates as Acceptable or Target in the areas of Use of Community Resources, Reflection & Self-Improvement, Standards-based Planning & Assessment, Classroom Environment, and Success of All Students.

 The area requiring the most improvement according to clinical faculty deals with “giving back to the community through civic action.”  Only 62% of clinical faculty rated EU candidates as Acceptable or Target in this area.  Interesting to note is that this was the only question with a significant number of survey takers choosing the “N/A” option (17%).  This may indicate that more opportunities for observing this behavior are necessary.

 Additionally, self-report results demonstrate that 70% of clinical faculty hold a Master’s degree and over 70% have been teaching for over 10 years. In addition to these qualifications, a large percentage (41%) are involved in leadership roles in the profession. (Exhibit 5.2)

 

2.      Assessment of faculty and assessment of candidate performance of faculty through student evaluations – how is this being done?

a.      Faculty performance is assessed according to the guidelines of the CBA (Exhibit 5.3).

b.      Student evaluations are conducted according to the CBA for all faculty members under review or those requesting evaluations.  The Student Evaluation instrument can be seen in Exhibit 5.4.

 

3.      How successful is the mentoring program working? What are the results? Are mentors evaluated by junior faculty?

a.    The official SOE mentoring policy was adopted in spring 2013, however a SOE program for new faculty began in fall 2011.  New faculty members have responded favorably to the efforts (see exhibit 5.6 for survey results). The policy does not specify that junior faculty evaluate the mentor, however, it does allow for reassignment of the mentor or mentee at any time. As noted in the President’s letter to faculty, the new Strategic Plan for the University also includes the creation of a faculty mentoring program to allow us to “better support faculty through their first years…” (Exhibits 4.1 and 4.2)

 

4.      How are part-time/ adjuncts mentored and evaluated?

a.       All part-time faculty and adjuncts are evaluated according to the guidelines set forth in the CBA. (Exhibit 5.3)

 

5.      Clarify the number and category of faculty re: temporary, adjunct, part-time, etc.

a.       Please see Exhibit 5.5.

 

Top of Standard           Top of Page

Standard 6

Preliminary Findings

 No comments with necessary response.

 

Areas of Concern related to continuing to meet the standard

No areas for concern.
 

Evidence for the BOE Team to validate during the onsite visit

1.     IR indicates that increased numbers of part-time faculty have been needed within the unit due to budgetary constraints. The exact numbers were not available as an exhibit. How many have been used by each program and how has this impacted the faculty, curriculum and candidates.

a.   Please see Exhibit 5.5 for clarification in the number of faculty in each hiring category.

b.   Please see Exhibit 6.1 for the percentage of faculty in each hiring category for the School of Education.

c.    As data indicate, the School of Education has about 25% of its faculty in the category of part time faculty and 14% of its faculty in the category of full time temporary.  Though full time temporary faculty participate greatly in the activities of the SOE, due to the nature of their contract, full and  part time faculty members are prohibited from serving on key university and departmental committees such as evaluation, tenure and promotion committees. Moreover, the limited availability of part-time faculty hinders their full participation in service activities, such as serving on program revisions and curriculum development committees, and student advising.  Their inability to fully engage in these activities creates a burden on regular tenure track and full time temporary faculty.   The Unit has, nevertheless, made a concerted effort to integrate part-time faculty into the Unit through planned professional development activities in the areas of effective use of Livetext, understanding accreditation standards and the continuous improvement model, state guidelines and  initiatives, etc.

 

2.      Meet with task force that is discussing future reorganization within the unit.

a.    The School of Education Reorganization Task Force met during the spring 2013 semester and completed its work in May 2013.  The newly formed departments were announced to the University on June 3, 2013 and took effect immediately.  The new departments now include the following:

i.      Counseling, School Psychology, and Special education

ii.     Early Childhood and Reading

iii.     Middle & Secondary Education and Educational Leadership

iv.      Health and Physical Education

b.      Please see Exhibit 6.2 for a list of faculty involved in the Reorganization effort as well as the models considered and issues raised.

Top of Standard           Top of Page