If you don't remember your password, you can reset it by entering your email address and clicking the Reset Password button. You will then receive an email that contains a secure link for resetting your password
If the address matches a valid account an email will be sent to __email__ with instructions for resetting your password
In fall 2011, the South Carolina Campaign to Prevent Teen Pregnancy (SC Campaign), with funding from Office of Adolescent Health, began replicating an evidence-based curriculum, It's Your Game, Keep It Real in 12 middle schools across South Carolina. Fidelity of the curriculum was monitored by the use of lesson fidelity logs completed by curriculum facilitators and lesson observation logs submitted by independent classroom observers. These data were monitored weekly to identify possible threats to fidelity. The innovative model Fidelity Through Informed Technical Assistance and Training was developed by SC Campaign to react to possible fidelity threats in real time, through a variety of technical assistance modalities. Fidelity Through Informed Technical Assistance and Training guided the 55 hours of technical assistance delivered by the SC Campaign during the first year of It's Your Game, Keep It Real implementation to 18 facilitators across 12 SC middle schools, and achieved 98.4% curriculum adherence and a high quality of implementation scores.
The South Carolina Campaign to Prevent Teen Pregnancy (SC Campaign) created a real-time response strategy to correct threats to fidelity during program implementation. Use of this model resulted in high rates of implementation fidelity, a factor that has been linked to intervention success.
Implementation fidelity is often used synonymously with program integrity and has been described as the degree to which a program is delivered as designed [
]. To appropriately determine the internal validity of an intervention, it is essential to carefully study the process of how the intervention was implemented, so that outcomes may be attributed to the intervention and not to extraneous variables [
Multiple studies have identified specific elements associated with implementation fidelity, including adherence to an intervention, exposure or dose, quality of delivery, participant responsiveness and program differentiation [
]. Others have described fidelity as the adherence, compliance, integrity, and faithful replication of an intervention; additional elements such as dosage, quality, participant responsiveness, reach, monitoring of control conditions, and program adaptation are described as components of the broader term of implementation [
]. The Centers for Disease Control and Prevention defines fidelity as the “faithfulness with which a curriculum or program is implemented; that is, how well the program is implemented without compromising its core components, which are essential for the program's effectiveness” [
]. Under its current funding initiative, the Office of Adolescent Health (OAH) defines “fidelity” as “maintaining the core components of the original program model.” Core components include those characteristics determined to be key ingredients related to achieving outcomes associated with the program, including what is being taught (curriculum adherence) and how well the program is being taught (quality) [
], “continuous quality improvement and rapid cycle problem solving” is essential to prevent threats to fidelity from “re-emerging and reoccurring.”
In 2010, OAH released a funding opportunity to better understand how, why, and under what conditions evidence-based teen pregnancy prevention programs work. The SC Campaign received a 5-year award to replicate It's Your Game, Keep It Real (IYG), a 2-year middle school, comprehensive, evidence-based curriculum shown to delay the initiation of sex and increase positive beliefs about abstinence [
]. Replication studies are important to the field of adolescent sexual health because these studies help policy makers, funders, and program developers understand with greater accuracy what works in teen pregnancy prevention.
The IYG curriculum consists of 12 seventh-grade lessons and 12 eighth-grade lessons. Lessons in each grade include facilitator-led sessions, including skills practice through role-plays in addition to individually interactive computer lessons addressing potentially more sensitive topics such as puberty, condoms, and contraceptive methods.
The IYG curriculum developers identified the following core content components, core pedagogical components, and core implementation components as characteristics that must be kept intact when the intervention is being replicated for it to produce program outcomes similar to those demonstrated in the original study [
Markham C, Peskin M, Shegog R, Tortolero S. It's Your Game, Keep It Real. An HIV, STI, and pregnancy prevention curriculum for middle schools. Field trainers man. Houston, TX: University of Texas Prevention Research Center; 2012.
]. The core content components of the IYG curriculum relate to what is being taught (setting personal limits, skills practice related to refusal skills, knowledge and skills building related to healthy relationships, and risk reduction practices), and the primary message of the program is for students to wait until they are older to have sex and for those students who are sexually active to use risk reduction strategies. The core pedagogical components relate to how the content is taught: create and maintain a positive learning environment by always using the ground rules for every lesson, follow rules for parental consent set forth by school, give clear directions for activities and model activities, and repeat messages to reinforce learning at the beginning and end of each lesson. The core implementation components relate to some of the logistics that are responsible for a conducive learning environment: all 24 lessons should be taught, lessons should be taught in the order outlined in the curriculum, lessons can be delivered according to any schedule that works best for the school (e.g., twice a week, once a week) within a 4-month time period, facilitators must have completed a training of facilitators in the IYG program, activities should not be added to the IYG lessons, and computer lessons should be completed individually and should not be delivered in group format.
The SC Campaign used these core components to measure curriculum adherence and quality of implementation of IYG when replicated in 12 middle schools. When clarity was needed, the SC Campaign staff contacted the curriculum developers to ensure that any adaptations made to the curriculum would not jeopardize the core components of the program. Facilitators were asked to submit for approval any proposed adaptations before implementation. Approved adaptations included separating students in a class by gender and changing names for role-plays. Requests that were not approved included deleting the lesson on condoms and contraception and implementing the curriculum in a 2-week period (vs. up to 4 months). All other adaptations occurred during implementation without requesting the SC Campaign's approval in advance. Most of these adaptations involved skipping activities the facilitator deemed unnecessary, such as a “getting to know each other” activity or a situation in which the facilitator did not have time to complete a lesson, such as recapping the day's lesson before dismissal.
ETR Associates (ETR) was contracted as an independent, outside evaluator to the project and assumed responsibility for the collection of performance measures including process and outcome data. The SC Campaign project staff worked with ETR to develop strategies to measure OAH required performance measures, including fidelity monitoring through program implementation logs and observations. Congruent with OAH's goal, the SC Campaign was interested in using implementation fidelity data to identify possible threats to implementation fidelity and respond with real-time, immediate technical assistance (TA) while implementation of IYG was ongoing, hoping to minimize repeated implementation error and improve overall implementation fidelity.
Literature has shown that training should be supplemented with site-specific, customized TA [
]. However, the authors found little in the literature to illuminate what effective TA looks like. It became apparent that there was a need to describe how implementation data could be used to translate information into actions to minimize threats to fidelity. The purpose of this article is to describe a model that shows how training and TA were operationalized during a replication study to increase adherence and quality implementation. The model design was practice-informed and illustrates how to use real-time implementation data to correct potential threats to implementation fidelity through ongoing monitoring and steady communication with school. Fidelity results from the first year of seventh-grade implementation are presented and discussed as well.
Fidelity Through Informed Technical Assistance and Training model development
The SC Campaign relied on best practices and past experiences to build a multi-method process evaluation strategy to monitor and improve implementation fidelity of a teen pregnancy prevention program in 12 middle schools. Previous and current projects at the SC Campaign used Getting to Outcomes [
] to help build the capacities of organizations (including schools) to implement evidence-based programs with fidelity. Although both models stress the importance of technical assistance, they stop short of describing how and when effective TA is provided. The Fidelity Through Informed Technical Assistance and Training (FITT) model includes both non-training activities (e.g., site visits, e-mails) and training activities (e.g., webinars, trainings) as appropriate TA responses. However, for the purpose of this article, only the TA effort in non-training activities is included in the results.
The SC Campaign recognized that the provision of real-time TA required access to real-time implementation data. The tools used to collect the data were developed in collaboration with ETR, the SC Campaign's external evaluator, according to OAH guidelines. From previous projects, ETR already possessed an online database to store implementation data entered from facilitators and observers. ETR added to this database to meet the needs of the IYG project.
Although timely implementation data presented new opportunities, the availability of these data posed new challenges and required considerable planning in advance of implementation. Because the SC Campaign had not had access to timely fidelity data in previous projects, the organization had to plan how to manage, review, and use the data. Rather than respond immediately to all threats to fidelity, large and small, the SC Campaign developed the FITT model to create a flexible and responsive approach to potential threats to fidelity. Whereas the FITT model evolved during the course of the first year of implementation, the early framework of the model emerged before implementation to manage the flow of implementation data and make timely TA responses possible.
Before implementation, the SC Campaign brainstormed scenarios to develop a common understanding of which threats to fidelity required an immediate response, and which could be handled at a future date. The magnitude of the threat dictated the nature of the TA response.
For example, if a facilitator indicated he or she had a challenge with a particular lesson that resulted in skipping or modifying activities, and the facilitator was planning to implement the same lesson in the near future, the SC Campaign would attempt to contact the facilitator as soon as possible to intervene before he or she implemented the lesson again. However, if a facilitator indicated minor issues, such as with classroom management, but it did not seem to affect fidelity, the SC Campaign might not have reached out immediately unless specific assistance was requested by the facilitator. Although completion of fidelity logs was required for facilitators, the SC Campaign recognized that responding immediately to all threats could be burdensome for facilitators and result in resentment or lack of candor in future reporting.
The SC Campaign dedicated time at least weekly and often twice weekly to review the implementation and observation logs. After review of the implementation data, the project coordinator would track whether the facilitator had entered the adherence data within 2 days of conducting the lesson, notify TA specialists of any threats identified, and enter the TA issues into a shared internal database built in FileMaker Pro (Filemaker Inc., Santa Clara, CA) to maintain a record of issues and their resolution. This review process required several hours a week.
Also before implementation, the SC Campaign determined that each implementation site would receive a set number of planning or continuous quality improvement visits each year, usually two or three per year. With prescheduled meetings in place, TA specialists could wait until these meetings took place to handle minor threats to fidelity (Figure 1).
Methods for Monitoring Implementation and Observation Data
A total of 18 facilitators across 12 middle schools participated in the first year of seventh-grade IYG implementation. Findings from a survey administered to each IYG facilitator after implementing the first year of the curriculum showed that 36% of facilitators reported having no prior experience teaching sex education, 18% of facilitators had 2–3 years' experience teaching sex education, and 46% of the facilitators reported having ≥4 years of teaching in the field.
Before initiating implementation, all 18 facilitators participated in a 2-day training of facilitators on the IYG seventh-grade curriculum, conducted by the curriculum developers. The training included a review of the curriculum's logic model, core components, and theoretical foundation. To support fidelity, a lesson by lesson review of the curriculum also was conducted and teach-back sessions were used to build the implementation skills of the facilitators.
To ensure that the online implementation database was used properly, ETR trained SC Campaign staff, site observers, and facilitators. Facilitators provided most of the adherence data used to inform TA, and special efforts were made to ensure that facilitators provided timely, complete, and accurate data. The SC Campaign framed the fidelity logs positively as a learning tool, rather than a means to micromanage implementation. During the training of facilitators, ETR provided a 45-minute training session and a teacher's manual on how to access the online database, enter adherence data, and submit the data online. The SC Campaign provided monetary incentives to facilitators to encourage submitting the logs within 2 days of implementing the lesson. If a facilitator seemed to fall behind on the logs, the SC Campaign would follow up with facilitators, and in the rare instance when necessary, would follow up with school principals to ensure curriculum adherence data were entered.
Considerable resources were invested in ensuring that the site observers were able to provide fidelity data on both curriculum adherence and quality of implementation. Observers were trained in the IYG curriculum and coached on how to complete the observation tool accurately and enter the data online. Each time new site observers were hired by ETR, SC Campaign staff would accompany them to observe a lesson, complete the observation tool, and discuss ratings. Only after there was a high level of agreement between observers and the SC Campaign were observers able to conduct observations independently.
Two instruments were developed to monitor curriculum adherence and quality of implementation: implementation logs and observation logs. Both instruments and their administration procedures were reviewed and approved by ETR Associates' Institutional Review Board Committee before their use in fall 2011.
Implementation logs from the initial IYG evaluation [
] were adapted to capture every activity within each lesson and provide space to note any problems or concerns. These logs, which were IYG-specific, were used to measure adherence. Adherence was calculated as the ratio of number of activities completed per lesson by the number of activities for that lesson. In addition, each log asked facilitators to (1) identify classroom management issues; (2) rate their perceived level of student engagement and lesson effectiveness; (3) identify any TA needs; and (4) provide any additional comments, especially tips for handling challenging situations (results not included here).
A required OAH observation tool was used to assess both curriculum adherence and the quality of implementation. The instrument for measuring curriculum adherence was the same as that used by the facilitators (described earlier). Quality of implementation was assessed using an 11-item tool that asked observers to rate the quality of implementation using 5-point Likert-type scales. Constructs measuring quality included clarity of explaining an activity (1 = not clear; 5 = very clear), timing of activities (1 = not on time; 5 = well on time), and timing of presentation materials (1 = very rushed; 5 = not rushed), participant engagement (1 = little participation; 5 = active participation), and understanding of the material (1 = little understanding; 5 = good understanding). In addition, several facilitator characteristics were also assessed: (1) knowledge of material; (2) enthusiasm; (3) poise; (4) level of comfort with material; (5) rapport and communication with students; and (6) effectiveness in addressing questions or concerns poised by students (1 = poor; 5 = excellent).
Data collection and analyses
Facilitators were trained on how to successfully complete the online implementation logs in a timely manner, to measure adherence. Incentives for timely submission included $50 per facilitator for turning in 100% (12 of 12 lessons per implementation cycle) of the implementation logs within 2 days of implementation and $25 per facilitator for turning in 83% (10 of 12 lessons per implementation cycle) of the implementation logs within 2 days.
To measure the level of curriculum adherence, the mean percentage of activities completed per lesson and per lesson type (i.e., computer, skills practice, and other facilitator-led lessons) was computed for implementation logs completed by both facilitators and observers. Data were analyzed using SPSS statistical software v18 and v20 (IBM, Armonk, NY). Activities were considered to adhere to the curriculum if the activity was implemented completely and/or the facilitator completed the activity with an adaptation that did not affect the program's core components and was approved by the SC Campaign, the program developer, and OAH.
Similar to the implementation log data, the observers submitted observation logs via an online database. The observation form was pilot-tested during the planning year to help facilitate inter-rater reliability. Facilitators were informed of observations in advance and worked with the observers to schedule a time when the observation would work best. Observers conducted five joint observations to determine inter-rater reliability; the average percent agreement per lesson was 93% (range, 80%–100%). To obtain a representative sample with an emphasis on facilitator-led lessons, staff strived to observe each facilitator at least once, each computer lesson at least twice, each skills practice lesson (e.g., role-play) six times, and all other interactive lessons four times. Computer lessons were observed less often because of the assumption that they would be easiest to facilitate.
For measures of implementation quality, the mean rating per lesson and lesson type were computed for each item on the observation tool. Data were analyzed using SPSS statistical software v18 and v20.
Time and effort
From weekly monitoring of implementation logs and observation logs, TA contact was made 177 times during the first year of implementation to address possible threats to curriculum adherence or quality of implementation. The project coordinator reviewed implementation and observation logs weekly and notified TA specialists of potential threats to fidelity. Examples of threats to curriculum adherence ranged from skipped activities owing to a lack of time, perceived importance, or classroom management problems; technology issues during computer lessons, such as an internet connection that was too slow to download videos; and computer lessons viewed as a class rather than individually, as designed. Examples of threats to the quality of implementation could include facilitators' difficulty in answering sensitive questions or sufficiently engaging students.
Technical assistance was delivered from SC Campaign TA staff via e-mail (84%), in person (13%), by or telephone call (3%) (Table 1). The 177 contacts resulted in 55 hours of TA (from August 1, 2011 through June 1, 2012). Most hours (n = 40.7 hours; 74%) were provided during prearranged meetings. The remaining TA time (n = 14.3 hours; 26%) was provided in reaction to a fidelity alert or in response to a TA request. In addition to TA specialist-driven TA, the SC Campaign created a Tips and Tricks (TaT) e-newsletter based on a review of implementation logs that summarized common challenges and strategies to address them, and highlighted successes identified by the facilitator. Tips and Tricks created a non-threatening way to address potential threats and allowed facilitators to learn from each other. Technical assistance staff e-mailed TaT to facilitators one to two times a month along with a personal note to discuss potential threats to fidelity if needed. Issues were considered resolved when a facilitator confirmed understanding of the problem and agreed to take corrective action; future fidelity logs were reviewed to identify whether the issue reoccurred.
Table 1Technical assistance effort, by method and type of technical assistance (August 1, 2011 to June 1, 2012)
The IYG curriculum was implemented 92 times during the first year. Facilitators submitted implementation logs for 1,100 of the 1,104 lessons (99.6%); these lessons included a total of 6,505 activities. Facilitators reported implementing an average of 98.4% of activities with adherence to the curriculum: 95% of activities were completed exactly as written, including any pre-approved adaptations (i.e., separating gender groups for lessons); 3% of activities were completed with modifications (i.e., splitting a lesson between two classes because of time); and 2% of activities were not completed.
Table 2 provides a breakdown of lesson-by-lesson adherence rates, first by implementation logs submitted by facilitators, and second, by observation logs submitted by observers. Facilitators and observers reported a high level of adherence per lesson. Among facilitators, the mean adherence rate (percentage of activities completed per lesson) was 98.4% across all lessons. Among observers, the mean adherence rate (percentage of activities completed per lesson) was 96.7% across all lessons. A mean adherence rate per lesson was calculated. Whereas all facilitators reported a mean adherence rate above 90% for all lessons, the adherence rate within lessons ranged from a low of 40% to a high of 100% of activities completed. Among observers, the adherence rates ranged from a low of 66.7% to a high of 100% of activities completed. This wide range may indicate that in specific lessons, some facilitators may have made multiple modifications to activities within the lesson or may have skipped activities with the lesson. Two computer lessons (Lessons 3 and 10) had minimum adherence rates of 50% and 40%, respectively.
Table 2Mean percentage of activities implemented per lesson, as reported by facilitators and observers
Activities per Lesson, n
Mean % completed, facilitator (range)
Mean % completed, observer (range)
Lesson 1: It's Your Game … Pre-Game Show
Lesson 2: Keep It Real … Among Friends
Lesson 3: Keep It Real … Among Friends (computer)
Lesson 4: It's Your Game … Playing by Your Rules
Lesson 5: It's Your Game … Playing by Your Rules (computer)
Lesson 6: Protecting Your Rules … A Clear No (role-play)
Lesson 7: Protecting Your Rules … Alternative Actions (role-play)
Lesson 8: Know Your Body (computer)
Lesson 9: Keeping it Real … For Yourself
Lesson 10: Playing By Your Rules … Regarding Sex (computer)
Lesson 11: Protecting Your Rules … Regarding Sex (role-play)
Lesson 12: It's Your Game … Post-Game Show
Mean across all lessons with completed activity log
Beyond assessing adherence to the IYG curriculum, observation data were used to assess the quality of implementation. Of the 1,104 lessons implemented, 4.5% (n = 50) were observed by an independent observer and 100% of facilitators (n = 18) were observed by an independent observer. Table 3 shows the differences between mean scores on implementation quality across IYG lessons, categorized as one of the following: computer lessons (Lessons 3, 5, 8, and 10), role-playing skill practice lessons (Lessons 6, 7, and 11), and other interactive lessons (e.g., games, journaling activities) (Lessons 1, 2, 4, 9, and 12). Computer lessons were generally scored lower than the other types of lessons; however, the mean score was still above 4.0 out of 5, with 1 indicating poor quality and 5 indicating excellent quality. Overall, implementation quality ratings suggest high-quality implementation of IYG. The average rating across all observed lessons was ≥ 4.0 on a 5-point Likert scale.
Table 3Mean quality of implementation ratings by observers
Computer lessons, n (mean [range])
Other interactive lessons, n (mean [range])
Role-Play/Skills practice lessons, n (mean [range])
In general, how clear were the program implementer's explanations of activities?
1 (not clear) 5 (very clear)
13 (4.62 [3–5])
18 (4.83 [4–5])
15 (4.67 [3–5])
To what extent did the implementer keep track of time during the session and activities?
1 (not on time) 5 (well on time)
13 (4.23 [3–5])
19 (4.79 [4–5])
17 (4.65 [3–5])
To what extent did the presentation of materials seem rushed or hurried?
1 (very rushed) 5 (not rushed)
12 (4.25 [2–5])
19 (4.84 [3–5])
17 (4.59 [3–5])
To what extent did the participants appear to understand the material?
1 (little understanding) 5 (good understanding)
12 (4.58 [3–5])
19 (4.89 [4–5])
17 (4.76 [3–5])
How actively did the group members participate in discussion and activities?
1 (little participation) 5 (active participation)
12 (4.33 [1–5])
19 (4.74 [3–5])
17 (4.65 [3–5])
On the following scale, rate the implementer on the following qualities:
Knowledge of the program
1 (poor) 5 (excellent)
12 (4.42 [3–5])
18 (4.67 [3–5])
15 (4.67 [3–5])
Level of enthusiasm
1 (poor) 5 (excellent)
12 (4.25 [3–5])
19 (4.79 [3–5])
16 (4.56 [2–5])
Poise and confidence
1 (poor) 5 (excellent)
12 (4.67 [4–5])
18 (4.67 [3–5])
17 (4.59 [2–5])
Comfort level discussing related topics (e.g., reproductive anatomy, sex, condoms, contraception, teen pregnancy, sexually transmitted infections, etc.)
Whereas many studies note it is difficult and in fact unrealistic to expect complete fidelity when implementing new innovations or programs, other research suggests that it is possible to achieve a high degree of implementation fidelity (85%) when early monitoring and timely feedback is provided [
]. To achieve high rates of fidelity, this study planned for and provided these services, as described in the FITT model, during the replication of an evidence-based middle school curriculum to prevent teen pregnancy. The FITT model described here bridges a gap in the current literature by providing strategies that can help move innovative programs from the realm of research into everyday practice.
Curriculum selection is an essential element in achieving high rates of fidelity, and the use of FITT assumes good program selection. Fidelity is influenced by how well a selected program matches the needs of youth it is intended to serve and the confines of the implementation setting [
], a list of evidence-based teen pregnancy prevention program models, an extensive needs assessment of the potential school partners, and an assessment to ensure the curriculum was consistent with the South Carolina Comprehensive Health Education Act [
] for middle schools. The selection of IYG was well received by students and facilitators, which may have supported high fidelity implementation.
Understanding the concept of fidelity and identifying the adaptive and core components of a curriculum is the key to the successful use of the FITT model. Adaptive components are considered optional, such as modifying role-playing to more accurately reflect local context, and would seem to have no effect on program outcomes. Core components, as discussed earlier, are identified by the program developers and are considered critical to program success. Explicit and well-defined core components make identifying threats to fidelity easier than programs in which core components are not well defined [
]. Understanding the theoretical framework of a program also can shed light on core components, or if possible, program developers can be contacted and queried about the essential elements of the programs.
During the first year of implementation, a 98.4% measure of adherence was achieved as reported by the curriculum facilitators, and 96.7% as reported by the observers. Average adherence across all 12 lessons ranged from 40% to 100% according to the facilitators. In addition, the average quality rating across all types of lessons was above 4.0 out of a maximum rating of 5.0. Although it is not possible to assert a direct relationship between the FITT model and high levels of fidelity defined as both adherence to activities and quality of implementation, data created the opportunity to provide more informed and specific TA than would otherwise have been possible.
Other research argues that some adaptation is not only good, but necessary. However, in a replication study, high levels of fidelity are essential. As with any experiment, repeatability of findings should be achieved before a program or innovation can be considered effective under real-world conditions and in a variety of settings [
The FITT model was developed as a way to use real-time implementation data to inform TA and ultimately increase fidelity of an evidence-based teen pregnancy prevention curriculum. The model illustrates a variety of TA strategies that are either immediate or longer-term, depending on the nature of the threat to fidelity. Using the FITT model could present some challenges because three key organizational abilities need to be in place: online data monitoring to enable access to timely implementation of data; highly skilled staff able to manage the flexibility of the FITT model and effectively provide TA; and sufficient resources to plan for accurate data collection, implementation, and review of implementation data.
Timely and accurate implementation data are the foundation of the FITT model because they make it possible to provide customized TA in real time. Curriculum adherence data needed to be entered in a timely fashion into a system that could be easily accessed and reviewed by the project coordinator. Observation data also needed to be accessible in a relatively short turnaround time. Fortunately, ETR had the capacity and resources to build a database that allowed for online tracking of implementation adherence. Without this online database, it would not have been possible to access implementation data in a timely way. However, using an online database may not always be an option, depending on the resources available.
Adhering to the FITT model required a significant commitment of resources in both the planning and operational stages of the IYG project. To have high-quality implementation data to inform TA, the SC Campaign invested resources in planning how these data would be collected. The planning phase included training for facilitators completing the implementation logs and for observers completing the observation logs. In addition, TA specialists met with facilitators before starting the curriculum to develop an implementation plan, which includes the dates each lesson would be implemented and any resources needed for that lesson. During implementation, the SC Campaign committed staff time to track actual implementation data compared with planned implementation. Again, if a facilitator fell behind the planned implementation schedule, he or she was contacted by TA staff to determine whether there was just a reporting problem or an implementation problem and if any remedial action was required.
Successfully carrying out the FITT model requires having highly trained TA specialists with the skill and time to address minor and major threats effectively in a sensitive way. The SC Campaign has been providing capacity-building services (training and TA) for 20 years. The organization maintains a highly skilled staff of master trainers and TA specialists. Technical assistance specialists were trained facilitators in the IYG curriculum, which may have enhanced fidelity in addition to the FITT model. The FITT model is flexible and offers multiple TA strategies ranging from immediate onsite visits to developing a training session, depending on the nature of the threat. This flexibility requires highly skilled staff, because they must balance several priorities when deciding how and when to respond to identified threats, including consideration of the unique personality of the facilitator in question and his or her preference as to method of contact; when the next scheduled contact with the facilitator will be; where the facilitator is in his or her implementation process and whether the issue will arise again in subsequent lessons; and, most important, the most effective way to address the threat. Some TA issues may be best handled by offering a training opportunity because it seems to be an issue affecting many facilitators. Other issues may require developing the capacity of a specific facilitator who may benefit from individual coaching or taking online lessons. Hiring and retaining highly skilled TA specialists also requires resources, such as competitive salaries and benefits.
The FITT model may represent a promising practice for using implementation data in real time to support fidelity, but several important limitations should be considered. The curriculum adherence data were reported by the teachers, and it is not possible to independently verify the high levels of adherence reported. Because of the project's emphasis on adherence, it is possible some teachers may not have been completely candid when completing implementation logs; yet, the curriculum adherence data reported in the observer logs supported the data reported by the teachers, which suggests high levels of adherence. However, it is possible that the presence of the observer may have resulted in the teacher being more careful to implement all activities. Finally, it is not possible to assert a direct relationship between using the FITT model and the high levels of fidelity achieved in this project. As noted, many other factors are associated with fidelity. Future study would be needed to identify whether the FITT model could have a direct and quantifiable effect on fidelity.
Planning and providing resources for the provision of TA not only can help schools and organizations reach high levels of implementation fidelity, it may also improve program success. In a meta-analysis of 52 mentoring programs, researchers found that programs with active implementation monitoring obtained effect sizes three times larger than programs with no active monitoring [
]. Further work in teen pregnancy prevention is needed to determine whether similar results can be obtained when monitoring implementation of evidence-based teen pregnancy prevention programs. For the IYG project, the evaluation cohort completed the second year of the curriculum during spring 2013. It will be assessed again in 2014 and outcome results will be available in 2015. At that point, it may be possible to better understand the relationship between high degrees of fidelity and program effect on behavior changes among students.
Past research has identified the need for a systematic monitoring and feedback system to ensure that implementation is tracked over time [
]. The findings from this study provide an example of a monitoring and feedback system, the FITT model, which shows promise for monitoring and using implementation data and observation data to achieve high rates of curriculum adherence and quality of implementation.
The authors thank Chris Rollison from the SC Campaign to Prevent Teen Pregnancy, and Karin Coyle from ETR Associates for their thoughtful review and feedback during the development of the manuscript.
This publication was made possible by Grant TP1AH000026 from the Office of Adolescent Health.
Program integrity in primary and early secondary prevention: Are implementation effects out of control?.
Markham C, Peskin M, Shegog R, Tortolero S. It's Your Game, Keep It Real. An HIV, STI, and pregnancy prevention curriculum for middle schools. Field trainers man. Houston, TX: University of Texas Prevention Research Center; 2012.
Conflicts of Interest: The authors declare no conflicts of interest.
Disclaimer: Publication of this article was supported by the Office of Adolescent Health, U.S. Department of Health and Human Services. The opinions or views expressed in this paper are those of the authors and do not necessarily represent the official position of the Office of Adolescent Health, U.S. Department of Health and Human Services.