Hostname: page-component-cd9895bd7-gvvz8 Total loading time: 0 Render date: 2024-12-24T16:38:26.414Z Has data issue: false hasContentIssue false

Measuring compliance with the Baby-Friendly Hospital Initiative

Published online by Cambridge University Press:  30 September 2011

Laura N Haiek*
Affiliation:
Ministère de la Santé et des Services sociaux du Québec, Direction générale de la santé publique, 201 Crémazie, Montréal, Québec H2M 1L2, Canada
*
*Corresponding author: Email [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Objective

The WHO/UNICEF Baby-Friendly Hospital Initiative (BFHI) is an effective strategy to increase breast-feeding exclusivity and duration but many countries have been slow to implement it. The present paper describes the development of a computer-based instrument that measures policies and practices outlined in the BFHI.

Design

The tool uses clinical staff/managers’ and pregnant women/mothers’ opinions as well as maternity unit observations to assess compliance with the BFHI's Ten Steps to Successful Breastfeeding (Ten Steps) and the International Code of Marketing of Breastmilk Substitutes (Code) by measuring the extent of implementation of two to fourteen indicators for each step and the Code. Composite scores are used to summarize results.

Setting

Examples of results from a 2007 assessment performed in nine hospitals in the province of Québec are presented to illustrate the type of information returned to individual hospitals and health authorities.

Subjects

Participants included nine to fifteen staff/managers per hospital randomly selected among those present during the interviewer-observer's 12 h hospital visit and nine to forty-five breast-feeding mothers per hospital telephoned at home after being randomly selected from birth certificates.

Results

The Ten Steps Global Compliance Score for the nine hospitals varied between 2·87 and 6·51 (range 0–10, mean 5·06) whereas the Code Global Compliance Score varied between 0·58 and 1 (range 0–1, mean 0·83). Instrument development, examples of assessment results and potential applications are discussed.

Conclusions

A methodology to measure BFHI compliance may help support the implementation of this effective intervention and contribute to improved maternal and child health.

Type
Research paper
Copyright
Copyright © The Author 2011

Exclusive breast-feeding for the first 6 months of life is a powerful health-promoting behaviour not consistently adopted(Reference Jones, Steketee and Black1, Reference Bryce, el Arifeen and Pariyo2). Moreover, there is growing evidence that exclusive and prolonged breast-feeding improves maternal/infant health in both developing and developed countries(Reference Horta, Bahl and Martinés3Reference Leon-Cava, Lutter and Ross6) and promoting it may be cost-effective(Reference Bartick and Reinhold7). The WHO/UNICEF Baby-Friendly Hospital Initiative (BFHI) is an effective strategy to increase breast-feeding exclusivity and duration(Reference Kramer, Chalmers and Hodnett5) but many countries have been slow to implement it(8).

In fact, compliance with the Initiative's policies and practices, outlined in the Ten Steps to Successful Breastfeeding (Ten Steps, see Table 1) and the International Code of Marketing of Breastmilk Substitutes (Code)(8, 9), requires formulating adequate policy as well as a detailed revision of pre-, peri- and postnatal services to support a change in paradigm where the mother/baby dyad is the centre of the process of care. Once Baby-Friendly status is achieved, recertification – or monitoring compliance – after initial designation poses another challenge. For example, although breast-feeding duration may decrease with deteriorating compliance in designated facilities(Reference Merten, Dratva and Ackermann-Liebrich10), only Switzerland reports to systematically monitor Baby-Friendly practices once a hospital has been certified for the initial period(11); other countries rely, if anything, only on recertification procedures(Reference Moura de Araujo Mde and Soares Schmitz Bde12). Since the introduction of more rigorous BFHI revised standards in 2006, countries are also faced with challenges in implementing them, particularly in regard to skin-to-skin contact (immediately after birth for at least an hour, unless medically justified), rooming-in (no mother/baby separation allowed, unless justified) and in applying these standards to caesarean deliveries(8).

Table 1 The Ten Steps to successful breast-feeding

Source: WHO/UNICEF (2009) Baby-Friendly Hospital Initiative. Revised, Updated and Expanded for Integrated Care. Section 1: Background and Implementation (8).

Monitoring BFHI compliance

Several publications have reported diverse methods to measure compliance with standards promoted by the BFHI(Reference Merten, Dratva and Ackermann-Liebrich10, 11, Reference Campbell, Gorman and Wigglesworth1330). For example, whereas most rely on surveys, Swiss authors report continuously monitoring targeted BFHI hospital practices(Reference Merten, Dratva and Ackermann-Liebrich10, 11). Also, most of these studies are designed to measure associations between BFHI exposure and breast-feeding behaviours and to measure BFHI compliance they generally resort to only one of the information sources proposed for the official designation(8): hospital staff and/or managers(Reference Cattaneo and Buzzetti14, 15, Reference Dodgson, Allard-Hale and Bramscher19, Reference Kovach21Reference Levitt, Kaczorowski and Hanvey23, Reference Rosenberg, Stull and Adler26, Reference Syler, Sarvela and Welshimer27, Reference Grizzard, Bartick and Nikolov29) or pregnant women/mothers(Reference Declercq, Labbok and Sakala16Reference DiGirolamo, Grummer-Strawn and Fein18, Reference Murray25, Reference Chalmers, Levitt and Heaman28) cared for in the facility; only two studies used both sources(Reference Martens, Phillips and Cheang24, 30) and two older studies also included observations of maternity units(Reference Campbell, Gorman and Wigglesworth13, Reference Gokcay, Uzel and Kayaturk20). No recent publication has triangulated these three data sources to analyse biases contributed by each source. Likewise, with the exception of reports on Nicaraguan(31), Swiss(11) and CDC monitoring initiatives(15, Reference Murray25, 32), little has been published about effective strategies or tools to convey to authorities and health-care professionals results about BFHI compliance that may help them improve practice(Reference Jamtvedt, Young and Kristoffersen33).

Furthermore, no computerized assessment method or tool is available to measure – and disseminate – compliance with the updated BFHI, using three information sources. WHO/UNICEF have developed a computerized tool to use in external hospital assessments for BFHI designation, therefore with restricted access(8). It is intended to collect information in compliance with standards and does not require systematic recording of (i) information on non-compliant answers/observations or (ii) qualitative data provided by participants or noted in observations. Also, the completed tool is not returned to evaluated hospitals. For monitoring purposes, WHO/UNICEF suggest different strategies including use of a paper tool(34) but there have been no publications reporting its use or accuracy. Although an evaluation method developed in Brazil has been tested and published(Reference de Oliveira, Camacho and Tedstone35), no computerized tools are available to assess/monitor compliance with a BFHI expansion to community health centres (CHC).

The present paper describes the development of a comprehensive, bilingual, computer-based tool to collect, summarize and disseminate information on policies and practices outlined in the BFHI intended for policy makers, public health authorities, hospital managers, physicians, nurses and other health-care professionals caring for families. The tool supports both normative and formative assessments because it not only measures compliance with evidence-based international standards but it can also contribute to the planning process by giving facilities detailed feedback on which improvements are needed(34).

Tool development

In response to successive provincial policy statements prioritizing the BFHI, the public health authority of the Montérégie (second largest socio-sanitary region in the province of Québec, Canada) developed an assessment tool in 2001 to monitor hospital compliance with BFHI indicators. In 2007, the tool was revised and renamed BFHI-40 Assessment Tool(Reference Haiek and Gauthier36). This version of the tool was used in a large study assessing BFHI compliance in sixty birthing facilities across the province of Québec, including nine hospitals in the Montérégie(Reference Haiek37). Hence, the Montérégie has benefited from a baseline assessment in 2001 for its nine hospitals(Reference Haiek, Brosseau and Gauthier38) (performing over 12 000 deliveries annually) and follow-up assessments in 2004 and 2007(Reference Haiek37). Assessments were approved by the Ethics Committee of Charles LeMoyne Hospital (a university-affiliated hospital with regional mandates).

While describing the development of the tool(Reference Haiek and Gauthier36), results from the 2007 assessment for one of the nine hospitals (Hospital F, 1700 annual deliveries) and for the region (mean of all Montérégie hospitals) are presented to illustrate the type of information returned to individual hospitals and regional authorities. Participants from the nine hospitals included nine to fifteen staff/managers (ten for Hospital F, total of ninety-four) and nine to forty-five breast-feeding mothers (thirty-five for Hospital F, total of 176). Participating staff were randomly selected among those present during the interviewer-observer's 12 h hospital visit (91 % response rate for Hospital F and 92 % for the Montérégie). Mothers were selected randomly from birth certificates and answered telephone interviews (88 % response rate for both Hospital F and the Montérégie) when their babies were on average 2 months old (73 d for Hospital F and 72 d for the Montérégie).Footnote * Six per cent of Hospital F and 15 % of Montérégie mothers delivered by caesarean section under epidural.Footnote Lastly, observations targeted documentation and educational/promotional materials; in this particular assessment, observations of postpartum mother/baby dyads were optional.

The following steps were followed to develop the tool.

1. Defining indicators for the Ten Steps and the Code

Using the BFHI as a framework, one to seven ‘common’ indicators (see explanation below) were formulated to measure each step and the Code, totalling forty (referred to in the tool's name). Most of these indicators were originally defined respecting the 1992 BFHI's Global Criteria(39) and were revised to comply with 2006 criteria(8). ‘Common’ indicators are measured using one, two or three perspectives or sources of information: (i) maternity unit staff (including physicians) and managers (usually head nurses); (ii) pregnant women and mothers; and (iii) external observers. For example, as shown in Table 2, Step 4 has four ‘common’ indicators, each measured using staff/managers and mothers, resulting in eight indicators for the step (four for staff/managers and four for mothers).Footnote *

Table 2 Four ‘common’ indicators and corresponding eight indicators for Step 4 of the Baby-Friendly Hospital Initiative

Each indicator is measured with one or two questions (mostly open-ended) or observations. Questions were originally(Reference Haiek, Brosseau and Gauthier38) developed integrating the 1992 criteria(39), previously tested questionnaires(Reference Martens, Phillips and Cheang24, Reference Lepage and Moisan40, Reference Kovach41) and multidisciplinary experts’ opinion. They were revised twice(Reference Haiek and Gauthier36, Reference Haiek, Gauthier and Brosseau42) to update recommendations(8) and improve content validity.

To enhance comparability, formulation of questions measuring a given policy or practice (and the corresponding indicator) is similar or identical for each perspective. Each question and observation is followed by colour-coded compliant (green) and non-compliant (yellow) categories (Fig. 1) where the interviewer summarizes answers and observations; a line intended for comments allows the interviewer/observer to record answers not listed as well as qualitative information offered by respondents.

Fig. 1 Example of a question used to measure a Step 4 indicator (mothers’ perspective)

The tool comprises questions to measure indicators not specifically addressed by the Global Criteria but that yield useful information. For example, despite being subject since 2006 to equal standards, questions for Step 4 are asked separately for vaginal and caesarean deliveries. The tool also includes information on potential institutional- and individual-level variables.

2. Measuring indicators’ extent of implementation

The extent of implementation of a given indicator is obtained by calculating the percentage of compliant or ‘correct’ answers/observations. Figure 2 illustrates results for Step 4 indicators for the ‘example’ Montérégie hospital (Hospital F) and the regional mean.

Fig. 2 Compliance with Step 4 as measured by the extent of implementation of the step's indicators in Montérégie Hospital F and the Montérégie region, 2007

For example, three out of ten staff/managers in Hospital F report placing mother and baby in contact immediately after a delivery – vaginal or caesarean under epidural – or within the first 5 min (extent of implementation of 30 %), whereas thirty-three of thirty-five mothers delivering vaginally or by caesarean in this hospital report having benefited, unless medically justifiably, from this early contact with their babies (extent of implementation of 94 %). Precision of calculated percentages varies according to the extent of implementation of the indicator and sample sizes. Using the same example, the proportion (%) and 95 % CI of the indicator measuring timing of contact is 30 % (2 %, 58 %) for staff/managers and 94 % (86 %, 100 %) for mothers. The extent of implementation of the other indicators can be interpreted in the same manner.

Results analysed by type of delivery show staff/managers and mothers are more likely to report early and prolonged ‘true’ skin-to-skin contact for vaginal deliveries than for caesarean section (C-section; Fig. 3). They also show a consistently low dissimilarity indexFootnote * (i.e. 15 % or under) between staff/managers and mothers for all indicators related to vaginal deliveries. Conversely, the dissimilarity between sources is high (i.e. greater than 15 %) for the indicators measuring quality of skin-to-skin contact for caesarean deliveries.

Fig. 3 Compliance with Step 4 as measured by the extent of implementation of the step's indicators in Montérégie Hospital F and the Montérégie region, by type of delivery, 2007

This example illustrates how collecting separate data may help to interpret, in this case, the up-dated skin-to-skin standards. Thus, higher compliance with Indicator 1 reported by mothers in the combined analysis (Fig. 2) can be explained by the fact 85 % of them delivered vaginally and report their favourable experience with this type of delivery (the majority of vaginal birth experiences in the sample drives overall compliance up), whereas the staff had to report compliant practices for both vaginal and caesarean deliveries (their report on caesareans drives overall compliance down).

3. Establishing compliance

To consider an indicator completely implemented, its extent of implementation has to attain a pre-established threshold. The tool is prepared to assess compliance with an 80 % threshold (for example, at least 80 % of mothers report they were put in skin-to-skin contact with their baby according to a compliant definition).Footnote * To summarize compliance, composite scores were constructed for each step (partial scores) and for all Ten Steps and the Code (global scores).

Step Partial Compliance Score

To build the Partial Compliance Score for each of the Ten Steps, a value of 0 or 1 point is attributed to each indicator based on its extent of implementation. Hence, a value of 0 denotes that the indicator is not completely implemented (i.e. its extent of implementation does not reach the threshold) and a value of 1 denotes that the indicator is completely implemented (i.e. its extent of implementation reaches the threshold). The score is then calculated by adding all the points attributed to each indicator of the particular step divided by the maximum amount of points that would be accumulated if all the step's indicators were completely implemented, resulting in a score that varies between 0 and 1. For example, a Step 4 Partial Compliance Score of 0 indicates that none of the eight indicators measuring step compliance are completely implemented; a score of 0·25, that two of the eight indicators are completely implemented; and a score of 1, that all eight indicators are completely implemented. Figure 4 shows Partial Compliance Scores calculated with an 80 % threshold for each step for Hospital F and the Montérégie.

Fig. 4 Partial Compliance Scores for each of the Ten Steps for Hospital F and the Montérégie region, 2007

Ten Steps and Code Global Compliance Scores

The Ten Steps Global Compliance Score is obtained by adding the individual steps’ Partial Compliance Scores and, therefore, varies between 0 and 10. The Code Global Compliance Score is calculated the same as a Step Partial Compliance Score, ranging also between 0 and 1.

Figure 5 shows 2007 global scores for the Montérégie hospitals. The Ten Steps Global Compliance Score varied among hospitals between 2·87 and 6·51 (regional mean of 5·06) whereas the Code Global Compliance Score varied between 0·58 and 1 (regional mean of 0·83). Because of the way the score is constructed, its validity in measuring a hospital's true BFHI compliance depends on the precision of each indicator's extent of implementation. As explained above, their precision varies with sample size and the variability for which the measured policy or practice is implemented or reported.

Fig. 5 Global Compliance Scores for the Ten Steps and the Code, Montérégie region hospitals, 2007

4. Monitoring compliance

Figure 6 shows the evolution of the Global Compliance Score for the Montérégie hospitals. To assure comparability between assessments, 2007 scores were recalculated using the same methodology as in 2001 and 2004 (based on 1992 Global Criteria and a slightly different attribution of points). This results in an increase of the Ten Steps Global Compliance Score from 6·18 to 7·33 for Hospital F and from 5·06 to 5·79 for the regional mean. It can be noted that eight of nine hospitals increased both the Ten Steps and Code Global Compliance Scores over time, frequently dramatically (as documented for Hospital F). Examples of actions undertaken at the regional level to address gaps in compliance with Step 4 and other BFHI practices include: adapting training materials, using a regional collaborative approach to discuss challenges identified in the assessments, share strategies and invite champions to present creative solutions. In the case of Hospital F, concrete actions in regard to Step 4 were taken only in 2004 after managers and clinicians forming a breast-feeding committee used monitoring results from the first two assessments to identify areas needing improvement. Changes were introduced gradually, involving all maternity unit staff and aimed at improving nursing competency. These efforts resulted in a doubling of Step 4 Partial Compliance Score between 2001–2004 and 2007 and were instrumental in obtaining BFHI certification before 2007 (personal communication from Hospital F's head nurse and breast-feeding coordinator).

Fig. 6 Global Compliance Scores for the Ten Steps and the Code based on the 1992 BFHI Global Criteria, Montérégie region hospitals, evolution between 2001 and 2007

5. Computerizing assessment tools

In order to avoid time-consuming tasks involved in analysing and reporting assessment results, a computerized tool combining questionnaires and an observation grid for data collection, computations for data analysis and dissemination tables was developed in 2004(Reference Haiek, Gauthier and Brosseau42). The tool was adapted in 2005 to measure Baby-Friendly compliance in the region's nineteen CHC offering pre- and postnatal services. The latest adaptation for a Québec-wide measure in sixty birthing facilities and 147 CHC are the tool's last two bilingual versions(Reference Haiek and Gauthier36, Reference Haiek and Gauthier43). Free copies of the tools are available from the author and may be used under certain copyright and copyleft(44) conditions.

All versions of the tools are available in an Excel file. For example, the BFHI-40 Assessment Tool has fifteen spreadsheets: three introductory sheets three data collection sheets (one per perspective) and nine others, summarizing results. The data collection sheets (rendered fail-safe with several program features) have assigned cells to enter answers/observations, ideally completed by an interviewer using a computer. This procedure results in prompt data computations and graphic representations, performed as data are entered.

Strengths and limitations of the methodology

The methodology's main strength is that it measures compliance based on different sources of information, thus allowing an analysis by triangulation. This is relevant because results are likely to be biased when relying on only one source. Obtaining reports from multiple professionals at each facility and comparing them with maternal answers to similar questions(Reference Rosenberg, Stull and Adler26) and with observations results in a more valid depiction of BFHI compliance(Reference DiGirolamo, Grummer-Strawn and Fein18, Reference Rosenberg, Stull and Adler26). Simple descriptive statistics such as a correlation analysis or dissimilarity index can be used to explore how the sources differ. For example in the 2007 assessment, the largest dissimilarities were between staff/managers and mothers in reports of pre- and postnatal counselling frequency and quality (Steps 3, 5 and 8): according to mothers, staff consistently overestimate compliance but mothers’ reports may be subject to recall bias. In contrast, the lowest dissimilarity between sources lies with indicators assessing policies (Step 1 and the Code) and postpartum follow-up (Step 10), that use observers as one of the sources. In fact, in this particular assessment, observations consistently confirmed staff/managers’ reports. As illustrated above for skin-to-skin contact, reports on hospital routines (Steps 4, 6, 7 and 9) need particular interpretation depending on samples used (e.g. percentage of caesarean deliveries). Ultimately, for an individual hospital, discrepancies between sources need to be interpreted taking into consideration their particular context, sampling procedures, potential biases and, if available, reference data (such as means for a whole country, region or state/province or for BFHI-designated facilities).

Since there seems to be a relationship between the number of steps implemented in a facility and breast-feeding exclusivity(Reference Merten, Dratva and Ackermann-Liebrich10, Reference Declercq, Labbok and Sakala16, 30) and duration(Reference Merten, Dratva and Ackermann-Liebrich10, 15, Reference DiGirolamo, Grummer-Strawn and Fein17, Reference DiGirolamo, Grummer-Strawn and Fein18, Reference Murray25, Reference Rosenberg, Stull and Adler26), the tool's in-depth assessment of all proposed policies and practices(Reference DiGirolamo, Grummer-Strawn and Fein18) may help improve the effectiveness of the BFHI intervention by promoting compliance with all Ten Steps and the Code. Information about other potential institutional- or individual-level confounders(Reference Rosenberg, Stull and Adler26) may be beneficial when analysing and interpreting results. It also provides a rigorous methodology that allows comparability among facilities and reproducibility over time.

Although there is no established gold standard to determine the accuracy of the methodology, it is noteworthy that by 2007 the three hospitals with highest global scores (A, C and F) were those that at the time of the assessment had either already obtained or formally applied for BFHI certification (based on 1992 Global Criteria, those from 2006 had not yet been incorporated into the evaluation process).

In turn, these strengths are related to the methodology's main limitations. The open-ended questions to collect in-depth information require interviewers skilled on breast-feeding and the BFHI. Collecting data from different sources is time-consuming, making it difficult for a given facility to perform detailed observations or obtain large sample sizes, even if sufficient staff and mothers are available to participate. Resulting small sample sizes may hamper representativeness(Reference de Oliveira, Camacho and Tedstone35) or the precision of the estimates, especially when measuring policies and practices with large variation in compliance (i.e. closer to 50 %). For example, Montérégie mothers tend to show more conflicting reports about counselling frequency and quality than about hospital routines, suggesting the need for a larger sample size to assess the former. Furthermore, the fact mothers convey their personal experience that is likely more variable than the generalization managers/staff are asked to report, constitutes another argument to aim for larger sample sizes for mothers. Conversely, when assessments are done on a whole region, province or country, summary or aggregated analyses will improve validity and serve as reference values for individual facilities.

Applications of the measurement methodology

Use of the developed tool is flexible. It can be used to collect data from only one information source or to measure specific steps requiring closer attention. Although developed for planning and monitoring, the methodology may also prove useful for research about BFHI determinants or effectiveness, quality improvement exercises, or as a ‘mock’ practice (or pre-evaluation) in the final stages towards officially becoming Baby-Friendly. In addition, if a country decides to rely fully on a system of internal monitoring, without scheduling external reassessments(Reference Merten, Dratva and Ackermann-Liebrich10, 34), this type of tool could be used to carry out periodic monitoring. Cost in performing a measurement will obviously vary with its use but with more widespread use of personal computers, it is presumably accessible to low-income/low-resource settings.

In fact, because of these user-friendly properties, the interviewer needs only to have basic computer skills. Whether the assessment is performed locally or at regional/provincial level, the main requirement is that interviewers have adequate knowledge of breast-feeding and the BFHI, such as a lactation consultant.

Approximate interviewing time needed to complete a hospital assessment include 20 and 30 min for each staff/manager and mother, respectively. The minimal amount of observation is 1 h but should be increased if observation of mothers/babies is included. To improve validity of the assessment, efforts should be done to avoid sources of selection bias (e.g. announcing the day of the assessment visit or selecting mothers from a list prepared by the hospital) and recall bias. Inevitably, sample size will be influenced by monitoring goals (hospitals alone or CHC also), feasibility and cost. For example, the cost to apply the tool in a small or middle sized hospital (less than 2500 births annually) is a 12 hour-day visit (to interview staff from all shifts). Cost of interviewing mothers will depend on whether done while visiting the facility or by telephone. Other costs to be added are those of training of interviewers (1 d suffices), organizing the visits, travelling time as well as unused time between interviews.

Furthermore, based in our experience, disseminating regional assessments to participating facilities at the local level (via their completed instruments and personalized presentations) not only provides concrete data on achievements and challenges, but also clarifies and demystifies the BFHI recommendations, contributing to the adoption of a ‘regional/local’ common vision. It also seems to spur a mobilization of key players contributing to organizational changes required to progress towards achieving or maintaining the standards required for Baby-Friendly designation. In fact, all eight hospitals and nineteen CHC in the Montérégie have stated in legally mandated local action plans they will seek Baby-Friendly designation or recertificationFootnote * by 2012(45).

Conclusions

It is well recognized that the BFHI is an effective intervention to improve breast-feeding exclusivity and duration. Since its inception in 1991, it has been prioritized in international and national infant feeding policies and recommendations. Still, it remains a challenge to transfer what is already known into action, that is to deliver the intervention to mothers, children and families(Reference Jones, Steketee and Black1). The current paper presents a process for making policies and recommendations targeting the BFHI operational. At a local, regional, provincial, national or international level, measuring BFHI compliance with a computerized tool allows authorities and clinical multidisciplinary teams to set realistic objectives and select appropriate activities to implement the proposed policies and best practices, providing as well valuable baseline or progress information for programme monitoring and evaluation at all levels. Moreover, personalized and timely dissemination of results may help health-care facilities achieve or maintain the international standards required for Baby-Friendly designation.

Acknowledgements

The development of the tool and the 2007 assessment were supported financially by the Agence de la santé et des services sociaux de la Montérégie, where the author worked at the time, and the Ministère de la Santé et des Services sociaux du Québec. There are no competing interests to declare. L.N.H. conceived the different versions of the assessment tools and is first author on all of them. Besides holding the moral rights, L.N.H. also shares the tools’ licence together with the Agence de la santé et des services sociaux de la Montérégie. L.N.H. directed the four studies testing the different versions of the assessment tools, and also wrote the present manuscript. I acknowledge and thank the other authors of the different versions of the assessment tools, Dominique Brosseau, Dany Gauthier and Lydia Rocheleau. I am also grateful to Eric Belzile, Ann Brownlee, Manon Des Côtes, Janie Houle, Ginette Lafontaine, Linda Langlais, Nathalie Lévesque, Monique Michaud, Isabelle Ouellet, François Pilote, Ghislaine Reid and Yue You for their contribution at different stages of this project. Lastly, I would like to thank Jane McCusker, Anne Merewood and Sonia Semenic for revising different versions of the manuscript.

Footnotes

* Interviews with mothers were scheduled at this period because the assessment measured also Baby-Friendly compliance in CHC. This required that the interview be sufficiently spaced from the time of hospital discharge in order to allow delivery of assessed postnatal services but at the same time trying to minimize the risk of recall bias.

Mothers of babies weighing less than 2000 g or having delivered under general anaesthesia were excluded.

* In the official designation process, observations of vaginal deliveries are contemplated to confirm, if necessary, adherence to this step(8). The tool does not require these observations to avoid intruding with care (if mother's perspective is measured while in hospital) or to allow telephone interview (if mother's perspective is measured after discharge).

* A statistic used to measure the overall difference between two percentage distributions (range 0–100). It indicates the proportion of cases that would need to be reallocated in order to make the two distributions equal.

* Supplementary spreadsheets (Excel; Microsoft® Corporation, Redmond, WA, USA) analyse results using a 60 % threshold to identify indicators closer to being completely implemented (extent of implementation between 60 % and 79 %) in contrast to those less well implemented (extent of implementation less than 60 %). In fact, thresholds can be easily modified to suit particular needs.

* One university hospital in the region is not required to formulate an action plan. Eight facilities are presently designated Baby-Friendly.

References

1. Jones, G, Steketee, RW, Black, RE et al. (2003) How many child deaths can we prevent this year? Lancet 362, 6571.CrossRefGoogle ScholarPubMed
2. Bryce, J, el Arifeen, S, Pariyo, G et al. (2003) Reducing child mortality: can public health deliver? Lancet 362, 159164.CrossRefGoogle ScholarPubMed
3. Horta, B, Bahl, R, Martinés, J et al. (2007) Evidence on the Long-term Effects of Breastfeeding. Systematic Reviews and Meta-analysis. Geneva: WHO; available at http://whqlibdoc.who.int/publications/2007/9789241595230_eng.pdfGoogle Scholar
4. Ip, S, Chung, M, Raman, G et al. (2007) Breastfeeding and Maternal and Infant Health Outcomes in Developed Countries. Evidence Report/Technology Assessment no. 153. AHRQ Publication no. 07-E007. Rockville, MD: Agency for Healthcare Research and Quality; available at http://www.ahrq.gov/downloads/pub/evidence/pdf/brfout/brfout.pdfGoogle Scholar
5. Kramer, MS, Chalmers, B, Hodnett, ED et al. (2001) Promotion of Breastfeeding Intervention Trial (PROBIT): a randomized trial in the Republic of Belarus. JAMA 285, 413420.Google Scholar
6. Leon-Cava, N, Lutter, C, Ross, J et al. (2002) Quantifying the Benefits of Breastfeeding: A Summary of the Evidence. Washington, DC: Pan American Health Organization; available at http://www.linkagesproject.org/media/publications/Technical%20Reports/BOB.pdfGoogle Scholar
7. Bartick, M & Reinhold, A (2010) The burden of suboptimal breastfeeding in the United States: a pediatric cost analysis. Pediatrics 125, e1048e1056.Google Scholar
8. World Health Organization/UNICEF (2009) Baby-Friendly Hospital Initiative. Revised, Updated and Expanded for Integrated Care. Section 1: Background and Implementation. Geneva: WHO/UNICEF; available at http://www.who.int/nutrition/publications/infantfeeding/9789241594967_s1.pdfGoogle Scholar
9. World Health Organization (1981) International Code of Marketing of Breast-milk Substitutes. Geneva: WHO; available at http://whqlibdoc.who.int/publications/9241541601.pdfGoogle Scholar
10. Merten, S, Dratva, J & Ackermann-Liebrich, U (2005) Do baby-friendly hospitals influence breastfeeding duration on a national level? Pediatrics 116, e702e708.Google Scholar
11. Basel Institute for Social and Preventive Medicine (2008) Monitoring de la promotion de l'allaitement maternel dans les maternités certifiées favorables à l'allaitement maternel et les cliniques et hôpitaux qui visent la certification (Initiative Hôpitaux Amis des Bébés). Basel: Basel Institute for Social and Preventive Medicine.Google Scholar
12. Moura de Araujo Mde, F & Soares Schmitz Bde, A (2007) Reassessment of baby-friendly hospitals in Brazil. J Hum Lact 23, 246252.Google Scholar
13. Campbell, H, Gorman, D & Wigglesworth, A (1995) Audit of the support for breastfeeding mothers in Fife maternity hospitals using adapted ‘Baby Friendly Hospital’ materials. J Public Health Med 17, 450454.Google ScholarPubMed
14. Cattaneo, A & Buzzetti, R (2001) Effect on rates of breast feeding of training for the baby friendly hospital initiative. BMJ 323, 13581362.CrossRefGoogle ScholarPubMed
15. Centers for Disease Control and Prevention (2008) Breastfeeding-related maternity practices at hospitals and birth centers – United States, 2007. MMWR Morb Mortal Wkly Rep 57, 621625.Google Scholar
16. Declercq, E, Labbok, MH, Sakala, C et al. (2009) Hospital practices and women's likelihood of fulfilling their intention to exclusively breastfeed. Am J Public Health 99, 929935.Google Scholar
17. DiGirolamo, AM, Grummer-Strawn, LM & Fein, S (2001) Maternity care practices: implications for breastfeeding. Birth 28, 94100.CrossRefGoogle ScholarPubMed
18. DiGirolamo, AM, Grummer-Strawn, LM & Fein, SB (2008) Effect of maternity-care practices on breastfeeding. Pediatrics 122, Suppl. 2, S43S49.CrossRefGoogle ScholarPubMed
19. Dodgson, JE, Allard-Hale, CJ, Bramscher, A et al. (1999) Adherence to the ten steps of the Baby-Friendly Hospital Initiative in Minnesota hospitals. Birth 26, 239247.CrossRefGoogle Scholar
20. Gokcay, G, Uzel, N, Kayaturk, F et al. (1997) Ten steps for successful breast-feeding: assessment of hospital performance, its determinants and planning for improvement. Child Care Health Dev 23, 187200.Google Scholar
21. Kovach, AC (1997) Hospital breastfeeding policies in the Philadelphia area: a comparison with the ten steps to successful breastfeeding. Birth 24, 4148.CrossRefGoogle ScholarPubMed
22. Kovach, AC (2002) A 5-year follow-up study of hospital breastfeeding policies in the Philadelphia area: a comparison with the ten steps. J Hum Lact 18, 144154.CrossRefGoogle ScholarPubMed
23. Levitt, CA, Kaczorowski, J, Hanvey, L et al. (1996) Breast-feeding policies and practices in Canadian hospitals providing maternity care. CMAJ 155, 181188.Google Scholar
24. Martens, PJ, Phillips, SJ, Cheang, MS et al. (2000) How Baby-Friendly are Manitoba hospitals? The Provincial Infant Feeding Study. Breastfeeding Promotion Steering Committee of Manitoba. Can J Public Health 91, 5157.CrossRefGoogle Scholar
25. Murray, E (2006) Hospital practices that increase breastfeeding-duration: results from a population based study. Birth 34, 202210.CrossRefGoogle Scholar
26. Rosenberg, KD, Stull, JD, Adler, MR et al. (2008) Impact of hospital policies on breastfeeding outcomes. Breastfeed Med 3, 110116.Google Scholar
27. Syler, GP, Sarvela, P, Welshimer, K et al. (1997) A descriptive study of breastfeeding practices and policies in Missouri hospitals. J Hum Lact 13, 103107.Google Scholar
28. Chalmers, B, Levitt, C, Heaman, M et al. (2009) Breastfeeding rates and hospital breastfeeding practices in Canada: a national survey of women. Birth 36, 122132.Google Scholar
29. Grizzard, TA, Bartick, M, Nikolov, M et al. (2006) Policies and practices related to breastfeeding in Massachusetts: hospital implementation of the ten steps to successful breastfeeding. Matern Child Health J 10, 247263.CrossRefGoogle ScholarPubMed
30. Toronto Public Health (2010) Breastfeeding in Toronto: Promoting Supportive Environments. Toronto: Toronto Public Health; available at http://www.toronto.ca/health/breastfeeding/environments_report/pdf/technical_report.pdfGoogle Scholar
31. UNICEF Nicaragua (2006) The Nicaragua Mother and Baby Friendly Health Units Initiative. Factors Influencing its Success and Sustainability. Managua: UNICEF Nicaragua; available at http://www.hciproject.org/sites/default/files/NicMotherBabyFriend.pdfGoogle Scholar
32. Centers for Disease Control and Prevention (2010) Breastfeeding Report Card – United States. Atlanta, GA: CDC; available at http://www.cdc.gov/breastfeeding/data/reportcard.htmGoogle Scholar
33. Jamtvedt, G, Young, JM, Kristoffersen, DT et al. (2006) Audit and feedback: effects on professional practice and health care outcomes. Cochrane Database Syst Rev issue 2, CD000259.Google Scholar
34. World Health Organization/UNICEF (2009) Baby-Friendly Hospital Initiative. Revised, Updated and Expanded for Integrated Care. Section 4: Hospital Self-Appraisal and Monitoring. Geneva: WHO; available at http://www.who.int/nutrition/publications/infantfeeding/9789241594998_s4.pdfGoogle Scholar
35. de Oliveira, MI, Camacho, LA & Tedstone, AE (2003) A method for the evaluation of primary health care units’ practice in the promotion, protection, and support of breastfeeding: results from the state of Rio de Janeiro, Brazil. J Hum Lact 19, 365373.Google Scholar
36. Haiek, LN & Gauthier, DL (2007) Instrument de mesure IHAB-40. Une méthodologie pour évaluer le niveau d'implantation de l'Initiative des hôpitaux amis des bébés (BFHI-40 Assessment Tool. A methodology to measure compliance with the Baby-Friendly Hospital Initiative). Longueuil: Direction de santé publique. Agence de la santé et des services sociaux de la Montérégie.Google Scholar
37. Haiek, LN (2011) Niveau d'implantation de l'Initiative des amis des bébés dans les établissements offrant des services de périnatalité au Québec. Québec: Ministère de la Santé et des Services sociaux; available at http://publications.msss.gouv.qc.ca/acrobat/f/documentation/2010/10-815-01.pdfGoogle Scholar
38. Haiek, LN, Brosseau, D, Gauthier, DL et al. (2003) Initiative des hôpitaux amis des bébés. Étude sur le niveau d'implantation en Montérégie. Longueuil: Direction de santé publique, Régie régionale de la santé et des services sociaux de la Montérégie; available at http://extranet.santemonteregie.qc.ca/Menu_Gauche/4-Publications/3-Monographies_Orientations_Rapports/Prevention_et_promotion_de_la_sante/dsp_pub_allaitement_implantation.pdfGoogle Scholar
39. World Health Organization/UNICEF (1992) Baby-Friendly Hospital Initiative: 1. The Global Criteria for the WHO/UNICEF Baby-Friendly Hospital Initiative. Geneva: WHO/UNICEF.Google Scholar
40. Lepage, MC & Moisan, J (1998) Étude sur l'alimentation du nourrisson chez des femmes primipares du Québec. Beauport: Régie régionale de la santé et des services sociaux de Québec, Direction de santé publique.Google Scholar
41. Kovach, AC (1996) An assessment tool for evaluating hospital breastfeeding policies and practices. J Hum Lact 12, 4145.Google Scholar
42. Haiek, LN, Gauthier, DL, Brosseau, D et al. (2004) BFHI-100 Assessment Tool. A Methodology to Measure Compliance with the Baby-Friendly Hospital Initiative (Instrument de mesure IHAB-100. Une méthodologie pour évaluer le niveau d'implantation de l'Initiative des hôpitaux amis des bébés). Longueuil: Direction de santé publique, Agence de la santé et des services sociaux de la Montérégie.Google Scholar
43. Haiek, LN & Gauthier, DL (2007) Instrument de mesure IAB-37. Une méthodologie pour évaluer le niveau d'implantation de l'Initiative des amis des bébés dans les centres de soins de santé primaires (BFI-37 Assessment Tool. A Methodology to Measure Compliance with the Baby-Friendly Initiative in Community Health Centers). Longueuil: Direction de santé publique, Agence de la santé et des services sociaux de la Montérégie.Google Scholar
44. Creative Commons (2010) About Licenses. http://creativecommons.org/about/licenses/ (accessed July 2011).Google Scholar
45. Direction de santé publique de la Montérégie (2009) Plan d'action régional 2009–2012. Destination prévention. Longueuil: Agence de la santé et des services sociaux de la Montérégie; available at http://extranet.santemonteregie.qc.ca/Menu_Gauche/4-Publications/6-Dépliants_Guides_Outils_Information/Santé_Publique/PAR%202009-2012.pdfGoogle Scholar
Figure 0

Table 1 The Ten Steps to successful breast-feeding

Figure 1

Table 2 Four ‘common’ indicators and corresponding eight indicators for Step 4 of the Baby-Friendly Hospital Initiative

Figure 2

Fig. 1 Example of a question used to measure a Step 4 indicator (mothers’ perspective)

Figure 3

Fig. 2 Compliance with Step 4 as measured by the extent of implementation of the step's indicators in Montérégie Hospital F and the Montérégie region, 2007

Figure 4

Fig. 3 Compliance with Step 4 as measured by the extent of implementation of the step's indicators in Montérégie Hospital F and the Montérégie region, by type of delivery, 2007

Figure 5

Fig. 4 Partial Compliance Scores for each of the Ten Steps for Hospital F and the Montérégie region, 2007

Figure 6

Fig. 5 Global Compliance Scores for the Ten Steps and the Code, Montérégie region hospitals, 2007

Figure 7

Fig. 6 Global Compliance Scores for the Ten Steps and the Code based on the 1992 BFHI Global Criteria, Montérégie region hospitals, evolution between 2001 and 2007