Hostname: page-component-cd9895bd7-p9bg8 Total loading time: 0 Render date: 2024-12-18T08:22:31.554Z Has data issue: false hasContentIssue false

The impact of surprise billing laws on hospital-based physician prices and network participation

Published online by Cambridge University Press:  11 December 2024

Christopher Garmon*
Affiliation:
Henry W. Bloch School of Management, University of Missouri Kansas City, Kansas City, MO, USA
Yiting Li
Affiliation:
College of Public Health, Nationwide Children's Hospital, Columbus, OH, USA
Sheldon M. Retchin
Affiliation:
College of Public Health, Ohio State University, Columbus, OH, USA
Wendy Yi Xu
Affiliation:
College of Public Health, Ohio State University, Columbus, OH, USA
*
Corresponding author: Christopher Garmon; Email: [email protected]
Rights & Permissions [Opens in a new window]

Abstract

Prior to the No Surprises Act (NSA), numerous states passed laws protecting patients from surprise medical bills from out-of-network (OON) hospital-based physicians supporting elective treatment in in-network hospitals. Even in non-emergency situations, patients have little ability to choose physicians such as anaesthesiologists, pathologists or radiologists. Using a comprehensive, multi-payer claims database, we estimated the effect of these laws on hospital-based physician reimbursement, charges, network participation and potential surprise billing episodes. Overall, the state laws were associated with a reduction in anaesthesiology prices and charges, but an increase in pathology and radiology prices. The price effects for each state exhibit substantial heterogeneity. California and New Jersey experienced increases in network participation by anaesthesiologists and pathologists and reductions in potential surprise billing episodes, but, overall, we find little evidence of changes in network participation across all of the states implementing surprise billing laws. Our results suggest that the effects of the NSA may vary across states.

Type
Article
Creative Commons
Creative Common License - CCCreative Common License - BY
This is an Open Access article, distributed under the terms of the Creative Commons Attribution licence (http://creativecommons.org/licenses/by/4.0/), which permits unrestricted re-use, distribution and reproduction, provided the original article is properly cited.
Copyright
Copyright © The Author(s), 2024. Published by Cambridge University Press

1. Introduction

Although they are rarely the primary physician treating a patient, hospital-based ancillary physicians (e.g. anaesthesiologists, pathologists, radiologists) provide vital services that aid diagnosis and treatment.Footnote 1 Patients seeking elective care at in-network hospitals often assume these physicians are employed by the hospital – and, thus, also in-network – but they can be independent. Even patients who understand that ancillary physicians may not be employed by the hospital are largely powerless to choose their ancillary physician, whether the needed treatment is emergent or elective. This can lead to surprise out-of-network (OON) bills from the ancillary physician to the patient for the balance of the physician's charges not covered by insurance, bills that patients have no ability to avoid.Footnote 2 Past research has documented a non-negligible share of ancillary physicians that are OON. Nikpay et al. (Reference Nikpay, Nshuti, Richards, Buntin, Polsky and Graves2022) found that almost 15 per cent of anaesthesiologists do not participate in any commercial networks. Among inpatient admissions, Sun et al. (Reference Sun, Mello, Moshfegh and Baker2019) found that 19.3 per cent had an OON anaesthesiology claim, 22.2 per cent had an OON pathology claim, and 22.6 per cent had an OON radiology claim. Using similar data from a large insurer, Cooper et al. (Reference Cooper, Nguyen, Shekita and Scott-Morton2020) found that, at in-network hospitals, 11.8 per cent of anaesthesiology claims, 12.3 per cent of pathology claims and 5.6 per cent of radiology claims were OON. Looking specifically at elective treatment at in-network hospitals with in-network primary physicians, Chhabra et al. (Reference Chhabra, Sheetz, Nuliyalu, Dekhne, Ryan and Dimick2020) found that 20.5 per cent of these cases contained an OON claim.

In recent years, numerous states implemented laws to protect patients from surprise OON bills in elective situations. However, these state laws only apply to fully insured health plans where insurers bear the risk. Most people with private employer-sponsored health insurance are in self-funded health plans, in which the employer bears the health spending risk. These latter plans are instead regulated by the federal government as an employment benefit through the Employment Retirement Income Security Act.

To fill this gap, Congress passed the No Surprises Act (NSA), which protects fully insured and self-funded patients from OON balance bills. In elective situations, OON providers must notify patients and receive written consent 72 h or more before treatment in order to balance bill patients. However, hospital-based specialists, such as anaesthesiologists, pathologists, and radiologists, are not allowed to balance bill patients regardless of prior notice or consent. The lawmakers who drafted the NSA recognised that, with ancillary physicians, patients are unable to shop for and select a provider, even with elective procedures and, thus, notice and consent is unlikely to be meaningful.

To determine payment for OON hospital-based ancillary physicians, the NSA uses a final-offer independent dispute resolution (IDR) process, which is similar to the arbitration systems previously established by some states to regulate surprise OON billing by ancillary physicians. Given their similarity with the NSA, the prior state laws may provide evidence of the impact of the NSA on reimbursements to ancillary physicians and the incentives of ancillary physicians to join managed care networks.

The existing literature on the impact of state balance billing laws on ancillary physician reimbursement and network participation is limited and focuses primarily on California's law and anaesthesiology. La Forgia et al. (Reference La Forgia, Bond, Braun, Kjaer, Zhang and Casalino2021) estimated the effect of surprise billing laws in California, Florida and New York on in-network and OON anaesthesiology prices and found that the laws led to in-network and OON price decreases in all three states. Dixit et al. (Reference Dixit, Heavner, Baker and Sun2023) looked at the effect of California's law on in-network and OON anaesthesiology prices and network participation. They found a reduction in OON prices, an increase in in-network prices and no evidence of a change in network participation. Adler et al. (Reference Adler, Duffy, Ly and Trish2019a) looked at changes in network participation for anaesthesiology, diagnostic radiology, pathology, assistant surgeons and neonatal–perinatal medicine in California after it implemented its law protecting patients from surprise OON bills. They found that network participation increased from 79.1 to 82.6 per cent after the law. Gordon et al. (Reference Gordon, Liu, Chartock and Chi2022) estimated the effect of California's and New York's surprise billing laws on total charges accrued during non-emergency potential surprise billing episodes (i.e. the hospital and primary physician are in-network, but another physician involved in treatment is OON). They found an increase in charges relative to controls in New York, but a decrease in California. A related literature studied arbitration awards across various physician specialties in certain states that implemented IDR systems as part of their surprise billing laws with mixed results (Adler, Reference Adler2019; Chartock et al., Reference Chartock, Adler, Ly, Duffy and Trish2021 and Duffy et al., Reference Duffy, Adler, Chartock and Trish2022 for New York, New Jersey and Texas, respectively). There are no studies of the effects of other elective balance billing laws implemented in the 2010s, such as the laws passed in Arizona, Connecticut, Maine, Minnesota, New Hampshire and Oregon. These state regulations differed substantially from the earlier laws in California, Florida and New York in their regulation of OON reimbursements. Furthermore, there are no studies that explore the effects of the state laws on pathology and radiology prices. Finally, while Adler et al. (Reference Adler, Duffy, Ly and Trish2019a) and Dixit et al. (Reference Dixit, Heavner, Baker and Sun2023) investigate the effect of California's law on network participation, no studies have estimated the effect of surprise billing laws on the frequency of situations in which patients are vulnerable to surprise bills from ancillary physicians. This is a significant gap in the literature because these laws did not protect patients in self-funded health plans from surprise bills.

We estimate the effects of all of the elective balance billing laws, collectively and individually, that were implemented between 2013 and 2018: Arizona, California, Connecticut, Florida, Maine, Minnesota, New Jersey, New Hampshire, New York and Oregon. We estimate the effects of the laws on physician in-network and OON reimbursements and network participation for anaesthesiologists, pathologists and radiologists. For anaesthesiologists and pathologists, we also estimate the effect of the laws on charges (i.e. list prices).Footnote 3 Finally, we investigate the effects of each law on potential elective surprise billing situations involving each of these specialties (i.e. the hospital is in-network, but the ancillary physician is OON). For all outcomes, we measure effects separately for fully insured and self-funded health plans. Although the state laws do not apply to self-funded plans, there may be spillover effects if insurers use common networks for fully insured and self-funded plans in their portfolio.

Overall, we find that the state laws led to reductions in anaesthesiology prices and charges for fully insured health plans, but with substantial heterogeneity across states. We also find some evidence of spillovers in anaesthesiology prices for self-funded plans. The effects of the laws on pathology and radiology prices seem to be inflationary overall. However, we do not find any statistically significant results for any state in particular and find no evidence of changes in pathology charges associated with the laws. We find increases in fully insured network participation in California, New Jersey and Oregon for anaesthesiologists and reductions in potential surprise billing episodes in these states. For New Jersey, these effects also occurred in self-funded plans, likely because self-funded plans can opt-into the protections in New Jersey. Reductions in anaesthesiology network participation were observed in multiple states.

In the following section, we describe the state laws in detail. Section 3 describes our data and estimation methods. Section 4 describes our results and section 5 concludes with a discussion of the implications of our findings.

2. Background

Ten states passed laws protecting patients from OON balance bills in non-emergency, elective situations between 2013 and 2018: Arizona, California, Connecticut, Florida, Maine, Minnesota, New Hampshire, New Jersey, New York and Oregon. All of these states protected patients from balance billing by OON ancillary physicians, using a variety of methods to regulate OON reimbursement. The laws either used a specific regulated payment or employed an IDR process to determine reimbursement by the insurer to the OON provider. The former regulated payment states employed a variety of payment rules. The latter IDR states used a variety of arbitration mechanisms to select cases eligible for arbitration and guide arbiters in their decision-making. Table 1 summarises the ten surprise billing laws.

Table 1. State surprise billing laws

Five of the ten state laws specified a regulated amount for OON reimbursement of physicians. California's law was effective 1 July 2017 and required that health plans pay OON physicians working at in-network hospitals the greatest of the average contracted rate or 125 per cent of the amount Medicare reimburses on a fee-for-service basis for the same or similar services in the general geographic region in which the services were rendered.Footnote 4 Connecticut's law, which started on 1 July 2016, specified that OON physicians providing non-emergency services at in-network hospitals be reimbursed at the health plan's in-network rate if the health plan and OON provider do not agree on a reimbursement amount.Footnote 5 Maine's law, which became effective on 1 January 2018, uses the same approach as Connecticut for reimbursement of OON providers working in in-network hospitals.Footnote 6 Florida's law became effective on 1 July 2016 and requires that the health plan reimburse the OON provider the lesser of (1) the provider's charges, (2) the usual and customary provider charges for similar services in the community where the services were provided or (3) the charge mutually agreed to by the insurer and the provider within 60 days after submittal of the claim.Footnote 7 Oregon's law, which started on 1 March 2018, required OON reimbursement to equal the median in-network rate in 2015 (adjusted for subsequent inflation).Footnote 8

The remaining five states relied on an IDR process for reimbursement of OON physicians practicing at in-network hospitals. Arizona's law became effective in January 2019 and it used an IDR process, but the patient must initiate it and the bill only qualifies for arbitration if the balance bill and cost-sharing is greater than $1,000.Footnote 9 Minnesota's law became effective in January 2018 and implements an IDR process if the health plan and physician cannot agree to reimbursement. If arbitration is necessary, the arbiter is directed to consider the health plan's other OON payments for similar services, the complexity of the treatment and the usual and customary payment calculated from an independent claims database (e.g. FAIR Health).Footnote 10 New York's law – which was signed by the governor in October 2014, but became effective in April 2015 – employed final offer arbitration for OON reimbursement disputes in which the arbiter is shown the 80th percentile of charges as calculated by FAIR Health (Adler, Reference Adler2019). New Hampshire's law became effective on 1 July 2018 and also employs an IDR system, but it relies on the state insurance commissioner's determination of the reasonableness of the insurer's payment to qualify for arbitration.Footnote 11 New Jersey's law protecting patients from balance bills for OON care occurring in in-network hospitals became effective on 30 August 2018 and uses an IDR system based on charge percentiles (Chartock et al., Reference Chartock, Adler, Ly, Duffy and Trish2021). However, the difference between the insurer's offer and the provider's offer must be greater than $1000 (at the episode level) to qualify for the IDR system. Prior to this law, New Jersey also banned balance billing by OON ancillary physicians, but required that insurers pay the OON physician's full charge. Some argued that this led to reduced levels of network participation for physicians (Mattke et al., Reference Mattke, White, Hanson and Kotzias2017). New Jersey's new law allows self-funded health plans to opt into the protections of the law and its IDR system.

3. Data and methods

We use the Health Care Cost Institute's (HCCI) 2.0 Commercial Claims Research Dataset for years 2012 through 2019. The data capture the medical claims for over 55 million covered lives in employer sponsored health plans and represent multiple insurers, including Aetna, Blue Cross and Blue Shield, and Humana.

We analyse prices, charges and network participation for three hospital-based physician specialties: anaesthesiologists, pathologists and radiologists. For anaesthesiologists,Footnote 12 we isolate the physician claims with Current Procedural Terminology (CPT) codes 100 through 1999. For pathologists, we isolate the physician claims with CPT codes 88300 through 88399. For radiologists, we isolate the physician claims with CPT codes 70010 through 79999, excluding CPTs 77261 through 77799. For all three specialties, we restrict the claims to those with place of service codes indicating a hospital (either outpatient or inpatient) and also exclude claims that may not represent commercially insured patients with managed care plans or that have other coding problems.Footnote 13

The HCCI claims data do not include the provider's charge. For anaesthesiologists and pathologists, the HCCI claims data are merged with charge information from Medicare's Provider Utilization and Payment (MPUP) data for 2013 through 2019Footnote 14. The MPUP data include the provider's average charge for Medicare patients for each CPT and year when the place of service is an inpatient hospital. The MPUP data are merged with the anaesthesiology and pathology physician claims by the provider's masked NPI, CPT and year. (2012 MPUP data are not available. Analysing charges for radiologists is not feasible because MPUP data are available for too few radiologists.)

Payments for anaesthesiologists and nurse anaesthetists are based on a different system than payments for other physicians.Footnote 15 Anaesthesiology payments are calculated using base units, time units, anaesthesiology-specific conversion factors that differ by region, and other adjustments depending on whether services were provided by an anaesthesiologist or by a nurse anaesthetist under the supervision of an anaesthesiologist. To case-mix-adjust prices and charges, we use base unit and anaesthesia conversion factor files for 2012 through 2019 from the Centers for Medicare and Medicaid Services (CMS).Footnote 16 The base unit files are merged with the anaesthesia claims by CPT and the anaesthesia conversion factors are merged with the anaesthesia claims by zip code.

For pathologists and radiologists, we use Medicare physician fee schedule files for 2012 through 2019 from CMS to case-mix-adjust prices and charges.Footnote 17 Using the Medicare physician fee schedule files, we calculate the total geographically adjusted relative value units (RVUs) for each five-digit zip code and CPT when the service is performed in a facility setting. We calculate the work RVUs multiplied by the work geographic practice cost index (GPCI) plus the facility practice expense RVUs multiplied by the practice expense GPCI plus the malpractice expense RVUs multiplied by the malpractice expense GPCI. This total geographically adjusted RVU amount is merged onto the claims data by zip code and CPT.

For pathology and radiology claims, we convert prices (and charges for pathology) to amounts that are comparable to Medicare's standard physician conversion factor (which is used to determine Medicare payment) by dividing the allowed amount (and, separately, the average charge for pathology) by the total geographically adjusted RVUs for each claim. This converts prices and charges to amounts per RVU. Because the geographic adjustment for anaesthesiology payments is included in the anaesthesiology conversion factors, we cannot use this same approach for anaesthesiology prices and charges. Instead, we calculate how much Medicare would have paid for each anaesthesiology claim by applying the base units, time units and region-specific conversion factors using Medicare's payment formula. We then divide the claim's allowed amount and (separately) the average charge by the hypothetical Medicare payment to convert anaesthesiology prices and charges into multiples of Medicare's payment. Whether prices and charges are amounts per geographically adjusted RVU for pathology and radiology or multiples of the payment that would apply if the patient had been a Medicare patient for anaesthesiology, our prices and charges are adjusted to reflect geographic and treatment intensity differences using Medicare's payment adjustments for these factors.

To measure network participation, we calculate the percentage of claims that are in-network. We separately measure outcomes for fully insured and self-funded claims. Summary statistics for price, charge and network participation outcomes are presented in Tables 2–4 for anaesthesiology, pathology and radiology, respectively.

Table 2. Anaesthesiology summary statistics for price (allowed/Medicare), charge/Medicare and network participation

Table 3. Pathology summary statistics for price (per RVU), charge (per RVU) and network participation

Table 4. Radiology summary statistics for price (per RVU) and network participation

Potential surprise billing episodes are defined as inpatient treatment episodes in which the facility is in-network, but at least one non-facility claim is OON. We identify patient treatment episodes as groups of inpatient facility and physician claims with the same patient ID and overlapping treatment dates. To focus on potential surprise billing episodes in elective situations, we exclude episodes that include claims for emergency services. We analyse potential elective surprise billing episodes regardless of physician specialty (i.e. the facility is in-network, but at least one physician, regardless of specialty, is OON), but we also analyse elective surprise billing episodes that are attributable to an OON anaesthesiologist, an OON pathologist and an OON radiologist, respectively. As with prices, charges and network participation, we separately estimate changes in potential surprise billing episodes for fully insured and self-funded health plans. Summary statistics for the prevalence of potential surprise billing episodes are presented in Table 5. For all analyses, we excluded 2020 due to the COVID pandemic.

Table 5. Elective potential surprise medical bill prevalence by type

To estimate the overall average effect of the state laws in each quarter after implementation, we use the event study method of Sun and Abraham (Reference Sun and Abraham2021). Estimating dynamic (i.e. by-period) average treatment effects using the standard two-way fixed-effects (TWFE) regression with treatment indicators for each period from treatment can lead to biased estimates if the treatments (e.g. state laws) are implemented at different times and the effects of treatment are heterogeneous. The method of Sun and Abraham (Reference Sun and Abraham2021) estimates dynamic average treatment effects accounting for treatment effect heterogeneity and staggered adoption. Specifically, we first estimate a TWFE model with dynamic time-from-treatment indicators interacted with treatment-time cohort indicators:

(1)$$Y_{ijt} = \alpha _i + \lambda _t + \mathop \sum \limits_{b\in B} \mathop \sum \limits_{l\in L} \mu _{lb}D_{ib}D_{it}^l + \epsilon _{ijt}, \;$$

where Yijt is the outcome of interest for provider j in state i in quarter t, α i is a state-specific effect, λ t is a quarter-specific effect, D ib is an indicator that equals one for each state in treatment-time cohort b (taken from the set B of treatment-time cohorts), $D_{it}^l$ is an indicator that equals one when state i is l quarters from treatment, and $\epsilon _{ijt}$ is the idiosyncratic error. (For the analysis of surprise-billing episodes, j indexes episodes instead of providers and the estimation is based on a random sample of one million episodes.) From the estimation of equation (1), we recover μ lb, which are average treatment effects for cohorts bB and quarters l ∈L from treatment. Next, we calculate weights as the share of cohort b in quarter l from treatment. Finally, we calculate a weighted average of μ lb across cohorts as the average treatment effect in quarter l from treatment, using the cohort weights calculated in the previous step.Footnote 18 Following convention, we exclude indicators for l = −1 and set the quarter before treatment as the reference quarter. We restrict L such that l ≥ −12. This is the same approach for estimating dynamic treatment effects as in Garmon et al. (Reference Garmon, Li, Retchin and Xu2024).

The control group we use for the estimation of equation (1) is the set of states that passed laws in 2020 or 2021 that protect patients from surprise OON bills in elective situations. Chhabra et al. (Reference Chhabra, Sheetz, Nuliyalu, Dekhne, Ryan and Dimick2020) and Nikpay et al. (Reference Nikpay, Nshuti, Richards, Buntin, Polsky and Graves2022) highlight the significant geographic heterogeneity in surprise billing from hospital-based physicians, but offer no explanation for it. The differences in surprise billing prevalence across states may be due to factors that are difficult to observe and quantify. States with high rates of surprise billing (e.g. Florida, New York, New Jersey) are more likely to have passed a law to protect patients in response to the elevated risk. Thus, states that passed a surprise billing law may differ from states that never passed a surprise billing law and the differences may not be measurable. For this reason, we restrict our control group to states that passed an elective surprise billing law after our observation period (i.e. after 2019). These states are Georgia, Michigan, New Mexico, Ohio, Virginia and Washington.

The dynamic quarterly estimates of the overall average treatment effect may obscure differences across states in the impact of the laws. Heterogeneous effects across states that passed elective surprise billing laws are likely because of the various methods used to regulate OON reimbursements. To explore this potential heterogeneity, we estimate equation (1) separately by the type of state law: a regulated OON reimbursement vs an IDR system. We also estimate the average effect of each state law relative to controls. To estimate the effect of each state law, we use the method of synthetic controls (Abadie, Reference Abadie2021). Specifically, we estimate:

(2)$$Y_{it} = \alpha _i + \lambda _t + \mu D_{it}^{l \ge 0} + \epsilon _{it}, \;$$

where Yit is the state-quarter mean, α i is a state-specific effect, λ t is a quarter-specific effect, $D_{it}^{l \ge 0}$ is an indicator that equals one for all observations in the treated state and all quarters after treatment, and $\epsilon _{it}$ is the idiosyncratic error. The treated state is stacked with its synthetic control, a weighted average of the outcomes across the pool of control states, where the weights are selected to match the treated state with its synthetic control in the pre-treatment period, based on the outcome and predictors of the outcome. We only use synthetic controls for the price and charge outcomes and use the Herfindahl–Hirschman Index (HHI) of the state's insurance market and the state's health expenditures per capita as predictors of price.Footnote 19 To increase the likelihood of a close pre-treatment match between the treated state and its synthetic control, we include all states that did not have an elective surprise billing law during 2012–2019 in the pool of potential synthetic controls, including the states that passed a law in 2020–21.Footnote 20

For inference, we follow the recommendation of Abadie (Reference Abadie2021) and create the permutation distribution of placebo effects, i.e. the distribution of effects if each state in the control pool is considered a treated state. P values are calculated using the distribution of placebo ratios of post-to-pre root mean squared errors (RMSE), where the RMSE captures the difference between the ‘treated’ state and its synthetic control. The p value is the proportion of placebo ratios of post-to-pre RMSE that exceed the post-to-pre RMSE ratio of the actual treated state. In this way, the p value captures both the size of the treatment effect relative to the distribution of placebo effects and the precision of the pre-treatment match of the treated state and its synthetic control. Following the recommendation of Abadie (Reference Abadie2021), we also report (in Appendix Tables A7–A10) the range of estimates when leaving each control state out of the pool of potential controls. As a robustness check on the state-level results, we also estimate the standard TWFE model (using the control group for equation (1)) and these results are reported in Appendix Tables A11–A14.Footnote 21 The price and charge-dependent variables of equation (2) are logged and we report the exponentiated coefficient minus one.

By prohibiting physicians from balance billing patients and regulating OON physician reimbursement, surprise billing laws may change physicians' incentives to join managed care networks or remain OON. If a surprise billing law changes the network participation of hospital-based physicians, mean in-network or OON reimbursements may change even if the underlying reimbursements have not changed. For example, if a law made it less lucrative for a physician to be OON (e.g. because the physician could no longer balance bill patients), physicians with smaller OON reimbursements may sign contracts and join networks after the law. In this case, the mean OON reimbursement would increase even if all existing OON reimbursement levels stayed the same because of the elimination of low-reimbursement OON physicians from the OON reimbursement distribution. We address this composition issue by analysing network participation (i.e. the proportion of claims that are in-network) and combined prices and charges, regardless of whether the claim is in-network or OON.

For network participation, the outcome is an indicator for whether the claim was in-network. We only estimate average effects for each state, and do so using the standard TWFE model:

(3)$$Y_{\,jkt} = \alpha _j + \lambda _t + \mu D_{\,jt}^{l \ge 0} + \epsilon _{\,jkt}, \;$$

where Yjkt is the in-network indicator for claim k in quarter t, α j is a state-specific effect, λ t is a quarter-specific effect, $D_{jt}^{l \ge 0}$ is an indicator that equals one for all providers in the treated state and all quarters after treatment, and $\epsilon _{jkt}$ is the idiosyncratic error.

For potential surprise billing episodes, in addition to the Sun and Abraham (Reference Sun and Abraham2021) method described above, we also estimate the effect of each state law with the following model:

(4)$$Y_{ijkt} = \alpha _j + \lambda _t + \rho _k + \mu D_{\,jt}^{l \ge 0} + \epsilon _{ijkt}, \;$$

where Yijkt is the potential surprise billing indicator for episode i, α j is a state-specific effect, λ t is a quarter-specific effect, ρ k is a plan-specific effect (e.g. fixed effect for PPO, HMO, EPO, etc.), $D_{jt}^{l \ge 0}$ is an indicator that equals one for all episodes in the treated state and all quarters after treatment, and $\epsilon _{ijkt}$ is the idiosyncratic error. Like equation (1), the control group used in the estimation of equations (3) and (4) is the group of states that passed an elective surprise billing law in 2020 or 2021. For both network participation and potential surprise billing episodes, we calculate p values using the randomisation inference procedure described in MacKinnon and Webb (Reference MacKinnon and Webb2020), in which placebo effects are estimated for each state in the control group (as if it were the treated state) and the p value is calculated as the proportion of placebo effects that exceed the treated state's effect in absolute value.

Although the data provide detailed information about each patient, provider and treatment episode, some information is not included. For instance, we cannot observe and control for the employment of hospital-based physicians by physician staffing companies. In addition, each claim includes only the insurer's allowed amount (i.e. the insurer's payment plus the patient's plan-based cost-sharing). We cannot observe the claim's specific charge, whether the OON physician sent a balance bill to the patient or, if so, how much was paid by the patient to the OON provider. Although we cannot observe the claim's specific charge, we can observe the provider's average charge for Medicare patients for each CPT and year and we use this as a proxy for the provider's average charge to privately insured patients. The same approach has been used in prior papers that analysed charges for privately insured patients (Adler et al., Reference Adler, Lee, Hannick and Duffy2019b).

4. Results

Figure 1 plots the average treatment effect for fully insured health plan anaesthesiology reimbursements by quarter using the Sun and Abraham (Reference Sun and Abraham2021) estimation method. Roughly two years after implementation, there is a substantial reduction of roughly 30 per cent in in-network anaesthesiology prices on average in states that passed a surprise billing law relative to controls. The reduction is present in combined in-network and OON prices as well, so it is unlikely to be caused by a shift in the network composition of claims. A similar, but slightly smaller reduction occurs for self-funded anaesthesiology prices, as seen in Figure 2.Footnote 22 Table 6 lists the state-level treatment effects for fully insured plans, as estimated by equation (2). The state-level anaesthesiology effects reveal substantial heterogeneity in reimbursement changes across states. For Maine, Minnesota and New York, there are large reductions in anaesthesiology prices relative to controls, although only the Maine estimate is statistically significant. Apart from Maine, there is little change in anaesthesiology prices relative to controls in the states using a regulated OON reimbursement. Overall, there is no change relative to controls for regulated payment states, as seen in Figure 3. Most of the reduction in anaesthesiology prices occurs in states using an IDR system (Figure 4). Table 7 lists the state-level anaesthesiology estimates for self-funded plans. The estimated effects are similar to the fully insured estimates, but smaller and few are statistically significant.

Figure 1. Anaesthesiology price event study coefficients.

Figure 2. Anaesthesiology price event study coefficients.

Table 6. Fully insured price changes relative to controls

Synthetic controls.

Figure 3. Anaesthesiology price event study coefficients.

Figure 4. Anaesthesiology price event study coefficients.

Table 7. Self-funded price changes relative to controls

Synthetic controls.

For pathology and radiology prices, we find smaller effects than with anaesthesiology prices and, unlike anaesthesiology, the effect of the laws seems to be inflationary. As seen in Figure 5, fully insured OON pathology prices in treated states gradually increase 20 percentage points more than controls after implementation of the balance billing law, but this may reflect increased in-network participation of low reimbursement OON physicians because combined in-network and OON prices do not change much relative to controls. There is no relative change overall in pathology prices for self-funded health plans (Figure 6). For fully insured health plans, the pathology price changes are similar in regulated payment and IDR states (Figures 7 and 8). For radiology (see Figures 9 and 10), there is a roughly 10 percentage point increase in fully insured prices relative to controls three to four years after implementation, with a smaller increase for self-funded prices. As seen in Figures 11 and 12, this increase in fully insured radiology prices occurred in both regulated payment and IDR states. However, as seen in Tables 6 and 7, we generally do not find statistically significant changes at the state-level for pathology or radiology.

Figure 5. Pathology price event study coefficients.

Figure 6. Pathology price event study coefficients.

Figure 7. Pathology price event study coefficients.

Figure 8. Pathology price event study coefficients.

Figure 9. Radiology price event study coefficients.

Figure 10. Radiology price event study coefficients.

Figure 11. Radiology price event study coefficients.

Figure 12. Radiology price event study coefficients.

Figures 13 and 14 plot the quarterly average treatment effects for anaesthesiology charges for fully insured and self-funded plans, respectively. We estimate a substantial reduction in anaesthesiology charges three to four years after law implementation for fully insured health plans, and a smaller, but still significant reduction in charges for self-funded plans. For pathology (see Figures 15 and 16), we do not observe any relative change in charges in surprise billing states overall, except for fully insured OON charges at the end of the estimation period, but this is likely due to a change in the composition of OON claims because there is not a similar change in combined charges. State-level results for fully insured and self-funded plans are presented in Tables 8 and 9, respectively. As with the price estimates, there is substantial heterogeneity across states for fully insured plans, with some states seeing increases in anaesthesiology charges (Arizona and Connecticut) and other states seeing decreases in anaesthesiology charges. Consistent with the overall results, the state-level anaesthesiology charge changes are smaller for self-funded plans.

Figure 13. Anaesthesia charge event study coefficients.

Figure 14. Anaesthesia charge event study coefficients.

Figure 15. Pathology charge event study coefficients.

Figure 16. Pathology charge event study coefficients.

Table 8. Fully insured charge changes relative to controls

Table 9. Self-funded charge changes relative to controls

As seen in Table 10, changes in network participation relative to controls were generally small and variable, with some exceptions. This is likely due to the fact that the vast majority of anaesthesiology, pathology, and radiology claims are in-network (as seen in Tables 24), so there is little room for improvement. However, California experienced a relatively large increase in network participation for anaesthesiologists in fully insured plans and New Jersey saw an increase for both fully insured and self-funded plans. The latter increase likely results from the ability of self-funded plans to opt-into the protections in New Jersey. California also saw an increase in pathology network participation across both types of health plans. Multiple states experienced reductions in anaesthesiology network participation for fully insured plans (and Florida experienced a 12 percent reduction in network participation in self-insured plans).

Table 10. Change in network participation relative to controls

Regarding potential surprise billing episodes in elective situations, the state-level results largely mirror the changes in network participation, as seen in Table 11. California and New Jersey saw a reduction in potential surprise billing episodes relative to controls for fully insured plans, with a smaller, but still statistically significant reduction for self-funded plans. The only sizable, statistically significant increase in potential surprise billing episodes occurred in Florida with self-funded plans. Overall, across all states implementing a surprise billing law in this period, there was little change in elective potential surprise billing episodes, as seen in Figures 17 and 18, which plot the average quarterly change relative to controls in episodes that leave patients vulnerable to surprise bills for fully insured and self-funded plans, respectively. The reductions in potential surprise billing episodes three and a half years after implementation may reflect New York, which is the only state in this time period to implement an elective surprise billing law prior to 2016. However, we do not find the reduction in surprise billing episodes to be statistically significant in New York using randomisation inference.

Table 11. Change in surprise billing episodes relative to controls

Figure 17. Surprise medical bill episodes event study coefficients.

Figure 18. Surprise medical bill episodes event study coefficients.

5. Discussion

While patients rarely choose their anaesthesiologist, pathologist or radiologist, some of these ancillary physicians are OON and send bills for the portion of their charges not paid by insurance, bills that patients do not expect and are unable to avoid. Recognizing the financial burden of surprise bills from OON ancillary physicians, many states implemented laws to outlaw the practice. We investigated the effects of these laws on physician reimbursement, network participation and potential surprise billing episodes using data from multiple health plans, including plans regulated by the state (i.e. fully insured plans) and those exempt from state regulation (i.e. self-funded plans).

Although we study more state surprise billing laws and estimate their effects on more outcomes, our results are largely consistent with the prior literature. Like La Forgia et al. (Reference La Forgia, Bond, Braun, Kjaer, Zhang and Casalino2021), we find that surprise billing laws led to reduced anaesthesiology reimbursements on average across all states that passed elective surprise billing laws in the 2010s. However, unlike La Forgia et al. (Reference La Forgia, Bond, Braun, Kjaer, Zhang and Casalino2021) we do not find statistically significant reductions in relative prices in California, New York or Florida. For New York, we observe a reduction in OON (and combined) anaesthesiology prices relative to controls, but these changes are not statistically significant. For New York at least, the difference between our findings and those of La Forgia et al. (Reference La Forgia, Bond, Braun, Kjaer, Zhang and Casalino2021) may be due to the use of TWFE with clustered standard errors in La Forgia et al. (Reference La Forgia, Bond, Braun, Kjaer, Zhang and Casalino2021) and our use of synthetic controls with randomisation inference (following Abadie, Reference Abadie2021), which is a more conservative approach for inference (Kaestner, Reference Kaestner2016; MacKinnon and Webb, Reference MacKinnon and Webb2020).

We also find an overall reduction in anaesthesiology charges after the implementation of the surprise billing laws, with substantial heterogeneity across states. Our results are not directly comparable to Gordon et al. (Reference Gordon, Liu, Chartock and Chi2022) due to differences in the data samples. However, unlike Gordon et al. (Reference Gordon, Liu, Chartock and Chi2022), we find no evidence of increases in charges in New York after its law. Our results are consistent with Adler et al. (Reference Adler, Duffy, Ly and Trish2019a), as we also find an increase in network participation in California after the implementation of its surprise billing law protecting patients in elective situations.

Unlike the previous literature, we estimate the overall average effect of the laws on reimbursements and charges, accounting for staggered adoption and controlling for unobservable characteristics with a control group of states that passed laws after the end of our data sample. Unlike anaesthesiology, where the overall effect of these laws tends to be deflationary, we find that the laws were generally inflationary, particularly for radiology prices, leading to increased prices relative to controls.

We also estimate effects separately for fully insured and self-funded health plans. Unsurprisingly given that state laws only apply to fully insured health plans, we find larger effects for fully insured than self-funded health plans. However, we find evidence of spillover effects to self-funded health plans for prices, network participation and potential surprise billing episodes. This is noteworthy given the lack of protections for self-funded patients prior to the NSA and the widespread use of self-funded plans by large employers.

The heterogeneity of effects across states and specialties is puzzling. Anaesthesiology prices fell relative to controls, while radiology prices increased. Two factors may explain this discrepancy. First, relative to Medicare reimbursement, anaesthesiology prices are higher at baseline than pathology or radiology prices (Cooper et al., Reference Cooper, Nguyen, Shekita and Scott-Morton2020). Second, Certified Registered Nurse Anesthetists have a large and growing presence in the United States (Wilson et al., Reference Wilson, Poeran, Liu, Zhong and Memtsoudis2021). We observe larger anaesthesiology price decreases in IDR states than in regulated payment states. Thus, it is possible that the higher relative baseline prices and large and increasing presence of lower cost anaesthesiology providers have led arbiters in IDR states to view anaesthesiologist bids less favourably than the bids of other specialties and this has led to a larger impact on OON reimbursements for anaesthesiologists than other specialties as payers adjust to these rulings. This is consistent with the newly released NSA arbitration data, which shows a relative disadvantage for anaesthesiologists relative to other specialties. Across all specialties in the NSA arbitration data, the arbiters pick the provider's offer roughly 75 per cent of the time. However, anaesthesiologists win 70 per cent of hearings, while radiologists prevail in 80 per cent of hearings.Footnote 23

The increase in network participation in California may be due to the cap on OON reimbursement at the average in-network price or 125 per cent of Medicare, whichever is greater. However, the increase in network participation in New Jersey was likely due to the elimination of the highly generous prior system that awarded OON providers their full charges. Overall, the state-level heterogeneity highlights that the effects of surprise billing laws are likely a function of not only their OON reimbursement structure, but prior protections, implementation and enforcement within the state.

Regarding the NSA, our results imply that the effects of the NSA may take time to develop and vary by state. Particularly for states using an IDR system like that in the NSA, we observed gradual changes in price relative to controls, suggesting that it took time for arbiters, clinicians and health plans to adjust to the IDR system. The states that had protections prior to the NSA can retain their laws (including their OON reimbursement systems) if they are at least as strong as the federal protections. Furthermore, enforcement of the NSA will be a mix of federal and state efforts.Footnote 24 Thus, the NSA may have little impact in the states that retain their protections and unknown impacts in other states, depending on enforcement.

Acknowledgements

We thank Annetta Zhou and anonymous reviewers for their helpful suggestions.

Financial support

This research was funded by a grant from the Robert Wood Johnson Foundation.

Competing interests

Because the paper uses the Health Care Cost Institute's data, the paper was reviewed by the Health Care Cost Institute's Data Integrity Committee to check for material misrepresentations of the data. Christopher Garmon has received consulting fees from Compass Lexecon. Yiting Li and Wendy Yi Xu have no personal disclosures to report. Dr. Sheldon M. Retchin is a Director on the Board of Aveanna Healthcare, a for-profit public company that provides home and hospice care to adults and children in 33 states. As a director, Dr. Retchin receives fees and stock as compensation.

Data availability statement

If published, we will provide code and instructions for replication of all results. However, our results are estimated using restricted-access data from the Health Care Cost Institute that cannot be made publicly available. We will also provide instructions for applying for data access and assistance for any replication requests.

Appendix A

Table A1. Anaesthesiology claim processing counts, fully insured

Table A2. Anaesthesiology claim processing counts, self-funded

Table A3. Pathology claim processing counts, fully insured

Table A4. Pathology claim processing counts, self-funded

Table A5. Radiology claim processing counts, fully insured

Table A6. Radiology claim processing counts, self-funded

Table A7. Fully insured synthetic control leave-one-out min and max price changes

Table A8. Self-funded synthetic control leave-one-out min and max price changes

Table A9. Fully insured synthetic control leave-one-out min and max charge changes

Table A10. Self-funded synthetic control leave-one-out min and max charge changes

Table A11. Fully insured TWFE price changes relative to controls

Table A12. Self-funded TWFE price changes relative to controls

Table A13. Fully insured TWFE charge changes relative to controls

Table A14. Self-funded TWFE charge changes relative to controls

Figure A1. Anaesthesiology price event study coefficients.

Figure A2. Anaesthesiology price event study coefficients.

Footnotes

1 We use ancillary to distinguish hospital-based physicians who provide supportive services from hospital-based physicians who assume responsibility for direct care of the patient (e.g. emergency physicians, hospitalists). The use of the term ancillary is not meant to imply that these services are less important than other health care services.

2 Rosenthal E., After Surgery, Surprise $117,000 Medical Bill from Doctor He Didn't Know, New York Times, 21 September 2014, https://www.nytimes.com/2014/09/21/us/drive-by-doctoring-surprise-medical-bills.html

3 Analysing charges for radiologists is not feasible because charge data are available for too few radiologists.

4 AB-72: Health Care Coverage: Out-of-Network Coverage, accessed on 21 June 2023.

5 Map: No Surprises Act Enforcement, Commonwealth Fund, accessed on 21 June 2023.

6 An Act To Protect Maine Consumers from Unexpected Medical Bills, accessed on 21 June 2023.

7 Florida House of Representatives Final Bill Analysis for HB 221, pages 7–8, accessed on 6 April 2023.

8 ORS 743B.287 – Balance Billing Prohibited for Health Care Facility Services, accessed on 6 April 2023.

9 Giancola P, Arizona Enacts Surprise Balance Billing Law, Snell and Wilmer Blog, 8 June 2017, accessed on 21 June 2023.

11 2018 New Hampshire Revised Statutes, Title XXX – Occupations and Professions, Chapter 329 – Physicians and Surgeons, Section 329:31-b – Prohibition on Balance Billing; Payment for Reasonable Value of Services, accessed on 6 April 2023.

12 We analyse claims for physicians and non-physician professionals (e.g. nurse anaesthetists). We exclude claims from hospital facility services.

13 We exclude indemnity claims, claims for patients age 65 and above and secondary claims. For all three specialties, we also exclude claims with multiple claim lines, allowed amounts that are negative or zero, claims that are missing the provider's masked NPI or network indicator and claims from non-US zip codes. For radiologists, we also exclude claims with multiple claim units, which are rare and likely represent miscodes. It is relatively common for pathologists to analyse multiple samples within a claim, but claim units rarely exceed 10, so we exclude pathology claims with units above 10 as likely miscoded claims. For pathologists and radiologists, we exclude claims that cannot be assigned relative value units. Anaesthesia payments are typically based on Medicare anaesthesia base units and conversion factors that are specific to anaesthesia services, so we exclude anaesthesiologist claims that cannot be assigned anaesthesia base units or conversion factors. Furthermore, for anaesthesiologists, we exclude supervision claims, claims where the units are less than one or greater than 100, claims in which the allowed amount is less than a dollar and claims below the 5th percentile or above the 95th percentile is allowed per unit. Tables A1 through A6 in the appendix list the claim counts after each stage of data processing for each specialty and, for each, separately for fully insured and self-funded claims.

15 See the American Society of Anesthesiologists' Payment Basics Series.

18 The estimation is implemented using the eventstudyinteract package in Stata.

19 All state-level predictors are taken from Kaiser State Health Facts. For self-funded outcomes, we use the HHI of the large group market. For fully insured health plans, we use the HHI of the small group market. To avoid over-fitting outcomes, we only use the pre-treatment outcomes from every other year in the matching algorithm. We use the synth package in Stata to calculate the synthetic control weights.

20 The states that never passed balance billing protections are Alabama, Alaska, Arkansas, the District of Columbia, Hawaii, Idaho, Kansas, Kentucky, Louisiana, Montana, North Dakota, Oklahoma, South Carolina, South Dakota, Tennessee, Utah, Wisconsin and Wyoming. Nevada (in 2020) and Nebraska (in 2021) passed surprise billing laws, but the laws only applied to emergency services. Thus, Nevada and Nebraska are also included in the pool of potential synthetic controls, but not in the controls for the estimation of equation (1).

21 P values for the TWFE model are calculated using the randomisation inference procedure described in MacKinnon and Webb (Reference MacKinnon and Webb2020).

22 As a robustness check, we estimated the quarterly average treatment effects after excluding anaesthesiology claims in which the ratio of the allowed amount to the Medicare price is less than one and these estimates are displayed in appendix Figures A1 and A2.

23 NSA Independent Dispute Resolution Reports, https://www.cms.gov/nosurprises/policies-and-resources/Reports?s=09 (accessed on 28 May 2024).

24 O'Brien, Madeline, ‘Map: No Surprise Act Enforcement’, The Commonwealth Fund, February 2022, https://www.commonwealthfund.org/publications/maps-and-interactives/2022/feb/map-no-surprises-act

References

Abadie, A (2021) Using synthetic controls: feasibility, data requirements, and methodological aspects. Journal of Economic Literature 59, 391425. issn: 0022-0515. doi: 10.1257/jel.20191450. (Visited on 02/24/2023).CrossRefGoogle Scholar
Adler, L (2019) Experience with New York's arbitration process for surprise out-of-network bills. (Visited on 03/31/2021).Google Scholar
Adler, L, Duffy, E, Ly, B and Trish, E (2019 a) California saw reduction in out-of-network care from affected specialties after 2017 Surprise Billing Law. (Visited on 03/31/2021).Google Scholar
Adler, L, Lee, S, Hannick, K and Duffy, E (2019 b) Provider charges relative to Medicare rates, 2012–2017. https://www.brookings.edu/articles/provider-charges-relative-to-medicare-rates-2012-2017/. (Visited on 06/07/2024).Google Scholar
Chartock, BL, Adler, L, Ly, B, Duffy, E and Trish, E (2021) Arbitration over out-of-network medical bills: evidence from New Jersey payment disputes. Health Affairs 40, 130137. issn: 0278-2715. doi: 10.1377/hlthaff.2020.00217. (Visited on 08/16/2022).CrossRefGoogle ScholarPubMed
Chhabra, KR, Sheetz, K, Nuliyalu, U, Dekhne, M, Ryan, A and Dimick, J (2020) Out-of-network bills for privately insured patients undergoing elective surgery with in-network primary surgeons and facilities. JAMA 323, 538547. issn: 0098-7484. doi: 10.1001/jama.2019.21463. (Visited on 05/16/2023).CrossRefGoogle ScholarPubMed
Cooper, Z, Nguyen, H, Shekita, N and Scott-Morton, F (2020) Out-of-network billing and negotiated payments for hospital-based physicians. Health Affairs 39, 2432. issn: 0278-2715. doi: 10.1377/hlthaff.2019.00507. (Visited on 08/16/2022).CrossRefGoogle ScholarPubMed
Dixit, AA, Heavner, D, Baker, L and Sun, E (2023) Association between ‘balance billing’ legislation and anesthesia payments in California: a retrospective analysis. Anesthesiology 139, 580590. issn: 0003-3022. doi: 10.1097/ALN.0000000000004675. (Visited on 10/31/2023).CrossRefGoogle ScholarPubMed
Duffy, EL, Adler, L, Chartock, B and Trish, E (2022) Dispute resolution outcomes for surprise bills in Texas. JAMA 327, 23502351. issn: 0098-7484. doi: 10.1001/jama.2022.5791. (Visited on 09/12/2022).CrossRefGoogle ScholarPubMed
Garmon, C, Li, Y, Retchin, S and Xu, W (2024) The impact of surprise billing laws on emergency services. Health Economics 33(11), 24502462. doi: 10.1002/hec.4874CrossRefGoogle ScholarPubMed
Gordon, AS, Liu, Y, Chartock, B and Chi, W (2022) Provider charges and state surprise billing laws: evidence from New York and California. Health Affairs 41, 13161323. issn: 0278-2715. doi: 10.1377/hlthaff.2021.01332. (Visited on 09/12/2022).CrossRefGoogle ScholarPubMed
Kaestner, R (2016) Did Massachusetts health care reform lower mortality? No according to randomization inference. Statistics and Public Policy 3, 16. issn: null. doi: 10.1080/2330443X.2015.1102667. (Visited on 06/22/2023).CrossRefGoogle Scholar
La Forgia, A, Bond, A, Braun, R, Kjaer, K, Zhang, M and Casalino, L (2021) Association of surprise-billing legislation with prices paid to in-network and out-of-network anesthesiologists in California, Florida, and New York: an economic analysis. JAMA Internal Medicine 181, 13241331. doi: 10.1001/jamainternmed.2021.4564CrossRefGoogle ScholarPubMed
MacKinnon, JG and Webb, MD (2020) Randomization inference for difference-in-differences with few treated clusters. Journal of Econometrics 218, 435450. issn: 0304-4076. doi: 10.1016/j.jeconom.2020.04.024. (Visited on 09/12/2022).CrossRefGoogle Scholar
Mattke, S, White, C, Hanson, M and Kotzias, V (2017) Evaluating the impact of policies to regulate involuntary out-of-network charges on New Jersey hospitals. Tech. rep. (Visited on 08/19/2022).https://www.rand.org/pubs/periodicals/health-quarterly/issues/v6/n4/07.html.Google Scholar
Nikpay, S, Nshuti, L, Richards, M, Buntin, M, Polsky, D and Graves, J (2022) Geographic variation in hospital-based physician participation in insurance networks. JAMA Network Open 5, e2215414. issn: 2574-3805. doi: 10.1001/jamanetworkopen.2022.15414. (Visited on 05/15/2023).CrossRefGoogle ScholarPubMed
Sun, L and Abraham, S (2021) Estimating dynamic treatment effects in event studies with heterogeneous treatment effects. Journal of Econometrics. Themed Issue: Treatment Effect 1 225, 175199. issn: 0304-4076. doi: 10.1016/j.jeconom.2020.09.006. (Visited on 02/02/2023).CrossRefGoogle Scholar
Sun, EC, Mello, M, Moshfegh, J and Baker, L (2019) Assessment of out-of-network billing for privately insured patients receiving care in in-network hospitals. JAMA Internal Medicine 179, 15431550. issn: 2168-6106. doi: 10.1001/jamainternmed.2019.3451. (Visited on 08/16/2022).CrossRefGoogle ScholarPubMed
Wilson, LA, Poeran, J, Liu, J, Zhong, H and Memtsoudis, S (2021) State of the anaesthesia workforce in the United States: trends and geographic variation in nurse anaesthetist to physician anaesthesiologist ratios. British Journal of Anaesthesia 126, e19e21. issn: 0007-0912, 1471-6771. doi: 10.1016/j.bja.2020.10.001. (Visited on 05/17/2024).CrossRefGoogle ScholarPubMed
Figure 0

Table 1. State surprise billing laws

Figure 1

Table 2. Anaesthesiology summary statistics for price (allowed/Medicare), charge/Medicare and network participation

Figure 2

Table 3. Pathology summary statistics for price (per RVU), charge (per RVU) and network participation

Figure 3

Table 4. Radiology summary statistics for price (per RVU) and network participation

Figure 4

Table 5. Elective potential surprise medical bill prevalence by type

Figure 5

Figure 1. Anaesthesiology price event study coefficients.

Figure 6

Figure 2. Anaesthesiology price event study coefficients.

Figure 7

Table 6. Fully insured price changes relative to controls

Figure 8

Figure 3. Anaesthesiology price event study coefficients.

Figure 9

Figure 4. Anaesthesiology price event study coefficients.

Figure 10

Table 7. Self-funded price changes relative to controls

Figure 11

Figure 5. Pathology price event study coefficients.

Figure 12

Figure 6. Pathology price event study coefficients.

Figure 13

Figure 7. Pathology price event study coefficients.

Figure 14

Figure 8. Pathology price event study coefficients.

Figure 15

Figure 9. Radiology price event study coefficients.

Figure 16

Figure 10. Radiology price event study coefficients.

Figure 17

Figure 11. Radiology price event study coefficients.

Figure 18

Figure 12. Radiology price event study coefficients.

Figure 19

Figure 13. Anaesthesia charge event study coefficients.

Figure 20

Figure 14. Anaesthesia charge event study coefficients.

Figure 21

Figure 15. Pathology charge event study coefficients.

Figure 22

Figure 16. Pathology charge event study coefficients.

Figure 23

Table 8. Fully insured charge changes relative to controls

Figure 24

Table 9. Self-funded charge changes relative to controls

Figure 25

Table 10. Change in network participation relative to controls

Figure 26

Table 11. Change in surprise billing episodes relative to controls

Figure 27

Figure 17. Surprise medical bill episodes event study coefficients.

Figure 28

Figure 18. Surprise medical bill episodes event study coefficients.

Figure 29

Table A1. Anaesthesiology claim processing counts, fully insured

Figure 30

Table A2. Anaesthesiology claim processing counts, self-funded

Figure 31

Table A3. Pathology claim processing counts, fully insured

Figure 32

Table A4. Pathology claim processing counts, self-funded

Figure 33

Table A5. Radiology claim processing counts, fully insured

Figure 34

Table A6. Radiology claim processing counts, self-funded

Figure 35

Table A7. Fully insured synthetic control leave-one-out min and max price changes

Figure 36

Table A8. Self-funded synthetic control leave-one-out min and max price changes

Figure 37

Table A9. Fully insured synthetic control leave-one-out min and max charge changes

Figure 38

Table A10. Self-funded synthetic control leave-one-out min and max charge changes

Figure 39

Table A11. Fully insured TWFE price changes relative to controls

Figure 40

Table A12. Self-funded TWFE price changes relative to controls

Figure 41

Table A13. Fully insured TWFE charge changes relative to controls

Figure 42

Table A14. Self-funded TWFE charge changes relative to controls

Figure 43

Figure A1. Anaesthesiology price event study coefficients.

Figure 44

Figure A2. Anaesthesiology price event study coefficients.