A new study published online ahead of print in the journal Tobacco Control purports to demonstrate that a smoke-free bar and restaurant law implemented in São Paulo, Brazil in August 2009 resulted in a 11.9% decline in the heart attack death rate for the first 17 months after the law was in effect (through December 2010).
The paper used a time-series analysis to compare the monthly rate of heart attack deaths prior to the smoking ban to the rate after the ban was implemented. The baseline period was January 2005 through July 2009. The implementation period was August 2009 through December 2010. Thus, the researchers had data for approximately 5 1/2 years before the ban and for 17 months after the ban.
The paper concluded: "In this study, a monthly decrease of almost 12% was observed in mortality rate for myocardial infarction in the first 17 months after the enactment of the comprehensive smoking ban law in São Paulo city."
The methods used in the study are quite complex and are summarized as follows: "We performed a time-series study of monthly rates of mortality and
hospital admissions for acute myocardial infarction from
January 2005 to December 2010. The
data were derived from DATASUS, the primary public health information
in Brazil and from Mortality
Information System (SIM). Adjustments and analyses were performed using
the Autoregressive Integrated
Moving Average with exogenous
variables (ARIMAX) method modelled by environmental variables and
atmospheric pollutants to
evaluate the effect of smoking ban
law in mortality and hospital admission rate. We also used Interrupted
Time Series Analysis
(ITSA) to make a comparison between
the period pre and post smoking ban law."
In simple terms, the investigators compared the trend in heart attack deaths in São Paulo before the smoking ban to the trend in heart attack deaths in São Paulo after the smoking ban. They concluded that there was an 11.7% drop in the heart attack death rate in the 17 months following the implementation of the smoking ban.
The investigators attributed the observed decline in heart attack deaths to a reduction in secondhand smoke exposure, citing evidence that just 30 minutes of exposure to secondhand smoke can cause a heart attack.
The Rest of the Story
To demonstrate the blatant bias in the reporting of the study results, simply take a look yourself at the actual data from the study. Below, I have plotted the data from Table 2 (Monthly number of deaths for myocardial infarction, city of São Paulo, Brazil, January 2005 to December 2010), but I have added the monthly numbers to yield annual figures, which smooths the data making it much easier to inspect visually.
The smoking ban went into effect in August 2009. You can easily see from the figure that in 2010, there was a striking increase in the number of heart attack deaths, which reached an all-time high for the study period.
Somehow, it appears that all this fancy modeling yielded a completely spurious result. This is why I teach my students to always start out by looking at the actual data. When you put the data into a fancy statistical model, strange things can happen. You always need to make sure that the results of a statistical model are consistent with what you are observing visually when you look at the data. If there is a major inconsistency, as in this case, then you must suspect that something is wrong: namely, that the statistical technique is for some reason not modeling the data correctly. It is also possible that the data are wrong. But clearly, something is wrong here.
Here, an examination of the actual data reveals that there is absolutely no basis to conclude that the smoking ban resulted in an 11.7% decline in heart attack deaths.
But why did nobody see this? It's difficult to believe that the authors didn't see it, the reviewers didn't see it, and the journal editorial team didn't see it. This should in fact be the first thing that everybody looks at. Even if you just look at the data in Table 2 without plotting it out, it is immediately apparent that there was a striking increase in heart attack deaths in 2010, wiping out the possibility that the smoking ban led to a large and sustained decline in heart attack deaths through 2010.
It appears that either nobody looked at the actual data or that they looked but ignored it. Either way, this demonstrates a severe bias on the part of the investigators, reviewers, and editorial team. Had the study found no effect of a smoking ban, you can rest assured that everyone would have scoured over the paper for hours, trying to find some explanation for why the results came out "wrong." But here, since the results were "right" (that is, favorable), it appears that there was no desire to sincerely "review" the paper.
Finally, it is critical to mention that even if the paper had found a decline in heart attack deaths in 2010, this would not justify the conclusion that the smoking ban caused a decrease in heart attacks. The critical and fatal methodological flaw of this paper is that there is no comparison group. It is very possible that heart attack death rates were declining during the study period anyway, even in the absence of smoking bans. We actually know this to be the case from abundant international data. To conclude that the smoking ban had an effect on heart attacks, one would need to first control for secular trends in heart attack mortality that were occurring anyway, independent of the smoking ban. The paper could easily have done this by including some comparison group -- such as a nearby city, the county, the state, or the country. But there needs to be some control for secular trends.
Thus, even if this paper had convincingly demonstrated that there was a decline in heart attacks in São Paulo after the smoking ban, it would not have been valid to conclude that this was a causal effect. Without a comparison group, this study is as good as worthless.
When the tobacco industry used to put out studies like this to show that smoking bans cause massive losses of revenue for restaurants, we attacked them for conducting time series analyses without using an appropriate control group. Now it appears that we are doing the same thing ourselves. This certainly has the appearance of a severe bias: results that are "favorable" are correct and the methodology is automatically valid and those which are "unfavorable" are incorrect and the methodology must be attacked.
Perhaps one of my great frustrations of 2016 is the way in which science has largely disappeared from the public policy agenda. Decisions are being made almost completely on political grounds. It is a bad enough state of affairs that we in public health don't need to contribute to it. Even though we are working for worthwhile causes, we cannot let loose our insistence on rigorous science. Once we do that, then we're really sinking to the level of our opponents.
Finally, I should make it clear that any failure of this study to detect an immediate decline in heart attack deaths does not affect my support for smoke-free bar and restaurant laws. It's just that in promoting such laws, I believe we need to rely upon solid scientific data, not hocus pocus that comes out of some complex statistical model that no one really understands and which ends up completely misrepresenting the actual data -- data that one can see with one's own eyes.