SVG
Commentary
Hudson.org

Misrepresenting Colorado Marijuana

(PYMCA/UIG via Getty Images)
Caption
(PYMCA/UIG via Getty Images)

* There is no Colorado “survey;” and no capacity to “represent” Colorado youth.
* The sample represents no more than the kids who participated.
* Media reported youth use “flat,” but steep increases were nonetheless widespread.
* __Colorado youth marijuana use cannot be “below the national average.” They have the highest rate of marijuana use in the nation.__
* __The survey response rate, only 46 percent, was inadequate; crucially, below the threshold set by the Centers for Disease Control.__
* The only “lesson” about legalization is a warning sign.

What is wrong with the marijuana legalization debate, and who is responsible for its sorry state? No better example of misdirection can be offered than the results of a recent Colorado poll. Because of the public health stakes for the nation’s youth, getting this right is essential.

The 2015 version of the Healthy Kids Colorado (HKCS) school survey, which polled both middle school and high school students, garnered a tremendous amount of news. Since it is the first state level estimate to be taken after marijuana legalization (accelerated with retail sales in January 2014), there was reporting concerning possible impact, compared to the previous HKCS taken in 2013.

While not without flaws, the study is an interesting snap-shot of youth health concerns, and legitimately alerts us to some genuine problem areas for marijuana use. But the media reporting was appalling, and the HKCS did little to prevent misunderstanding.

Media Advocacy

Uniformly, media described good news for marijuana legalization advocates. Most coverage (in the Washington Post, Denver Post, Fox News reported that, compared to two years prior, marijuana use was “flat,” because not “statistically significant.”

Yet even a flat outcome is surprising, not least because other national surveys in Colorado have disclosed alarming increases in use for adults and young adults, rates rising in ten years some 99 percent (7.5 percent to 14.93 percent).

Moreover, in the regional breakdown there were major increases since 2013 in past month use for some students who were juniors and seniors, the increase in some regional breakdowns rising between 50 and 90 percent.

But the survey combined those results with younger grades to produce an overall mean (an ill-advised methodology without weightings), so officials were reported to declare no statistically significant change in marijuana prevalence.

Predictably, the results were treated as a report card on legalization, and media seized on the purported lesson -- no rising rates, hence, no worries.

There was even more confusion (or worse) by some media. Both Time magazine and the Scientific American ran articles claiming that the survey had indeed found change, after all. But remarkably, they reported that the new results showed marijuana use had “dipped” from 2013. Of course, in the absence of significance, this claim would be simply wrong. If the non-significant outcome cannot be up, it certainly cannot be down.

There is worse in store. Media thought that calling the changes “not significant” meant that changes were not sufficiently large. But as we shall see, what happened is that the survey did not, methodologically, produce an outcome capable of being statistically significant, as a true weighted probability survey would be. That is a very different matter.

What they seem to have produced is a (partial) census of students, for whom marijuana is variously steeply up or on occasion down, depending on grade and geography. But this survey does not fulfill the necessary criteria for a probability sample.

What National Average?

A further question is, to what are the results being compared? Both the HKCS report itself, and the media, compared the outcome of HKCS 2015 not only to HKCS 2013, but to what they termed the “national average” of youth marijuana use.

According to the reports, 21.2% of Colorado teens were past month marijuana users in 2015. The “national rate” of pot use by youth was reported as higher, at 21.7%.

What is the possible source for deriving that “national average”? There is one genuinely national sample of youth drug use, that from the National Survey on Drug Use and Health (NSDUH) that covers all states. But this cannot be the basis for the claim. In their latest 2014 estimates, NSDUH reported that 7.2 percent of adolescents aged 12 to 17 across the nation used marijuana in the past month – that figure, not 21.7 percent, would be the youth “national average.”

Moreover, the NSDUH specifically declared that Colorado had the nation’s highest rates. Adolescent marijuana use ranged from 4.98 percent in Alabama to 12.56 percent in Colorado.

Worse, the NSDUH showed for youth that from 2009, when medical marijuana took off in Colorado, there has been a stunning rise of 27 percent through 2014 (from 9.91 percent to 12.56 percent). So Colorado youth use rates in the NSDUH are not only higher than the national average, but, after freer access to marijuana, have been steeply climbing.

There is also Monitoring the Future (MTF), a school survey (8th, 10th, an 12th grades) produced by the National Institute on Drug Abuse. In 2014 MTF showed 21.2 percent of students reporting past month marijuana use. But that rate applied only to seniors in their survey, while the HKCS results were supposed to represent all grades. (Rates for 8th graders in the MTF stood at only 6.5 percent.)

Apparently worried about such data contradictions, the Washington Post sought to mitigate concern by pointing out that the HKCS had a very large in-state sample, of 17,000 kids. Says the Post, “That much larger sample could produce a more accurate estimate than the smaller numbers in the federal drug survey.”

But the Post's maneuver only magnifies the problems. HKCS is modeled on the Youth Risk Behavioral Survey (YRBS) conducted every two years by the Centers for Disease Control, and the YRBS (which covers most but not every state), also has a sample that estimates a national average.

Critically, however, the YRBS has nowhere near 17,000 youth in each state as their national sample. Their national estimate is based on a total 13,600 responses, for the entire nation; that would average just a few hundred kids per state if evenly distributed. Moreover, the YRBS surveys youth in 9th through 12th grade. Hence, the YRBS is not really comparable to what happened in Colorado.

Likely Missed Most At-Risk Youth

The fact that HKCS and YRBS are both school-based surveys (that is, not taken in households, as is the NSDUH) may account for some of the difference in their respective magnitudes – school surveys produce traditionally higher figures. But this fact raises questions about the validity of any school-based survey.

School-base surveys cannot capture youth not in schools. Given that marijuana use itself is strongly associated with school drop-out rates (as well as high rates of absenteeism), the population of interest may not have been included in a representative fashion. Further, the rising number of young homeless marijuana users flooding into Colorado shows another side of the problem; they are not captured in the school surveys.

That is, the HKCS survey might have systematically missed exactly the kids most at-risk of using marijuana. Of course, if true, the same impact might have affected the HKCS 2013 results that formed the contrast. However, if marijuana use had increased substantially since legalization, there could be a differential impact on student attrition by 2015, as the situation worsened.

Response Rates Undermined Validity

The 2013 survey reported in their “demographics” breakdown that the earlier iteration had an even larger sample than the 2015 run. The 2013 sample was 40,000 youth, who answered with a response rate of 58 percent. But the 2015 sample, the 17,000, registered a truly dismal 46 percent response rate. This is a real problem.

First, when fewer than half of the sample responds, there is a risk that those who did answer are not representative of the actual youth total. Second, that drop in response could signal that there were more youth at risk in the 2015 sample, and hence, not present in the classroom during the second round.

Third, however, these data also show the non-comparability of the surveys. The 58 percent response in the 2013 survey, with a much larger sample, stands in genuine contrast with the lower 46 percent response rate and smaller initial sample for 2015. By way of contrast, the CDC YRBS has an 88 percent student response rate, while the NSDUH stands at 71.2 percent

Here the point made by the Post about larger samples can be turned back on them; surely the smaller 2015 HKCS would be less accurate than the previous iteration, by the Post's own logic. Moreover, the surveys are non-comparable not only because of sample size and response rates, but further because a different set of schools was included in the 2015 iteration, not reporting, for instance, some large school districts near urban areas, where rates of use are often higher.

Given such differences, the two different sets of results cannot meaningfully be put side-by-side. And as we have seen, because the HKCS study was unique to Colorado, there is no methodologically comparable “national average” to which comparisons could be made. Hence, there is no lesson to be derived regarding the impact of legalization; certainly not one sufficiently robust to counter the worrisome NSDUH data to the contrary.

But here’s the most devastating problem of all. The official YRBS, run every two years, requires a 60 percent “participation” rate in order to generate valid weightings for the results. If they are unable to weight a state survey (such as HKCS), they cannot provide an estimate that is representative of the state population. As the CDC participation map shows, Colorado did not receive a proper weighted sample.

This means that the HKCS, according to the criteria offered by the CDC, cannot be used to represent all students in Colorado – there can be no extrapolation of the findings beyond the survey respondents. __According to the CDC criteria, these results cannot be extrapolated beyond the participants themselves, and therefore “stands for” no one but the kids who participated.__ As such, there can be no statistically significant comparisons between these results and previous years, nor with what was termed a “national sample.”

Correction Needed

And yet the media were allowed (where they were not encouraged) to run the “top line” results as though they stood for Colorado youth. And to declare the results “better” following legalization. And to declare use in Colorado post-legalization to be below the national average. As we have seen, none of these statements is warranted.

In fact, we now learn about other states that did not participate in the latest 2015 YRBS round. They include Minnesota, but more importantly, both Oregon and Washington, states that have recently legalized marijuana – that is, states that could have provided a report-card on legalization, but about which we will learn even less than we learned about Colorado. (The next NSDUH state-level estimate won’t come until 2017, post-election.)

This apparent coincidence does little to allay concerns that we are witnessing the effects of a pro-marijuana agenda, perhaps forwarded by well-meaning state boosters and, more surely, by their enablers in the media. Given that other states are looking to Colorado to comprehend their own legalization risks, it is important that the record be corrected.