After Florida Politics published the Saltshaker story on the recent FAU/Mainstreet poll, representatives from Mainstreet Research contacted us and asked to retract the whole story.
No, we're not doing that.
Instead, we'll address some of their concerns and offer them as corrections to the original.
Here are Mainstreet's stated concerns, which they had the good graces to send in writing:
"Pollsters 'reviews' of other polls aren't allowed in the industry for obvious reasons"
We honestly don't know what that means. "Aren't allowed" by who? By the American Association of Public Opinion Research (AAPOR)?
In checking with an attorney, he knows of no such legal prohibition.
And what are those "obvious reasons?" We don't want to have other professionals critique our work?
In our opinion, pollsters should critique each other; we should rely on each other to ensure we get it right.
The general public often fails to grasp the nuances of polling, such as parametric statistics, stratified sampling, and the importance of methodologies to a poll's validity and reliability.
With that in mind, we do believe peers should review peers and we should police ourselves and hold each other accountable.
That is our opinion, and we stand by that regardless of what AAPOR says.
"The registered voter file in Florida is available to pollsters, that was the frame that was used" and "Even though we use the registered voter file, we still screen for registered voters."
Good. Correction noted (sort of).
Mainstreet's actual methodology states, "among a sample of 878 adults, 18 years of age or older."
Those are the numbers we relied on.
Nowhere in the public document posted on the FAU website does it say that Mainstreet only interviewed registered voters taken from the Florida voter file? That was important information we were relying on.
In response to the column, they are now telling us they did rely only on voters. We will take their word for it and note that this is a great improvement.
Kudos!
This is also a good time to reference the AAPOR standards:
"Accordingly good professional practice imposes the obligation upon all public opinion and survey researchers to disclose sufficient information about how the research was conducted to allow for independent review and verification of research claims … "
We agree and believe that kind of critical information should have been shared in the methodology and suggest, like the AAPOR, that they include this vital information in future releases.
We (wrongly) assumed that, given the lack of clarity in the methodology and the fact that this poll was conducted using both robocalls (IVR) and an online panel, they talked to "adults," as they said in their supporting documents.
Why is it so important?
Online panels too often contain avatars, and it is difficult to ensure that an actual voter is the respondent. Saying you polled "adults" versus saying you polled verified voters is a very important distinction.
With that, we stand corrected on our observation that they polled "adults" and not just registered voters.
However, we are still unsure of why most questions did not have 878 respondents, but that some had over 100 fewer (771) and many had 883 (a number larger than the overall number interviewed and we have no idea how that could even happen.)
The presumption we operated under, which we believe is fair, is that the smaller tallies were of the registered voters. But in reaching out to us, the author says those are frequent voters and, therefore, why some questions have a lower number. Fair, but nowhere is this noted in the methodology and the report sometimes tallies all respondents and other times the 771 respondents without explanation.
We trust the author's assertion that it's "likely voting adults" rather than the overall sample of "registered voters."
Not sure what can (or should) be "corrected," so let's just say it's duly noted.
"The poll is weighted by past vote, not party ID" and "The party ID is NOT registered for, it's what party do you identify with."
This is where we stand our ground — and won't budge.
First, it's a mistake to ask about voter registration in this way ("How do you identify?")
Why do we feel this way? One simple example comes to mind. Back in the mid-2000s, tens of thousands of registered Democrats in the Panhandle voted and identified with Republicans. We called them either "Reagan Democrats" or "Dixiecrats," and with closed Primaries and other factors, knowing someone's actual voter registration versus how they "self-identify" is vital to getting a properly balanced sample.
Another example: a large number of Florida voters (nearly 30%) chose not to belong to either major party, so asking them which party they identify with can also lead to problems. Case in point: in this poll, only 17.7% of the likely voting respondents "identify" as being "Independent."
We believe that there is almost no chance that the 2024 electorate will have less than 20% NPA + Minor Party voters – the projected turnout should show that number at least 22-24%.
Second, we also stand by our analysis that an overall sample with more Democrats than Republicans is flawed and would lead to skewed results.
Further, weighting it to a +1 GOP model (looking at what they have told us is among likely voters: a 771 sample with 314 Dems and 321 GOP likely voters) moves the ball slightly in the right direction, but it is our strong belief that, given the dramatic shift in voter registration combined with 2022 turnout numbers, these results incorrectly favor the Democratic candidates.
Since the 2022 election, Republicans have expanded their net registration lead by over 550,000 and it continues to grow. And in looking at the 2022 actual turnout*, it was a +12 GOP one! A far – FAR – cry from the +1 GOP model used in this poll.
*In 2022, 2,577,635 Democrats and 3,514,830 Republicans voted. Therefore, the electorate was comprised of 33.3% Democrats and 45.5% Republicans. That 12-point gap is how we come up with a +12 GOP model. How anyone, let alone a professional pollster, could see those publicly available reports and say Democrats will dramatically reverse course (even as the voter registration numbers widen) and will drop back to a +1 GOP model is not only unsupported by the data but is beyond our level of understanding.
Mainstreet's final concern is that "Allen Grayson [sic] withdrew from the Primary after the field dates of the poll."
Yes, he did. And the column clearly acknowledged that.
This is how it was worded in the story: "The poll also surveys respondents on U.S. Sen. Rick Scott's re-election prospects, and the Democratic Primary to challenge him" (though it's Alan Grayson in the poll and, as we now know, he's not running in that race anymore.)
We are simply not sure what the concern is as Grayson's exit is noted (see above) in the story.
With that in mind, if we get something wrong, we are more than willing to correct our errors.
That said, we stand by our opinions and still suggest readers treat this poll with, well, you know ...
No comments:
Post a Comment