ASPPH logo


Member Research & Reports

Member Research & Reports

CUNY Faculty Studies Demographic and Sexual Behavioral Factors by Recruitment Modality

Dr. Christian Grov, professor at the CUNY Graduate School of Public Health and Health Policy, and colleagues compared self-reported demographic and sexual behavioral factors among men who have sex with men recruited through mechanical Turk, Qualtrics, and a HIV/Sexually Transmitted Infections clinic-based sample. They discussed the implications for researchers and providers. The findings were published in the journal of Archives of Sexual Behavior.

[Photo: Dr. Christian Grov]

Recruitment for HIV research among gay, bisexual, and other men who have sex with men has increasingly moved to the online sphere. However, there are limited data comparing the characteristics of clinic-based respondents versus those recruited via online survey platforms.

For this study men who have sex with men were recruited from three sampling sites (an STI clinic, MTurk, and Qualtrics) to participate in a survey from March 2015 to April 2016. Respondents were compared between each of the sampling sites on demographics, sexual history, substance use, and attention filter passage.

Attention filter passage was high for the online sampling sites (MTurk = 93%; Qualtrics = 86%), but significantly lower for the clinic-based sampling site (72%). Clinic-based respondents were significantly more racially/ethnically diverse, reported lower income, and reported more unemployment than online respondents. Clinic-based respondents reported significantly more male sexual partners in the previous 3 months (clinic-based = 6; MTurk = 3.6; Qualtrics = 4.5), a higher proportion of gonorrhea, chlamydia, and/or syphilis in the last year, and a greater proportion of methamphetamine use (clinic-based = 21%; MTurk = 5%), and inhaled nitrates use (clinic-based = 41%; MTurk = 11%). The clinic-based sample demonstrated more demographic diversity and a greater proportion of HIV risk behaviors when compared to the online samples, but also a relatively low attention filter passage rate.

The research team recommended the use of attention filters across all modalities to assess response validity and urged caution with online survey engines as samples may differ demographically and behaviorally when compared to clinic-based respondents.