Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home

Search results

9 items matching your search terms.
Filter the results.
Item type










































New items since



Sort by relevance · date (newest first) · alphabetically
Haltiwanger and Abraham comment on missing data
Surveys could be asking the wrong questions
Located in News
The Women's Empowerment: Data for Gender Equality (WEDGE) project underway
The WEDGE advisory board meeting discussed generating cross-culturally comparable data
Located in News
Normative Beliefs Regarding the Employment of Parents of Pre-School Children: A National Survey Experiment
Jerry A. Jacobs, Professor, Department of Sociology, University of Pennsylvania
Located in Coming Up
Using propensity scores for causal inference with covariate measurement error
Faculty Associate Frauke Kreuter's project, an R01 funded by the National Institute of Mental Health, seeks to develop and assess new statistical methods
Located in Research / Selected Research
How does interview methodology affect interviewer variance?
Frauke Kreuter compares the effectiveness of commonly-used face-to-face interview methods
Located in Research / Selected Research
Article Reference Troff document (with manpage macros)Tree-based Machine Learning Methods for Survey Research
Predictive modeling methods from the field of machine learning have become a popular tool across various disciplines for exploring and analyzing diverse data. These methods often do not require specific prior knowledge about the functional form of the relationship under study and are able to adapt to complex non-linear and non-additive interrelations between the outcome and its predictors while focusing specifically on prediction performance. This modeling perspective is beginning to be adopted by survey researchers in order to adjust or improve various aspects of data collection and/or survey management. To facilitate this strand of research, this paper (1) provides an introduction to prominent tree-based machine learning methods, (2) reviews and discusses previous and (potential) prospective applications of tree-based supervised learning in survey research, and (3) exemplifies the usage of these techniques in the context of modeling and predicting nonresponse in panel surveys.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article Reference Troff document (with manpage macros)Does Benefit Framing Improve Record Linkage Consent Rates? A Survey Experiment
Survey researchers are increasingly seeking opportunities to link interview data with administrative records. However, obtaining consent from all survey respondents (or certain subgroups) remains a barrier to performing record linkage in many studies. We experimentally investigated whether emphasizing different benefits of record linkage to respondents in a telephone survey of employee working conditions improves respondents’ willingness to consent to linkage of employment administrative records relative to a neutral consent request. We found that emphasizing linkage benefits related to “time savings” yielded a small, albeit statistically significant, improvement in the overall linkage consent rate (86.0) relative to the neutral consent request (83.8 percent). The time savings argument was particularly effective among “busy” respondents. A second benefit argument related to “improved study value” did not yield a statistically significant improvement in the linkage consent rate (84.4 percent) relative to the neutral request. This benefit argument was also ineffective among the subgroup of respondents considered to be most likely to have a self-interest in the study outcomes. The article concludes with a brief discussion of the practical implications of these findings and offers suggestions for possible research extensions.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article Reference Troff document (with manpage macros)Factors Affecting Completion Times: A Comparative Analysis of Smartphone and PC Web Surveys
This article compares the factors affecting completion times (CTs) to web survey questions when they are answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer CTs when respondents use smartphones than PCs. This is a concern to survey researchers because longer CTs may increase respondent burden and the risk of breakoff. However, few studies have analyzed the specific reasons for the time difference. In this analysis, we analyzed timing data from 836 respondents who completed the same web survey twice, once using a smartphone and once using PC, as part of a randomized crossover experiment in the Longitudinal Internet Studies for the Social Sciences panel. The survey contained a mix of questions (single choice, numeric entry, and text entry) that were displayed on separate pages. We included both page-level and respondent-level factors that may have contributed to the time difference between devices in cross-classified multilevel models. We found that respondents took about 1.4 times longer when using smartphones than PCs. This difference was larger when a page had more than one question or required text entry. The difference was also larger among respondents who had relatively low levels of familiarity and experience using smartphones. Respondent multitasking was associated with slower CTs, regardless of the device used. Practical implications and avenues for future research are discussed.
Located in MPRC People / Christopher Antoun, Ph.D. / Christopher Antoun Publications
Article Reference Troff document (with manpage macros)Willingness to participate in passive mobile data collection
The rising penetration of smartphones now gives researchers the chance to collect data from smartphone users through passive mobile data collection via apps. Examples of passively collected data include geolocation, physical movements, online behavior and browser history, and app usage. However, to passively collect data from smartphones, participants need to agree to download a research app to their smartphone. This leads to concerns about nonconsent and nonparticipation. In the current study, we assess the circumstances under which smartphone users are willing to participate in passive mobile data collection. We surveyed 1,947 members of a German nonprobability online panel who own a smartphone using vignettes that described hypothetical studies where data are automatically collected by a research app on a participant’s smartphone. The vignettes varied the levels of several dimensions of the hypothetical study, and respondents were asked to rate their willingness to participate in such a study. Willingness to participate in passive mobile data collection is strongly influenced by the incentive promised for study participation but also by other study characteristics (sponsor, duration of data collection period, option to switch off the app) as well as respondent characteristics (privacy and security concerns, smartphone experience).
Located in MPRC People / Christopher Antoun, Ph.D. / Christopher Antoun Publications