Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home

Search results

5 items matching your search terms.
Filter the results.
Item type









































New items since



Sort by relevance · date (newest first) · alphabetically
Article Reference Troff document (with manpage macros)Factors Affecting Completion Times: A Comparative Analysis of Smartphone and PC Web Surveys
This article compares the factors affecting completion times (CTs) to web survey questions when they are answered using two different devices: personal computers (PCs) and smartphones. Several studies have reported longer CTs when respondents use smartphones than PCs. This is a concern to survey researchers because longer CTs may increase respondent burden and the risk of breakoff. However, few studies have analyzed the specific reasons for the time difference. In this analysis, we analyzed timing data from 836 respondents who completed the same web survey twice, once using a smartphone and once using PC, as part of a randomized crossover experiment in the Longitudinal Internet Studies for the Social Sciences panel. The survey contained a mix of questions (single choice, numeric entry, and text entry) that were displayed on separate pages. We included both page-level and respondent-level factors that may have contributed to the time difference between devices in cross-classified multilevel models. We found that respondents took about 1.4 times longer when using smartphones than PCs. This difference was larger when a page had more than one question or required text entry. The difference was also larger among respondents who had relatively low levels of familiarity and experience using smartphones. Respondent multitasking was associated with slower CTs, regardless of the device used. Practical implications and avenues for future research are discussed.
Located in MPRC People / Christopher Antoun, Ph.D. / Christopher Antoun Publications
Frauke Kreuter featured in The Baltimore Sun on New Data Collection on COVID-19 with Facebook
Faculty at the University of Maryland have been working with Facebook to design a worldwide survey aimed at collecting coronavirus data during the global pandemic.
Located in News
Article Reference Troff document (with manpage macros)Tree-based Machine Learning Methods for Survey Research
Predictive modeling methods from the field of machine learning have become a popular tool across various disciplines for exploring and analyzing diverse data. These methods often do not require specific prior knowledge about the functional form of the relationship under study and are able to adapt to complex non-linear and non-additive interrelations between the outcome and its predictors while focusing specifically on prediction performance. This modeling perspective is beginning to be adopted by survey researchers in order to adjust or improve various aspects of data collection and/or survey management. To facilitate this strand of research, this paper (1) provides an introduction to prominent tree-based machine learning methods, (2) reviews and discusses previous and (potential) prospective applications of tree-based supervised learning in survey research, and (3) exemplifies the usage of these techniques in the context of modeling and predicting nonresponse in panel surveys.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article Reference Troff document (with manpage macros)Trust and cooperative behavior: Evidence from the realm of data-sharing
Trust is praised by many social scientists as the foundation of functioning social systems owing to its assumed connection to cooperative behavior. The existence of such a link is still subject to debate. In the present study, we first highlight important conceptual issues within this debate. Second, we examine previous evidence, highlighting several issues. Third, we present findings from an original experiment, in which we tried to identify a “real” situation that allowed us to measure both trust and cooperation. People’s expectations and behavior when they decide to share (or not) their data represents such a situation, and we make use of corresponding data. We found that there is no relationship between trust and cooperation. This non-relationship may be rationalized in different ways which, in turn, provides important lessons for the study of the trust—behavior nexus beyond the particular situation we study empirically.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article Reference Troff document (with manpage macros)Willingness to participate in passive mobile data collection
The rising penetration of smartphones now gives researchers the chance to collect data from smartphone users through passive mobile data collection via apps. Examples of passively collected data include geolocation, physical movements, online behavior and browser history, and app usage. However, to passively collect data from smartphones, participants need to agree to download a research app to their smartphone. This leads to concerns about nonconsent and nonparticipation. In the current study, we assess the circumstances under which smartphone users are willing to participate in passive mobile data collection. We surveyed 1,947 members of a German nonprobability online panel who own a smartphone using vignettes that described hypothetical studies where data are automatically collected by a research app on a participant’s smartphone. The vignettes varied the levels of several dimensions of the hypothetical study, and respondents were asked to rate their willingness to participate in such a study. Willingness to participate in passive mobile data collection is strongly influenced by the incentive promised for study participation but also by other study characteristics (sponsor, duration of data collection period, option to switch off the app) as well as respondent characteristics (privacy and security concerns, smartphone experience).
Located in MPRC People / Christopher Antoun, Ph.D. / Christopher Antoun Publications