Skip to content. | Skip to navigation

Personal tools

Navigation

You are here: Home

Search results

16 items matching your search terms.
Filter the results.
Item type










































New items since



Sort by relevance · date (newest first) · alphabetically
Large Scale Infrastructure for Social Data Science
Webinar - will be recorded
Located in Coming Up
Frauke Kreuter featured in The Baltimore Sun on New Data Collection on COVID-19 with Facebook
Faculty at the University of Maryland have been working with Facebook to design a worldwide survey aimed at collecting coronavirus data during the global pandemic.
Located in News
Article Reference Troff document (with manpage macros)Coverage Error in Data Collection Combining Mobile Surveys With Passive Measurement Using Apps: Data From a German National Survey
Researchers are combining self-reports from mobile surveys with passive data collection using sensors and apps on smartphones increasingly more often. While smartphones are commonly used in some groups of individuals, smartphone penetration is significantly lower in other groups. In addition, different operating systems (OSs) limit how mobile data can be collected passively. These limitations cause concern about coverage error in studies targeting the general population. Based on data from the Panel Study Labour Market and Social Security (PASS), an annual probability-based mixed-mode survey on the labor market and poverty in Germany, we find that smartphone ownership and ownership of smartphones with specific OSs are correlated with a number of sociodemographic and substantive variables.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article ReferenceThe Relationship between Interviewer-Respondent Rapport and Data Quality
Interviewer-respondent rapport is generally considered to be beneficial for the quality of the data collected in survey interviews; however, the relationship between rapport and data quality has rarely been directly investigated. We conducted a laboratory experiment in which eight professional interviewers interviewed 125 respondents to see how the rapport between interviewers and respondents is associated with the quality of data—primarily disclosure of sensitive information—collected in these interviews. It is possible that increased rapport between interviewers and respondents might motivate respondents to be more conscientious, increasing disclosure; alternatively, increased rapport might inhibit disclosure because presenting oneself unfavorably is more aversive if respondents have a positive relationship with the interviewer. More specifically, we examined three issues: (1) what the relationship is between rapport and the disclosure of information of varying levels of sensitivity, (2) how rapport is associated with item nonresponse, and (3) whether rapport can be similarly established in video-mediated and computer-assisted personal interviews (CAPIs). We found that (1) increased respondents’ sense of rapport increased disclosure for questions that are highly sensitive compared with questions about topics of moderate sensitivity; (2) increased respondents’ sense of rapport is not associated with a higher level of item nonresponse; and (3) there was no significant difference in respondents’ rapport ratings between video-mediated and CAPI, suggesting that rapport is just as well established in video-mediated interviews as it is in CAPI.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article ReferenceNew Data Sources in Social Science Research: Things to Know Before Working With Reddit Data
Social media are becoming more popular as a source of data for social science researchers. These data are plentiful and offer the potential to answer new research questions at smaller geographies and for rarer subpopulations. When deciding whether to use data from social media, it is useful to learn as much as possible about the data and its source. Social media data have properties quite different from those with which many social scientists are used to working, so the assumptions often used to plan and manage a project may no longer hold. For example, social media data are so large that they may not be able to be processed on a single machine; they are in file formats with which many researchers are unfamiliar, and they require a level of data transformation and processing that has rarely been required when using more traditional data sources (e.g., survey data). Unfortunately, this type of information is often not obvious ahead of time as much of this knowledge is gained through word-of-mouth and experience. In this article, we attempt to document several challenges and opportunities encountered when working with Reddit, the self-proclaimed “front page of the Internet” and popular social media site. Specifically, we provide descriptive information about the Reddit site and its users, tips for using organic data from Reddit for social science research, some ideas for conducting a survey on Reddit, and lessons learned in merging survey responses with Reddit posts. While this article is specific to Reddit, researchers may also view it as a list of the type of information one may seek to acquire prior to conducting a project that uses any type of social media data.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article Reference Troff document (with manpage macros)Does Benefit Framing Improve Record Linkage Consent Rates? A Survey Experiment
Survey researchers are increasingly seeking opportunities to link interview data with administrative records. However, obtaining consent from all survey respondents (or certain subgroups) remains a barrier to performing record linkage in many studies. We experimentally investigated whether emphasizing different benefits of record linkage to respondents in a telephone survey of employee working conditions improves respondents’ willingness to consent to linkage of employment administrative records relative to a neutral consent request. We found that emphasizing linkage benefits related to “time savings” yielded a small, albeit statistically significant, improvement in the overall linkage consent rate (86.0) relative to the neutral consent request (83.8 percent). The time savings argument was particularly effective among “busy” respondents. A second benefit argument related to “improved study value” did not yield a statistically significant improvement in the linkage consent rate (84.4 percent) relative to the neutral request. This benefit argument was also ineffective among the subgroup of respondents considered to be most likely to have a self-interest in the study outcomes. The article concludes with a brief discussion of the practical implications of these findings and offers suggestions for possible research extensions.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article Reference Troff document (with manpage macros)Change Through Data: A Data Analytics Training Program for Government Employees
From education to health to criminal justice, government regulation and policy decisions have important effects on social and individual experiences. New data science tools applied to data created by government agencies have the potential to enhance these meaningful decisions. However, certain institutional barriers limit the realization of this potential. First, we need to provide systematic training of government employees in data analytics. Second we need a careful rethinking of the rules and technical systems that protect data in order to expand access to linked individual-level data across agencies and jurisdictions, while maintaining privacy. Here, we describe a program that has been run for the last three years by the University of Maryland, New York University, and the University of Chicago, with partners such as Ohio State University, Indiana University/Purdue University, Indianapolis, and the University of Missouri. The program—which trains government employees on how to perform applied data analysis with confidential individual-level data generated through administrative processes, and extensive project-focused work—provides both online and onsite training components. Training takes place in a secure environment. The aim is to help agencies tackle important policy problems by using modern computational and data analysis methods and tools. We have found that this program accelerates the technical and analytical development of public sector employees. As such, it demonstrates the potential value of working with individual-level data across agency and jurisdictional lines. We plan to build on this initial success by creating a larger community of academic institutions, government agencies, and foundations that can work together to increase the capacity of governments to make more efficient and effective decisions.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article ReferencePredicting Voting Behavior Using Digital Trace Data
A major concern arising from ubiquitous tracking of individuals’ online activity is that algorithms may be trained to predict personal sensitive information, even for users who do not wish to reveal such information. Although previous research has shown that digital trace data can accurately predict sociodemographic characteristics, little is known about the potentials of such data to predict sensitive outcomes. Against this background, we investigate in this article whether we can accurately predict voting behavior, which is considered personal sensitive information in Germany and subject to strict privacy regulations. Using records of web browsing and mobile device usage of about 2,000 online users eligible to vote in the 2017 German federal election combined with survey data from the same individuals, we find that online activities do not predict (self-reported) voting well in this population. These findings add to the debate about users’ limited control over (inaccurate) personal information flows.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article Reference Troff document (with manpage macros)Trust and cooperative behavior: Evidence from the realm of data-sharing
Trust is praised by many social scientists as the foundation of functioning social systems owing to its assumed connection to cooperative behavior. The existence of such a link is still subject to debate. In the present study, we first highlight important conceptual issues within this debate. Second, we examine previous evidence, highlighting several issues. Third, we present findings from an original experiment, in which we tried to identify a “real” situation that allowed us to measure both trust and cooperation. People’s expectations and behavior when they decide to share (or not) their data represents such a situation, and we make use of corresponding data. We found that there is no relationship between trust and cooperation. This non-relationship may be rationalized in different ways which, in turn, provides important lessons for the study of the trust—behavior nexus beyond the particular situation we study empirically.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications
Article ReferenceThe effect of framing and placement on linkage consent
Numerous surveys link interview data to administrative records, conditional on respondent consent, in order to explore new and innovative research questions. Optimizing the linkage consent rate is a critical step toward realizing the scientific advantages of record linkage and minimizing the risk of linkage consent bias. Linkage consent rates have been shown to be particularly sensitive to certain design features, such as where the consent question is placed in the questionnaire and how the question is framed. However, the interaction of these design features and their relative contributions to the linkage consent rate have never been jointly studied, raising the practical question of which design feature (or combination of features) should be prioritized from a consent rate perspective. We address this knowledge gap by reporting the results of a placement and framing experiment embedded within separate telephone and Web surveys. We find a significant interaction between placement and framing of the linkage consent question on the consent rate. The effect of placement was larger than the effect of framing in both surveys, and the effect of framing was only evident in the Web survey when the consent question was placed at the end of the questionnaire. Both design features had negligible impact on linkage consent bias for a series of administrative variables available for consenters and non-consenters. We conclude this research note with guidance on the optimal administration of the linkage consent question.
Located in MPRC People / Frauke Kreuter, Ph.D. / Frauke Kreuter Publications