Centre for Research on Discretion and Paternalism Bergen

Literature update #2 2019

LITERATURE OVERVIEW: See our list of recently published articles of interest.

How articles are selected

  • A full list of articles are collected based on TOC alerts from journals by the publishers: Taylor & Francis, Sage Publications, Cambridge University Press, Oxford Academic.
  • The short list is selected based on an assessment of the articles theoretical, methodological and/or empirical relevance to the projects at the Centre.
  • Please note that the list of articles is not based on a qualitative assessment of the articles scientific contributions or level.
  • Questions: Barbara Ruiken

ILLUSTRATION: Centre for Research on Discretion and Paternalism / MGalloway, Wikimedia Commons


The effects of family structure and race on decision-making in child welfare
(Yelick & Thyer, 2019)

ABSTRACT: Decision-making among child welfare professionals is complex, influencing children and their families. The primary aim of this study was to examine how removal decisions of case managers are affected by family structure, given the intersectionality of family structure and race using an experimental case vignette design. The non-probability convenience sample included 54 case managers working throughout the state of Florida. According to the results, family structure has an influence on removal decisions, particularly when the safety decision is used to modify the relationship. While often overlooked, family structure is a relevant factor related to decision-making among child welfare professionals.

Yelick, A., & Thyer, B. (2019). The effects of family structure and race on decision-making in child welfare. Journal of Public Child Welfare, 0(0), 1–21. https://doi.org/10.1080/15548732.2019.1616651

Demand Effects in Survey Experiments: An Empirical Assessment
(Mummolo & Peterson, 2019)

ABSTRACT: Survey experiments are ubiquitous in social science. A frequent critique is that positive results in these studies stem from experimenter demand effects (EDEs)—bias that occurs when participants infer the purpose of an experiment and respond so as to help confirm a researcher’s hypothesis. We argue that online survey experiments have several features that make them robust to EDEs, and test for their presence in studies that involve over 12,000 participants and replicate five experimental designs touching on all empirical political science subfields. We randomly assign participants information about experimenter intent and show that providing this information does not alter the treatment effects in these experiments. Even financial incentives to respond in line with researcher expectations fail to consistently induce demand effects. Research participants exhibit a limited ability to adjust their behavior to align with researcher expectations, a finding with important implications for the design and interpretation of survey experiments.

Mummolo, J., & Peterson, E. (2019). Demand Effects in Survey Experiments: An Empirical Assessment. American Political Science Review, 113(2), 517–529. https://doi.org/10.1017/S0003055418000837


Comments are closed.