We use cookies to distinguish you from other users and to provide you with a better experience on our websites. Close this message to accept cookies or find out how to manage your cookie settings.
Online ordering will be unavailable from 17:00 GMT on Friday, April 25 until 17:00 GMT on Sunday, April 27 due to maintenance. We apologise for the inconvenience.
To save content items to your account,
please confirm that you agree to abide by our usage policies.
If this is the first time you use this feature, you will be asked to authorise Cambridge Core to connect with your account.
Find out more about saving content to .
To save content items to your Kindle, first ensure [email protected]
is added to your Approved Personal Document E-mail List under your Personal Document Settings
on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part
of your Kindle email address below.
Find out more about saving to your Kindle.
Note you can select to save to either the @free.kindle.com or @kindle.com variations.
‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi.
‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.
Online labor markets provide new opportunities for behavioral research, but conducting economic experiments online raises important methodological challenges. This particularly holds for interactive designs. In this paper, we provide a methodological discussion of the similarities and differences between interactive experiments conducted in the laboratory and online. To this end, we conduct a repeated public goods experiment with and without punishment using samples from the laboratory and the online platform Amazon Mechanical Turk. We chose to replicate this experiment because it is long and logistically complex. It therefore provides a good case study for discussing the methodological and practical challenges of online interactive experimentation. We find that basic behavioral patterns of cooperation and punishment in the laboratory are replicable online. The most important challenge of online interactive experiments is participant dropout. We discuss measures for reducing dropout and show that, for our case study, dropouts are exogenous to the experiment. We conclude that data quality for interactive experiments via the Internet is adequate and reliable, making online interactive experimentation a potentially valuable complement to laboratory studies.
The University of São Paulo Twin Panel (Painel USP de Gêmeos), based at the Institute of Psychology of the University of São Paulo, started formally in 2017. Our registry is new, but in only two years of formal existence, it comprises a volunteer sample of 4826 registered individuals (98% twins and 2% higher-order multiples), recruited at the University of São Paulo and by social media campaigns. Our main aim is to conduct and promote research with twins on psychological processes and behavior. The University of São Paulo is the largest higher education and research institution in South America, and the Painel USP de Gêmeos has great potential for fostering research on twin-related issues from a psychological perspective in Brazil and South America.
The Best Practices in Social and Behavioral Research Course was developed to provide instruction on good clinical practice for social and behavioral trials. This study evaluated the new course.
Methods
Participants across 4 universities took the course (n=294) and were sent surveys following course completion and 2 months later. Outcomes included relevance, how engaging the course was, and working differently because of the course. Open-ended questions were posed to understand how work was impacted.
Results
Participants rated the course as relevant and engaging (6.4 and 5.8/7 points) and reported working differently (4.7/7 points). Participants with less experience in social and behavioral trials were most likely to report working differently 2 months later.
Discussion
The course was perceived as relevant and engaging. Participants described actions taken to improve rigor in implementing trials. Future studies with a larger sample and additional participating sites are recommended.
This article discusses the process of defining competencies and development
of a best practices training course for investigators and clinical research
coordinators who conduct social and behavioral research.
Methods
The first project phase established recommendations for training in Good
Clinical Practice (GCP) and was done in conjunction with representatives
from 62 Clinical and Translational Science Award (CTSA) hubs. Diversity in
behavioral clinical trials and differences in regulation of behavioral
trials compared with clinical trials involving drugs, devices, or biologics
necessitated a separate Social and Behavioral Work Group. This group worked
with CTSA representatives to tailor competencies and fundamental GCP
principles into best practices for social and behavioral research.
Results
Although concepts underlying GCP were deemed similar across all clinical
trials, not all areas were equally applicable and the ways in which GCP
would be enacted differ for behavioral trials. It was determined that
suitable training in best practices for social and behavioral research was
lacking.
Discussion
Based on the training need, an e-learning course for best practices is
available to all CTSA sites. Each institution is able to track outcomes for
its employees to help achieve standardized competency-based best practices
for social and behavioral investigators and staff.
The Office of Behavioral and Social Sciences Research
(OBSSR) at the National Institutes of Health opened in
1995 to facilitate the advancement of research on social
and behavioral influences on health. The establishment
of the OBSSR coincided with the ascendancy of molecular
biology, with its emphasis on more reductionistic influences
on health. This greater emphasis on genetic aspects of
health has the potential to produce a widening chasm between
biomedical research and social, behavioral, and psychological
research. We discuss the chasm between sociobehavioral
and biomedical research during what might be considered
the era of molecular biology and propose the concept of
levels of analysis as a unifying framework for research
in the health sciences, using research on hypertension
in African Americans as a representative example. We also
argue for the primacy of psychophysiological research in
bridging the chasm and furthering a multilevel perspective
and summarize some of the activities of the OBSSR that
are relevant to this perspective.
This article seeks to stimulate thought on the philosophy behind agricultural research. Pragmatism is identified as a philosophical basis for studying environmental issues that focus on human behavior. The ways in which this approach is applicable to the study of alternative agriculture are illuminated. “Behavioral pragmatists” differ from “behavioral positivists” in their aim, focus, process, and approach to research. I describe the main goals of the pragmatic behavioral approach: accepting a systems approach to study the interrelationships between humans and the environment; gaining understanding through human experiences; viewing problems as whole complex “problematic situations”; and promoting social activism and appropriate policy formulation. Combining qualitative and quantitative methods is often most effective. Pragmatism allows for holistic analysis that incorporates numerous factors that influence human uses of the environment. A specific example shows how behavioral pragmatism is effective in research on alternative agriculture.
Recommend this
Email your librarian or administrator to recommend adding this to your organisation's collection.