Keith Smolkowski ![]() |
Correction of Misinformation & Persuasion |
Oregon Research Institute |
The great enemy of truth is very often not the lie—deliberate, contrived, and dishonest—but the myth—persistent, persuasive, and unrealistic. . . . We subject all facts to a prefabricated set of interpretations. We enjoy the comfort of opinion without the discomfort of thought. |
— John F. Kennedy |
Jump to
A,
B,
C,
D,
E,
F,
G,
H,
I,
J,
K,
L,
M,
N,
O,
P,
R,
S,
T,
U,
V,
W,
or XYZ
See also Negotiation,
Behavioral Selection & Decision Making
Achenbach, J. (2015, March). Why do many reasonable people doubt science? National Geographic. ♦
Aguirre, J., Herbel-Eisenmann, B., Celedón-Pattichis, S., Civil, M., Wilkerson, T., Stephan, M., Pape, S., & Clements, D. H. (2017). Equity within mathematics education research as a political act: Moving from choice to intentional collective professional responsibility. Journal for Research in Mathematics Education, 48(2), 124-147. ♦
Argyris, C. (1966). Interpersonal barriers to decision making. Harvard Business Review, 44(2), 84-97. ♦
Argyris discusses the differences between executives' words and actions, creating barriers to trust and openness and damaging the decision-making process, especially when decisions are most important. The author explores these issues and then offers ideas about how to improve the process.
Also: Argyris, C. (2001). Interpersonal barriers to decision making. In Harvard Business Review on Decision Making (pp. 1-20). Boston, MA: Harvard Business School Publishing. (Reprinted from the Harvard Business Review, 1966, 44(2), 84-97) ◊ [Cite as Argyris (1966/2001)]
Aronson, E. (1980). Persuasion via self-justification: Large commitments for small rewards. In L. Festinger (Ed.), Retrospection on social psychology (pp. 3-21). New York: Oxford University Press.
Bishop, D. V. M. (2018). Fallibility in science: Responding to errors in the work of oneself and others. Advances in Methods and Practices in Psychological Science, 1(3), 432-438. https://doi.org/10.1177/2515245918776632 ♦
Bishop, D. V. M., Whitehouse, A. O., Watt, H. J., & Line, E. A. (2008). Autism and diagnostic substitution: Evidence from a study of adults with a history of developmental language disorder. Developmental Medicine and Child Neurology, 50(5), 341-345. https://doi.org/10.1111/j.1469-8749.2008.02057.x ♦
An example that shows how changes in diagnostic criteria can purpetuate misinformation (see also Shattuck, 2006).
Bursztyn, L., Rao, A., Roth, C., & Yanagizawa-Drott. D. (2020). Misinformation during a pandemic (Working Paper No. 2020-44). Becker Friedman Institute for Economics at the University of Chicago. ♦
Newer version may be available (click here).
Cialdini, R. B. (1993). Influence: Science and Practice (3rd ed.). New York: HarperCollins College Publishers. ◊
Cialdini, R. B. (2009). Influence: Science and Practice (5th ed.). New York: Pearson/Allyn and Bacon. ◊
Cook, J., Ellerton, P., & Kinkead, D. (2018). Deconstructing climate misinformation to identify reasoning errors. Environmental Research Letters, 13(024018), 1-7. https://doi.org/10.1088/1748-9326/aaa49f ♦
Cook, J., & Lewandowsky, S. (2011), The debunking handbook. St. Lucia, Australia: University of Queensland. http://sks.to/debunk (ISBN 978-0-646-56812-6) ♦
See Lewandowsky, Ecker, Seifert, Schwarz, and Cook (2012) for a more thorough literature review and detailed presentation of the content.
Cook, J., Lewandowsky, S., & Ecker, U. (2017). Neutralizing misinformation through inoculation: Exposing misleading argumentation techniques reduces their influence. PLOS ONE, 12(5, E0175799), 1-21. ♦
Dalton, C. (2016). Bullshit for you; transcendence for me. A commentary on 'On the reception and detection of pseudo-profound bullshit.' Judgment and Decision Making, 11(1), 121-122. ♦
A commentary on Pennycook, Cheyne, Barr, Koehler, and Fugelsang's (2015) 2016 Ig Nobel Peace Prize winning article.
Dickerson, C., Thibodeau, R., Aronson, E., & Miller, D. (1992). Using cognitive dissonance to encourage water conservation. Journal of Applied Social Psychology, 22, 841-854.
Eisenhardt, K., & Sull, D. (2001). Strategy as simple rules. Harvard Business Review, 79(1), 106. ♦
Eriksson, K. (2012). The nonsense math effect. Judgment and Decision Making, 7(6), 746-749. ♦
Eriksson (2012) asked participants to review two abstracts. For each participant, one had been manipulated to include an nonsense formula: "an extra sentence taken from a completely unrelated paper and presenting an equation that made no sense in the context" (p. 746). "Participants judged the quality of research [described by the abstracts] as higher when the content included unintelligible elements, which arguably ought to detract from the quality" (p. 748). Only people with degrees in science, math, and technology reported the abstract with the nonsense equation as lower quality.
Evans, A., Sleegers, W., & Mlakar, Ž. (2016). Individual differences in receptivity to scientific bullshit. Judgment and Decision Making, 15(3), 401-412. ♦
A commentary on Pennycook, Cheyne, Barr, Koehler, and Fugelsang's (2015) 2016 Ig Nobel Peace Prize winning article.
Evans, I., Thornton, H., Chalmers, I., & Glasziou, P. (2011) Testing treatments: Better research for better healthcare, Pinter and Martin. ♦
Evans and colleagues (2011) present an accessible review of research methods that improve healthcare as well as the competing approaches and assumptions that may lead to worse outcomes. In doing so, this "excellent book . . . gives a series of examples where treatments, thought to be beneficial on the basis of observational data, have been shown, in fact, to harm patients" (Cambell & Walters, 2014, p. 1).
Festinger, L. (1957). A theory of cognitive dissonance. Palo Alto, CA: Stanford University Press.
Fisher, R., Brown, S. (1988). Getting together: Building relationships as we negotiate. New York: Penguin Books. ◊
Fisher, R., Kopelman, E., & Kupferr-Schneider, A. (1996). Beyond machiavelli: Tools for coping with conflict. New York: Penguin Books.
Fisher, R., Ury, W., & Patton, B. (1991). Getting to yes (2nd Ed.). New York: Penguin Books. ◊ ♦
See also a summary at WikiSummaries or notes by Doug Marshall.
Freedman, D. H. (2010, November). Lies, damned lies, and medical science. The Atlantic. http://www.theatlantic.com/magazine/toc/2010/11/ ♦
Garvin, D. A., Roberto, M. A. (2005). Change through persuasion. Harvard Business Review, 83(2), 104-112. ♦
Gilbert, M. A. (1996). How to win an arguement: Surefire strategies for getting your point across (2nd ed.). New York: John Wiley & Sons. ◊
Glaser, S. R., & Glaser, P. A. (2006). Be quiet, be heard: The paradox of persuasion. Eugene, OR: Communication Solutions Publishing.
Goldstein, N. J., Martin, S. J., & Cialdini, R. B. (2008). Yes! 50 scientifically proven ways to be persuasive. New York: Free Press.
Gorski, P. (2019). Avoiding racial equity detours. Educational Leadership, 76(7), 56-61. ♦
Gregory, R. (2000, June). Using stakeholder values to make smarter environmental decisions. Environment, 42(5), 34-44. ♦
Gregory, R., & Keeney, R. L. (1994). Creating policy alternatives using stakeholder values. Management Science, 40(8), 1035-1048. ♦
Gretton, J. D., Meyers, E. A., Walker, A. C., Fugelsang, J. A., & Koehler, D. J. (2021). A brief forewarning intervention overcomes negative effects of salient changes in COVID-19 guidance. Judgment and Decision Making, 16, (6), 1549-1574. ♦
Gutiérrez, R. (2013). The sociopolitical turn in mathematics education. Journal for Research in Mathematics Education, 44(1), 37-68. ♦
Gwiazda, J., Ong, E., Held, R., & Thorn, F. (2000). Myopia and ambient night-time lighting. Nature, 404(6774), 144. ♦
See Quinn, Shin, Maguire, and Stone (1999).
Hammond, J. S., Keeney, R. L., & Raiffa, H. (2006). The Hidden Traps in Decision Making. Harvard Business Review, 84(1), 118-126. ♦
This article nicely summarizes a number of important heuristics (traps) that can lead to poor decisions, those that do not satisfy the objectives of the decision. The traps include anchoring, status quo, sunk costs, confirming-evidence, framing, and several traps that come into play when estimating or forcasting.
Also: Hammond, J. S., Keeney, R. L., & Raiffa, H. (2001). The Hidden Traps in Decision Making. In Harvard Business Review on Decision Making (pp. 143-168). Boston, MA: Harvard Business School Publishing. (Reprinted from the Harvard Business Review, 1998, 76(5), 47-58) ◊ [Cite as Hammond, Keeney, and Raiffa (1998/2001)]
Original: Hammond, J. S., Keeney, R. L., & Raiffa, H. (1998). The Hidden Traps in Decision Making. Harvard Business Review, 76(5), 47-58. ♦
Hastie, R., & Kameda, T. (2005). The robust beauty of majority rules in group decisions. Psychological Review, 112(2), 494-508. https://doi.org/10.1037/0033-295X.112.2.494 ♦
Hayashi, A. M. (2001). When to trust your gut. Harvard Business Review, 79(2), 59-65. ♦
An interseting overview of how people make "gut" decisions, including the pattern-matching rules that decision-makers rely on and how they can also lead to poor choices. Hayashi shows that expertise, above all, requires considerable depth and breadth of knowledge.
Also: Hayashi, A. M. (2001). When to trust your gut. In Harvard Business Review on Decision Making (pp. 1-20). Boston, MA: Harvard Business School Publishing. (Reprinted from the Harvard Business Review, 2001, 79(2), 59-65) ◊ [Cite as Hayashi (2001/2001)]
Harrison, G. P. (2012). 50 popular beliefs that people think are true. New York: Prometheus.
Huff, D. (1993). How to lie with statistics. New York: Norton.
Ito, T. A., Larsen, J. T., Smith, N. K., & Cacioppo, J. T. (1998). Negative information weighs more heavily on the brain: The negativity bias in evaluative categorizations. Journal of Personality and Social Psychology, 75(4), 887-900. https://doi.org/10.1037/0022-3514.75.4.887 ♦
Johnson, E. J., & Goldstein, D. G. (2003). Do defaults save lives? Science, 302(5649), 1338-1339. ♦
Kahneman, D. (2011). Thinking, fast and slow. New York: Farrar, Straus & Giroux.
Keeney, R. L. (2004). Making better decision makers. Decision Analysis, 1(4), 193-204. ♦
Kendeou, P., & van den Broek, P. (2007). The effects of prior knowledge and text structure on comprehension processes during reading of scientific texts. Memory & Cognition, 35(7), 1567-1577. ♦
Kendeou and van den Broek (2007) focused on the association between the accuracy of knowledge and reading comprehension, finding that the way students read can depend on their misconceptions.
Klein, J. (2005). The contribution of a decision support system to complex educational decisions. Educational Research and Evaluation, 11(3), 221-234. ♦
Kleinmuntz, B. (1990). Why we still use our heads instead of formulas: Toward an integrative approach. Psychological Bulletin, 107(3), 296-310. https://doi.org/10.1037/0033-2909.107.3.296 ♦
Kruger, J., & Dunning, D. (1999). Unskilled and unaware of it: How difficulties in recognizing one's own incompetence lead to inflated self-assessments. Journal of Personality and Social Psychology, 77(6), 1121-1134. https://doi.org/10.1037/0022-3514.77.6.1121 ♦
Larrick, R. P., & Soll, J. B. (2008). The MPG illusion. Science, 320(5883), 1593-1594. ♦
Lehrer, J. (2010, December 13). The truth wears off: Is there something wrong with the scientific method? The New Yorker, 86(40), 52-57. http://www.newyorker.com/reporting/2010/12/13/101213fa_fact_lehrer ♦
See related papers by Fitzmaurice (2002), Ioannidis (2005), Kraemer et al. (2006), and Sterne and Smith (2001).
Leippe, M. R. (1994). Generalization of dissonance reduction: Decreasing prejudice through induced compliance. Journal of Personality and Social Psychology, 67, 395-413.
Lewandowsky, S., Ballard, T., Oberauer, K. & Benestad, R. E. (2016). A blind expert test of contrarian claims about climate data, Global Environmental Change, 39, 91-97. https://doi.org/10.1016/j.gloenvcha.2016.04.013
For a brief popular-press discussion of this paper, see Climate Denial Arguments Fail a Blind Test (The Guardian, 23-05-2016).
Lewandowsky, S., Cook, J., Oberauer, K., Brophy, S., Lloyd, E., & Marriott, M. (2015). Recurrent fury: Conspiratorial discourse in the blogosphere triggered by research on the role of conspiracist ideation in climate denial. Journal of Social and Political Psychology, 3(1), 142-178. ♦
Lewandowsky et al. (2015) examine the "important, but not always constructive, role of the blogosphere in public and scientific discourse" (abstract). See Lewandowsky's blog post and FAQ about how this paper was first accepted and then retracted by Frontiers in Psychology (see also APA PsycNET and PMC) and then published in the Journal of Social and Political Psychology.
Lewandowsky, S., Ecker, U. K. H., Seifert, C. M., Schwarz, N., & Cook, J. (2012). Misinformation and its correction: Continued influence and successful debiasing. Psychological Science in the Public Interest, 13(3), 106-131. https://doi.org/10.1177/1529100612451018 ♦
"Evidence shows that vaccines do not cause autism, that global warming is actually occurring, and that President Obama was indeed born in the United States. Why then do people still—often passionately—believe the opposite to be true? In this report, Lewandowsky and colleagues review research detailing the real-world impact of misinformation on our ability to make decisions. They examine common sources of misinformation, processes for evaluating the validity of new information, and reasons why misinformation is so persistent. The authors conclude by providing practical tips for combating misinformation, showing that debiasing strategies can be effective when based on strong psychological science" (Summary from an announcement by the Association for Psychological Science). See Cook and Lewandowsky (2011) for a brief summary of the same content.
Lewandowsky, S., Gignac, G. E., & Oberauer, K. (2013). The role of conspiracist ideation and worldviews in predicting rejection of science. PLOS ONE, 8(10), e75637. https://doi.org/10.1371/journal.pone.0075637 ♦
Lewandowsky, S., Oberauer, K., & Gignac, G. E. (2013). NASA faked the moon landing—therefore, (climate) science is a hoax: An anatomy of the motivated rejection of science. Psychological Science. Advance online publication. ♦
Lilienfeld, S. O. (2012). Public skepticism of psychology: Why many people perceive the study of human behavior as unscientific. American Psychologist, 67(2), 111-129. https://doi.org/10.1037/a0023963 ♦
Lilienfeld, S. O., Lynn, S. J., Ruscio, J., & Beyerstein, B. L. (2010). 50 great myths of popular psychology: Shattering widespread misconceptions about human behavior. Chichester, England: Wiley-Blackwell. ◊
Lilienfeld, S. O., Ritschel, L. A., Lynn, S. J., Cautin, R. L., & Latzman, R. D. (2014). Why ineffective psychotherapies appear to work: A taxonomy of causes of spurious therapeutic effectiveness. Perspectives on Psychological Science, 9(4), 355-387. https://doi.org/10.1177/1745691614535216 ♦
Lilienfeld, Ritschel, Lynn, Cautin, and Latzman (2014) outline the potential causes of spurious treatment effects for psychological interventions that explain why interventions may appear to work when they, in fact, do not. They discuss the causes in terms of the perceptions of interventionists and their treatment recipients and potentially associates (e.g., family & friends). The authors locate each cause of spurious effects within four broad cognitive barriers: naïve realism, confirmation bias, illusory causation, and illusion of control. Many of the 26 potential causes of spurious effects have parallels for educational, social-behavioral, or other interventions, curricula, policies, and prevention programs.
Lilienfeld, S. O., Sauvigné, K. C., Lynn, S. J., Cautin, R. L., Latzman, R. D., & Waldman, I. D. (2015). Fifty psychological and psychiatric terms to avoid: A list of inaccurate, misleading, misused, ambiguous, and logically confused words and phrases. Frontiers in Psychology, 6, 1100-1100. https://doi.org/10.3389/fpsyg.2015.01100 ♦
Lovallo, D., & Kahneman, D. (2003). Delusions of success: How optimism undermines executives' decisions. Harvard Business Review, 81(7), 56-63. (HBR reprint R0307D) ♦
Mandel, D. R., & Irwin, D. (2021). Facilitating sender-receiver agreement in communicated probabilities: Is it best to use words, numbers or both? Judgment and Decision Making, 16(2), 363-393. ♦
Mandel and Irwin (2021) compare the use of numbers (e.g., 10% to 40%), verbal statements (e.g., unlikely), or both to convey probabilities. "To sum up, in terms of vagueness, numeric probabilities pale in comparison to verbal probabilities" (p. 386), even when ranges have been defined (e.g., unlikely means 10% to 40%). Presenting information both ways did not improve over numbers alone.
Marcatto, F., Rolison, J. J., & Ferrante, D. (2013). Communicating clinical trial outcomes: Effects of presentation method on physicians' evaluations of new treatments. Judgment and Decision Making, 8(1), 29-33. ♦
McMullen, F., & Madelaine, A. (2015). Why is there so much resistance to Direct Instruction? Australian Journal of Learning Difficulties, 19(2), 137-151. https://doi.org/10.1080/19404158.2014.962065
Mercer, J. (2010). Child development: Myths and misunderstandings. Los Angeles, CA: Sage. ◊
Meyers, E. A., Turpin, M. H., Białek, M., Fugelsang, J. A., & Koehler, D. J. (2020). Inducing feelings of ignorance makes people more receptive to expert (economist) opinion. Judgment and Decision Making, 15(6), 909-925. ♦
Moyer, M. W. (2019, March). People drawn to conspiracy theories share a cluster of psychological features. Scientific American, 320(3), 58-63. ♦
Naqvi, N., Shiv, B., & Bechara, A. (2006). The role of emotion in decision making: A cognitive neuroscience perspective. Current Directions in Psychological Science, 15(5), 260-264. https://doi.org/10.1111/j.1467-8721.2006.00448.x ♦
O'Brien, T. C., Palmer, R., & Albarracin, D. (2021). Misplaced trust: When trust in science fosters belief in pseudoscience and the benefits of critical evaluation. Journal of Experimental Social Psychology, 96, 104184. https://doi.org/10.1016/j.jesp.2021.104184 ♦
Osborne, J., & Pimentel, D. (2022). Science, misinformation, and the role of education. Science, 378(6617), 246-248. https://doi.org/10.1126/science.abq8093 ♦
Patihis, L., Ho, L. Y., Tingen, I. W., Lilienfeld, S. O., Loftus, E. F. (2013). Are the "memory wars" over? A scientist-practitioner gap in beliefs about repressed memory. Psychological Science. Advance online publication. https://doi.org/10.1177/0956797613510718 ♦
Patrick, V., & Hagtvedt, H. (2012). "I don't" versus "I can't": When empowered refusal motivates goal-directed behavior. Journal of Consumer Research, 39(2), 371-381. https://doi.org/10.1086/663212 ♦
Pennycook, G., Cheyne, J. A., Barr, N., Koehler, D. J., & Fugelsang, J. A. (2015). On the reception and detection of pseudo-profound bullshit. Judgment and Decision Making, 10(6), 549-563. ♦
Pennycook, Cheyne, Barr, Koehler, and Fugelsang (2015) won the 2016 Ig Nobel Peace Prize. See Dalton (2016) for a commentary and Eriksson (2012) for a similar issue: researchers who judge research quality more highly from abstracts that contain bullshit equations.
Pew Research Center. (2014, October). Political polarization and media habits. Washington, DC: Pew. www.journalism.org ♦
Pew Research Center. (2015, January). Public and scientists' views on science and society. Washington, DC: Pew. http://www.pewresearch.org/science2015 ♦
Powell, D., Keil, M., Brenner, D., Lim, L., & Markman, E. M. (2018). Misleading health consumers through violations of communicative norms: A case study of online diabetes education. Psychological Science. Advance online publication. https://doi.org/10.1177/0956797617753393 ♦
Quinn, G. E., Shin, C. H., Maguire, M. G., & Stone, R. A. (1999). Myopia and ambient lighting at night. Nature, 399(6732), 113-4. https://doi.org/10.1038/20094 ♦
Due solely to correlational evidence, this paper began a scare about children sleeping with their light on. See Zadnic and Jones (2000) and Gwiazda, Ong, Held, and Thorn (2000) for an alternate explanation: myopic parents prefer lighting and have myopic children. The reply by Stone, Maguire, and Quinn (2000) discount the alternate and seemingly more likely explanation, but offers no test that accounts for that account.
Ross, R. M., Rand, D. G., & Pennycook, G. (2021). Beyond "fake news": Analytic thinking and the detection of false and hyperpartisan news headlines. Judgment and Decision Making, 16(2), 484-504. ♦
Schumacher, J., & Slep, A. (2004). Attitudes and dating aggression: A cognitive dissonance approach. Prevention Science, 5, 231-243.
Scientific American. (2019). The science behind the debates. Author. https://www.scientificamerican.com/store/books/the-science-behind-the-debates/ ♦epub ♦mobi
Shattuck, P. T. (2006). The contribution of diagnostic substitution to the growing administrative prevalence of autism in US special education. Pediatrics, 117(4), 1028-1037. ♦
An example that shows how changes in diagnostic criteria can purpetuate misinformation (see also Bishop, Whitehouse, Watt, & Line, 2008).
Shermer, M. (2010, December). The conspiracy theory detector: How to tell the difference between true and false conspiracy theories. Scientific American, 303(6), 102. https://doi.org/10.1038/scientificamerican1210-102 ♦
Shermer, M. (2014, December). Conspiracy central: Who believes in conspiracy theories—and why. Scientific American, 311(6), 94. https://doi.org/10.1038/scientificamerican1214-94 ♦
Shermer, M., Hall, H., Pierrehumbert, R., Offit, P., & Shostak, S. (2016, November). 5 things we know to be true. Scientific American, 315(5), 46-53. ♦
This article contains five short essays: "Evolution Is the Only Reasonable Explanation for the Diversity of Life on Earth" by Michael Shermer, "Homeopathy Has No Basis in Science" by Harriet Hall, "Climate Change Conspiracy Theories Are Ludicrous" by Ray Pierrehumbert, "Vaccines Do Not Cause Autism" by Paul Offit, and "No Credible Evidence of Alien Visitations Exists" by Seth Shostak.
Sippitt, A. (2019). The backfire effect: Does it exist? And does it matter for factcheckers? London: Full Fact. The Full Fact blog: https://fullfact.org/blog/2019/mar/does-backfire-effect-exist/ ♦
Slovic, P. (1986). Informing and educating the public about risk. Risk Analysis, 6(4), 403-415. ♦
Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2002). Rational actors or rational fools? Implications of the affect heuristic for Behavioral Economics. Second Annual Symposium on the Foundation of the Behavioral Sciences, Great Barrington, Massachusetts. ♦
From the manuscript: "This paper is a revised version of a chapter titled 'The Affect Heuristic,'" by the same authors and published in Gilovich et al., Heuristics and Biases, and an earlier version was published in the Journal of Socio-Economics (see below).
Slovic, P., Peters, E., Finucane, M. L., & MacGregor, D. G. (2002). Rational actors or rational fools? Implications of the affect heuristic for Behavioral Economics. Journal of Socio-Economics, 31(4), 329-342.
Slovic, P., Finucane, M. L., Peters, E., & MacGregor, D. G. (2002). The affect heuristic. In T. Gilovich, D. Griffin, & D. Kahneman (Eds.), Heuristics and biases: The psychology of intuitive judgment (pp. 397-420). New York: Cambridge University Press.
See "Rational actors or rational fools?" (above) by the same authors.
Slovic, P., Peters, E., Finucane, M. L., & MacGregor, D. G. (2005). Affect, risk, and decision making. Health Psychology, 24(4), S35-S40. ♦
Stephens-Davidowitz, S. (2017). Everybody lies: Big data, new data, and what the Internet reveals about who we really are. Dey St. ♦
Stice, E., Shaw, H., Bekcer, C. B., & Rohde, P. (2008). Dissonance-based interventions for the prevention of eating disorders: Using persuasion principles to promote health. Prevention Science, 9, 114-128. ♦
Stone, J., Aronson, E., Craing, A. L., Winslow, M. P., & Fried, C. B. (1994). Inducing hypocrisy as a means of encouraging young adults to use condoms. Personality and Social Psychology Bulletin, 20, 116-128.
Stone, R. A., Maguire, M. G., & Quinn, G. E. (2000). Myopia and ambient night-time lighting. Nature, 404(6774), 143-144. ♦
See Quinn, Shin, Maguire, and Stone (1999).
Thaler, R. H., & Benartzi, S. (2004). Save More Tomorrow™: Using behavioral economics to increase employee saving. Journal of Political Economy, 112(S1), S164-S187. ♦
Tyner, A. (2021, August). How to sell SEL: Parents and the politics of social-emotional learning. Washington, DC: Thomas B. Fordham Institute: https://sel.fordhaminstitute.org/ ♦
Tyner (2021) reports that parents prefer the labels Life Skills followed by Social-Emotional and Academic Learning and both considerably more than Social-Emotional Learning, which fell 11th on a list of 12 labels. They also reported parents perceptions of 32 subjects, skills, or values in order of importance to parents (Figure 5, p. 9). Parents rated reasoning and problem solving (score: 676) as the most important skill by a fair margin, followed by these ten: mathematics (431); career, technical and vocational education (361); English and reading (337); responsibility for actions (301); communication and interpersonal skills (254); self-confidence (230); science (210); computer science or IT (199); self-motivation (161); and integrity (152).
Vosoughi, S., Roy, D., & Aral, S. (2018). The spread of true and false news online. Science, 359(6380), 1146-1151. ♦
Zadnik, K., & Jones, L. A. (2000). Myopia and ambient night-time lighting. Nature, 404(6774), 143-144. ♦
See Quinn, Shin, Maguire, and Stone (1999).
Links to External Websites