Weird Science: Ten Years of Informal Science Workshops

Robert E. Pyatt,
Ohio State University

Introduction

As educators, we are frequently challenged to develop interesting and educationally robust methods for the promotion of critical thinking in our classrooms. Once our students have graduated, the opportunities for them to further develop their critical thinking skills are greatly diminished. For the last ten years I have conducted informal science outreach workshops outside of the classroom setting, which I call “Weird Science.” In the discussion that follows, I’ll introduce the concepts behind these workshops and the strategies I have used to promote science and critical thinking skills among diverse audiences. I’ll conclude with some challenges I have encountered and provide anecdotal feedback from attendees on the significance of these events.

Weird Science

Weird Science workshops are part journal club, part citizen science project, and part stand-up comedy. Having previously written for the Annals of Improbable Research, I have adopted their slogan of making “people laugh and then think.” Through Weird Science I have appeared before diverse audiences including lunch clubs, summer school programs, book clubs, science fiction conventions, and MENSA chapters in informal learning environments such as public libraries, hotel ballrooms, gymnasiums, waterparks, bars, restaurants, and churches. Each session typically lasts from sixty to ninety minutes and includes a review of three to four science articles and participation in a hands-on experiment. Both parts are designed to be interactive and foster maximum audience participation in the form of a group discussion on data review/analysis and a hands-on activity. The content is tailored for either adult or family audiences.

The educational framework of Weird Science is based on training I received in the philosophical, pedagogical, and scientific aspects of education through the Fellowships in Research and Science Teaching (FIRST) program, which is cooperatively organized through Emory University, Clark Atlanta University, Spelman College, and Morehouse College and School of Medicine. This fantastic program combines a traditional post-doctoral research experience with formal instruction on teaching and learning methods, with a mentored teaching experience at one of the minority serving institutions in the Atlanta area. Specifically, I have covered topics drawn from Barbara Davis’s book Tools for Teaching, which was used as a text for this program: encouraging student participation in discussions, tactics for effective questioning, fielding student questions, and alternatives to lecturing. Although the book focuses on formal classroom techniques, I have found many of its principles to be applicable to informal teaching as well.

Figure 1. The author presenting a Weird Science workshop in late 2014. The caption on the image behind the author reads “Because Chocolate Can’t Get You Pregnant”

Weird Science contains many of the strands recently outlined by the National Research Council for learning in informal spaces. These include reflecting on science as a process, participating in science activities involving scientific language and tools, manipulating, testing, and exploring the natural and physical world, and experiencing excitement and motivation to learn about our world (Bell et al. 2009). My goal is to make each one a funny, educational, and informative session for everyone, regardless of their age or science background.

Part Journal Club

The majority of a Weird Science workshop is composed of audience analysis and discussion of scientific articles as typically found in a science journal club. The types of articles I draw from include primary, peer-reviewed literature as well as reports from the mass media. In many cases, this is the first time audience members have ever been exposed to a peer-reviewed publication, and I find demystifying the scientific literature to be an important goal. While the prospect of fostering a discussion of primary scientific articles involving individuals with diverse science backgrounds may seem daunting, the selection of appropriate papers has been the key to success. I have found that the most appropriate types of publications typically include topics with a minimum of background information needed to understand the hypothesis, experimental methodologies with simple designs used to address that question, and most importantly a subject which can quickly grab attention and stoke curiosity. For example, little background knowledge is needed to understand the importance of identifying methods to safely transplant animals to new habitats, such as those discussed in “Transplanting Beavers by Airplane and Parachute” (Heter 1950). Participants can easily understand the experimental design in “Testing the Danish Legend That Alcohol Can Be Absorbed through Feet: Open Labelled Study” (Hansen 2010), where subjects immersed their feet in vodka for three hours and then monitored their blood alcohol levels.   Finally, the papers already mentioned and many others, including “My Baby Doesn’t Smell as Bad as Yours: The Plasticity of Disgust” (Case et al. 2006), “Robot Vacuum Cleaner Personality and Behavior” (Hendriks et al. 2011), and “Do Women Spend More Time in the Bathroom Than Men?” (Baille et al. 2009) illustrate how a great subject can quickly pique interest.

By using these examples, and many others over the last ten years, I have been able to guide participants with little to no formal training in science through a critical review of the scientific methodology, data analysis, and conclusions presented in these publications. For example, when asked to design their own method to test the myth of alcohol absorption through feet, many audiences initiated spirited discussions concerning what type of alcohol to use (percentage alcohol content) and what controls would be appropriate for such a study. Participants then contrasted their experimental designs to the one used in the published report, which opted for vodka (37.5 percent alcohol by volume) but included no real controls (Hansen 2010). For the study “Robot Vacuum Cleaner Personality and Behavior” (Hendriks et al. 2011), which surveyed a population of six individuals as part of their methodology, participants correctly recognize that such a small sample size does not provide statistically reliable support for the conclusions drawn by the authors. The differences between hypothesis-driven research and observational types of science can be illustrated through case studies such as “Pharyngeal Irritation after Eating Cooked Tarantula” (Traub et al. 2001). Mass media articles like “Swedish Cows Make Lousy Earthquake Detectors” (The Local 2009) can be used to explain what peer review is and to promote a discussion on the differences between peer-reviewed scientific literature and reports from mass media sources. The history of science can be explored through publications such as “The Behavior of Young Children under Conditions Simulating Entrapment in Refrigerators” (Bain et al. 1958). In the end, science articles like these are ideal for stimulating discussions about the scientific method and data analysis in individuals, regardless of their formal scientific training.

While finding appropriate journal articles with these characteristics within the vast body of published literature may seem overwhelming, there are actually many resources that one can mine. Both the Annals of Improbable Research and the Journal of Irreproducible Results feature odd science topics in every issue. There are also a wealth of blogs including Sci-Curious (https://www.sciencenews.org/blog/scicurious) and Seriously, Science? at Discover Magazine (http://blogs.discovermagazine.com/seriouslyscience/), which highlight strange science publications. Additionally, many end-of-year “best of” lists now include odd science discoveries in their categories. Fortunately, I have always had some form of academic position that has included access to nearly all of these publications through the fantastic library resources found at colleges and universities across the United States. With the gradual adoption of open access policies, many of these articles are now accessible for free to participants after the workshop.

Part Citizen Science Project

The last third of a Weird Science session involves audience participation in examining a scientific question. It has been suggested that involving the public in citizen science projects can impact their understanding of science content and the process of science (Cohn 2008). While most citizen science projects are long-term studies in which participants play a minor role, these exercises are smaller in scale and are selected so that participants can be actively involved in both data collection and interpretation. I again draw directly from the primary literature for inspiration; previous topics have included stall preference in public bathrooms (Christenfeld 1995), left/right-side preference for tasks such as holding a small dog (Abel 2010), and whether Dippin’ Dots (tiny frozen spheres of ice cream) can cause ice cream headache (Kaczorowski and Kaczorowski 2002).

While the exact series of steps differs depending on the topic of investigation, this section typically includes a brief discussion on the background knowledge behind a specific scientific question and an experiment in the form of a hands-on activity or survey to test the discussed hypothesis. For example, Chittaranjan and Srihari published a report in the Journal of Clinical Psychiatry examining nose- picking behavior in two hundred school-age children in Bangalore City (Chittaranjan and Srihari 2001). As the instrument used in that study is included in the article, I would hand out that short survey and ask that any interested individual anonymously answer the questions on their nose-picking behavior. Once these responses are collected, I would introduce the publication and discuss any limitations in their methodology, in this case issues such as reporting honesty by respondents and response selection bias when using surveys. The group then discusses the results from the paper allowing attendees to compare their own personal answers to questions like “Do you believe that nose picking is a bad habit?” and “Do you occasionally eat the nasal matter that you have picked?” to the complete data set from the article (Chittaranjan and Srihari 2001).

While I vary the articles I cover for every Weird Science workshop, I conduct the same scientific experiment for all presentations during a calendar year running from July to June. This allows me to amass a large data set examining a specific hypothesis and to correlate results from the Weird Science experiments with results from the original manuscript. Most venues invite me back annually, which means I can present the cumulative data set from the complete year upon my return visit and allow the audience to draw parallels and conclusions from our data in relation to the original published study. Most importantly, we discuss how no scientific study is perfect and identify the limitations of our own study methods, which impact how we can analyze the data and draw conclusions from it.

Part Stand-Up Comedy

In the last few years, publications have appeared examining the use of humor in science communication with both positive (Roth et al. 2010; Pinto et al. 2013) and negative conclusions (Lei et al. 2010). While acknowledging that there can be positive effects of humor in education, Lei et al. also comment that some types of humor can be viewed as offensive and therefore unfit for a classroom setting. Additionally, humor that is excessive or forced may also be viewed as negative and can undermine the credibility of the educators (Lei et al. 2010). Through an analysis of video tape recordings of first-year teachers, Roth et al. describe multiple types of humor in the classroom and identify laughter as “a collective interactive achievement of the classroom participants that offsets the seriousness of science as a discipline” (Roth et al. 2011).

Figure 2. Clay creations made by attendees in 2013, testing whether working with modeling clay can alleviate chocolate cravings.

I rely heavily on humor as an instructional and entertainment tool that takes three general forms. First, many of the articles themselves contain classic bits of humor I can draw from directly. For example, in the study “Observing a Fictitious Stressful Event: Haematological Changes, Including Circulating Leukocyte Activation,” the authors determine whether immune cells are activated when participants view a fictitious stressful event by having them watch “The Texas Chainsaw Massacre” (Mian et al. 2003). In commenting on the study’s conclusions disproving the Danish myth of absorbing alcohol through the feet, the authors write, “Driving or leading a vessel with boots full of vodka seems to be safe” (Hansen et al. 2010). Secondly, as I typically use PowerPoint as a method of delivering figures and images from these publications, I can draw on the extensive collection of clip art from the internet to graphically enhance my presentations. Finally, the responses from participants themselves during the experimental portion are often excellent sources of humor. When reviewing the results of our test to see whether a modelling clay activity can alleviate chocolate cravings, I show pictures of some of the clay creations made during that activity. While I encourage everyone to treat the experiments with an appropriately “serious” attitude, I see a wide range of interpretations. In response to a question concerning their favorite ice cream flavor, participant answers included “blue,” “orange sherbet,” and “Ben and Jerry’s Vanilla Nut Cream of the shimmering hills crowded among the snowy valley.” As part of a study on body hair patterns, participants responded to a question on unusual body hair locations with answers including “I have it on the tops of my feet but no, I am not Frodo Baggins” and “Only when I am around my cat.” While not necessarily fulfilling the intent of the questions asked, these responses are funny in a good-natured way and provide a great teachable moment to illustrate some of the challenges of using surveys as a research instrument.

It has been suggested that humor may not be an appropriate tool for science communication as audiences lack the background knowledge to get the jokes (Marsh 2013), speakers present themselves as elite individuals (science experts) elevated above the audience (Marsh 2013), or because humor can only be derived when the audience asserts their superiority over the shortcomings of the particular situation (Billig 2005). I would instead argue that humor is a powerful tool in any educational setting, and that these pitfalls are avoided by the organization and delivery of Weird Science. The audience members themselves serve as the scientists as they work through the various analysis and experimentation exercises. Consequently I serve more as a “guide on the side” rather than as an all-knowing “sage on the stage.” My selection of articles specifically ensures that extensive background information is not needed to get any particular joke and shows that critical review is an integral part of the scientific process, which need not include an air of superiority. Finally, humor is essential to making these sessions entertaining and promoting a general feeling that an audience’s time has been well spent.

Putting It All Together

To demonstrate how all of these parts come together to form a complete program, I’ll describe a recent workshop I presented at the Multiple Alternative Realities Convention (MarCon) in Columbus, Ohio. The workshop lasted approximately seventy-five minutes and began with a discussion of “Do Bees Like Van Gogh’s Sunflowers?” (Chittka and Walker 2006). I used this paper to foster a discussion on the study’s methods, which measured the preference of bees to pictures with and without flowers, using different media for each image; these included posters with reprints of original works, oil on canvas, and an acrylic on canvas board reproduction of Van Gogh’s painting by another artist. Audiences noted that the inconsistent use of media complicated the interpretation of bees’ preferences for the images. Next we reviewed the results from the previous year’s citizen science project “The Use of a Modeling Clay Task to Reduce Chocolate Craving” (Andrade et al. 2012). After reviewing the results from the study, the audience contrasted the published methods with the study they participated in and noted that while the original had selected for individuals who self-described as “chocolate lovers,” our population was not pre-screened in such a way. This may have contributed to our failure to reproduce the study’s findings.

Next the paper “Skipping and Hopping of Undergraduates: Recollections of When and Why” (Burton et al. 1999) was presented. The authors of the paper highlight that one percent of undergraduates surveyed report never having skipped or hopped, which the audience noted may reflect more on the selective memories of the respondents and the limitations of surveys as experimental instruments than on actual events. The case report “The Case of the Haunted Scrotum” (Harding 1996) was used to illustrate the difference between hypothesis-based research and observational science. Finally, the audience was challenged to design an experiment to test whether watching different types of television programs would impact the amount of food being consumed during snacking, as studied in the paper “Watch What You Eat: Action-related Television Content Increases Food Intake” (Tal et al. 2014). We closed the workshop with a new citizen science project examining the types of rubber glove creations attendees would make in the setting of a pediatric doctor’s office to calm an upset child. Once I recorded the types of creations made, the audience then compared their creations to child preferences in the study “The ‘Jedward’ versus the ‘Mohawk’: A Prospective Study on a Paediatric Distraction Technique” (Fogarty et al. 2014).

Challenges

While I have loved presenting these workshops, they have not been without their challenges. Because of the diversity of scientific backgrounds in audience members, I have seen participants with more science experience unintentionally dominate discussions. The job of moderator is an important one and requires a sensitive touch in these informal settings to maintain a balance between a lively group discussion and basic crowd control. Additionally, while I have often found myself presenting in bars, I have luckily never found the inclusion of alcohol to be a negative factor. However, its presence can change the discussion dynamics, and I am always on guard in such situations for alcohol-related complications such as heckling.

I find identifying appropriate articles to be relatively easy, but designing the hands-on component has proven to be more complicated. The diversity of locations where I present limits the types of hands-on experiments that can practically be done. Surveys have become an easy solution to these logistical issues, but I have tried to use them only sparingly, when I can’t identify another subject that involves more active experimentation. As a majority these workshop are free, the cost of any reagents (ice cream, chocolate, rubber gloves, etc.) comes directly out of my own pocket, and a lack of external funding further limits experimental complexity.

Occasionally, I have perceived a slight air of disappointment from participants when our attempts to replicate a published scientific study fail, as in the clay modeling activity to alleviate chocolate cravings. While situations such as this provide excellent educational opportunities to discuss how the process of science is full of errors and failed experiments (for whatever reason), a lack of exciting results does work against the entertainment goal of the workshops. I have tried to redirect negative feelings through analogies to the TV show Mythbusters by discussing how replication is the foundation of science and how our negative results may have disproved a questionable hypothesis (with caveats regarding differences between our experimental method and the published study).

Anecdotal Feedback

I have honestly been thrilled with the level of success I have experienced with Weird Science. I have never made a formal attempt to evaluate the effectiveness of these sessions or track my attendance numbers, but written responses to the experimentation portion over the last four years can be used to at least measure the number of attendees participating annually. For each year from 2011 through 2014, between 192 and 207 people participated, with ages ranging from 17 to 79 years. This included approximately equal numbers of male and female respondents. I would estimate that at any one workshop, between one half to two thirds of attendees participate in the science experiment.

Finally, the success of these sessions has led me to create a Facebook group called “Weird Science with Rob Pyatt” to continue similar scientific discussions outside of the workshops by using social media. In preparation for this paper, I asked group members who had previously attended a workshop a few questions regarding their views on and experiences with Weird Science sessions. While this is far from a scientific evaluation, I think these anecdotal responses begin to illustrate the value in this unique informal education format. When asked if something surprised them about a Weird Science workshop, two individuals responded “The amount of time devoted to discussing data collection and study. I learned more about how science works than any actual science itself,” and “Science can be fun.” When asked why they took the time to attend a Weird Science workshop, answers included “Because you don’t just lecture, you involve everyone in the process so that they understand how a scientific study should work,” and “Learning and entertainment!” One final comment from a participant concerning why they have attended a session in the past, “You engagingly discuss science in a way that I who has a minimal science background and my fiancé who has a degree in chemistry can both enjoy.” I’ll close with an unsolicited comment I received in 2013 from a mother who had attended a session with her daughter; I hope it serves to illustrate the impact these workshops can have. She posted “Just wanted to let you know that you are an influence on young minds. My mom was talking about some ‘study’ she saw on TV (with a test group of one) and my daughter immediately started countering with all the reasons this was NOT a scientifically valid study. So proud!”

About the Author

Robert E. Pyatt is an Associate Director of the Cytogenetics and Molecular Genetics Laboratories at Nationwide Children’s Hospital and an Assistant Professor-Clinical in the Department of Pathology at Ohio State University. He received his M.S. from Purdue University and Ph.D. from Ohio State University. Rob is also the chair of the JW Family Science Extravaganza, a satellite event of the USA Science and Engineering Festival held annually in Hilliard, Ohio.

References

Abel, E.L. 2010. “Human Left-Sided Cradling Preferences for Dogs.” Psychological Reports 107 (1): 336–338.

Andrade, J., S. Pears, J. May, and D.J. Kavanagh. 2012. “Use of a Clay Modeling Task to Reduce Chocolate Craving.” Appetite 58: 955–963.

Baille, M.A., S. Fraser, and M.J. Brown. 2009. “Do Women Spend More Time in the Bathroom Than Men?” Psychological Reports 105:789–790.

Bain, K., M.L. Faegre, and R.S. Wyly. 1958. “The Behavior of Young Children under Conditions Simulating Entrapment in Refrigerators.” Pediatrics 22: 628–647.

Bell, P., B. Lewenstein, A. Shouse, and M. Feder. 2009. Learning Science in Informal Environments: People, Places, and Pursuits. Washington, DC: National Academies Press.

Billig, M. 2005. Laughter and Ridicule: Towards a Social Critique of Humor. London: SAGE.

Burton, A.W., L. Garcia, and C. Garcia. 1999. “Skipping and Hopping in Undergraduates: Recollections of When and Why.” Perceptual and Motor Skills 88: 401–406.

Case, T.I., B.M. Repacholi, and R.J. Stevenson. 2006. “My Baby Doesn’t Smell as Bad as Yours: The Plasticity of Disgust.” Evolution and Human Behavior 27 (5): 357–365.

Chittaranjan C., and B.S. Srihari. 2001. “A Preliminary Survey of Rhinotillexomania in an Adolescent Sample.” Journal of Clinical Psychiatry 62 (6): 426–431.

Chittka, L., and J. Walker. 2006. “Do Bees Like Van Gogh’s Sunfowers?” Optics and Laser Technology 38: 323–328.

Christenfeld, N. 1995. “Choices from Identical Options. Psychological Science.” 6 (1): 50–55.

Cohn, J.P. 2008. “Citizen Science: Can Volunteers Do Real Research?” Bioscience. 58: 192–197.

Davis, B.G. 1993. Tools for Teaching. San Francisco: Jossey-Bass.

Fogarty, E., E. Dunning, K. Stanley, T. Bolger, and C. Martin. 2014. “The ‘Jedward’ Versus the ‘Mohawk’: A Prospective Study on a Paediatric Distraction Technique.” Emergency Medicine Journal 31: 327–328.

Hansen, C.S., L.H. Faerch, and P.L. Kristensen. 2010. “Testing the Validity of the Danish Urban Myth That Alcohol Can Be Absorbed through Feet: Open Labelled Self Experimental Study.” BMJ 341: 1–3.

Harding, J.R. 1996. “The Case of the Haunted Scrotum.” Journal of the Royal Society of Medicine 89 (10): 600.

Hendriks, B., B. Meerbeek, S. Boess, S. Pauws, and M. Sonneveld. 2011. “Robot Vacuum Cleaner Personality and Behavior.” International Journal of Social Robotics 3: 187–195.

Heter, E., 1950. “Transplanting Beavers by Airplane and Parachute.” The Journal of Wildlife Management 14 (2): 143–147.

Holtzclaw, J.D., L.G. Morris, R. Pyatt, C.S. Giver, J. Hoey, J.K. Haynes, R.B. Gunn, D. Eaton, and A. Eisen. 2005. “FIRST: A Model for Developing New Science Faculty.” Journal of College Science Teaching 34: 24–29.

Kaczorowski, M., and J. Kaczorowski. 2002.”Ice Cream Evoked Headaches (ICE-H) Study: Randomized Trial of Accelerated Versus Cautious Ice Cream Eating Regimen.” BMJ 325: 21–28.

Lei, S.A., J.L. Cohen, and K.M. Russler. 2010. “Humor on Learning in the College Classroom: Evaluating Benefits and Drawbacks from Instructors’ Perspectives.” Journal of Instructional Psychology 37(4): 326–331.

The Local. 2009. “Swedish Cows Make Lousy Earthquake Detectors: Study.” January 13, 2009. http://www.thelocal.se/20090113/16876 (accessed May 27, 2015).

Marsh, O. 2013. “A Funny Thing Happened on the Way to the Laboratory: Science and Standup Comedy.” http://blogs.lse.ac.uk/impactofsocialsciences/2013/07/12/a-funny-thing-happened-on-the-way-to-the-laboratory/ (accessed May 27, 2015).

Mian, R., G. Shelton-Rayner, B. Harkin, and P. Wiliams. 2003. “Observing a Fictitious Stressful Event: Haematological Changes, Including Circulating Leukocyte Activation.” Stress. 6 (1): 41–47.

Pinto, B., D. Marcal, and S.G. Vaz. 2013. “Communicating through Humor: A Project of Stand-up Comedy about Science. Public Understanding of Science.” Epub 12/9/2013.

Roth, W.M., S.M. Richie, P. Hudson, and V. Mergard. 2011. “A Study of Laughter in Science Sessions.” Journal of Research in Science Teaching 48 (5): 437–458.

Tal, A., S. Zuckerman, and B. Wansink. 2014. “Watch What You Eat: Action-Related Television Content Increases Food Intake.” JAMA Internal Medicine 174 (11): 1842–1843.

Traub, S.J., R.S. Hoffman, L.S. Nelson, 2001. “Pharyngeal Irritation after Eating Cooked Tarantula.”  International Journal of Medical Toxicology 4(5): 40.

Figure Legends

Figure 1: The author presenting a Weird Science workshop in late 2014. The caption on the image behind the author reads “Because Chocolate Can’t Get You Pregnant.”

Figure 2: Clay creations made by attendees in 2013, testing whether working with modeling clay can alleviate chocolate cravings.

Author: seceij

Chuck Gahun is the content manager for the SECEIJ website and technical consultant for NCSCE

Leave a Reply