There is a wealth of literature across many domains that attempt to define what critical thinking is and how it can be implemented in classrooms and lecture theatres at both undergraduate level and post-16 education. This essay will attempt to define critical thinking in the context of post-16 psychology delivery, outline several specific interventions and scaffolds that can be implemented in a psychology classroom and assess how one could measure the effectiveness of any critical thinking strategies. Finally, the nature of critical thinking is discussed and a more appropriate term of ‘rational thinking’ is put forward as a label of what is desired from post-16 students.
Defining Critical Thinking
“It is not what the man of science believes that distinguishes him, but how and why he believes it. His beliefs are tentative, not dogmatic; they are based on evidence, not on authority or intuition.”
Russell, 1945, p.527.
When considering strategies developed to improve the critical thinking skills of students in post-16 psychology one should first consider the definition of ‘critical thinking’ (CT). Even within a specific domain such as psychology there is ambiguity in the way that CT is implemented, discussed and assessed. CT scholarship is in a mystified state and no single definition of it is widely accepted (Halonen, 1995). In 2002 the APA task force on learning goals and objectives for psychology listed CT as one of 10 major goals for American undergraduate students (Halonen et al., 2002). Even though there has been much recent research into its implementation and effectiveness in many contexts (Bensley et al., 2010) CT remains controversial as a construct (e.g., Bensley, 2009; Yanchar, Slife & Warne, 2008)
A possible definition suggests that CT is the reflective thinking involved in the evaluation of evidence relevant to a claim so that a sound or good conclusion can be drawn from the evidence (Bensley, 1998). However, Halpern (1998) describes a skills-based approach to teaching CT to students and emphasises the importance of using real-world contexts to build these skills to allow students to apply their CT abilities in other domains (Stark, 2012).
Broadly, there are three approaches to defining CT and these stem from philosophy, cognitive psychology and educational domains (Lewis & Smith, 1993; Sternberg, 1986). Although each of these areas focuses on different interpretations of the term and put the emphasis on different components, there is a foundation of similarity running throughout the three. The philosophical approach focuses on the hypothetical critical thinker, enumerating the qualities and characteristics of this person rather than the behaviours or actions the critical thinker can perform (Lai, 2011; Lewis & Smith, 1993). Alternatively, the cognitive approach to CT focuses more on quantifiable characteristics, consequently defining CT as a list of skills or tasks performed by an individual (Lewis & Smith, 1993). Finally, the educational approach surrounds Bloom’s (1956) work and his hierarchical taxonomy of skills suggesting that the three highest levels (analysis, synthesis and evaluation) represent CT.
Even though there are differing interpretations of CT there is consensus on some components. These four areas of agreement suggest that CT includes:
- Analysing arguments, claims, or evidence (Ennis, 1985; Facione, 1990; Halpern, 1998; Paul, 1992);
- Making inferences using inductive or deductive reasoning (Ennis, 1985; Facione, 1990; Paul, 1992; Willingham, 2007);
- Judging or evaluating (Case, 2005; Ennis, 1985; Facione, 1990; Lipman, 1988); and
- Making decisions or solving problems (Ennis, 1985; Halpern, 1998; Willingham, 2007). (Lai, 2011)
Further to this the importance of background knowledge in a subject is central in providing a student with the awareness necessary to be an effective critical thinker. Ennis (1989) identifies a range of assumptions regarding domain specificity held by various theorists. Proponents of domain specificity include Willingham (2007), who argues that it is easier to learn to think critically within a given domain than it is to learn to think critically in a generic sense. Similarly, Bailin (2002) argues that domain-specific knowledge is necessary for CT because what constitutes valid evidence, arguments, and standards tends to vary across domains. Facione (2000) designed the California Critical Thinking Skills Test as a general test of critical thinking rather than one embedded within the context of a specific domain. Yet Facione (1990) also notes the importance of domain-specific knowledge in any application of critical thinking skills and abilities. Thus, Facione also falls into the category of researchers who acknowledge both general and domain specific elements of CT.
Critical Thinking in Post-16 Psychology
Psychology students should be able to think critically, or evaluate claims, in a way that explicitly incorporates basic principals of psychological science or have psychological critical thinking (Lawson, 1999). Popper (1963) suggested that all science should start with myths and the techniques to test these myths. One should monopolise on the intrinsic engagement a learner gets from the ‘treasure hunt’ of considering a myth. CT could be embedded into post-16 psychology in many ways from questioning techniques used by teachers (Yang, 2005) to problem-based learning (PBL) and psychological applied learning scenarios (PALS)(Norton, 2004).
Within psychology being able to ‘think critically’ is intrinsically linked to a solid understanding of research methods as many (if not all) issues linked to evaluation of others work will, at least in some way, fall back to the methods employed, design of research, or participants tested to make conclusions about behaviour. It is therefore necessary to ensure that students have this foundation of knowledge of psychological methods to have the toolbox of issues that can then be used to analyse arguments, theories and research presented to them critically.
This raises an issue in the current educational climate of narrow psychology specifications where, at A Level, teachers of research methods are often faced with teaching content to heterogeneous groups of students who have a wide variety of academic backgrounds and knowledge (Porter et al., 2006). As a result it is tempting for teachers to deliver strategic lessons and assessments that ‘teach to the test’ (Halonen et al., 2003) and miss extending students knowledge and skill set and developing the higher level CT skills outlined by Hayes (1996). Consequently, the underlying foundation of knowledge about research methods and the scientific approach is lacking which will only have a negative impact on a student’s ability to apply their CT skills.
The term ‘psychological literacy’ was first used by Boneau (1990) and subsequently McGovern et al (2010) used the term ‘psychologically literate citizens’ to refer to students becoming “critically scientific thinkers and ethical and socially responsible participants in their communities” (p10). Until students are equipped with the knowledge to allow them to understand and consider alternatives to what is presented to their CT abilities will reach a glass ceiling quickly. Many students when presented with Freud’s work will be able to criticise it on the surface level using anecdotal points, but unless first skilled with knowledge they would not be able to move to the next level and infer such conclusions as psychoanalysis contains theories and hypothesis but lacks a method of empirical observation (Engel, 2005). Skilling students in CT must be embedded into the curriculum rather than taught as a standalone module and work in the scaffold of psychological knowledge (Case, 2005). Without a foundation of psychological knowledge, CT skills will be limited in all learners.
The following strategies therefore attempt to embed the development of CT skills within knowledge-based sessions of psychological of concepts allowing the learner to develop both in parallel.
Strategies to Improve Critical Thinking in Post-16 Psychology
As addressed in the first section the definition of CT is not a simple thing to make and it would be naive to progress into sessions building students’ CT skills without introducing them first to the concept of CT, what it involves and the expectations of the teacher as far as application to psychology is concerned. To introduce students to the concept of CT and provide a foundation for the following strategies to improve CT skills an initial session (appendix 3) was developed to frame students’ awareness of what CT is and to get them ‘thinking outside of the box’ initially on non-psychological items.
One approach to increase students CT skills is to get them considering methodological issues outside of the narrow framework of each subject specification (Kaminski et al., 2008) and bring these issues to life (Blair-Broeker, 2003). The use of activities such as ‘More cat owners have degrees’ (appendix 5) demonstrating the dangers of misinterpreting correlational research and the possible bias caused by funding, and ‘The dangers of bread’ (appendix 5) again illustrating issues of inferring causation from correlation act as excellent points for discussion about causation and correlation. Articles such as these teach students to be ‘savvy consumers and producers of research’ and develop the abilities needed to analyse, synthesise and applied learned information (Sternberg, 1999).
Once these more abstract activities have been introduced more specific examples can be used such as Parkes et al.’s (2013) research on the effects of the media on childhood development which has been interpreted in vastly differing ways by the media. The article is open access giving students an awareness of how research is written up and how this can be (mis)represented by the media. By providing students with the articles from the Independent (Connor, 2013) and the Mail Online (Hope, 2013) to read (see figure 1) in parallel, students identify conclusions in the media that are not explicit in the original article. As a final reference the NHS Direct article (NHS Direct, 2013) summarising the study is given to make a final, more reliable, comparison.
Students who participate in collaborative learning perform better on the critical- thinking tests than students who study individually (Gokhale, 1995). Vygotsky (1978) suggests that students are capable of performing at higher intellectual levels when asked to work in collaborative situations than when asked to work individually. Group diversity in terms of knowledge and experience contributes positively to the learning process (Gokhale, 1995). Further to this, Bruner (1985) provides support for the thesis that cooperative learning methods improve problem-solving strategies because the students are confronted with different interpretations of the given situation. Within this vein of research an approach to evaluation and analysis was developed to provide learners with a collaborative learning experience to scaffold their evaluative skills. Using a simple Venn diagram (see figure 2) of strengths and weaknesses each pair of learners is given an evaluation issue (see appendix 11 for examples) with questions to frame their evaluation of a study that has been delivered. As a collaborative full-class activity these are then placed within the Venn with discussion being fed in from all learners allowing discussion and debate on the issue and how something that initially could be seen as an advantage is also a disadvantage of a study. If student discussion is not flowing it is supported by Socratic Questioning (see appendix 10 for framework used) to provoke avenues of thought which has been shown be improve CT skills (Yeng, 2005).
As discussed in the first two sections, a key element of CT is not taking results and conclusions at face value and questioning the methods that were used and any biases that these could have introduced when making inferences from results. Several activities have been designed to make learners aware of the acceptance of conclusions and the fallibility of doing this without question. The first activity is based on hindsight bias, or the “I knew that all along” attitude (Roese and Vohs, 2012), helping students become aware of the fact that anything can seem commonplace once explained if you are not aware of the underlying methodology.
This was the rationale for a session (appendix 1,2 & 4) that starts with the class being divided into two groups with each half receiving conclusions from a study (adapted from Lazarsfeld, 1949). However, unaware of this, the two groups received the opposite findings. For example group one would receive:
“Better educated soldiers suffered more adjustment problems than less educated soldiers.”
Whereas the second group would have:
“Better educated soldiers suffered fewer adjustment problems than less educated soldiers.”
Each group would have to make inferences about ‘why’ the conclusions might be true. Following on from the task students were asked to “did the findings make sense?” and to feedback their reasons. Only at this point will the class be made aware that they had the opposite findings and how easily it is to justify a finding after the fact. A discussion about the fallibility of the “I knew that already” attitude follows in relation to the students that the students have completed. This allows for the learner to review conclusions made and consider alternative arguments, confounding variables and biases in generalisations made.
To then scaffold students’ analysis and evaluation skills a set of critical thinking questions to frame evaluation of research was adapted (Bensley, 2010; Lawson, 1999)(see appendix 1 & 2). These CT questions provide students with important questions that they can use to establish the credibility of a research method. It also allows differentiation across learners providing the opportunity for those with low ability to give limited responses and the more able students to expand and demonstrate their synoptic awareness of research methods and the surrounding issues and concepts.
Following on from this and considering methodological issues and inferring conclusions a final session is delivered considering ethnocentrism. After his initial research Henrich et al. (2010) considered that there were many ‘certainties’ social science took for granted when explaining behaviour. They suggested that members of WEIRD (Western, Educated, Industrialised, Rich and Democratic) societies, including young children, are among the least representative populations one could ?nd for generalising about humans and that we need to be less cavalier in addressing questions of human nature on the basis of data drawn from this particularly thin, and rather unusual, slice of humanity (Henrich et al., 2010). To further develop students CT skills and make learners aware of some of the wider debates in psychology (ethnocentrism) the following ‘We are(n’t) the world’ session (appendix 9) was designed to allow reflection on the participant samples within research and the biases we bring to our interpretation of findings. The session (appendix 9) considers the interpretation of research in different areas of the world, reasons for this, and looks at measurements of behaviour (Derogowski, 1972; Gould, 1982) and the issues of taking an ethnocentric view of research. For example students are asked to define ‘intelligence’ and a measurement of it. A discussion of ethnocentric measures (Gould, 1982) and the use of examples such as Yerkes’ (from Gould, 1982) Army Beta intelligence test and a ‘student centric’ test (appendix 8) which uses local colloquialisms to simply show how measures can be biased towards a specific group. The use of the infographic (appendix 6) and Foot and Stanford’s (2004) article ‘The use and abuse of student participants’ feed discussion surrounding participant samples which is then applied to research that the students are studying.
Measures of Effectiveness of Critical Thinking Strategies
Problems and disagreements in conceptualising CT have posed serious challenges to its assessment (Bensley and Murtagh, 2012). CT is a multidimensional construct, and assessments of it should include measures of skills, dispositions, and metacognition. One could use specific measures of CT to assess the effectiveness of any measure of intervention. Two of the most commonly used standardised CT tests, the Watson-Glaser Critical Thinking Appraisal and the Cornell Critical Thinking Test, and both have been shown to have good reliability and validity (Follman, 2003; Gadzella et al., 2006).
Further to this there is an association between student engagement and critical thinking skills. Student engagement is linked positively to desirable learning outcomes such as critical thinking (Carini et al., 2006). Actively engaging students in lesson by encouraging CT fosters an active learning scenario where learners are ‘thinking’ and considering the information in front of them rather than passively assimilating it. The measurement of student engagement and the consequent impact on learning is an issue of much research and debate (Fredericks et al., 2005; Lipman and Rivers, 2008; Kuh, 2003; Scholman, 2002) and several student engagement measures have been created such as the School Engagement Scale (Fredericks et al., 2005) which considers engagement on three levels: behavioural, emotional and cognitive. The impact of the strategies above will be evident on both formative and summative assessments. Increased critical thinking skills would manifest themselves in the AO2/AO3 bands and it would also produce learners who are more aware of the deeper psychological issues surrounding research development.
Action research could be run over the course of a year investigating the effectiveness of each intervention on different levels of learners. Research that has been conducted at undergraduate levels have found that groups receiving explicit critical thinking skills instruction showed signi?cantly greater gains in their argument analysis skills than groups receiving no explicit critical thinking instruction (Bensley et al., 2010). These results support the effectiveness of explicitly teaching critical thinking skills infused directly into regular course instruction.
The transition from post-16 education to HE and university careers is of continued debate with universities making more reference to students who are unprepared for degree level study (BPS Psychology Education Board, 2013; Jarrett, 2010). Banyard stated that “the ideal psychology undergraduate is someone who engages, who has an interest, has their own ideas, and the ability to look beyond the question…” (Banyard in Jarret, 2010, p.715) all of these are skills that fall under the umbrella of CT. Therefore, a further measure of the effectiveness is on the ease of transition from post-16 education into higher education and self-reported preparedness for study at that level.
Shepard (1983) echoes the idea that one of the most valuable measures of CT will not necessarily be seen in academic writing or exam performance, but instead in a life skill set that the learner will take with them.
“Although most undergraduate psychology students may not go on to scientific careers, one hopes that they acquire some facility for the critical evaluation of the incomplete, naive, confused, or exaggerated reports of social science ‘findings’ to which they will continue to be exposed by the popular media. Widespread notions that human behaviour and mental phenomena can be adequately understood through unaided common sense or, worse, by reference to nonempirical pseudosciences, such as astrology, present us with a continuing challenge” (Shepard, 1983, p.855).
As teachers we are often teaching to the test and preparing learners with the knowledge that is explicitly dictated to us by awarding bodies’ specifications however it is important for us to consider that years after content is gone CT skills remain (Stanovich, 2009). By investing lesson time into fostering CT skills one is benefiting the learner on two dimensions. First by empowering them to evaluate the research, theories and concepts that they are required to learn at A Level with greater skill. But, further to this, you equip them with a skill base that will aid transition onto higher education, or just prepare them for reading a copy of The Daily Mail (Goldacre, 2009).
Although CT is not directly examined on any of the current A Level specifications learners who have sound skills in this area will demonstrate increased evaluative and analytical abilities and increased performance on those questions that require the candidate to apply their knowledge to a novel situation. Teachers have to ensure that they are equipping their learners with the necessary skills as well as the knowledge that the specifications outline to allow them to become better consumers of psychological research (Sternberg, 1999). Teachers need to find opportunities to infuse CT that fit the content and skill requirements of their course. Further to this, as with other skills that we foster in our learners, a realisation that to improve CT skills, students must be given opportunities to practice them (Bensley, 2010) and be allowed to work collaboratively to evaluate and analyse issues (Gokhale, 1995). The classroom is a natural laboratory with research participants readily accessible for instructors to do assessment research, affording the opportunity to test the effectiveness of instructional strategies and programs hypothesised to improve student learning and thinking (Bensley & Murtagh, 2012).
Learning about the quality of evidence and drawing appropriate conclusions from scientific research are central to teaching CT in psychology (Bensley, 2010). This is not a new concept, Newman (1852) discussing what education should be about and what a teacher should be aiming to instil in their learners stated:
“It is the education which gives us a clear and conscious view of our own opinions and judgements, a truth in developing them, an eloquence in expressing them, and a force in urging them. It teaches us to see things as they are, to go right to the point, to disentangle a skein of thought, to detect what is sophisticated, and to discard what is irrelevant.” (Newman, 1852)
For students to be able to do this they need to be equipped with the knowledge and concepts of research methods that will allow them to address issues in research that is presented to them. If teachers can provide their learners with this skill it will provide invaluable in future study. Education should take as a basic aim the fostering of rationality; and that rationality, or its educational cognate, critical thinking should be taken to be a fundamental educational idea (Siegel, 1989).
Rationality is a matter of reasons, and to take it as a fundamental educational idea is to make as pervasive as possible the free and critical quest for reasons, in all realms of study (Scheffler, 1973). There is a deep conceptual connection between critical thinking and rationality. Education aimed at the propagation of CT is nothing less than education aimed at the fostering of rationality (Sigel, 1989). Although this may just seem to be an argument in semantics, with many seeing rational thinking simply being a cognate of critical thinking, it provides the learner (and maybe the teacher) with a more concrete idea of what is expected.
Science’s rationality, and consequently psychology’s rationality, comes from a commitment to evidence (Siegel, 1989). Therefore, psychology’s rationality is a function of its method and this is where the focus of teaching rational / critical thinking should focus. This would allow students not only to appreciate the philosophy of science debate at work but also give them a solid understanding of research methods as an evaluation issue improving their abilities to discuss and debate at a higher level which will be evident in both the AO2 and AO3 bands.
CT is more than just providing learners with the tools to be able to distinguish between the true science within psychology and other approaches that are more pseudopsychology. It is fostering an inquisitive and questioning mind in our students; helping them find the ‘right’ questions to ask when presented with research and giving them the confidence to move away from what the textbooks are telling them to their own rational and critical thought. These attitudes need to be embedded into the curriculum and teaching from the first session to engage their critical minds. Psychology is not about answering the question as to why we think and behave the way that we do; this is far too simplistic a definition and can lead students to believe that we are on a misguided search for a ‘magic bullet’ when it comes to explanations of human behaviour. Psychology progresses by investigating solvable empirical problems. This progress is uneven because psychology is composed of many different subareas, and the problems in some areas are more difficult than in others (Stanovich, 2013).
Critical thinking requires a confidence in ones own awareness and knowledge; this is the keystone to freeing a learner to approach research with a critical and questioning mind. Overcoming the initial leap of faith for the learner to move away from rote learning evaluation and analysis from textbooks is a challenge that teachers will have to address through the development of activities that allow learners to make this transition and gain confidence in their own questions. Intrinsically these activities will engage the learner, taking them back to their childhood days where it way okay to ask ‘but why?’ It seems that, in this ability we regress, as we age we loose this ability; the ability to question things that we do not understand. Through teaching CT skills one can hope to recapture some of this creative thought and embed it within a psychology curriculum.
Bailin, S. (2002). Critical thinking and science education. Science & Education, 11(4), 361–375.
Bensley, D.A. (2008). Can you learn to think more like a psychologist? The Psychologist, 21, 128–129.
Bensley, D.A. (2009). Thinking critically about critical thinking approaches. Review of General Psychology, 13(3), 275-277.
Bensley, D.A. (2010). A Brief Guide for Teaching and Assessing Critical Thinking in Psychology. Observer, 23(10).
Bensley, D.A., Crowe, D.S., Bernhardt, P., Buckner, C., & Allman, A.L. (2010). Teaching and Assessing Critical Thinking Skills for Argument Analysis in Psychology. Teaching of Psychology, 37, 91-96.
Bensley, D.A., & Murtagh, M.P. (2012). Guidelines for a Scientific Approach to Critical Thinking Assessment. Teaching of Psychology, 39(1), 5-16.
Blair-Brokeker, C. (2003) Bringing psychology to life. In Buskist, W., Hevern, V., and Hill, G.W. (eds) (2002) Essays from e-xcellence in teaching. Retrieved from http://teachpsych.lemoyne.edu/teachpsych/eit/index.html.
Bloom, B.S. (1956) Taxonomy of Educational Objectives, Handbook I: The Cognitive Domain. New York: David McKay Co Inc.
Boneau, C.A. (1990). Psychological literacy: A first approximation. American Psychologist, 45, 891-900. Retrieved from http://people.auc.ca/brodbeck/4007/article12.pdf .
Bruner, J. (1985). Vygotsky: An historical and conceptual perspective. Culture, communication, and cognition: Vygotskian perspectives, 21-34. London: Cambridge University Press.
Case, R. (2005). Moving critical thinking to the main stage. Education Canada, 45(2), 45–49.
Carini, R.M., Kuh, G.D., & Klein, S.P. (2006). Student Engagement and Student Learning: Testing the Linkages. Research in Higher Education, 47(1), 1-32.
Connor, S. (2013, March, 26). Watching TV for three hours a day will not harm your children, parents told. The Independent. Retrieved from http://www.independent.co.uk/arts-entertainment/tv/news/watching-tv-for-three-hours-a-day-will-not-harm-your-children-parents-told-8549131.html
Deregowski, J.B. (1972). Pictorial perception and culture. Scientific American, 227, 82-88.
Engel, J. (2008). American therapy. New York: Gotham Books.
Ennis, R.H. (1985). A logical basis for measuring critical thinking skills. Educational Leadership, 43(2), 44–48.
Facione, P. A. (1990). Critical thinking: A statement of expert consensus for purposes of educational assessment and instruction (Executive Summary). Millbrae, CA: The California Academic Press.
Facione, P. A. (2000). The disposition toward critical thinking: Its character, measurement, and relation to critical thinking skill. Informal Logic, 20(1), 61–84.
Follman, J. (2003). Reliability estimates of contemporary critical thinking instruments. Korean Journal of Thinking & Problem Solving, 13, 73–81.
Foot, H., and Stanford, A. (2004). The use and abuse of student participants. The Psychologist, 17(5), 256-259.
Fredericks, J.A., Blumenfeld, P., Friedel, J., & Paris, A. (2005). School engagement. In K.A. Moore & L. Lippman (Eds.) What do children need to flourish?: Conceptualizing and measuring indicators of positive development. New York, NY: Springer Science and Business Media
Gadzella, B. M., Hogan, L., Masten, W., Stacks, J., Stephens, R., & Zascavage, V. (2006). Reliability and validity of the WatsonGlaser critical thinking-forms for different academic groups. Journal of Instructional Psychology, 33, 141–143.
Gokhale, A.A. (1995). Collaborative Learning Enhances Critical Thinking. Journal of Technology Education, 7(1). Retrieved from http://scholar.lib.vt.edu/ejournals/JTE/v7n1/gokhale.jte-v7n1.html?ref=Sawos.Org
Goldacre, B. (2009). Bad Science. Harper Perennial: London.
Gould, S.J. (1982) A nation of morons. New Scientist, 349-52.
Halonen, J. S. (1995). Demystifying critical thinking. Teaching of Psychology, 22(1), 75–81.
Halonen, J.S., Appleby, D.C., Brewer, C.L., Buskist, W., Gillem, A. R., Halpern, D. F., et al. (APA Task Force on Undergraduate Major Competencies). (2002) Undergraduate psychology major learning goals and outcomes: A report. Washington, DC: American Psychological Association. Retrieved August 27, 2008, from http://www.apa.org/ed/pcue/reports.html .
Halonen, J. S., Bosack, T., Clay, S., McCarthy, M., Dunn, D. S., Hill, G. W., et al. (2003) A rubric for learning, teaching and assessing scientific inquiry in psychology. Teaching of Psychology, 30(3), 196-208.
Halpern, D.F. (1998). Teaching critical thinking for transfer across domains: Dispositions, skills, structure training, and metacognitive monitoring. American Psychologist, 53(4), 449–455.
Hayes, N. (1996). What makes a psychology graduate distinctive? European Psychologist. 1(2), 130-34.
Henrich, J., Heine, S.J., & Norenzayan, A. (2010). The weirdest people in the world? Behavioural and Brain Sciences, 33, 61-135.
Hope, J. (2013, March, 26). More than three hours of TV ‘makes youngsters naughtier by the age of seven’. The Mail Online. Retrieved from http://www.dailymail.co.uk/health/article-2299153/More-hours-TV-makes-youngsters-naughtier-age-seven.html
Jarrett, C. (2010). The journey to undergraduate psychology. The Psychologist, 23 (9), 714-717.
Kaminski, J.A., Sloutsky, V.M., and Heckler, A.F. (2008). The advantage of abstract examples in learning math. Science, 320, 454-455.
Kuh, G.D. (2003). What we’re learning about student engagement from NSSE: Benchmarks for effective educational practices. Change: The Magazine of Higher Learning, 35, 2, 24-32.
Lai, E.R. (2011). Critical Thinking: A Literature Review. Pearson Assessments. Retrieved Aug, 2, 2013 from http://www.pearsonassessments.com/hai/images/tmrs/criticalthinkingreviewfinal.pdf.
Lawson, T.J. (1999). Assessing Psychological Critical Thinking As A Learning Outcome for Psychology Majors. Teaching of Psychology, 26(3), 207-209.
Lazarsfeld, P.F. (1949). The American Solidier – An Expository Review. Public Opinion Quarterly, 13(3), 377-404.
Lewis, A., & Smith, D. (1993). Defining higher order thinking. Theory into Practice, 32(3), 131–137.
Lipman, M. (1988). Critical thinking—What can it be? Educational Leadership, 46(1), 38–43.
Lippman, L., and Rivers, A. (2008) Assessing School Engagement: a guide for out-of-school time programme practitioners. Child Trends. Retrieved from http://www.childtrends.org/files/child_trends-2008_10_29_rb_schoolengage.pdf .
McGovern, T. V., Corey, L. A., Cranney, J., Dixon, Jr., W. E., Holmes, J. D., Kuebli, J. E., Ritchey, K., Smith, R. A., & Walker, S. (2010). Psychologically literate citizens. In D. Halpern (Ed.). Undergraduate education in psychology: Blueprint for the discipline’s future (pp. 9-27). Washington, D.C.: American Psychological Association.
Newman, J. H. (1852). The idea of a university. New York: Longmans Green.
NHS Direct (2013, March, 26) Do TV and video games really make kids naughty? Health News via NHS Direct. Retrieved from http://www.nhs.uk/news/2013/03March/Pages/Do-TV-and-video-games-really-make-kids-naughty.aspx
Norton, L. (2004). Psychology Applied Learning Scenarios (PALS): A practical introduction to problem-based learning using vignettes for psychology lecturers. LTSN.
Parkes, A., Sweeting, H., Wight, D., & Henderson, M. (2013). Do television and electronic games predict children’s psychosocial adjustment? Longitudinal research using the UK Millennium Cohort Study. Archives of Disease in Childhood. Retrieved from http://adc.bmj.com/content/early/2013/02/21/archdischild-2011-301508.full
Paul, R. W. (1992). Critical thinking: What, why, and how? New Directions for Community Colleges, 77, 3–24, cited in Lai, E.R. (2011) Critical Thinking: A Literature Review. Pearson Assessments. Retrieved Aug, 2, 2013 from http://www.pearsonassessments.com/hai/images/tmrs/criticalthinkingreviewfinal.pdf.
Popper, K. (1963). Conjectures and Refutations: The Growth of Scientific Knowledge. NY: Routledge.
Porter, A., Cartwright, T., & Snelgar, R. (2006) Teaching statistics and research methods to heterogeneous groups: the Westminster experience. In Proceedings of the Seventh International Conference on Teaching Statistics, Salvador, Brazil. Voorburg: The Netherlands: International Statistical Institute.
Roese, N.J., & Vohs, K.D. (2012). Hindsight Bias. Perspectives on Psychological Science, 7(5), 411-426.
Russell, B. (1945). A History of Western Philosophy. Simon and Schuster, New York.
Scheffler, I. (1973). Philosophical models of teaching. In Reason and teaching. London: Routledge & Kegan Paul. Originally published in Harvard Educational Review 35 (1965): 131-143.
Shepard, R. (1983). “Idealized” figures in textbooks versus psychology as an empirical science. American Psychologist, 38, 855.
Shulman, L.S. (2002). Making Differences: A table of learning. Change: The Magazine of Higher Learning, 34(6), 36-44.
Siegel, H. (1989). The Rationality of Science, Critical Thinking and Science Education. Synthese, 80(1), 9-41.
Stanovich, K.E. (2009, Nov/Dec). The thinking that IQ tests miss. Scientific American Mind, 20(6), 34-39.
Stanovich, K. E., West, R. F., & Toplak, M. E. (2013). Myside bias, rational thinking, and intelligence. Current Directions in Psychological Science, 22, 259-264.
Stark, E. (2012). Enhancing and Assessing Critical Thinking in a Psychological Research Methods Course. Teaching of Psychology, 39(2), 107-112.
Sternberg, R.J. (1986). Critical thinking: Its nature, measurement, and improvement. National Institute of Education. Retrieved Aug, 2, 2013 from http://files.eric.ed.gov/fulltext/ED272882.pdf.
Sternberg, R.J. (1999). Teaching psychology students to be savvy consumers and producers of research questions. Teaching of Psychology, 26(3), 211-213.
The British Psychological Society: Psychology Education Board (2013). Briefing Paper: The Future of A-level Psychology. BPS. Retrieved Aug, 1, 2013 from http://www.bps.org.uk/system/files/Public%20files/inf209_a_level_web_final.pdf
Vygotsky, L. (1978). Mind in society: The development of higher psychological processes. Cambridge: Harvard University Press.
Willingham, D.T. (2007). Critical thinking: Why is it so hard to teach? American Educator, 8–19.
Yang, Y.C., Newby, T.J., & Bill, R.L. (2005). Using Socratic Questioning to Promote Critical Thinking Skills Through Asynchronous Discussion Forums in Distance Learning Environments. American Journal of Distance Education, 19(3), 163-181.
Yanchar, S.C., Slife, B.D., & Warne, R. (2009). Advancing disciplinart practice through critical thinking: a rejoinder to Bensley. Review of General Psychology, 13(3), 278-280.