Supervisory Skills

You have just taken over as the head of Toyota’s Quality Control Department. You are a Vice President (VP) and answer directly to the CEO. You have seven directors that answer to you. Each director manages one of the following divisions:

1. Drive Systems (including brakes, axles, and wheels)
2. Electrical Systems
3. Safety Systems
4. Body and Chassis (including suspension)
5. Material (quality of all materials that go into making car / truck)
6. Manufacturing Equipment (all systems that go into making a car, the factory line)
7. Facilities (all the supporting facilities, paperwork, etc)

Each director has a group of inspectors and managers for their divisions.

The previous VP has been let go, for cause. The departments morale is very low and there is a great deal of pressure for each division to perform.

You are to develop a paper to present to the CEO that will outline how you are going to turn around the department (your department – hint – you are NOT in charge of production). You should include / address the following areas:

1. Short term (1 year) and long term (2-10 years) goals. Make sure they are SMART.
2. Any reorganization that you plan to do within your department.
3. Any assistance you will need from other departments.
4. Any training you will do with your department, and the company as a whole. (this ties into number three, since you will need to work with other departments)
5. An analysis of what challenges you will face. Think of this as a way of analyzing your potential pitfalls as your group undergoes change.
6. What leadership traits do want your seven directors to have? Pick at least three. Why did you pick these traits? How will you ensure that the directors have or gain the required skills.

Your paper should be 3-5 pages (not including reference and title pages).

You should use proper APA sixth edition formatting.

Paper should contain no more than 20% direct citations or quotations. That means 80% should be your own analysis, thoughts, and ideas.

PSYC101 Journal Article Review Assignment  – Part 2
Article Analysis, Connection, and Reflection
Due: 11:55pm ETSunday at the end ofWeek 6of the 8-week course term
You may work on this assignment ahead of its deadline but may not submit it prior to Week 6.
Points Possible: 140
OVERVIEW: For Part I of this two-part assignment you identified and summarized elements of a published scholarly article selected from the classroom Resources Folder.  For Part 2, which is due by the end of Week 6, you will analyze, connect, and reflect on aspects of your selected article. Note that the words “succinct” and “thorough” repeat regularly in the instructions below.  They will serve as reminders that this is a formal assignment and sentence fragments, bulleted lists, conversational or other types of casual language cannot be used.
In completing Part 2 you will:

–Describe a research method alternative to the one used by your selected article’s author(s) to study the same phenomena
— Evaluate the potential impact on the “real world” of your selected article’s research;
— Apply three concepts from required textbook readings;
— Identify an aspect of the article’s topic focus about which you would like to know more
In composing your work use complete and clearly articulated sentences in one or more paragraphs, as assigned below, a minimum of 250 words each and citing sources in the body of your writing and in a References list attached to the end of it.  Proofread your work carefully as proper spelling, grammar and writing structure are required.All answers must be your original words or paraphrases of material in your selected article or the course textbook.  No other sources may be used.  Copying from published material is a violation of the University policy on academic integrity and will void all points for this assignment with no option for revision and submission.
**You may work on this assignment ahead of its deadline but may not submit it prior to Week 6 of the 8-week course term.
Begin your Part 2 work here.
Enter your name and student ID here: ___________________
Then respond to the following succinctly and thoroughly in the spaces below
NOTE:  You must complete your work in this document, save and attach it to the assignment tab; while you may want to do so as a back-up, content pasted into the assignment page Student Comments space cannot be accepted as a substitute for an on-time attachment submission and cannot be graded.  Inserting your answers here will change the number of document pages and the location of particular items at the top, middle or bottom of pages.

While you may not remove or reorder items or change font sizeor other elements of this document and need to place your responses directly under each item, page number increases or an item moving from the top, middle or bottom of a page as a result of you entering your responses is to be expected and is not of concern.

====================================================================================

ANALYSIS – 30 points possible (15 points each)
The authors of the article you have selected for this assignment used a particular research design, group of participants, and set of study methods to investigate a research question (sometimes referred to in scholarly articles as the “hypothesis tested”).  As you know from early assigned readings in the course textbook, research questions can be investigated using a variety of methods.
1. Write a succinct and thorough paragraph in the space below either justifying or challenging the use of your selected article’s research methods used to examine its research question.NOTE:  You must base your writing on the course textbook information about which research methods are best suited for which types of studies, not on personal opinion or preference, and choose just one position here.  A paragraph that both justifies and challenges the use of an article’s research methods or that states why one position cannot be chosen cannot not be assigned points.  Include in the body of your writing appropriately placed and formatted source citations for both the article and the course textbook.

2. Write a succinct and thorough paragraph in the space below describing a research method other than the one noted in your selected article that the article author(s) could have used to conduct the same study and explain why it would be suitable as an alternative method.NOTE: The alternative research method you select must be suited for the article’s study so you will want to review the assigned course readings on the various types of methods.  Include in the body of your writing appropriately placed and formatted source citations for both the article and the course textbook.

CONNECTION – 50 points possible (25 points each)
1. In the space below, write a succinct and thorough paragraph (250 words minimum)describing three concepts, theories or principles from the course textbook that can be related the focus of your selected article.  NOTE:  Research methods and statistical analyses have already been addressed earlier in this assignment and cannot be used as textbook concepts here.Include in the body of your writing appropriately placed and formatted source citations for both the article and the course textbook.

2. One of the most interesting aspects of the field of psychology is the application of its concepts, theories, and principles to everyday life.   In the space below, write a succinct and thorough paragraph (250 words minimum)describing at least 3 different ways that the research conducted by your selected article’s author(s) can impact the “real world”.Include in the body of your writing appropriately placed and formatted source citations for the article and the course textbook (if you use the latter in the construction of this paragraph).

REFLECTION – 50 points possible (25 points each)

1.  Although the author(s) of your selected article addressed many aspects of its focus of study, it is inevitable that components were not discussed that might also be interesting.  In the space below, write a succinct and thorough paragraph (250 words minimum)explaining one aspect of your selected article’s focus that the researchers did not mention that you would like to know more about.Include in the body of your writing appropriately placed and formatted source citations for the article and the course textbook (if you use the latter in the construction of this paragraph).

2. Combined, the Week 3 and Week 6 portions of this assignment provided several opportunities, to identify the key components of a published scholarly journal article, to demonstrate knowledge of research methods use in studying psychological phenomena, (Week 3); and second, to develop and hone article summary and analysis skills (Week 6).  With the second opportunity in mind, in the space below, write a succinct and thorough paragraph (250 words minimum)describing three aspects of summarizing and reviewing a published scholarly journal article that you now understand that you didn’t know about before starting the assignment.  NOTE:  You are reflecting on your learning experience here, not summarizing or evaluating the selected article.  This is the only part of the assignment to be written personal reflection style. It is expected that you would have few or perhaps no source citations here, using them only for portions of your reflective work that are directly based on either your selected article or the course textbook.

=================================================================================

SPELLING, GRAMMAR, AND CITATIONS – 10 points possible
(Nothing to type here.  This is an alert to go back and proofread your writing and make any needed corrections before submitting this assignment to avoid the loss of these 10 important points.  Tip:  Look for basic grammar errors [ex: using “their” when you are talking about one person), misspellings and typos, correctly spelled but incorrectly used words that SpellCheck won’t catch, sentence fragments that don’t state complete thoughts, run-on sentences that should be split into smaller ones or sentences that read awkwardly or don’t make sense when you read them aloud, etc.).

==================================================================================

After you have completed your work, save and attach this document, with your name as part of the document file name, to the Journal Article Review Assignment Part 2 Assignment page.

South African Journal of Psychology, 40(3), 2010, pp. 272-281

Effect of a course in research methods on scientific thinking among psychology students
Ashraf Kagee Stellenbosch University, South Africa Saalih Allie University of Cape Town Anthea Lesch Stellenbosch University This study followed a quasi-experim ental design to determ ine the effect of a course in research m ethods on undergraduate students’ ability to reason scientifically. Two classes of students in their first and second year of study were asked to participate in the study. The second year class ( n = 171) was taught a course in research m ethods, while the first year class ( n = 201) was taught a course in research m ethods. An instrum ent consisting of a series of vignettes was adm inistered to all students at the beginning and at the end of the quarter in which these courses were taught. Total scores on the instrum ent were used to determ ine the extent of scientific thinking. Analysis of variance showed a non-significant difference between the groups at pretest and a significant difference ( p < 0.05) at post test. These results were interpreted to m ean that the research m ethods course was responsible for increasing students’ level of scientific thinking. Keyw ords: causality; psychology students; research m ethods; scientific thinking The broad purposes of teaching any science based discipline comprise two distinct aspects that need to be addressed. On the one hand there is a body of broadly accepted knowledge, while on the other hand there are the processes that lead to the creation and legitimatization of knowledge within the discipline. A perusal of popular first year textbooks ranging from physics (Halliday, Resnick, & W alker, 2007) to psychology (Swartz, De la Rey, & Duncan, 2004) shows that the emphasis is on the first aspect, namely, content knowledge while the processes of science are usually only mentioned briefly. One of the effects of this approach, coupled with traditional assessment practices, is that scientific knowledge appears authoritative and unchallengeable. To this extent, following completion of a first year course, students may embrace epistemologies that are weaker than those they accepted prior to the course (Redish, Saul, & Steinberg, 1998). The consequence of following such curricula can have the adverse effect of weakening student epistemology as has been shown, for example, in studies on first year physics students using the Maryland Physics Expectations (MPEX) instrument which was administered before and after their first year physics courses (Redish, Saul, & Steinberg, 1998). Given the need for students to become familiar with a discipline, the amount of material covered in most undergraduate courses is usually considerable. However, without an understanding and appreciation of the nature of scientific processes and the way scientific knowledge is constructed there is little on the surface to distinguish between accepted discipline based knowledge and knowledge claims based on folklore or whimsy. Graduates who claim some level of certification in a science-related discipline but who have not developed the tools to enter these debates are more likely to undermine the scientific enterprise than to promote scientific thinking amongst society at large.

READ ALSO :   define globalisation, identify a few key concepts, and state what disciplines

South African Journal of Psychology, Volume 40(3), September 2010

273

Scientific thinking in psychology The nature of psychology as a discipline is such that it is more difficult to separate out the role of personal experience from disciplinary content than in, say, physics. It is, therefore, critical that part of the training of psychology students explicitly addresses aspects of scientific thinking with the aim that they are able to critique knowledge claims and arguments about human behaviour from a scientific perspective. Legitimate forms of reasoning within the scientific paradigm are often loosely referred to as scientific thinking. Examples of cognitive processes involved in scientific thinking include induction, deduction, analogy, problem-solving and causal reasoning (Dunbar & Fugelsang, 2005). The issue of scientific thinking has not received widespread attention in the literature on teaching in psychology, though the related notions of promoting critical thinking and stimulating the use of higher order cognitive processes in students has received a fair amount of attention. Current definitions of critical thinking describe it as a higher order thinking process in which individuals apply information to analyse, make inferences about and evaluate knowledge claims, and recognize and solve problems (Angelo, 1995; Beyer, 1985; Lewis & Smith, 2001). This body of research highlights the crucial importance of developing these cognitive processes in students, whilst simultaneously acknowledging the difficulties in achieving this goal through teaching. According to Lewis and Smith (2001) all disciplines need both lower and higher order thinking in order to generate knowledge. These authors point out that psychologists tend to see higher order thinking as problem-solving due to the discipline’s roots in experimentation and research. In psychology, therefore, developing students’ ability to apply higher order thinking skills is viewed as challenging them to interpret, analyse and manipulate information as distinct from lower order thinking which involves the routine, mechanical application of previously acquired information (Newman, 1990). The development of higher order thinking skills is often perceived as being achieved via courses in research methods within the psychology curriculum. A typical research methods course in psychology tends to be centred around the technical aspects of scientific investigation, such as research design, experimentation, and quasi-experimentation. Thus, there is a tacit assumption that engaging successfully with these technical aspects will also lead to a deeper understanding of some of the key themes associated with scientific reasoning. However, it is not easy to test routinely for thinking skills. The degree to which the course may have had an impact on scientific thinking is usually inferred from the results of formal testing that emphasize the technical issues of design rather than deeper conceptual understanding. Hence, it would be assumed (to some extent at least) that a student who is deemed successful, based on the results of traditional formal assessment, would possess a sound understanding of the way in which scientific is knowledge is constructed and the tentative nature of scientific tenets. Such an expectation is reasonable as a research methods course highlights many ideas that are associated with scientific thinking. Concepts typically emphasized in research methods courses include the role of empirical evidence, the nature of conclusions based on probabilistic thinking, the uses of falsification (Popper, 1963) and the differences in weight attached to different types of confirmatory evidence (Stanovich, 2004). In this study, we aimed to facilitate among students an awareness and understanding of the possible errors in reasoning that may occur when drawing conclusions based on observation. These errors include the following: failing to seek alternative explanations or causes for observed phenomena, focusing disproportionately on information that appears most vivid to the exclusion of other less salient data, using the notion of “ post hoc ergo propter hoc ”, assuming that correlation equals causation, relying on testimonial and anecdotal evidence, regarding intuition as a valid source of evidence, confirmation bias, placing the burden of proof on the skeptic rather than the claimant and failing to detect confounds in attributing causality. These elements, described in detail in Table 1, were used as markers of scientific thinking in this study. W e report on an investigation into the effect of a course in research methods on various facets of scientific thinking among psychology students.

274

South African Journal of Psychology, Volume 40(3), September 2010

M ETHOD Participants First and second year psychology students at a large residential South African university were invited to participate in the study. They were informed about the study in class by a researcher and asked to respond to a questionnaire that assessed the aspects of scientific thinking discussed above. Both groups of students (first and second year) were asked to complete the pre-test and the posttest in class, administered before and after the research methods course, respectively. As an expression of appreciation for their participation they were given a R30 lunch voucher on completion of the posttest. The study was approved by the university ethics committee. Students were informed that they could choose not to participate if they did not wish to do so without the decision affecting their course results. Procedures The study followed a quasi-experimental two group pre-test posttest research design. Second year psychology students constituted the experimental group as they completed the research methods course. The first year students who took a course in developmental psychology but not in research methods were the comparison group. Both groups were assessed at the beginning and at the end of the final quarter of 2010, which was when the research methods course was presented. The actual content learning of each course was assessed by a combination of a class assignment (25%), a class test (25%) and a final examination (50%). Description of the intervention The research methods course was delivered in a large auditorium attended by about 250 students. The lecturer made liberal use of teaching aids such as Powerpoint slides, a chalkboard and an overhead projector. Modes of delivery included traditional lecturing and extensive use of lecturer-student interaction. In-class discussion included eliciting students’ insights into problems of constructing new knowledge through research. The syllabus included a discussion of the epistemological assumptions of science, components of scientific theory, hypothesis testing, sampling theory, various research designs, internal and external validity including threats to validity and psychometric theory. In terms of scientific reasoning, students’ attention was called to the pitfalls associated with making causal attributions and the stringent criteria that need to be met when making claims involving causal reasoning. Table 1 summarizes the key concepts that were explicitly addressed in the course. The developmental psychology course addressed traditional concepts in developmental psychology and did not specifically include aspects of research design. Instrument W e developed an 11 item instrument that required participants to read a vignette and respond to specific questions that reflected the aspects of scientific thinking of interest, as articulated in Table 1. The response options were binary, in the direction of either a scientific or non-scientific response, which were coded 1 and 2, respectively. The minimum possible score was 11 and the maximum possible score was 22. Responses to each item were coded as 1 and 2, with responses in the scientific direction given a 2 and responses in the non-scientific direction given a 1. The scores were summed with the lowest possible score at 11 and the highest possible score at 22. Appendix 1 contains one of the 11 vignettes which were presented as part of the assessment instrument. In this particular example the idea was to test whether or not students were aware of the possibility of alternative explanations. Thus the sceptical stance of W alker is more compatible with accepted scientific reasoning than the conclusion of Rider. W e constructed each item, including the vignette, on the basis of the epistemological principles covered in the course, for example, the hypothetical counterfactual condition, temporal order between causes and effects and using correlational data in making causal

READ ALSO :   Traditional Gangs in America

South African Journal of Psychology, Volume 40(3), September 2010

275

claims. Thus the scale was not used as a psychometric instrument possessing internal consistency, but as a summation of individual items. Table 1. Key underlying concepts used as markers of scientific thinking Hypothetical counterfactual When presented with an apparent cause and effect relationship, scientific thinking requires that one imagine whether the effect would also be observed in the absence of the apparent cause. This envisaged hypothetical counterfactual set of circumstances permits the conclusion that the stated or apparent cause is indeed the cause of the observed effect. In experimental terms such a set of circumstances is given expression in the form of a control or comparison group. In the absence of a control or comparison condition it is often considered scientifically incorrect to make a causal attribution (Campbell & Stanley, 1963), even though change in the apparent cause is observed in tandem with change in the apparent effect. The availability heuristic is a rule of thumb or cognitive shortcut where one bases a prediction of an outcome on the vividness and emotional impact of an event rather than on actual probability (Ruscio, 2000). People generally make a judgment based on what they remember, rather than complete data. The availability heuristic is particularly used for judging the frequency or likelihood of events. Thus people often remember information about a few cases and assume that this is representative of a population. The burden of proof in science rests on the person who makes the scientific claim, not on the sceptic or critic (Shermer 1997). It is therefore inappropriate to expect that the sceptic should demonstrate that a claim is false (e.g. effectiveness of a new technique). Instead, the proponent of the claim must show that the claim is likely to be true. Thus, if the evidence in favour of the effectiveness of a certain psychological procedure is not forthcoming, a reasonable response is one of skepticism rather than a retort that there is no evidence against the procedure and therefore the procedure is valid. The assumption that a claim is likely to be correct because there is no compelling evidence against has been termed the ad ignorantium fallacy (Walton, 1998). There is an assumption among many psychology students that clinical training will develop in them an intuitive sense about the clients with whom they work. Psychologists are typically called upon to make assessments, diagnoses, and predictions about events pertaining to clients such as future violence, recidivism, hospitalisation, diagnosis, prognosis, and suicide attempts. However, in most studies comparing actuarial and clinical methods of prediction, actuarial methods significantly outperformed clinical methods (Aegisdottir et al. 2006). Invariably, objective data such as test results yields outcomes that are superior to what is known as “clinical intuition” (Meehl, 1954). Yet, many psychologists in training are schooled into believing that their task is to develop a special sense of their patients that may be obtained by verbal interaction. A more detailed discussion of the clinical and actuarial methods of prediction is available elsewhere (Kagee, 2006). Post hoc ergo propter hoc is a Latin term that translates as “After this therefore because of this”. This proposition is based on the mistaken notion that simply because one event happens after another, the first event was a cause of the second event (Pinto 1995). While there are many sequences of events that may be both temporally and causally related, temporality is only one condition out of several, for two events to be causally connected. In and of itself temporality does not equal causation. A simple example of this is engaging in a ritual, such as hand-clapping or finger snapping before an examination, believing that this behaviour will cause high performance in the examination.

The Availability heuristic

Reversed burden of proof

Reliance on intuition

Post hoc ergo propter hoc

276

South African Journal of Psychology, Volume 40(3), September 2010

Table 1. Continued Overreliance on testimony Without dispute anecdotes are good educational devices. However, they are not generally useful as a basis for generalisation or as evidence, as they are typically not representative of the population of cases from which the anecdotes are drawn (Casscells, Schoenberger, & Graboys 1978). For any empirically demonstrated relationship there may be outliers that run foul of the apparent relationship. This does not mean that the relationship is invalid but merely that exceptions to the rule do occur. For example, in general, men are taller than women. However, in some instances women may be taller than men. Similarly, there is an undisputed relationship between smoking and lung cancer. Yet, everyone knows an elderly person who has smoked heavily for decades, but whose health is excellent. An individual case does not invalidate the empirically demonstrated relationship between smoking and cancer. When discussing empirically demonstrated relationships between variables, students may sometimes cite individual cases that go contrary to the data, mistakenly believing that individual cases invalidate such relationships. If two variables are shown to correlate with each other, a naïve explanation would be to say that one causes the other. Thus if the correlation coefficient between shoe size and reading ability among children is found to be high, it would be a erroneous to conclude that having large feet causes children to read better. An alternative explanation is that reading well causes feet to grow, but a more likely explanation is that age, associated with cognitive development, results in physical growth as well as an increase in reading ability. Age is therefore a third variable in the equation that is the causal agent. The conclusion therefore is that correlation by itself does not equal causation and is only one of several conditions to be satisfied for a causal relationship to be determined (Kerlinger & Lee 2000). Confirmation bias is selective thinking whereby one tends to notice and look for events that confirm pre-existing beliefs, and to ignore or undervalue the relevance of those that contradict those beliefs (Stanovich 2004). Confirmation bias occurs when a hypothesis is generated and evidence is sought that supports its tenability, to the exclusion of evidence that refutes it. Confirmation bias is thus an error of inference toward confirmation of the hypothesis that is being tested. Thus a determined advocate of a belief can find at least some supportive evidence for virtually any claim (Lilienfeld, Lynn, & Lohr 2004). Hindsight bias is the tendency to state after an event occurred that the event was predictable (Hawkins & Hastie 1990). It is an inclination to see past events as being predictable and reasonable to expect after the fact, rather than before they have occurred.

Correlation equals causation

Confirmation bias

Hindsight bias

RESULTS Description of the sample A total of 201 first year and 171 second year students agreed to participate in the study at the beginning of the fourth quarter of the 2008 academic year. Only those students who had completed the pretest were allowed to complete the posttest. Thus at the second assessment point, close to the end of the course, 78 first year and 118 second year students had been retained in the study. The decrease between pre-test and posttest may be attributed to the fact that many students who completed the questionnaire at pretest did not attend class when the posttest questionnaire was administered. The final sample consisted of 24 (12.2%) males and 172 (87.8%) females. The faculty breakdown of the students was as follows: Arts and social sciences: 106 (54.4%); Science: 22

South African Journal of Psychology, Volume 40(3), September 2010

277

(11.3%); Health science: 48 (24.6%); Theology: 1 (.5); Other (18; 9.2). The mean age of the sample was 20 ( SD = 2). Study results The mean score for both the first and second year groups at pretest was 17.3, indicating that the groups were equivalent at pretest in terms of the variable of interest, that is, the metric of scientific thinking. The mean scores for the first and second year groups at posttest were 16.7 and 18.2, respectively. An analysis of variance, as presented in Table 2, indicated a significant difference between the two groups at posttest. Follow up t-tests revealed significant overall differences between the pretest and posttest scores for each group. W e also conducted significance testing by means of a series of z tests for differences between proportions to determine whether, at posttest, second year students endorsed individual items in the scientifically compatible direction at a greater rate compared with first year students. Figure 1 presents these results. As can be seen, of 11 comparisons 7 were significant (items 1, 2, 5, 7, 8, 10, and 11). These items measured awareness of alternative explanations for phenomena (Item 1); the absence of a counterfactual condition (Item 2); the concept of post hoc ergo propter hoc (Item 5); the limited utility of testimonial and anecdotal evidence in making causal attributions (Item 7); the difference between intuition and empirical evidence in decision-making; limitations imposed by sampling bias (Item 8), and the role of confounding variables in limiting causal claims (Item 11). The results indicate that on these questions a greater proportion of second year students endorsed the scientifically compatible response.

Figure 1. Percentages of students who selected the “scientifically compatible” options at posttest DISCUSSION W ith regard to the group who took the research methods course there is evidence that there was an increase in the rate of students’ overall endorsement of scientifically compatible responses at posttest compared to pretest. The inclusion of a comparison group that was not exposed to the course and whose scores did not increase permits such a conclusion. Also, on the majority of items, the students

278

South African Journal of Psychology, Volume 40(3), September 2010

who attended the course endorsed scientifically compatible options more frequently than the first year developmental psychology students. These two observations, taken together, may be interpreted to mean that the course was responsible for increasing scientific reasoning ability among students. It therefore appears that the process of teaching students the technical aspects of research methods in psychology may help in shifting their underlying epistemological reasoning, resulting in an increased ability to reason scientifically. Table 2. Results of Analysis of Variance SS Total T1 Between groups Within groups Total Between groups Within groups Total 2.89 529.81 532.70 108.87 688.53 797.40 df 1 167 168 1 177 178 ms 2.89 3.17 F .91 sig. .34

READ ALSO :   business strategy

Total 2

108.87 3.89

27.99

.00

W ith regard to the control group who followed a traditional first year teaching sequence, it was interesting to observe a significant reduction of scores. This effect is similar to that found in traditional content-based first year physics courses, for example, where a general deterioration of expectations, attitudes and epistemology has been measured over the period of instruction (Redish, Saul & Steinberg, 1998). This study brings into focus the question of mainstreaming the teaching of scientific reasoning skills within the general psychology curriculum. Our data suggest that such mainstreaming is entirely possible within the context of a research methods course. W e argue that integrating the teaching of scientific reasoning skills within other courses such as abnormal, developmental, and social psychology in the undergraduate curriculum warrants consideration. Courses of this nature may provide the contextual space within which scientific reasoning skills may be facilitated while students are simultaneously exposed to their content. W hile this idea lies outside of the scope of the present study, we raise it in the hope that it may be considered in the future. However, the results from the control group are somewhat concerning and have serious implications for the way in which science is taught at first year level. The epistemological beliefs of students, including their views about the nature and construction of knowledge deeply affects how students approach learning (Hammer, 1994; Elby, 1999) Thus, in the context of teaching physics, studies have been carried out in which epistemological issues have been explicitly introduced into the curriculum. Elby (1999) has reported that the inclusion of such a strand has assisted substantially in changing students beliefs about the nature of knowledge. Implications for future research The present study was based on a pilot intervention aimed at increasing scientific reasoning ability among students. Such a process requires further refinement and evaluation so as to create visible and more clearly articulated links between the technical aspects of research design and the underlying scientific reasoning on which they rest. The specific didactic methods to accomplish this require development, refinement, and evaluation. The data did not permit us to examine the specific cognitive processes that informed the students’ responses to the vignettes that formed part of the assessment instrument. The next step in this line of research is to explore qualitatively the reasoning processes that led students to endorse the items they did. Such processes may best be uncovered by asking students to explicate their thinking

South African Journal of Psychology, Volume 40(3), September 2010

279

processes when responding to questions that assess scientific reasoning and then analysing these reflections. The instrument used in this study requires further refinement and validation. It is not known whether the individual items cohere sufficiently with one another so as to permit the underlying construct of scientific reasoning to be identified. Reliability analyses are required to determine the internal consistency of the scale as a whole and item analysis is required to determine the performance of individual items. Such data will inform further refinement of the instrument so as to yield an optimal assessment of the construct of scientific reasoning. Concluding remarks W hile at one level, the technical aspects of research methods form an integral component of psychology curricula in most psychology departments around the world, it is at the deeper level of scientific reasoning that an impact can be made on how people conceptualize the world they inhabit. Technical knowledge is seldom retained if not used frequently, but changes brought on by understanding aspects of scientific reasoning may persist longer and thus influence the way psychology graduates appraise knowledge claims. However, there is evidence that leaving such endeavours for research methods courses taught after the first year of study is not optimal given our findings that content driven courses may have a negative effect on the scientific mindset of students. This is in keeping with the finding of Schommer (1990), for example, that “epistemological beliefs appear to affect the critical interpretation of knowledge” (p. 501) that is, it was a question not of students’ being able to recall prominent information in the passages but rather of what they concluded from the information. W hen one encounters content material that is tentative, strong beliefs in the certainty of knowledge leads to the distortion of information in order to be consistent with this belief. The recognition of this problem in the context of physics has brought about attempts to incorporate epistemological themes and explicit development of scientific abilities into first year teaching (Etkina et al., 2006; Etkina et al., 2008). In the context of the prominence of popular and folk psychology in many societies around the world, the results of this study are potentially important. Popular psychology is evident in the form of television programmes that show distressed persons receiving psychological help, various selfhelp books, and long-cherished folk wisdom about human nature. In many instances these sources of knowledge have questionable scientific bases, even though they appeal to the popular imagination and appear to make intuitive sense. However, intuition and even untrained observation can yield inaccurate conclusions (Myers, 2002). For psychology to be able to make superior knowledge claims about human nature as opposed to lay, popular, and folk understandings it is essential that psychology curricula incorporate explicit strands that address the way in which scientific knowledge about human behaviour is constructed. ACKNOW LEDGEM ENTS W e thank the following people for their insightful comments while we were undertaking the study and during the drafting of the paper: Dedra Demaree, Eugenia Etkina, Dylan Fincham and Brenda Liebowitz. W e also acknowledge the Fund for Innovation and Research into Learning and Teaching (FIRLT), Stellenbosch University, and the National Research Foundation, South Africa, for financial support for this project. REFERENCES Ægisdóttir, S., White, M.J., Spengler, P.M., Maugherman, A.S.,Anderson, L.A., Cook, R.S., Nichols, C.N., Lampropoulos, G.K., Cohen, G., & Rush, J. (2006). The Meta-Analysis of Clinical Judgment Project: Fifty-Six Years of Accumulated Research on Clinical Versus Statistical Prediction. The Counseling Psychologist, 34, 341-382. Angelo, T.A. (1995). Classroom assessment for critical thinking. Teaching of Psychology, 22, 6-7.

280

South African Journal of Psychology, Volume 40(3), September 2010

Beyer, B.K. (1985). Critical thinking: What is it? Social Education, 49, 270-276. Campbell, D.T., & Stanley, J.C. (1963). Experimental and quasi-experimental designs for research. Chicago: Rand McNally. Casscells, W., Schoenberger, A. & Graboys, T. (1978). Interpretation by physicians of clinical laboratory results. New England Journal of Medicine, 299, 999-1001. Dunbar, K., & Fugelsang, J. (2005). Scientific thinking and reasoning. In K. Holyoak, & R. Morrison (Eds), Cambridge handbook of thinking & reasoning (pp. 705-725). New York, NY: Cambridge University Press. Elby, A. (1999). Helping students how to learn. American Journal of Physics (Physics Education Research Supplement), 69, S54-S64. Etkina, E., Heuvelen, A., White-Brahmia, S., Brookes, D., Gentile, M., Rosengrant, D., & Warren A. (2006). Scientific abilities and their assessment. Physical Review Special Topics – Physical Education Research, 2, 020103-1 – 020103-15. Etkina, E., Karelina,A., & Ruibal-Villasenor, M. (2008). How long does it take? A study of student acquisition of scientific abilities. Physical Review Special Topics – Physics Education Research, 4, 020108-1 – 020108-15. Halliday, D., Resnick, R., & Walker, J. (2007). Fundamentals of Physics Extended, 8th Edn. New York: Wiley. Hammer, D. (1994). Epistemological beliefs in introductory physics. Cognition and Instruction, 12, 151-183. Hawkins, S.A., & Hastie, R. (1990). Hindsight: Biased judgments of past events after the outcomes are known. Psychological Bulletin, 107, 311-327. Kagee, A. (2006). Where is the evidence in South African clinical psychology? South African Journal of Psychology, 36, 233-248. Kerlinger, F., & Lee, H. (2000). Foundations of behavioral research. Orlando, USA. Harcourt College Publishers: Orlando, FL. Lewis, A., & Smith, D. (2001). Defining higher order thinking. Theory into practice, 32, 131-137. Lilienfeld, S.O., Lynn, S.J., & Lohr, J.M. (2004). Science and pseudoscience in clinical psychology: Initial thoughts, reflections, and considerations. In S.O. Lilienfeld, S.J. Lynn, & J.M. Lohr, Science and pseudoscience in clinical psychology (pp. 1-16). New York: Guilford. Meehl, P.E. (1954). Clinical versus statistical prediction: A theoretical analysis and a review of the evidence. University of Minnesota Press: Minneapolis. Myers, D.G. (2002). Intuition: Its powers and perils. Yale: New Haven. Newman, F.M. (1990). Higher order thinking in teaching social studies: A rationale for the assessment of classroom thoughtfulness. Journal of curriculum studies, 22, 41-56. Pinto, R. C. (1995). Post hoc ergo propter hoc. In H.H. Hansen, & R.C. Pinto (Eds), Fallacies: Classical and contemporary readings. Penn State Press: University Park. Popper, K.R. (1963). Conjectures and refutations: The growth of scientific knowledge. London: Routledge & Kegan Paul. Redish, E.F., Saul, J.M., & Steinberg, R.N. (1998). Student expectations in introductory physics. American Journal of Physics, 66, 212-224. Ruscio, J. (2000). Risky business: Vividness availability and the media paradox. Skeptical Inquirer, 24, 22-26. Schommer, M. (1990). The effect of beliefs about the nature of knowledge on comprehension. Journal of Educational Psychology, 82, 498-504. Shermer, M. (1997). Why people believe weird things: Pseudoscience superstition and other confusions of our time. New York: Freeman Press. Stanovich, K.E. (2004). How to think straight about psychology. Boston: Allyn and Bacon. Walton, D.N. (1998). A pragmatic theory of fallacy. Argumentation, 12, 115-123. Swartz, L., De la Rey, C., & Duncan, N. (Eds). (2004). Introduction to Psychology. Cape Town: Oxford University Press.

South African Journal of Psychology, Volume 40(3), September 2010

281

APPENDIX 1. Example of a vignette used in the assessment instrument Professor Rider announces to a group of workers that he wants to study the effect of playing music on their productivity. After a week of playing to them he finds that productivity has indeed increased. He then turns up the volume and after another week finds that the productivity has increased further. He thus concludes that music causes an increase in productivity. His colleague Professor Walker, however, says this is not a correct conclusion. With whom do you most strongly agree, Professor Rider or Professor Walker? G Professor Rider G Professor Walker

Copyright of South African Journal of Psychology is the property of South African Journal of Psychology and its content may not be copied or emailed to multiple sites or posted to a listserv without the copyright holder’s express written permission. However, users may print, download, or email articles for individual use.

PLACE THIS ORDER OR A SIMILAR ORDER WITH US TODAY AND GET AN AMAZING DISCOUNT 🙂