Details:In this course, we will move toward the concept of the literature review, an important element of the dissertation process. As you have learned, synthesis is the process of creating a new idea by analyzing multiple disparate concepts or notions to discern the common thematic or connecting principles among them. Synthesis of research is a process learned through time and practice. In this assignment, you will continue to analyze research. Instead of approaching the research articles as individual entities, you should compare, analyze, evaluate, and synthesize the information you are reading. General Requirements:Use the following information to ensure successful completion of the assignment:Review the journal articles assigned in the first three topics of this course.Doctoral learners are required to use APA style for their writing assignments. The APA Style Guide is located in the Student Success Center.Refer to the resource, “Preparing Annotated Bibliographies,” located in the Student Success Center, for additional guidance on completing this assignment in the appropriate style.Refer to the sample annotated bibliography attached to this assignment.You are required to submit this assignment to Turnitin. Refer to the directions in the Student Success Center.Directions:Provide an evaluation (750-1,000 words total) of at least five journal articles from the first three topics of this course. Include the following for each article:The article citations, which are not included in the total word count.A written summary of the key concept(s) of the articles. Consider the following questions: Why were the studies done? What were the populations studied? What did the researcher(s) conclude? How are the research questions or findings similar/different? What other information about these studies do you believe is unique or important to recall? Are there specific statements made by the authors that you wish to retain? How might these articles compare with others you read? What new ideas can come out of the articles you are evaluating?
psy_830_rs_sample_annotated_bibliography.docx

article_1.pdf

article_2.docx

article_3.pdf

article_4.pdf

Unformatted Attachment Preview

Running head: ANNOTATED BIBLIOGRAPHY
1
Annotated Bibliography Examples
Kenobi, O.W. (1977). Mos Eisley spaceport: A wretched hive of scum and villainy. Journal of
Intergalactic Spaceports, 7, 42-50. doi: 4815162342
Kenobi presents a solid argument for the wretchedness of the Mos Eisley spaceport. His
research is thorough, current, and his claims are well-supported. He examines the
denizens of the spaceport, thoroughly documenting the caliber of their occupations and
characters, setting up a firm argument for their inadequacy as galactic citizens. Based on
a thorough review of the literature, an exhaustive survey of his sample population, and an
analysis of the data using SPSS, he concludes that there is no spaceport more wretched
than Mos Eisley. Based on other literature in the field, and the ample support provided by
Kenobi in this article, the conclusions drawn here seem valid. Kenobi is a prolific
researcher in this field, with 85 publications in peer-reviewed journals, and 5 texts
published with well-regarded academic publishers. This article is published in the leading
journal of spaceport research, indicating credibility for the article and an intended
audience of other spaceport experts.
Skywalker, L. (1981). Dagobah: Swamp planet or treasury trove of secret knowledge? Journal of
Jedi Studies, 77, 293-309. Retrieved from http://www.journalof jedistudies.org
Keep your annotation to about 150-200 words. Note the purpose of the article, the
participants/subject of the study, the conclusions drawn by the author(s), and the validity
of the conclusions. Evaluate the article: is it a credible source? Describe the credibility of
the author – are there any biases? How well did the author support his or her assertions?
Did they provide an adequate literature review? Were there any limitations?
ANNOTATED BIBLIOGRAPHY
2
Solo, H., & Organa, L. (1983). I am not a committee: Building a relationship during a galactic
civil war. Journal of Interpersonal Attraction, 4, 77-90. doi: 934.1701.007
Annotated bibliographies are helpful tools for organizing and preparing for a research
paper or project. Instead of reading articles and forgetting what you’ve read, you can
have a convenient document full of helpful information about the articles you’ve read. In
addition to helping you remember what you have read, an annotated bibliography can
help you see the bigger picture of the literature you are reading. It can help you visualize
the overall status of the topic, as well as where your unique question might fit in.
Checking in With the Scientist–Practitioner Model:
How Are We Doing?
Deborah E. Rupp
University of Illinois at Urbana-Champaign
Daniel Beal
Rice University
The science and practice of industrial-organizational psychology is being
influenced by a number of factors. For example, within academics, we see an
increase in the number of faculty members taking positions in business
schools.1 Within practice, we see an increase in the number of mergers and
acquisitions among I-O psychology consulting firms. I-O psychology research
is becoming increasingly multilevel and multidisciplinary. I-O practitioners
face increased challenges surrounding what they can share with the profession
at large. As we face these and other changes, it not only becomes important to
reflect on the implications of these trends for science and practice but also
whether the scientist–practitioner model, as it is currently articulated, represents the way we are or should be conducting ourselves as a profession.
To respond to this need, the SIOP Strategic Program Committee hosted a
special invited panel discussion at the 2007 annual conference charged with
the mission to “check in” with the science–practice model and discuss how
contemporary issues facing the field might affect the viability and interpretation of this model. The panel consisted of a sample of SIOP’s leading scientists, practitioners, scientist–practitioners, and practitioner–scientists: Rosemary Hays-Thomas, Leaetta Hough, Daniel Ilgen, Gary Latham, Ed
Locke, Kevin Murphy, Nancy Tippins, and Howard Weiss.
The resulting discussion, debate, and dialogue brought many important
issues to the surface. We summarize these issues below and hope that they
will serve not as a prescription for any one perspective (as we learned, all
such prescriptions are quite debatable) but rather as a catalyst for dialogue
surrounding our current and future identity as a field.
Origins of the Model
The science–practice model has its origins in clinical psychology. It was
conceived in 1949 at the Boulder conference as a model for graduate student
training (Benjamin & Baker, 2000; Hays-Thomas, 2002). According to the
model, psychologists are to be trained in a way that integrates science and
practice such that activities in one domain would inform activities in the other
domain. Graduate students are to learn about research and practice, and carry
out research and practice under the supervision of faculty and professionals
with expertise in both areas. Graduate programs are to house both research
1
As of this writing, SIOP data show that 50% of the 41% of SIOP members working for academic institutions are working in business schools.
The Industrial-Organizational Psychologist
35
and clinical facilities, and curricula are to be structured to integrate these two
activities whenever possible.2
In I-O psychology, science–practice has been adopted more as a model for
the field than a model for graduate training (and the panel noted the great variance in I-O graduate programs with regard to practitioner training; HaysThomas, Hough). Although there is not an officially mandated definition of
the science–practice model, most descriptions point to a reciprocal relationship between the two: Practitioners should look to the scientific literature for
guidance on setting up effective workplace systems; scientists should take
their cues from practitioners in identifying issues relevant to employee wellbeing and organizational effectiveness. Although seemingly straightforward,
there are a number of issues embedded within this idea that are hotly debated.
To What Extent Should Practice Influence Science?
Although everyone on the panel agreed that science should exert a strong
influence on practice, there were varied opinions about whether and how practice
should have an influence on science in I-O psychology. On the one hand, Locke
made an argument for inductive theory building (Locke, in press) where we accumulate a large body of findings from both laboratory and field settings and then
integrate these in order to develop a theory. This is in contrast to the hypotheticodeductive method where we develop a theory, make deductions, and then test
them. A similar perspective mentioned by several panel members was that,
although research need not necessarily be governed by our observations of
applied problems, these problems often do (and should) help determine the focus
of our research interests. As an example of this perspective, Latham described
how several fruitful research efforts began by pursuing a practical problem of an
organization (Latham, 2001). After consulting the available scientific evidence
related to the problem (e.g., goal-setting research), novel solutions were made
apparent such as when to set a learning versus a performance goal.
On the other hand, some panelists (most strongly argued by Weiss) felt that
there is a justifiable role in I-O psychology for research that is not at all guided by applied problems. Instead, research can be stimulated by the simple
desire to understand the psychology of people at work (see also Hulin, 2001).
In so doing, applications of our research will arise naturally, and it is precisely
the role of practitioners to determine and implement these applications. Weiss’s
work on affective events is an example of such an approach (Weiss & Cropanzano, 1996). Whereas we have learned a great deal through this research about
the link among employee emotion, attitudes, and behaviors, and this research
has implications for a number of organizational strategies that could serve to
2 It should be noted that the actual implementation of this model has been questioned within clinical psychology, and other models (e.g., the scholar–practitioner perspective, the local clinical
scientist model; Belar & Perry, 1992; Korman, 1974; Stricker & Trierweiler, 1995; Wright, 1983)
have been suggested. The appropriateness of this model for I-O psychology was also called into
question by Murphy, Weiss, and Hays-Thomas.
36
July 2007
Volume 45 Number 1
increase both the well-being and performance of employees, the research was
motivated not by a goal of informing practice but rather a quest to more deeply
understand the emotional experiences of people while working.
This perspective raises other important points. The first point, made by
Ilgen and others, argued that if our research is of high quality, if we follow
the scientific method (which Hough suggested should be the true basis of our
graduate training, a point which received enthusiastic agreement from all
panelists), and if we study topics that are relevant, then our research will
automatically have important implications for practice. In the words of
Locke, “any theory which is true and nontrivial has potentially useful applications.” As pointed out by Ilgen, this is, by definition, the nature of I-O psychology. The fact that it is psychology implies that it is scientific and follows
the scientific method. The fact that it is industrial-organizational implies that
it has implications for practice. Thus, whether we take an inductive or deductive approach, whether we are motivated by a pursuit of psychological understanding or practical relevance, whether our careers involve research, practice, or both, “we are all scientists–practitioners” (McHenry, 2007).
Of course, this statement is only generalizable to all of us if we allow for
some flexibility in the interpretation of the scientist–practitioner model. As
both our panelists and the literature point out, just because science should
inform practice, and a scientist can’t conduct applied research without some
understanding of work contexts, it is unreasonable to think that all SIOP
members should be both conducting research and practicing I-O psychology
(Brooks, Grauer, Thornbury, & Highhouse, 2003; Hays-Thomas, 2002; Kanfer, 2001; Murphy & Saal, 1990). Indeed, carrying out these two activities
often requires very different skills and personality traits. In addition, the
incentive systems within academic and practitioner jobs so often stifle the
researcher’s practice and the practitioner’s research. Professors (especially
those of a junior nature) are often under tremendous pressure to publish a
great deal of research in the field’s leading journals. This doesn’t leave much
time for consulting, which is often not valued by academic departments and
also frowned upon by large research institutions. This trickles down to graduate training as well, where practice components of curricula are more often
included in master’s programs than in doctoral programs (Hays-Thomas).
But the pendulum swings the other way as well. There are a number of contextual factors that not only impede practitioners from conducting research but
also make it difficult for them to carry out evidence-based practice altogether.
In our discussion, Tippins pointed out that practitioners are often forced to
work in a world where science is completely undervalued. Sometimes research
and evaluation work is only agreed to because legal departments require it. In
fact, there are often career penalties for espousing academic ideals, in that
doing research, reading journals, and going to conferences to potentially (in the
eyes of organizations) divulge intellectual property and proprietary client
information is neither furthering the goals of the employer nor a billable activThe Industrial-Organizational Psychologist
37
ity, and can be perceived of as doing more harm than good. Moreover, many
contracts prohibit the disclosure of a client’s name, and some client arrangements allow hefty fines if a consultant divulges a client name, every time the
name is divulged. In addition, many corporate attorneys are quite concerned
about protecting their company’s interest and frown upon disclosure of sensitive data. All of these factors can make it quite difficult for practitioners to conduct research—even when they have the data and motivation to do so.
Thus, a second point made by the current implementation of the science–practice model is that the statement “we are all scientist–practitioners”
can only realistically hold true, if by “we” we mean the field as a collective.
Given differing motives (knowledge generation, practical solutions), work
preferences (research, consulting), and contextual barriers (tenure, billable
hours), we will only truly be able to simultaneously enhance science and
practice if we communicate effectively with one another.
Model or Mindset?
The myriad of issues raised thus far led the panel to suggest that
science–practice may not be as much a model as it is a value system (Weiss),
mindset (Latham, Ilgen), or career metaphor (Ilgen, on Latham), and some
members of the panel suggested that this mindset need not be appropriate for all
members of SIOP nor be fundamental to what it means to be an I-O psychologist (Weiss, Ilgen). What was agreed upon by all panelists is that I-O psychologists need to be trained to consume, critique, and carry out science (Hough). This
is crucial for both doing good research and conducting evidence-based practice.
It is also essential that these individuals are trained (either through their graduate programs or early career experiences) to interface and communicate with
other individuals at various hierarchal levels and with various amounts of power
and influence. These skills are needed to teach. These skills are needed to persuade organizations that research, consulting, or evidence-based HR systems are
needed. These skills are needed to advocate on behalf of SIOP to inform the public about the purpose and importance of our field.
In his 2007 presidential address, Jeff McHenry argued for a three-pronged
approach to the science and practice of I-O psychology:
• Work with issues that are important
• Measure outcomes that are important (at multiple levels of analysis)
• Share knowledge effectively
There is something both parsimonious and universal about these three
goals, in that none of them are in conflict with the (divergent) views of our
panelists. That is, importance can be determined by the actor whether the actor
is a scientist purely focused on the accumulation of knowledge, whether the
actor is a scientist–practitioner inductively building theory based on organizational information, whether the actor is a practitioner focused solely on meet38
July 2007
Volume 45 Number 1
ing client needs (but doing so in a scientifically informed way), or whether the
actor is the field as whole, committed to enhancing “human well-being and
performance in organizational and work settings by promoting the science,
practice, and teaching of industrial-organizational psychology” (SIOP, 2007).
Although this model provides us with ideals to strive for, we will continue to
face derailers. For example, Weiss and Murphy pointed out that as many
researchers migrate to business schools the nature of our science will undoubtedly shift. Whereas this shift may expose us to multidisciplinary and multilevel
research, we may be drifting further and further from core psychological
research. McHenry also questioned what implications this might have on our reputation with psychology departments, APA, APS, and other groups affiliated with
the broader field of psychology with which we have historically been connected.
We feel the answer lies in McHenry’s third recommendation: “Share
knowledge effectively.” Our panelists offered a number of ideas for carrying
this out. For example, Locke presented some creative ideas surrounding a science–practice networking Web site, where researchers can learn about issues
practitioners are observing in the field and find sites for conducting field
experiments, and practitioners can read summaries and abstracts of the current
research being published in the journals. If science–practice is indeed a fieldlevel value system, it is only through such information sharing that we will be
able to live up to our mission as a discipline. Whether it be a model, a mindset, or a value system, science–practice is a highly relevant concept that has
shaped our history as a field. We hope that our investigation has served to
remind us of some old issues and to catalyze dialogue about several new ones.
Looking Forward
We also wanted to note here that our panel was one of many sessions at
the 2007 annual conference that discussed the current state and future of our
field. For example, Deidra Schleicher and Michelle Marks chaired a session
entitled “Is the Future of I-O Psychology at Risk?” with panelists Michael
Campion, José Cortina, Angelo DeNisi, Katherine Klein, Richard
Klimoski, Frank Landy, Kevin Murphy, and Victor Vroom. Several issues
emerged in this session that resonate with those described above. That is, panelists showed a lack of consensus regarding the state of the field, expressed
concern about the migration to business schools’ effect on our identity, and
conveyed worry that we may be losing credibility/respect within the broader
field of psychology. Also emphasized in this session was a need for more
definitive data on the nature of the challenges/issues facing I-O psychology.
In addition, Jerald Greenberg chaired a session entitled “To Prosper, Organizational Psychology Should…” with panelists Wayne Cascio, Jeffery
Edwards, Michele Gelfand, Richard Klimoski, Joel Lefkowitz, and Lyman
Porter. This session underscored many of the issues that were discussed in the
scientist–practitioner panel, such as bridging application and scholarship (CasThe Industrial-Organizational Psychologist
39
cio), improving education for future scientist–practitioners (Klimoski), and
changing our value system to explicitly adopt the beliefs held implicitly by scientist–practitioners (Lefkowitz). The session also covered a wide range of issues
important to our field, including the development of more rigorous process-oriented theories (Greenberg), increasing the methodological sophistication of
empirical research (Edwards), and adopting a more global perspective (Gelfand).
If one thing is clear, it is that we are certainly not at a loss of things to talk
about. Indeed, it is an exciting time to be in I-O psychology!
References
Belar, C. D., & Perry, N. W. (1992). National conference on scientist–practitioner education and training for the professional practice of psychology. American Psychologist, 47, 71–75.
Benjamin, L. T., Jr., & Baker, D. B. (Eds.) (2000). Boulder at 50. American Psychologist,
55, 233–254.
Brooks, M. E., Grauer, E., Thornbury, E. E., & Highhouse, S. (2003). Value differences
between scientists and practitioners: A survey of SIOP members. The Industrial-Organizational
Psychologist, 40(4), 17–23.
Greenberg, J. (2007, April). In order to prosper, organizational psychology should…Symposium presented at the 22nd Annual Conference of the Society for Industrial and Organizational
Psychology. New York, New York.
Hays-Thomas, R. (2002). Perspectives on the teaching of applied psychology. In D. C.
Solly & R. Hays-Thomas (Eds.), Mastering the future: Proceedings of the third national conference on master’s psychology. CAMPP: Pensacola, FL, and Richmond, KY.
Hulin, C. (2001). Applied psychology and science: Differences between research and practice. Applied Psy …
Purchase answer to see full
attachment