Class 2002B

 

HOME

ABSTRACT

INTRODUCTION

PROBLEM STATEMENT

LITERATURE REVIEW

METHOD

RESULTS AND

   DISCUSSION

REFERENCES

APPENDICES

Method

In order to illustrate that knowledge management will benefit military public affairs practitioners and the Department of Defense public affairs organization as a whole, we created a model of a knowledge-based website containing the Department of Defense Joint Course in Communication Capstone Projects from 1998 through 2002.  The model is based upon the Myers and Swanborg (1999) steps of knowledge packaging.  Recall from the literature review the six steps of packaging are to first, identify the knowledge; second, segment the audience; third, customize the content; fourth, choose the appropriate format; fifth, organize the content; and finally, market-test the format and content (pp. 202-203). 

Survey

We designed a pretest-posttest survey for this study for two primary reasons.  The first is to test our hypothesis.  In such a test, the researcher “measures the dependent variable before the treatment group is exposed to the stimulus.  After the stimulus is given, the dependent variable is measured once again in exactly the same way with the same participants” (Keyton, 2001, p. 152).  In this study, the independent variable is the knowledge management tool created by the team.  The dependent variables are the degree to which public affairs professionals will use the tool and the degree to which the tool is beneficial to them.

Our survey technique differs from the traditional pretest-posttest format in that we will not conduct the posttest on the same sample that underwent the pretest.  We believe that since the questions on the survey that indicate what content on a knowledge-based website the participants would use are being used to create the knowledge model, using the same participants in the posttest would bring up questions of generalizability.

Survey Participants

The pretest survey was administered to 39 military public affairs professionals who are senior enough to make decisions in their respective offices.  Because of the short timeline in this study, we used a combination of techniques.  Primarily, we used convenience sampling, but also used volunteer sampling as well as snowball sampling when possible.  The survey population includes officers, enlisted service members, and civilian employees who represent all four branches of the Department of Defense.  Twenty-two of the participants represent the Army; 8 represent the Air Force; and 9 represent the Marine Corps. There is no representation from either the Navy or the Coast Guard, which is part of the Department of Transportation but practices public affairs under the same general guidelines as Department of Defense organizations.  Because the emphasis we placed on the potential for decision-making, none of the respondents are in the pay grades E-1 to E-4.  Fourteen of the participants are enlisted, E-5 or above.  Likewise, there are 14 O1-O4 respondents as well as three officers, O5 or above.  In addition, 8 of the participants are civilian government employees.  These individuals serve in various capacities throughout the world.  Their billets include public affairs officer, public affairs chief, media operations officer, Defense Information School instructor, broadcast manager, and broadcaster to name just a few.  Because this study must be carried out over a long period of time with exposure to the stimulus, we did not conduct the posttest survey.

Survey Design

            Since we could find no previous survey or study upon which we could build, our team designed its own.  This can be seen in Appendix A.  The survey consists of 14 questions that provide demographic information, determine familiarity with and use of Capstone Projects, and garner opinions on what information or knowledge would be beneficial to the respondent.  All responses were self-reported.  The survey was distributed via e-mail on a Microsoft Excel database file to the respondents.  Respondents typically took 5-10 minutes to answer the survey and returned the surveys within a one-week period.  

Content Analysis

Another step required in this study was to conduct a content analysis of the Capstone Projects in order to categorize them and identify key words.  Keyton (2001) states that content analysis “integrates both data collection method and analytical technique to measure the occurrence of some identifiable element in a complete text or set of messages...content analysis helps researchers make inferences by identifying specific characteristics of messages” (p. 251).

Our content analysis differs from the typical sense of the term by relying on a more qualitative approach.  While traditional content analysis uses established categories, and the characteristics are mutually exclusive, these notions do not fully apply to the circumstances of this study.  Instead, each member of the team was given a portion of the 45 Capstone Projects accessible on the web.  Each member was to identify key words and themes.  Together the team tracked natural patterns in the data and formed categories based upon those patterns.  Capstone Projects were not placed into as many categories as they naturally fit. 

Materials

            The materials for this study have included the use of Microsoft Excel to design the survey and capture the survey data.  In addition, Microsoft Front Page was employed to design the knowledge-based website.  Microsoft Word was used to compile the key words and transfer them to web.  A simple search engine will have to be obtained and employed to allow the user to search for key words on the website. 

Website Model 

As such, we have created a model for a knowledge-based website for the Capstone Projects that allows the user to browse the projects either by the class and year or by various categories, conduct a keyword search, or discuss public affairs issues on a discussion bulletin board.  In addition, links to other public affairs information, such as public affairs plans, doctrine and orders have been added based upon needs identified by the pretest survey respondents.  Appendix E provides an example of the website model.