This html article is produced from an uncorrected text file through optical character recognition. Prior to 1940 articles all text has been corrected, but from 1940 to the present most still remain uncorrected. Artifacts of the scans are misspellings, out-of-context footnotes and sidebars, and other inconsistencies. Adjacent to each text file is a PDF of the article, which accurately and fully conveys the content as it appeared in the issue. The uncorrected text files have been included to enhance the searchability of our content, on our site and in search engines, for our membership, the research community and media organizations. We are working now to provide clean text files for the entire collection.
Sky-high grade inflation is marking 90% of naval officers as top 10% performers—in this artificially bright galaxy, the real stars fade from promotion boards' view. In the first of a three- part series, four authors train their lenses on Navy fitness reports—and find they need tough counseling.
You’ve done it again! In a world that seldom seems to appreciate your unique genius, your commanding officer once again has ranked you in the top 1% of your grade. This gives you a wonderful feeling until you note that of the three officers in your grade, two of you were placed in the top 1% and the third was in the top 5% column. How extraordinarily fortunate your command is to have received the cream of an exceptional crop. The odds against having received such top-rated officers at random is preposterous, about 200,000:1. Of course, the chances are considerably increased by the current practice of rating nearly all officers in the top two columns. The officer is then forced to wonder: “Where do I really stand?” Few of us can honestly claim that out of a group of 100 officers the other 99 rank below us.
The system was intended to keep the officer apprised of his progress and to aid selection boards in determining which officers are promotable and which have reached the limit of their potential. Both objectives are best served by frank appraisal and ranking. If a selection board aims to promote 70% of the officers in the selection zone, it is a mathematical given that a substantial portion will in fact come from officers who are below average in performance, but presumably still acceptable. The current habit of inflating grades does not alter the fact that, in this example, 30% of the officers reviewed will fail at selection. It merely makes the chances of correctly identifying the most qualified candidates more difficult. With everyone in the top brackets, the selection boards are far more likely to overlook an outstanding candidate while favoring a mediocre candidate with equally outstanding marks and a more eloquent reporting senior writing the verbal portion. One must pity the rare candidate who has the truth written about him, for even the best of us could not survive a frank appraisal of our performance in a system in which a good officer receives nearly perfect marks and is usually described in glowing terms.
This tendency to inflate performance evaluations is self- perpetuating. A commanding officer now knows that he cannot give a frank appraisal without destroying a young officer’s career. Currently, there is no way to point out an officer’s weaker traits (present even in the best of us) without causing unintended and unwarranted professional damage. There have been attempts to deflate the grading by changing the format of the report. When I was first commissioned, shortly after the last major change, the average officer on my ship was rated in the top 10% or 30%. Now he finds himself in the top 5% or 1%. There is a natural tendency to give an officer as good a grade as possible. It is a pleasure to hand an officer a good report. But it is disconcerting to damage a career that may yet develop potential; it is much easier to let the grade slide to the left. This is similar to complaints made by college professors: they cannot give a student a B without damaging their chances for graduate school because promising students normally receive As.
The tendency to inflate grades was pointedly illustrated while I was an ensign. My division chief did not display the leadership qualities that one would expect of a senior chief. He was a very productive administrator, but he was not able or willing to make the unpopular decisions necessary, to accomplish the division's assigned tasks. Because he was also the senior chief of the command, he often was the subject of discussion among the officers of the ship. The universally expressed opinion was that his performance fell far short of what would be expected of a senior chief. Though I had worked only with one other chief, that experience and those that followed confirmed this opinion. His previous
evaluations did not. When it came time for me to write the 1 sion officer’s input to this chief’s evaluation, I spelled out strengths and weaknesses as I saw them and as I had discu with him previously. As an inexperienced officer, I felt awk'1^1 judging a man 25 years my senior in age and experience, took the matter to my department head for advice. I found tn' agreed with my opinion, so I submitted the input as I had wn ,, it. Several days later, the captain casually dropped by to dis , j the matter. He said that if he based the evaluation on what written, “the man would never make master chief. ’ ’ The cap ^ ; did not take issue with my opinion that the man probably sn not be a master chief. Rather, one simply did not fail to reC ^ mend a senior chief for promotion, nor did one write about ^ failings in all but the most extreme cases. The captain did ^ insist that I rewrite the input nor did I volunteer to do so. & ever, the final evaluation was entirely rewritten. The man subsequently transferred to another ship in the same s9uadjr1i: Later I met his new division officer, who wondered how on e the man had ever been selected for master chief. I have so ■ times wondered which senior chief was passed over becauslj this man’s promotion. I cannot help but feel that the wrongse ^ tion was made. While the chief’s evaluation form is differeIlt’ , principle applies equally to the current officer fitness r 1
(FitRep) system. ^
The FitRep form has ten columns for assigning an offi^1^ overall grade indicating his standing among his peers. F°r ^ system to work properly, 80% of the officers should fall m ^ middle four columns—the top 30% through the bottom 30 /r- , do this, the top three columns must be cleared of all J5 ^ handful
-the top 10%. To place someone in the top l0^f’ e. commanding officer (CO) should believe that he truly has s° .
one special, not just another competent officer. The majority the officers whom I have met are solid and competent. This <s ^ average, and must be so evaluated. In such a select group r
officer corps, to be average is not to be looked down uP^t.
The top three columns should be reserved for the truly anding performers. If a top 1% evaluation were require accompanied by a recommendation for a Navy Commei
Medal, and the top 5% and 10% columns were accompam
edW
recommendations for Navy Achievement Medals, comma111
ceP1
The
officers would be far more reluctant to use these columns e*1 for the most deserving officers—the best one out of ten- ^ Navy could issue medals to 10% of its officers and still f®‘ short of the medals awarded in 1984 to one in three member' ,
jU’
3s!
the U. S. Army, as reported in the Army Times. Further,
below the top 10% column, a column could be reserved place to grade the officers who were recommended for, but 1® ^ to be awarded, the Navy Achievement Medal. This would S.
four columns to separate out the truly top performers. Those
recommended would rank in the top 30% column and
beia".
where a full 90% of us actually belong. This requirement f make a commanding officer think long and hard about whe ^ an individual’s performance truly warrants the special atte®*^
that designation to the top 10% should bring. Because a
would be reluctant to recommend more than a couple of °Hld for the top columns, the middle columns would contain the . jority of the officers and the stragglers would fall into the 1° f columns. This spread would give the officer a clearer idea.,f where he stands in relation to his peers. Further, the more jul1 - awards would be given out in a more systematic manner th®l\| now the case. I served three-and-a-half years on one ship '•vlin
PARHAM)
arTtvf 3 Sm^*c award given, yet during a two-year period at 0 er command, I saw the six Navy Achievement Medals
Warded.
six Navy
P .
orcing COs to single out the poorer performers is a more
difficult
task. Currently, this seems to be done (except in the
■nishf^61™ cases-* by “damning with faint praise.” The officer the8 k Fated 011 in the top 5% or 10% and the compliments in shipmat mi?bt n°t be as glowing as those given to his
he really has
like J“““ not received a poor report (it certainly does not read are °nC^’ and ^eaves selection board members wondering if they fra rkev,ewing a substandard officer, or a report by an extremely the °r *eSS elo0uent. CO. The selection board does not have sani°PK0rtUnity t0 review the FitReps of other officers from the offic Sh'P side"by-side unless they are both up for selection. The reand the selection board would be better served by a poor the h r read hke one. The selection board can and does have view ' t0 comPare officers from large commands. They also vari ,many rePorts on the same officer, but usually written by a 0ftee y different authors. They are extremely flexible and grad"1 'uCntify and make allowances for COs who consistently temnt the'r °fficers higher or lower than normal.1 Then they at- •|P, t() magnify the column distribution and determine trends. 3.5.2 6 3 hypothetical command with ten lieutenants spread Spre, !n dlc top three columns. If a subsequent report shows a „ a ol 4.5:1, the board can make additional judgments about - standings of the officers. This becomes more difficult
Like the
-------------------------- ........ fe „ .j very difficult
Navy* 6 iud8ments as to the member’s overall standing in the
. ~~u Ul
'he relative;
exan^T niean'nSlu 1 as the size of the group diminishes, to mClted ln ibe opening of this discussion, it is very
p0orr aPs a block should be included to detail the officer’s two for pSt traits (we all have them). This would be a required entry Previ °dlcers except those recommended for awards, as noted hen AH °f us overlook our own weaknesses; we would
hy th^ • hnowmg about them. This is addressed to some degree sev 6 lndlvidual trait grades, where a B or C may indicate a CO7C Weakness or may just serve as a contrast for traits that the e leves should stand out as exceptional strengths, the k Xt WC mUSt address the impact on FitRep writing caused by he , nowledge that the writer will have to confront those whom facea^ written about. “Commanding Officers are reluctant to disc ^ C confrontatlor> of personal justification in showing and theCU*SS'n® rePorts w'th the junior. . . . [He] recommends that elii recluirement to show the report to the officer concerned be off' *nated and *n fact constrained. The officer must always have 1Clal access to his record but not prior to its [the FitRep’s]
Require COs to recommend top 1% ratees for Navy Commendation Medals, and top 5% and 10% ratees for Navy Achievement Medals—that could help limit shining evaluations to the truly stellar officers.
submission. There is no constraint upon a Commanding Officer’s ability and in fact, persuasion to counsel his officers on their performance on a continuing basis but the fitness report should not be the vehicle for this.”2 This would allow the CO to address problems more freely. An officer with a minimum of initiative would later review the report in his record and enter justifiable statements of rebuttal. He could also seek out his immediate superior or CO for counseling.
All FitReps submitted by a command should be compared to a statistical norm. In one year, even a relatively small command will submit 10-15 FitReps. Unless the ship has been outperforming all others—receiving efficiency “Es” in all categories and doing superbly in inspections, it can be assumed that these reports should fall in a normal distribution with about half of them falling in the lower half, etc. If the actual distribution falls outside of statistical norms by a previously determined amount, the reports should be returned to the command for resubmission. They would either be changed to reflect a more reasonable distribution or a special report would be required to justify the wardroom’s statistical superiority. Writing FitReps is a chore that few of us enjoy. The added burden that would fall on those officers as a result of submitting inflated reports would rapidly improve the accuracy of the reports. COs who continue to overestimate the abilities of their subordinates should have this noted in their own reports.
The standard solution to the problem of inflated FitReps has been to revise the reporting form. This has always been, at best, a temporary solution. “No basic solution is going to work that does not punish the reporting senior in some form or another for overinflating his reports.”3 This “punishment,” as stated previously, primarily should take the form of added effort required to rate an officer highly. The added effort and risk to credibility involved in the suggested required recommendations for citations would be a small price to pay (occasionally) to reward the truly outstanding officer, but should be sufficient to deter routine inflation of the average officer’s report.
This problem is not unique to the Navy; it is common to all large organizations. To a large degree, the selection boards have managed to read between the lines to make meaningful comparisons, generally producing accurate performance assessments despite the lack of cooperation from the rest of the fleet. “The basic Navy system has been admired and emulated by the other services and by major industries. They all face the same escalation problem and manage to make it work.”4 Incorporating these suggestions would spread out the distribution of the grades and thus improve the objectivity and accuracy of the selection board’s relative ranking of the individual.
‘Correspondence and subsequent discussions with Rear Admiral P. P. Cole, U. S. Navy (Retired), 10 May 1986.
Correspondence with Cole, 10 May 1986.
3Ibid.
4Ibid.
Commander Swensen received a B.S. degree in engineering from Cornell University in 1976. He served in the Mauna Kea (AE-22) and was designated a Surface Warfare Officer in 1978. He transferred to the Naval Reserve in 1980 and served in the Pyro (AE-24) and in Special Boat Unit XI. He is a California-registered civil engineer and employed by VHS Associates.
Going Purple
By Commander John L. Sams, U. S. Navy
No issue is more important to the Navy than the quality of its officer corps. Every other issue raised in Proceedings will be addressed (or ignored) according to the priorities set by those who lead the Navy. For this reason, it is vital that the Report on the Fitness of Officers (NAVPERS 1611/1) validly and unambiguously compare our officers so that only the best will be selected to lead us. To those who say, “Don’t fix what ain’t broken,” I submit that any system that grades 90% of its officers top 5% or higher needs fixing.
In the two years I served as operations officer for a joint service command in Europe, I became familiar with the various services’ officer fitness reports (FitReps). Some features of the Army’s, Air Force’s, and Marine Corps’s FitRep formats could improve the Navy FitRep. (My comments are based on personal observation as a report writer; I am not a personnel expert, nor is this an exhaustive study, and my opinion may or may not reflect official views.)
The Army DA-67-8 Officer Evaluation Report (OER) (see Figure 1) is similar to the Navy’s FitRep in that it contains a duties assigned block, a performance scores section, and a narrative section for specifics of performance and potential. The Army devotes space to professional ethics (dedication, loyalty, discipline, etc.), which are nowhere to be found on the Navy FitRep. Also, whereas the Navy assigns one block (29) to goal setting/achievement, the Army devotes a separate form, DA-67 - 8-1 (Figure 2). Each officer, with his rater, fills out the front of this form within 30 days of the start of a rating period. At the end of the period, block IVc becomes the ratee’s brag sheet, in which he catalogs how well he accomplished the goals set in block IVb. Though at first glance this seems like paperwork overkill, I have come to appreciate the virtue of this procedure, in which individual goals merge with command goals. I required the DA-67-8-1 to be filled out by all officers who worked for me, regardless of service. In addition to the rater’s space, the Army OER provides comment blocks for the rater’s two immediate seniors in the chain of command; both have their chance to help or hurt you. No skipping up the reporting chain is allowed and rating chains are published for all to see. The bottom line on Army OERs is the block filled with little soldiers. The left side of this block equates to the Navy’s overall evaluation (block 51) section; however, the Army’s version of the Navy’s block 52 summary is not done at the unit level, but computed by the Department of the Army for every senior rater when the number of evaluations he has written exceeds ten. If one tends to be an easy marker, the Army will notify both the officer and selection boards of this deficiency. This has the effect of holding down the number of officers marked in the top block.
The front of the Air Force OER, Form 707 (Figure 3), is similar to the Army OER, except that a large amount of space is allowed for elaborating on specific examples of job performance. Potential is marked in block V. In my experience, most Air Force officers are marked in the highest block here as well as in block III categories. There is no distribution summary block. In both the Army and Air Force, annual rating periods are based on rotation dates of rater and ratee, and one is seldom rated simultaneously with his peers. This contributes to inflated marks. Like the Army, the Air Force allows three seniors to enter comments, but unlike the Army, the Air Force permits the endorser to be from any level in the chain of command. If this concept were applied in the Navy, a division officer on a frigate could have his OER endorsed by Commander-in-Chief Atlantic Fleet! Air Force logic appears to be that a “water walker’s” seniors will take effort to push his OER to a level that distinguishes him from pack (who, of course, are all marked in the top block). It1S unreasonable to assume that this policy encourages officers seek duty on senior staffs rather than at operational squadro ■ hoping to maximize the rank of endorsers.
man T"6 ^orps Provides only marking blocks in its perfor- (pj„Ur Professional qualities section of the NAVMC 10835 tj0n in6 , and adows on'y a small space for appraisal elabora- ^larin Sp'tlon,<“' ^his may he continued on another sheet.) The Unsatisf 0r^S S SCa'c uses ff*e following marks: not observed, lent aa,Ctor^’ f,ei°w average, average, above average, excel- age \ n °utstanthng. (Note: three categories higher than aver- marks „n . ^ ■ ‘General Value to the Service”), four other
interestre ed ^bree °f these are also higher than average). An tabulated C,atUrC °* block 15b is that the rater is required to e distribution of all subordinates of the ratee’s grade
whether or not they are being simultaneously rated. (In other words, every subordinate is rated overall any time one is fully reported on.) To further ensure against hanky-panky, the rater is required to list on page two the names of all ratees under his authority, and if any received an outstanding overall, certify his standing (i.e., one of three) among all marked outstanding. Page two also provides room for a reviewing officer’s comments.
The Navy FitRep system may seem fair enough (especially to those who have been rewarded by it), but we should earnestly ask whether it is really the best we can design to discriminate among officers, not only for selection boards, but for counseling
Figure 4 Marine Corps Fitness Report
Figure 5 A Proposed Fitness Report
REPORT ON THE FITNESS OF OFFICERS
i. uamFIlXST. First, UibbcEf
S. ACOUTtuV
□ "
TVK at REPORT □ *S | --------------- “*-------- l,'j hdt&A OBSERVATION _ [Ecrs |
; 21 euploymeht1 |
|
9. DUTIES ASSIGNED
PERFORMANCE FACTORS (SPECIFIC EXAMPLES)
JOB KNOWLEOGE/WARFARE SPECIALITY (Depth, currency, breadth)
JUDGMENT/DECISION MAKING (Conatetency. ettecthreneea)
ABILITY TO PLAN. PRIORITIZE. ORGANIZE
MATERIAL MANAGEMENT 5 LEADERSHIP
ADAPTABILITY TO STRESS/PRESENCE OF Ml
ORAL COMMUNICATION (Clear, conclee, confident) 8. WRITTEN COMMUNICATION (Clarity, organltetlon)
. HUMAN RELATIONS (Equal opportunity participation, eanelthrlty)
82. SIGNATURE OF OFFICER EVALUATED (IAW BUPERS INST. ACKNOWLEDGE THAT I HAVE SEEN THIS REPORT. HAVE BEEN APPRISED u PERFORMANCE AND RIGHT TO MAKE A STATEMENT -
JSMC FITNESS «frO«T Pag. ? (1610)
1 i i ! I
REPORTING SENIOR'S CERTIFICATION
1.0 I hove not hod sufficient opportunity to observe thi* Marine to I hove no comment
- O I hove hod only limited opportunity to observe ’his Marine, but from whot I hove observed I generally concur with the
Reporting Senior's marks in Items 15a and b.
- D I hove hod sufficient opportunity to observe this Marine, and concu' with the Reporting Senior s morks in Items 15a and b.
- □ I hove hod sufficient opportunity to observe this Marine and do no* concur wit1- the Reporting Senior's morks In Items
15a ond b. I would evaluate this Marine os litem !5o, and ronk this Marine at_______ of
______________ (only ronk those evaluated os Ou’-'onding (OS)).
REMARKS fmandatory If Item 4. above, is checked;
RATER COMMENTS
COMMENTS ON SPECIFIC ASPECTS OF PERFORMANCE (IAW -Individual GoalI SI
e, Grad*, Br of SVC, Command
COMMANDING OFFICER EVALUATION
PERSONAL/PROFESSIONAL TRAITS
NOT OBS SUPERIOR
NOT OBS SUPERIOR
SELF DISCIPLINE
MORAL COURAGE
PHYSICAL ENDURANCE
MORALE BUILDING
RESPECT FOR JUNIORS
COMMAND PRESENCE
COMMANDING OFFICER COMMENTS
OVERALL PERFORMANCE
CONFIDENCE
FORCEFULNESS
PERSONAL APPEARANCE
MILITARY BEARING
T TEST RESULTS
OBS SUPERIOR PACK BEHIND OBS SUPERIOR PACK
EARLY ON TIME NO
EVALUATION
DISTRIBUTION SUMMARY Name. Arado. of laV*
amin£
Force include judgment under performance; the Marines put
under professional qualities; and the Navy lists it under Pets°^c traits). My preferences are indicated. These boxes should^
FitRep includes sections for rater comments and CO evalu~~ , Rater comments should document how well the ratee achic the objectives set in the “individual goals” form at the beginn -
purposes. My proposal is a redesigned FitRep system—based on 67-8-1 (Figure 2). It should be jointly filled out by ratee and supervisor, and reviewed by those who make inputs into the final FitRep, including the commanding officer (CO). (There is no prohibition against the CO adding goals.) This form can ensure that individual goals tie in to command goals, and that all concerned understand exactly what direction juniors are expected to move.
Figure 5 is my proposal for an improved FitRep. It should be revised to incorporate a performance factors section similar to the Air Force’s, minus the marking boxes. (Marks here are invariably inflated.) I recommend that the Navy closely reexan the performance factors included (for instance, the Army and ^ filled in by the ratee. This is his chance to articulate sPeC'|1jS accomplishments that illustrate his performance, not only f°7g seniors, but for the benefit of selection boards. The back ot
aluati°n:
19*7
l!^_^cademy Way
poi tefratln® l,er'od' Th's section provides a concrete starting Weas r°|m Wb'cb counseling may proceed, with performance with Ure 3®a'nSt cstal,lished goals. If the rater should disagree here ^ ratee"Prov‘ded achievements, he may officially do so
re 1 atK-0fCasl1 ^e rater documents performance, the CO provides a Harks6 ran*'n8 °f the ratee’s performance and potential. He also (other fr rS°na' tra'ts> which are limited to three marking blocks Pack ” . an(aot observed): superior to contemporaries, “in the tive if > 1c^'nc^ contemporaries. It would be highly informa- traits °ne S Peers and subordinates could also mark these same he c S°mewhere °n the FitRep. Although such a system would scoretbere’s no h°ubt that the best of the best would Navv ' y W't*1 sen’ors> juniors, and peers. (I am sure that the pers 'S not ready to take this step!) It is unnecessary to subdivide tential 3 traits or lhc bottom line marks on performance and po- divide ' fr* ITIOre t^lan three categories. It is impossible to fairly to (I,,, °. 1Cers. 'nt0 ten categories (as the Marine Corps attempts (perha - low *S §enerahy apparent which officers are: superior dedica?S| f *n ’he Pack” (50-75%—those hard working, c°ntemC °”1Cers wh° make up the majority), or behind one’s he markkaaneS ^dication of relative ranking should continue to Cer ma l i'n t*1C distribution summary block, and for each offi- re<luired t ^LIPer'or in overall performance the CO should be c°uld b ° *Urt^cr indicate each one’s relative ranking. This c accomplished with a mandatory entry in his comments % UeUtenam Jeffrey A. Tomeo, U. S. Navy
the toD(|(WCmember tbe *ast ('me you were marked lower than btness ' °r rece'ved a letter grade lower than an A on your “Whcr re[)*'rt (RtRep)? No doubt, your initial reaction was: that v-,,1" ' ' wrong?” You probably had a sinking feeling
An CarCer Was over’
10% nS^.Stern *hat evaluates so many of its members in the top earlv in tt,S an overhaul- Young junior officers learn the system learned t ^ Careers ^rom ’heir commanding officers (COs), who
Toda *’ r°m tbe'r ^s' ^he Navy is living in the past! Howev^ S ?"'cers are well-educated and extremely competitive, me whe'Yf are. n°l a11suPerstars- As an °ld salt explained to hasicall^0 , lrst j°'ned a squadron out of flight school, there are °f abh y ■rCC types °h People. In keeping with the best tradition KMatm VlUting *on8 ’’ties, these categories are: SHWOW,
sftwd ,DGADAYCMM-
true too i(w,meanS super hot walks on water.” These are the loves th n °f the °lliccr corps. A member of this category sPend 2n h avy and b's -i0*3 more ’ban anything else. He will knows' °UrS a bay working and eats his meals at his desk. He He is pvery publication inside-out and has even rewritten a few. With be 6 YUy y°U '°ve to fiy with or stand officer of the deck ^Uestionm8*2 y°U *cnow fie can handle any situation. There is no t° hjs . at ’his guy can handle jobs at least a rank or two senior
KMAIM definitc command material, by definv ’ °r kick me and I’ll move,” is the largest category, '0% is * l0n Anyonc below the top 10% and above the bottom the quiet* member ’his group. An officer in this group can be had bett ’ anassurr|mg type. He always gets the job done, but you out the Cf ^ wa’ching him. Deadlines sometimes sneak by with- steliarbeing done' °ne day he wil1 dazzle you With a n°on with 0rmance’ but the next day he shows up for work at tjieetino L°Ut a §ood reason, or he misses a training session or g because “I forgot.” At the other end of the scale, he block, such as: “I rank Lieutenant Jones number two out of the three lieutenants marked superior overall.” The Navy should monitor the bottom line summaries and advise the commanding officer, his boss, and selection boards should a CO’s distribution of subordinates fall outside of the norms cited above, or Navy- determined percentages.
The candor required for such a revised FitRep, though seemingly brutal compared to the current system, would unambiguously let officers know where they stand—something we should demand from a performance reporting system. I believe that such a revised FitRep format would:
- Improve command management/goal setting
- Provide better counseling and feedback
- Provide a mechanism for the ratee to communicate specific accomplishments to selection boards
- Be easier for raters/commanding officers to fill out
- Be more useful to selection boards
Commander Sams was operations officer of the U. S. European Command Defense Analysis Center from 1985 to May 1987, when he reported to the Carl Vinson (CVN-70) as intelligence officer. He graduated from the Naval Academy in 1969 and served in the Benjamin Stoddert (DLG-22), Tucumcari (PGH-2), PTF-23, and Kinkaid (DD-965) before changing his designator from 1110 (surface warfare) to 1630 (intelligence). He then served as intelligence officer in the Nassau (LHA-4) and as production department head of the Fleet Intelligence Center, Europe and Atlantic.
could also be the type who is always skirting trouble. You have the gut feeling that the person is wasting the taxpayer’s money, but you can never pin him down. He is frequently late with reports and is generally sloppy about his job, but he still gets things done. He would not get his warfare specialty pin without someone else setting deadlines and pressuring him.
IDGADAYCMM, or “I don’t give a damn and you can’t make me” is the bottom 10%. His whole attitude is in the sewer. You can always find him in his stateroom reading nonprofessional magazines, or sleeping. His reports are always late or full of inaccuracies. He will sneak ashore on his duty day to avoid work.
Certainly, these examples are somewhat exaggerated, for there is usually a fine line dividing the categories. Evaluations would not be so difficult if performance indicators were so neatly packaged and obvious. The point is that the Navy, like all professions, has officers who perform across the spectrum. It is time that the Navy’s evaluation system reflected this.
The U. S. Naval Academy recognized the problem, and this past year met it head on. Guidance is explicitly laid out in an instruction titled: “Operation and Administration of the Military Performance System.” As with any innovation, the guidelines set down in this instruction met with initial resistance. After reading the wording closely and consciously following their intent for the first semester of the 1986-87 academic year, users are heralding the changes as revolutionary.
The purpose of any evaluation system is to provide feedback. The best results are achieved when that feedback is honest. When officers and midshipmen are constantly evaluated as top 10%, they may begin to believe it. Then the evaluation system becomes a failure. Rather than stress areas that need improvement and thus provide direction to help people to become more productive and efficient, the current system stresses achieve-
20-25%
25-30%
40-45%
3-10%
; of peop|e
whetf
stantly hear negative things about his performance. This is
iance
the evaluation form and the scale defined by the perform! ^
A. Yet this is realistic. Most people have grown accustom1
,ed to
the A-B-C-D-F grading scale; in learning institutions, a C is aV
i Mi'1'
utive D is awarded, the midshipman must appear before
tary Performance Review Board. At that time, the midshipmal1,
U'%J
no"'
The new evaluation standards are working. Midshipmen
The time for change is now! Though there is no perfect e
how'
1981. He did his squadron tour with Air Antisubmarine Warfare Squadron 30^^
ment. While that in itself is not bad, it promotes complacency and self-satisfaction. Without direction, the system allows the individual to develop on his own.
The current midshipman evaluation form is a slight variant of the fleet FitRep. There are not nearly as many grading levels and the scale used at the Academy more closely resembles classroom grading. A letter grade of A denotes outstanding in all respects. That is the highest grade given and is reserved for the true top 10%. Above-average performance receives a B. This group is
The fleet should do as the Naval Academy does in its new appraisal system: Assume Mother Nature dispenses her talents evenly. Graders may give “As” to only the top 20-25%; “Bs” to the top 20-50%; “Cs” in the 50-90% range; and “Ds” to the bottom 3-10%.
limited to the top 30%. An average midshipman receives a C, and below average, a D. Those processed for separation from the Academy for insufficient aptitude and poor performance receive an F.
The heart of the new instruction is captured in the follow statement: “Talent within the Brigade is generally distributed uniformly. To ensure consistency of Military Performance grade assignments, the distribution plan below will be followed.” Figure 1 illustrates the grade distribution plan.
The grade distribution chart is realistic and gives the company officers great latitude in assigning performance grades. Responsible for about 125 midshipmen of fairly equal mix among the four-year groups, the company officer can designate the top 2025% as outstanding, or A performers. Those in the 20-50% range are considered above average and would receive a grade of
- The average performer is evaluated as just that—average. Anyone falling in the 50-90% range is the pack player and receives a C. The bottom 3-10% of each company is given a below-average grade of D.
Ideally, this system is still not properly focused. It seems contradictory, for example, to label someone in the bottom 15% of the class as average. However, a closer look reveals that this makes good sense. There is nothing terribly wrong with being average. Many people do not have the drive or dedication needed to be a front-runner. It would be idealistic to think that every officer in the Navy is a workaholic. As long as there are enough competent officers to handle the workload, there is no need for everyone to be a top 10% performer. Every command needs a few, to generate fresh ideas and maintain vitality.
Another good reason for the large grouping of average scores
Figure 1 USNA Grade Distribution*
Outstanding (A)
Above Average (B)
Average (C)
Below Average (D)
‘This distribution applies to a company, not a class within a company'
is that it would degrade morale to rank large numbers c . below average. Evaluations are not meant to belittle someone' to provide a list of all their mistakes. Rather, it should h combination of strengths and weaknesses. It should provide r cific feedback on what the individual has done well, along w denoting areas that need more emphasis. Nobody likes to c
tact and succinct, professional writing become important. ^ There is a slight inconsistency in the grading scales used instruction. The evaluation form defines an A as the top 10%> ^ the instruction allows 20-25% of a company to be awarded
erage. Because we cannot control those outside the mili[ary makes perfect sense to adapt our grading scale to society’s no It would be more advantageous to change the evaluation f°rirl conform with the performance instruction’s definition than V1 versa. j
The Naval Academy also allows for more movement up a down the scale. Receiving a grade of D does not mean autoniu ^ expulsion. Rather, it is a warning flag to help the midship111 a realize there are serious problems. If the problems are resolve ’ the system worked. If the problems persist and a second conss
entire performance record is reviewed and a decision is reache on whether to retain or to separate. In this manner, nonpcrf0111 ers cannot drift along with the current for four years and rccei'c commission without having earned it.
understand the grading scale. They realize that good perfo^ ance grades are no longer handed out like candy—they muSt earned. As a result, initiative and participation are at a new hig With a clearly defined standard and forced compliance with 1 grade distribution chart, everyone realizes he is on an equal f° ing with his classmates.
:valU'
ation system, the Naval Academy is taking positive steps to si that the old, tired system can be improved. The fleet needs sue system. We are only hampering our own progress and effic>efl. by encouraging deadwood to stay in the Navy because their ev uations lead them to believe that they are doing a top-notch J° Time will soften initial opposition. Once the old system has be cleared out of everyone’s mind, it will be obvious that an evalf tion system like the Academy’s will increase individual pr011f tivity and unit efficiency, and will give individuals a better pe spective on how they stack up against the fleet ideal.
Lieutenant Tomeo is a 1980 Naval Academy graduate. He reported dimcth -| Pensacola for flight training and was designated a Naval Flight Officer in WA deployed to the Mediterranean Sea on board the Forrestal (CV-59) and SarO1 (CV-60). He is currently the 29th Company officer at the Naval Academy
jj^CO’s Gouge_________________
By Captam Thomas P. Scott, U. S. Navy (Retired)
senSg'tj?.ePs' I hate ’em! If I just had a gouge that made some Cracie f£ t*lat woldd accommodate my style and the idiosyn- °hjective f*C °^'Cers * de;d w'fh every day. One that’s fairly
every6 ^'tness rePort instruction is a necessary starting place for back-uC°I^lmandin® °ff'cer (CO), but other problems, such as 'harks • bata t0 Prec*ude inconsistencies between evaluation burdg and wrhe-ups, require local solutions without the extra
Her* • anotber cumbersome support system. devc|rC ^ a PhRep gouge that can work for you. It has been heater' fl and.refined over several command tours to provide c0mmr 'eXlbllity in grading and to accommodate variations in
Lotsf Sty'6S tbat (he current system ignores, tied off0 Sm°ke bas been generated over the years by disgrun- rate h6'5 comPlaints. Reports are late, incomplete, inaccu- selinpan lncoris'stent. Worse, officers receive little or no council, l°n Pert°rrnance and progress from the CO. The data d m" Problem stems from the fact that the COs lack the this c °CUmemat'°n, or intestinal fortitude required to perform is “(|lc ?'a laaction. To say that a less-than-adequate evaluation 'n most'rCabs ob Naval Air” won’t hack it when the crying need With | lnstances is a hard-nosed, one-on-one counseling session CO’socu,,lented reference notes to provide substance for the bett»,COrnrnents' A FitRep gouge can give you the means to do a
tcreiingj°b-
of your° 1Cer corPs includes all kinds of people. As the CO, one Glanced1051 ’mPortant j°bs is to select the few most responsible, and ne ’ comrn‘tted> courageous, enthusiastic, self-disciplined, They I,*everjn8 individuals, who you believe represent the best, hurt the tf20 * number one. If you pretend that they are, you a Hi,., est’ and fail to help the ones who are overrated. This is
Noepvlce t0 every°ne-
H0Wev 'l^CP system is objective; they’re not designed to be. ti°n 0f-er’ to tbe officer being rated, objectivity (or his percep- pr0per ls essential to the system’s overall effectiveness. A Pr°Rra C°mmand climate (fairness) and an effective counseling effort f11 W'd clear the way for those officers who will make the
Here” '!^Prove future performance.
’em ud- S 10W worlcs: Low man wins. Gather the inputs; add
bers. ?’ put in the weight factors; stack the rankings by the num- Use row in the prose; and start the debriefings. You can even bfeako” bt<Mnt sPread *n the standings to establish the officer acceler* hf eacb 8rouP'ng> such as RAPs (recommended for reeoinIUtC<J Prornoti°n)> 5%ers and above, regular promotion VVeishtTfen<^at^°nS’ and command screen nominations. The choice.- Yt0rS you aPP'y t0 'he various categories are the CO’s styie of ’>U Can Prescribe any number of subcategories to fit the Perfor ' °Ur cornmand and use any terms you like to describe a "screvv'anCC lra'1 n0t containe<f in the FitRep worksheet (such as are dif|\U^ r'Sb quotient” or “opportunity for limelight,” which raries Wa^s ob say*n§ the same thing). Among contempo- Weigh’, f °SS or an assistant can be differentiated by assigning the to ,j'Ct°rS t0 each- The same is true for officers who stand selecte'f n-WatCbeS or tabc tbe tough hops. (Subjectively, you UP7) R ^'m ^°r tbese duties; objectively, how did he measure it’s ®cause you chose the officer for the higher risk function, 0n rP t0 you t0 decide whether and how it should be weighted. aSsiare occasions, some officers have opted out of the tough Uals nemS’ Which also tells you something about those individ- PeODl tbey ®et 3 shot at some of the traditionally tougher P e-sensitive assignments, or did you limit them to the less-.
visible ones for some reason? Why? They’ll need to know when you debrief them.
The CO can also generate some special projects as topics of interest for each junior officer (JO). This provides a conduit for periodic one-on-one contact with that individual without violating the sanctity of the chain of command. In most cases, you won’t have any problems with this arrangement, but the JO might, and that tells you something about him. Without some personal link, “Ensign Benson,” with limited opportunities to engage you during the daily routine, could get lost in the shuffle if you don’t make an effort to maintain some contact with him.
Whatever form it takes, a FitRep gouge should provide the following:
- A grading category selection expanded beyond the current worksheet listing, and grouped to accommodate the current FitRep format
- A quality-point system that gives added emphasis to those performance factors that you believe are most important
- Implied objectivity, whether or not it can ever be achieved, which creates a sense of fairness
- A means for factoring other inputs into the rating cycle (executive officer [XO] and department head recommendations)
When all the paper-shuffling and number-running is done, an effective evaluation comes down to this—hard-nosed counseling, in which the CO helps a junior colleague to be as good an officer as he can be.
Figure 1 Expanded Worksheet Grading List (Sample)
- Leadership
Effective (Visible)
Show-Me (Hesitant)
Follow-Me (Out Front)
Wishy-Washy
Decisive
Theory-X
Mature
- Presence
Confident
Abrasive
Overbearing
Submissive
Enthusiastic
Compatible
Listens & Learns (Normal) Won’t Listen Doesn’t Leam Can’t Leam
- Administration
Organization
Response (on time) Complete Accurate Sensible
Planning (plan ot action and milestones)
- Intangibles
Spirit
“Moxie”
“Pizzazz”
“Cool”
“Guts”
Luck
I. Potential
- Operations
Warfare Specialty (progress) Weapon Systems Knowledge Weapon Systems Tactical Employment Crew Training Teamwork
Tactical Planning (schedules) Achievement (Competitive Evaluation Exercise)
- Asset Management
Care (looks, works)
Use (‘ups’ fly, machinery schedules)
Planning/Employment
(maintenance, availability) Trouble Shooting
(recommendations, solutions)
- Fiber
Conviction
Courage
Concern
Sobriety
Sincerity
Cooperation
- Personnel (Management)
Training Morale/Attitude Squared Away Well-informed (rules, tasks, goals, “benies”)
Efficiently Used (jobs, details, watches)
Courteous Re-Up (trends)
Advancement (statistics)
Human Relations (opportunities, success)
Figure 2 Sample Worksheet—Leadership
5: & 3 Category Leadership | Effective | Show-Me | Follow-Me | Wishy-Washy | Decisive | Theory-X | Mature | Subtotal | Weight Factor (X) | Total Points | Relative Standing | Remarks ______________________ |
Brown | 1 | 0 | 1 | 0 | 1 | 0 | 1 | 4 | 3 | 12 | 1 | Brown leading the pack in leadership—press on ________________ |
Green | 2 | 0 | 1 | 0 | 1 | 0 | 1 | 5 | 3 | 15 | 2 | Green strong in leadership. No significant weaknesses. Press on- ^— |
Jones | 31 | 0 | 2 | 1 | 2‘ | 0 | 2 | 10 | 3 | 30 | 3 | 'Jones not doing badly; just doesn’t show any excitement, which affects the reaction of men to his leadership efforts. Get excited. Let men see and feel it. |
Smith | 31 | 2 | 0 | 0 | 2 | 1 | 31 | II | 3 | 33 | 4 | 'Smith can’t get people to do what he wants without confrontation Department head usually has to get involved. Needs to quit playmS dictator and spend some time listening in his division. Take chte petty officer recommendation—see what happens! _________________________________________ |
Notes:
- Low-point man wins.
- Weight factor varies—CO’s choice. -n each
- Subcategories determined by specific strengths and weaknesses displayed by the officers being rated. Obviously, each subcategory can’t apply to every otnccr grouping. (One can’t be wishy-washy and decisive at the same time.)
- Marks denoting weaknesses in subcategories should be accompanied by written comments that can be used later during individual counseling sessions.
- A method that encourages written notations explaining the grade comparisons within the various officer groupings, thereby reducing the need to wing it when the counseling sessions begin
- A system that is simple enough to be used regularly, and can be explained to the officers participating in the rating process
As simple or complex as your system may be, remember that the officers in a command already know who the strokers and non-strokers are. Your task as CO is to provide this information to those outside the command who must make important personnel decisions about assignments, promotions, and command selection—largely on the basis of the information contained in the officers’ FitReps.
Let’s run through a drill to demonstrate how a system can be used to construct the essential elements of a FitRep (the grades,
rankings, and breakout). The first step is to expand the stan worksheet grading categories to fit the officers in your comma and to accommodate your style. All of the officers in the c mand are probably not water walkers, but they do represent ^ tain strengths and weaknesses that can be identified, eva^ua.e’, and compared. Integrating the standard worksheet grading c gories into the expanded list, where feasible, will facilitate transfer of your assigned grades to the regular FitRep format- example, the category Subordinate Management and Deve ment from the standard worksheet parallels the Personnel 1*a gory of the expanded worksheet list illustrated in Figure 1 ■ p’^ qualities, such as Command and Leadership, pervade all ments of any attribute list. Your task in these areas is to ens | that the strongest leaders with the greatest command poten
26
20
13
10
Figure 3 Relative Rankings by Category
Remarks
Council- 3 C comments on individual performance for each category for use in
2. Relative"8’ ^ approPriate-
ual C rankin§ 'n each major category above was transcribed from the individ- tegory worksheets (figures 2 and 3).
| Figure 4 Sample Breakout Chart | ||
1 a Name | Relative Ranking _ 1 | f Total Ranking Points | Breakout |
Brown | i | 10 | 1%, RAP, Command |
Green | 2 | 13 | 1%, RAP(2 of 2) or Regular, Possible |
|
|
| Command |
Jones |
| 19 |
|
3 | 5%, Regular, No Command Recommended | ||
Smith | 4 | 25 | 10-30%, Possible Regular/No Promotion, No Command |
Notes: |
|
|
|
total r ^recn’ as numbers 1 and 2, rank well ahead of Jones, number 3, in warrantn ln* P°'nts. Brown is also sufficiently ahead of Green in the sample to (RAP) .a scParale’ individual recommendation for accelerated promotion di*tior,'faniJ command selection. Green may, or may not, receive a recommen- tion : ™ federated promotion or command screen support based on his posi- Fo„o:«* P°'nt standings.
and c *^e Same anabrsis> Jones and Smith would be evaluated for promotion ini> „ mrnand recommendations based on their relative standing and total rank® points.
reCeiVe th u-
iIlUs(tne highest marks and the strongest write-ups. Figure 1 0 rates the first step in the process. gr0ue y°u decide the grading categories, the next step is to desip . 6 °^'ccrs in the standard competitive categories (rank, lhat ,rjator/sPec'aUy), and list them on a homemade worksheet Now S° ,conta'ns the expanded grading categories (Figure 2). each ^i°-U re set UP t0 evaiuate and compare the performance of tiVg lcer ’n the individual groupings and develop a compara- Enteran<hng in each major category of the expanded grade list. Cate„ SPCeiiic comments about each officer in each grading sub- 2 pr °f^ as y°u evaluate that aspect of his performance. (Figure erShj)Vl^es an illustration of a completed worksheet in the Lead- egor ^ Cate80r'es- Worksheet construction is the same for all cat- vide'eSf,an^ officer groupings.) The written comments will probe information you need for individual debriefs and counseling later in the process.
As stated earlier, the point spread or high man/low man out is the CO’s choice, just so you maintain consistency throughout the process. The number range you select can be used to spread out or bunch up the rankings in each category. A grade-point spread from 1 to 3, for example, keeps everyone relatively close together; whereas a spread from 1 to 10 in the scoring will scatter the standings considerably more.
To reduce the number crunching after the grading by categories is finished, you can construct an abbreviated worksheet containing only the major grading categories for each officer group and record their relative standings by category. (See Figure 3.) The written comments about specific performance traits for each officer are then consolidated and transferred to the summary sheets. These notes are useful in checking performance progress for individual officers during subsequent evaluation periods. They also provide good counseling records for use in follow-up sessions with those same individuals.
After the groups have been rated comparatively by major category, you can now perform a breakout by the numbers. Applying some standard promotion and command screen percentage factors for each group of officers, the rankings can be refined further. (See Figure 4.) Continuing the example, let’s assume that the rating list contains four lieutenant commander line officers. Statistically, about 80% (three of these officers) will be promoted to commander. Statistics might also indicate a 50% command screen probability for this group (two officers). In the absence of other, more convincing evidence to the contrary, you now have a promotion and command screen recommendation baseline for the four lieutenant commanders in your command. Even if you reject these results and alter the rankings arbitrarily, your reasons for making the changes can be bounced against the information you developed while doing the worksheets. If you note your reasons for the deviations, you now have additional information to support your decisions and enforce your counseling remarks. And, by doing the drill, you must evaluate the performance factors you decided were important in the first place.
Weight factors applied to the relative standings in each grading category can alter the competitive lineup significantly. If, for example, an officer is an outstanding, operations-oriented leader, but is less capable than another in Administration and Asset Management, and you personally prefer warriors to paper pushers, you can apply a weight factor that enhances warrior qualities in the overall standings.
It should be obvious by now that I’m not selling snake oil or a cure-all for the FitRep process. My purpose is to convince you— the harried CO—that you need to spend some time thinking about the FitRep business (other than your own) as it applies to your command structure. If you do, chances are the officers in your command will derive even greater benefit from the evaluation process than they do now.
If you decide that the benefits derived from the drill do not justify the effort—no problem. In the years I’ve used the system, the gut rankings have nearly always mirrored the scientifically extrapolated data, with the usual exception of one or two middle- pack players. But it did provide me ample reference material, or gouge notes, if you prefer. And for me, in the long run, that extra help justified the effort. The bottom line is still the same: The rating system is only as good as you—the CO—make it.
Captain Scott held four command assignments before retiring in 1985: Attack Squadron 82 (1971-72), Carrier Air Wing 17 (1974-75), the Coronado (LPD-11) (1977-79), and the Peleliu (LHA-5) (1980-82). He also served in attack squadrons 12, 113, 44, and 174. He received an M.A. degree in international affairs from American University in 1963, and was awarded the Distinguished Flying Cross, Bronze Star, Air Medal, and Navy Commendation Medal.