Overview of the Federal Ex~F:: ·~·-u~ ent Program II Assessm~n, -·- ~ : ,, : i· : mUnited States SUNY AT BUFFALO .....-_!. Civil Service Comm.ission THE LIBRARI~S ~~~., Bureau of Policies and Standards DEPOSITORY COPY Technical Memorandum 76-14 AN OVERVIEW OF THE FEDERAL EXECUTIVE DEVELOPMENT PROGRAM II ASSESSMENT CENTER Hardy L. Hall and Dale R. Baker Applied Psychology Section Personnel Research and Development Center U. S. Civil Service Commission Washington, D. C. August 1976 AN OVERVIEW OF THE FEDERAL EXECUTIVE DEVELOPMENT PROGRAM II ASSESSMENT CENTER ABSTRACT This report provides an overview of the assessment center used in the second Federal Executive Development Program (FEDP II), under the operational control of the U. S. Civil Service Commission. This was a developmental program to prepare employees at the GS-15 level for higher level executive positions in the Federal government. The assessment center was used as one of the evaluation devices for selecting 27 employees from a group of 72 finalist candidates to participate in the program. The final selection board found the assessment center results to be extremely valuable in making the selections for the program. Results of post-assessment surveys administered to assessors and candidates were highly favorable with respect to the total assessment center method. CONTENTS Page 1 Job Analysis 1 Evaluative Instruments 2 Assessor Selection 3 Assessor Training 3 Selection Process 3 Assessment Center Results 5 Final Selection of Candidates 5 Feedback and Counseling 6 Questionnaire Findings 12 Recommendations Appendices 13 A. Skill List Definitions 14 B. Supervisory Appraisal Form 18 C. Candidate Questionnaire 23 D. Assessor Opinion Survey 27 E. Candidate Questionnaire Results 31 F. Assessor Opinion Survey Results 37 G. Tables of Questionnaire Findings A. Mean Item Ratings and Standard Deviations for Part I of the 37 Candidate Questionnaire B. Mean Item Ratings and Standard Deviations for Part II of the 37 Candidate Questionnaire C. Mean Item Rating and Standard Deviations Part I of the Assessor 38 Opinion Survey D. Mean Item Ratings and Standard Deviations for Part II of the 38 Assessor Opinion Survey Tables 1. FEDP II Final Results: Score Distribution of Candidates 4 2. FEDP II Final Results: Score Distribution of Selected Candidates 5 AN OVERVIEW OF THE FEDERAL EXECUTIVE DEVELOPMENT PROGRAM II ASSESSMENT CENTER The U. S. Civil Service Commission Next, a meeting was held with the announced the second Federal Executive DeDirector and members of the staff at the Federal Executive Institute to discuss the velopment Program (FEDP II) in the autumn of skills which they find are essential for an 1974. This is a government-·wide one-year executive. developmental program which provides an opportunity for selected career managers at the GS-15 level to prepare for higher level Members of the Personnel Research and executive responsibilities in the Federal Development Center then met with the con tractor for this program, Dr. Cabot Jaffee, government, through a combination of forma lized training and developmental work assignto jointly determine the final list of skills to be used for the FEDP II assessment ments. The previous program, FEDP I, was jointly sponsored by the Commission and OMB center. The list is contained in Appendix A and was favorably received, which helped to of this report. set the stage for FEDP II. Although some adjustments were made in the selection process for FEDP II, one of the evaluation portions Evaluative Instruments which was retained was the requirement for the final 72 candidates to be evaluated by The contractor worked closely with members of the Personnel Research and Devel the assessment center method. The assessment reports provided valuable information opment Center in designing and tailoring six for use by a final selection board, along measurement exercises for the Federal executive, which were work samples that elicit with other background material, to select the skills necessary for success in the job. the 27 participants for the program. The FEDP II assessment center activities The following exercises were used in were under the operational control of the the FEDP II assessment model: Commission, with Dale Baker and Hardy Hall, Administrative Exercise (Transportation Personnel Research and Development Center, Bureau of Policies and Standards, serving as Control Agency) Directors of the assessment center. Con This exercise requires the candidatesiderable data were generated and collected to go through a large amount of in-basket from the assessment center, including post assessment center questionnaires which were material including letters, memos, and messages for a Federal executive. The administered to the assessors and candidates. This report will also summarize some of the candidate assumes the role of a Regional Director for the Inner City Transportationfindings from these data. Control Agency, an agency concerned with all aspects of transportation into major cities in the United States. The exercise Job Analysis requires two hours to complete and is fol The first step in developing the assesslowed up with a 30 minute interview. ment center was to review the job analysis data that were collected and used for seLeadership Exercise (Trauma Center) lecting the skills that were considered This exercise requires the candidate essential for the FEDP I program. The tar get position for this program is ultimately to direct the activities of two staff assis tants to help him/her prepare a position a generalist executive in the Federal ser paper to go before a congressional committee vice. In addition, a group of top level to justify an appropriation of $300 million. government executives from several Federal agencies was convened to provide additional The candidate assumes the role of a newly data with respect to the skills which were appointed special assistant to the adminis trator of this new agency which hasessential for success in their jobs. responsibilities for developing a Medical Trauma Center System in the United States. This exercise requires approximately one hour of candidate time. Leaderless Group Discussion Exercises There were two group exercises. One of the exercises is titled "Supervisor of the Year", and requires the group of six candidates collectively to rank six individuals who have been nominated by the agency for the Outstanding Supervisor of the year award. Each candidate assumes the role of one of the members of the Executive Personnel Board of a government agency. The other leaderless group exercise is the "Superport Problem", which requires the group of six candidates individually to support proposed superport sites and collectively to decide the top three choices in rank order. Each candidate assumes the role of a member of the committee representing the southeastern United States, which later reports to a joint congressional committee for licensing of off-shore land for Superport construction. Each of these exercises requires 30 minutes of preparation time followed by one hour of group discussion. Problem Analysis Exercise (National Agency for Information Distribution The first part of this two-part exercise requires the candidate to analyze information and data and make a written report defending changes in a plan for allocating $2 million of supplemental funds for his/her agency which is to be spent in two of six possible areas. The candidate assumes the role of the agency administrator for the National Agency for Information Distribution, which has responsibilities for encouraging the free flow of ideas, fostering good will and mutual understanding between the U.S. and foreign countries, presenting an accurate picture of the U.S., and aiding education through cooperation with foreign countries. This part of the exercise requires two hours. The second part of this exercise re quires the candidate to make a brief oral defense before a "Senate Subcommittee." This portion of the exercise requires approximately 30 minutes. Assessor Selection Letters were sent by the Commission to thirteen Federal agencies requesting their cooperation in providing two executive managers, preferably at the GS-16 level or above, to serve as assessors for the FEDP II assessment center. The letters outlined the requirement that each assessor participate in a five-day resident training session and serve as an assessor for a period of two weeks 'at a non-residential assessment cen ter site. The letters also outlined the criteria for agency officials to use in selecbing the assessors. The criteria were: 1. Should be a GS 16-18 program director or manager. Since the candidates to be assessed are at the GS-15 level, it is desirable, although not mandatory, that assessors be at the supergrade level. In any event, it is preferable that assessors be managers, rather than individual workers, research specialists, or others who have limited program management responsibilities. 2. Should be considered a top performer. Assessors should be persons whose own high-quality performance enhances their ability to judge the performance of FEDP finalists. 3. Should be a good communicator, orally and in writing. Since assessors' observations must be shared among other members of the assessment team, the assessors must have facility for effective oral and written expression. 4. Should be perceptive and analytical. It is not necessary that assessors be personnel management specialists, psychologists, or members of any other specific discipline. However, it is important that they be able to perceive fine distinctions in the quality of performance of individual members of a group of GS-15 managers, all of whom are considered to be superior. 5. Should be interested in serving as an assessor. 2 Twenty assessors, representing most of the major Federal agencies, were selected to partici~ate in the program. Assessor Training The assessor training was a five-day residential course for the twenty assessors, three administrators and other Commission personnel. The training was conducted at the Annapolis Hilton Hotel in Annapolis, Maryland, under the direction of Dr. Cabot Jaffee, with whom the Commission contracted to provide technical leadership for this portion of the assessment center process. The curriculum for the training program placed emphasis on the following: 1. Understanding of the skills to be measured in the assessment center model. 2. Participation by the assessors in each of the six assessment center exercises, and discussion of the range of behaviors that could be observed in each. 3. Evaluation by the assessors of a group of mock candidates for practice in the total assessment center process. 4. Reliability checks to assure that each a$S.essor was rating the skills observed from the mock candidates in approximately the same way. Selection Process The selection process commenced by making the FEDP II program open for selfnomination by people who were full time managers at the GS-15 level or its equivalent in other salary systems in the Federal government. Those interested were requested to apply through their respective agencies. The first screening process took place at the agency, where officials reviewed the qualifications of applicants and made nominations to the Civil Service Commission. Each agency was allotted a numerical quota of nominees based on a uniform percentage of their agency's GS-15 population. The quota insured that each agency was permitted to submit at least one nomination. This agency screening produced a total of 122 nominees. The second screening process took place at the Commission, when a high level panel of government executives convened to review background materials, supervisory appraisals (see Appendix B), and candidates' interest in participating in the program. The panel selected 72 nominees from the group. These 72 finalists were evaluated by the Commission's assessment center. This assessment center, a two-day program for each candidate, operated at the Naval Aviation Executive Institute in the Jefferson Plaza, Crystal City complex in Alexandria, Virginia. It was designed to evaluate groups of six candidates with three assessors for each group. The weekly schedules were set up to evaluate 18 candidates (3 groups) with 9 assessors. The evaluation of 72 candidates therefore required 4 weeks of assessment center operation. A comprehensive report was generated on each candidate as a result of the assessment center process. The third and final screening process occurred at the Commission when the final selection board, composed of high level executives both in and out of the government, convened, reviewed all the background material on each candidate, including their assessment center report, and selected 27 FEDP II participants. This selection panel consisted of the following officials: Edward Preston, Assistant Director of Executive Development and Labor Relations Office of Management and Budget; Hugh McKenna, Director, Bureau of Retirement and Survivors Insurance, Social Security Administration; Anita Alpern, Director, Planning and Analysis Division, Internal Revenue Service; William Heffelfinger, Assistant Secretary for Administration, Department of Transportation; Philip Rutledge, Director, Office of Policy Analysis, National League of Cities/U.S. Conference of Majors. 3 Assessment Center Results 4 -Good potential for executive management The assessment center reports provided to the selection board included an overall 3 -Low potential for executive assessment center score and scores for each management of the twelve skills listed in Appendix A, with supporting documentation. The overall 2 -Potential for executive management score scale used for this program was as is marginal follows: 1 -Lacks potential for executive 7 -Outstanding potential for executive management management The overall score reflects the level of the 6 -High potential for executive candidates' skills at the time they were evalumanagement ated at the assessment center. Table 1 shows the dispersion of the overall assessment cent~r 5 -Very good potential for executive scores of the 72 candidates. management TABLE 1 FEDP II Final Results Score Distribution of Candidatesa Overall Number Assessment Center of Score Candidates Percent High 7 6 - - - - - - - 4 15 - - - - - 6 21 - - - - Satisf ac tory 5 4 24 16 33 22 - - - - - - - - - - - - - - - - - - - - 3 12 17 Low 2 1 1 1 0 0 a n 72 These data were very helpful to the selec27% of the total group scored high in the tion board in making the final selection of assessment center, and 18% scored low. the FEDP II participants. Approximately 4 Final Selection of Candidates placed a great amount of emphasis on the assessment center results. It should be Table 2 represents the assessment cennoted the board did not select any candidates ter score distribution of the 27 candidates who received a low overall assessment center who were selected for FEDP II. The data score. indicate that the final selection board TABLE 2 FEDP II Final Results Score Distribution of Selected Candidatesa Overall Assessment Center Score 7 High 6 5 Satisfactory 4 --· 3 Low 2 1 Feedback and Counseling As soon as the 27 FEDP II participants had been officially notified of their selection to the program, an assessment center feedback and career counseling session was scheduled and conducted for each participant. The feedback and counseling team consisted of Dale Baker, who directed the assessment center and provided the assessment center feedback, and Jack Conyers, who directed the fEDP II program and provi~ed executive development opportunities and career counseling to the participants. Each session lasted an average of 1~ hours and was well received by each of the participants. All candidates who were not selected Number Percent of of Total Number of Candidates Candidates 4 100% 13 87% 10 47% 0 0 0 0 0 0 0 0 for the program were given an opportunity for feedback and career counseling within two months following the assessment center activities. Thirty-three of them requested and received feedback and counseling sessions similar to the sessions that were conducted for the selected participants. In these sessions, however, more emphasis was placed on assisting the candidates in skill areas which needed further development, and suggestions were made of possible types of work assignments which would be developmental in their Federal careers. Almost every candidate was in agreement with the assessment center results and was appreciative of having the feedback and counseling session. Candidates also considered the assessment center pro cess to be objective, job related and 5 beneficial to their career development. Questionnaire Findings Obtaining information and feedback from FEDP II participants was of utmost importance in evaluating the assessment process --how did assessors and candidates react to the use of the assessment center? What are the implications for its use for higher management? What areas were weak? What suggestions could be made for improvement? Two instruments were designed to elicit information in these and other related areas. The Candidate Questionnaire was developed in order to elicit candidates' attitudes toward the assessment center and toward the individual feedback which they received on their own assessment center performance. A second questionnaire, the Assessor Opinion Survey, was concerned with assessors' attitudes and opinions about the operational assessment center and assessor training. In order to obtain candid responses, respondent anonymity was maintained on the instruments. Candidate Questionnaire Analysis There were two parts to the Candidate Questionnaire. Part I.which was administered to each candidate after he/she completed the assessment center, was concerned with the candidate's opinions about the operational assessment center. Of the 23 items on this part of the questionnaire, the first 15 items were rated on a 5-point Likert-type scale. Responses were as follows: "Strongly Disagree", "Disagree", "Neither Agree Nor Disagree", "Agree", and "Strongly Agree". Responses were scored +1 to +5 depending upon how positive they were for the assessment center. Items 16 through 23 were open-ended questions which provided candidates with an opportunity to elaborate on their responses to the Likert-type items and to comment on other aspects of the center. Part II dealt with candidates' attitudes toward the individual feedback they received regarding their own assessment center performance. It was mailed out to candidates who had elected to receive 6 feedback. There were five Likert-type items (1-5) in Part II which were rated as de scribed above, and four open-ended items (6-9). Likert-type items in Parts I and II were counterbalanced to correct for response set. A copy of the Candidate Questionnaire, Parts I and II and item statements, appears in Appendix C. The investigators interpreted a response to an item as agreement with that item if the mean rating was greater than 3.5 and as disagreement if it was less than 2.5, otherwise a response was placed in the "Neither Agree Nor Disagree" category. Open-ended questions (16-23 and 6-9) for Parts I and II, candidate responses and comments, and the percentage of candidates who indicated each response appear in Appendix E. Percentages may not necessarily sum to 100% since a respondent may have indicated more than one response. Candidate Questionnaire, Part I. Mean ratings and standard deviations for questions 1 through 15 appear in Table A, Appendix G. The standard deviation is an indication of how much variability there was in the ratings. The smaller the standard deviation the more the ratings tended to cluster around the mean rating. The results of this analysis show that candidates were, in general, in agreement with the following statements: -After the assessment center orientation, I felt that I was well informed about what to expect. -I looked forward to participating in the assessment center process. -I felt that the exercises were not too stressful. -I found that directions for the exercises were not difficult to follow. -The individuals who evaluated me in the assessment center were very capable. -Conditions created in the assessment center were realistic. -The assessment center approach is superior to other methods for selecting individuals with executive management potential. -My experience in the assessment center was challenging, revealing, and rewarding. -There was no problem with the scheduling of exercises. Candidates neither agreed nor disagreed with the following questionnaire items: -Assessment facilities were less than adequate. -I was fairly evaluated in the assessment center.l -There are some managerial skills which should have been measured in the assessment center but were not. -My performance in the assessment center accurately reflects the way I would perform in a real life situation. -Higher management should place a great deal of weight on assessment center results. -An individual's strengths and weaknesses were not accurately assessed in the assessment center. None of the means of these items were below 3.0, the mid-point of the scale, indicating more of a tendency for candidates to agree than to disagree with an item. In addition, candidates may not have known how to respond appropriately to item 8 ("I was fairly evaluated in the assessment center") as indicated by the mean rating near the midpoint and the relatively small standard deviation. They may not have been able to respond correctly to item 14 ("an individual's strengths and weaknesses were not accurately assessed in the assessment center") because they probably were not aware of how they were evaluated prior to feedback. In summary, responses to open-ended questions (16-23) were as follows: About 56% of the candidates felt that there were no reasons why they could not perform to the best of their ability in the assessment center. Some problems which were mentioned were time constraints (15%) , unrealistic situations (8%), and conditions which were too stressful (7%). Approximately 44% of the candidates felt that there was nothing wrong with the assessment center facilities, while others (24%) thought that the rooms were too far from one another and still others (14%) that there was not enough working room and no place to relax. Roughly 61% of the candidates believed that the orientation given to them prior to the assessment center was sufficient, while some felt that more information about the measurement devices the skills which were assessed should be provided (11%), that more detailed information about the assessment center should be provided in the orientation and in the mail-outs which candidates received (6%), and candidates should be informed in advance as to the effect the selection panel's decisions would have on them (4%). A majority of the candidates (61%) stated that none of the exercises were too stressful; however, about one-third (36%) felt that the Leadership Exercise was overly stressful. Approximately 72% of the candidates thought that the exercise instructions were easy to follow. Some candidates stated th<,t the instructions for the Problem Analysis Exercise (14%) and the Leadership Exercise (6%) were difficult to follow. Most of the candidates agreed that the skills measured in the assessment center were appropriate. In response to the question of whether or not candidates felt they were fairly evaluated, about 86% stated that they did1The intent of this question was to deternot feel unfairly evaluated and about 13% mine whether candidates felt they were obstated that they did not know how they were jectively evaluated. evaluated. This finding seems to contradict 7 an earlier one which showed that assessors specific information on ways in which they felt neither fairly nor unfairly evaluated. could improve themselves (specific courses, While approximately one-third of the candidates had no suggestions for improving the assessment center, some of the comments were: Improve the facilities (11%), make the exercises more realistic (10%), have more timely feedback (6%), have less "down time" (6%), allow candidates time to get to know each other (6%), improve the scheduling of the exercises (6%), and reduce the time constraints (4%). Candidate Questionnaire, Part II. Fifty-nine candidates elected to receive feedback. This portion of the questionnaire was mailed to all 59 after they had received their feedback. Thirty-nine questionnaires were returned. Since these questionnaires were anonymous, a determination of their true representativess could not be made. Mean ratings and standard deviations for questions 1 through 5 appear in Table A, Appendix H. The results of the analysis of candidates' responses to the feedback interview showed that candidates generally felt that: -The feedback I received provided me with all the information I would have liked to receive. -My performance in the assessment center was accurately reflected in the feedback interview. -The feedback was effective in providing information that would aid me in self-development. -The manner in which the feedback team gave feedback was excellent. -Examples of areas in which my strengths and weaknesses lay were provided. In summary, responses to open-ended questions (6-9) were as follows: Approximately 36% of the candidates believed that no other additional information needed to have been covered in the feedback session. About 18% felt that they would have liked to have received more etc.). Others (15%) thought more details on the ultimate selection criteria used by the final selection panel should have been provided to them. Roughly 13% wanted more detailed information on areas in which they could improve themselves. A majority (62%) of questionnaire respondents stated that the feedback team's handling of the interview was fine or had no suggestions for how the team could have better handled the interview. Some comments were: The team should have been more specific on candidates' strengths and weaknesses (8%), assessors should have been present to clarify their reports (8%), candidates should have been allowed to see their reports (8%), and the team should have been less defensive on behalf of the assessment center (5%). While approximately 72% of the candidates stated that their performance in the assessment center was accurately reflected in the feedback, some candidates (8%) felt that it was not. Some (8%) felt that some of the skill ratings were based on very limited data and others (5%) that the assessment center was too much of a game. Some comments or suggestions on how feedback could be improved were: Place more emphasis on recommendations for development (10%), assessors should be present at the feedback session to critique candidates (10%), the feedback should have been more timely (8%), and there should have been videotaped feedback (5%). About 41% had no comments or suggestions. Assessor Opinion Survey Analysis Like the Candidate Questionnaire, this instrument had two parts. It was administered after each assessor completed his/her phase of the assessment center. Assessors were provided with envelopes in which to mail their completed questionnaires. Of the 20 assessors, 12 responded. Part I of the instrument dealt with assessors' opinions about the operational assessment center. There were 14 Likert-type items and 9 open-ended questions (15-23) in this part. 8 Part II of the questionnaire was de signed to elicit assessors' opinions about assessor training and consisted of 6 Likerttype items and 5 open-ended questions (7-11). The scoring and interpretation of the Likert-type items was the same as that described for the Candidate Questionnaire. These items were also counterbalanced. Parts I and II of the Assessor Opinion Survey and item statements appear in Appendix D. The open-ended questions (15-23 and 7-11) for both Parts I and II, assessor responses and comments, and the percentage of assessors who indicated each response appear in Appendix F. Percentages do not necessarily sum to 100% for each question. Assessor Opinion Survey, Part I. Mean ratings and standard deviations for questions 1 throvgh 14 appear in Table C, Appendix G. Assessors generally agreed with these statements: -The assessment center process is effective in assessing a participant's strengths and weaknesses. -A candidate's potential for executive management is objectively measured by the assessment center. -Assessment center facilities were less than adequate. -Instructions for the various exercises were easy to follow. -The assessment center process is effective in aiding management to recognize an individual's strengths and weaknesses. -Higher management should place more weight on assessment center results. -The scheduling of assessors for the operational assessment center could not have been handled better. -I was confident in the accuracy of my overall judgment of a candidate's potential. -I was confident in the accuracy of my assessment of individual skill ratings. -A participant should place much reliance on assessment center results. -I would not like to serve as an assesor again. -A participant's supervisor should place much reliance on assessment center results. On the average, assessors neither agreed nor disagreed with the statement that "higher management does not place much weight on assessment center results." Some assessors may have felt that they were not aware of how much weight actually is placed on the results. The relatively small standard deviations for a number of items, e.g., .51 to .94, show that assessors were pretty much in agreement with one another on these items. In summary, assessors' responses to the open-ended questions (15-23) were: Assessors cited numerous advantages to serving as an assessor, the most prevalent one being that it allows one to sharpen his/ her skills in evaluating the performance of subordinate managers (75%). Several disadvantages to serving as an assessor were also noted: It required too much time away from the office (50%), too much night work (17%), and a significant investment on the part of assessors (17%). Assessors felt that thE following skills were most accurately measured in the assessment center: Skill 1 (67%), Skill 3 (50%), Skill 5 (33%), Skill 7 (33%), Skill 8 (33%), Skill 9 (33%), Skill 6 (25%), Skill 2 (17%), Skill 4 (17%), and Skill 10 (17%). Several reasons were cited as to why assessors thought these skills were the most accurately measured: Numerous opportunities were provided for candidates to demonstrate some of these skills, there was little opportunity to "fake it", some skills cannot be exhibited or measured well in the normal. setting, most of the definitions were clear and there was enough data to permit easy determinations, the exercises brought out certain skills more distinctly, certain skills were easily observable, and there was tangible evidence of a skill in the products prepared by the candidates. The skills that assessors felt were most important in terms of their relative 9 contribution to the overall assessment cen ter rating were, in rank order: Skills 3; 1; 6; 8; 7; 2 and 11 (tied); 10; and 4, 5, 9, and 12 (tied). Two-thirds of the assessors thought that the Administrative Exercise contributed the most to the assessment center. They stated that it was more representative of management experiences, could be more easily evaluated, offered excellent opportunities to observe a broad range of skills, forced candidates to try if they were to compete assessed skills that were clearly observable, had breadth and similarity to real world problems. and was standardized for all candidates. Two-thirds of the assessors stated that the Leadership Exercise contributed the least to the assessment center. They stated that it was unrealistic, comparisons among candidates were difficult to make due to the unstandardized performance of the role players, the objective of the exercise was not clear and there was no viable solution to the problem, and the breadth of observable skills was comparatively limited. Assessors felt that certain skills should be combined because they were so similar and some should be better defined. One assessor felt that "motivation" should be measured and another that Skill 11 should be excluded because there was not enough time in which a candidate could demonstrate it well. Some suggestions for improving the assessment center were: Provide better physical facilities, improve the rating forms, improve or provide better exercises which are designed to elicit Skills 2, 3, 7, and 12, improve the directions for the exercises, decrease the emphasis on Skill 2, provide guidelines on what to observe in some of the exercises, clarify the goal of the Leadership Exercise and make the role players more standardized, have assessors discuss each exercise prior to making individual write-ups, and have a residential site. The area where the most improvement was needed, as indicated by the percentage of assessor comments, was that of the physical facilities --the rooms were too crowded and uncomfortable, and there was not enough room for writing up reports or privacy for conducting interviews. Assessor Opinion Survey, Part II. Mean ratings and standard deviations for questions 1 through 6 appear in Table D, Appendix G. Generally, assessors felt that: -The quality of assessor training was good. -Assessor training facilitieb were more than acceptable. -The location of the assessor training site created no undue problems. -Assessors were adequately trained on each of the exercises. -Assessor training was not too long. -There were enough opportunities to ask questions during training. The following summarizes the responses to open-ended questions (7-11) : Assessors felt that they could use some additional training in these skill areas: Skill 2 (17%), Skill 3 (17%), Skill 7 (17%), Skill 8 (17%), Skill 9 (17%), Skill 10 (17%), Skill 1 (8%), Skill 4 (8%), and Skill 5 (8%). Although a number of assessors (4%) indicated that no additional practice was required, some said they would like more practice in certain skill areas (Skills 1, 8, 9, and 10), in writing up the final reports, in the actual assessment center process, and in all areas in general. One assessor felt that the introduction to assessor training should be shortened, while the remaining assessors suggested that no portions of the training be either excluded or shortened. Half the assessors indicated that training should not be increased nor should anything be added to it. Other assessors believed that the following should be included in assessor training: Each assessor should go through the assessment center, more time should be spent defining and 10 illustrating various skills, an opportunity should be provided to observe the exercises with the developer of the exercise or a trained assessor, more time should be spent discussing the exercises, more time in dis cussing ratings, more practice in observing and recording behavior, and more training should be done with the team an assessor would be assigned to for the assessment center. Some comments and suggestions assessors made for improving assessor training were: A social hour should be held to encourage mingling and discussion, "the training was adequate--it was just fine now", a larger training room should be obtained, the role players should be properly trained, more structure should be provided in setting up assessor teams for training --"don't keep the same teams too long", better examples of final reports should be made available for training purposes, and preparation material should be provided prior to training. Summary of questionnaire findings Candidate responses. The Candidate Questionnaire analysis showed that candidates generally were in favor of the assessment center process. According to the results of the analysis, the majority of the candidates felt that after the assessment center orientation they were well informed about what to expect, they looked forward to participating in the assessment center process, they found that the directions for the exercises were easy to follow, the individuals who evaluated them in the assessment center were very capable, they felt that they were fairly evaluated, the skills measured were appropriate,conditions created in the assessment center were realistic, the assessment center approach is superior to other methods for selecting individuals with executive management potential, their experience in the assessment center was challenging, revealing, and rewarding, and there was no problem with the scheduling of exercises. While the majority of the candidates felt that there were no reasons why they could not perform to the best of their ability in the assessment center, some felt that they were hindered by time constraints, and by situations which were unrealistic and too stressful. Most candidates did not find the exercises overly stressful, however, some felt that the Leadership Exercise was too stressful. There seemed to be some disenchantment with the physical facilities at the assessment center site. This problem probably could not have been avoided because government facilities had to be used and this was the only site which even carne close to meeting the space reqnirernents. In terms of the feedback they received, the majority of candidates thought that their performance was accurately mirrored in the interview, the feedback provided information that was useful for self-development purposes, the manner in which the feedback team gave feedback was excellent, and examples of areas in which a candidate's strengths and weaknesses lay were provided. A few candidates preferred receiving more specific suggestions for improvement and development. Assessor responses. The Assessor Opinion Survey analysis revealed that assessors also favored the assessment center process. The results of this analysis disclosed that the majority of assessors felt that the assessment center is effective in assessing a candidate's strengths and weaknesses, a candidate's potential for executive management is objectively measured by the assessment center, assessment center facilities were less than adequate, instructions for the various exercises were easy to follow, the process is effective in aiding management to recognize an individual's strengths and weaknesses, higher management should place more weight on assessment center results, the sched11ling of assessors was handled well, they were confident in the accuracy of their individual and overall ratings, a participant and his/her supervisor should place much reliance on assessment center results, and they would not like to serve as assessors again. Most of the explanations assessors gave for not wanting to serve again were that the work was too hard and too much time was involved. Assessors also noted that there were numerous advantages and disadvantages to serving as an assessor, some skills were more accurately measured than others (Skills 1 and 3) and some contributed more to the 11 overall rating of candidate ability than others (Skills 1, 3, 6, 7 and 8). The Administrative Exercise apparently contri-buted the most to the assessment center and the Leadership Exercise the least. Assessors seemed to agree that the skills measured were appropriate, but that some should be collapsed or better defined. Assessors suggested that in order to improve the assessment center, better physical facilities be provided, the rating forms be improved, the exercises and the directions be improved or clarified, guidelines be provided on what to observe in the exercises, the goal of the Leadership Exercise be clarified, and the "assistants" be trained to be more standardized in their behavior. Assessors felt that the quality of assessor training was good, the facilities were more than acceptable, the training site location created no undue problems, the training was not too long, and there were enough opportunities to ask questions during training. There were some skill areas in which they indicated they needed more training and practice. Practically all the assessors stated that nothing should be deleted from training. Suggestions for improving assessor training included spending more time defining and illustrating the various skills, discussing the exercises and the ratings, and having more practice in observing and recording behavior. Recommendations On the basis of the questionnaire analyses, the following changes in the assessment center are suggested: 1. Reevaluate the use of the Leadership Exercise. If the Leadership Exercise is to be used in the future, make the "assistants'" roles more standardized across candidates. 2. Obtain better physical facilities for the operational center. 3. Define the skills more carefully and provide more examples of behavior for each one. 4. Spend more time in assessor training discussing the exercises and the ratings, and provide more practice in observing and recording behavior. 5. Continue the assessment center feedback process but with gre2ter emphasis on agency support in assisting in a career development program for each of the candidates. 12 APPENDIX A Skill List Definitions SKILL 1: Ability to make a persuasive, clear presentation of ideas or facts. Is effective in individual or group situations. SKILL 2: Ability to perceive the point of view and sensitivities of others and responds with understanding. SKILL 3: Ability to lead a group to accomplish a task and gets ideas accepted. Gets subor dinates to want to do their best. SKILL 4: Ability to modify behavioral style and approach to reach a goal. Adjusts quickly to changes and meets varying organizational demands and pressures. SKILL 5: Self-starter to initiate action. Makes active efforts to influence events. SKILL 6: Effective in seeking out pertinent data and determining the source of a problem. Takes all relevant considerations into account. SKILL 7: Ability to make quality decisions in a reasonable period of time. SKILL 8: Ability to plan and organize own activities effectively and to defined work objectives and priorities for accomplishing them. establish well SKILL 9: Ability to use subordinates effectively and to understand where a decision can best be made. SKILL 10: Appreciation of needs for controls over processes. SKILL 11: Ability to express ideas clearly in writing in good grammatical form. SKILL 12: Ability to work effectively within a group and hold group goals above individual needs. 13 APPENDIX B August 1974 FEDERAL EXECUTIVE DEVELOPMENT PROGRAM (FEDP II) SUPERVISORY APPRAISAL OF POTENTIAL FOR TOP-LEVEL MANAGEMENT The objective of the Federal Executive Development Program (FEDP) is to provide an opportunity for selected managers at grade GS-15 to prepare for executive responsibilities in the Federal service through a program of training and interagency work experience. The eventual executive responsibilities may include leadership in program areas or staff organizations. Through individually planned work assignments and formal training designed to sharpen managerial skills, the one-year program will improve the capability of individuals to assume important leadership roles in the future. Planned exposure to new roles in different organizations will provide program experiences to the participants and opportunities to eva! uate their capacity to operate as executives. In 1974-75, thirty-five outstanding managers with demonstrated executive potential will be selected for participation in this program. The purpose of this evaluation is to assess the potential of applicants to be executive managers in the Federal system. Your evaluation, as a first-or second-line supervisor of an applicant, will be used along with other information to select the final group of participants. INSTRUCTIONS Evaluate the applicant on each of the listed characteristics and then give us your best overall assessment of his* potential for executive management responsibilities. Also give your estimate of the value the FEDP experience might have for the applicant. Your evaluations should reflect only your personal judgment, so please do not discuss your ratings with others. • Masculine pronouns are used in this form simply to avoid repetition of awkward and space-consuming locutions. They in no way connote a judgment that prospective FEDP candidates should be male. 14 APPENDIX B (1) APPLICANT'S NAME (LAST, FIRST, Ml) (2) APPLICANT'S SSN (4) AGENCY BUREAU-DIVISION (3) APPLICANT'S POSITION TITLE AND SERIES (S) RATER'S NAME (LAST, FIRST, Mil (6) RATER'S POSITION TITLE (B) DATE OF APPRAISAL (7) NO. MONTHS SUPERVISED APPLICANT CHECK ONE: D First-level Supervisor of Applicant D Second-level Supervisor of Applicant Following is a list of characteristics which are related to success in some or all managerial positions. From your observation of the applicant's behavior, decide how well each statement describes him and place a v in the appropriate column. Then in the column at the extreme right, place an X after any characteristic which you believe could be augmented by the applicant's participation in the FEDP. If you have not had an opportunity to observe the applicant on a particular characteristic, check the "Not Observed" column. ONLY HIGHLY QUITE SOMEWHAT NOT NOT SLIGHTLY FEOP DESCRIPTIVE DESCRIPTIVE DESCRIPTIVE DESCRIPTIVE OBSERVED DESCRIPTIVE 1. Has in-depth understanding of the social, political, and economic forces that affect his program. 2. Has an excellent knowledge of the way Government works--of people, policies, organizations and missions. 3. Has a good understanding of the Federal personnel, budget, and contract and procurement s:rst.ems. 4. Is thoroughly informed about agency goals and operations, and is knowledgeable about the internal agency organization. 5. Understands how the work of other parts of the agency impinges on the work of his par.ticular unit and knows when and with whom to coordinate. 6. Is fully abreast of developments in labormanagement relations, and understands the role of unions and other employee organizations in his agency. 7. Is both realistic and innovative in making long-range program plans. 8. Recognizes when reorganization is desirable and is able to devise organizational structures which meet current needs. 9. Establishes and uses controls to make certain work is proceeding properly. 10. Adjusts readily to changes in mission, policy, organization, and personnel. 11. Works effectively with peers in task forces or in other group settings. 12. Neither too pliant nor too rigid. 13. Makes quality decisions. 14. Makes decisions readily but not without making sure of the facts. 15. Foresees problems and takes necessary action to prevent their becoming critical. 16. Broad-gauged in his approach to problems. 17. Can handle a lot of different problems at the same time. 15 APPENDIX B ONLY HIGHL V QUITE SOMEWHAT NOT NOT SLIGHTLY DESCRIPTIVE DESCRIPTIVE DESCRIPTIVE FEDP OESCRI PTI VE OBSERVED DESCRIPTIVE 18. Perceives the point of view and sensitivities of others and responds with understanding. 19. Recognizes the importance of staff development and sees that his subordinates have appropriate developmental opportunities. 20. Delegates appropriately-does not try to keep total control over everything himself, but at the same time insures that important matters are in competent hands. 21. Is knowledgeable about modern developments in data processing to the extent of being able to see applications of data processing in his work. 22. Understands the importance of good relations with the press, the public, and the Congress. 23. Plans effectively for the accomplishment of large-volume work projects. 24. Follows through to make sure that' work is on schedule. 25. Sets priorities effectively. 26. Is effective in thinking of new ideas and solutions. 27. Is reliable-you can depend on what he says. 28. Faces up to unpleasant problems and situations. 29. Gets along well with people from a variety of backgrounds. 30. Doesn't let personal antagonisms or "turf' • conflicts get in the way of program accomplishment. 31. Is good in "selling" his ideas and persuading people. 32. Is open-minded -listens to the ideas of others and is willing to change his views. 33. Good at recognizing the key parts of complex problems-doesn't get lost on minor points or overlook important considerations. 34. When things go wrong, he works to fix them instead of making excuses or trying to shift the blame. 35. Motivates people who work for him to want to do their best. 36. Has an impressive presence and manner -commands respect as a representative of the agency. 37. Communicates very effectively in informal situations. 38. Is an effective speaker-both in formal addresses and in question-and-answer sessions. 39. His writing is clear, correct, and well organized. I I 16 APPENDIX B HIGHLY QUITE SOMEWHAT ONLY NOT NOT SLIGHTLY FEOP DESCRIPTIVE DESCRIPTIVE DESCRIPTIVE DESCRIPTIVE OBSERVED DESCRIPTIVE 40. In adversary situations, is able to negotiate effectively to reach an acceptable solution. 41. Has a high level of energy. 42. Has a realistic perception of his own strengths and weaknesses. 43. Self-st iliting to influence events. 44. Performs well under stress and pressure. 45. Knows when to touch base with his superiors before acting and keeps them informed. 46. Cooperative in situations when a decision has been reached with which he did not agree. I. Overall. how would you evaluate the applicant's potential for assuming top-level managerial responsibilities (i.e., at the G&16 level or above) in the reasonably near future? D Has outstanding potential for executive management. D Has very good potential for executive management. D Has good potential for executive management. D Potential for executive management is marginal. D Not a potential 8xecutive manager. II. To what extent do you believe the applicant would benefit from participation in the FEDP? D He is so effective now that I can't see what FEDP would do for him except provide a showcase. [] F'EDP would provide the finishing touches to prepare him for broader responsibilities. FEDP would be very useful to him, but he will need further managerial experience beyond FEDP before he is ready for top-level responsibilities. D FEDP would benefit him, but I do not see him ever becoming a first-rate manager at the G8-16 level or above. D FEDP would be of no real help to him. Date __________________ ___ Rater's Signature____________________________ _ 17 APPENDIX C CANDIDATE QUESTIONNAIRE, PART I Please answer the following questions about the Federal Executive Development Program Assessment Center by placing an "X" in the column corresponding to your rating for a particularitem. This questionnaire will aid us in evaluating the assessment center process and the results will be kept strictly confidential. Your cooperation is greatly appreciated. Your name is not required on this form. Please return this form before you leave. Neither Strongly Disagree Disagree Agree Nor Disagree Agree Strongly Agree 1. After the assessment center orientation, I felt that I was well informed about what to expect. 2. I looked forward to participating in the assessment center process. 3. I felt that some exercises were too stressful. 4. Assessment center facilities were less than adequate. 5. I found that directions for the exercises were difficult to follow. 6. The individuals who evaluated me in the assessment center were very capable. 7. Conditions created in the assessment center were too unrealistic. 8. I was fairly evaluated in the assessment center. 9. The assessment center approach is superior to other methods for selecting individuals with executive management potential. 10. There are some managerial skills which should have been measured in the assessment center but were not. 11. My performance in the assessment center accurately reflects the way I would perform in a real life situation. 18 Form Q3 APPENDIX C Neither Strongly Agree Nor Strongly Disagree Disagree Disagree Agree Agree 12. My experience in the assessment center was challenging, revealing and rewarding. 13. Higher management should place a great deal of weight on assessment center results. 14. An individual's strengths and weaknesses were not accurately assessed in the assessment center. 15. The scheduling of exercises left something to be desired. 16. Were there any reasons why you felt you could not perform to the best of your ability? 17. What, if anything, was wrong with assessment center facilities?___________________________ 18. Should anything else have been covered in the orientation?_________________________________ 19. Which, if any, exercises were too stressful? 20. Which, if any, exercise instructions were difficult to follow? 19 Form Q3 APPENDIX C 21. What skills, if any, should have been measured?____________________________________________ 22. If you felt you were not fairly evaluated, why do you feel this way? 23. How could the assessment center be improved?______________________________________________ 20 Form Q3 APPENDIX C CANDIDATE QUESTIONNAIRE, PART II Please answer the following questions concerning the feedback from the Federal Executive Development Program Assessment Center by placing an "X" in the appropriate column corresponding to your rating for a particular item. The questionnaire results will be kept evaluate the feedback process. Your name stringly confidential and will be used to help us is not required. Please return the completed questionnaire to: Dale R. Baker Room 3223 Personnel Research and Development Center U.S. Civil Service Commission 1900 E Street, N. W. Washington, D. C. 20415 Your cooperation is greatly appreciated. 21 Form Q3 APPENDIX C Neither Strongly Agree Nor StronglyDisagree Disagree Disagree Agree ~ree 1. The feedback I received did not provide me with all the information I would have liked to have had. 2. My performance in the assessment center was accurately reflected in the feedback interview. 3. The feedback was effective in providing information that would aid me in selfdevelopment. 4. The manner in which the feedback team gave feedback was excellent. 5. Examples of areas in which my strengths and weaknesses lay were not provided. 6. What additional information should have been covered in the feedback session?____________ 7. How could the feedback team have better handled the interview?______________________________ 8. In what ways, if any, do you feel your performance was not accurately reflected in the feedback?------------------------------~-----------------c---------------------------------- 9. In what other ways could feedback be improved?_____________________________________________ _ 22 Form Q3 APPENDIX D ASSESSOR OPINION SURVEY Please answer the following questions about the FEDP II assessment center by placing an "X" in the column corresponding to your rating. This questionnaire will be useful in helping us to evaluate the assessment center process. Questionnaire responses will be kept strictly confidential. It is not necessary to sign your name to this form. NeitherStrongly Agree Nor Strongly Disagree Disagree Disagree Agree Agree 1. The assessment center process is effective in assessing a participant's strengths and weaknesses. 2. A candidate's potential for executive management is not objectively measured by the assessment center. 3. Assessment center facilities were less than adequate. 4. Instructions for the various exercises were easy to follow. 5. The rating procedures were difficult to comprehend. 6. The assessment center process is effec tive in aiding management to recognize an individual's strengths and weaknesses. 7. Higher management does not place much weight on assessment center results. 8. Higher management should place more weight on assessment center results. 9. The scheduling of assessors for the operational assessment center could have been handled better. 10. I was confident in the accuracy of my overall judgment of a candidate's potential 11. I was not confident in the accuracy of my assessment of individual skill ratings. 23 Form Q2 APPENDIX D Neither Strongly Agree Nor StronglyDisagree Disagree Disagree Agree Agree 12. A participant should not place too much reliance on assessment center results. 13. I would like to serve as an assessor again. 14. A participant's suvervisor should not place too much reliance on assessment center results. 15. What advantages are there to serving as an assessor? 16. Disadvantages?____________________________________________________________________________ ___ 17. What skills do you feel were most accurately measured in the assessment center? WHY? 18. Rank the skills you feel are most important in terms of their relative contributions to the overall assessment center rating.____________________________________________ 19. What exercises contributed most to the assessment center? WHY?___________________ 20. The least? WHY? 24 Form Q2 APPENDIX D 21. Should any skills be included or excluded in the assessment center? WHY? 22. What should be done to improve the assessment center?__________________________________ ___ 23. Comments:__________________________________________________________________________________ 25 Form Q2 APPENDIX D ASSESSOR OPINION SURVEY, PART II Neither Strongly Agree Nor StronglyDisagree Disagree Disagree Agree Agree 1. The quality of assessor training was good. 2. Assessor training facilities were not acceptable. 3. The location of the assessor site created undue problems. 4. I was adequately trained on each of the exercises. 5. Assessor training was too long. 6. There were enough opportunities to ask questions during training. 7. Inwhichof the twelve assessment center skill areas do you feel you could use some more training? In what other areas? 8. In what areas do you feel you need more practice?__________________________________________ _ 9. What portions of assessor training should be excluded or shortened?_______________________ 10. Added or increased?________________________________________________________________________ _ 11. How do you think assessor training could be improved?____________________________________ ___ 26 Form Q2 APPENDIX E Candidate Questionnaire, Part I The open-ended questions (16-23), candidate responses and comments, and the percentage of those candidates who indicated each response appear below. (~ = 72.) % 16. Were there any reasons why you felt you could not perform to the best of your ability? 55.56 Time Constraints No 15.28 Situations too unrealistic 8.33 6.94 Conditions too stressful Some instructions weren't clear 4.17 Other (one exercise assumed accounting/budget experience, jet lag led to fatigue on the first day, etc.) 9.72 17. What, if anything, was wrong with assessment center facilities? 44.44 Nothing 23.61 Rooms were too scattered and elevators were too slow Not enough working room and no place to relax 13.89 Other (rooms were too stuffy, there was too much down time, too many 18.06 distractions, etc.) 18. Should anything else have been covered in the orientation? 61.11 No More information about the measurement devices and the skills 11.11 assessed Provide more detailed information in the orientation and in the 5.56 mail-out What will happen to those who make the program and to those who don't? 4.17 Other (background of assessors and candidates, how the assessment center results are used later, more information on the facilities, 18.06 etc.) 19. Which, if any, exercises were too stressful? 61.11 None 36.11 Leadership Exercise Other (Administrative and Problem Analysis Exercises) 2.78 20. Which, if any, exercise instructions were difficult to follow? 72.22 None Problem Analysis Exercise 13.89 5.56 Leadership Exercise Other (Administrative and Group Discussion Exercises, research 8.33 collection forms, etc.) 27 APPENDIX E 21. What skills, if any, should have been measured? None others Oral and written communication Decision making Dealing with people, group interaction Planning and organizing (coordinating, directing, etc.) Critical managerial skills rather than energy level Motivation and development of subordinates Other (reaction to public and political pressure, formal presentations, etc.) 22. If you felt you were not fairly evaluated, why do you feel this way? Did not feel unfairly evaluated Did not know how I was evaluated Other 23. How could the assessment center be improved? No suggestions Improve the facilities (have a central location, have a sign in the lobby, clocks in the rooms, more privacy for individual problems) The situations need to be more realistic and relevant organization Feedback should occur the week after assessment Have less down time Allow candidates to get to know each other to the Improve the scheduling of exercises (alternate group and individual exercises, begin with a group exercise, provide breaks between exercises) Reduce time constraints Other (it was good for a two day center, let assessors get to know candidates, provide more of an orientation on the concepts involved and a list of skills to be measured, make the exercises less stressful, notify candidates of time changes on the exercises, etc.) % 47.22 9.72 9. 72 5.56 5.56 5.56 4.17 12.50 86.11 12.50 1. 39 33.33 11.11 9. 72 5.56 5.56 5.56 5.56 4.17 19.44 28 APPENDIX E Candidate Questionnaire, Part II candidate responses and comments, and the percentage of those Open-ended questions (6-9), 39.) candidates who indicated each response appear below. (~ = % 6. What additional information should have been covered in the feedback session? 35.90 Nothing else Methods or ways in which I can improve myself 17.95 More details on the ultimate selection criteria used by the 15.38 selection panel Areas in which there was room for improvement 12.82 My position relative to other candidates 7.69 There was a tendency to gloss over a person's shortcomings - 5.13 more candidness is needed 5.13 I was satisfied with everything Other (I should have been able to review the report on me and make comments, weak points could have been provided 7.69 in writing, etc.) 7. How could the feedback team have better handled the interview? 38.46 No suggestions 23.08 The team's handling was fine They should have been more specific on strengths and weaknesses 7.69 Assessors should have been present to clarify candidate 7.69 reports Allow the candidates to see their reports --the information 7.69 covered was too voluminous to take notes on 5.13 Less defensiveness on the part of the assessment center Other (tape the session for later review, all the members of 10.26 the review team should have been present, etc.) 8. In what ways, if any, do you feel your performance was not accurately reflected in the feedback? 71.79 My performance was accurately reflected My performance was not accurately reflected (my background and experience did not appear to be adequately considered, I felt I was misjudged by assessors when I took a stand I believed 7.69 to be correct, etc.) Some of the skill ratings were based on very limited data 7.69 The assessment center was too much of a game --it was 5.13 pointless and superficial Other (feedback should have been more in depth on person's a 7.69 performance and in writing, etc.) 29 APPENDIX E % 9. In what other ways could feedback be improved? No suggestions Place more emphasis on recommendations for development- 41.03 10.26 specific courses Assessors should be present at feedback session and critique candidate 10.26 The feedback was the most satisfactory part of the whole process 10.26 Feedback should have been more timely 7.69 Video taped feedback 5.13 Other (inform participants of their ratings in relation to others, the feedback session deserved more attention than it got, etc.) 15.38 30 APPENDIX F Assessor Opinion Survey, Part I Open-ended questions (15-23), assessor comments and responses, and the percentage of those assessors who indicated each response appear below. (~ = 12.) % 15. What advantages are there to serving as an assessor? It allows one to sharpen skills in evaluating the performance of subordinate managers 75.00 Other (it''s good training and experience, it changes the way way I will appraise others) 25.00 16. What disadvantages are there to serving as an assessor? It requires too much time away from the office 50.00 None 16.67 Too much night work 16.67 It requires a significant investment 16.67 Other (hard work, disrupts normal work and personal life) 16.67 17a. What skills do you feel are most accurately measured in the assessment center? Skill 1 66.67 Skill 3 50.00 Skill 5 33.33 Skill 7 33.33 Skill 8 33.33 Skill 9 33.33 Skill 6 25.00 Skill 2 16.67 Skill 4 16.67 Skill 10 16.67 Other 8.33 17b. Why? Skills 1, 2, 3, 6 and 7 --the exercises were designed to allow sufficient opportunities to demonstrate these skills_ Skill 1 --exercises forced candidates to formulate thought sequences, organize these and express them, with little opportunity to "fake it"; it was easily observable. Skills 3, 6, 7, and 12 --I would not select anyone who was low in these areas. Skills 1, 3, and 7 --cannot be exhibited or measured well in the normal setting; most of the skills had quite clear definitions and there was enough data to permit easy determinations. Skill 8 --there was more tangible evidence in products prepared by the candidates. Skills 2, 5, and 6 --the exercises brought out these skills more distinctly. Skill 4 --both time and situations combined to produce tension, 31 APPENDIX F 18. Rank the skills you feel are most important in terms of their relative contribution to the overall assessment center rating. (Median ranks were calculated.) 1. Skill 3 2. Skill 1 3. Skill 6 4. Skill 8 5. Skill 7 6. Skill 2 7. Skill ll 8. Skill 10 9. Skill 4 10. Skill 5 ll. Skill 9 12. Skill 12 % 19a. What exercises contributed most to the assessment center? Administrative Exercise 66.67 Leaderless Group Discussion Exercises 50.00 Problem Analysis Exercise 25.00 Leadership Exercise 16.67 19b. Why? Administrative Exercise, Problem Analysis, and Leaderless Group Discussion--these are more representative of management experiences and can be more easily evaluated; offered good opportunities to observe broad range of skills. Administrative Exercise and Leaderless Group Discussion (assigned role) --forced candidates to try if they were to compete; the skills being assessed were clearly observable. Administrative Exercise --· contributed most because of its breadth and similarity to real world problems; was standardized for all candidates and provided many opportunities for skills to be demonstrated. Leaderless Group Discussions --surfaced the leaders, showed interactions in two different group situations. Leadership Exercise --"separated the sheep from the goats"; exposed candidate's ability to function with uncooperative assistance --very stressful. % 20a. What exercises contributed least to the assessment center? Leadership Exercise 66.67 Problem Analysis Exercise 16.67 Other (irrelevant responses) 25.00 32 APPENDIX F 20b. Why? Leadership Exercise --it is unreal, the assistants should have been trained to be more consistent in their roles; not a reliable indicator .of behavior; not sure of objective, structured to confuse candidate, very unreal; needs restructuring; a bit contribed; relative comparisons among candidates were difficult to make with inconsistent role-playing; no viable solution was possible; valuable exercise but breadth of observable skills is the most limited in this exercise. Problem Analysis --rather fuzzy, interview process created problems; instructions confusing; directions poor, almost consistently poor performance, and it did not reveal any s:ignificant skill. % 2la. Should any skills be included or excluded in the assessment center? None should be excluded, no others included 16.67 The following skills should be combined: Skills 3 and 9 16.67 Skills 2 and 12 8.33 Skills 5 and 12 8.33 Skills 9 and 10 8.33 The following should be defined better: 8.33 Skill 5 Skill 4 8.33 Skill 7 8.33 8.33 Skill 12 Include motivation 8.33 Exclude Skill 11 8.33 Decision making should be more readily observable 8.33 16.67 Other 21b. Why Skill 11 --should be excluded because not enough time for candidates to demonstrate it well. Motivation --should be included because effectiveness on the job is function of ability times motivation. Skills 2 and 12 --should be combined because they were closely related. Skill 7 --should be made more reFdily observable as it can only be assessed after the results of decisions are observed; more opportunity should be given to make decisions. 33 APPENDIX F % 22. What should be done to improve the assessment center? Provide better physical facilities (too crowded, uncomfortable, not enough places to write up reports, more privacy, improve cleanliness and attractiveness, use FEI, etc.) Improve ratiiJg forms (list only skills appropriate under each exercise, etc.) Improve exercises which are designed to measure Decrease emphasis on Skill 2 Improve instructions Provide better exercises to elicit Skills 3, 7, Provide guidelines on what to observe in some of exercises Clarify the goal of the Leadership Exercise 16.67 Skill 4 8.33 8.33 8.33 and 12 8.33 the 8.33 8.33 Provide secretarial assistance to take dictation 8.33 Assessors should discuss each exercise to insure common understanding before making individual write ups 8.33 Make role players in Leadership Exercise more consistent 8.33 Have a residential setting 8.33 In the Problem Analysis Exercise, the assessor who writes the report should not be the role player 8.33 Nothing 8.33 23. Comments: None 41.67 A difficult job was handled very professionally 16.67 The reliability and validity of the Leadership Exercise is questionable 16.67 Any exercise needs careful testing prior to use 8.33 There is some difficulty in defining Skills 4 and 7 8.33 Decrease the work level 8.33 The candidate ranking exercise (for research purposes only} took too much time away from the actual evaluation process and the results would be questionable as we were not observing all six candidates at once 8.33 The assessment center is the finest thing I've seen to accomplish its purpose, it does its job extremely well, but it is not cost effective 8.33 How can we establish mutually exclusive skills? 8.33 Are the exercises valid tests of managerial skills would candidates perform differently in the real world? 8.33 34 APPENDIX F Assessor Opinion Survey, Part II Open-ended questions (7-11), assessor responses and remarks, and the percentage of assessors who indicated each response appear below. % 7a. In which of the twelve assessment center skill areas do you feel you could use some more training? 16.67 Skill 2 16.67 Skill 3 16.67 Skill 7 16.67 Skill 8 16.67 Skill 9 16.67 Skill 10 8.33 Skill 1 8.33 Skill 4 8.33 Skill 5 50.00 None 7b. In what other areas? 100.00 None 8. In what areas do you feel you need more practice? 16.67 Skill 8 16.67 Skill 10 8.33 Skill 1 8.33 Skill 9 8.33 All areas Writing up individual reports --how to combine them into the final report 8.33 8.33 The actual assessment center process 41.67 None 9. What portions of assessor training should be excluded or shortened? 8.33 The introduction 91.67 None 10. What portions of assessor training should be added or increased Each assessor should go through the assessment center 8.33 Defining and illustrating skills 8.33 The opportunity to observe exercises with the developer a 8.33 of the exercise or trained assessor 8.33 How to write up reports More time should be spent discussing the exercises 8.33 Spend more time in discussing ratings --what is the standard for a "4" 8.33 35 APPENDIX F % More practice in observing and recording behavior 8.33 More training should be done with the team an assessor would be assigned to for the assessment of actual candidates 8.33 None 50.00 11, How do you think assessor training could be improved? Provide a social hour to encourage mingling and discussion 16.67 The training was adequate; it is just fine right now 16.67 The training room was too small 8.33 The role players should be properly trained 8.33 Provide more structure in setting up assessor teams for training--don't keep same teams too long 8.33 Provide better examples of final reports for training purposes 8.33 Provide preparation material prior to training 8.33 Require each assessor to go through the process 8.33 Provide an opportunity to observe exercises with the developer of the exercises 8.33 Provide more practice ip observing and recording behavior 8.33 36 APPENDIX G TABLE A Mean Item Ratings and Standard Deviations for Part I of the Candidate Questionnaire Item Mean Standard Numbera Rating Deviation 1 3.60 1.00 2 4.18 .88 3 3.78 1.05 4 3.36 .97 5 4.03 .79 6 3.93 .79 7 3.56 .99 8 3.33 .58 9 3.56 .85 10 3.18 .84 11 3.40 1.01 12 4.21 .69 13 3.49 .79 14 3.35 .81 15 3.51 .89 an = 72 for each item. TABLE B Mean Item Ratings and Standard Deviations for Part II of the Candidate Questionnaire Item Mean Standard Numbera Rating Deviation 1 4.05 1. 62 2 3.79 .95 3 3.59 .68 4 3.74 .97 5 3.69 .92 ~ 39 for each item. 37 APPENDIX G TABLE C Mean Item Ratings and Standard Deviations Part I of the Assessor Opinion Survey Item Mean Standard Numbera Rating Deviation 1 4.58 .51 2 4.17 .94 3 2.15 1.16 4 4.25 .45 5 3.83 1.19 6 4.50 .52 7 3.17 .94 8 3.92 . 67 9 4.08 .51 10 4.50 .52 11 4.08 .79 12 4.08 .90 13 2.50 1. 24 14 3.83 1.03 an 12 for each item. TABLE D Mean Item Ratings and Standard Deviations for Part II of the Assessor Opinion Survey Item. Mean Standard Number a Rating Deviation 1 4.17 .94 2 4.33 .79 3 4.08 1.08 4 3.83 1.03 5 4.08 .51 6 4.00 1.04 ~ 12 for each item. 38