Appendix B

 

 

 

 

 

 

 

 

 

for

Fermi National

Accelerator Laboratory

 

FY2003

 

 

 

 

Section I: PREAMBLE

 

 

 

Performance Measures

 

PREAMBLE

 

This Appendix sets forth the procedure to be used in the evaluation of Fermi National Accelerator Laboratory performance as required by Part I, Section H, Clause H.14 - Use of Objective Standards of Performance, Self Assessment and Performance Evaluation, and as referenced in Part II, Section I, Clause I.81A - Total Available Fee:  Base Fee Amount and Performance Fee Amount, of the Contract.  The procedure described in this Appendix utilizes, to the extent possible, a set of "objectives", "indicators", "measures", and “metrics" against which the Department of Energy (DOE) will assess Fermi National Accelerator Laboratory's performance for each area identified herein.

 

Section II of this Appendix, Performance Based Management Guidelines, sets forth guidelines on the use of the performance objectives, indicators, measures, and metrics.

 

For the period October 1, 2002, through September 30, 2003, the Parties have agreed to evaluate the Laboratory activities identified in Section III of this Appendix, Performance Areas.  Section III reflects that DOE will evaluate the Contractor in two broad areas ("Performance Areas"), namely Critical Outcomes (Performance Area 1) and Self- Assessment (Performance Area 2).  The Critical Outcomes consist of incentivized (fee bearing) Performance Measures.  The Self-Assessment consists of non-fee bearing Performance Measures. 

 

DOE will rate the Critical Outcomes Performance Areas of Science Programs (A) and Operations Management (B-F) separately for the purpose of determining the contractor’s performance or determining fee earned by the contractor.  DOE will use the ratings received for Critical Outcomes to determine the Contractor’s fee earned in a given performance period.  However, DOE reserves its rights specified elsewhere in this Contract, including those in Part I, Section H, Clause H.14 - Use of Objective Standards of Performance, Self Assessment and Performance Evaluation, and those in Part II, Section I, Clause I.81A – Total Available Fee:  Base Fee Amount and Performance Fee Amount, and Clause I. 83 - Conditional Payment of Fee, Profit, or Incentives.

 

Section IV lists the performance objectives, indicators, measures, and metrics for Performance Area 1, Critical Outcomes and for Performance Area 2, Self-Assessment.

 

Attachment 1 provides the schedule for performing the evaluation of the Laboratory.  The parties intend to adhere to this schedule although either party may request to alter the proposed schedule.

 

Attachments 2 and 2a establish the maximum performance fee earnable by the Contractor, as well as the potential reductions to the performance fee, based on the individual ratings in the Performance Area of Critical Outcomes.

 

The Parties agree to work together to clarify and improve, when necessary, the process to be used to measure and validate the level of performance attained.  In particular, the Parties agree to:

 

a.      check the validity of each respective performance objective, indicator, measure, and metric as an accurate and meaningful reflector of performance and replace them with more appropriate performance objectives, indicators, measures, and metrics, if necessary;

 

b.      consider adding to or subtracting from the compliment of performance objectives, indicators, measures, and metrics in order to track performance objectives more meaningfully and accurately; and

 

c.      consider adding or subtracting performance measures as appropriate in response to the evolving requirements of DOE; in particular, the Parties undertake to replace requirements contained in DOE Directives whenever feasible by performance measures. 

 

The Parties acknowledge that continued changes in the Department’s Directives system are occurring, and that implementation of this performance-based contract may require changes to:   1) refine selected performance objectives, indicators, measures, and metrics; 2) implement data collection and reporting mechanisms; and 3) establish benchmarks against which to set targets for performance improvement and/or measurement.

 

The Parties will use the evaluation period to assure the implementation, testing, and refining of systems and processes.  The DOE will use the results of these performance measures, the contractor's self-assessment of overall performance and other inputs, such as DOE's day-to-day operational awareness, General Accounting Office or Inspector General reviews, or for-cause reviews, as appropriate to evaluate the Contractor's performance for each performance period.

 

Attachments:

Attachment 1:    Typical Evaluation Schedule

Attachment 2:    Performance Fee

Attachment 2a:  Fermilab Critical Outcomes Fee Distribution

Attachment 3     Self-Assessment Forms     


 

 

Section II: PERFORMANCE-BASED MANAGEMENT GUIDELINES

 

 

1.   The purpose of these Guidelines is to institutionalize a performance-based management system that encourages and rewards excellence, continuous improvement, cooperation and timely communication.

 

2.   In keeping with the objectives set forth above, any performance-based management contract must begin with the establishment of contract performance objectives, indicators, measures, and metrics, which may be linked to pre-established performance incentives that, if achieved, will:

 

a.      enhance the Laboratory's ability to accomplish its mission for the Department;

 

b.      drive cost-effective performance improvements, focusing on efficient system performance while maintaining appropriate internal controls;

 

c.      allow for meaningful trend and rate of change analysis, when possible; and

 

d.      encourage benchmarking initiatives as a means of incorporating industry business standards, and "best practices" that are meaningful, appropriate, and consistent with Departmental requirements and deemed to reflect overall successful operations.  "Best practices" should include cost/risk/benefit analysis.

 

3.         Performance-Based Contract Measures (PBCMs) which include Critical Outcome and Self-Assessment measures should be constructed to: 

 

a.      drive improvements;

 

b.      focus on effectiveness of systems; and

 

c.      maintain an appropriate level of internal controls. 

 

PBCMs should incorporate "best practices" and reflect the DOE's and the Contractor's judgment as to the key performance elements which will enhance the fulfillment of the Department's mission objectives.  Each Division and Section of Fermilab shall participate in the development of performance measures.  The Performance Measures for the Performance Areas of Critical Outcomes and the Self-Assessment are incorporated into Section IV of this Appendix B.   Performance Measures for Critical Outcomes are tied to performance fee.  Self-Assessment Measures will be considered with the overall evaluation of the contractor’s performance.

 

4.         PBCMs are composed of five tiers for Critical Outcome measures and four tiers for Self-Assessment measures:

 

a.            Critical Outcome:  A long-term or constant area or activity that is mutually important to both the DOE and Fermilab, and which has high priority.

 

b.            Objective:  Statements of desired outcomes for an organization or activity.

 

c.            Indicator:  Areas of performance to be measured.

 

d.            Measure:  A quantitative or qualitative characterization of performance.

 

e.            Metric:  A result, output, or characteristic of the activity to be evaluated.

 

5.     Adjectival Ratings are as follows:

 

a.      Outstanding:  Significantly exceeds the standards of performance; achieves noteworthy results.

 

b.      Excellent:  Exceeds the standard of performance; although there may be room for improvement in some elements, better performance in all other elements more than offsets this.

 

c.      Good:  Meets the standard of performance; deficiencies do not substantively affect performance.

 

d.     Marginal:  Below the standard of performance; deficiencies are serious and may affect overall results;  management attention and corrective action are required.

 

e.     Unsatisfactory:  Significantly below the standard of performance; deficiencies are serious, may affect overall results, and urgently require senior management attention.

 

6.     PBCMs should reference industry standards, best practices, or other standards, which are meaningful, appropriate, and consistent with DOE requirements, rather than trying to develop standards arbitrarily.  To this end, benchmarking initiatives are strongly encouraged.  When establishing benchmarks and setting targets the parties should consider the return on the cost required to make further improvements.

 

7.     The methodology for measuring performance will be established by mutual agreement of the parties (except as may be otherwise specified in this contract) prior to the start of the performance period.

 

8.         The Parties acknowledge that the performance levels achieved against the specific performance objectives, indicators, measures, and metrics, which are established in the contract for the Critical Outcomes and directly linked to contract fee, are the primary but not the sole criteria for determining the Contractor's final performance ratings and fee earned in any given performance period. 

 

When determining the Contractor's final performance ratings and fee earned in any given performance period for the Performance Measures in the Critical Outcomes, the Contracting Officer also will consider:  1) Laboratory performance in the Self-Assessment Measures; and 2) any other relevant information directly related to the Performance Measures in the Critical Outcomes that is deemed to have had an impact (either positive or negative) on the Contractor's performance.  The Contracting Officer also will consider relevant information available from other sources, including but not limited to, the Contractor's self-assessment, DOE's day-to-day operational awareness, annual business reviews, (if applicable) Inspector General reviews, General Accounting Office (GAO) audits, for cause reviews, etc., The Contracting Officer also will consider the Contractor cooperation, interaction, and responsiveness to DOE throughout the performance period. This evaluation process does not impact DOE's rights under Part II, Section I, Clause I.83 - Conditional Payment of Fee, Profit, or Incentives.

 

Should the Contracting Officer contemplate considering other relevant information in establishing the final performance rating for either the Critical Outcomes or Self- Assessment for the performance period, the Contracting Officer will give the Contractor written notice specifying such information at the appropriate and reasonable time, the reasons for considering it relevant and significant, and the intended effect on the performance rating for the year.  The Contractor will be given the opportunity to respond in writing and, if the Contractor requests, in a meeting to respond to the Contracting Officer's intended action.

 

9.     The Contracting Officer will review, approve and periodically verify how the Contractor collects, compiles and scores its performance against the measures established annually and incorporated into the contract as Section IV of this Appendix.

 

10.   PBCMs are to be developed in a team approach involving appropriate representatives from the Fermi Area Office, Chicago Operations Office, HQ, Universities Research Association, Inc. and Fermi National Accelerator Laboratory.

 

11.   Failure to include a specific objective and/or measure in the contract in Section IV does not eliminate the need for the Contractor to comply with any contractual requirements, and failure to comply may result in the Contracting Officer modifying the performance rating achieved.

 

12. The Director of the Office of Science (SC-1) has the primary responsibility for evaluating Science Programs performance, but input also will be sought from cognizant DOE Assistant Secretaries, Office Directors and Program Managers.  The Contracting Officer has the primary responsibility for evaluating the Operations Management performance in accordance with the objectives, indicators, measures, and metrics of Performance Area 1, Operations Management, B through F, and Performance Area 2, Self-Assessment.  However, the Contracting Officer shall inform SC-1 of any issues or concerns that should be considered when evaluating the Contractor's performance in Science Programs.  This is especially important in those areas where operational performance could have a significant impact on the Contractor's ability to conduct successful research for the Department.  The Contractor has primary responsibility to compile the data necessary to document its performance against all measures.

 

13. If, for reasons beyond the Contractor's control, certain data input may not be available to meet the appraisal schedules outlined in Attachment 1 to this Appendix,  the evaluation shall proceed according to schedule for measures which have complete data.  Final ratings shall not be determined until all ratings are completed.  A final assessment report with final adjectival ratings will only be issued when sufficient data is available to evaluate the Contractor's performance against all measures. 

 

14. The Contractor and DOE have agreed to specific weights for the Performance Areas of Science Programs and Operations Management (70% and 30%, respectively).  In addition, within each of these areas, individual measures will have metrics established to gauge Laboratory performance.  If the Parties cannot reach agreement on either the individual metrics or the specific weights for the evaluation criteria, the Contracting Officer shall have the right to establish such weights and/or metrics.

 

15. In the event the Contracting Officer determines it necessary to exercise the right set forth in 14 above, the Contracting Officer will notify the Contractor in writing of the intended decision.  The final weightings and/or metrics will be issued to the Contractor within 10 working days of the aforementioned written notice.

 

16. The Contractor shall have the ability to earn an annual performance fee as described in Attachments 2 and 2a of this Appendix.


 

 

Section III:      PERFORMANCE AREAS

 

 

PERFORMANCE AREAS

 

 

PERFORMANCE AREA 1:  CRITICAL OUTCOMES

 

 

I.  Science Programs 

Weight

A

Science

70%

 A.1

Quality of Research

 30%

  

 A.2

Success in Constructing and Operating Research Facilities

 25%

 A.3

Effectiveness and Efficiency of Research Program Management

 15%

 A.4  

Relevance to DOE Missions and National Needs

  Pass/Fail

 

 

 

 

II.  Operations Management 

30%

B

Leadership 

  6%

C

Environment, Safety & Health

  9%

D

Mission Support

  9%

E

Self Assessment

  3%

F

Stakeholder Relations

  3%

 

                                                         Total

100%

 


 



 

PERFORMANCE AREA 2:  SELF-ASSESSMENT

 

Division/

Section

 Category                                                

Assessment Planning Responsibility

Section

 

 

DIR

1/3 of Documented Processes

DIRECTORATE

A

Science Program

DIRECTORATE

B

Leadership

URA

E

Self-Assessment

DIRECTORATE

F

Community Involvement

DIRECTORATE

G

Intellectual Property

DIRECTORATE

H

Financial Management

DIRECTORATE

I

Counterintelligence

DIRECTORATE

 

 

 

BSS

1/3 of Documented Processes

 

D

Mission Support

BSS

J

Property*

BSS

K

Procurement*

BSS

L

Legal Management

BSS

 

 

 

ES&H

1/3 of Documented Processes

 

C

Environment, Safety & Health

ES&H

M

Waste Reduction

ES&H

N

Environmental Management Systems

ES&H

O

Safeguards and Security

ES&H

 

 

 

FESS

1/3 of Documented Processes

 

D

Mission Support

FESS

P

Real Estate Management

FESS

Q

Facility Maintenance and Engineering

FESS

 

 

 

LSS

1/3 of Documented Processes

 

R

Human Resources*

LSS

S

Training

LSS

T

Diversity

LSS

U

Science and Technology Information

LSS

 

 

 

BEAMS

1/3 of Documented Processes

BEAMS

CD

1/3 of Documented Processes

 

V

Cyber Security

CD

 

 

 

PPD

1/3 of Documented Processes

PPD

 

 

 

TD

1/3 of Documented Processes

TD

            *Balanced Scorecards for these functions are stand-alone measures that are

              submitted in the BSC format in addition to all other assessments.


 

 

Section IV:  PERFORMANCE MEASURES

 

 

Performance Area 1:  Critical Outcomes

I.  Science Programs – 70%

 

                                A. Science

 

Critical Outcome:  Advance the understanding of the fundamental nature of matter and energy by conducting research at the frontier of high-energy physics and related disciplines.

 

Objective

A .1

Quality of Research - Advancement in the understanding of the fundamental nature of matter and energy.

 

Indicator

A.1.1

Success in producing original, creative scientific output that advances fundamental science and opens important new areas of inquiry.

 

Measure A.1.1.1

 

Results of program peer reviews reveal sustained progress.

 

Metric

A.1.1.1.1

 

Office of Science evaluation with input from the URA Visiting Committee(s).

 

Indicator A.1.2

 

Success in achieving sustained progress and impact on the field.

Measure A.1.2.1

The results of reviews and evaluations indicate the laboratory programs have a sustained impact on the scientific field.

 

Metric A.1.2.1.1

 

Office of Science evaluation with input from the URA Visiting Committee(s).

 

Indicator A.1.3

 

Recognition from the scientific community, including awards, peer- reviewed publications, citations, and invited talks.

 

Measure A.1.3.1

 

Research output produced is recognized by the scientific community.

Metric A.1.3.1.1

Office of Science Evaluation.

 

 

Weight 30%

 

 

 

Objective A.2

 

Successfully construct and operate research facilities.

 

Indicator A.2.1

 

Construction and commissioning of new facilities on time and within budget; achievement of facility performance specifications and objectives.

Measure A.2.1.1

 

DOE approved project baselines are met.

 

Metric A.1.2.1.1

 

Performance against the DOE-approved NuMI construction project baselines, as evaluated by the DOE Project Manager using input from reviews and assessments.

 

Metric A.1.2.1.2

Performance against the DOE-approved U.S. LHC Accelerator construction project baselines, as evaluated by the DOE Project Manager using input from reviews and assessments.

 

Metric A.1.2.1.3

Performance against the DOE-approved U.S. CMS Detector construction project baselines, as evaluated by the DOE Project Manager using input from reviews and assessments.

 

Metric

A.1.2.1.4

Performance against the DOE-approved project baselines for the Run IIb CDF and DZero Detector projects, as evaluated by the DOE Project Manager using input from reviews and assessments.

 

Indicator A.2.2

Reliability of operations and adherence to planned schedules for accelerator run hours and delivered integrated and peak luminosity.

 

Measure A.2.2.1

 

DOE approved accelerator and experimental facilities operations goals.

 

Metric A.2.2.1.1

Number of Tevatron store hours during the fiscal year.

 

Outstanding

Excellent

Good

Marginal

 

> 2800

> 2400

≥2000

< 2000

 

Metric A.2.2.1.2

Average of CDF and DZero delivered integrated luminosity as reported by the Beams Division during the fiscal year.

 

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

> 225 pb-1

> 180 pb-1

>135 pb-1

≥ 90 pb-1

< 90 pb-1

 

Metric A.2.2.1.3

Peak weekly integrated luminosity during the fiscal year.

 

 

Outstanding

 Excellent

   Good

  Marginal

Unsatisfactory

 

> 11 pb-1wk-1

> 9 pb-1wk-1

> 7 pb-1wk-1

5 pb-1wk-1

< 5 pb-1wk-1

 

 

 

Weight 25 %

 

 

 

Objective A.3

Provide for effective and efficient program management for a world class research program.

 

Indicator A.3.1

 

The extent to which effective management programs support the research program.

 

Measure A.3.1.1

 

Optimal use of personnel, facilities and equipment.

 

Metric A.3.1.1.1

 

Office of Science evaluation.

Measure A.3.1.2

Effectiveness of communicating technical results to maximize the value of research results and gain appropriate recognition for DOE and the Laboratory.

 

Metric A.3.1.2.1

 

Office of Science evaluation.

Measure A.3.1.3

 

Success in meeting budget projections and milestones.

Metric A.3.1.3.1

 

Office of Science evaluation.

Measure A.3.1.4

 

Planning for future physics program.

Metric A.3.1.4.1

Office of Science evaluation.

 

                                                Weight   15%

 

 

 

Objective A.4

 

Establish relevance to DOE missions and national needs.

Indicator A.4.1

 

The laboratory successfully contributes to DOE missions and programs of national importance.

Measure A.4.1.1

 

Contributions to the goals and objectives of the strategic plans of DOE and other national programs.

Metric A.4.1.1.1

 

Office of Science evaluation.

Measure A.4.1.2

 

Productive interaction with other scientific programs.

Metric A.4.1.2.1

 

Office of Science evaluation.

Measure A.4.1.3

 

Effective use of research facilities that serve the needs of a wide variety of users.

Metric A.4.1.3.1

 

Office of Science evaluation with input from the URA Visiting Committee(s).

 

Weight  0% (Pass/Fail)

 

 

Weightings for Science Review

Measure

Weight, %

A.1   Quality of Research

30

A.2   Success in Constructing and                  Operating Research Facilities

25

A.3   Effectiveness and Efficiency of Research Program Management

15

A.4   Relevance to DOE Missions and                      National Needs

0 (P/F)

                                                            Total

70

 


II.  Operations Management – 30%

           

                             B. Leadership                         

 

Critical Outcome:   Provide the leadership to ensure operational excellence and foster responsible stewardship of the DOE resource.

 

Objective B.1

 

URA provides the integrated management and leadership necessary to enhance the operations and management processes that are necessary to ensure execution of the Fermilab mission in a safe, effective, and efficient manner.

 

Indicator B.1.1

 

URA directs management reviews that result in an overall assessment of key Fermilab operations functions and management systems.

 

Measure B.1.1.1

 

Management systems and processes, including a review of each operational area are accomplished at least once every three years.

 

Metric B.1.1.1.1

 

DOE evaluation with input from reviews done within the performance period, such as peer reviews for operations, projects, and programs.

 

Measure B.1.1.2

 

Management effectively resolves important issues arising during the year.

 

Metric B.1.1.2.1

 

DOE evaluation, with input from reviews done within the performance period and from operational awareness activities.

Indicator B.1.2

 

URA management promotes operational and management system excellence.

 

Measure B.1.2.1

 

Management proactively identifies and addresses opportunities for improvement.

 

Metric B.1.2.1.1

 

DOE evaluation, with input from reviews done within the performance period and from operational awareness activities.

Measure B.1.2.2

 

Management responds appropriately to recommendations made by review teams.

 

Metric B.1.2.2

DOE evaluation, with input from reviews done within the performance period and from operational awareness activities.

 

 

 

 

 

 

 

 

 

 

 

 

Weighting for Leadership

Objective

Weight, %

B.1

6

Total

6

 

 

 

 

Weight  6%

 

 


 

 

                C. Environment, Safety & Health

 

Critical Outcome:   Protect the safety and health of the Fermilab workforce, subcontractors, the community, and the environment in all SC program activities.

 

Objective C.1

 

Identify and implement enhanced Integrated Safety Management Systems (ISMS) through the Tripartite Assessment process.

 

Indicator C.1.1

 

Completion of Phases II and III of the Beams Division ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by September 30, 2003.

 

Indicator C.1.2

 

Completion of the Facilities Engineering Services Section (FESS) ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by September 30, 2003.

 

Indicator C.1.3

 

Completion of the Laboratory Services Section (LSS) ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by September 30, 2003.

 

Indicator C.1.4

Completion of the Environment, Safety and Health Section (ESHS) ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by September 30, 2003.

 

Measure C.1.4.1

 

Number of indicators achieved.

Metric C.1.4.1.1

Outstanding

Excellent

Good

Unsatisfactory

4

3

2

1

 

Objective C.2

 

Sustain excellence in safety, health and environmental protection.

 

Indicator C.2.1

 

Performance against agreed upon metrics, as set forth below.

 

Measure C.2.1.1

 

Injury cost index for Fermilab employees for the performance period (October 1, 2002 – September 30, 2003).

 

Metric C.2.1.1.1

Outstanding

Excellent

Good

Marginal

Unsatisfactory

< 11.0

11.0 - 16.0

16.1 - 21.9

22.0 - 28.4

> 28.4

 

 

 

 

 

 

 

Cost index = 100(1,000,000 D + 500,000 T + 2,000 LWC + 1,000 WDL +  400 WDLR + 2,000 NFC) divided by total work-hours.  Where:

 

D is the number of fatalities;

T is the number of permanent transfers or terminations due to occupational illness or injury;

LWC is the number of lost workday cases;

WDL is the number of days away from work;

WDLR is the number of restricted duty days;

NFC is the number of non-fatal cases without days away from work or restricted workdays.

 

Measure 2.1.2

 

Injury cost index for Fermilab’s subcontractor’s employees for the performance period (October 1, 2002 – September 30, 2003).

 

Metric C.2.1.2.1

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

< 21.0

21.0 - 28.7

28.8 - 35.2

35.3 – 42.3

> 42.3

Measure C.2.1.3

 

Lost Workday case rate for Fermilab employees for the performance period (October 1, 2002 – September 30, 2003).

 

Metric C.2.1.3.1

 

Lost Workday Case Rate (number of lost workday cases per 200,000 worker hours) during fiscal year. [1]

 

Outstanding

Excellent

    Good

Marginal

Unsatisfactory

 

< 1.0

1.0 – 1.4

1.5 - 2.0

2.1 – 2.5

> 2.5

 

Measure

C.2.1.4

Lost Workday case rate for Fermilab’s subcontractor’s employees for the performance period (October 1, 2002 – September 30, 2003).

 

Metric C.2.1.4.1

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

< 3.2

3.2 – 3.7

3.8 – 4.4

4.5 - 5.0

> 5.0


Measure C.2.1.5

 

Occupational Radiation Protection Program Performance.

 

Metric C.2.1.5.1

Total Effective Dose Equivalent (TEDE) in person-rem.

 

The TEDE is the sum of deep dose equivalent received by individuals monitored at Fermilab and has units of person-rem.  The TEDE applies to all individuals who have been issued personal radiation monitoring devices at Fermilab.  These include Lab employees, subcontractor employees, experimenters, and tourists.  Due to the time required for processing the doses, this measure will cover period from July 1, 2002 through June 30, 2003.

 

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

< 15.0

15.1 – 17.0

17.1 – 20.0

20.1 –  22.0

> 22.0

 

Metric C.2.1.5.2

Unplanned radiation exposure (based upon discovery date within FY 2003). 

 

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

0

1 – 2 events

3 – 4

5

> 5

 

 

Any work evolution that results in confirmed occupational whole body exposure(s) that exceeds an expected exposure(s) by > 75 mrem for a non emergency work activity and is not controlled either by a written job specific Radiological Work Permit, specific sealed source procedure, Radiological Control Organization supervision, or documented planned special exposure.

 

Metric C.2.1.5.3

Loss of control of radioactive material/spread of radioactive contamination (based upon a discovery date within FY 2003). 

 

Number of events:

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

0

1 – 2

3 – 4

5

> 5

 

 

As required by the Fermilab Radiological Control Manual, radioactive materials are to be located in the posted areas corresponding to emanating exposure rates.  This metric applies to the discovery of unlabeled radioactive material(s)/contamination in excess of thresholds specified in DOE Occurrence Reporting Program outside of controlled, radioactive material, or radiological areas.  The metric also applies to the discovery of inappropriately labeled radioactive materials found outside of the aforementioned areas.

 

 

Weightings for ES&H

Objective

Measure

Metric

Weight, %

C.1

C.1.4.1

 

C.2

C.2.1.1

 

1

 

C.2.1.2

 

1

 

C.2.1.3

 

1

 

C.2.1.4

 

1

 

C.2.1.5

 

1

 

 

C.2.1.5.1

.50

 

 

C.2.1.5.2

.25

 

 

C.2.1.5.3

.25

Total

 

 

9

 

 

Weight     9%


 

                         D. Mission Support

 

Critical Outcome:   Manage and enhance business and management systems, work processes, and facility support to provide an effective and efficient work environment that enables the execution of Fermilab’s mission.

 

Objective D.1

 

Establish and maintain a dependable facilities base from which particle physics and other Fermilab programs can be safely accomplished without interruption.

 

Indicator D.1.1

 

The infrastructure is maintained to support operations in a safe, environmentally responsible, and cost-effective manner.

 

Measure D.1.1.1

 

Maintenance is performed as scheduled.

Metric D.1.1.1.1

Scheduled hours vs. total hours, measured as a percentage.

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

> 80%

80 - 70%

69 - 60%

59 - 55%

< 55%

 

Indicator D.1.2

 

Infrastructure project milestones, as approved in Construction Directives, are completed for the following projects:

 

Measure D.1.2.1

 

Small projects:

·         General Plan Projects (GPP);

·         In-House Energy Management (IHEM);

·         Accelerator Improvement Projects (AIP).

 

Metric D.1.2.1.1

 

For the performance period (October 1, 2002 through September 30, 2003) the percentage of projects completed (number of projects completed/number of projects planned), as documented in construction directives.

 

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

> 90%

90 - 81%

80 - 71%

70 - 60%

< 60%

 

Indicator D.1.3

 

Energy Management initiatives are managed consistently with a comprehensive energy management plan that includes the minimum requirements of DOE Order 430.2A.

 

Measure D.1.3.1

 

Energy requirements accomplished/requirements scheduled to be accomplished during the performance period (October 1, 2002 through September 30, 2003) in accordance with the plan.

 

Metric D.1.3.1.1

Percentage of energy requirements accomplished:

 

 

Outstanding

Excellent

Good

Marginal

 

> 95%

95 - 85%

84 - 75%

< 5%

 

Objective D.2

 

Effective and efficient delivery of the best value products and services by Fermilab subcontractors.

 

Indicator D.2.1

Top quality subcontractors are selected and effective management and contract administration is provided.

 

Measure D.2.1.1

 

Evaluation of construction subcontractor performance in accordance with DOE-approved established criteria for subcontracts greater than $100K.  Collected data will form a baseline for subsequent evaluation periods.

 

Metric D.2.1.1.1

Percent of subcontracts greater than $100K that have been evaluated.

 

 

Outstanding

Excellent

Good

Marginal

 

100%

97%

95%

<95%

 

 

 

Weightings for Mission Support

Objective

Weight, %

D.1

6

D.2

3

Total

9

 

 

Weight     9%


 

                         E. Self-Assessment

 

Critical Outcome:   The self-assessment process will evaluate Fermilab’s ability to meet critical outcomes and meet performance objectives, measures and expectations, and to control its processes.

 

Objective E.1

 

A self-assessment is undertaken for all organizational elements of the laboratory.

 

Indicator E.1.1

 

Peer reviews are used as a tool to evaluate the program elements, Divisions and Sections.

Measure E.1.1.1

 

Number of peer reviews accomplished in each major category.

 

Metric E.1.1.1.1

Percent of peer reviews accomplished for the following projects and programs such as:

 

Reviews of Projects and Programs at Fermilab[2]

 

Project

Type

Date

Neutrinos at the Main Injector

Fermi

November

 

 

 

Run IIb Detector Upgrade

Fermi

April

 

 

 

CKM

Fermi

May

 

 

 

US Compact Muon Solenoid

Fermi

March

 

 

 

Run IIb Accelerator Improvement Projects

Fermi

June

 

 

 

BTeV

Fermi

October

 

 

 

Physics Programs:

 

 

     Visiting Committee

URA

April

     Physics Advisory Committee

URA

April

     Physics Advisory Committee

URA

June

 

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

> 90%

> 80%

> 75%

> 70%

70%

 

Indicator E.1.2

A self-assessment completed by the ES&H Section for Integrated Safeguards and Security Management Systems.

 

Indicator E.1.3

A self-assessment is developed for the systems and activities of all Divisions and Sections.

 

Measure E.1. 3.1

 

33% of the systems and activities are assessed each reporting period.[3]

Metric

E.1. 3.1.1

 

All assessment reports for the reporting period completed by 11/15/2003.

 

Measure E.1. 3.2

 

Effectiveness of the annual self-assessment.

 

Metric

E.1. 3.2.1

 

DOE approval of the self-assessment plan.

 

Metric

E.1. 3.2.2

 

DOE evaluation of the annual self-assessment.

 

Metric

E.1. 3.2.3

 

Percentage of issues or improvements from previous assessments that are resolved during the current reporting period.

 

 

Outstanding

Excellent

Good

Marginal

 

> 95%

95 - 90%

< 90 - 85%

< 85%

 

Metric

E.1. 3.2.4

 

The Administrative and Operations Peer Review Committee will review and   provide an evaluation of the quality, effectiveness and completeness of the self-assessment process and the resulting self-assessment report. 

 

Pass

Fail

 

Review completed

Review not completed

 

 

 

Weightings for Self-Assessment

Objective

Indicator

Weight, %

E.1

E.1.1

1.5

E.1

E.1.2-1.3

1.5

                                 Total

3

 

 

Weight     3%

 


 

                      F. Stakeholder Relations

 

COMMUNITY INVOLVEMENT

 

Critical Outcome:   The laboratory is regarded as a good corporate citizen and conducts its affairs in a manner that leads to public confidence in the laboratory.

 

Objective F.1

 

Work with customers, stakeholders, and neighbors in an open, frank and constructive manner.

 

Indicator F.1.1

 

Development and implementation of an effective communications and community involvement plan. 

 

Measure F.1.1.1

 

Achievement of significant goals and/or milestones as identified in the DOE approved communications and community involvement plan for the performance period.

 

Metric F.1.1.1.1

Percent of the milestones achieved on time.

 

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

> 95%

95 - 90%

89 - 80%

79 - 70%

< 70%

 

Indicator F.1.2

 

Design two new publications.

Measure F.1.2.1

 

A new external publication is designed using an Advisory Board consisting of policy makers, senior staff from Fermilab and other labs, and members of the scientific community.

 

Metric F.1.2.1.1

 

Design complete by 9/30/2003.

 

Measure F.1.2.2

 

A new internal publication is designed using an Advisory Board consisting of senior staff, focus groups from Fermilab and the results of employee surveys.

 

Metric F.1.2.2.1

Design complete by 9/30/2003.

 

 

Weightings for Stakeholder Relations

Objective

Indicator

Weight, %

F.1

F.1.1

1.5

F.1

F.1.2

1.5

                                 Total

3

 

 

Weight     3%


Self-Assessment

Self-Assessment Criteria

 

Preamble

 

An effective Performance-Based Management system should be established which:  1) institutionalizes an internal Self-Assessment Program; 2) fosters assessment of existing internal systems, policies, and procedures; and 3) encourages continuous improvement.  This Self-Assessment is in addition to the development of specific contract Performance Measures directly tied to incentives.  The Laboratory will accomplish a complete assessment of all of its work processes every three years.  Each Division/Section is responsible for assessing one third of its processes each fiscal year.  These assessments must be fully documented and the results included in the Self-Assessment Report at the end of the reporting period.  In the case of Divisions/Sections that also have contractual metrics related to the Self-Assessment (i.e., those included in the Performance Area 2, Self-Assessment), processes that generate the results for these metrics must be included as part of the one third of the total processes for the assessment period.  The processes to be assessed for the reporting period will be identified at the beginning of the reporting period and the resultant list submitted to DOE by letter.

 

Each Division and Section of the Laboratory will monitor continuously the agreed upon performance measures and produce both a mid-year progress report with narrative comment on each measure by the responsible party and a year-end self-assessment of performance with detailed narrative on each measure and on each process assessed by the responsible party.  The Contractor's Self-Assessment Program shall be developed in formal agreement with the Contracting Officer and provide for the following:

 

a.         an assessment of performance against objectives, indicators, measures and metrics which have been identified under the category of "Critical Operations";

 

b.                  an assessment of performance against objectives, indicators, measures and metrics which have been identified by mutual agreement of the parties as being measures of system performance.  These Self-Assessment Measures are not linked directly to any contract performance incentive.  Instead, they are additional to the contract Performance Measures identified in the Science and Operations Management Critical Outcomes of this Appendix B;

 

c.         an assessment of overall operations for:

 

(1)        compliance with the prime contract, law, or other DOE, Federal, and State requirements (such as regulations, directives, etc.) as may be applicable pursuant to the terms of the prime contract;

 

(2)        the adequacy and the degree to which internal policies procedures and controls are implemented and are being met;

 

d.         identification of improvement opportunities and improvement plans.  As a result of the year-end assessment, a list of opportunities for improvement will be identified and actions for implementation scheduled for the next reporting period.  Achievement of all implementation actions will automatically become a metric for the subsequent reporting period;

 

e.         assessment of the overall program once per year using the Administrative Peer Review process.  The process shall consist of the following elements:

 

1)      The Sections will present their year-end Self-Assessment to the Administrative Peer Review Committee;

 

2)      The Committee will evaluate the presentations and feed back to the      Sections;

 

3)      The Committee will split into sub-committees to investigate areas they select based on the Sections’ presentations;

 

4)      The goal of the sub-committees will be to evaluate one third of each Section’s processes each year;

 

5)      The Administrative Peer Review Committee will include its report in the year-end self-assessment report to DOE;

 

6)      The Divisions will continue to utilize their long-standing system of peer review for all significant projects and operations;

 

7)      Divisions and Sections will perform the self-assessments using forms and methodology contained in Attachment 3 of this Appendix;

 

8)      Each Division/Section will select one third of its processes from its list in the Self-Assessment Plan and perform a self-assessment of these processes in FY2003.  Those selected may not be the same as those assessed in FY2002;

 

9)      Divisions and Sections will self-assess Indicators and metrics listed in the categories “Contractual Performance Indicators to be Assessed” and “Additional Metrics to be Assessed” in addition to those selected in accordance with paragraph h. above.

 

 


 

Self-Assessment Plan

 

Directorate

 

List of Work  Processes Identified by the Directorate

 

1.      Foreign travel approval

2.      ORTA  (technology transfer)

3.      Project review process

4.      Purchase requisition approval

5.      Personnel requisition approval

6.      Housing allocation for users

7.      Customer communication and feedback

8.      Budget process

9.      Policy formulation and publication

 

Contractual Performance Indicators to be assessed:

 

SCIENCE

 

Critical Outcome:  Advance the understanding of the fundamental nature of matter and energy by conducting research at the frontier of high-energy physics and related disciplines.

 

Objective A.1

Quality of Research - Advancement in the understanding of the fundamental nature of matter and energy.

 

Indicator A.1.1

Success in producing original, creative scientific output that advances fundamental science and opens important new areas of inquiry.

 

Indicator A.1.2

Success in achieving sustained progress and impact on the field.

 

Indicator A.1.3

Recognition from the scientific community, including awards, peer- reviewed publications, citations, and invited talks.

 

Objective A.2

 

Successfully construct and operate research facilities.

 

Indicator A.2.1

Construction and commissioning of new facilities on time and within budget; achievement of facility performance specifications and objectives.

 

Indicator A.2.2

 

Reliability of operations and adherence to planned schedules for accelerator run hours and delivered integrated and peak luminosity.

Objective A.3

Provide for effective and efficient program management for a world class research program.

 

Indicator A.3.1

The extent to which effective management programs support the research program.

 

Objective A.4

 

Relevance to DOE missions and national needs.

 

Indicator A.4.1

The laboratory successfully contributes to DOE missions and programs of national importance.

 

 

LEADERSHIP

 

Critical Outcome:   Provide the leadership to ensure operational excellence and foster responsible stewardship of the DOE resource.

 

Objective B.1

 

URA provides the integrated management and leadership necessary to enhance the operations and management processes that are necessary to ensure execution of the Fermilab mission in a safe, effective, and efficient manner.

 

Indicator B.1.1

 

URA directs management reviews that result in an overall assessment of key Fermilab operations functions and management systems.

 

Indicator B.1.2

URA management promotes operational and management system excellence.

 

 

SELF-ASSESSMENT

 

Critical Outcome:   The self-assessment process will evaluate Fermilab’s ability to meet critical outcomes and meet performance objectives, measures and expectations, and to control its processes.

 

Objective E.1

 

A self-assessment is undertaken for all organizational elements of the laboratory.

 

Indicator E.1.1

 

Peer reviews are used as a tool to evaluate the program elements, Divisions and Sections.

 

Indicator E.1.2

A self-assessment is developed for the systems and activities of all Divisions and Sections.

 

 

COMMUNITY INVOLVEMENT

 

Critical Outcome:   The laboratory is regarded as a good corporate citizen and conducts its affairs in a manner that leads to public confidence in the laboratory.

 

Objective F.1

 

Work with customers, stakeholders, and neighbors in an open, frank and constructive manner

Indicator F.1.1

 

Development and implementation of an effective communications and community involvement plan.

Indicator F.1.2

Design two new publications

 

 

Additional metrics to be assessed:

 

Intellectual Property:

 

Objective G.1

Fermilab promotes utilization and development of inventions and discoveries in support of the Laboratory’s science and technology transfer missions.

 

Measure G.1.1

 

Timeliness of invention administration, including:  submission of invention disclosures, election of title, filing and confirmatory licenses.

 

Metric G.1.1.1

 

Percent of invention disclosures filed on time.

 

Outstanding

Excellent

Good

Marginal

 

97

< 97 - 94

< 94 - 88

< 88

 

Measure G.1.2

 

The Laboratory will review subcontract actions in which Intellectual Property (IP) may be developed or utilized and, based upon the status of the subcontractor, will assign the appropriate IP provision to ensure that the Government’s rights in any IP are protected.

 

Metric G.1.2.1

Percent of subcontracts reviewed for IP considerations.

 

 

Outstanding

Excellent

Good

Marginal

 

97

< 97 - 94

< 94 - 88

< 88

 

 

Financial Management

 

Objective H.1

 

A financial system that is sound, responsive, has economical financial management programs to safeguard DOE financial assets, and supports an aggressive laboratory-wide overhead management program.

 

Indicator H.1.1

 

Uncosted balances are maintained at levels consistent with responsible financial management.

 

Measure H.1.1.1

 

Percentage of uncosted balances to funds received in financial plan ( for subsequent funds placed in the financial plan, contractor and DOE will mutually agree on appropriate balances applicable under this measure).

 

Metric H.1.1.1.1

 

A.      Operating

B.     Capital Equipment and Construction

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

    A

0 - 6%

7 - 8

9 – 11

12 - 14

> 14

    B

0 – 29%

30 - 40

41 – 46

47 - 50

> 50

 

Measure H.1.1.2

 

The amount of delinquent receivables over 90 and 180 days.

 

Metric H.1.1.2.1

 

95% of receivable dollars at the end of any reporting period are not more than 90 days delinquent, and the remaining 5% of receivable dollars are not more than 180 days delinquent.

 

 

COUNTERINTELLIGENCE

 

Objective

I.1

 

 

A counterintelligence program in accordance with applicable DOE orders, policies, and the CH & Fermilab Counterintelligence Agreement.

 

Indicator I.1.1

 

The Laboratory shall meet the notification requirements for contractor employees who travel officially to sensitive countries.  The Laboratory will coordinate with the Chicago Operations Office of Counterintelligence to furnish those travelers with counterintelligence briefings and/or debriefings.

 

Measure I.1.1.1

 

Notify the DOE Office of Counterintelligence at least 30 days in advance of and within 30 days of completing travel to sensitive countries.  Prior notification will occur via the Foreign Travel Management System.  Post-travel notification will occur in conjunction with submission of required trip reports.

 

Metric I.1.1.1.1

 

The Laboratory will meet this requirement at least 90% of the time.

 

Measure I.1.1.2

 

Ensure that all employees receive an annual counterintelligence briefing.

 

Metric I.1.1.2.1

 

The Laboratory shall meet this requirement 100% of the time.

 

 


 

Business Services Section

 

List of Work Processes Identified by the Business Services Section

 

1.      Small procurements

25.              Receiving

2.      Pro card procurements

26.              Distribution

3.      Short order procurements

27.              Shipping/Traffic

4.      Integrated work orders

28.              Property Office

5.      Scrap sales agreements

29.              Supply

6.      Excess property purchases

30.              Mailroom

7.      “Other” commercial procurements

31.              Vehicle Maintenance

8.      Commercial services

32.              Property Balanced Scorecard process

9.      Modified contracts

33.              Production system services

10.  Construction procurement

34.              Project services

11.  T&M procurement

35.              Technical services

12.  Non-commercial services

36.              Training and administrative services

13.  A&E procurement

37.              Voice services

14.  Source evaluation board

38.              Wireless services

15.  Subcontractor evaluation

39.              Radio services

16.  Procurement Balanced Scorecard process

40.              Payroll

17.  Legal support in procurement and subcontract actions

41.              Accounts payable

18.  Legal advice and counsel

42.              Financial reporting

19.  Oversight of outside retained counsel

43.              Travel settlement

20.  Specialized legal support in technical areas

44.              Accounts receivable

21.  Litigation management

45.              Asset accounting

22.  Dispute resolution

46.              Treasury operations

23.  Drafting/reviewing Laboratory documents

47.              Record retention and disposition

24.  Provision of DOE-requested legal work products

 

 

 

Contractual Performance Indicators to be Assessed

 

MISSION SUPPORT

 

Critical Outcome:  Manage and enhance business and management systems, work processes, and facility support to provide an effective and efficient work environment that enables the execution of Fermilab’s mission.

 

Objective D.2

 

Effective and efficient delivery of the best value products and services by Fermilab subcontractors.

 

Indicator D.2.1

 

Top quality subcontractors are selected and effective management and contract administration is provided.

 

 

Additional Metrics to be Assessed

 

Property:

 

Objective J.1

 

Deliver Laboratory support functions in a manner consistent with applicable laws, regulations, and contract terms and conditions.

 

Indicator J.1.1

 

Fermilab’s performance as indicated by the Balanced Scorecard Performance Measurement and Performance Management Program.

 

Measure J.1.1.1

 

Balanced Scorecard Performance Measurement.

 

Metric J.1.1.1.1

 

The Laboratory will develop and submit a FY2003 Balanced Scorecard (BSC) plan, which will be approved by DOE.  The Laboratory will analyze the BSC Plan in order to show progress toward meeting the targets in the four perspectives of the BSC.

 

Metric J.1.1.1.2

 

Percent of improvements implemented that were identified in the opportunities for improvement from the prior performance period.

 

 

Outstanding

Excellent

Good

Marginal

 

 > 95%

95 - 90%

< 90 - 85%

< 85%

 

Indicator J.1.2

 

Fermilab Maintains a certified Property Management System.

 

Measure J.1.2.1

 

Fermilab shall assess one third of the processes that are related to maintaining a certified property management system for this performance period.  All related processes shall be assessed by the end of the performance period for FY2004.

 

Metric J.1.2.1.1

 

Pass/Fail.

 

 

Procurement:

 

Objective K.1

 

Deliver best value products and services to Fermilab Procurement customers in a manner consistent with applicable laws, regulations, and contract terms and conditions.

 

Indicator K.1.1

 

Fermilab’s performance as indicated by the Balanced Scorecard Performance Measurement and Performance Management Program.

 

Measure K.1.1.1

 

Balanced Scorecard Performance Measurement.

 

Implementation of the Balanced Scorecard Performance Measurement and Performance Management Program for Federal Procurement and Contractor Purchasing Systems.

 

Metric K.1.1.1.1

 

The Laboratory will develop and submit a FY2003 Balanced Scorecard (BSC) plan, which will be approved by DOE.  The Laboratory will analyze the BSC Plan in order to show progress toward meeting the targets in the four perspectives of the BSC.

 

Metric K.1.1.1.2

 

Percent of improvements implemented that were identified in the opportunities for improvement from the prior performance period.

 

Outstanding

Excellent

Good

Marginal

 

 > 95%

95 - 90%

< 90 - 85%

< 85%

 

Indicator K.1.2

 

Fermilab Maintains a certified procurement system.

Measure K.1.2.1

 

Fermilab shall assess one third of the processes that are related to maintaining a certified procurement system for this performance period.  All related processes shall be assessed by the end of the performance period for FY2004.

 

Metric K.1.2.1.1

 

Pass/Fail.

 

 

LEGAL MANAGEMENT

           

Objective L.1

 

Ensure quality, timely, and cost effective legal services; promote the protection and utilization of inventions and Laboratory-generated data in support of the Research and Development (R&D) mission.

 

Indicator L.1.1

 

Management of legal services in an efficient and cost-effective manner that protects the interests of the Laboratory and the Government.

 

Measure L.1.1.1

 

Number of major non-compliances with Contractor’s DOE-approved Legal Management Plan.

 

This measure will be evaluated using the following rating ranges; however, DOE reserves the discretion to factor in an excessive number of minor non-compliances if such non-compliances bring into question the validity of the system.

 

 

Outstanding

Excellent

Good

Marginal

 

0 - 1

2

3

 > 3

 

Measure L.1.1.2

 

Number of Work products submitted by the Contractor for DOE approval or use that are not supported by timely, sound, and thoroughly-researched legal advice.

 

Metric L.1.1.2.1

Outstanding

Excellent

Good

Marginal

0 - 1

2

 3 

 > 3

 

 

 

 

 

 

 

 

 


 

Environment, Safety & Health

 

List of Work Processes Identified by the Environment, Safety, and Health Section

 

1.      Waste management

2.      Environmental assurance

3.      RP program implementation

4.      Instrumentation and sealed sources

5.      Dosimetry, analytical, and emergency response

6.      Radiation field characterization

7.      ES&H support

8.      ES&H oversight

9.      Medical surveillance programs

10.  Health promotion

11.  Medical case management

12.  Drug free workplace

13.  Routine operations (Fire)

14.  Emergency responses (Fire)

15.  Training (Fire)

16.  General switchboard services

17.  Emergency phone/radio services

18.  Issuance and control of access control devices

19.  Contract security services

 

Contractual Performance Indicators to be Assessed

 

ENVIRONMENT, SAFETY & HEALTH

 

Critical Outcome:   Protect the safety and health of the Fermilab workforce, subcontractors, the community, and the environment in all Office of Science (SC) program activities.

 

Objective C.1

 

Identify and implement enhanced ISMS management systems through the Tripartite Assessment process.

 

Indicator C.1.1

 

Completion of Phases II and III of the Beams Division ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by 9/30/2003.

 

Indicator C.1.2

 

Completion of the FESS ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by 9/30/2003.

 

Indicator C.1.3

 

Completion of Phase I of the LSS ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by 9/30/2003.

 

Indicator C.1.4

Completion of the ESHS ISMS Assessment and preparation of any associated corrective action plan and schedule for implementation by September 30, 2003.

 

Objective C.2

 

Sustain excellence in safety, health and environmental protection.

Indicator C.2.1

Performance against agreed upon metrics (see Section C, ES&H).

 

 

Additional Metrics to be Assessed

 

 

WASTE REDUCTION

 

Objective M.1

 

Minimize waste and promote recycling.

 

Indicator M.1.1

 

Pollution Prevention/Waste Minimization (P2/Wmin) are incorporated into work planning and experimental review.

Indicator M.1.2

 

Employees, line management, and experimenters are involved in identifying and proposing viable P2/Wmin opportunities for projects, experiments, and routine operations.

 

Indicator M.1.3

 

Each Division/Section demonstrates participation in P2/Wmin efforts annually in a fashion that befits its organizational mission.

 

Measure M.1.3.1

 

DOE evaluation of the Fermilab self-assessment for P2/Wmin program implementation.

 

Metric M.1.3.1.1

 

Pass/Fail.

 

ENVIRONMENTAL MANAGEMENT SYSTEMS

 

Objective N.1

 

Demonstrate environmental management and leadership through development and implementation of environmental management systems (EMSs) that will strengthen Integrated Safety Management at Fermilab.

 

Indicator N.1.1

 

By September 30, 2003, the ES&H Section will review guidance on implementation of Executive Order 13148 (see also Contractor Requirements Document for DOE N 450.4), identify the essential elements of EMSs, conduct a self assessment against those elements, develop a gap analysis, and provide the Fermi Area Office a report, a plan of action, and schedule to meet the EMS implementation date of December 31, 2005.

 

Measure N.1.1.1

 

DOE evaluation of the Fermilab EMSs gap analysis, plan of action, and schedule for implementation.

 

Metric N.1.1.1.1

 

Pass/Fail.

 

 

SAFEGUARDS AND SECURITY

 

Objective O.1

 

 

Implementation of an integrated safeguards and security management (ISSM) program to ensure internal monitoring of compliance and performance with safeguards and security requirements.

 

Indicator O.1.1

 

The Laboratory Self-Assessment Program adequately addresses all applicable topical areas of the ISSM program.  DOE evaluation of Fermilab self-assessment of the adequacy of the Safeguards and Security Program.

 

Measure O.1.1.1

 

Self-assessment documentation reflects how ISSM program elements were evaluated and the resultant evaluation of the elements.

 

Measure O.1.1.2

 

Corrective actions or compensatory measures for deficiencies that involve nuclear materials or security interests are implemented immediately.

 

Measure O.1.1.3

 

Corrective actions are monitored until resolved.

Metric O.1.1.3.1

 

Pass/Fail.  

 


 

Facilities Engineering Services Section

 

List of Work Processes Identified by the Facilities Engineering Services Section

 

1.                  Monthly Group Performance Review

2.                  Monthly Financial Performance Review

3.                  New Employee Training

4.                  Design through Construct

5.                  Shop Drawing Approval

6.                  Construction Coordination/Task Management

7.                  UIP Project Work Flow

8.                  Energy Management

9.                  Building Efficiency Tracking

10.              Condition Assessment Program

11.              T&M Work Request to Finished Job

12.              Work Order Flow

13.              Maintenance Reporting and Analysis

14.              Preventive Maintenance

15.              Journeymen Hiring Selection

16.              Fire Alarm Disablement

17.              Stores Inventory Tracking

18.              Extraordinary Maintenance Tracking

19.              Computer Hardware Inventory Tracking

20.              Computer Support Work Order Tracking

21.              Performance Contracting (Janitorial)

22.              Land Management

23.              Pesticide Application

24.              Safety (ISM)

 

Contractual Performance Indicators to be Assessed

 

MISSION SUPPORT

 

Critical Outcome:   Manage and enhance business and management systems, work processes, and facility support to provide an effective and efficient work environment that enables the execution of Fermilab’s mission.

 

Objective D.1

 

Establish and maintain a dependable facilities base from which particle physics and other Fermilab programs can be safely accomplished without interruption.

 

Indicator D.1.1

 

The infrastructure is maintained to support operations in a safe, environmentally responsible, and cost-effective manner.

 

Indicator D.1.2

 

Infrastructure project milestones, as approved in Construction Directives, are completed.

 

Indicator D.1.3

 

Energy Management initiatives are managed consistently with a comprehensive energy management plan that includes the minimum requirements of DOE 430.2a.

 

 

Additional Metrics to be Assessed

 

REAL ESTATE MANAGEMENT

 

Objective P.1

 

Effective and efficient real estate management.

 

Indicator P.1.1

 

Management systems accurately reflect the classification and square footage of DOE facilities.

 

Measure P.1.1.1

 

Accuracy of square footage reported in the Energy Management System 4 (EMS4) database reconciles with square footage reported in FIMS.

 

Metric P.1.1.1.1

 

 

Metric

Outstanding

Excellent

Good

Marginal

Unsatisfactory

1: EMS4 is equal to or   less than FIMS

 

100 - 96%

 

95 - 91%

 

90 - 86%

 

85 - 81%

 

< 81%

2: EMS4 is equal to or greater than FIMS

 

100  - 104%

 

105 -109%

 

110 - 114%

 

115 - 119%

 

> 119%

 

 

FACILITY MAINTENANCE AND ENGINEERING

 

Objective

Q.1

 

Efficient and effective facility management.

 

 

Indicator Q.1.1

 

Maintenance Investment Index.

 

Measure Q.1.1.1

 

The Maintenance Investment Index is calculated by dividing the total fiscal year contractor funded maintenance for active conventional facilities by the same fiscal year replacement plant value (from FIMS) of these facilities, and multiplying by 100 to express the index as a percentage.  Conventional facilities are defined as buildings and utilities as identified in FIMS with a Replacement Plant Value (RPV) of $460M in FY02 (this value will change as RPV's are escalated, new assets are capitalized, and old are excessed).  Determination of RPV will be shown with each MII calculation. Contractor maintenance is defined as all funding from laboratory overhead used to sustain property and equipment in a condition suitable for its designated purpose.

 

Metric Q.1.1.1.1

 

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

> 1.4%

1.39 - 1.3%

1.29 -1.2%

1.19 - 1.1%

< 1.09%

 

 

Definitions:

 

 

Conventional Facilities - all facilities for which maintenance is the responsibility of laboratory's site-wide facility maintenance organization and is funded from overhead or space charges.  FIMS will be modified to allow these facilities to be identified in FIMS.  This includes the conventional portion of programmatic facilities where the maintenance responsibility has been assigned to the site-wide facility maintenance organization.  Note that items classified as personal property or items inventoried as "capital equipment" are not part of real property assets.

 

 

Active Conventional Facilities - that subset of conventional facilities which are not designated excess in the FIMS database.  Excess may be currently excess or to be excessed in the next 3 years. 

 

 

Programmatic (Non-conventional) Facilities - those facilities for which the maintenance responsibility is assigned to a research division or department.  

 

 

Contractor Funded Maintenance - maintenance funding from laboratory overhead or space charges. 

 

 

Capital Funded Maintenance (or, "GPP/GPE/Line Item Funded Maintenance") -  is that portion  of a GPP/GPE or line item project that accomplishes maintenance, repair and system/component replacement needs; these are usually previously identified as deferred maintenance activities. 

 

 

Rehab and Improvement Costs - The total of all rehab and improvement costs, including needed function or capacity upgrades and the costs to bring the facility in compliance with all applicable building codes, ADA/UFAS, and Life Safety requirements, etc. as well as the costs to make facilities suitable for planned mission needs.   RIC are not to include deferred maintenance costs.   FIMS will be modified to capture this data.  

 

 

Maintenance - is the upkeep of property and equipment, work necessary to realize the originally anticipated useful life of a fixed asset.  Maintenance includes periodic or occasional inspection; adjustment, lubrication, and cleaning (non-janitorial) of equipment; replacement of parts; painting; resurfacing; and other actions to assure continuing service and to prevent breakdown.  Maintenance does not prolong the design service life of the property or equipment, nor does it add to the asset's value.  However, lack of maintenance can reduce an asset's value by leading to equipment breakdown, premature failure of a building's subsystems, and shortening of the asset's useful service lifetime, or repair work to restore damaged or worn-out property to a normal operating condition.  Repairs are curative, while maintenance is preventative. (Generally Expense funded)

 

 

Replacement - of an item that is part of the permanent investment of plant and equipment is an exchange or substitution of one fixed asset for another having the capacity to perform the same function.  Replacement may arise from obsolescence, cumulative effect of wear and tear throughout the anticipated service lifetime, premature service failure, or destruction through exposure to fire or other hazard.  In contrast to repair, replacement generally involves a complete identifiable item of investment (i.e., a major building component or subsystem).  When major building subsystems fail, a building owner may sometimes have a choice of repair or replacement of that subsystem.  Replacement is typically funded in maintenance and repair budgets.  (Generally capital funded)   Note: Does not include total renovations or new buildings to replace old buildings.

 

Indicator  Q.1.2

 

Assess the effectiveness of construction safety programs for work done in Divisions/Sections by Time and Materials and Fixed-Price contractors.

 

Measure Q.1.2.1

 

DOE evaluation of Fermilab’s self-assessment of the effectiveness of this specific element of the Construction safety program.

 

Metric Q.1.2.1.1

 

Pass/Fail.

 


Laboratory Services Section

 

List of Work Processes Identified by the Laboratory Services Section

 

1.      Food services – Fermilab generated Monthly Cost Center report

2.      Food services – Providers Monthly Operating Statements

3.      Maintenance of food service equipment

4.      Food service quality

5.      Housing reservations

6.      Housing financial reporting

7.      Housing billing and payment collection

8.      Cleaning and maintaining housing

9.      Benefits reporting requirements

10.  Monitoring benefit costs

11.  Communication of benefit information to employees

12.  Health plan cost containment

13.  FMLA

14.  COBRA

15.  Employee compensation

16.  Visa administration

17.  HRIS and Employee records

18.  Labor Relations

19.  Performance Management

20.  Employee Discipline

21.  Policy development and communication

22.  Service and recognition awards

23.  Employment termination

24.  Orientation process

25.  Recruitment and hiring

26.  Promote a diverse workforce

 

27.  Internal and external complaint resolution

28.  Diversity training

29.  Assessing need for ILG

30.  Designing ILG program

31.  Applying for ILG program accreditation

32.  Offering ILG program

33.  Evaluating ILG program

34.  ILG reporting requirements

35.  Technical publication process

36.  Materials acquisition process

37.  Serials management process

38.  Interlibrary loan process

39.  Web development process

40.  Bibliographic classification process

41.  On-site class administration

42.  Tuition reimbursement process

43.  Assessment of Fermilab training needs

44.  Tracking training

45.  Training cost and charge backs

46.  Training evaluation

47.  Photography and digital imaging output services

48.  Video tape duplication services

49.  VMS production services

50.  Duplication services

51.  Travel estimates

52.  Travel reservations

53.  Travel ticketing

54.  Balanced Scorecard process is also implemented

 

 

 

 

 

 

 

 

 

 

 

 

Additional Metrics to be Assessed

 

HUMAN RESOURCES

 

Objective R.1

 

Fermilab implements a Human Resources (HR) performance system which contains goals tied to the organizational mission and which provides feedback on the impact of, and value added by, HR.

 

Indicator R.1.1

 

Fermilab’s performance as indicated by the Balanced Scorecard Performance Measurement and Performance Management Program.

 

Measure R.1.1.1

 

FY 2003 Balanced Scorecard (BSC) Plan for Human Resources.

 

 

The Laboratory will analyze the FY2003 Balanced Scorecard (BSC) plan in order to show progress toward meeting the goals listed below: 

 

 

DOE PERFORMANCE MEASURE

GOAL

 

 

Financial

 

 

 

Recruiting cost

 

£ $2,600

 

 

Market ratio of salaries

 

± 5%

 

 

Customer

 

 

 

Rewrite the Long-Term Disability, Travel, Severance, Medical, Dental Summary Plan Descriptions

 

By 1/15/03

 

 

Revise the Personnel Policy Guide

By 8/31/03

 

 

Review and decide whether we are able to implement education savings program for children of employees for college tuition

 

 

By 4/30/03

 

 

Hold lunchtime brown bag HR Q&A sessions

Twice per FY03

 

 

Internal Business

 

 

 

Time to hire from requisition posting date to acceptance date

 

£ 150 days

 

 

Acceptance rate for hiring

 

³ 93%

 

 

Test and implement PeopleSoft 8

By 5/30/03

 

 

Explore self-serve personnel information change system with implementation of PeopleSoft 8

 

By 9/30/03

 

 

Automate HMO Illinois eligibility process

By 6/30/03

 

 

Review of job descriptions for current accuracy, market competitiveness and internal equity

 

³ 20%

 

 

Learning and Growth

 

 

 

Develop and implement informational sessions regarding visa administration for supervisors and primary users

 

By 4/30/03

 

 

Implement interviewing training

First class by 11/30/02

 

 

Metric R.1.1.1.1

Number of goals met:

 

Outstanding

Excellent

Good

Marginal

Unsatisfactory

 

 

> 12

11 - 12

10

9

< 9

 

 

 

TRAINING:

 

Objective S.1

 

Employees receive appropriate training.

 

Indicator S.1.1

 

Evaluate ES&H training needs and deliver training to employees.

 

Measure S.1.1.1

 

Supervisors will complete an ES&H-related training needs assessment, and identify required training for each employee.

 

Metric S.1.1.1.1

 

For the reporting period (October 1, 2002 - September 30, 2003), the following performance levels will be applied to the percentage of training plans developed.

 

 

Outstanding

Excellent

Good

Marginal

 

> 95%

95% to > 90%

90% to 85%

< 85%

 

Measure S1.1.2

 

The contractor will verify ES&H training completion to ensure contract work is performed safely and effectively.

 

Metric S.1.1.2.1

 

For the reporting period (October 1, 2002-September 30, 2003), the following performance levels will be applied to the percentage of required training completed.

 

 

Outstanding

Excellent

Good

Marginal

 

> 95%

95% to > 90%

90% to 85%

< 85%

 

Diversity:

 

Objective T.1

 

A diverse professional workforce.

 

Indicator T.1.1

 

Diversity in the workforce is increased or maintained as compared to the prior measurement period.

Measure T.1.1.1

 

Offers in the Professional Job Group made to women and underrepresented minorities, using FY 2001 as a baseline.

Metric T.1.1.1.1

Percent increase in offers to women in Professional Job Groups.

 

Outstanding

Excellent

Good

Marginal

 

> 14.9

14.9 – 14.5

< 14.5 –14.1

< 14.1

 

Metric T.1.1.1.2

Percent increase in offers to minorities in the Professional Job Groups.

 

Outstanding

Excellent

Good

Marginal

 

> 4.9

4.9 - 4.5

< 4.5 -  4.1

< 4.1

 

 

Science and Technology Information

 

Objective U.1

 

Improve the number of electronic deliverables submitted to the Office of Scientific and Technical Information (OSTI).

 

Measure U.1.1

 

Percentage of Scientific and Technical Information deliverables sent to OSTI electronically.

 

Metric U.1.1.1

Pass/Fail:  Pass > 95%.

 


 

Beams Division

 

List of Work Processes Identified by the Beams Division

 

1.      Travel authorization, arrangements, and closeout

2.      Personnel Requisition Processing

3.      Personnel Promotion

4.      Personnel transfers and temporary assignments

5.      Merit adjustments

6.      Disciplinary actions

7.      Vendor selection

8.      P.O.  Specifications

9.      P.O. authorization (signature chain)

10.  Receiving

11.  Quality Assurance.

12.  Budgeting

13.  Program manpower requirements

14.  Accounting

15.  Policy and Procedure Dissemination and Monitoring

16.  ITNA (Individual Training Needs Assessment)

17.  Training modules

18.  Qualification database (verification)

19.  Environmental monitoring

20.  Permits

21.  Waste minimization (radioactive waste, hazardous waste, and industrial waste)

22.  Operations training

23.  Operations policies

24.  Operations staffing levels

25.  Run strategies

26.  Run scheduling

27.  Run studies (planning and results)

28.  Run improvements

29.  Run reviews (external and internal)

30.  Off hours

31.  Unusual occurrence

32.  Off normal occurrence

33.  Performance statistics

34.  Downtime logs.

35.  Run Permits

36.  Beam permits

37.  Cool down permits.

38.  AIP process

39.  Project Reviews

40.  Engineering Design Reviews

41.  Facility Maintenance Planning

 

 

 

Computing Division

 

List of Work Processes Identified by the Laboratory Services Section

 

1.      Hiring - recruit and hire trained personnel required to perform  division function

2.      Personnel evaluation - provide evaluation, salary review, promotion and professional development opportunities to division personnel

3.      Budget (planning) - provide input to lab management of yearly budgetary needs and provide budget guidance to division administrative units

4.      Budget (execution) - track expenditures during the course of the budget year and adjust plans based on available resources and developing needs

5.      Long range planning (strategy sessions) - forecast needs and plan future projects in collaboration with various experiments

6.      Procurement - obtain goods and services for division projects in an optimized manner

7.      Building and property management - manage the division real estate, in  particular the Feynman Computing Center, and provide office space and  furniture for division employees

8.      Administrative computing support (MISCOMP) - providing hardware and software
 systems to manage division administrative information systems including  procurement, budgeting, and various databases

9.      Administrative support (travel) - providing administrative and clerical support for division personnel

10.  Safety and health - carry out all division business consistent with safety, carry out lab safety policies, respond to employees safety concerns

11.  Project management of large projects (external reviews) - provide necessary oversight and external and internal review of large multiyear projects

12.  Email - provide centralized email services to lab employees and users, including managing mail servers, virus checking, and managing mailing lists

13.  Web - providing central web servers to present lab information as well as guidance for individuals running their own web servers

14.  Data center operations - operation of the central data processing facilities in the Feynman Computing Center, detecting and reporting problem conditions

15.  Help desk - providing facilities to respond to user questions and problem reports and refer issues to the appropriately qualified experts

16.  User training and providing information - provide external training opportunities, conduct training classes and brown bag seminars, and make technical information widely available through the web and other documentation tools

17.  Software product support - provide mechanisms for managing and distributing homemade, commercial, and open source software, and provide different defined levels of software support for the most widely used products.  In particular this includes C++ compiler support and consulting

18.  Physics computing support - provide support for a variety of physics software products, from commercial, homemade and open software sources, including such areas as simulation tools, analysis tools, and graphics tools

19.  Mass storage - provide hardware and software systems for mass storage and retrieval of large amounts (hundreds of Terabytes) of experimental data, including robotic storage and provision for exporting and importing data

20.  Data access - providing the needed hardware and software for online disk storage of experimental data, including backup facilities

21.  Offline batch processing - providing necessary central computing and dedicated computing for specific experiments for offline data analysis

22.  Interactive processing - providing necessary central computing and dedicated computing for specific experiments for interactive use, software development and testing

23.  Farm processing - providing necessary central computing and dedicated computing for specific experiments for large scale scheduled data reconstruction on farms of commercial commodity computers

24.  Computer security - develop and enforce policies to protect lab computing resources against attacks, scan the site for computer security vulnerabilities, and operate a response team for computer security emergencies

25.  Desktop support/PC - provide support for PC desktops, including operation of the main Windows Domain and provide contracted support for various lab entities

26.  Desktop support/Unix - provide support for Unix desktops, especially Linux,  including maintaining standard Fermi versions of operating system releases
 for installation by local system administrators and providing general system management

27.  Networking/internal - managing internal Fermilab networks, including running
 centralized domain name servers and other central facilities, and providing
 DHCP and wireless services.  This includes the design of network topology,
 granting of IP addresses, and operation of routers, bridges and site border
 controls

28.  Networking/external - maintaining all Fermilab network connections with the outside world, monitoring the performance of these networks, planning for upgrades, and representing the lab with external bodies (such as ESNET) providing connectivity

29.  CDF support (including task force) - provide dedicated technical support for the design, procurement, operation and support of computing facilities for the acquisition and analysis of data for the CDF experiment, including  support for dedicated desktop clusters, batch and interactive computer servers, networking, and data access and storage facilities

30.  D0 support (including task force) - provide dedicated technical support for the design, procurement, operation and support of computing facilities for  the acquisition and analysis of data for the D0 experiment, including support for dedicated desktop clusters, batch and interactive computer servers, networking, and data access and storage facilities

31.  BTeV support - provide dedicated technical support for the design, procurement, operation and support of computing facilities for the acquisition and analysis of data for the BTeV experiment, including support for dedicated desktop clusters, batch and interactive computer servers, networking, and data access and storage facilities

32.  SDSS support - provide dedicated technical support for  the design, procurement, operation and support of computing facilities for  the acquisition and analysis of data for the Sloan Digital Sky Survey and  other experimental astrophysics initiatives, including computing for data  acquisition and monitoring at the experimental site, data processing  facilities, and facilities to make data available to a wider community.

33.  CMS support - provide dedicated technical support for the design, procurement, operation and support of computing facilities for the acquisition and analysis of data for the CMS experiment, including operation of a Tier 1 Regional Center as part of the worldwide distributed CMS computing facility

34.  Electronic equipment support and maintenance - maintain a pool of electronics modules for use by experiments, and provide instrument repair services for both commercial and home made electronic products hardware and software maintenance contracts - negotiate and administer various contracts with external vendors to provide hardware and software maintenance and software licensing for the laboratory community, including managing a charge back mechanism to recover costs from other lab administrative entities

35.  Electronic engineering support of experiments - provide the expertise to design, construct, operate and maintain a variety of electronics modules for use in physics experiments

36.  Online computing support - provide tools and support for data acquisition and online computing for a variety of experiments, including the development of standard and customized products

37.  Database support - provide common database solutions for scientific, technical and administrative databases, including negotiating contracts with vendors and aiding users in developing custom solutions

38.  Technology assessment - provide up to date projections for future technology in the computing arena, giving guidance to the laboratory community on directions to pursue

 

Additional metrics to be assessed

 

Cyber Security

 

Indicator V.1.2

 

Successful completion during FY 2003 of a Self-Assessment and

Peer Review of Fermilab's Computer Security Program.  Fermilab will address implementation of all recommendations in a response provided to the Fermilab Area Office Manager resulting from the Review.

 

Measure V.1.2.1

 

Complete said Review.

Metric V.1.2.1.1

Pass/Fail.

 

                      Pass

                          Fail

 

Complete the Review.

Failure to complete the Review.

 

Measure V.1.2.2

 

Address all recommendations in a response to the Fermilab Area Office manager after said Review.

Metric V.1.2.2.1

Pass/Fail.

 

                      Pass

                          Fail

 

Address all Review recommendations.

Failure to address all Review recommendations.

 

Indicator V.1.3

 

Perform an independent cyber security review of the Business Services Critical System.

 

Measure V.1.3.1

 

Perform an independent cyber security review of the Business Services Critical System in CY 2003.

 

Metric V.1.3.1.1

Pass/Fail.

 

                      Pass

                          Fail

 

 

Perform an independent cyber security review of the Business Services Critical System.

Failure to perform an independent cyber security review of the Business Services Critical System.

 

 

 


Particle Physics Division

 

List of Work Processes Identified by the Particle Physics Division

 

1.      Review of the PPD Operating Manual  (Division Office)

2.      ProCard Use in PPD (Division Financial Support Group)

3.      PPD Effort-Reporting (Division Financial Support Group)

4.      PPD Financial Reports (Division Financial Support Group)

5.      Foreign Travel (Division Office)

6.      Review the Mechanical Department’s CDF Operations Group log (Mechanical Dept)

7.      Review the Mechanical Department’s DZero Operations Group log  (Mechanical Dept)                                                                             

8.      Mechanical Department Co-op Program (Mechanical Dept)

9.      Design / Drafting Group work process (Mechanical Dept

10.  Uptime on the Computer Aided Design Server (Mechanical Dept)

11.  Log of Lifting Fixtures (Mechanical Dept)

12.  Log of Pressure Vessels (Mechanical Dept)

13.  Review the Electrical Engineering Department’s CDF Support Group log (Elect Eng Dept).

14.  Review the Electrical Engineering Department’s DZero Support Group log  (Elect Eng Dept).

15.  Electrical Engineering Department’s Experiment Assembly Group Work Request Records (Elect Eng Dept)

16.  Printed Circuit Board Drafting Group (Elect Eng Dept)

17.  Electrical Engineering  (Elect Eng Dept).

18.  Electrical Engineering ASIC design and testing schedules (Elect Eng Dept)

19.  Alignment Group Work Log (Technical Centers Department)

20.  Silicon Detector Facility Infrastructure (Technical Centers Department).                   

21.  Scintillating Plastic Extrusions (Technical Centers Department).                  

22.  Integrated Safety Management (ISM) (ES&H / Building Management Services Dept)  

23.  ESHTRK Database findings (ES&H / Building Management Services Dept)

24.  Operational Readiness Clearances (ORC) (ES&H / Bldg Manage Services Dept., Division Office)

25.  Recordable Injury Case Actions (ES&H / Building Manage Services Dept., Division Office)                           

26.  ITNA status (ES&H / Building Manage Services Dept)

27.  ES&H Training (ES&H / Building Manage Services Dept)

28.  Building maintenance (ES&H/Building Management Services Dept)

29.  Site Support of PPD areas (Site Department)

30.  Rehabilitation & Reuse, Demolition & Disposal, PPD Clean-up Activity (Site Dept, ES&H/Building Management Services Dept, Division Office)

31.  Storage Space Management (Site Dept, ES&H/Building Manage Services Dept, Division Office)

32.  Public Outreach with Experimental Apparatus Displays (Site Dept, ES&H/Building Manage Services Dept, Division Office)

33.  Domestic Travel (Support Services Department)                                      

34.  Conference Support (Support Services Department).

35.  US Particle Accelerator School (Support Services Department)

36.  Desktop Computing Operations (Support Services Department)

37.  PPD Guest and Visitor Program for Experimentalists (Experimental Physics Projects Dept)

38.  Experimentalist Research Associate Hiring (Exp Physics Projects Dept)

39.  Educational Outreach (Experimental Physics Projects Dept)

40.  Seminars and Lectures (Experimental Physics Projects Dept)

41.  CDMS Construction Project and Operations (Experimental Physics Projects Dept)                                   

42.  MiniBooNE Operations (Experimental Physics Projects Dept).                                   

43.  Experimental Scientific Output (Experimental Physics Projects Dept)

44.  CDF Detector “Uptime” (CDF Operations Department)

45.  CDF Detector Procedures (CDF Operations Department)

46.  CDF Run IIb Upgrade (CDF Run IIb Project Department).                 

47.  Fermilab Participation in the CDF Collaboration (CDF Department).

48.  DZero Detector “Uptime” (DZero Department)

49.  DZero Detector Procedures (DZero Department)

50.  DZero Run IIb Upgrade (DZero Department).                 

51.  Fermilab Participation in the DZero Collaboration (DZero Department

52.  CMS Construction Project (CMS Department)

53.  CMS Maintenance and Operations (CMS Department)

54.  CMS and Fermilab’s Host Laboratory Role (CMS Department and the Division Office)

55.  Fermilab Participation in the CMS Collaboration (CMS Department).

56.  MINOS portion of the NuMI Construction Project (MINOS Dept)

57.  Operating Support of the Soudan Underground Laboratory (MINOS Department)

58.  Fermilab Participation in the MINOS Collaboration (MINOS Department).

59.  Theory Research Associate Hiring (Theoretical Physics)

60.  Theory Seminars and Lectures (Theoretical Physics)

61.  Theoretical Physics Scientific Output (Theoretical Physics)

62.  Astrophysics Research Associate Hiring (Theoretical Astrophysics)

63.  Theoretical Astrophysics Seminars and Lectures (Theoretical Astrophysics)

64.  Theoretical Astrophysics Scientific Output (Theoretical Astrophysics)


 

 

Technical Division

 

List of Work Processes Identified by the Technical Division

 

1.      Office Management

2.      Financial Management

3.      Personnel Management

4.      Project Management

5.      Accelerator Component Management

6.      Records Management

7.      Magnet Test Facility (MTF) Operation

8.      MTF Measurement Operation

9.      Procurement of Accelerator and Detectors Components

10.  Quality Control Verification of Procured Components to Specifications

11.  Accelerator Component/Parts Distribution

12.  General Materials & Service Procurement

13.  Accelerator Component & Tooling Storage Management

14.  Pro Card Procurement System

15.  Accelerator Component Production Tooling Mgmt

16.  Accelerator Component Production QA Management & Documentation

17.  Operational Readiness Clearance Review & Approval

18.  Machine Shop Operation & Support

19.  Machine Repairs Management

20.  Machine Auto CAD

21.  Computer Information Systems & Technical Support

22.  Computer Security Audit

23.  Research & Development Program

24.  Design, Create, Review, and Document

25.  Linear Collider R&D

26.  High Field Magnet Program R&D

27.  QA Guidance and Oversight

28.  Annual QA Internal Self Assessment of TD Project/ Department Activity

29.  Internal Self Assessment Program

30.  Tripartite Assessment Program

31.  OSHA Type Inspection

32.  ES&H Quarterly Walkthroughs

33.  Safety Audits

34.  Employees Grass Roots Safety Committees

35.  ES&H Training

36.  Construction Safety/Oversight

37.  Radiation Oversight

38.  Environmental Oversight

39.  Infrastructure Scheduled Maintenance & Improvements

40.  Construction Oversight & Management

41.  Physical Plant Property Management

42.  Plant Security


 

Attachment 1

______________________________________________________

 

TYPICAL EVALUATION SCHEDULE

_______________________________________________________

 

7/1/FY-1[4]        Functional area experts from both DOE and Fermilab develop proposed version of PBMMs.

 

9/1/FY-1         Proposed PBMMs due to the Fermi Area Office Manager.

 

10/1/FY[5]         DOE transmits final PBMMs to Fermilab and evaluation period starts.

 

5/15/FY           Fermilab reports to DOE on mid-year status.

 

9/30/FY           Evaluation period ends.

 

10/1/FY+1[6]     Fermilab initiates tabulation process.

 

11/15/FY+1    Fermilab submits to DOE its self-assessment based on the PBMMs.

 

12/15/FY+1    DOE develops draft report and transmits to the Contractor.

 

1/15/FY+1      Contractor submits comments on draft report.

 

1/31/FY+1      DOE transmits final report to the Contractor.


Attachment 2

 

Performance Fee

 

 

 

 

 

 

Rating

 

 

Science

70%

 

Critical

Operations*

30%

 

 

Total

Available

Fee

FY 2002

(1/1/02-9/30/02)

 

Outstanding

 

$746,900

 

$320,100

 

$1,067,000

 

Excellent

$560,175

$240,075

 

 

Good

$373,450

$160,050

 

 

Marginal

0

0

 

 

 

 

 

 

FY 2003

Outstanding

$949,200

$406,800

$1,356,000

 

Excellent

$711,900

$305,100

 

 

Good

$474,600

$203,400

 

 

Marginal

0

0

 

 

 

 

 

 

FY 2004

Outstanding

$949,200

$406,800

$1,356,000

 

Excellent

$711,900

$305,100

 

 

Good

$474,600

$203,400

 

 

Marginal

0

0

 

 

 

 

 

 

FY 2005

Outstanding

$949,200

$406,800

$1,356,000

 

Excellent

$711,900

$305,100

 

 

Good

$474,600

$203,400

 

 

Marginal

0

0

 

 

 

 

 

 

FY 2006

Outstanding

$949,200

$406,800

$1,356,000

 

Excellent

$711,900

$305,100

 

 

Good

$474,600

$203,400

 

 

Marginal

0

0

 

 

 

 

 

 

FY 2007

Outstanding

$237,300

$101,700

$339,000

(10/1/06-12/31/06)

Excellent

$177,975

$76,275

 

 

Good

$118,650

$50,850

 

 

Marginal

0

0

 

 

 

 

 

 

 

* Critical Outcomes Fee distribution shall be developed by the Contracting Officer on an annual basis for each Performance Period and will be identified in Attachment 2a.

 

 

Attachment 2a

 

FERMILAB

Critical Outcomes Fee Distribution

(FY2003 - October 1, 2002 - September 30, 2003)

 

 

 

 

 

 

 

 

Science

(70%)

 

Critical Operations Management (30%)

* $406,800

 

Leadership

(20%)

 

E,S,&H

(30%)

 

Mission Support

(30%)

 

Self -Assessment

(10%)

 

Stakeholder Relations

(10%)

 

Outstanding

 

 

$ 949,200

 

$ 81,360

 

$ 122,040

 

$ 122,040

 

$ 40,680

 

$ 40,680

 

Excellent

 

 

$ 711,900

 

 

$ 61,020

 

$ 91,530

 

$ 91,530

 

$ 30,510

 

$ 30,510

 

Good

 

 

$ 474,600

 

$ 40,680

 

$61,020

 

$ 61,020

 

$ 20,340

 

$ 20,340

 

Marginal

 

 

0

 

0

 

0

 

0

 

0

 

0

 

        *  Total Critical Operations Fee attainable at an "Outstanding" level of performance.

 

 


 

Attachment 3

 

Assessment Forms

 

Fermilab

FY2003 Self-Assessment

Process Assessment Report

for

 

Division/Section______________________

 

Date____________

 

Division/Section performing assessment

 

Self explanatory

 

 

Name of organization that owns assessed process

 

Department, Group, etc.

 

 

 

Organization Strategy

 

How does the assessed process contribute to the accomplishment of the owning organization’s mission?

 

 

Names of Personnel on Assessment team

 

Self-explanatory

 

 

Name of process assessed

 

                                              Self-explanatory      

 

 

Brief description of process to be assessed

 

Self-explanatory

 

 


 

1.      Are metrics associated with this process?  If so, what are they?

 

List contractual metrics if applicable.  Internal metrics if any.

 

2.      What are the names of the procedures associated with this process?

 

List all procedure names that describe or document this process.

 

3.      Are these procedures being followed? Are they current?

 

 

 

4.      Describe the methodology used to assess this process.

 

 

 

5.      Results of the assessment:

 

 

Answer the following questions in the RESULTS write-up

 

a.       Are the existing process controls adequate?

 

b.      Have any notable practices been identified?

 

c.       Have any major deficiencies been identified?

 

d.       Is the process working effectively?  What improvements can be made?

 

e.       How does current performance compare to last assessment, other similar labs, industry?

 

f.        What are the results for the metrics?

 

g.       Adjectival grade achieved

 

 

 

 

Identified opportunities for improvement

 

Improvement opportunities from the current assessment or a statement of optimal performance for the process.    Lack of budgeted funds to accomplish improvement actions must be documented.  Projected savings and benefits from improvement actions must be documented. Also, some improvement actions may not be done due lack of economic benefit.

 

 


 

Schedule for implementation of improvements

 

Self explanatory

 

 

Status of improvements from previous assessment

 

Self-explanatory

 

 

Attachments (supporting data, worksheets, reports, etc.

 

Self-explanatory.  Attachments should be integrated into this document. Not separate files.

 

 

 

 



[1] The Lost Workday Case Rate metrics (C.2.1.3.1 and C.2.1.4.1) take into account the impact of the revised OSHA recordkeeping rules that became effective January 1, 2002.  The metrics have been adjusted in order to normalize the data and continue having meaningful goals.  The net effect of the Laboratory achieving a grade of Outstanding for these metrics will result in the Laboratory attaining a combined rate of approximately 1.2, which is less than the current (February 2003) Office of Science average of 1.5.

[2] The number of project and program reviews may increase as the fiscal year progresses.

[3] The current reporting period is October 1, 2002 through September 30, 2003.

[4] “FY-1” refers to the fiscal year prior to the performance period.

[5] “FY” refers to the performance period fiscal year.

[6] “FY+1” refers to the fiscal year following the performance period.