surveyofbp
TRANSCRIPT
![Page 1: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/1.jpg)
A Survey of CurrentBest Practices and Utilization of Standards
In the Public and Private Sectors
Authors:George Brotbeck
Tom MillerDr. Joyce Statz
TeraQuest Metrics, Inc.
11/23/99
![Page 2: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/2.jpg)
Survey of Current Best Practices and Utilization Standards
Page 2 Version 0.9
Table of Contents
TABLE OF CONTENTS.................................................................................................................................................2
EXECUTIVE SUMMARY................................................................................................................................................3
1 SURVEY PURPOSE AND CONTEXT......................................................................................................................3
2 HOW THE SURVEY WAS PERFORMED...............................................................................................................4
3 KEY FINDINGS.............................................................................................................................................................4
CONSTRUCTION INTEGRITY..................................................................................................................................... 4PRODUCT STABILITY AND INTEGRITY.................................................................................................................. 4
4 EMERGING TRENDS IN STANDARDS AND BEST PRACTICES ...................................................................4
APPENDIX A. FEDERAL AND STATE GOVERNMENT RESOURCES ..............................................................4
![Page 3: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/3.jpg)
Survey of Current Best Practices and Utilization Standards
Page 3 Version 0.9
Executive Summary
This document presents the results of an extensive survey of the current use of standards andbest practices to assure that Information Resource (IR) projects are successfully completed ontime, within budget and with the intended benefits. Federal government, state government, andprivate sector organizations were included in the scope of this survey. Key standards and criticalsuccess factors are identified, with excellent correlation of these to the internal QA guidelinesand model procedures being developed for Texas agencies. We conclude with a discussion oftrends in the standards and best practices, as well how state governments are using them. Stategovernments are seen to be following the pattern set in private sector organizations not too longago, and continuing today.
1 Survey Purpose and Context
Why This SurveyWas Done
What is the reason this survey was done?How should the results be used?
The LegislativeMandate
The Texas Government Code, Chapter 2054, Subchapter G,Sections 2054.151-2054.157 (Information Resources ManagementAct, IRMA) requires that each state agency “develop and implementits own internal quality assurance (QA) procedures1.” It has beendetermined that these should “make use of widely adopted, non-proprietary standards, guides, and templates wherever possible2.”Accordingly, this survey has been performed to determine what isavailable in the public and private sectors that could be “useful andadaptable for Texas3.”
What This DocumentDoes
This document provides an overview of the current best practices andstandards assuring that IR projects are successfully completed ontime, within budget and with the intended benefits. It summarizes thecritical project success factors found in Federal and state agencies, aswell as in private sector organizations. These factors set theobjectives or desired results that assure a quality project result.
Relationship of thisDocument to theInternal QualityAssurance Model
The processes, procedures, checklists, guidelines, templates, toolsand other descriptive materials found by the survey have beenreviewed, evaluated and included as input to the core content of theinternal QA guidelines and model procedures, published separately.
1 Texas Government Code, Chapter 2054, Subchapter G, Section 2054.151(b)2 Department of Information Resources Invitation to Negotiate, August 10, 1999, Section 23 ibid.
![Page 4: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/4.jpg)
Survey of Current Best Practices and Utilization Standards
Page 4 Version 0.9
Guidelines The appendix shows the results of the public sector search, listing theweb site or other source of each artifact we reviewed. Private sectorartifacts are not generally available outside the developingorganization for competitive reasons. However, TeraQuest has drawnon its experience with many clients in the private sector to incorporatetheir best practices into the internal QA guidelines and modelprocedures.
2 How the Survey was Performed
Where theInformation CameFrom
How was the information collected?
Looking at State andFederal GovernmentPractices
Federal and state government resources were identified using theinternet (Appendix A). The web sites of all fifty states were reviewed.The web site of the National Association of State InformationResources Executives (NASIRE) has links to some state sites.However, not all states have web sites listed with NASIRE. In othercases, following related links found additional Information Resources(IR) web sites that proved to be more informative. The nomenclaturefor naming the agency with IR responsibility varies among states. Insome states, it is the Chief Information Officer’s (CIO) site. In others,a technology office exists, while others have something analogous toDIR. Only those states having material that is useful to developing theDIR internal QA guidelines and model procedures are listed inAppendix A. In Section 3, we will discuss the wide disparity thatexists among states with respect to software QA practices.
Federal Government sites were usually discovered by following linksfrom state sites. Many Federal sites also have links to other relatedsites. In Appendix A, the wealth of available material can be seen byobserving that some of the listings are described as “collections” ofmaterial, rather than lists of individual items, thereby keeping theappendix to a manageable size.
Examining the PrivateSector
Private sector artifacts come from three sources:
1. Web sites of professional and non-profit organizations, such asthe Project Management Institute and the Information TechnologyResources Board
2. The library of materials collected by TeraQuest from its clientsover years of doing business in the private sector. This material
![Page 5: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/5.jpg)
Survey of Current Best Practices and Utilization Standards
Page 5 Version 0.9
was generally collected under non-disclosure agreements thatpreclude identification of the source.
3. Artifacts developed by TeraQuest that reside in the TeraQuestProcess Asset Library. These have been developed usingindustry standards and best practices in support of trainingclasses and client consulting engagements over several years.
What Is Available vs.What’s Not Available
The amount of information available from the above mentionedsources is indeed impressive. However, a number of state web sitesreferenced intra-net sites that are not publicly accessible. Thissuggests that there is potentially much more information available thanwe were able to consider in this survey. Access to this informationcan probably best be arranged directly between peer groups withinstate agencies. States for which this could prove beneficial include• Florida• Georgia• Maryland• Missouri• Washington
3 Key Findings
What StateGovernments and thePrivate Sector areDoing
How do successful organizations, both public and private, ensurethat they build quality information systems that delight the users,on time and within budget?What standards does the industry provide?How are other states using these?What is the private sector doing?
Commonly UsedStandards andBenchmarks
In both government and private sectors, there are a small number ofstandards being used to guide the development of informationresource projects:
• the Software Engineering Institute’s, Capability Maturity Model®4 for Software
• the Project Management Institute’s, Project Management Bodyof Knowledge (PMBOK)TM 5
• the Institute of Electrical and Electronics Engineers’, Software
4 Capability Maturity Model is registered in the U.S. Patent and Trademark Office.5 "PMBOK" is a trademark of the Project Management Institute, Inc. which is registered in the United Statesand other nations.
![Page 6: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/6.jpg)
Survey of Current Best Practices and Utilization Standards
Page 6 Version 0.9
Engineering Standards• the International Organization for Standardization (ISO), 9000
Quality Management and Quality Assurance Standards.
These are briefly described below, followed by a discussion of thecurrent usage of these standards by other states and the privatesector.
In 1995, a significant international standard was issued as ISO/IEC12207, Standard for Information Technology – Software Life CycleProcesses. This standard is gradually being adopted and tailored tolocal use by various national groups. In the U.S., the standard wasissued in three parts in 1996 and 1997, and is now incorporated inthe 1999 IEEE Software Engineering Standards.
Software EngineeringInstitute, CapabilityMaturity Model forSoftware
The Capability Maturity Model for Software is one of severalCapability Maturity Models (CMMs) developed at the SoftwareEngineering Institute at Carnegie Mellon University since the mid-1980s. In addition to the model for software, there are also CMMsfor managing people, for software acquisition, for personal softwareprocess, and for systems engineering (done in collaboration withseveral other industry groups). These models share some features,while content and intended audiences vary.
Each model provides a structured view of its area of focus, generallyin a five-layer model of increasingly sophisticated practices for thoseworking in the area. With the exception of the personal softwareprocess, each is intended to be used by an organization to improve itsoverall capability in an incremental way. Each layer of the modelprovides a basis for continuous improvement in the practices alreadyestablished, as well as the basis for the next layer of practices.
Project ManagementInstitute (PMI® 6),Project ManagementBody of Knowledge(PMBOK)
The Project Management Institute is a thirty-year-old internationalorganization for project management professionals. It has beeninstrumental in codifying project management practices, known as theProject Management Body of Knowledge, or PMBOK. ThePMBOK “is an inclusive term that describes the sum of knowledgewithin the profession of project management7.” Internationalstandards groups, such as ANSI and IEEE, are increasingly
6 PMI is a trade and service mark of the Project Management Institute, Inc. which is registered in theUnited States and other nations.7 “A Guide to the Project Management Body of Knowledge”, Project Management Institute, StandardsCommittee, © 1996, page 3.
![Page 7: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/7.jpg)
Survey of Current Best Practices and Utilization Standards
Page 7 Version 0.9
recognizing this common body of knowledge, comprising bothtraditional and innovative practices, as a standard for sound projectmanagement.
Additionally, The Project Management Institute certifies individuals asProject Management Professionals (PMP® 8) through a rigorousprogram consisting of both experience evaluation and knowledgeexamination. To date, PMI has certified more than 15,000 individualsworldwide.
Institute of Electricaland ElectronicsEngineers (IEEE),Software EngineeringStandards
The IEEE standards are widely used as models for generating avariety of project artifacts, such as requirements documents, designspecifications, test documentation and project plans. Standardscovering software acquisition and development processes, andproject measurements are also included in the collection.
InternationalOrganization forStandardization (ISO),ISO 9000, QualityManagement andQuality AssuranceStandards
The ISO 9000 collection is a suite of standards and guidelines thathelp organizations implement effective quality systems for the type ofwork they do. Two items in the collection are most useful toorganizations that design and build software:• ISO 9001 – Quality Systems Model for Quality Assurance in
Design, Development, Production, Installation and Servicing• ISO 9000-3 – Guidelines for the Application of ISO 9001 to the
Development, Supply, and Maintenance of Software
ISO 9001 covers the requirements for a quality system that supportsthe full product life cycle, from initial agreement on a deliverable,through design, development, and support of the product. ISO 9000-3 provides specific advice for how to interpret the standard fordeveloping a quality system of an organization whose product isprimarily software. This guideline has been very useful to softwareorganizations, since the original focus of ISO 9000 was for managingmanufacturing and process control types of activities, and interpretingthe standards for software was sometimes difficult.
State Usage ofStandards
Figure 1 shows the results of our investigation of current state use ofthe standards described above. It was developed by searching eachstate’s web site for references to the four standards cited here.Because not all information is available on the web, there is somelikelihood that usage may be more prevalent than shown in this table.
8 “PMP” and the PMP logo are certification marks of the Project Management Institute, which are registeredin the United States and other nations.
![Page 8: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/8.jpg)
Survey of Current Best Practices and Utilization Standards
Page 8 Version 0.9
Figure 1. – States (Other Than Texas) Using Standards and Industry Best Practices
Standard/Industry Best Practice States Using9
Software Engineering Institute, SoftwareCapability Maturity Model
Kansas, Michigan, Minnesota, NorthCarolina, Ohio, Tennessee, Washington
Project Management Institute, “A Guide tothe Project Management Body ofKnowledge”, and/or Project ManagementProfessional certification
California, Missouri, North Carolina, NorthDakota, Oregon, Tennessee
The Institute of Electrical and ElectronicsEngineers, software standards
California, Michigan, North Carolina,Tennessee, Washington
International Organization for Standardization,ISO 9000-3:1997 “Quality Managementand Quality Assurance Standards – Part 3:Guidelines for the Application of ISO9001:1994 to the Development, Supply,Installation and Maintenance of ComputerSoftware”
California, Washington
Information ResourcesProjects – CriticalSuccess Factors
To understand why the standards described earlier are so importantto Information Resources (IR) project success, it is necessary torelate their use to commonly observed critical success factors for IRprojects. Below are representative studies of best practices andcritical success factors in IR projects.
Center for Technologyin Government (CTG)
In its report “Tying a Sensible Knot: A Practical Guide to State-LocalInformation Systems”, CTG identifies nineteen best practices that“should go into the design, development, and operation of any state-local information system10.” They are11
1. “Define purpose and scope (of the project)2. Choose a well-skilled and respected project leader3. Recruit the right project team
9 States Using means the state has
• publicly identified the best practice or standard as the basis for one or more state practices, or• artifacts describing practices that contain wording from which compliance with the best practice or
standard can be strongly inferred.10 “Tying a Sensible Knot: A Practical Guide to State-Local Information Systems”, Center for Technology inGovernment, University at Albany, SUNY, June 1997 (available at www.ctg.albany.edu).11 Parenthetical comments added by authors.
![Page 9: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/9.jpg)
Survey of Current Best Practices and Utilization Standards
Page 9 Version 0.9
4. Sell the project to decision makers (based on project benefits)5. Communicate often and clearly with stakeholders6. Finance creatively (multiple funding sources)7. Adopt tools and techniques that can manage complexity (sound
management practices)8. Look for existing models (in both public and private sectors)9. Understand and improve (business) processes before you apply
technology10. Match the technology to the job11. Use industry standard technology12. Adopt and abide by data standards13. Integrate with related processes and practices14. Use prototypes to ensure understanding and agreement about
design15. Choose a capable pilot site16. Make the best use of vendors17. Train (users) thoroughly18. Support users19. Review and evaluate performance (compare system’s actual
operational performance to expected benefits.)”
These practices map very closely into the areas covered by the TexasInternal Quality Assurance (QA) Guidelines for• Creating a Project Plan,• Monitoring and Controlling a Project,• Developing a Project’s Expected Benefits and Budget,• Analyzing and Managing Project Risk,• Establishing Project Effectiveness/Efficiency Measurements, and• Evaluating Project Results
Successful deployment and use of these internal QA guidelines shouldsignificantly increase the success rate of IR projects, based on theCTG findings.
Ten Factor Model ofCritical SuccessFactors
Pinto and Millet, in their book “Successful Information SystemsImplementation”, offer an empirically-based model of ten criticalsuccess factors for IR projects12:
1. “Project Mission – Initial clearly defined goals and generaldirections;
12 “Successful Information System Implementation: The Human Side”, Jeffrey K. Pinto and Ido Millet,Project Management Institute, 1999.
![Page 10: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/10.jpg)
Survey of Current Best Practices and Utilization Standards
Page 10 Version 0.9
2. Top Management Support – Willingness of top management toprovide the necessary resources and authority/power forimplementation success;
3. Schedule/Plans – Detailed specification of the individual actionsteps for system implementation;
4. Client Consultation – Communication, consultation and activelistening to all parties impacted by the proposed informationsystem;
5. Personnel – Recruitment, selection and training of the necessarypersonnel for the implementation project team;
6. Technical Tasks – Availability of the required technology andexpertise to accomplish the specific technical action steps to bringthe information system online;
7. Client Acceptance – Act of selling13 the final product to itsultimate intended users;
8. Monitoring and Feedback – Timely provision of comprehensivecontrol information at each stage in the implementation process;
9. Communication – Provision of an appropriate network andnecessary data to all key actors in the information systemimplementation process;
10. Troubleshooting – Ability to handle unexpected crises anddeviations from plan.”
Note the close correlation of the ten factors above with the CTGfindings. Once again, implementation and use of the internal QAguidelines should significantly enhance the successful completion ofagency IR projects.
Software ProgramManagers Network(SPMN)“16 Critical SoftwarePracticesTM forPerformance-BasedManagement”
One of the best sources for lessons learned and best practices is theSPMN web site (Appendix A). The paper named at left “outlines the16 Critical Software PracticesTM that serve as the basis forimplementing effective performance-based management of software-intensive projects. They are intended to be used by programs desiringto implement effective high-leverage practices to improve their bottomline measures – time to fielding, quality, cost, predictability, andcustomer satisfaction – and are for CIOs, PMs, sponsoring agencies,software project managers, and others involved in softwareengineering.” The sixteen practices are
Project Integrity1. Adopt Continuous Program Risk Management
13 Italics appear in book.
![Page 11: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/11.jpg)
Survey of Current Best Practices and Utilization Standards
Page 11 Version 0.9
2. Estimate Cost and Schedule Empirically3. Use Metrics to Manage4. Track Earned Value5. Track Defects Against Quality Targets6. Treat People as the Most Important Resource
Construction Integrity7. Adopt Life Cycle Configuration Management8. Manage and Trace Requirements9. Use System-Based Software Design10. Ensure Data and Database Interoperability11. Define and Control Interfaces12. Design Twice, Code Once13. Assess Reuse Risks and CostsProduct Stability and Integrity14. Inspect Requirements and Design15. Manage Testing as a Continuous Process16. Compile and Smoke Test Frequently
For an in-depth discussion of each practice, and its implementation,please see the paper at the web site. However, looking at the first sixitems, a strong correlation is evident between those items and theinternal QA guidelines.
4 Emerging Trends in Standards and Best Practices
Where Use of BestPractices andStandards is Going
How are standards and best practices evolving?How is state use of standards and best practices changing?
Evolution of ProcessMaturity Standards andFrameworks
The SEI’s Capability Maturity Model for Software is currentlyevolving toward CMM IntegrationSM (CMMISM 14). The integrationreferred to in the model’s name is the combination of software andsystems engineering CMMs into one model. The CMMI project “is acollaborative effort sponsored by the Office of the Secretary ofDefense/Acquisition and Technology (OSD/A&T) with participationby government, industry, and the Software Engineering Institute(SEI). The project's objective is to develop a product suite thatprovides industry and government with a set of integrated products tosupport process and product improvement.15”. CMMI is scheduled
14 CMM Integration and CMMI are service marks of Carnegie Mellon University.15 SEI web site
![Page 12: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/12.jpg)
Survey of Current Best Practices and Utilization Standards
Page 12 Version 0.9
for release in the mid-2000 time frame. Information is available at theSEI’s web site.
The next efforts to evolve CMMI are expected to focus on softwareacquisition practices, extending the relatively new and undevelopedSoftware Acquisition Capability Maturity Model to the realm ofsystems projects and incorporating it into the integrated model.
Internationally, we expect to see, in the near term, a commonframework for performing software process capability assessments.This will be independent of the process improvement frameworkbeing followed (of which CMMI is an instance) This standard,ISO/IEC 15504, will provide, if approved, a way of evaluatingassessment approaches, assuring that a given approach is a validmeasure of software process capability. This standard is currentlyundergoing field trials.
Evolution of ProjectManagement andCommercial SoftwareDevelopmentStandards
The Project Management Institute Standards Committee is continuingto evolve the PMBOK, with extensions specific to informationsystems currently being drafted and evaluated. Other related elementsunder development include an Organization Project ManagementMaturity Model, Project Manager Competencies, ProjectTaxonomy, and a Work Breakdown Structure (WBS) PracticeStandard. PMI’s web site has a Standards section containing furtherinformation.
The IEEE standards collection has been growing rapidly, whileintegrating other standards into its overall structure. Notable examplesof this are the inclusion of PMI’s PMBOK Guide and ISO/IEC12207 “Standard for Information Technology – Software Life CycleProcesses.” In addition, all of the software standards in the 1999IEEE collection related to the life cycle processes and work productshave been harmonized with the 12207 standard. Each of the relevantstandards has either been re-written to comprehend the processes asdefined in 12207, or an appendix has been added to the standardshowing how its elements map to corresponding elements ofISO/IEC 12207.
Evolution of MilitarySoftware Standards
Most of the software industry standards trace their ancestry to theDepartment of Defense (DoD) military standards. There has been alot of change in military standards over the last several years,culminating in the current recommendation that military softwaredevelopment be done with commercial standards, specifically 12207-
![Page 13: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/13.jpg)
Survey of Current Best Practices and Utilization Standards
Page 13 Version 0.9
based standards.
• For many years, the defense industry and other governmentorganizations used Military Standard 2167A, Defense SystemSoftware Development, last updated in 1998.
• In the late 1980s, the DoD decided to consolidate 2167A andtheir information systems standard into a single standard knownas Military Standard 498, Software Development andDocumentation. As this standard was being completed in themid-1990s, DoD began working under new defense acquisitionregulations that removed acquirer-mandated standards, allowingthe supplier to follow commercial standards. 498 wasrecommended as an interim standard, while an appropriatecommercial standard was created.
• In 1995, a joint effort of the IEEE and the Electronics IndustriesAssociation (EIA) produced an interim standard for both militaryand commercial use, known as Joint Standard 16 (J-016,Standard for Information Technology Software Life CycleProcesses, Software Development) based on technical content inISO/IEC 12207. This functioned as an appropriate standard formilitary acquisitions for several years, until the 3-volumeIEEE/EIA 12207 standard was issued in 1996 and 1997 as theU.S. version of the international standard. At this point, the U.S.Department of Defense recommends use of IEEE/EIA 12207 forwork done by defense agencies and as the default for suppliersseeking advice.
Trends in StateGovernment Use ofBest Practices andStandards
The table shown in Figure 1 is, perhaps, a little surprising. It indicatesa rapidly expanding effort in state governments to establish andmaintain control of IR projects. In most cases this has been mandatedby legislation. In some, it is being driven by Y2K projectrequirements. In sifting through the documentation and web sitecontent we have reviewed, several key trends emerge:
1. The SEI’s CMM for Software is an increasingly popularframework for establishing sound IR project processes. Evenwithout formal assessments, state agencies are finding value in thestructured approach to establishing good engineering andmanagement practices;
2. The PMBOK is being similarly used to develop internal projectmanagement skills and capabilities. While PMP certification is notcustomarily mandated, there is a lot of emphasis being placed onproject management training that is PMBOK-compliant;
![Page 14: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/14.jpg)
Survey of Current Best Practices and Utilization Standards
Page 14 Version 0.9
3. Both the CMM for Software and the PMBOK make frequentreference to key documentation elements as the basis forestablishing and maintaining project control. The IEEE softwarestandards provide readily available models for suchdocumentation. Such models are easily tailored to fit specificorganization and/or project needs;
4. Usually, ISO 9000-3 is imperative only for those organizationswishing to do business outside the United States. However, as anexternal benchmark for quality information systems products anddevelopment processes, it offers valuable guidance in establishingappropriate processes and documentation.
The trends mentioned above are very much parallel to what has beengoing on in the private sector at a somewhat faster pace of adoption.Driven by unacceptably large numbers of failed information systemsprojects, and the harsh realities of Y2K compliance, there has been adramatic increase in the number of private sector organizations thatare aggressively pursuing the use of the standards and best practicesmentioned here as a core survival strategy.
![Page 15: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/15.jpg)
Survey of Current Best Practices and Utilization Standards
Page 15 Version 0.9
Appendix A. Federal and State Government Resources
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
1. Arizona www.gita.state.az.us/ Information TechnologyProject and InvestmentMonitoring
PR Descriptions of how major projects arecentrally monitored and reported on.
2. Arizona www.gita.state.az.us/ Monthly status report BP A monthly report form for monitored projects3. Arizona www.gita.state.az.us/ Policies, Standards, and
ProceduresPY Ten categories of items; none are SW
acquisition or development-related guidance4. Arizona www.gita.state.az.us/ Project Justification and
OversightBP Descriptions of how projects are launched and
reviewed by state oversight team (much likeTexas QAT)
5. Arizona www.gita.state.az.us/ Web Practices Guideline GL under development – set of practices andguidelines for putting material on the Web
6. Arkansas www.dis.state.ar.us/workinggroups/SP_WG/ITPLAN.htm
Strategic InformationTechnology Plan
OT Outlines the goals and objectives of IT in AR
7. Arkansas www.dis.state.ar.us/workinggroups/SP_WG/PS.htm
Technology Policies andStandards
PY Policies about security, privacy,interoperability of systems, technologypurchases, etc. Not software developmentrelated
8. California www.doit.ca.gov/Reports/DecemberReport.asp
Information Technology:Project Initiation andApproval Report
BP A narrative summary of the uniform processfor initiating, approving, and changing state ITprojects
9. California www.doit.ca.gov/SIM Overview of Project PY High level description of what’s expected of
![Page 16: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/16.jpg)
Survey of Current Best Practices and Utilization Standards
Page 16 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
M/ProjectManagement/ProjManPolicies.htm
Management andOversight Policies
projects in the areas generally found in CMML2; SAM is missing
10. California www.doit.ca.gov/SIMM/default.asp
Statewide InformationManagement Manual(SIMM)
PY, PR Manual of policies and procedures, includingRisk Assessment model, Project managementMethodology, Y2K materials, policies,guidelines (also includes a Quality Planningprocedure)
11. Center forTechnology inGovernment
www.ctg.albany.edu A Survey of SystemDevelopment ProcessModels
BP A survey of widely used softwaredevelopment processes, useful for projectplanning purposes.
12. Center forTechnology inGovernment
www.ctg.albany.edu Tying a Sensible Knot: APractical Guide to State-Local InformationSystems
BP A collection of best practices drawn from astudy of successful state-local governmentpartnership IT projects.
13. Colorado www.state.co.us/gov_dir/gss/imc
Systems DevelopmentMethodology Policy
PY Policy governing the creation of anappropriate methodology covering newapplication development, applicationmaintenance or modifications/enhancements,and/or new software acquisitions.
14. Colorado www.state.co.us/gov_dir/gss/imc
Systems DocumentationPolicy
PY Policy governing the creation of appropriatedocumentation for new applicationdevelopment, application maintenance ormodifications/enhancements, and/or newsoftware acquisitions.
15. DoD, Office of theUnder Secretary of
http://www.acq.osd.mil/sa/se/index.htm
Pointers to a number ofdocuments on various
BP Covers, among other things, policies andguidelines for software engineering,
![Page 17: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/17.jpg)
Survey of Current Best Practices and Utilization Standards
Page 17 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
Defense forAcquisition andTechnology –SystemsAcquisition
systems engineeringtopics
configuration management, quality assuranceand risk management. Also has a link to theDefense Acquisition Handbook, an exhaustivereference for systems acquisition activities.
16. DoD, Data &Analysis Center forSoftware
www.dacs.dtic.mil/databases/url/key.hts?keycode=14:124:170&islowerlevel=1
Pointers to a number ofdocuments used byDoD/NASA forinspections
BP Includes• Software Formal Inspections Guidebook• Software Formal Inspections Standard• Software Technology Reference Guide –
Software Inspections17. Florida http://mail.irm.state.fl.us
/Controls & TheInformation TechnologyProject
GL Describes recommended controls for both theIT project and the system being built.
18. Florida http://mail.irm.state.fl.us/
Cost/Benefit AnalysisWorksheet
TL A set of Excel spreadsheets for estimating thecosts and benefits of proposed IT projects.
19. Florida http://mail.irm.state.fl.us/
Feasibility StudyGuidelines
GL A set of guidelines for conducting feasibilitystudies associated with new IT projects.
20. Florida http://mail.irm.state.fl.us/
Information SystemsDevelopmentMethodology Policy
PY Policy mandating each agency to create anInformation Systems DevelopmentMethodology (ISDM). The ISDM includes,minimally, strategic planning, projectmanagement and quality assurance.
21. InformationSystems Audit andControl
http://www.isaca.org/cobit.htm/
COBIT ControlObjectives
PS Information Systems Audit and ControlFoundation’s attempt to define a generallyapplicable and acceptable standard for good
![Page 18: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/18.jpg)
Survey of Current Best Practices and Utilization Standards
Page 18 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
Association IT security and control practices. This is anexpanded and more detailed treatment of theFramework.
22. InformationSystems Audit andControlAssociation
http://www.isaca.org/cobit.htm/
COBIT ExecutiveSummary
PS Information Systems Audit and ControlFoundation’s attempt to define a generallyapplicable and acceptable standard for goodIT security and control practices. Thisdocument is an overview of the entire set ofpractices.
23. InformationSystems Audit andControlAssociation
http://www.isaca.org/cobit.htm/
COBIT Framework PS Information Systems Audit and ControlFoundation’s attempt to define a generallyapplicable and acceptable standard for goodIT security and control practices. TheFramework is a set of high-level controlobjectives and an associated classificationstructure.
24. InformationTechnologyResources Board
www.itrb.gov Assessing the Risks ofCommercial-Off-TheShelf Applications
BP A “tool to assist Federal organizations inclarifying the myriad risks their organizationwill encounter when facing a COTSimplementation.”
25. InformationTechnologyResources Board
www.itrb.gov Lessons Learned BP A collection of best practices reflecting “eachITRB member’s own experience, and theBoard’s unique perspective based uponassessments of Federal information systemsprojects.”
26. Information www.itrb.gov Managing Information BP/CK …“contains a broad array of questions in nine
![Page 19: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/19.jpg)
Survey of Current Best Practices and Utilization Standards
Page 19 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
TechnologyResources Board
Systems: A PracticalAssessment Tool
areas from which to evaluate informationtechnology systems: mission and vision,customers, business focus, executive direction,capital planning, project management,performance management, acquisition, andarchitecture.”
27. InformationTechnologyResources Board
www.itrb.gov Project ManagementHandbook
BP A “handbook derived from reviews of missioncritical Federal information systems projects.Describes a concise, high-level framework forproject management. Provides practicalsuggestions for Federal executives involved inmanagement of mission critical informationsystems.”
28. Kansas www.ink.org/public/kirc
Policies PY A collection of policies, including projectplanning and monitoring.
29. Massachusetts www.state.ma.us/osc A Guide to SystemImplementation
BP Provides “an outline for systemimplementation and is based on theexperiences of the implementation teams whohave worked to successfully roll-out new orenhanced systems to users throughout stategovernment.”
30. Massachusetts www.state.ma.us/osc Information Technology–Project ManagementResource Book
BP This Resource Book has been developed forOSC project managers, or Project Drivers, togive them specific guidelines and explainadopted standards for IT project managementat the Comptroller’s Office.
![Page 20: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/20.jpg)
Survey of Current Best Practices and Utilization Standards
Page 20 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
31. Michigan www.state.mi.us/dmb/oas/itsd
Year 2000 ProjectOffice Quality AssuranceReview Procedures
PR Describes the procedures used by the State ofMichigan Year 2000 Project Office toconduct Agency Year 2000 project qualityassurance reviews.
32. Michigan www.state.mi.us/dmb/oas/itsd
Year 2000 SoftwareQuality AssuranceProgram Manual
BP “Provides insights and guidance to agencyYear 2000 Quality Assurance Analysts withrespect to ensuring that critical softwareapplications are remediated and operate in theYear 2000 and beyond. It outlines thecomponents and salient issues relevant todesigning and implementing an effective qualityassurance program.”
33. Minnesota www.ot.state.mn.us Information ResourceDevelopment:Management Framework
GL “Explains how the information resourcedevelopment (IRD) management frameworkfor Minnesota government can be used tocontrol IRD. State agencies can use thisguideline to improve the quality of theirinformation resources and manage theprocesses used to develop those informationresources.”
34. Minnesota www.ot.state.mn.us Project ManagementBudget RequestGuideline FY 1998 -1999
GL “Provides a recommended organizationalguide for project management and describesthe OT (formerly IPO) information resourcebudget request requirements for projectmanagement.”
35. Missouri www.oit.state.mo.us ConfigurationManagement Guidelinesand Best Practices
GL Guidelines and best practices for implementingconfiguration management in IT projects.
![Page 21: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/21.jpg)
Survey of Current Best Practices and Utilization Standards
Page 21 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
36. Missouri www.oit.state.mo.us Decision ItemRequirements
GL Requirements for projects needing formalapproval to proceed.
37. Missouri www.oit.state.mo.us Project Management –1998 Project Charter
OT The charter of the project responsible forbuilding project management guidance forstate agencies.
38. Missouri www.oit.state.mo.us Project PlanningGuidelines and BestPractices
GL Guidelines and best practices for implementingproject planning in IT projects.
39. Missouri www.oit.state.mo.us Project TrackingGuidelines and BestPractices
GL Guidelines and best practices for implementingproject tracking in IT projects.
40. Missouri www.oit.state.mo.us RequirementsManagement Guidelinesand Best Practices
GL Guidelines and best practices for implementingrequirements management in IT projects.
41. Missouri www.oit.state.mo.us Risk ManagementGuidelines and BestPractices
GL Guidelines and best practices for implementingrisk management in IT projects.
42. NASA www.ivv.nasa.gov./SWG/resources/
NASA Guidebooks andStandards
GL A collection of guidelines for building largeprojects.
43. NASA satc.gsfc.nasa.gov/crm/ NASA Publications –Software AssuranceTechnology Center
TL NASA developed publications, papers, andreports pertaining to risk management and riskassessment. (This site also has a link toIV&V)
44. New Mexico www.cio.state.nm.us Project ManagementPolicy
PY Policy for use of formal project management.Includes project classification based on risk.
![Page 22: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/22.jpg)
Survey of Current Best Practices and Utilization Standards
Page 22 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
45. North Carolina www.state.nc.us/irm Example of CompositeProject Status Report
BP An overview report of major agency ITprojects showing not only project status,but also major QA activities planned andcompleted.
46. North Carolina www.state.nc.us/irm Individual ProjectStatus Report
BP A template for use by an agency to reportthe status of a project.
47. North Carolina www.state.nc.us/irm Project ProposalChecklist
CK A checklist for assuring that “essentialactivities have been planned or performed”,used as part of independent QA reviewsduring the project.
48. North Dakota www.state.nd.us/isd Guidelines forDeveloping a ProjectBusiness Case
GL Guidelines to “help agencies document thebusiness case for large projects.”
49. Software ProgramManagersNetwork
www.spmn.com/critical_software_practices.html
16 Critical SoftwarePracticesTM forPerformance-basedManagement
BP The "16-Point PlanTM and Templates forCritical Software PracticesTM" contain the16 practices (9 best and 7 sustaining) thatare the key to avoiding significant problemsfor software development projects. Thesepractices have been gathered from thecrucible of real-world, large-scale, softwaredevelopment and maintenance projects.
50. Tennessee www.state.tn.us/finance/oir
IT MethodologyProject Intranet page
BP Defines technical approach for the ITMethodology Project
51. Tennessee www.state.tn.us/finance/oir
IT MethodologyProject – Project Plan
BP A project plan for implementing formalproject management processes in stateagencies
52. U.S. Dept. ofEnergy
http://cio.doe.gov/sqse/ [link updated 08/22/01]
Analysis of Benefits andCosts (ABC's)Guideline, Volumes 1-3
GL These guidelines “explain the usefulness ofABC's in making choices among competingalternatives concerning InformationResources Management (IRM).”
![Page 23: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/23.jpg)
Survey of Current Best Practices and Utilization Standards
Page 23 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
53. U.S. Dept. ofEnergy
http://cio.doe.gov/sqse/ [link updated 08/22/01]
Automated OfficeSystems SupportQuality Assurance Plan
BP “Describes the standards, processes andprocedures used to support the consistentdelivery of high-quality, professionalproducts and services.”
54. U.S. Dept. ofEnergy
www.orau.gov/pbm/documents/documents.html [link updated 08/22/01]
How to MeasurePerformance: AHandbook ofTechniques and Tools
GL A guide to “assist in the development,utilization, evaluation, and interpretation ofperformance measurement techniques andtools”.
55. U.S. Dept. ofEnergy
http://cio.doe.gov/sqse/ [link updated 08/22/01]
In-stage AssessmentProcess Guide
GL Defines the process for planning andconducting independent reviews of systemdevelopment and maintenance projects.
56. U.S. Dept. ofEnergy
http://cio.doe.gov/sqse/ [link updated 08/22/01]
Project PlanningQuestionnaire
TL A tool to “enable project teams (immediateand extended) to be cognizant of thedisparate planning activities which canaffect project outcome. Provide earlynotification to the stakeholders that a newproject may involve their area, andinformation to help plan resource estimatesand identify risks.”
57. U.S. Dept. ofEnergy
http://cio.doe.gov/sqse/ [link updated 08/22/01]
Software ProjectPlanning Checklist
CK This checklist is intended to “providesystem owners, project managers, and otherinformation system development andmaintenance professionals with guidance inidentifying and planning software projectplanning activities. The checklist reflectsrecognized software project planningactivities to be performed throughout theinformation system (IS) life cycle.”
![Page 24: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/24.jpg)
Survey of Current Best Practices and Utilization Standards
Page 24 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
58. U.S. Dept. ofEnergy
http://cio.doe.gov/sqse/ [link updated 08/22/01]
Software ProjectTracking Checklist
CK This checklist is intended to “providesystem owners, project managers, and otherinformation system development andmaintenance professionals with guidance inidentifying and planning software projecttracking activities. The checklist reflectsrecognized project tracking activities to beperformed throughout the informationsystem (IS) life cycle.”
59. U.S. Dept. ofEnergy
http://cio.doe.gov/sqse/ [link updated 08/22/01]
Software QualityAssurance Checklist
CK This checklist is intended to “providesystem owners, project managers and otherinformation system development andmaintenance professionals with guidance inidentifying and planning software qualityassurance (SQA) activities. The checklistreflects recognized SQA activities to beperformed throughout the informationsystem (IS) life cycle.”
60. U.S. GovernmentAccounting Office
www.gao.gov Assessing Risks andReturns: A Guide forEvaluating FederalAgencies’ ITInvestment Decision-making
GL Provides a structure for evaluating andassessing how well an agency selects andmanages IT resources.
![Page 25: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/25.jpg)
Survey of Current Best Practices and Utilization Standards
Page 25 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
61. U.S. GovernmentAccounting Office
www.gao.gov Executive Guide:Measuring Performanceand DemonstratingResults of InformationTechnologyInvestments
GL A suggested framework for developing andimplementing IT performance management.
62. U.S. GovernmentAccounting Office
www.gao.gov InformationTechnology: An AuditGuide for AssessingAcquisition Risks
GL Provides a “logical framework forevaluating” IT acquisitions, focusing onrisk assessment.
63. Virginia www.cim.state.va.us Mission FocusedInformationManagement
BP A collection of best practices fordeveloping information systems for stategovernment agencies.
64. Virginia www.cim.state.va.us Model Standard forLarge-Scope Projects
GL Provides a “model structured approach fordefining, developing and implementinglarge-scope information systems projects instate agencies.”
65. Virginia www.cim.state.va.us Model Standard forMaintenance andEnhancement Projects
GL Provides a “model structured approach formanaging maintenance and enhancementprojects for existing information systems.”
66. Virginia www.cim.state.va.us Model Standard forSmall-Scope Projects
GL Provides a “model structured approach fordefining, developing and implementingsmall-scope information systems projects instate agencies.”
67. Washington www.wa.gov/dis Cost Benefit AnalysisWorksheet
TL An MS Excel spreadsheet for performingcost benefit analysis of state IT projects.
68. Washington www.wa.gov/dis Feasibility StudyGuidelines
GL Provides guidance in performing feasibilitystudies for IT projects.
![Page 26: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/26.jpg)
Survey of Current Best Practices and Utilization Standards
Page 26 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
69. Washington www.wa.gov/dis InformationTechnology PortfolioStructure and ContentStandard
BP Describes the details of portfoliomanagement for state agencies.
70. Washington www.wa.gov/dis Portfolio-basedInformationTechnology (IT)Management andOversight
BP A description of Washington’s portfolio-based approach to IT project selection andmanagement.
71. Washington www.wa.gov/dis Portfolio ManagementTraining
TL A MS PowerPoint management briefing onportfolio management.
72. Washington www.wa.gov/dis Project ManagementGuideline
GL Guidance for state agencies on performingproject management.
73. Washington www.wa.gov/dis Quality ImprovementPlan
BP The overall plan for improving the qualityof information systems, including adiscussion of the rationale for usingportfolio management.
74. Washington www.wa.gov/dis Responsibilities andObligations for QualityAssurance
GL “This document is intended to serve as amodel for procurement and contract language,facilitating a consistent approach across stategovernment. The goal is to establish commonexpectations among state agencies, QAvendors, the Department of InformationServices (DIS) and the Information ServicesBoard (ISB), about the role of QA underPortfolio-based Information Technology (IT)Management and Oversight.”
![Page 27: SurveyofBP](https://reader030.vdocuments.mx/reader030/viewer/2022032506/55cd0f20bb61eb39288b46e6/html5/thumbnails/27.jpg)
Survey of Current Best Practices and Utilization Standards
Page 27 Version 0.9
State orAgency URL
DocumentTitle
DocumentType
DocumentDescription
75. Washington www.wa.gov/dis Software Life CycleManagement Guideline
GL This guideline describes “the software lifecycle management concept, the need forchange to the current software developmentmodel, current initiatives, benefits,applicability to agencies, and criticalsuccess factors.” It “sets forth a descriptionof essential steps to be taken to implementa life cycle process.” It also includes a“brief description of establishing a softwareperformance improvement measurementand change management process, and theneed for teamwork.”