invention evaluation services: a review of the state of the art

12
In 19?4 two events occurred which led to public poky recognition of technical and zommerciai evaluation as an important part of the industrial innovation process. The first of these was es- tablishment of a technical evaluation program under the Non-nuclear Energy Act c:f 1974 [9]. The second was the development of a systematic commercial innovation evaluation sys&enr its part of the National Science Foundation’s Innovation Centers Experiment [2]. This article discusses these events, examines the evaluation programs generated by them and evaluates the current state of the art as practiced by evaluation programs throughout the United States. Since commercial evaluation services have become more widespread than technical evaitlation services, greater emphasis is placed on the evolution of commercial evaluation. It is hoped this article will provide useful information to researchers, public policy makers and public L,;ld private sector practitioners re- design and delivery of innova- rograms and other incentives to missioner of the US Patent and ndent inventors ical evaluations I 1989 Elsevier Science Publishing Co., Inc. hF A~enesr of the Americas, New York, NY 100010 malized the atte Congress, which had previously for- federal interest in evaluation in the 0737-6782 89 $320

Upload: g

Post on 28-Dec-2016

212 views

Category:

Documents


0 download

TRANSCRIPT

Page 1: Invention evaluation services: A review of the state of the art

In 19?4 two events occurred which led to public poky recognition of technical and zommerciai evaluation as an important part of the industrial innovation process. The first of these was es- tablishment of a technical evaluation program under the Non-nuclear Energy Act c:f 1974 [9]. The second was the development of a systematic commercial innovation evaluation sys&enr its part of the National Science Foundation’s Innovation Centers Experiment [2].

This article discusses these events, examines the evaluation programs generated by them and evaluates the current state of the art as practiced by evaluation programs throughout the United States. Since commercial evaluation services have become more widespread than technical evaitlation services, greater emphasis is placed on the evolution of commercial evaluation.

It is hoped this article will provide useful information to researchers, public policy makers and public L,;ld private sector practitioners re-

design and delivery of innova- rograms and other incentives to

missioner of the US Patent and ndent inventors ical evaluations

I 1989 Elsevier Science Publishing Co., Inc. hF A~enesr of the Americas, New York, NY 100010

malized the

atte

Congress, which had previously for- federal interest in evaluation in the

0737-6782 89 $3 20

Page 2: Invention evaluation services: A review of the state of the art

Stevenson-Wydler Technology Innovation Act of !YSO [!O] and the S.mall Business Improvement Act of 1980 [Ill. extended that interest in this early stage of the industrial innovation process. The latest congressional mandate. the Technol- ogy Competitiveness Act of 1988 [ 121. establishes a national technical evaluation service at the National Bureau of Standards.

Ths- muGvating force is the recognition that evaluation of technical merit and of commercial merit during early stages of the innovation or new product felopment process is essential to efficient further investment. *44s a result. evalu-

ses and services into the market-

is simple: identification and subse- t elimination of nonfeasible idex and inven-

Will it work and work efficiently?. w mercial evaluation typical1 value approach. that is. Ca and at what risk?

rations have long recognized the need ating the wheat from the chaff. Indeed,

in 1978 Kenneth Baker [I] found over 2 porate-oriented new product commercia ation models. While useful in corporate settings, Baker round that most of these models were inappra. oriate for use in evaluating independent or non,” jrporate inventions. That is, they tended to be c i-porate-specific, were costly to utilize or requirt data commonly not readily available, particir y at the early stages of the innovation procey

‘4s L xd above, it was not until 1974 that any significL .st intxest was shown in the evaluation needs La’ noncnrpora:e inventors. In that year Congress established an energy-related technical evaluation program at the National Bureau of Standards [9]. Similarly, the National Science Foundation directed its then recently established Innovatioil Center at the University of Oregon to develop a commercial evaluation system oriented toward noncorporate innovation evaluation [S]. Both efforts have left their marks.

of Standari 7 has held a congressi to

Page 3: Invention evaluation services: A review of the state of the art

question of com- ion of the tech-

In the case of rwarded to the

have been favorably

e volume expected by is already laying plans

s. initially. OERI gies for the statzs. When

et d fcr a given

to just turn the ri ag rtates

arter and. conse- a result. techno-

ommended to. and some- DOE. As a result of this

commercial and technical considerations.

co ercial vation Ev ation

Prior to 1974, inventors who wanted an opinion of the commercial potential of their idea had two basic attpmstivps- they cou!d send their idea to US.~..IC-.. , _-.

an idea broker or to a corporation. Obtaining data in a systematic manner about

idea brokers has proven to be impossible. At- tempts by the author over the years to obtain data from such private sector firms have consistently met with failure. However. information suppiied by over 200 independent inventors yields a con- sistent picture despite the possibility of percep- tual biases and the nonrandom nature of the data gathering.

Virtually every inventor interviewed or supply- ing information reported receiving very sitnilar . “We like your idea . . .” type letters, regardless of technical or commercial merit. Without appar- ent exception, inventors were then invited to engage the soliciting firm to assist them in the

ment and commercialization of their Id. Not one ins+.ciihce in which an idea or

invention was rejected has been reported in over ten years of selective monitoring of private sector activity in this area.

This picture is consistent with the one de- ade Commission in its

secction of two ing firms in the guilty of violat-

ing Section Five of the FTC Act. Evaliuations proved to be meaningless or nonexisent. and the

Page 4: Invention evaluation services: A review of the state of the art

s’ marketing services ineffective. Only about inventors doing business with the

e a profit of $, 1 or more in doing SO 141. ates, such as North Dakota. now regu-

invention promotion industry [7]. tars pursuing the second alternative are

far less likely to be defrauded. However. as noted ier. corporations typically were. and perhaps are. hostile tc outGde new product ideas.

Inventors frequent y face restrictive submission agreements which are heavily weighted in favor of the company. More often than not. corporate policies seem to be oriented toward protecting the company. rather than soliciting new product

eas from outside sources. Furthermore. corpo- ration evaluation procedures are typically un- structured and evaluation results or feedback omaining the strengths or weaknesses of the idea or invention are rarely disclosed to the inventor.

However. in December 1974 the first struc- tured preliminary innovation evaluation system designed to meet the evaluation needs of indepen-

ent and other noncorporate inventors was im- lemented at the Oregon Innovation Center. An

f the literature generated a list of well over 1 product-related factors which contribute to success or failure of a new

ctors were then collapsed into original 29 e\Auation criteria used by the

uring the nest five years nnovation Evaluation System

d to evaluate over 5000 ideas and inventions submitted by inventors

Funded by the National Science Foundation, the Innovat!on Center was established in part to provide assistance to inde endent inventors. The basic objective of the Inn ation Center’s experi- ment w; s to elop and test various incentives for stinulat technological innovation [2]. Hence. Oregon can be viewed as an experiment utilizing t”le observation method in which various techniques were implemented and tested.

Throughout the experiment the National Sci- ence Foundation gave the Center complete freedom to Fail. i LL is. t e objective of the National Science- Foundation was, as noted above. to test difi’erent techniques for increasing the rate of innoyration. While ultimate success and replication of‘ the results was hoped for, the National Scence Foundation made it quite clear that the experiment was foremost. Thus, the Center did not function to prove a point, but rather to find ,2ut what did work and what di work and, in b&r cases. why. Data gleaned Center failures were deemed to be as significant as data generat :d by its successes. The Oregon experiment welt through five basic stages. as

e experiment was

opinions of volunteer business local communit

rt

few dozen to several

Page 5: Invention evaluation services: A review of the state of the art

e evahrationc had

. Center client can get a ‘no’ from any

ere to find out just d. I want to know why.

and the Center

not cost effective. as it ional hours for each re-

it was extremely dif3cuIt at e reasons for the evaluators’

atior results, high costs

had to be cost effective: capable of working with very little

rovide its users with sufficient feedback that the inventor could understand

and weaknssseq of

e evaluation results ad to be capable of

ed out another informal system and the adoption of one of the 200 or so corporate screening models then available. Early in the effort it was determined that the Center would

need to develclp its own evaluation system. This marked the beginning of stage three of the exper- iment.

After an intensive research effort to identify the factors which contribute to new product/ techno!ogy failures, the Center introduced its first formal and systematic evaluation program [IS]. Consisting of 29 factors gleaned from the per- sonal industrial experience of the researchers and the literature. the new format provided clients with a computer print out which outlined the evaluators’ judgments on each point. Actually, a much larger number of criteria had been iden- tified, but an informal factor analysis had reduced the number to 29 f16, p. II-41j. A later formal factor analysis indicated that the number of cri- teria could be reduced to about eight. but in the interest of providing as much feedback as possi- ble, it was decided not to reduc; the number.

Since that time several new criteria, have been

added as need dictated. thereby increasing the

criteria currently used to the 39 listed in Ex-

ementation of this system, the Cen- client dissatisfaction dropped con- wever. the residue of dissatisfaction ed to be too high. Follow-up inter-

rents revealed that inadequate c”o~- ~~~~~Cat~Q~ was t e major remaining roblem. Unskilled in the affairs and language of busmess. many inventor clients did not understand the terminology and implications of their evaluation

Page 6: Invention evaluation services: A review of the state of the art

t; c; l_‘l)i-t I

I

ark13 Licceptance

C~~rn~~tibilit~,

Learning

Need

Dependence

Vi+ilitj

Promo1 icw-4

Di~trihution

Set-\ ice

Competitive

Appearance

Durzhilit~

Function

Price

E\i\ting comprtltion

YC\4 wnpclition

Protection

Experience/stmteg!,

Technology transfer

New venture

Marketing experience

Technical experience

Financial resources

Management eywience

reports. Furthermore. e data clearly indicate Auding those with ad-

of the evaluation cri- ent. With the use of t

Page 7: Invention evaluation services: A review of the state of the art

Eltmre to read the Eval- error or fncompiete

er researc

gon Innovation Center. like ‘nany early stage experimental projects, could not surGve the hostile environment of its host university [ 141.

ever, the evaluat;on concept it spawned has ome accepted practice throughout the United

Statzs and Canada. In addition. it is used in several other countries.

Currently there are 15 or mare universities and ers using the PIES format. that is. the criteria

and its associated computer program.’ irrstitutions the current

ears to be similar to that reac

major shortcomings a

’ In addition to the I,lnovatton Initlttite. CL~I -_~nt u-sr-. iti !hct PIES format include the uni!er<ltie> ~7f ?;or?h biko!a. \!~:nilobd. Quebec. Waterloo IOntarIo). U‘isconbin at Whr’euater. Ballor (Texas). Washington State. Drarte tioua). Southue>t Micsouri State afid others as well as Rural Enterprises in Oklahoma and the Department of Science and Technology in Saskatcheaan.

Page 8: Invention evaluation services: A review of the state of the art

Feedback is limited to a computer printed evalu- ation report similar to the format implemented at oregon in 1975 during the third stage of the experiment. No additional feedback in the form of an evaluation manual or other document that explains the evaluation report is provided to the client. This leaves such programs open to the same problems and criticisms experienced by the Oregon Center in 1975.

Ilradeqrrare Feedback

Some centers are using counselors at Small Busi- ness Development Centers to interpret evaluation results and provide other feedback to inventors. Although the information about the relative mer- its of this approach is limited. it would appear that the previously discussed 0, egon experience with the use of business generalists would apply.

A limited experiment by the author would indicate that it does. In this experiment 32 inven- tors in 12 states were referred to their local Small Business Development Center office for further feedback or assistance. Follow-up interviews re- vealed a level of client satisfaction of slightly under 25%. The complaints included “‘the coun- selors were not knowledgeable about inventors or innovation.” and “they tried to treat inventors as small business persons.” For example, several inventors reported being urged to prepare busi- ness plans despite negative evaluation reports and declared desires to license rather than at- tempt to start new ventures. On the positive side, when specific “next step” projects were sug- gested to the inventor prior to referral, the rate of satisfaction was considerably higher. This sug- gests that when knowledgeable guidance is given, general business counselors make a useful contri- bution.

Some centers have made changes in the system ivhich are counter to the research conducted at Oregon. In one such instance, a lengthy evalu- ation report of some 60 pages has replaced the summary report. As noted above, this approach was rejected at Oregon after considerable re- search. A brief investigation of this ch;<nge re- vealed the same criticism as generated earlier.

Undetected Errors

Errors have crept

G. G. UUEl_I.

into the computer systems utilized by some centers, thereby resulting in confusion. It is not known if these errors are affecting evaluation results. However, program errors do affect program credibility.

Perhaps more significant are the errors in the advice offered to inventors by some centers. For example, the literature distributed by several centers contains the following advice:

1.

2.

File a disclosure with the U . S. Patent Office. This is their invention disclosure program which allows you to disclose your idea up to one year before applying for a patent . . .

When you complete the . . . Registration and Disclosure Form, run a copy of everything you mail to . . . University and mail it to yourself.

The first bit of advice seems to confuse the Invention Disclosure Program with the one-year statutory limitation on patenting an invention after public disclosure or first sale.

The Disclosure Program provides inventors with a credible means of establishing evidence of conception of an invention. The one-year limita- tion provides basically that one year after first sale or public disclosure, an invention falls into the public domain and cannot be patented. The latter advice is obviously intended to help inven- tors establish date of conception. However, this practice is totally ineffective in the sense that it has never been used successfully in court. While it is frequently recommended by the uninformed, it has not been recommended by knowledgeable people for many years.

Both of these errors are very elementary mis- takes and cast considerable doubt upon the cen- ter’s experience with the innovation process as it impacts on independent inventors. These errors appear to have been copied from an earlier adaptor of the PIES format which had obtained a copy of the Oregon computer program and re- lated program materials. This university did not avail itself of the experiences of the Oregon staff or apparently the research data compiled by the Center. It in turn disseminated the PIES and its

e cut off from the original body of

Page 9: Invention evaluation services: A review of the state of the art

INVENTION EVALUATION SERVICES

research. As a result, the original errors have been retained and others added through apparent ignorance of the extent of the research available

in this area.

Untrained Evaluators

Some centers apparently use untrained or inex- perienced evaluators. In at least one center un- dergraduate students are used, and in several others evaluators are not given any training. As a result, their evaluation resclts are frequently sub- ject to question.

It was the Oregon experience that untrained commercial evaluators tend to be overly optimis- tic about the commercial viability of ideas and inventions. This has not changed. For example, in one such instance a center which uses un- trained volunteer evaluators gave a 97% chance of commercial success to an unpatentable auto- motive idea submitted by a person with no busi- ness experience or financial resources of note. By way of comparison, the idea was rejected by the Innovation Institute with a Success Likelihood Rating of under 25%.

In this instance, it is clear that the evaluators did not understand either the evaluation system or the realities of the innovation process. The idea was a mere incremental improvement which was likely in the public domain and would have been obvious to anyone even minimally skilled in the art related to the idea. The probability of licensing such an idea is very small. Similarly, the prospect of raising sufficient capital to manufac- ture and market such an idea is likewise minimal.

Unfortunately, this is not an isolated example. The problem here is that unrealistic evaluations provide inventors with a false sense of optimism, which encourages them to make further invest- mcnts in an attempt to develop and commer- cialize their inventions. The financial losses in- volved can easily exceed the losses an inventor might experience at the hands of an idea broker.

In this instance the guilty party is well mean- ing, but the guilt is just as certain. In light of the apparent fact that inventors to take preliminary evaluations seriously, it seems reasonable to ex- pect those who provide them to dg likewise and to ensure that evaluators are trained and have

an a passing acquaintance with the indus- ovation process.

ter performed over 5 of these received

Oregon innovation Cen-

tively modest. This was in part due to the lack of follow-up on the part of the Center because of its closing. The larger factor is that the Center lacked the resources to provide follow-up assis- tance to promising projects. While there is some data which suggest a reasonable Ete of commer- cialization among this high scoring group, there is also ample evidence which suggests that others failed to be commercialized because the inventor lacked the necessary resources or inclination to become an entrepreneur. Approximately 90% of Oregon clients reported little interest in person- ally commercializing their invcnt;ons. These clients preferred to license their devices to an existing firm. They cited lack of resources and/or business experience as their maj,br reasons for their choice.

In other words, a positive evaluation report has value to entrepreneurs and those inventors with business acumen. However, to the mhjo& of independent inventors a pGGtive evaluation re- port has little impact unless it is coupled with some additional appropriate assistance. In either case, innovation evaluation is most effective in terms of its impact when it is coupled with other innovation-related services. In the case of the inventrapreneur, or experienced inventor, he or she can make good use of an evaluation report. For the majority it does little more than add to the inventor’s memorabilia.

Inflated Evaluation Scores

According to the literature of some evaluation programs, a Success Likelihood Rating-the likelihood of an idea or invention being sxcessful in the marketplace- of 75% is required before further development is recommended. This has resulted in inflated evaluation scores of as high as 97%. In contrast, both the Oregon Center and the Innovation Institute consider a Success Likeli- hood Rating of 40% as “passing.” No invention

Page 10: Invention evaluation services: A review of the state of the art

G. G. utx3.L

evaluated by either program was ever deemed to have ci 7C!T chance of success. Ninety-seven percent of inventions fell below 50%.

The major problem here is that such high Success Likelihood Rating scores give inventors an unrealistic view of their chances of success. Such high scores encourage inventors to make unwarranted investments in their ideas and de- stroy the credibility of the evaluation with knowl- edgeable people. In reality. the typical inventor‘s lack of business experience and the difficulties of licensing drive the odds of success down to less than 2 ou! of 100.

In some instances the fees charged or the costs incurred are ioo high. Furthermore. there ap- pears to be little correlation between price and the quality of the evaluation service. Some pro- grams are supported only by user fees varying from $100 to $150. Others charge similar fees. but make in-kind contributions or receive public funds to support their service. For examp!e, one recently established university-based program es- timates that in addition to a $100 user fee, it will require $244.000 to support an evaluation pro- gram and related services. This is based on an estimated 100 evaluations. In contrast. a state-of- the-art private sector service has functioned at break even at about the same level for under 4% of the cost of this program. Some of the costs being incurred by the university are for additional services. However. since the Oregon dat=: indi- cate these services are not necessary, the net effect is the same.

urpose of Evaluation It is not the function of evaluation to identif) ideas or inventions which will become innova- tions. Rather. the purpose of evaluation is to identify those ideas or inventions with serious technical or commercial flaws. In order to be beneficial. evaluation should occur well before sufficient data are available to prove either tech- nical or commercial success. Thus, projecting success in either case can be extreme!y unre- liable. This is particularly true in the case of commercial evaluation, but may also be true of

technological challenges to the existing state of the art.

Whi!e evaluation does serve a screening func- tion in that it should identify those ideas and inventions with obvious commercial or technical flaws, it also serves the larger function of provid- ing guidance for the development and commer- cialization of thoce with both technical and com- mercial merit. SuMicient information should be generated by the evaluation process to help deter- mine the next step and to identify potential prob- lem areas.

Ideas, Inventions and Innovation

While every innovation starts with an idea, the gap between an idea and an innovation is often long. hazardous, laborous and expensive. Given the apparent odds and dficulty of the process, the miracle of innovation is that it happens at all.

Starting with a creative burst of intellectual inspiration which may either suggest a new way of accomplishing something or recognize an alter- native application for an existing or recent dis- covery, an ideas must be refined and perhaps given tangible form before it becomes an inven- tion. Vagueness must be reduced to the point to where the idea is reduced to practice. The con- cerns here are purely technical. That is, the idea must work or be shown to be capable of working before an invention can be said to have occurred. By this definition, a perpetual motion machine would be deemed by most to be an idea reduced to physical form, but not an invention in any meaningful sense. Note that commercial feasibi- lity i5 not a relevant prerequisite to invention. That is, inventions are often made Lnd patented which have little or no commercial potential. This gives rise to the better mouse trap theory often accepted by inventors and technological policy- makers alike.

An innovation, however, does not occur until an invention is refined, produced, marketed mzd

accepted by the marketplace. Essential to inno- vation are both technical and commercial feasi- bility. The most technically elegant technological discovery which lacks a market will fall short of commercial success. Conversely, the largest conmercial potential will not guarantee commer- cial s!iccess i- v the technologically inferior.

Page 11: Invention evaluation services: A review of the state of the art

INVENTION EVALUATION SERVICES

Conclusions and Recommendations

There are some substantive differences between technical and commercial evaluations. There are also similarities. In both instances, there appears to be a predictable natural progression of mis- takes that neophytes tend to make on entering the field. Universities and government agencies are no exception to this rule. Jumping into the arena just because there is need or there are program dollars available does not serve the interests of inventors and innovators.

Given the interest in industrial innovation and the growth of public incentives to stimulate the commercialization of technology by new and smaller enterprises. the need for effective and efficient evaluation services is likely to increase.

The OERI is wise to encourage interested states to become well versed in the complexities of technical evaluation before they go out on their own. This is a step in the right direction. The OERI has a broad base of experience on which others can build. While OERI has amassed con- siderable experience and data relative to the technical invention process, researchers and oth- ers have not made good use of this resource. As a result, the literature pertaining to technical evalu- ation remains sparse.

This is in contrast to commercial evaluation, which has received extensive coverage in the literature. Although commercial evaluation has become more systematic, it does not necessarily follow that commercial evaluation procedures are less complex. In both instances evaluators would do well to build on the research and experiences of those who have preceded them.

Innovation is rarely the result of independent action. That is, most innovations are based on some previous work. Innovation evaluation is no exception. Those currently offering or planning to offer invention evaluation services should re- view the literature. The work on invention evalu- ation is by no means finished. However, such a review would help those with fresh ideas to bring improvements to the table rather than spending their time reinventing the wheel and replicating old mistakes.

This apparent lack of interest in research does not appear to be confined to recent adoptors of the PIES format. There is evidence that business assistance programs in general may lack a re-

search orientation. For example, despite an initial charge from the National Science Foundation to study the innovation and entrepreneurial pro- cesses, the Foundations innovation centers did not put a great deal of emphasis into research about these processes. For the most part, they did not view publishsing articles about their work as a vehicle for reaching others who might be interested [ 14, pp. 5-9%Ml].

In order for an evaluation to have any value for its user beyond that of a screening device, an evaluation must be objective, realistic and pro- vide sufficient feedback written in a style that is understandable to the person or persons submit- ting the invention. Properly done, an innovation evaluation report should provide sufficient infor- mation so as to

0

0

0

0

and

0

clearly outline the evaluation process and criteria used by evaluators:

permit others to make their own decisions about commercial and technical feasibility and wisdom of further development;

provide guidance in correcting deficiencies and otherwise strengthening technical/com- mercial merit;

suggest strategies for future development and commercialization of the invention or new product idea;

provide a basis for judging the objectivity of the evaluators.

Furthermore, it should accomplish these objec- tives at a reasonable cost. Given the prior proba- bilities of success at the early stages of the innovation process, initial evaluation costs should be kept at a minimum, thereby conserving resources for technology transfer and venture development activities.

Although these observations are based on the needs of independent inventors, they appear to apply to corporate and institutional inventors as well. Corporations and institutions do not in- vent-people do. Unless adequate consideration is given to emotional and intellectual needs of inventors, dissatisfaction and a subsequent de- crease in productivity are the natural conse- quences even in c, r-pot-ate and institutional set- tings. Thus, corporations and institutions should

Page 12: Invention evaluation services: A review of the state of the art

review their evaluation procedures to ensure that their inventors are adequately recognized and informed. More than one significant opportunity has hen lost because an inventor has taken his or hgr idea outside the corporation or institution, or worse yet, abandoned the project altogether be-

perceived. and perhaps real. lack of interest on the part of his or her employer.

For the most part present evaluation services fall below an acceptable level. There is some danger that standardization or accreditation ma; limit experimentation and innovation. However. the lack theft_ ,f has spawned a current level of service well below the state of the art.

The problems delineated above can be fixed. Perhaps the first step for those with, or contem- plating establishing. an evaluation service is to review the existing literature. Such a review would eliminate many of the basic errors cur- rently occurring.

A second step is to recognize that inventors are not small business people. Accordingly, their needs are different and existing programs need to be adjusted accordingly. Treating inventors as entrepreneurs wastes considerable effort and misses the mark. That is, the bulk of inventors have very little interest in starting a new venture. The bulk of them are far more interested in licensing their devices to exi;;ing businesses.

A third step WOI%! be to employ persons with more innovation-related expertise. Too fre- quently evaluators and those who supervise them have little innovation-related experience. This has led to poor evaluations and poor advice.

Finally, more research, involving a greater number of evaluators could help improve the state of the art. Accordingly, further research into innovation evaluation procedures should be encouraged, if not sponsored, by the govern- ment. In addition. an in-depth review of the various public and private sector evaluation ser- vices should be undertaken. Also, criteria rela- tive to that which constitute appropriate evalu- ation procedures should be agreed on and used to establish a quality control or accreditation me&

G. <i. IlDEl.1.

anism. The establisnment of such criteria would make it easier to spot bogus or inadequate evalu- ation services. It would also establish a common basis for judging the relative merits of technolo- gies and raise the credibility of services meeting these criteria among inventors, investors, corpo- rations and others, thereby facilitating the devel- opment and flow of technology into the market- place.

References 1.

7 _.

3.

4.

3 .

7.

8.

9.

IO.

11.

12.

13.

14.

IS.

16.

17.

Baker. Kenneth G. A comparative analysis of models for use in new product screening decisions. In: Tlte Oregon innovation Center Experiment: 1973-1978. Volume III: Readings in Inno- uation, G. Udell fed.). Washington. DC: National Science Foundation. 1980. p. 111-85.

Colton. Robert M. The NSF innovation center experiment. Mec/zanical Engineering lOZ(8) (August 1978).

Hawkins, Del H. and Udell, GeraId G. Corporate caution and unsolicited new product ideas. Journal of the Patent O&e Sober? 375-388 (June 1976).

In the matter of the Raymond Lee Organization, Inc., et al, FTC Docket 904s. complaint July 15. Final Order November i, 1978.

Incubators for Entrepreneurs. National Science Foundatiosz MOSAIC 9(4): 14 (July/August 1978).

Invention Review Guide for OERI Consultants. Washington DC: Office of Energy-Related Inventions. National Bureau of Standards. U.S. Department of Commerce, February 1986, LCI 134.

Century Code 9-1401.

Office of Energy-Related Inventions. telephone interview. July 1988.

Public Law 93-577.

Public Law 96-480.

Public Law 98-395.

Public Law 100-418.

Quigg. Donald J. Remarks before the National Inventors Con- ference. Arlington, VA, February 1987.

Shierer, Mary Ann. et al. Innovation and Enterprise: A Study of NSF’s innouatiort Centers Program. Rockville, MD: Westat, 1985. p. I-85.

Udell. Gerald G. The essential nature of the idea brokerage function. Jorrrnal of the Patent Office Society $442 (October 1975).

Udell. Gerald 6. and Baker. Kenneth 6. Evolution of the Preliminary Innovation Evaluation System (PIES). In: The Orwon Iunouation Evaiuation Experunent: 1973-3978. Vohme II: innouation Erlahution System, G. Udell and M. Venkatcsan feds. 1. Washington. DC: National Science Foundation, 1980. p. 11-37.

Udell, Gerald and Venkatasan M. teds.). TJje Oregon innow- tiotl Center Experiment: 1973-1978. C’olltme I: Sumniac r?f Results. Washington. DC: National Science Foundation, 1980, p. I-25-29.