conference report. 7th international symposium on computer hardware description languages
TRANSCRIPT
from the IEEE Service Center, 455 Hoes Lane, Piscataway,NJ 08854, USA, publication reference 85-CH-2147.7.
The 1986 symposium will be held at Virginia Poly-technic and State Institute, Blacksburg, Virginia, on 27th—29th May 1986. Details will be available from J.G. Tront,Department of Electrical Engineering, VPSI, Blacksburg,VA 24061, USA.
S.L. HURST
4463E
7th international symposiumon computer hardware descriptionlanguagesTokyo, Japan, 1985The influences of both computer science and of VLSIcould clearly be perceived throughout the three-day pro-gramme of CHDL 85. The symposium, at which some 35papers were presented, contained three main centres ofinterest: the first two, hardware design languages andsimulation, have long been the traditional concern of theCHDL symposium. The third, hardware verification, is acomparative newcomer.
There were two main trends evident in simulation; bothare clearly conditioned by the challenges posed by VLSIdesign. The first is a trend towards multi-level simulation,where different parts of the same circuit are simulated atdifferent levels of abstraction. The second is in the use ofspecial purpose hardware to speed up, by some orders ofmagnitude, the simulation process.
In hardware design languages (hdls), there were alsotwo trends evident. One was in the use of abstract datatypes to facilitate mixed high and low level descriptions ofhardware and of behaviours. The other was towards pro-viding integrated design environments in which all aspectsof the design process could be effectively carried out.
It was, however, in the field of hardware verificationthat the greatest changes could be seen. This topic ratedtwo full sessions (some seven papers in all) and it may besaid that the potential role that verification can play in thedesign process has now become well understood, even ifthe software support and the technical training needed forits successful exploitation are still at an early stage ofdevelopment.
It is always invidious to select from a whole conferencejust a few papers worthy of special mention, and thepapers so selected invariably reflect the reviewer's owninterests and prejudices. Nevertheless, the following papersseemed especially noteworthy.
A keynote address by Darringer (IBM) provided anexceptionally perceptive overview of the various levels andkinds of activity that typically occur in the design of alarge digital system. Taking as an example the IBM 3090(a powerful, four processor, 18.5 ns cycle time, ECL-basedcomputer), he described the three design activities that runconcurrently, namely: logic design, microcode design, andpower and mechanical design. At an early stage in theoverall design process, each of these three activities ispursued by a small group of 'gifted, clever, experienceddesigners'. Their tentative, very high-level designs, afterbeing subjected to statistical simulation, are then commu-nicated (generally, informally) to the much larger hierarchyof designers who handle the more detailed aspects of thedesign. Many of the lower-level aspects are, of course, par-tially or fully automated. It was interesting to note howlogic optimisation was handled by pattern matching (onthe circuit structure) and local transformation, rather than
by classical Boolean minimisation. One of the most press-ing needs, in Darringer's view, was for the introduction ofhardware description languages that were more suitablefor use during the active process of design, rather thanmerely being suitable for the documentation of finaliseddesigns.
Barbacci (Carnegie-Mellon University) presented theinitial results of an investigation into using the program-ming language Ada as an hdl. He began by observing thatmany of the features required in the next generation ofhdl's, such as abstraction, modularity, strong typing, etc.were already embodied in Ada. This, together with the(likely) widespread availability of Ada compilers (and,more generally, Ada support environments) and the devel-opment of a community of experienced Ada programmers,evidently provides a strong incentive to exploit Ada, as itstands, as an hdl. The approach his group has exploredinvolves representing a hardware system as a collection ofdata objects, where each object is an instance of an appro-priate abstract data type (itself realised by an encapsulatedAda package). Given such a set of types, together with asuitable convention for handling timing relations, it is theneasy to realise, within the same framework, associated soft-ware tools (for example, simulators, timing analysers, etc.)as Ada modules.
By contrast, Thorpe (Royal Signals and RadarEstablishment) described ELLA, an hdl which set out tosatisfy a similar set of aims. It provides extensive facilitiesfor the introduction of user-defined data abstractions, andfor the hierarchical description of structures, and it has acomprehensive set of delay primitives built in for describ-ing timing at the gate level. The overall ELLA system,which by virtue of being written in Algol 68 is portable,incorporates an efficient simulator and integrated database facilities. The tradeoff between the two approaches(i.e., using a standard language as an hdl, against imple-menting an hdl in a standard language) is an interestingone, and it must remain an open question as to whetherthe advantages offered by tailor-made hdls and supportenvironments offset the advantages of using, albeit withmany notational and operational overheads, a powerfuland potentially widely available language.
There were several interesting papers presented describ-ing aspects of hardware simulation, but perhaps the mostimpressive was one by Sasahi and colleagues (NEC,Japan). This described a special purpose multi-processormachine with a capability for mixed-level simulation ofvery large circuits. Its performance was quoted as being5 ms per clock step for a 1 million gate circuit! The inputto the system is a functional description (that describeshow the circuit is intended to perform) and a structuraldescription of an intended realisation. Given this data, thesystem automatically generates a set of test patterns, simu-lates the behaviour from the structural description, andcompares this output with that given by the behaviouraldescription.
The hardware consists of some 30 processors, linkedtogether with a high-speed router network. Each processoris responsible for simulating up to a thousand 'blocks' oflogic, with each block comprising some hundreds of gates.It was reported that the system has been successfully usedduring the design of several NEC computers, and it wassuggested that simulators with this level of performancewere an essential prerequisite for successful VLSI design.
An interesting paper by Bruck (Dortmund University)described how asynchronous controllers could bedescribed in a notation based upon Petri nets, and then beautomatically synthesised in terms of self-timed CMOS
IEE PROCEEDINGS, Vol. 133, Pt. E, No. 6, NOVEMBER 1986 349
architectures. Significantly, extra circuity can be automati-cally introduced to help overcome one of the most difficultproblems in such realisations, that of ensuring the testabil-ity of the final realisation.
Turning now to the third main strand of the sympo-sium, that of verification, there were several interestingpapers presented. However, it is worth noting that theterm 'verification' is turning out to be a rather ill-definedword. Depending exactly what assumptions are made,what degrees of generality are claimed, and what proper-ties are 'verified', so the difficulty, and the utility, of verifi-cation techniques vary. There is thus a spectrum; at oneend of this spectrum, systems consisting of hundreds, oreven thousands, of gates are verified entirely automatically.Conversely, at the other end of the spectrum, the verifica-tion of small collections of gates can require significantintellectual effort. Perhaps a taxonomy of the field isneeded, together with the introduction of a more preciseterminology.
Clarke, of Carnegie-Mellon University, described howuseful properties (for example, liveness, termination, fair-ness, etc.) of a digital system could be automatically veri-fied. The techniques were nicely illustrated by twoexamples: a self-timed CMOS queue element (under the'unit-delay' assumption) and a simple traffic light control-ler. The technique essentially involves two steps: first thesystem's behaviour is described as a labelled state graph,and then, using an efficient algorithm, the truth value ofthe desired property (expressed using temporal logic) is
evaluated with respect to the graph. The technique appearsto be straightforward in use and robust enough to be ofimmediate practical utility.
Near the other end of the spectrum, Herbert(Cambridge University) gave an account of the specifi-cation and formal verification of a 'real' integrated circuit,an ECL-technology chip that had been designed toprovide an interface to a fast (100 MHz) local network.What was particularly interesting was his comparison ofthe use of two styles of specification language. One, a spe-cially devised notation, essentially model devices assequential machines; the other uses the language of higher-order predicate logic. The comparison he gave came downin favour of the latter, on the grounds of its expressivenessand generality. The paper also contained a description ofthe process of 'guiding' a computation inferencing systemto produce a formal proof asserting the correctness of theECL chip. It was interesting to hear that not only wereminor errors discovered in the intended chip design, butalso that the proposed specification itself needed amend-ment.
Overall, the symposium was a considerable success, andits organisers, IFIP Technical Committee 10 and the Infor-mation Processing Society of Japan deserve congratula-tion. The symposium proceedings (which run to nearly 500pages) will be published by North Holland.
F.K. HANNA
4513E
350 1EE PROCEEDINGS, Vol. 133, Pt. E, No. 6, NOVEMBER 1986