5 march 2002 - dcs final design review: rpc detector the dcs system of the atlas rpc detector...
TRANSCRIPT
5 March 2002 - DCS Final Design Review: RPC detector
The DCS system of the Atlas RPC detector
V.Bocci, G.Chiodi, E. Petrolo, R.Vari, S.VenezianoINFN Roma
5 March 2002 - DCS Final Design Review: RPC detector
Atlas RPC muon system location
•follow ELMB tests done by DCS group
5 March 2002 - DCS Final Design Review: RPC detector
PAD Boards
There are about 900 PAD boards, each one with an ELMB CAN node.
Location of the CAN nodes in the LVL1 muon Trigger
5 March 2002 - DCS Final Design Review: RPC detector
•900 CAN NodesDivided in chain of about 16 nodes:
•about 64 branches
•or 32 if branches grouped in max 32 nodes.
CAN node
CAN branch
Can branches in Atlas muon detector
Branches are foreseen for HV, LV power supplies. Design is not yet final
5 March 2002 - DCS Final Design Review: RPC detector
CANbus on-detector connectivity
Trigger and readout logic sector
5 March 2002 - DCS Final Design Review: RPC detector
PAD board with TTCrx ELMB XCV200 and Optical Link
CAN daisy chain
Remote I2C
ELMB
PAD board layout
5 March 2002 - DCS Final Design Review: RPC detector
Optical linkfpga
SPI interface(SPI Flash)
Temperature I2C
FPGA power down
System reset
Reset CM
Configuration I2CLong distance I2C
FPGA configuration mode select
Prom JtagJtag
Chain
TTCControls
PAD ELMB signals
5 March 2002 - DCS Final Design Review: RPC detector
Transceiver Logic
CANControllerChip
8/16 bit Microcontrollernode controller
I2C
I2C
CAN node
FPGAFlashRom
CMPADLogic
JTAG
Chips with JTAG SCAN
I/O
SPI
PAD controls
CAN node connectionsPADLogic
Temperatures
ConnectivitySCANBoardTest
OpticalLink
AD7417 ADCI2C
JTAGSPIFlash prom
x5
x4
x7PCF8575 16-bit IO
LM75
PRODE delay
TTC CM ASIC
Remote I2C (splitter)
x4
5 March 2002 - DCS Final Design Review: RPC detector
I2C devices network ID assignement
Device A6 A5 A4 A3 A2 A1 A0 Address commenti
LM75 Splitter 1 0 0 1 0 1 0 4AhLMB Master I2C network n.2 (ext.)
LM75 CME1 1 0 0 1 1 1 1 4FhLM75 CME0 1 0 0 1 1 1 0 4EhLM75 CMF1 1 0 0 1 1 0 1 4DhLM75 CMF0 1 0 0 1 1 0 0 4ChLM75 PAD 1 0 0 1 0 1 1 4BhLM77 on Link 1 0 0 1 0 0 0 48hAD7417 P.D. 0 1 0 1 0 0 0 28hPCF8575 CME1 0 1 0 0 1 1 1 27hPCF8575 CME0 0 1 0 0 1 1 0 26hPCF8575 CMF1 0 1 0 0 1 0 1 25hPCF8575 CMF0 0 1 0 0 1 0 0 24hPCF8575 ID 0 1 0 0 0 1 0 22hPCF8575 P.D. 0 1 0 0 0 0 1 21hPCF8575 Link 0 1 0 0 0 0 0 20hLMB Master I2C network n.1
CME1 1 0 0 0 1 1 1 47hCME0 1 0 0 0 1 1 0 46hCMF1 1 0 0 0 1 0 1 45hCMF0 1 0 0 0 1 0 0 44hPAD 1 0 0 0 0 0 0 40hTTC 0 0 1 0 1 0 X 15h ÷ 14hPRODE3 0 0 1 0 0 X X 13h ÷ 10hPRODE2 0 0 0 1 1 X X 0Fh ÷ 0ChPRODE1 0 0 0 1 0 X X 0Bh ÷ 08hPRODE0 0 0 0 0 1 X X 07h ÷ 04hLMB Master I2C network n.0
JTAG daisy chain device assignementCME0 7TTC 6CMF0 5CMF1 4LINK 3CME1 2PAD 1LMB Master JTAG ctrl network
EEPROM 1LMB Master JTAG prom network
reset=0
linked to CME0 IDlinked to CMF1 IDlinked to CMF0 IDPAD ID number
linked to CME1 IDlinked to CME0 ID
Phase detector
linked to CMF1 IDlinked to CMF0 IDAD7416 eq.
linked to CME1 ID
Pad devices controlled by ELMB• Distrubuted on motherboard and
remote or piggy CM, link, TTC boards:
– I2c Temperature sensors
– TTC
– Delay chips
– FPGA
– Flash prom FPGA
– Flash prom SPI
– I2c I/O registers
– Coincidence matrix ASIC (about 200
I2C registers)
– Optical link controls
5 March 2002 - DCS Final Design Review: RPC detector
•ELMB board
•IXXAT 165 PCI CANBUS board
•IXXAT CAN Analyzer
•IXXAT canopen Client
•IXXAT Tincan interface PCMCIA
• Virtual Can Interface Library (VCI)
•CAN open Master API
Hardware and SoftwareTools used
5 March 2002 - DCS Final Design Review: RPC detector
Software development:
• We integrated a single bus I2C in the CAN node. A guideline
has been the document from Henk, but (we use SDO and):
– We need to multiple I2C buses (three), only one
implemented so far
– We need different ‘flavors’ of I2C.
– (SDO block transfer could be used for large I2C registers).
• Software for FPGA readback on JTAG/SPI written for AVR
evaluation board.
– Readback and diff with SPI flash memory data takes 15 s.
• We wrote new ccan.c and ccan.h to interface ixxat board
5 March 2002 - DCS Final Design Review: RPC detector
Software requirements• We need a lot of functionalities on our ELMB
firmware, only some is covered so far by the CERN/Nikhef supported software:– Efficient ‘Low level’ I2C/JTAG/SPI instructions, as
we have developped so far– ‘high level instructions’, like
• Load FPGA with SPI flash data• Check FPGA firmware• Initialize ASIC (200 I2C registers)• Read all temperatures• Measure clock phases from delay chip outputs• Monitor remote splitter board voltages and
temeperatures
5 March 2002 - DCS Final Design Review: RPC detector
Software• Currently we have an MS windows PC and IXXAT
board and one ELMB. Current aim is:– ‘high level’ commands developped on pc (Microsoft Visual
c++), low level commands sent via CAN.– ELMB interprets only ‘atomic’ instructions. – Then we will ‘move’ the high level commands from the
remote application to ELMB firmware
• We still need to:– define the full set of CAN commands;– understand the final size of this firmware. – Integrate to PVSS/OPC
• Current distribution of GNU gcc is now supporting AVR. Should we move from ICAV to gcc (from windows to a generic windows/linux platform)?
5 March 2002 - DCS Final Design Review: RPC detector
PVSS II
• We wait for collaboration decision about final PCI boards.
• PVSS II and OPC is now working on PCs and NI boards, on a test case configuration different from what we need.
• Is an OPC server capable of handling our requests (realtime)?
• Once our control software is implemented on PC/ELMB, is it portable to PVSS/OPC ? (rules/regulations to follow?)
• We may want to start with PVSS II now, to understand this issues (e.g. database access).
5 March 2002 - DCS Final Design Review: RPC detector
ELMB in the final system• We need about 1000 boards including spares
for the final system.• We have 10 boards for out prototype work • It is time to start to think about the structure of
the final system:– How many CAN nodes per PC ? – How many PCs ?
• Check on initialization time vs CAN nodes per PC has to be done
• (also ethernet-CAN boxes, configured as OPC servers, exist, they are small)
– Do we have an answer on space needs? We need to answer on rack allocation