controls and monitoring implementation plan

25
Controls and Monitoring Implementation Plan J. Leaver 03/06/2009

Upload: moe

Post on 25-Feb-2016

21 views

Category:

Documents


0 download

DESCRIPTION

Controls and Monitoring Implementation Plan. J. Leaver 03/06/2009. Implementation Issues. Organisation & responsibilities General EPICS infrastructure EPICS server / client organisation Unification of control systems Remote access Monitoring Controls Configuration database Schedule. - PowerPoint PPT Presentation

TRANSCRIPT

  • Controls and Monitoring Implementation PlanJ. Leaver03/06/2009

    *Imperial College*

    Implementation IssuesOrganisation & responsibilitiesGeneral EPICS infrastructureEPICS server / client organisationUnification of control systemsRemote accessMonitoringControlsConfiguration databaseSchedule

    *Imperial College*

    Organisation of Control SystemsOriginal plan was for Daresbury Lab (DL) to provide all controls for the experimentDL responsible for many existing C&M systems (excellent quality)Unfortunately, recent funding issues have limited collaborations ability to pay DL for new workDL to continue with current projects (where possible)MICE community to take responsibility for additional C&M systems

    *Imperial College*

    Organisation of Control SystemsMICE Online Group (MOG) created in JanuaryAim: Organise data acquisition, C&M & online reconstructionControls & Monitoring Leader (JL)Identify control requirements for each section of MICEDecide on most appropriate solutionCoordinate the effort of those involved in implementing agreed solution

    *Imperial College*

    Organisation of Control SystemsMOG directly responsible for C&M infrastructureNetwork/hardware organisationIntegration of control systems (with each other & the rest of MICE)User experience (i.e. how operators interact with global C&M system)For individual projects, each group within MICE should be responsible for own system(s)Contributing either EPICS development effort or funds for a 3rd party (e.g. DL) to complete required workWhere necessary, MOG contributes developer effortHowever, very limited resources available (~1.5 man years per year)Currently seeking additional support within the community

    *Imperial College*

    EPICS Client / Server Overview

    *Imperial College*

    EPICS Server / Client OrganisationWide variety of EPICS server applications permittedTypically connect to physical hardwareImpossible to enforce common interface/processor/OS specificationsEach server maintained by owner of respective control systemStrict central administration unnecessary end user only concerned with availability of PVs on networkEPICS clients also varied, but must be uniformly accessibleUsers should not have difficulty finding/launching clientsApplications should be consistently organised/updatedMOG responsibility

    *Imperial College*

    EPICS Client OrganisationAll client-side applications run on miceecservCentral installation repository greatly simplifies configuration/maintenance/backupMOG collates individual applications, applies updates when available from control system ownersmiceecservmiceopi1miceopi2EPICS IOCPortable CA ServerPortable CA ServerEPICS IOCEPICS IOCControls NetworkEPICS server applicationsEPICS client applications

    *Imperial College*

    EPICS Client OrganisationClient control/monitoring GUIs viewed directly on miceecserv, or one of 2 Operator Interface PCsOPI PCs act as dumb terminals, running displays from miceecserv via SSHmiceecservmiceopi1miceopi2EPICS IOCPortable CA ServerPortable CA ServerEPICS IOCEPICS IOCControls NetworkEPICS server applicationsEPICS client applications

    *Imperial College*

    Unification of Control SystemsAt user level: Simple wrapper GUI provides menu for launching individual client applicationsAt system level: Employ 2 standard EPICS tools (running as background services on miceecserv) Alarm HandlerMonitors all servers & warns operators of abnormal/dangerous conditionsChannel ArchiverAutomatically records PV parameters to disk & provides several visualisation optionsSee P. Hanlets talk

    *Imperial College*

    User Interface

    *Imperial College*

    User InterfaceLarge wall-mounted displayAlarm HandlerMessage logAny important parameters for current run

    *Imperial College*

    User InterfaceClient application launcherStandard desktop monitorClient GUI

    *Imperial College*

    User InterfaceConnected to miceecserv

    *Imperial College*

    User InterfaceConnected to miceopi1Connected to miceopi2

    *Imperial College*

    Remote Monitoring: General Principles Remote users should have simple, easily accessible interface for routine monitoringExpert remote users should have access to monitoring displays which match those in MLCR No machine on Controls Network should be directly accessible over the internetSystem load generated by remote monitoring should have minimal impact on control & monitoring services

    *Imperial College*

    Remote Monitoring: Web ServermiceecservEPICS IOCPortable CA ServerPortable CA ServerEPICS IOCEPICS IOCRAL GatewayChannel ArchiverWeb ServerPV ArchiveData ServerCGI ExportControls NetworkJava Archive ViewerWeb browserNFS MountPPD NetworkInternet

    *Imperial College*

    Remote Monitoring: Direct PV AccessCould recreate normal client displays using web interface, but would involve impractical development overheadsProvide direct read only access to PVs so actual client GUIs may be run remotelymiceecservEPICS IOCPortable CA ServerPortable CA ServerEPICS IOCEPICS IOCRAL GatewayControls NetworkStandard client GUI running on remote PC (read only)CA Gateway (read only)CA Gateway (read only)

    *Imperial College*

    Remote Monitoring: Direct PV AccessCA Gateway makes PVs available across subnets (with full access control), while minimising load on underlying serversTo simplify end-user support, virtual machine disk image containing EPICS + all client applications will be made availablemiceecservEPICS IOCPortable CA ServerPortable CA ServerEPICS IOCEPICS IOCRAL GatewayControls NetworkStandard client GUI running on remote PC (read only)CA Gateway (read only)CA Gateway (read only)

    *Imperial College*

    Remote ControlWhere possible, operations affecting the state of any MICE system should only be performed within MLCRRemote users accessing controls can lead to unknown/unexpected running conditions should be discouragedIf necessary, off-site experts will be permitted to run control client applications on miceecserv, via SSH through RAL GatewayEach expert will have an account on miceecserv which only contains client applications for their designated system

    *Imperial College*

    Configuration DatabaseNecessary to integrate control systems with central MICE Configuration DatabaseRead set point values from databaseUpload PV values to EPICS serversModify PVs with client GUIsDownload PV values from EPICS serversWrite new set point values to databaseFor (2) & (4), could use standard EPICS Backup & Restore Tool (BURT)Backup/restore PV values to/from snapshot filesHowever, interfacing snapshot files with database introduces significant overheadsPropose creation of custom backup/restore client

    *Imperial College*

    Configuration DatabaseSimple client applicationRead/write PV values via MICE C++ wrapper for CA C-bindingsXML configuration file specifies PV names, correct sequence for write operationsImport/export sets of PV values from/to XML stringRead/write XML string from/to database via Configuration Database APIManual backup/restoreState tagged with time, user-generated identification string, etc.Monitoring of DATE DAQ stateAutomatic backup at start of each run

    *Imperial College*

    Configuration DatabaseAdditional requirementsThroughout each DAQ run, all set point values should be held in state defined by the last Configuration Database snapshotIf values change, system in unknown stateCannot perform automated analysis of run data While DAQ in run state, client monitors all set point valuesIf any parameters are modifiedSet PV to indicate invalid run state (read into DAQ stream)Set warning on Alarm Handler display

    *Imperial College*

    Configuration DatabaseConfiguration Database interface still in early design stages work not commencedJ. Leaver/P. Hanlet to develop EPICS clientD. Forrest to implement database API functions for parsing/formatting EPICS set point XML stringsDetails of run state PV monitoring to be confirmed

    *Imperial College*

    Infrastructure Schedule