© ibm corporation, 2010 ibm advanced technical skills wp101783 at ibm.com/support/techdocs an...
TRANSCRIPT
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
An illustration
Integrated with an Enterprise Scheduler such as
IBM WebSphere Compute Grid for z/OS
IBM Tivoli Workload Scheduler
See a narrated video of this on YouTube … search on
ATSDemosClick
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
Preview of Technical MessageIBM WebSphere Compute Grid z/OS has an MDB interface intended to interface with enterprise schedulers
The WSGRID utility program is what connects the enterprise scheduler to Compute Grid
WSGRID forms up a job submission message and places it on a queue. The MDB picks it up and the job is submitted inside Compute Grid.
WSGRID stays active while the job executes in Compute Grid and feeds output to JES spool and alerts the enterprise scheduler of the Java batch job’s status
This design allows Compute Grid Java batch to be integrated with traditional batch in a broader batch process
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
IBM WebSphere Compute Grid is supported on all platforms supported by WebSphere Application Server
The focus of this presentation will be Compute Grid for z/OS
Our focus will be on integration with Tivoli Workload Scheduler, but this integration design works with any scheduler capable of submitting JCL to JES
Our focus will also be on using WebSphere MQ as the JMS provider, but there is also a solution involving the internal messaging provider of WebSphere Application Server
Please Note …
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
z/OS Facilities and Functions(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
System z and z/OS
AppServer
AppServer
AppServer
WebSphere Application Server z/OS
Data Systems
DB2CICSIMSMQ
VSAMetc.
BatchApplication
BatchApplication
Compute Grid Scheduler
Compute Grid End Point
Compute Grid End Point
Job Submission and Dispatching
The WebSphere Compute Grid scheduler function has a
browser interfaceCompute Grid Job Console
In addition to a browser interface, Compute Grid also provides:
• Command line interface
• Web Services interface
• RMI Client interface
• MDB interfaceThis is the interface of particular interest for
integration with enterprise schedulers
JES
Tivoli Workload Scheduler
Spool
The question is this: what ties TWS + JES
to Compute Grid?
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
JES
Tivoli Workload Scheduler
Spool
AppServer
AppServer
AppServer
WebSphere Application Server z/OS
Data Systems
DB2CICSIMSMQ
VSAMetc.
BatchApplication
BatchApplication
JOB
PGM=WSGRID
z/OS Facilities and Functions(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
System z and z/OS
Compute Grid Scheduler
Compute Grid End Point
Compute Grid End Point
The answer: WSGRID, a utility program supplied
with Compute Grid
MQ
OutputInput
MDB
Native code utility and MQ with BINDINGS means this is very fast
Two versions of WSGRID are provided: a C/C++ native implementation on z/OS;
and on implemented in Java
The native WSGRID utility interacts with Compute Grid
using MQ and BINDINGS mode
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
AppServer
AppServer
AppServer
WebSphere Application Server z/OS
Data Systems
DB2CICSIMSMQ
VSAMetc.
BatchApplication
BatchApplication
JES
Spool
z/OS Facilities and Functions(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
System z and z/OS
Tivoli Workload Scheduler
Compute Grid Scheduler
Compute Grid End Point
Compute Grid End Point
MQ
OutputInput
Let’s take a high-level look at how this works, then we’ll dig into some of the details
JCL
TWS submits WSGRID JCL to JES (details on JCL coming)
JCL names PGM=WSGRID, which results in program being launched
WSGRID forms up message (details coming) and places on queue
MDB in scheduler fires and pulls message off the input queue
Job is dispatched, executes and completes
Scheduler feeds output back to MQ in a series of messages
WSGRID pulls messages off output queue and writes to JES
WSGRID ends and JES alerts TWS of job return code
If desired, normal JES spool archiving may take place
JOB
PGM=WSGRID
Msg
MDB
The job executes and completesJob RC = 0
Normal archive process
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
MQ
OutputInput
Tivoli Workload Scheduler
Standard JOB card EXEC PGM=WSGRID
SYSPRINT DD to JES
Name the QMGR and input / output queues
Specify the input xJCL path and file name
Provide any substitution properties you wish to
pass into the xJCL
STEPLIB to WSGRID module and MQ
QLOAD and QAUTH
JOB
PGM=WSGRID
JES
Spool
JCL
Let’s see what the JCL for WSGRID looks like, and start to demystify how this works.
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
Tivoli Workload Scheduler
MQ
OutputInput
xJCL
AppServer
AppServer
AppServer
WebSphere Application Server z/OS
Data Systems
DB2CICSIMSMQ
VSAMetc.
BatchApplication
BatchApplication
JOB
PGM=WSGRID
z/OS Facilities and Functions(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
System z and z/OS
Compute Grid Scheduler
Compute Grid End Point
Compute Grid End Point
The output ends up in the JES spool, and is viewable like any other JES spool
JES
Spool
The first part of the job output showing the xJCL
and substitution variables
The second part of the job output showing the output from the return codes for each step as well as the overall job return code
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
JES
Tivoli Workload Scheduler
MQ
OutputInput
Spool
xJCL
AppServer
AppServer
AppServer
WebSphere Application Server z/OS
Data Systems
DB2CICSIMSMQ
VSAMetc.
BatchApplication
BatchApplication
JOB
PGM=WSGRID
z/OS Facilities and Functions(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
System z and z/OS
Compute Grid Scheduler
Compute Grid End Point
Compute Grid End Point
Are jobs submitted through WSGRID controllable from the Job Management Console?
Yes! Jobs submitted through WSGRID are controllable through the Job
Management Console (JMC).
And actions in the JMC are fed back to JES and TWS through WSGRID
Cancel Job
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
AppServer AppServer
AppServer
WebSphere Application Server z/OS
Data Systems
DB2CICSIMSMQ
VSAMetc.
BatchApplication
BatchApplication
JES
Spool
z/OS Facilities and Functions(WLM, RRS, SAF, RMF, Parallel Sysplex, etc.)
System z and z/OS
Tivoli Workload Scheduler
Compute Grid Scheduler
Compute Grid End Point
Compute Grid End Point
What if you don’t have MQ as a part of your enterprise messaging infrastructure?
JCL
JOB
BPXBATCH or JZOS
WSGrid.sh
MDB
JMS
JMS Destination
SIBus
Then use the Java client with the built-in WAS messaging
wsgridConfig.py
Run supplied WSADMIN script to create messaging components inside the Compute Grid Scheduler
TWS integration using JCL same as before. Difference is the job now launches a Java client rather than the native MQ client.
The WSGrid Java client forms message and places on JMS destination. MDB fires and pulls the message and submits job.
If using JZOS then output can be directed back to JES spool. If BPXBATCH then output goes to file system.
Msg
Job Output
Dispatched and Executed
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
Integration with Traditional Batch
We know that Tivoli Workload Scheduler (TWS) is a powerful enterprise scheduler
We’ve seen how it integrates with WebSphere Compute Grid
Now let’s see how we can use the power of TWS to integrate Compute Grid and traditional batch into a larger batch process
Finally, we’ll simplify the pictures a bit to reduce clutter and focus on the key points
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
System z and z/OS
JES
Spool
MQ
OutputInput
JOB
WSGRID
Tivoli Workload Scheduler
COBOL Batch 1 Assem
Batch 2 C/C++ Batch 3
WCG Scheduler
WCG Endpoint
WCG Endpoint
Batch Appl ABatch Appl B
Batch Appl C
Imagine you have a mixed-batch environment, with Compute Grid and traditional batch
You have Tivoli Workload Scheduler and other z/OS functions (JES)
You have a series of traditional batch jobs
You have WebSphere Compute Grid in place with several batch applications deployed to the batch endpoints
You plan to integrate TWS with WebSphere Compute Grid so you have the WSGRID program ready with MQ input/output queues defined
MDB
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
System z and z/OS
JES
Spool
MQ
OutputInput
JOB
WSGRID
Tivoli Workload Scheduler
COBOL Batch 1 Assem
Batch 2 C/C++ Batch 3
WCG Scheduler
WCG Endpoint
WCG Endpoint
Batch Appl ABatch Appl B
Batch Appl C
Imagine further that you have a TWS batch workflow defined with mixed Java and native batch
MDB
A 12
B
3 CYour TWS batch workflow:
Java COBOL
Java
Assembler
C/C++ Java
JCL Job Library
COBOL Batch 1 JCL
Assembler Batch 2 JCL
C/C++ Batch 3 JCL
WSGRID JCL for Java A
WSGRID JCL for Java B
WSGRID JCL for Java C
You assemble the JCL for your traditional native batch jobs so TWS has access to submit to JES
And you assemble the JCL to invoke an instance of WSGRID for each Java batch job in WCG
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
System z and z/OS
JES
Spool
COBOL Batch 1 Assem
Batch 2 C/C++ Batch 3
WCG Scheduler
WCG Endpoint
WCG Endpoint
Batch Appl B
Batch Appl C
Let’s now walk through an illustration of how TWS would integrate traditional and Java batch …
A 12
B
3 CYour TWS batch workflow:
Java COBOL
Java
Assembler
C/C++ Java
JCL Job Library
COBOL Batch 1 JCL
Assembler Batch 2 JCL
C/C++ Batch 3 JCL
WSGRID JCL for Java A
WSGRID JCL for Java B
WSGRID JCL for Java C
Tivoli Workload Scheduler
TWS process initiated
MQ
OutputInput
JOB
WSGRIDMsg
WSGRID job initiated
MDB
Batch Appl A
Job dispatched to end point where application deployed
WSGRID job spun down
Job output goes to spool
Job completes
Message formed based on properties inline with JCL or
in named properties file
Tivoli Workload Scheduler readies itself to proceed in the workflow … COBOL Job 1 … that’s next
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
System z and z/OS
JES
Spool
MQ
OutputInput
WCG Scheduler
WCG Endpoint
WCG Endpoint
Batch Appl ABatch Appl B
Batch Appl C
TWS moves on to the next job in its process – a traditional COBOL batch job
MDB
A 12
B
3 CYour TWS batch workflow:
Java COBOL
Java
Assembler
C/C++ Java
JCL Job Library
COBOL Batch 1 JCL
Assembler Batch 2 JCL
C/C++ Batch 3 JCL
WSGRID JCL for Java A
WSGRID JCL for Java B
WSGRID JCL for Java C
Tivoli Workload Scheduler
JES initiates batch job
Assem Batch 2 C/C++
Batch 3COBOL Batch 1 Job
executesCOBOL Batch 1
Job completesJob output goes to spool
Tivoli Workload Scheduler readies itself to proceed in the workflow … simultaneous submission
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
System z and z/OS
JES
Spool
MQ
OutputInput
WCG Scheduler
WCG Endpoint
WCG Endpoint
Batch Appl A
Batch Appl C
A TWS process may consist of multiple jobs run simultaneously. It makes no difference to TWS if
the jobs are mixed Java and native.
A 12
B
3 CYour TWS batch workflow:
Java COBOL
Java
Assembler
C/C++ Java
JCL Job Library
COBOL Batch 1 JCL
Assembler Batch 2 JCL
C/C++ Batch 3 JCL
WSGRID JCL for Java A
WSGRID JCL for Java B
WSGRID JCL for Java C
Tivoli Workload Scheduler
Assem Batch 2 C/C++
Batch 3COBOL Batch 1
JOB
WSGRID Assem Batch 2Msg
MDB
Batch Appl B
Job output goes to spool
Note: We’re going to speed this up
quite a bit
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
System z and z/OS
JES
Spool
COBOL Batch 1 Assem
Batch 2 C/C++ Batch 3
WCG Scheduler
WCG Endpoint
WCG Endpoint
Batch Appl B
A 12
B
3 CYour TWS batch workflow:
Java COBOL
Java
Assembler
C/C++ Java
JCL Job Library
COBOL Batch 1 JCL
Assembler Batch 2 JCL
C/C++ Batch 3 JCL
WSGRID JCL for Java A
WSGRID JCL for Java B
WSGRID JCL for Java C
MQ
OutputInput
JOB
WSGRID
Batch Appl A
The processing of the final two jobs in this batch flow unfolds just like the first two did …
Batch Appl C
Msg
MDB
Tivoli Workload Scheduler
Tivoli process complete with all jobs ending RC=0
C/C++ Batch 3Job output goes to spool
Click
© IBM Corporation, 2010IBM Advanced Technical SkillsWP101783 at ibm.com/support/techdocs
Summary of this Show …• Integration with Enterprise Schedulers is provided by
the WSGRID function
• WSGRID is a module that’s easily submitted with batch JCL
• One option is a thin MQ client that puts message on an MQ queue. Compute Grid MDB picks it up and submits job
• WSGRID feeds Compute Grid job output back to the JES spool, and informs enterprise scheduler of RC
• Because of this model, Compute Grid may be integrated with traditional batch using Enterprise Scheduler process flows
• There is a Java based client that does not require MQ. It’s not as fast as the native MQ client however.
End