TurboTuneSQL Data Gathering Instructions

Below you will find the instructions for a simple collection of statistical data required to perform an initial analysis of your Db2 SQL/Programs within the data center. 

You may also download the instructions here in PDF. 

INSTRUCTIONS TO COLLECT DATA FOR Db2 APPLICATION PERFORMANCE ANALYSIS WITH TURBOTUNESQL®

PLEASE NOTE THERE ARE TWO PHASES FOR COMPLETION OF DATA COLLECTION WHEN UTILIZING TTSQL. PHASE II INSTRUCTIONS WILL FOLLOW SUBMISSION OF PAHSE I.

The collection is performed in two phases. Please complete Phase I and upload data to your company account as noted below for review. Once the initial assessment is complete CPSI will project Db2 savings. Once engagement is initiated Phase II collection instructions will be forwarded.

Basic pre-requirements:

1) Db2 version 11 and above
2) User must have SYSADM authority in Db2 to process both Phase I and Phase II
3) User must have authorization to run ADRDSSU to dump the collect datasets
4) User must have authorization to run SYSPROC.ADMIN_INFO_SYSPARM stored procedure in Db2
5) User must have authorization to upload and/or transmit data to CPSI

NOTE: No APF authorization is required.

PHASE I:

STEPS:

1. In TSO option 3.2, allocate 2 datasets, HLQ.PHASE1.REXX and HLQ.PHASE1.INST.

DCB and SPACE for the 2 datasets: LRECL=80, BLKSIZE=3120, RECFM=FB and SPACE=(CYL, (5,5))

2. Upload file sqlBASE_phase1_rexx to HLQ.PHASE1.REXX in binary mode

Upload file sqlBASE_phase1_inst to HLQ.PHASE1.INST in binary mode

3. In TSO option 6, execute: EXEC ‘HLQ.PHASE1.REXX’ ‘HLQ.PHASE1.INST’. It will run a REXX routine to prepare the jobs for collection.

3.1. In the first panel, inform:

Db2 SYSTEM – Db2 ssid to collect data. It is the Db2 to be analyzed.

Number of Collections – This is the number of times to read Db2 memory to collect information about programs/statements currently being executed.

(DEFAULT 4 TIMES)

Delay Time in Minutes – This is the time between collections

(DEFAULT 15 MINUTES)

NOTE: Db2 TRACES – These traces are mandatory to be able to collect all CPU, WAIT times and other information about programs, statements, correlations, objects, etc. There is a maximum of 1.5% overhead while the traces are on and the system will have no impact. The traces will be started and stopped automatically via batch job (PH1JOB1 explained further in this procedure). A pop-up may be displayed showing the Db2 traces which will be turned on.

Default configuration collection will last approximately 50 minutes.

OUTPUT DATA SET – This dataset will contain all the information collected for the first phase. This is the dataset to be downloaded in binary mode at the end of the collection.

3.2. In the second panel, inform:

CCSID UNICODE: 367 (use 00367 as default). This field must match USCCSID in DSNHDECP module

CCSID EBCDIC: 1047 (use 1047 as default). This field must match SCCSID in DSNHDECP module

PDS : Samplib containing the Db2 DSNTIJUZ system parameters member.

MEMBER : Db2 System parameters member (DSNTIJUZ).

3.3. In the third panel, inform:

Db2 SDSNEXIT DATASET for this Db2

Db2 SDSNLOAD DATASET for this Db2

JOB CARD to run jobs PH1JOB1 and PH1JOB2.

Working Datasets HLQ: Inform high level qualifier for datasets used for collection.

Space Allocations: Inform primary and secondary space for working datasets. (The values are in the cylinders).

3.4. In the fourth panel, hit <enter> to edit jobs PH1JOB1 and PH1JOB2, or <PF3> to exit. The PDS containing the jobs will be named <userid>.SQLBASE.PH1.JCL.

3.4.1. Information about job PH1JOB1:

STEP1: Delete temporary datasets

STEP2: Start necessary Db2 Traces

STEP3: Collect information about the programs/statements being currently executed. Note: This step will take approximately 50 minutes in the default configuration.

STEP4: Stop Db2 Traces. This will stop only the traces initiated in STEP1

3.4.2. Information about job PH1JOB2:

STEP1: Delete temporary datasets

STEP2: Dump all information collected.

Include SMF DUMP datasets in <inform SMF1>, <inform SMF2>… accordingly

*** The date/time period of SMF records must match the date/time period
informed in DDNAME SMFINFO in PH1JOB1/STEP3.

STEP3: Create the dataset to be uploaded to Critical Path Software, Inc.

3.5. Download in binary the result (OUTPUT DATASET name in the first panel). Please rename file “companyname”_sqldata_Phase_I.xmt. A secure upload link will be provided by Critical Path Software for transfer of file.

Critical Path Software, Inc. will analyze the data. A report projecting savings will be provided and instructions for PHASE II will be sent.

END PHASE I.