How to copy the statistics for all the objects in a database on one subsystem to another subsystem.
Release: R20
Component: RCM
It may be useful for performance testing on a development subsystem to migrate just the catalog statistics from production and the development packages rebound in order to see what access paths are chosen. This would indicate how the packages will behave on production and may indicate performance issues with SQL that should be dealt with before going to production.
When the statistics are to be moved across LPARS , then before beginning, create a partitioned dataset for the analysis output to be saved on for transport to the remote (across LPARs) subsystem. The dataset must be moved over to the other LPAR and the analysis output executed there. The partitioned dataset must have an LRECL of 80 and RECFM of FB.
If the statistics migration is on the same LPAR, you may either use a partitioned dataset or managed output to hold the analysis output.
At the RC/Migrator Main Menu create a new migration strategy as normal.
T S SRC +---- LAST UPDATE ----
O STRATEGY DESCRIPTION CREATOR P O SSID USER DATE TIME
c MIGSTAT stats only migration on DB userid1 M Y ssid <== STRATEGY CREATION
Press ENTER
This will go to the primary object selection screen.
When migrating at database level, it is best to explode down the object dependencies.
So, next to DB place an 'e'... In the criteria section and put in the DB name and creator, Using 'e', you will not have to make subsequent selections for dependant objects.
---------------- RC/M Create Migration Strategy --------------- yy/mm/dd hh:mm
COMMAND ===>
Strategy ===> MIGSTAT Description ===> STATS ONLY MIGRATION ONDB
Creator ===> userid1 Share Option ===> U (U,Y,N,X,L) SRC SSID ===> ssid
----------------------------------------------------------------------------
PRIMARY OBJECT TYPE SPECIFICATION. (Select one type with 'A' , 'S', or 'E'
_ Storage Group
e Database
_ Tablespace
_ Table
_ Index
_ View
_ Synonym/Alias
_ Trigger
_ Routine
PRIMARY OBJECT SELECTION SPECIFICATION.
Name ==> PTDB > Creator ==> authid1 > Where ==> N
After pressing enter, it will return to the RC/M Strategy Services screen.
The next step is to analyze the strategy using the "a" line command.
RMS1 20.0 -------------- RC/M Strategy Services -------------- yy/mm/dd hh:mm
COMMAND ===> SCROLL ===> CSR
DB2 SSID ===> ssid
STRATEGY ===> MIGSTAT CREATOR ===> authid1 TYPE ===> M SRC SSID ===> *
--------------------------------------------------------------------- authid1
T S SRC +---- LAST UPDATE ----+
O STRATEGY DESCRIPTION CREATOR P O SSID USER DATE TIME
_ ________ _________________________ authid1 _ N ____ <== STRATEGY CREATION
a MIGSTAT STATS ONLY MIGRATION ON D authid1 M U ssid authid1 yy/mm/dd hh:mm
Press enter.
20.0 ----------- RC/M Migration Strategy Analysis ----------- yy/mm/dd hh:mm
COMMAND ===>
STRATEGY ===> MIGSTAT DESCRIPTION ===> STATS ONLY MIGRATION ON D
CREATOR ===> authid1 SHAREOPTION ===> U
--------------------------------------------------------------------- authid
EXECUTION SPECIFICATIONS ACCESS SPECIFICATIONS
EXECUTION MODE ===> ( O - Online, B - Batch ) ACM ANALYSIS ===> N
OVERRIDE ===> N PDS ANALYSIS ===> N
RECOVERY ===> N
DB2 SSID SPECIFICATIONS
OPTION SPECIFICATIONS SOURCE SSID ===> ssid
UPDATE OPTIONS ===> N TARGET SSID ===> ssid
PROFILE NAME ===>
PROFILE CREATOR ===>
GLOBAL CHANGES ===> N
SET NAME ===>
SET CREATOR ===>
OUTPUT DATASET SPECIFICATIONS
EDIT DATASET ===> N
DATASET NAME ===> 'authid1.ANALYSIS.BPCODE(MIGSTAT)'
VOLUME SERIAL ===> ( If not cataloged )
Note: under OUTPUT dataset Specification, in quotes, enter the partitioned dataset name and the member which is the same as the strategy name! If a partitioned dataset is always used, the strategy name will be appended to the string as it will be used as the member name.
Enter a "Y" in UPDATE OPTIONS and press enter.
This displays the Analysis Options Screen.
20.0 RC/M Migrate Strategy Analysis Options
COMMAND ===>
Enter SAVE to save settings. END to return. "?" in field for field level help.
Exclusive Options Utility Options Output Options
NO .AUTHS ==> Y STATS (Y,N,S,A)==> S ANALYSIS HEADER ==> Y
BND/DAT/STA/SQL/GRNT/RI ==> R CHECK DATA ==> Y IMPACT ANALYSIS ==> Y
(B,D,R,S,G,I/N) COPY (Y,P,N) ==> N RPT IN DDLFILE ==> Y
COMMIT ASAP (SQL ONLY) ==> N (B)IND/(R)EBND ==> C
Control Options REBUILD INDEX ==> Y
VIEW (IMP/EXP) ==> N Model Options IDCAMS Options
AUX IMPLODE ==> N MOD ID ==> modelid VSAM DEFINES ==> N
MQT IMPLODE ==> N MOD CREATOR ==> authid1 VSAM DELETES ==> N
RI (foreign keys) ==> L UPDATE MOD ==> N
SECURITY ==> N
DROP OBJECTS ==> N Data Unload Options Dataset Del Options
DEFAULT SQLID ==> authid1 ALL ROWS ==> Y UNLOAD ==> Y
ALIAS SQLID ==> NUM ROWS ==> TEMPORARY ==> Y
LOB OBJECTS ==> Y DATA STATS ==> Y
TABLE OBID ==> N TRUNCATE ==> N
SEQ IMPLODE ==> (T,R,A,blank,S)
RTN IMPLODE ==> (T,V,A,blank)
RTN INTERACTION ==> N (Y,N,A,O,E,M,S)
MQT REFRESH ==> N (Y,N,B,U)
SET TERMINATOR ==> (blank defaults to ';')
Set STATS to "S" so that no control cards are generated.
------Utility Options----
STATS (Y,N,S,A) ==> S
Under the output options, set the below fields to Y.
----Output Options----
ANALYSIS HEADER ==> Y
IMPACT ANALYSIS ==> Y
RPT IN DDLFILE ==> Y
This provides information on what has occurred in the strategy.
As this analysis is for STATS only, change the values below.
Exclusive Options
NO .AUTHS ==> Y
BND/DAT/STA/SQL/GRNT/RI ==> R
(B,D,R,S,G,I/N)
COMMIT ASAP (SQL ONLY) ==> N
Set the BND/DAT/STAT/SQL/GRNT/RI ==> R for stats only and No .AUTHS so that there are no .AUTH statements generated.
Set the @DEFAULT model that will be used.
-------Model Options------
MODEL ID ==> modelid
MODEL CREATOR ==> authid
UPDATE MODEL ==> N
Type SAVE at the top of this screen so it saves the values that are currently selected. PF3 back to the RC/M Migration Strategy Analysis screen.
Make sure an SSID is entered in both source and target SSID's
DB2 SSID SPECIFICATIONS
SOURCE SSID ===> ssid1
TARGET SSID ===> ssid2
NOTE: If the analysis dataset will be moved to another LPAR, specify the correct Target SSID anticipating the target subsystem that the analysis will be executed. In this way the correct .CONNECT ssid statements will be in the code.
.SYSTEM SQLDDL
.CONNECT ssid2
Now submit the job to be analyzed either online or in Batch mode. Online will tie up your session till completed!!. This will generate the SQL statements needed to update the catalogue statistics.
Once it is analyzed , the dataset will contain the information that is needed to migrate the stats!
Check the Analysis report generated for error and warning messages regarding what the analysis found in the source catalogue such as warning messages about missing catalog statistics for tables and indexes like this:
-- RMA252I NO STATISTICS UPDATE STATEMENTS GENERATED FOR TABLE
-- PTI.ACCUM_STRATEGY - RUNSTATS NOT PRESENT
and
-- RMA252I NO STATISTICS UPDATE STATEMENTS GENERATED FOR INDEX
-- PTI.PTSE_IX_PREFX_NFM - RUNSTATS NOT PRESENT
SQL UPDATE and INSERT statements will be seen in the analysis output depending on what statistics are found on the source subsystem like this:
UPDATE SYSIBM.SYSTABLESPACE
SET
NACTIVE = 0
,STATSTIME = '0001-01-01-00.00.00.000000'
,NACTIVEF = -0.1E+01
WHERE
DBNAME = 'PTDB' AND
NAME = 'ALOGFILE'
;
and
INSERT INTO SYSIBM.SYSCOLDIST
( FREQUENCY, STATSTIME, IBMREQD, TBOWNER, TBNAME, NAME, COLVALUE,
TYPE, CARDF, COLGROUPCOLNO, NUMCOLUMNS, FREQUENCYF,
QUANTILENO, LOWVALUE, HIGHVALUE )
VALUES(
1
,'2019-06-25-09.48.32.890255'
,'N'
,'PTI'
,'PTALT_SYSTBL_0160'
,'USERID'
, ''
,'C'
, +0.123E+03
, X'00010002'
, 2
, -0.1E+01
, -1
, ''
, ''
)
;
So, what you have are essentially a bunch of SQL statements to update the target catalogue and in this format this SQL could be used more than once and on different subsystems if required.
Move the dataset to the target LPAR, enter the Database Management for DB2 for Z/OS main menu and type BP at the command line.
This will go to the batch processor screen where the analysis output dataset previously moved and member can be entered into the DATASET NAME and MEMBER fields and executed.
PTBP 20.0 ---------- Batch Processor Interface --------- yyyy/mm/dd hh:mm:ss
COMMAND ===>
EXECUTION SPECIFICATIONS:
EXECUTION MODE ===> ( B - Batch mode, O - Online mode )
DB2 SSID ===> ssid2
BATCH PROCESSOR INPUT DATA SET:
DATA SET NAME ===> 'authid1.ANALYSIS.BPCODE'
MEMBER ===> MIGSTAT ( Blank or pattern for member selection list )
VOLUME SERIAL ===> ( If not cataloged )
EDIT DATA SET ===> N ( Y - Yes, N - No )
PROCESSING OPTIONS:
UNLOAD MODE ===> A CONTINUE IF:
RESTART ===> N -WARNING ===> Y
TERM UTILITY ===> N -SQL ERROR ===> N
RETRY LIMIT ===> 10 -BIND ERROR ===> N
WRAP/TRUNCATE ===> T -LOAD DISCARDS ===> N
RE-EXEC CHECK ===> N
AUDIT OPTIONS
AUDIT List Dest ===> P ( P - print; D - data set; N - No audit; U - Update)
Print Setting - Class X Dest A
If the analysis will be executed on the same LPAR use the 'S' line command on the analysis and it will go to the Batch Processor Interface screen where the analysis can then be submitted as above online or in batch.
RMS1 20.0 -------------- RC/M Strategy Services -------------- yy/mm/dd hh:mm
COMMAND ===> SCROLL ===> CSR
DB2 SSID ===> ssid
STRATEGY ===> MIGSTAT CREATOR ===> authid1 TYPE ===> M SRC SSID ===> *
--------------------------------------------------------------------- authid1
T S SRC +---- LAST UPDATE ----+
O STRATEGY DESCRIPTION CREATOR P O SSID USER DATE TIME
_ ________ _________________________ authid1 _ N ____ <== STRATEGY CREATION
_ MIGSTAT STATS ONLY MIGRATION ON D authid1 M U ssid authid1 yy/mm/dd hh:mm
s authid1.ANALYSIS.BPCODE(MIGSTAT) authid1 yy/mm/dd hh:mm
So, while running this analysis output the only errors you can expect would be SQL errors found on the target catalog. The analysis output will have .SYNC statements between the statements which provides a checkpoint restart capability for the statements in case any sql errors are found and must be manually corrected.