Quantcast
Channel: SAP NetWeaver Administrator
Viewing all 185 articles
Browse latest View live

Using VB code in BEx Analyzer

$
0
0

VB coding in BEx Analyzer.

In some situations MS Visual Basic coding may be used to transform data displayed in BEx Analyzer.
Let’s suppose we have to create a report that presents service costs with respect to the country of vendor and the code of service.  

The problem is that two queries should be used in this case. First query where we can see country associated with vendor (0vendor__0country) and second query using 0pcompany__0counry (as we have no data in 0vendor in some records ). 

Now we have two tables that we have to combine into one to see the data in a proper way.
First: based on 0vendor_0country:

SERVICE (code and text); 0VENDOR__COUNTRY (code); AMOUNT (rounded); AMOUNT
232 service group 1  DE 100 100,01 
232 service group 1  GB 200 200,01
236 service group 1  DE 150 150,02

 

Second (based on 0pcompany_0country)

SERVICE (code and text); 0PCOMPANY__COUNTRY (code); AMOUNT (rounded); AMOUNT
232 service group 1  DE 300 300,01 
232 service group 1  GB 100 100,01
236 service group 1  DE 50 50,02

 

We need combine data from both tables into:

SERVICE (code and text); COUNTRY (code); AMOUNT (rounded); AMOUNT
232 service group 1  DE 400 400,02 
232 service group 1  GB 300 300,02
236 service group 1  DE 200 200,02

 

Here is the moment when we may use VB coding. 

What should be done in this case is:
1. data from the first table should be copied to another location.
2. data from the second table should be copied to the same location just below the last row from first table
(here we have one table containing the data from both tables)
3. data in the new table should be sorted by service and country
4. data should be aggregated by service and country

Copying the data from first table may be realized by code:

   Sheets("0VENDOR SHEET").Select   
    r = 2
    t = Cells(r, 1).Value
    While t <> ""
        r = r + 1
        t = Cells(r, 1).Value
    Wend
    RwMax = r
    Range(Cells(3, 1), Cells(RwMax - 2, 7)).Select
    Selection.Copy
    Sheets("RESULT").Select
    Cells(2, 1).Select
    ActiveSheet.Paste
 
Copying the data from the second table is slightly more difficult because the first row of that table have to be just under the last row of the first table.

   Sheets("0PCOMPANY SHEET").Select   
    r = 2
    t = Cells(r, 1).Value
    While t <> ""
        r = r + 1
        t = Cells(r, 1).Value
    Wend
    RwMax2 = r
    Range(Cells(3, 1), Cells(RwMax2 - 2, 7)).Select
    Selection.Copy
   
    Sheets("RESULT").Select
    last_row = Selection.SpecialCells(xlCellTypeLastCell).Row
    Cells(last_row + 1, 1).Select
    ActiveSheet.Paste
    Application.CutCopyMode = False

 

To sort the data we use code like this 

    ActiveWorkbook.Worksheets("OdbRazem").Sort.SortFields.Clear
    ActiveWorkbook.Worksheets("OdbRazem").Sort.SortFields.Add Key:=Range( _
        "A2:A300"), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:= _
        xlSortTextAsNumbers
    ActiveWorkbook.Worksheets("OdbRazem").Sort.SortFields.Add Key:=Range( _
        "C2:C300"), SortOn:=xlSortOnValues, Order:=xlAscending, DataOption:= _
        xlSortNormal
    With ActiveWorkbook.Worksheets("OdbRazem").Sort
        .SetRange Range("A1:F100")
        .Header = xlYes
        .MatchCase = False
        .Orientation = xlTopToBottom
        .SortMethod = xlPinYin
        .Apply
    End With

 

Aggregation of records is not very complicated 

    serviceCode = ""
    serviceText = ""
    vCountry = ""
    roundedValue = 0
    myValue = 0
   
    sumRow = last_row + 7
    cRow = 2
    serviceCode = Cells(cRow, 1).Value
    serviceText = Cells(cRow, 2).Value
    vCountry = Cells(cRow, 3).Value
    roundedValue = Cells(cRow, 4).Value
    myValue = Cells(cRow, 5).Value
   
    For row = 3 To last_row + 1
        If Cells(cRow, 1).Value <> serviceCode Or Cells(cRow, 3).Value <> vCountry Then
            sumRow = sumRow + 1
           
            Cells(sumRow, 1).Value = serviceCode
            Cells(sumRow, 2).Value = serviceText
            Cells(sumRow, 3).Value = vCountry
            Cells(sumRow, 4).Value = roundedValue
            Cells(sumRow, 5).Value = value
           
            serviceCode = Cells(cRow, 1).Value
            serviceText = Cells(cRow, 2).Value
            vCountry = Cells(cRow, 3).Value
            roundedValue = Cells(cRow, 4).Value
            myValue = Cells(cRow, 5).Value
          
        Else
            roundedValue = roundedValue + Cells(cRow, 4).Value
            myValue = myValue + Cells(cRow, 5).Value
        End If
    Next

 

Now we have aggregated data that may be used in a beautiful report.

The program of course may be associated with a button that we create in one of the sheets.

 

This is only an example of a method. The program may be much more elaborated. Using VB we can do everything in what we need.

 

Hope it encouraged you to use VB coding in analyzer.

Regards, Leszek


Installing SAP NetWeaver AS ABAP 7.03 SP04 Trial in Amazon Public Cloud together with SAP PO 7.31 SP 11

$
0
0

This blog is about installing SAP NetWeaver Application Server ABAP 7.03 64-bit Trial on SAP Process Orchestration 7.31 SP 11 Trial in Amazon Web Services (AWS) Public Cloud (described in ‘Try SAP NetWeaver Process Orchestration in Public Cloud!’ blog - http://scn.sap.com/docs/DOC-33765). The target is to have two working systems in one cloud – SAP NetWeaver ABAP and SAP PO.


1. Make sure that your AWS cloud instance with SAP Process Orchestration 7.31 SP 11 Trial is installed and running (as described in ‘Try SAP NetWeaver Process Orchestration in Public Cloud!http://scn.sap.com/docs/DOC-33765).


2. Using Remote Desktop log into your virtual operating system.


3. Go to https://store.sap.com/ web and look for SAP NetWeaver 7.03 64-bit trial.


4. Click on ‘Trial Version’

          1.png

5. Fill the form and click on ‘Submit’


6. Go to email that you used in form in point 5. There should be a link to the SAP NetWeaver Trial. Choose ABAP_7_03_SP04_64_bit_Trial_3_9_2_GB and download it. You have in this package SAP GUI also.


7. To unpack SAP Netweaver Trial download and install any program that unpacks archives for example 7-zip ( http://www.7-zip.org/ ). Unpack the SAP NetWeaver Trial.


8. If your hostname is longer than 13 characters you need to change for shorter one. To do that go to system properties (right click on computer -> properties), click ‘Change settings’ and then ‘Change…’  button. Change the computer name and save it. Computer needs to restart and it may take a few minutes.


9. After hostname changes your pi license might stop to be valid. To check this out go to http://localhost:50000/nwa Configuration -> Infrastructure -> Licenses.  If it is not valid  you need to install new license on your PI system. To do that go to https://websmp130.sap-ag.de/sap(bD1lbiZjPTAwMQ==)/bc/bsp/spn/minisap/minisap.htm, fill the form and click ‘Submit’, go to mail that you used in form and download new license key. To install it go again to http://localhost:50000/nwa Configuration -> Infrastructure -> Licenses, click ‘Install from File’, find the file and add it. You should see now license with valid license status. The old license with not valid status you may delete by selecting it and clicking on ‘Delete License’


10. Install loopback adapter. Press ‘Start’-> cmd. Run hdwwiz.exe. Click ‘Next’. Set ‘Install the hardware that I manually select from a list (Advanced)’ and click ‘Next’. Choose ‘Network adapters’ and click ‘Next’. Then set as below:

          2.png

Click twice ‘Next’ and then ‘Finish’.


Go to network properties. Click ‘Change adapter settings’. Right click on loopback adapter that you just added and click ‘Properties’.

Go to IPv4 Properties

          3.png

Set the IP address and Subnet mask and save it. You will use this ip address in point 16.


Go to C:\Windows\System32\drivers\etc and edit file ‘hosts’.

Add line (with defined by you values from points 8 and 10 instead of these variables) to the end of a file


<ip_address_that_you_added_to_loobpack_adapter> <hostname>


For example: 10.10.10.10 Host1

 

and save it.


11. Now you can start the installing. Go to folder:

NWABAPTRIAL70304_64\SAP_NetWeaver_703_Installation_Master\DATA_UNITS\BS2011_IM_WINDOWS_X86_64 and run sapinst.exe


12. Choose Central System as below and click ‘Next’.

          4.png


13. Accept the license, enter master password (only ‘Abcd1234’ password worked for me. Other password that I tried were causing errors further steps of      installation) and set your installation drive to C.

          5.png

if in prerequisites checker you will have condition ‘Swap Size’ not met then change your swap space to the recommended one and start installation again from point 11. To change swap size go to system properties, click ‘Advanced system settings’

  7.png

Press ‘Settings...’

   8.png

tab Advanced and press ‘Change…’. Change size to the recommended one and save it.


14. In Parameter Summary there might be one thing needed to be changed. If any of your installing locations there is set on Temporary Storage drive (had it as drive Z) then you have to change it on other not temporary drive like C.

          9.png

Like above data volume is set to Z. If it will not be changed after installing system end restarting it might not start because of lack data volume (due to installing it on temporary storage). That is why you need to select ‘MaxDB Data Volumes’, click ‘Revise’ button and change all locations to other drive, for example on C:\sapdb\NSP\sapdata. Remember SAP System ID and Central Instance Number because it will be needed in point 16.


 

15. If you don’t want to change parameters go next. Installation should begin and after few hours should be finished. After that in SAP Management Console there should be your abap server in green color. That means installation was successful. If it is grey then right click on it and press ‘Start’. Log in if needed and after a moment server should be working (green color). Actually both – PI system and ABAP server should be green to be sure that they are both working well. To be sure that they are working you may restart computer and start them again.

 

16. To connect with server you need to install SAP GUI. It is in package that you unpacked in point 7. Install it. Run ‘SAP Logon’ icon from desktop and create new entry (right click -> add new entry). In application server use the ip address from point 10 and in Instance Number and System ID fields use values from point 14. Then click ‘Finish’. Now you should be able to connect to ABAP Server and log in there.

 

 


    

Resolving CMS database inconsistency in SAP Business Objects using query builder

$
0
0

Many a times CMS database in Business Objects can become inconsistent resulting in issues while logging in. While repository diagnostic tool can be used to point out the inconsistencies, some of them need to be resolved by query designer tool. One such issue and it's resolution has been described below having exceptions related to SI_APPLIED_PROPERTIES at the time of logging in.

 

Encountering below error while trying to login into BO BI 4.0. Issue happened during login into CMC as well as launchpad. Please find below the snap-shot of the error,

Untitled.png

The CMS database somehow got inconsistent. Firing the following query in the Business Objects Query Builder,

 

select * from CI_INFOOBJECTS,CI_APPOBJECTS,CI_SYSTEMOBJECTS where
SI_APPLIED_PROPERTIES is NULL and SI_KIND in
('ClientAction','ClientActionSet')


This resulted in the following 3 infoobjects being returned

Untitled.png

Untitled.png


Untitled.png

Make a note of the SI_CUID value for each of the three returned info-objects. Then use these SI_CUID to delete the entries from the database

 

delete from cms_infoobjects7 where si_cuid in
('AfOoXl4g25xBqIjaVIDrsKU'
,'AZwPQf9Z4gRMr13bWVMspNY','AYybOm8Lqk5OoJ1kOHNLOwA');


The stale entries in the CMS database pointing to DFO files are removed. Now you can successfully login into CMC and launchpad.


Please Note :

1. The url for query builder is http://[ServerIP]:[Port]/AdminTools/querybuilder/ie.jsp You can give host name also instead of server IP if you are using the local terminal server. Also you can check what all files are present in you Tomcat directory structure inside the host where the Tomcat web application server is installed for BO.

2. This issue is not release specific as CMS database can become inconsistent irrespective of product releases.

3. SAP was supposed to release a KBA for this issue, though I didn't found any. Somewhat identical issue has been captured in SAP KBA# 1563662.

4. While firing the delete query you might need the owner of the table name cms_infoobjects7. Use the query, select owner from dba_tables where table_name=''ÇMS_INFOOBJECTS7'

Once owner is retrieved you can use the CMS schema name in front of the table name to fire the delete query in oracle.

Finally, COMMIT is a strong word. Don't forget to communicate the same to Oracle.

 

Cheers,

Raj

SAP Migration using Oracle Golden Gate - High Level Steps

$
0
0

Hi Techies,

 

I have performed Migration using Oracle Golden Gate. Please find the High level Steps.

 

Would be sharing the detailed Documentation.

 

Advantages of Oracle Golden Gate is there no down time for the source system. only down time is during the switch over.

 

 

  • Initialize the Oracle Advanced Customer Service (ACS) for SAP O2O (Oracle to Oracle) package which includes all functionality to create a complete set of migration scripts including a new source user creation, target database and related tablespace creation, and the OGG source and target configurations.
  • Install the Oracle GoldenGate software on source and target
  • Run the O2O package on the source system 4. Start GoldenGate change data extract and pump processes on the source system, which will being capturing and queuing change data and DDL (as needed).

Note: All tables in the SAPR3 schema will be included and will contain a SCN/CSN token so that later each table can be filtered by SCN/CSN. A code snippet to add such as token is:\

Table SAPR3.*, Tokens (tk-csn = @GETENV ("TRANSACTION", "CSN"));

  • Start the initial database load to copy the database objects from the source to target database.
  • Validate initial database load.
  • Start GoldenGate replicat apply processes on the target system, which will be applying captured change data and DDL (as needed). Note: While the replicat process(es) will be started using the AFTERCSN option, tables within a single replicat will be configured to skip transactions based on the table grouping of the initial loads. This automatically configured by the O2O scripts. Transaction filter by table will be done with something similar to the following code snippet:\

Map SAPR3.MYTABLE, Target SAPR3.MYTABLE, Filter ( @NUMSTR (@TOKEN ("TK-CSN")) > 1234567);

  • Switch over. Once both system are in sync the source SAP system may be stopped. After the last changes is applied to the target the SAP system can be started
  • on the target system. The source SAP shutdown and target SAP startup will typically take approximately 10 minutes regardless of the system size.

 

--

Sandeep Singh

Sum and Sum with DMO

$
0
0

Motivation:

I started the upgrade of 7.3 portal system and sum is expecting <sid>adm password, and due to audit restriction we were not able use the <sid>adm in sum for the upgrade.  I checked couple of documents about sum and I didn’t see any workaround for the same.


I checked one document which talks about  DMO for HANA , observed that there was option where we can skip <sid>adm user/password(by changing the sumstart.conf file) and tried that option  we moved forward but unfortunately started several issues with schema user and  and several other issues(Even though I have given full permission and groups to the user) and ultimately resulted the restart of the upgrade with JSPM with the help of note 1563660 .


I checked with sap and they told that it is not meant for normal upgrade and it’s is only meant for DMO and it was not tested for other scenarios

Goal:

I spent several hours for investigating the workaround and picking the wrong tool again wasted several more working hours in solving the actual issue. I checked several blogs in SDN and here I would like summarize few points which gives idea when to use sum and sum with DMO  ,   this is not new blog with entirely new content but I would like differentiate both the tools .



Software Update Manager:

    • Release uprade (major release change)
    • System update (EHP installation)
    • applying Support Packages (SPs) / Support Package Stacks
    • applying Java patches
    • correction of installed software information

  SUM is used for all SAP NetWeaver based systems, so systems either based on AS ABAP, or AS Java, or based on a dual-stack system. 

./STARTUP

http://<host name>:4239.


SUM with DMO

Database migration option (DMO) is an option in SUM - DMO is not a separate tool and this build is only using the ABAP part of SUM, called SAPup

  1. 1. Although DMO is based on the "standard" SUM, a new user interface (UI) is used - but only for the DMO procedure. The UI is based on SAPUI5, so it's running in a browser, and offers some comfortable features like checking the log files without having to log on to the OS of the application server.

   2.  Different start procedure

SUM is started in a different way: a browser request to the SAPHostAgent is used to start SUM, instead of starting SUM from the prompt/console. This allows you to start the SAPup process without having an additional server window open

Different UI

   3. SUM shows a new user interface which is based on SAPUI5, which is currently exclusively available for DMO only.

   4. The new UI is currently only available for the DMO procedure, but in the long run, it may become available for other use cases as well.


Where we can use

Scenario:

  • You want to migrate your existing SAP ABAP system to the SAP HANA database
  • Your SAP release needs to be upgraded prior to migration
  • MO for SAP NetWeaver BW and for SAP Business Suite systems
  • DMO is available with Software Update Manager 1.0 SP09, and can be used for systems based on AS ABAP. It can be used for SAP NetWeaver BW systems from 7.0 SP17 (and higher) to migrate to 7.31 (and higher). And it can be used for systems part of the SAP Business Suite 7.0 (and higher) to migrate to a level corresponding to SAP BASIS 7.40 (for example "SAP enhancement package 7 for SAP ERP 6.0").


How to start SUM with DMO option


Run the command /usr/sap/<SID>/SUM/abap/SUMSTART confighostagent to create the file sumstart.conf in

Start the DMO Frontend with the following URL – (HTTP/HTTPS)


http://<server>.<domain>.<ext>:1128/lmsl/upgrade/<SID>/doc/gui

https://<server>.<domain>.<ext>:1129/lmsl/upgrade/<SID>/doc/gui


Using proper tools as suggested by sap will save your time And always it is  recommend to use sap prefered tools and follow release notes and guides before planning any upgrade .


Referecens:


http://scn.sap.com/community/it-management/alm/software-logistics/blog/2014/03/10/dmo-technical-procedure

http://scn.sap.com/docs/DOC-49580

https://scn.sap.com/docs/DOC-46824

1600846 - JSPM/SUM Calls sapcontrol without user credentials

1563660 - sapcontrol, <sid>adm authorization issues (SUM)

 

SAP First Guidance - migrate BW on HANA with the

database migration option (DMO)

VB coding in BEx Analyzer – writing first custom functions

$
0
0

A large variety of custom VB functions can be created and used in BEx analyzer. That method may be useful especially when somebody don’t know ABAP.  

 

As an example creating VB functions converting 0fiscper value into full date is described here.

 

First, we have to be sure that macros can be saved in the workbook we use. Standard MS workbook .xlsx can’t do that. On such occasions I save the workbook as The Excel 97 - Excel 2003 file format.

 

After doing this you have to save the workbook on the server again

saveworkbook.png

 

Now we can add macros to the workbook. We can do this in VB editor (Alt-F11)  

If we want to store our macros locally (as in this example) we should add a new module

 

vb creating module.png

and write down the code for the macro

vb writting a code.png

If we have the code stored in .bas file it can be imported here (we can do this when we want to use the same code in many workbooks).

vb importing module.png

  After writing or importing the code the workbook should be saved on the server with the “Save workbook” button.

saveworkbook.png

 

Now function we stored in workbook can be used like this:

vb function usage.png

Please find attached file containing the code that was used here (the extention of the file has been changed to .txt so that the file could be stored in SCN server).

 

Regards, Leszek

Report to alert you the latest jobs canceled

$
0
0

Hello,

 

I had the necessity to check up all the jobs canceled, on the environment that we administer, in a fastest way. So I developed this report, that check which jobs was canceled in the last 24 hours and, in case of had one of it, it sends me an email. It fulfills my need.

 

You can schedule it to run once a day, or customize as you like.

 

I would like to share this report, in case of someone else need something similar.

 

 

*&---------------------------------------------------------------------*

*& Report  ZMAILJOBSCANCEL

*&

*&---------------------------------------------------------------------*

*&

*&

*&---------------------------------------------------------------------*

 

 

REPORT  zmailjobscancel.

 

 

* data jobs

DATA: BEGIN OF wa,

   job_name TYPE tbtco-jobname,

   run_date TYPE tbtco-strtdate,

   run_time TYPE tbtco-strttime,

   job_stat TYPE tbtco-status,

   END OF wa.

 

DATA: itab LIKE TABLE OF wa.

********

 

TYPES: BEGIN OF gty_s_tbtco,

   job_name TYPE tbtco-jobname,

   run_date TYPE tbtco-strtdate,

   run_time TYPE tbtco-strttime,

   job_stat TYPE tbtco-status,

   END OF gty_s_tbtco.

 

 

DATA: gs_tbtco TYPE gty_s_tbtco.

**  gt_tbtco TYPE TABLE OF t005t.

 

 

DATA: new_date TYPE sy-datum,

       l_str TYPE string.

       new_date = sy-datum - 1.

 

 

DATA :

   gv_sender TYPE string, " Sender Email Address

   gv_recipient TYPE string, " Recipients Email Address

   gv_title TYPE string, " Email Subject

   gv_soli_tab TYPE soli_tab, " Email Body

   gv_soli_tab_line TYPE LINE OF soli_tab.

 

 

gv_title     = 'TITLE'.

gv_sender    = 'sender@mail.com.br'.

gv_recipient = 'recipient@mail.com.br'.

 

 

**  Select Canceled Jobs from the last 24 hours.

SELECT jobname strtdate strttime status FROM tbtco INTO TABLE itab WHERE

strtdate >= new_date AND status IN ('A').

 

IF sy-subrc <> 0.

   gv_soli_tab_line = 'None'.

   APPEND gv_soli_tab_line TO gv_soli_tab.

ENDIF.

 

 

**gv_soli_tab_line = '----------------------------------------------------------------------------'.

**APPEND gv_soli_tab_line TO gv_soli_tab.

 

 

LOOP AT itab INTO gs_tbtco.

**  WRITE:/ gs_tbtco-run_date.

 

 

   CONCATENATE gs_tbtco-run_date(4) gs_tbtco-run_date+4(2) gs_tbtco-run_date+6(2) INTO l_str SEPARATED BY '.'.

   CONCATENATE gs_tbtco-job_name' - ' l_str INTO gv_soli_tab_line.

 

   APPEND gv_soli_tab_line TO gv_soli_tab.

 

 

ENDLOOP.

 

 

** SEND MAIL

CALL FUNCTION 'EFG_GEN_SEND_EMAIL'

   EXPORTING

     i_title                   = gv_title

     i_sender               = gv_sender

     i_recipient            = gv_recipient

     i_flg_send_immediately = 'X'

   TABLES

     i_tab_lines            = gv_soli_tab

**    I_TAB_RECIPIENTS     =

       "i_flg_commit        = 'X'

   EXCEPTIONS

     not_qualified            = 1

     failed                      = 2

     OTHERS                 = 3.



Regards,

Richard W. L. Brehmer

Performing SAP Kernel Upgrade on IBM i using APYSIDKRN command

$
0
0

The following method describes the procedure of performing a Kernel Upgrade on IBM i-Series /DB2. Since the AS/400 specific procedure is different from any other OS platforms. Kernel upgrade can be performed successfully using the APYSIDKRN command which takes care of all manual actiions – like checks, backups, permissions etc., from versions 7.20 and above. In the past, various commands were used for performing this single activity, now SAP has simplified the procedure with the introduction of a single command APYSIDKRN. Have documented this for beginners who are new to i-series, any inputs or feedback is highly appreciated. Various projects might have followed different approaches, but we have been practicing the following method without any issues.

 

Reference SAP Notes:  1) Note 1632755 for more information on APYSIDKRN command

                                     2) Note 1432807 for restore old kernel back again


Technical Steps


Step 1: Downloading the Kernel Copying Media to your IBM i

Step 2: Stop SAP System

Step 3: Upgrade kernel using APYSIDKRN

Step 4: Start SAP System and validate


1) Download the OS/400 kernel files from market place and copy to you IBM i

 

1.JPG

Copying the files manually to IBM i


The required installation media has to be copied from your Windows PC to your IBM i. To copy the media, you have to use a binary share ROOTBIN. This guarantees that the content of the media is copied correctly from the Windows PC to your IBM i. No copied content is corrupted, and no copied files with longer file names are shortened by a converting share. Binary Share – ROOTBIN will be created during an initial installation, if not please refer to i-series installation guide on creating a ROOTBIN directory.

 

2.jpg

Create a folder kernel_741U_43 under ROOTBIN for example and place the SAR files in the directory (Don’t have to extract them – the tool will take care of the uncar)

3.jpg

4.jpg

2) Stopping the SAP System

Log on to the VM for i-series and connect to the server using Emulator (All i-series tools will be installed under IBM i Access for windows)

6.jpg

7.jpg

Soon after we log in, the system prompts for specific operation we want to perform.

2.jpg

Stop the SAP system

3.jpg

Step 3: Upgrade kernel using APYSIDKRN


Check the existing kernel version of the system

 

Run disp+work -v from kernel directory (/sapmnt/SID/exe/uc/as400_pase_64)

 

(To run user friendly UNIX commands like other operating systems – we can use Q SHELL, simply enter the following command at the initial screen CALL QP2TERM – which will take you to terminal, general commands can be run from there like cd, ls, cp, rf etc.,)

4.jpg

 

7.jpg

RUN the program APYSIDKRN from command line

 

8.jpg

9.jpg

Just give the path of the kernel dump which we gave in Step 1 in the field Archives to be applied : /kernel_741U_43/*

The program will automatically look for files in ROOTBIN and under that the dump we mentioned.

11.jpg

The APYSIDKRN will automatically prepare the environment, take back up of the existing kernel, check the Archive SAR files, kernel compatibility; it will also give options for Mode of Operation to ADD the files to the existing kernel, TEST, FULLY – to upgrade kernel versions directly also (ex from 720 to 721)

 

ADD is option is to OVERWRITTEN existing files

TEST option is to Simulate, no changes will be applied but we can identify and analyze any problems to be expected when applying patches

FULLY – Program directly is completely emptied before it is filled with new kernel files.

12.jpg

Old kernel automatically backed up to /sapmnt/SID/patches/saved/xxx.SAR

14.jpg

Once kernel upgrade is finished the program automatically exits.

Check the newly upgraded patch level same as above

 

Step 4: Start SAP System and validate

 

16.jpg

Enter

19.jpg

It took 15 minutes for the complete procedure and to bring up PI 7.4 system.


Equation of Throughput for GC on a server node - Abstract

$
0
0
DISCLAIMER: Please do not take 'Throughput' as we all know. For me, and here, it's a measure of performance.
All you see below was an idea that I wrote on paper and I thought of sharing it.
I will try to make this idea more viable and realistic.
Readers and Moderators - Forgive me for something that you don't like and this effort.

 

Abstract:

This weekend, I was thinking of some way to calculate the 'Throughput' in terms of Garbage Collection for a Server Node.

So, I started related Java with basic mathematics - and came down to derive an equation for 'Throughput'.

 

Here we have some terms - Two Variables - Non GC Time and GC Time, with the help of which I have written the first equation:


Non GC Time + GC Time = Total Time = Start to current up time

 

or we can say,

Non GC Time = (Total Time - GC Time)


Now, if I want the throughput, I have to do.....

(Non GC Time/Total Time) *100 = Throughput -- this should me maximum


And hence, my motive is to increase the Throughput

((Total Time - GC Time)/Total Time) * 100 = Throughput -- this should me maximum  > much high


To calculate this GC Time, I have included a new function to find it

i.e.  GC TIme = (N * avgGC Time ), where avgGC Time is what  you calculate from few random  GC runs and  N is the number of GC runs , which again you can get from std_server logs 


So, now I have a new modified equation:

((Total Time - (N * avgGC Time ))/Total Time) * 100 = Throughput(N)


Now if we include the number of core per CPU

Throughput(N,P) = avg [(Total Time - (N *avgGC Time ))/Total Time) * 100] / P

where P stands for processors number of cores on multiprocessor.


Throughput(N,P) = avg [(Total Time - (N *{Pause Time }))/Total Time) * 100] / P


I then thought of delay, that all I included.

Throughput(N,P) = [(Total Time - (N *{avgGCTime + X(delay)}))/Total Time) * 100] / P

Now, this delay also can be used to include CMS mark times

Also, mark - X is directly proportional to P - one example is delay between polls in a multiprocessing architecture


So, finally I derived an equation to measure GC performance, which I call as Throughput for GC for a server node.


  • Throughput(N,P) = [(Total Time - (N *{avgGCTime + X(delay)}))/Total Time) * 100] / P


Any suggestion for improvement for this novice efforts are always welcome.

I am trying to automate this, and future improved versions, equation on test system, and if it works I will share with you all. My own Wily


Regards,

Divyanshu

Monitoring MSSQL from AIX - My Experience

$
0
0

Hi All,

 

Few days back, I was in a strange situation where availability checks of MSSQL databases were required to be set from Solution Manager which was running on AIX. Request was to monitor availability, even when the applications(SAP) are down.

 

I tried with some basic setup, but all failed from SAP level. Querying MSSQL database directly was not working in my case.I went through some KBAs and notes.

Like 1601608 and 1458291.

 

So, Issue now was:

Whenever, we access any remote database from any SAP system, perhaps we should say a central system like SOLMAN, we have to provide 'Database Shared Library', abbreviated as DBSL and Microsoft SQL Server SNAC Client Library on that SAP system. This DBSL is used to connect SAP processes, it's database counterparts to query and manipulate data residing in databases. MS-SQL is a product of Microsoft and it's supported DBSLs are also provided by them. Microsoft has only made DBSLs for Linux X86_64 and it's own Windows Operating systems. SNAC is Microsoft's SQL Server Native Client is a stand-alone data access application programming interface (API) that is used for both OLE DB and ODBC so a mandatory requirement to take advantages of Solution Manager DBA Cockpit and it's counterparts.

 

I raised this wtih SAP but nothing worked

 

Because of no supported DBLS, every attempt to connect would fail -

the connection failed *** ERROR => DlLoadLib()==DLENOACCESS - dlopen("/usr/sap/<SID>/DVEBMGSXX/exe/dbmssslib.o") FAILED

 

Finally, one work around to monitor MS-SQL availability was to listen to a port 1433 which MSSQL instantiates whenever it goes live.

So, we prepared a script and tested on test system, and that worked.

We also checked bi-directional ports 1433 and 1434 to be opened with MS-SQL hosts and our Solution Manager system.

 

But now, the protocol/program used to do this test was telnet.

Whenever, we started a telnet test from script, a connection was established and seen on MSSQL host.

Also, on our AIX, a new process was created for every test and existed even after terminating the script.

The connection remained active on MSSQL host unless telnet process was removed from OS level.

 

So, we took things further, we prepared two scripts, 1st script for doing telnet and the other, 2nd script, for killing the processes.

Now, question was how would the script know when to kill the process, and importantly which processes are to be killed.

 

What we did, every connection from telnet was redirected to a log file.

Now, the 2nd script reads the current system processes, filter all telnet processes on for MSSQL host and 1433 ports, and prefix the output with 'kill -9' before the OS process which is now written to a 3rd dynamic-temp 3rd script.

We further added one more test system to observer the behavior. Now, when the 1st script was executed, we made it wrote the output to a DB-SID specific log file, but the next telnet for new system was not executed, as the connection was established for the 1st system and connection didn't exit.

So, we started the 2nd scripts, using FOR loop based on DB-SIDs, and filter the process, as written above, in the 3rd script and executed it, read the telnet DB-SID specific log file. If we found "connected" word in the log file for this system, that means the DB is available and emailed from OS level to the required users and distribution list.

 

This all was tested and worked well for us, there were no pending telnet connection on remote databases left.

 

Taking this further, we used around 20 MSSQL systems and modified the script further which read DB information from a flat file and triggered email when DB was not available. We scheduled scripts in cron for every 10 mins, with 2nd scripts using SLEEP function in loop, giving 1st script few seconds to test connection.

 

I cannot share that script, but the algorithm is almost here.

Hope, this will help others if they face similar situation in near future.

 

Regards,

Divyanshu

Awstats for SAP systems

$
0
0

I have always lacked the feature of eg apache where the web server generated an accesslog which I then could attach to an awstats server.

 

But one day I decided to dig down to the logging feature and realized that it might be possible to have the ICM to generate an access log.

As I see it, it is not the import of the log that is the problem, it is the generation of the log that is cumbersome.

To make sure that all relevant fields are added to the log entry.

 

Prerequisites:

Authorization to change system parameters

Web server that contains awstats (http://www.awstats.org/)

 

Step 1:

Analyse logging parameters.

http://help.sap.com/saphelp_nwpi711/helpdata/en/48/442541e0804bb8e10000000a42189b/content.htm

 

Quite straight forward.

Compare the SAP standard with the Apache standard log settings.

 

SAP

No real default, but CLF and CLFMOD are proposed:

CLF - %h %l %u %t "%r" %s %b

CLFMOD - %h %l %u %t "%r2" %s %b

 

Apache

 

LogFormat "%h %l %u %t \"%r\" %>s %b \"%{Referer}i\" \"%{User-Agent}i\"" combined

 

And AWSTATS.

LogFormat = "%host %other %logname %time1 %methodurl %code %bytesd %refererquot %uaquot"

 

Step 2:

Decide on parameters to include.

I am very interested in the referrer and User agent so that must be included.

 

That means to enhance the SAP string with more parameters.

Comparing the name with wanted feature gave me following list of parameters.

LOGFORMAT=%h %l %u %t "%r" %s %b "%{referer}i" "%{user-agent}i"

 

Step 3:

Activate Logging

 

Activate logging is done by adding the logging parameter to the instance profile via RZ10.

This is my setting:

sapawstats1.png

 

I also want the log file name to change name everyday .

Now the server needs to be restarted. (Eventually just restarting ICM is enough).

 

And the log file is generated.

sapawstats2.png

 

Step 4:

Generate awstats

With the logs present we can now generate our awstats page.

And since I replicated another log I had for another server I could easily clone the config and automate the generation.

 

sapawstats3.png

I have attached an example HTML which unfortunately have image and other links broken.

But feel encouraged that it is possible to generate awstats from an SAP-system and do return with your comments and improvements.

Both the ABAP system as well the Java system has this logging feature. I have only tested the ABAP server, yet.

 

Good luck

/Fredrik

rfc communication error with system/destination - Solution

$
0
0

Hello;

 

You may get the error "rfc communication error with system/destination" while you import one of your transport requests from one SAP system to another as below image shows:

 

Capture.PNG

 

Solution:

To fix this error go to client 000 --> run transaction code STMS --> overview --> systems

11.png

 

Extras --> reset user TMSADM

11.png

 

Extras --> Generate RFC destinations

11.png

 

This will fix the error.

 

Hope this was helpful.

 

Best Regards

~Amal Aloun

How to transfer data from BEx Workbook to other system using XML data format

$
0
0

How to transfer data from BEx Workbook to other system using XML data format.

 

Before the system described here was created a raport (BEx Workbook) was used to print paper
version of tax reports. The goal was to enable electronic flow of data.

Bez tytułu - schemat ogolny.png

Note !  In original version the Polish Tax System e-deklaracje was used as the target system. However, any other system accepting xml file can be used. 

 

 

Prerequisites:

 

  1. Target system (here: e-deklaracje) that will be supplied with the data in xml format,
  2. BW query that contain all the data that we need (here: Q001),
  3. XML file (here: Data.xml) that will be used to transfer data between the source system and target system.

 

In case of e-deklaracje system the needed XML file can be created by using e-deklaracje “export data” functionality.

 

There is also possibility to create such file by converting XML file schema (.xsd)  file into xml format. Such converters can be found in the internet.

 

 

Things to be done:

 

In the process we create BEx workbook with two worksheets “Q001” and “Data XML”. In the “Data XML” worksheet we save the file “Data.xml”. In the “Q001”
worksheet  the data source to the query “Q001” will be created. To fill the “Data XML” with data that is displayed in “Q001” worksheet Excel formulas will be created. After the data is displayed in the “Data XML ” that worksheet will be saved as “XML data” to be imported to e-deklaracje system in the final step. 

 

 

Step by step procedure:

 

BEx workbook creation:

 

  1. Run BEx Analyzer (log in to the
    system where the workbook will be created),  Open XML file using standard excel function: file / open / as xml table   function (information about creating file schema is displayed),
  2. Now you can see the xml file fields in one row :

     

Bez tytułu - import xml.png

 

3. Change the worksheet name where the xml file is stored to “Data XML”

 

4. Create new worksheet in the workbook and give it name eg. “Query1”

     (it should look like this)

Bez tytułu - zapisz zakładki.png

  

  5. Create data source to display data from the query “Q001” in “Query1” worksheet,

 

  6. Save the file as SAP BW Workbook using BEx Analyzer Save Workbook function.

 

  7. Now we should see the table with the refreshed data in the “Q001” worksheet. If not refresh the query (eg. by reloading the workbook).

 

  8. Create formulas in worksheet “Data XML” to display data from the query in the XML sheet.

 

  Bez tytułu - wpisanie formul.png 

 

 

   9. After formulas had been created the workbook can be saved and workbook closed.

 

 

Refreshing data in workbook and creating the xml file:

 

 

10. Open the workbook, refresh the query to see the data that we need to transfer to the target system,

 

11. Activate worksheet “Data XML”Save the activated worksheet as xml file using standard File / Save as / xml data type file.

 

12. Import the file to e-deklaracje or any other target system.

 

 

Now the imported data can be checked in the target system.

 

 

Regards,
Leszek

How to Import SAP Transport Requests manually at the OS Level

$
0
0

Hello All;

 

We can release SAP Transport Requests manually by copying the Transport Request on the OS level:


The normal scenario happens when you release your request on the development system, and then import it to the quality, and to the production systems. The import procedure happens via SAP STMS, but in some few cases you would need to do this scenario manually on the OS level.


1) In your system where your transport request exists go to the directory /usr/sap/trans/data& /usr/sap/trans/cofile

1.PNG

2.PNG



2) copy the files relevant to your transport request in both directories. In this example the transport request ID is 905209.


3) On your quality system or on the system which you need to import this transport request to paste both files  under /usr/sap/trans/data& /usr/sap/trans/cofile.


4) Go to STMS to the import queue of your quality system, or to the system where you want to import the transport request to, and on the menu bar click on extra --> other requests --> Add

1.PNG


5) You will be asked to add the transport request

Capture.PNG

6) You will find your transport request. choose it, and click on ok


7) Now your Transport request will be in your import queue. Choose it, and click on import

1.PNG

 

Hope this was helpful.

 

Best Regards

~Amal Aloun

Configuring and troubleshooting enterprise portal url iviews for BI reports

$
0
0

These days end users from Business Houses prefer to login to the one and only enterprise portal in their SAP landscape to view and analyse BO BI reports rather than doing a separate login to the BO BI launchpad. This blog post discusses about the configuration and issues related to such usage.

 

Pre-requisites

 

1. SSO must be configured across your BW ABAP system, AS Java Enterprise Portal and BO BI system. Certificates are exchanged between your portal and BW systems and you configure the SSO via creation of system objects in enterprise portal and do a subsequent test of the same using the test and configuration tools in enterprise portal. Similarly you need to import the certificate generated in the OS level of BO system and import the same into your BW ABAP system and activate the relevant parameters in the properties of files of the BOBJ system and at the same time importing the keystore into your BO CMC, Remember that other than setting up of parameters (like the change of the global.properties file and the OpenDocument.properties file), NO exchange of certificates are performed between you portal and BO systems.

 

2. Setup entitlement systems in the SAP authentication inside your BO CMC by putting in the relevant information of your BW ABAP system. Import the BW roles (the roles for which you want the BI reports should have access) from BW ABAP system into your BO system. Give those roles adequate security in the BO folder level (view, view on Demand or advanced rights according to the client's requirement). Ensure that the same SAP users have proper enterprise portal roles assigned to them so that they can view the BI tab created for them inside portal which is housing all the BI report links (url iviews), and finally the same users should have enough authorization in the backend BW system so that they can fetch the data from the OLAP cubes, info-providers as and what is applicable (having right amount of securities in all these three levels particularly at the BO and BW end resolves lots of report related issues).

 

3. Ensure that the Business Objects template is available in portal when you intend to create a system object for the BO purpose inside the System Administration -> System Landscape. If it is not available try to incorporate it via epa file transport.

 

You can create one separate folder to club system objects of your organisation inside the portal content tree. We are just going ahead with the creation of the BO system object inside the portal content tree within System Administration -> System Landscape.

 

Untitled.jpg

 

Select the appropriate Business Objects template from the list,

Untitled.jpg

 

And then proceed with keying in the following properties,

 

1st being connector :-

 

Group - The logon group which has been created in SMLG of BW ABAP system which you intend to use for logon whenever a particular BI report is viewed via portal

Logical System Name - Is the one for your BW ABAP system SIDCLNT640 (say).

Message Server - One that hosts the message server of your BW system; ideally the ASCS instance fully qualified domain name.

SAP Client - The client number where all the BW cubes and infoproviders exist

SAP system ID - SID of BW system

Server Port - Port Number for your message server defined in SMMS of BW system. you can also find the same from your ASCS instance profile. Typically, 36<port_number> where port number corresponds to the instance number of ASCS

Untitled.jpg

 

In the below step you need to key in the OpenDocument URL for your BOBJ system which is http://<servername>:<port>/OpenDocument/opendoc/openDocument.jsp

 

The port number will be the Tomcat port for your BOBJ installation typically 8080 (rare occasions it can be of WACS also, which is 6405) and server name will be the application server where your BOBJ enetrprise and web application services have been installed. for production system normally this host name and port number is replaced by some alias name.

Untitled (2).jpg

 

Add the alias. In our case we have kept the system object name and alias name to be identical which is SAP_BO_TOMCAT. It can differ.

 

Untitled.jpg

 

To troubleshoot issues regarding the errors related to usage of url iviews you need to read, as always the case, the log files which you can view by logging on to your portal netweaver administrator and navigating as below

Untitled.jpg

Selecting log viewer you get to see the below page and then if you click on the small arrow head of the view button you can list out the errors and their reasons. A mode set with DEBUG will enable you the entire course of action for a particular user starting the login into portal until he encounters the issue while viewing the report. These log files are nothing but the default trace of your AS Java system which you can also view by logging into the OS level of your portal application server and navigating to the appropriate server<n> directory.

 

Untitled.jpg

 

Untitled.jpg

 

You can always view ID and CUID of your particular BI report by logging on to BO CMC and then navigating though folders (and your appropriate LOB folder structure where you have deployed the report). Just in case you need to verify that this is the report that you are using in the url iview.

 

In the enterprise portal you can view the iview in Content Administration -> Portal Content Management and inside the portal content tree

Untitled.jpg

Right click any iview and open the properties section. Select the category Navigation and then there will be one option called Quick Link and therein you can give the name which will be your actual iview URL.

 

Untitled.jpg

On selecting the category SAP Business Objects you get to see the document id and opendocument url which I've mentioned above. Alsp there is System. Remember that this is the alias name and NOT the system object name defined at the time of creation of the system object.

Untitled.jpg

 

SAP has a wonderful logging mechanism and you need to know the way to look into them to find out the reasons. Trust SAP. It won't hurt you back!

 

Stay intrigued,

Raj


SAP mapping network drive without downtime. Wrong order of calls

$
0
0

During a windows patchday we lost some of our mapped network drives. All of the permissions were set correct as of note 117395.

The access on operating system level was working perfectly fine.

 

I know that the SAP System works with the SAPService<SID> user. Due to this i logged on to the server and checked the mapped network drive:

Bild1.jpg

 

The access to the folder was possible.

I checked ( and recreated ) the mapped network drive in the SAP System:

 

Bild2.jpg

 

The error wrong order of calls <- CALL opendir was normally resolved after a system restart. In this special case i had no chance.

I also implemented a small application server and mapped the network drive to the new server.

 

The access was still not possible. Althou the UNC Connection worked!

During the resarch i came across Report RSBDCOS0. This report allows to acces the operating system level directly.

From here i could also check what setting are active in the SAP system.

Bild3_1.jpg

The access to directory I: was not possible. ( I also had one Applikation Server where the Status was "ok" but i still could not logon to directory I: via transaction al11 )

 

After deleteing the the mapped drive both in the sap system and on os level with the command:

net use I: /delete

 

i created the link again with Report RSBDCOS0

Bild3_2.jpg

The access was possible inside the report.

Double checking inside transaction al11 also working:

Bild4.jpg

 

I did not manage to find out why the restart was not working. At the end i found a nice solution to map those network drives without any downtime.

 

 

Rainer

send output of a process which we run in background mode not as background job as email

$
0
0

To send output of a process which we run in background mode not as background job as email, we need to setup below printer and use it

 

I want to receive a result of transactions executed in background to my e-mail box, better saying to an e-mail box who has executed as a background job.


We have to create a printer in SAP with access method M and device type ZPDF which will help us to receive files as PDF.


we have to provide an email id and select use this email id option else we can use different users to receive the email.

 

 

pdf.jpg

pdf2.jpg

Application and system - a worthless debate?

$
0
0

Recently one of my friend who is also an ABAP developer , corrected me saying that SAP BI is a 'system' not an 'application'. We debated about two minutes and finally I gave up admitting to his demand, since there wasn't any point in winning an argument . His demand was that I should agree it is a system not an application. I got into thinking why we have a different view about it. I am a SAP Netweaver administrator and consultant. Both of us come from the engineering background in our education, me being an engineering degree in IT and my friend in Electronics. In unix concept a system software and application software is different. A system software is essential programs to run system,eg a kernal. Application software is application run on top of system eg: firefox. But I am aware that the sense of system is different in  our day to day life.

 

I think SAP BI is an application not a system.When I install it into a computer and make it ready to use for business it becomes a "system" . My friend think SAP BI is a system because it is not application server like Jboss or SAP NW webAS. SAP BI contains SAP webAS as well, but SAP BI is more than SAP NWAS. In my view, it is like this

 

QuestionAnswer
Is SAP BI a system?No if it is not installed in a computer (server) . Yes if it is installed on a computer.
Is SAP BI an application ?Yes
Is SAP BI an application server?No. But SAP NW web APPLICATION  serveris a part of SAP BI

 

Let me know your views.

NWA acquires 47.29% shares of Configtool

$
0
0

I would like to share few small tips, not a new discovery but not widely used!

I think these new features introduced in NWA approximately at the time of NW7.3 or little earlier.

 

There are certain activities which we are accustomed to do with help of configtool, like memory parameter change. The issue mainly we face here is to

  • get the OS access
  • set the display (I am not going to discuss here about display issues)
  • and sometime we do not have proper display tools to get the work done(like hummingbird etc)
  • and most important -- feeling lazy to start configtool

 

Still there are situation where there is no option except using configtool, like if Java is down OR  might have noticed when configtool does not give option to change certain values for example for SAP* activation or UME modification -  com.sap.security.core.ume.service. Every where the gurus have told :

1) Under Global Server Configuration, choose Services.   2) Find com.sap.security.core.ume.service, click it. 3) Under Global Properties, find ume.superadmin.activated

 

Gone are the days! Now as you see below its disabled to change even in global.

<Please click below image to view in actual size>

ConfigEditor01.png

In order to do that we need to click on switch to Configuration Editor Mode

cluster_config -> system -> custom_global -> cfg -> services -> com.sap.security.core.ume.service -> Propertysheet properties

<Please click below image to view in actual size>

ConfigEditor02.png

You need to scroll to the bottom to find com.sap.security.core.ume.service, don't look alphabetically

Double click on Propertysheet properties.

<Please click below image to view in actual size>

ConfigEditor03.png

Once done it will not ask you to save again, so being careful is always advisable!

 

Back to original topic....

 

Lets use NWA for some of the purposes for them usually we goto configtool.

I am going to show you the comparison of NWA and Configtool to accomplice the memory and other parameter setting changes.

<Please click below image to view in actual size>

configtool.png

In this version of configtool we know that parameters are distributed among 4 tabs.

Servers, VM Environment, VM Parameters (which again has 3 tabs) and Instance Profile

<Please click below image to view in actual size>

configtool02.png

configtool03.png

That being said, lets goto system information page and find these 3 new links. You need to click on each instance to view them

http://<hostname>:5<nn>00/nwa/sysinfo

  1. VM Memory Parameters
  2. VM System Parameters
  3. VM Additional Parameters

<Please click below image to view in actual size>

nwa01.png

Or even better goto NWA -> Configuration -> Infrastructure -> Java System Properties

 

(Did you notice System Information link is also present)

<Please click below image to view in actual size>

nwa02.png

You will get this page and you can find different tabs for

Kernel, services, Application, VM Environment, VM Memory Parameters, VM System Parameters, VM Additional Parameters

 

If you will select any node and modify the desired value you want;   it will make change only to that instance/node.

After changing the value, if you will hit the save button, it will prompt you to restart the instance.

 

(I have tried to restart the instance from here and it successfully restarted the application and brought it back, so even for restart you don't have to go to OS level)

<Please click below image to view in actual size>

nwa03.png

 

Apart from memory and other Java parameters we also require to work with various services, which can be found on services tab.

If you will select certain service; in the bottom you will able to find its properties, some of them are changeable and some of them are not.

<Please click below image to view in actual size>

nwa04.png

Even you can restart services from here.

<Please click below image to view in actual size>

nwa05.png

 

There is a lot to explore in this regard, keep on looking.

 

 

LineSeparator.jpg

My other Blogs, if you have time...

 

NWDS step by step (In the loving memory of SDM)

What's new in SAP NetWeaver 7.3 - A Basis perspective Part-I

What's new in SAP NetWeaver 7.3 - A Basis perspective Part-II

Bye bye STRUSTSSO2: New Central Certificate Administration NW7.3

Escaping tough moments of SPAM or SAINT

SAP Software Provisioning Manager : with screenshots

Multiple/Bulk transports with tp script for Unix (AIX, Solaris, HP-UX, Linux)

Script for deleting files within a directory structure with different retention days

Automate the configuration of the SSH for <sid>adm user on DB and Appl. Server

Holistic Basis View: BusinessObjects BI 4.0 SP 2 Installation & Configuration

How to Rename the Oracle Listener & Change Listener port for SAP

OSS1 & RFC connections SAPOSS, SAPNET_RFC, SDCC_OSS

Start/Stop SAP along with your Unix Server Start/Stop

Interrelation: SAP work process, OPS$ mechanism, oracle client & oracle shadow process

Install and configure NetWeaver PI 7.3 Decentralize Adapter part-1

Install and configure NetWeaver PI 7.3 Decentralize Adapter part-2

List of Newly added/converted Dynamic parameter in NetWeaver 7.3

Sunset for ops$ mechanism: No more supported by Oracle & Not Used by SAP

Automate the configuration of the SSH for <sid>adm user on DB and Appl. Server

 

Essential Basis for SAP (ABAP, BW, Functional) Consultants Part-I

Essential Basis for SAP (ABAP, BW, Functional) Consultants Part-II

Essential Basis for SAP (ABAP, BW, Functional) Consultants Part-III

Understand ST14 - use/repair/customize your analysis

$
0
0

Hi NW admins,

May be you know about the functions of the ST14 which is delivered by ST-A/PI which also includes:


  • Transaction ST14 (Application monitor)
  • Report RTCCTOOL (Servicetools Update) 
  • Report RSECNOTE (SAP Security Notes Check)
  • Transaction ST12 (ABAP trace for EarlyWatch/GoingLive)
  • Transaction ST13 (Launchpad for further analysis tools)
  • new SDCC datacollectors for BW APO CRM and databases
  • collectors for E2E change management

 

The ST14 is good to get an overview which area is wasting space and where is potential for optimization:


The collection of this data in ST14 is depending on the result of the database collectors. They are present in DB02 => Space => Additional Functions => Collector Logs


Here you can see the last analysis and if all modules are in an active state. This is prerequisite to use the ST14.


Error analysis

For error analysis you can use table DB02_COLL_LOG
Details in 1482296 - DVM Service: analysed data volume incorrect (Oracle)
Report RSORACUP (connection DEFAULT delete/create) to activate all collectors (oracle).
Details in 1002840 - No data or obsolete data in DB Space Statistic monitor

You also have to check if all entries in table TCOLL ae correct. Normally this is checked by report RTCCTOOL but you can double check it with note 12103.

If you have one of the latest versions of ST-A/PI (01R / 01Q SP2) you have to implement some corrections:
2049530 - Data Collectors corrections ST-A/PI 01R*
1936913 - Data Collectors corrections ST-A/PI 01Q SP02
1843959 - ST14 does not show data for BW analyses

Don't forget after you implemented a new support package for ST/PI or ST-A/PI to uncomment/recomment analysis coding! Therefor use RTCCTOOL=>Addon&Upgrade assistent=> Procedure after addon implemtation..
You can also execute the report /SSF/SAO_UTILS via transaction SA38 and flag the first option 'Uncomment/Recomment analyis coding for additional components'.


Customizing

If your ST14 is working and you could collect some results than you my be prefer a other format or you don't want so much information than you can customize it.

Check the Questions&Answers section in note 69455 - Servicetools for Applications ST-A/PI (ST14, RTCCTOOL, ST12)
Q8: deactivate single ST14 analysis subroutine ?
A8: An ST14 analysis job calls a sequence of analysis subroutines. If one subroutine fails, proceed ad follows: from the short dump get the subroutine name (e.g. AFI_FI_SCAN_T685). Run report /SSA/AXM from SE38. On the selection screen, enter the first 3 letters (e.g. ABW/ABO; check this in your report, the headline in a tree entrie includes the letters) of the subroutine name into parameter P_APPL and execute. In the left section 'ST14 customizing' choose 'Mapping: Assign analy. subroutines to GRPIDs' and press the button 'Change'. In the following tablecontrol deselect the 'Active' checkbox for analysis subroutine that should be skipped and press 'Save'.
Then schedule a new analysis from ST14.


Quick overview

If you want to have quick overview without scheduling ST14, you can use the DB02 / dbacockpit => space => additional functions => BW Analysis:

With a double click on a row you go into detail view.

The ST14 is pretty good for analyzing BW or SCM systems. To get a quick overview were your space is wasted and what could be optimized.

For me it is the starting point of a deeper analysis, because it is platform and DB independent.


If you have any further questions, don't hestate to comment the blog or contact me or one of my colleagues at Q-Partners ( info_at_qpcm_dot_de )

 

Best Regards,

Jens Gleichmann

Technology Consultant at Q-Partners (www.qpcm.eu)

Viewing all 185 articles
Browse latest View live


<script src="https://jsc.adskeeper.com/r/s/rssing.com.1596347.js" async> </script>