Vous êtes sur la page 1sur 39

Introduction to SAP Data Archiving

From my little experience in systems implementation, support and administration, one phenomenon that has contradicting
meaning depending on the context in which it is used is archiving or data archiving. Often times, IT professionals tend to
confuse the term with concepts such as data reorganization, fragmentation, document imaging, backup and restore
among others. Even in the SAP world, some technical professionals takes data archiving to be synonymous with
terminologies such as SAP Archive Link and DART.
While I do not intend to do a compare and contrast these phenomenons in this post, I think it is expedient to clearly define
what data archiving is in the context of SAP. I will also give an overview of how data is archived and the benefits of data
archiving in SAP.

What is Data Archiving?


Data Archiving is a decongestion process used to delete large volume of data that is no longer needed from a database
and storing same outside the database in a format that allows for data retrieval and analysis when need be. The emphasis
here are on deleting and storing. It is common knowledge that if a database is left to grow unmaintained, there is a
possibility of having performance bottlenecks and high database maintenance cost. Hence, one of the ways to maintain
the database is to delete records that can be termed obsolete. The word obsolete is relative. For the archiving process
to be complete, data has to be stored using a defined method.

How is data archived?


The data archiving run follows a sequence of steps. A brief overview is provided below.
1. Creation of the archive file: During an archive run, the write program first creates archive files which initiates the reading
process (from the database) and the consequent writing process (to the archive file)
2. Storage of the archive file: After successful run of step 1, the created archive files are stored. A number of methods can
be leveraged to store an archive file. Archived files can be stored hierarchically, optically or manually. It is important to
state that SAP does not recommend the manual storage of archive file. This is as a result of some standardization issues
3. Deletion of data: This step terminates the archive run. Before data is deleted from the database, the program first read
the content of the archive file. It is after this task, that the program deletes the corresponding entry from the database.

Benefits of Data Archiving


1. Reduced Backup and Restore time
2. Reduced access time for tables
3. Reduced database administration cost
4. Reduced downtime for system upgrade
5. Reusability of data

SAP Data Archiving Changes in ERP 60 EhP 6


I am currently testing SAP data archiving in ERP EhP 6 SP 3 and have found some issues as well as noticed several changes (from
ERP 60 EhP 3) that I want to share with the SAP Community. This information will be grouped by Configuration Issues, New
Functionality for Current Archive Objects, New Archive Objects, and Updated Documentation.

Configuration Issues

In past tests (upgrades, previous enhancement packs, OSS Notes, etc.) I have found configuration related issues with archive objects
that have been touched by IS Oil. Multiple changes/fixes were needed in order to even begin testing data archiving. For examples,
archive job variants were missing for MM_EBAN, MM_EKKO and MM_EINA. This was easliy resolved by recreating them via
SARA/Customizing Settings. The archive write job/program failed for SD_VBAK, MM_EKKO and MM_EBAN. This was resolved by
following the instructions in OSS Note 945459 (the note states that it is for SRM, but, SAP verified that it could also be followed to fix
this ERP issue). The archive write job/program also failed for MM_EINA with "Archiving object or class MM_EINA does not contain
structure OICQ4". I compared the MM_EINA archive object structure definition within transaction AOBL from the EhP 6 system with
one in a system that had not been upgraded yet. I found that it was missing 3 segments in the EhP 6 system. I added those segments
and the archive write job then completed successfully.

New Functionality for Current Archive Objects

MM_EKKO:

I noticed that this archive object has new write, delete and preprocessing programs:

I did have to manually change the Preprocessing program to reflect the new program name per OSS Note 1646578.

The new version of the preprocessing and write job programs provide an additional option of "Residence Time Check Creation Date".

If this new option is not selected, the residence time check is carried out against the last change date of the purchase order instead.

SD_VBAK:

The archive preprocessing and write jobs have a new option of "Check Valid-To Date".

This option only applies to sales documents with a valid-to date (like a quote, scheduling agreement, contract, etc.) and the program
assumes the end of validity has been maintained.

WORKITEM:

The archive write job has a new option of "Delete Unnecessary Log Entries".

I have not been able to determine exactly what this means yet as there isn't any SAP Help for this option.

PM_ORDER:

The preprocessing job has a new field of "Revision".

The write job has the new Revision field as well as an additional section for "PS:Project">

BC_DBLOGS:

The archive write job now provides the functionality to specify by table name, which changes get archived. Prior to this, ALL
customizing tables that had logging turned on were archived.

PM_QMEL:

The preprocessing program has added fields "Planning plant (IWERK)" and "Revision(S_REVNR)" as well as adding the capability for
getting detailed logging information to assist with determining why a notification was not eligible for archving, and saving this
information in the application log.

The write job also includes the same new functionality as the preprocessing program.

The detailed log and log output functionality has also been added to the preprocessing program and write program for SM_QMEL.

New Archive Objects with EhP 6

I found that there are almost 200 new archive objects that are delivered in EhP 6. I will not be going over all of them in this blog
will be picking a few of them to highlight.

,I

Virsa Firefighter Logs:

Depending on how you use firefighter id's, you may or may not need to control the growth of the log tables. OSS Note 1041912
provides some Firefighter Best Practice Archiving Strategy information.

If you do use firefighter id's extensively, you can use data archiving for these tables:

/VIRSA/ZFFTNSLOG - Firefighter Transaction Log


/VIRSA/ZFFCDHDR - Firefighter Change Document
/VIRSA/ZVIRFFLOG - Firefighter Action Log

To archive, use transaction /VIRSA/FFARCHIVE (not through SARA). Before you can run this transaction, you will need
to follow the instructions in OSS Note 1228205 to maintain the path where the archive file will be written as indicated in
the "Application Server File' parameter in the below image:

Additional information on this can be found in the Firefighter User Guide available on the SAP Service Marketplace.

BC_E071K:

Starting in SAP_BASIS Release 7.x, you can now archive transport information. Archive object BC_E071K is standard in
SAP_BASIS Release 731. For 70-72, you will need to be at the relevant support package as indicated in OSS Note
1340166.

Note that only the entries from table E071K will actually be archived out of the system. The related entries from tables
E070 and E071 will only be written out to the archive file, but not deleted.

CA_SE16NCD:

Per OSS Note 1360465: If you use transaction SE16N to make changes to tables, they are updated in separate change
document tables. Depending on the number of changes, the change document tables can be very large.

The tables in this archive object are:

SE16N_CD_KEY Table Display: Change Documents Header


SE16N_CD_DATA

Table Display: Change Documents Data

It is recommended to archive this data using date intervals.

The archived data can then be displayed/analyzed with report RKSE16N_CD_DISPLAY.

Archiving in GRC Access Control 10.0:

There are several new archive objects related to archiving GRC Access Control related data. They are:

GRFNMSMP

Archiving for GRC AC 2010 Requests

SPM_AU_LOG

SPM Audit Log Archive

SPM_CH_LOG

Change Log Archive

SPM_LOG

Archiving for SPM Log Reporting

SPM_OC_LOG

SPM OS Command Log Archiving

SPM_SY_LOG

SPM System Log Archival

Updated Documentation

The Data Management Guide has been updated as of December 2011. If you have not downloaded this from the Service
Marketplace recently, you should check it out (logon required).

To find out what has been added or updated, go to Chapter 2 "Which Tables are Examined".

Here you can quickly find out what is new in this version of the document by checking the "Last Changed in Version"
column.

There are a lot of changes related to SAP Data Archiving in ERP EhP 6. This blog just highlights a few of them. I hope
you find this information useful.
You can go through the below links related to archiving which transactions we use for archiving the objects and how we do that.

http://help.sap.com/saphelp_srm30/helpdata/en/15/c9df3b6ac34b44e10000000a114084/content.htm

http://help.sap.com/SAPHELP_NW04S/helpdata/EN/43/f0ed9f81917063e10000000a1553f6/content.htm

Basically Data archiving process comprises three major phases. They are:

1. Creating an archive file: The archived files of data are created in the SAP database by the Archiving Management system. The
Management system reads the data from the database and writes it to the archive files in the background. In instances of archived files
exceeding the maximum specified limit, or if the number of data objects exceed the stipulated limit in the system, then the system
automatically creates new archive files.

At the end of the process of saving data into archive files, ADK triggers the system event SAP_ARCHIVING_WRITE_FINISHED, which
is an indicator to the system to start next phase of archiving process.

2. Removing the archived data from the database:


While archiving management system writes data on the archive files, another program deletes it from the database permanently. The
program checks whether the data has been transferred to the archive. It is quite important too as it is the last check performed by the
system before deleting data permanently from the database. Several deletion programs run simultaneously, because the archiving
program is much more faster than the deletion programs. This is important as it increases the efficiency of archiving process.

3. Transferring the archived files to a location outside the SAP database


Once the Archive management system has finished archiving the data, the next step is to save the archived files at a different location
other than the SAP database. This can be accomplished by an automated process in the system or by a manual process. This step is
optional since many enterprises may wish to keep the archived files within the current database. However, large enterprises transfer
their data periodically as a part of their data archiving processes.

Hope this helps!!

Thanks and Kind Regards


Esha

Data Archiving
The SAP application generates a significant amount of data, and this increases with every new release. The
growth of the SAP database is generally reflected by a reduction of system performance, but in addition there
are additional strains placed up the system management people and processes. A not uncommon scenario is
for a production SAP database to be copied to create testing and training systems, so a 100GB increase in
production data can easily lead to 400Gb or 500Gb of additional landscape storage when DR and backup
servers are taken into account.
Currently only about 30% of the current SAP customers worldwide keep their database growth under control by
archiving data and in turn keep their SAP systems in peak performance. Many of the remaining 70% have no
idea how big their database is and its growth rate. The first time they may look at it is when batch or interactive
performance deteriorates or their database management system such as Oracle warns them of low remaining
free space in their disk storage systems. Even for those customers that do archive, a number of these do not
look at the additional archiving that could be performed to reduce the size of their disk storage.
Deleting data is not a viable option, since there any number of legal and regulatory requirements on the
retention of data, however, data from closed business process can be archived to a "content repository" server,
freeing up space in the SAP database but still providing seamless access via SAPGUI to the archived data.
Holding the data in a content repository allows the usage of more appropriate storage media and avoids data
replication across the landscape.

Proceed apply a detailed Methodology to all our projects, to ensure that our years of experience help you
obtain the most benefit and lowest risk for your projects. This methodology includes detailed documentation
deliverables so that your archiving can carry on seamlessly once the initial project has been completed.

Whilst our methodology covers the processes of a successful archiving project, in parallel we have developed
a suite of analysis and implementation tools that is loaded onto your SAP systems to further streamline the
project and reduce risk and cost. Known as the Archiving Workbench, it performs tasks such as table where
used analysis, and historical re-printing of SD/MM documents. The latest addition to the Workbench are a
suite of Data Cleansing tools that identify and correct data errors in your system (common during data loads at
go-live) that prevent the full archiving potential from taking place. For Opentext customers there are also
integrated Opentext Archive Server monitoring and troubleshooting utilities.
Proceed have available a number of free examples of the tools and process we use during a typical data
archiving project. They can be viewed in our Library

SAP ILM
SAP ILM (Information Lifecycle Management) is the new end-to-end data management process from SAP.
Available from ECC6 EHP4, a successful ILM solution will be the combination of the new archiving applications
from SAP, combined with ILM aware hardware and an experienced implementation partner such as Proceed.
Long-term storage of data through archiving has been a cornerstone of SAP and its partners Archivelink
certified solutions. Advances in technology and the emergence of Content Addressed Storage (CAS)
technology from EMC, HP and IBM ensure that data cannot be destroyed or modified before its end of life.
With ILM, SAP can now apply data retention policies to the content within the archive files, meaning that
control of data at end of life can now be managed easily and in a compliant manner with regulations.

ILM is built upon 3 main towers of functionality that bridge the gap between SAP and your storage solution

Information Retention Manager (IRM)


Archive Development Kit (ADK)
ILM WebDAV Interface
The IRM provides the policies and procedures for managing the data retention policies. The ADK binds the
retention policies from IRM to the traditional archive files created during data archiving, and the WebDAV
interface moves the archive files and its retention into the ILM aware storage solution.
ILM goes further than just applying retention polices to traditionally archived SAP data. By exporting data from
legacy applications using SAP Business Objects and parsing the data through the ILM ADK Converter, you
also have the option to apply data retention policies to your legacy data prior to application decommissioning.
This data can then be surfaced up thorough the ILM Retention Warehouse for viewing and reporting, in much
the same was as you would use your SAP BI system for your current SAP data.
Call us today, and ask how Proceed can help ensure your ILM project is a success.

Seven tips for simplifying SAP data archiving administration


SAP data archiving is a complicated, labor-intensive process. But there are best practices for SAP data archiving to make the
administrative task simpler.

SAP data archiving cannot be implemented without significant risk. That is the message of Jim Malfetti,
president of Glen Mills, Pa.-based Brandywine Data Management Group, an SAP professional services firm.
But there are ways to simplify the administration of SAP data archiving, said Malfetti, who has been involved in
consulting with SAP customers since 1996 and claims to have dedicated much of his career to archiving best
practices. Malfetti offers seven tips for simplifying the administration of SAP data archiving.
Tip #1. Don't worry if your SAP data archiving project goes into start/stop mode. Once the archiving
maintenance phase is reached, going into start/stop is perfectly normal. This is because of the 12-month time
lapse between the end of the archiving execution phase and the beginning of the archiving maintenance phase.
During this time, details get lost -- who's responsible, which new procedures and policies need to be created,
which new security authorizations are required. A reassessment at this time of whether the right archiving
objects are being used is also often required.
Tip #2. There's a law of diminishing returns with selecting new archiving objects. In SAP data archiving,
the archiving object is a critical component. The archiving object specifies which data is archived and how;
describes which database objects must be handled together as a single business object; and interprets the data
regardless of the technical specifications at the time of archiving, such as release and hardware.
There are more than 600 possible SAP archiving objects, Malfetti said. The most he ever witnessed any
customer using was 100, and he called that customer's practice "insane."
The law of diminishing returns kicks in after 20 to 25 objects, since 10 to 20 objects address most of the
database. He said, however, that new archiving objects can significantly add to your overall SAP database
storage savings.
Tip # 3. SAP provides tools to improve archiving effectiveness, but consider adding third-party utilities.
Although many SAP archive administrators currently use the SAP Archive Administration (SARA) tool, which
provides the overall administration of archiving schedules and managing the archiving sessions, Malfetti
recommended a new archiving automation aid from TJC Software Solutions Inc. called Archiving Session
Cockpit (ASC). He said that both tools provide the same functions, but "what takes the administrator 100 steps
with SARA takes only a few steps with ASC."
Tip # 4. Perform a comprehensive database analysis every six to 12 months. Introducing new archiving
objects can yield significant database savings. This is another area where third-party tools are worth evaluating,
he said.
Using Transaction TAANA (which stands for Table Analysis: Administration) helps identify the distribution of
data within a table. TAANA can also identify the volume of archivable data and any archive file routing
requirements.
Malfetti recommends that SAP administrators look into the PBS DB Analyzer Plus tool to help them analyze
the capacity commitment of their archiving objects in the database in order to determine their current memory
requirements. The tool helps administrators determine how much their database is growing, for which objects
archiving makes the most sense, and which module requires most of the disk space.

Tip # 5. When it comes to archiving, consider your entire SAP landscape. BI data can be archived and
nearlined. CRM and industry-specific solutions all have archiving functional standards. QA and DEV systems
could benefit from PRD archiving after a refresh; they can also be reduced using third-party solutions.
Malfetti noted that any space savings may be mirrored three or four times, so that a reduction of a terabyte in
your original database can actually represent a total saving of 4 TB.
Tip #6. Archiving is a prerequisite for deleting expired data. In order to dispose of end-of-life data, "expired
data must first be archived," Malfetti said. "There is no magic program to dispose of, for example, everything
older than seven years in sales. You need to go through all the steps for archiving" in order to delete the data -after the seven-year retention period now stipulated.
Malfetti also noted that each archiving object may have multiple variants for a given time frame. So that, while
the mandate for keeping U.S. data is seven years, in Italy the retention period is 10 years. This means, he said,
that you need two separate files for the data. "Otherwise, if it's in one file, you'll be keeping the U.S. data for 10
years."
Tip #7. To keep an SAP data archiving initiative on schedule requires project management expertise.
Malfetti divides the SAP archiving cycle into four phases: archiving development, archiving scheduling,
archiving execution, and then ongoing maintenance, and each phase involves a large and diverse team of
players. While the primary roles in the first three phases belong to the archive administrator and the storage
administrator, significant parts are also played by the project manager, functional analysts, key business users
and the Basis administrator. The fact that so many different titles play key parts requires significant
coordination to keep the project on schedule.

SAP archiving process-Simple steps


Application:

Archiving can be done in different objects of PP/MM/FI/CO/PM etc.

The example is with a MM object, but it can be applicable equally to any PP object (PR_ORDER-Process order)

This document demonstrates the process of archiving in different objects. Though this is a archiving process,
this is a general way of archiving. Archiving is required to have a faster and clean system. It is also possible to
get the data once these are archived.

The archiving process have generally four steps.

There are 4 Main steps in SAP Archiving are:

Step 1 Pre processing


Deletion indicators will be set in this step
Applicable for some objects

Step 2 Write
Records chosen for Archiving will be written in Archiving file system in Unix server (Path:
/Volumes/Archiving/<Object name>/)
Applicable for all objects

Step 3 Delete

Written records will be deleted


Applicable for all objects
To ensure that the Archiving file system back up is taken for written records, at least 2 days gap to be
maintained before performing Delete step.

Step 4 Post processing


For deleting secondary indexes in FI_DOCUMNT and movement data in QM_CONTROL.

Useful Transaction codes:

SARA- Archive Administration

SARI- Archive Information System

SARE- Explorer

ALO1- Search for documents with relationships (Incl. Archive)

The archiving process:

Object: MM_EKKO (Purchase order)

1.

Pre processing Step: Create Variant (KT2NDLINE):

Enter the New Variant name and click on Maintain button. Maintain Purchasing Documents information and
Restrictions as required.

Save Variant.
Execute the variant. Check the jobs.

Open the spool.

Pre-processing step successful for the selection.

2. WRITE Step: Variant KT2NDLINE-WR

Create and save variant for the selection (should be same as maintained in Pre processing step).
Execute the variant. Check the Jobs.

Open the Spool.

Write step successfully completed.

3. DELETE Step:

Click on Archive Selection. Select the files to be deleted.

Execute the selection.

Open the Spool

File is successfully deleted.

4. Checking Information system Info structure updation.

We can see a Green signal light for the last archiving session. This means that the Info structure is updated successfully.

If the signal is Yellow or Red, Please Click in

.It will fill the info structure once again.

After every archiving run this should be checked. Otherwise, reading data from archived files will not work for the users,
which leads to raising a ticket.

5. STATISTICS:

This option shows the details of archiving sessions. The details include,

Number of objects (header records) Written, Deleted

Number of Delete jobs

Write and Delete Job durations

Etc.

6. LOGS:

If Job log is clicked for the Write step job, the following log is displayed.

We can see the Program name, variant used, User id, Archiving session number, How many number of documents
processed,
Path of the system where the written file is stored
Name of the Archive file created in the file system, etc., can be seen in the log.

7. READ:

Execute

Execute

We can see the archived Purchasing documents. For further documents, refer to User document for this object.
( SPEED_MM_Display_archived_Purchase_Order_v1.pdf)

8.

MANAGEMENT:

It shows the status of archiving session. (Complete or Incomplete).

Double clicking on the session will lead to the following screen where the archived file related details can be seen.

28 Comments

Aditya S Dec 31, 2013 8:52 AM


Hello Krishnedu,
Thanks for sharing this. But can you please confirm whether we need to wait for 2 days really before performing Step 3
mentioned in the Archive process.

We have used SARA for my client but we never waited for 2 days to perform the Step 3 after Step 2 mentioned above.
Please clarify and let me know whether my understanding is correct or wrong. Thanks.
Like (0)

Krishnendu Das Dec 31, 2013 10:19 AM (in response to Aditya S)


Hi Aditya,

Not necessary. Actually we are having a setting for data backup. So we have considered a cushion time. The job of Write
and Delete is dependent job. Hence 2 days wait time is just to confirm the data backup.

Not required in normal process always

Krishnendu.

Like (0)

Aditya S Dec 31, 2013 11:19 AM (in response to Krishnendu Das)


Hi Krishnendu,

Thanks for clarification.


Like (0)

Sesha S B Jan 1, 2014 6:28 AM


Hello Krishnedu,

Very usefull info...

Regards
Sesha...

Like (0)

Krishnendu Das Jan 1, 2014 6:35 AM (in response to Sesha S B)


Thanks Sesha.

Regards,
Krish.
Like (0)

Jrgen L Jan 2, 2014 10:34 PM


Keeping archives in the file system is actually not a safe place, as data are not unchangeable in file system.
Hence you often have another step called storing to move the archives from the file system to a content server.

What makes me always a kind of angry is that all those archiving documents completely ignore to do at least the test
runs with a detailed log. Nobody is talking about checking joblogs and detailed lists. That this is actually 95% of the
whole archiving activity, and probably as well 95% of the questions asked in the forums are because the people do not
know about logs.
If I comment that a document is just basic, then I often get the answer, it is for beginners.
But especially beginners should then be educated with the difficulties and the most important settings. This is the
advantage of documenting and blogging in SCN, to be different from the standard help documentation. This is just an
illustrated best case walk through.
If you have archived, then you could explain the hundred different error cases that you have seen.
Like (5)

Krishnendu Das Jan 3, 2014 6:31 AM (in response to Jrgen L)


Thanks a lot Jurgen. I will try to add some more

Actually we are taking the backup in memory tape everyday in a specific time. That's the reason I have mentioned a gap
of 2 days between Write and Delete process.

Regards,
Krish.
Like (0)

Manjunath Ravi Jan 3, 2014 5:40 AM


helpful document for beginners , keep up the good work..!!!
Like (0)

Vamshi Krishna Menaka Mar 4, 2014 3:31 PM


Hello Everybody..

I have just gone through the document as we got a requirement for SAP archiving ..i hope its quite clear for
begginners....

Thanks alot..:-)
Naresh Gollapelly
Like (0)

Krishnendu Das Mar 10, 2014 4:45 PM (in response to Vamshi Krishna Menaka)
Hi Vamshi,

Thanks a lot.

Regards,
Krishnendu.
Like (0)

h wd Mar 14, 2014 7:24 AM


Dear Krishnendu,

Really a Good Document,

...

Im having a doubt here About "Post processing". Please let me know what exactly the Post processing stands for?
Doubting because, in "SARA" there is no Post-processing Tab in "MM_EKKO" like in
"FI_DOCUMENT".

Like (0)

Roby Vivs Mar 14, 2014 1:15 PM (in response to h wd)


Hi,

Post processing process in FI_DOCUMNT is used to delete (not to archive) the data entries from the secondary index
tables like BSAS, BSIS, BSAD, BSAK etc...
These tables entries are just the duplicate entries of BKPF and BSEG table entries which are stored in these secondary
index tables for fast retrieval purpose. So post-processing is used to delete these entries once after archiving WRITE,
STORE and DELETE jobs are completed for FI_DOCUMNT object.
Like (1)

Krishnendu Das Mar 17, 2014 7:22 AM (in response to h wd)


Hi Bhagya,

Thanks. Good reply by Roby. It means system sync.

Regards,
Krishnendu.
Like (1)

Dibyendu Patra Mar 18, 2014 6:29 AM


Good information..
So clear and informative...
Like (0)

Krishnendu Das Mar 19, 2014 5:59 PM (in response to Dibyendu Patra)
Hi Dev,

Thanks.

Regards,
Krishnendu.
Like (0)

Gajendra N Koka Apr 29, 2014 11:33 AM


Hi Krish,

Thank you for a simple, concise and comprehensive document. Very helpful for first timers.

Thank You,
Gajendra.
Like (0)

Krishnendu Das Apr 29, 2014 2:41 PM (in response to Gajendra N Koka)
Thanks Gajendra.

Regards,
Krishnendu.
Like (0)

Samuel Friedman Jul 23, 2014 10:22 AM


Hello Krish,

Indeed a nice concise document which explains all the important steps in archiving.

We run all of our archiving automatically using an external job controller. That way, immediately following the
completion of the write phase, we perform a backup on the files on the unix server followed by the sap program for
deleting "RSARCHD" to control the deletion phase.

Following that we also run a reorg of the indexes of the tables that were archived.

Regards,
Sam
Like (0)

raj reddy Aug 8, 2014 11:31 AM


Hi every body,

i need small help from experts, like in archiving object if we dont have any reload program in SARA , then how we reload
the archived data from other system to ECC system (back), please help me ...

the object is BC_SBAL ...its contain only write and delete jobs only, it doesn't have any reload program.. please help on
this
Like (0)

Karin Tillotson Aug 8, 2014 3:15 PM (in response to raj reddy)


Hi Raj,

If the reload program is not available for an archive object, that means it is not supported for reloading. Why would you
need to reload BC_SBAL? The archived data is accessible via SLG1.

Best Regards,
Karin Tillotson
Like (0)

raj reddy Aug 12, 2014 7:45 AM

Hi Karin,

1) the purpose of reload for this is , we ran the write , delete program to this object is 2014 date instead of 2013(need
to read before 2013 data only). so i need to revert back the 2014 data into ECC . how i can reload the deleted data .

2) can you help me to know what configration required to store the deleted data into table as file .( if we want to read
archived data then we read data from that stored table right).

please help me on above points , it could be more helpful to me

Thanks
Rajashekar
Like (0)

Karin Tillotson Aug 13, 2014 9:47 PM (in response to raj reddy)
Hi Raj,

Like I stated before, you cannot reload the archived BC_SBAL data, but, you can read it if you activate field catalog
SAP_SBAL_002 and Infostructure SAP_BC_SBAL01 in SARI and fill the archive information structure with the relevant
BC_SBAL archive files. You can then read this data via transaction SLG1 by selecting the Format Completely from Archive
radio button.

Hope this helps.

Best Regards,
Karin Tillotson
Like (0)

Douglas Domenici de Lara Aug 28, 2014 10:34 PM

Very good article! Congrats!


Like (0)

Krishnendu Das Sep 1, 2014 11:20 AM (in response to Douglas Domenici de Lara)
Thanks Douglas.

Regards,
Krishnendu.
Like (0)

Carlos Trujillo Sep 24, 2014 9:21 PM


Hi Krishnendu

The Customer wants to delete de Central Intance from the Logon Group of the Archiving. Did you know how to do it ?

Thanks in Advance
Best Regards.
Like (0)
Krishnendu Das Sep 26, 2014 8:35 AM (in response to Carlos Trujillo)
No
Regards,
Krishnendu.
Like (0)

Santhanakrishnan Arumugam Dec 8, 2014 6:16 PM


Hi,

Very good article! Congrats

Keep going

Tnx
Krishnan
Like (0)

Santosh V Dec 10, 2014 7:57 PM


Very detailed blog...

SAP has Fiori applications which could be interesting if you are involved in archiving - Understanding SAP ILM Archiving
Fiori applications in 5 questions . The applications provide options to monitor jobs, logs and also take corrective action
(not covered in this blog)

Vous aimerez peut-être aussi