cancel
Showing results for 
Show  only  | Search instead for 
Did you mean: 

Data pruning best practices

Data pruning best practices

Overview

This document discusses the configuration, processes and best practices for pruning old data from your IdentityIQ environment, in order to maintain performance and database disk space allocation. 

Optimizing Data Retention

IdentityIQ should be configured to store the appropriate amount of historical data to optimize database size and overall system performance based on your business requirements for retaining identity and certification data.

The most important fact to understand is that optimizing data retention is magnitudes easier when done before the data has been created.  If a data cleanup activity must occur due to a lack of database space, then specialized procedures may be needed to clean up the old data.  Configuring IdentityIQ before that occurs will prevent future problems.

 

The following objects have the most effect on database size and system performance:

Object Type

Description

 Impact

Identity Snapshots

Stores a copy of the governance Identity data for historical purposes. These snapshots are created during certification generation, and by using the option “Maintain identity histories” in the Identity Refresh task. The number of Identity Snapshots can affect the time to generate new certifications, the time to render Identities in the UI, and the size of the database. It is recommended that you define an expiration policy for identity snapshots that matches your certification archive policy.

Performance: High

Table Space: High

Certifications

As certifications are generated, large amounts of historical data are captured and stored in the certification tables along with identity snapshots, history, and entitlement snapshot data.  After the certification cycle is complete and audit reports are generated, the certification data becomes stale. It is recommended that you keep at least one previous cycle of data to use during the next certification generation, for comparison purposes. You may choose to keep live certification data in the system for one or more complete certification cycles to provide historical data for current certifications.  Older certifications can be archived and deleted to improve certification performance and to reduce disk space.

Performance: Medium

Table Space: High

Task Results

Any time a task is executed, a task result is saved. Task results include report results that in many cases consume a large portion of the utilized database storage space. Depending on the TaskDefinition configuration, the previous result may be renamed or deleted.  In the case of renames to keep historical task results, a large amount of task result data can accumulate, especially for scheduled daily or hourly tasks.  It may be important to keep this task data for some period of time for debugging and monitoring purposes. Depending on the business and operational requirements task results should be cleaned up as often as necessary to reduce disk space.

Performance: Low

Table Space: Medium

Identities

Identities accumulate over time as new systems are aggregated into IdentityIQ and users complete their life cycle at a company.  As the number of identities increases, the time to search, refresh and maintain these identities increases.  This document won’t discuss strategies for optimizing the execution of these daily tasks but will focus on pruning the dead weight from your identity warehouse.

Performance: Medium

Table Space: Low

Requests

Each time an email is sent, a new request object is created.  For large certification cycles or daily-automated tasks, the request queue can become very large.  Request objects are also created for future data workflow events and other object expiration events. Although the completed requests have no effect on system performance they do take disk space.

Performance: Low

Table Space: Low

Syslog Events

Each time a log event reaches the Level defined for “Level at which syslog events are stored” an entry is created in the spt_syslog_event table.  This table can grow to the point where the UI cannot cope with the number of events and the Advanced Analytics searching of syslog events fails.  In addition, because many syslog events can be created by a single failure, a large amount of database space may be consumed by retaining extraneous syslog events.

Performance: Low

Table Space: Medium

 

 

Data Retention Configuration

For each of the object types it is important to define the number of days before the objects will be deleted or archived to optimize the amount of data retained, maintain system performance, and control the size of the data footprint. The system can also be configured to capture more or less granular data based on your business requirements. Figure 1 and Figure 2 below show the settings available from the System Setup > IdentityIQ Configuration section. Refer to the product documentation for specific explanations.

  • Each setting is applied when the task listed is executed
  • A value of 0 indicates never to remove objects
  • Deleted objects cannot be recovered

WARNING:  Enabling these options can result in deleted data

Setting and Type

Recommended Retention Policies

Task Configuration

Days before snapshot deletion

At a minimum this should be set to match your shortest certification cycle and at a maximum it should match your certification archive days.

Perform Maintenance Option: Prune Identity Snapshots

Days before task result deletion

This option should be set to match the longest schedule for tasks where you need to see previous results.  For some customers this is every 60 days, where their longest schedule of interest is every 30 days. This option applies to Task Results and Report Results.

Perform Maintenance Option: Prune Task Results

Days before certifications are archived

When this option is enabled (not set to 0) completed certifications are serialized into a single Character Large Object (CLOB) and persisted into the spt_certification_archive table. The certification objects are then deleted from the live tables and no longer appear in the UI or are available for reporting.  Archived certification objects should be considered as permanently removed but not yet deleted.  A recommended practice is to archive your certifications that are older than the last 2 cycles. Certification data should be extracted into a data warehouse or report before archiving.

Perform Maintenance Option: Archive and Prune Certifications

Days before certification archive deletion

To complete the cleanup of the old certifications this option will cause the archive object to be completely deleted.  Use this option with caution, as the data is not recoverable.

Perform Maintenance Option: Archive and Prune Certifications

Snapshot Frequency in Days

In general, this option should be set to the highest acceptable value to reduce the amount of snapshot data being stored.  This option is used only if the Maintain Identity history option is enabled in the Identity Refresh task.

Refresh Identity Cube: Maintain identity histories

Index History Granularity

This option determines the increments at which to store history. For example, if the Identity history is set to ‘Week’, snapshots are preserved on a weekly basis.

This option should be set to the largest granular window that is acceptable to the user, to reduce the number of updates to the snapshot data.

Refresh Identity Cube: Maintain identity histories

Days before syslog event deletion

This option determines how long syslog events are stored within the system.  Keeping this too large may impact searching of syslog events within Advanced Analytics and may lead to a large spt_syslog_event table size.

This option should be set to match the operational requirements for maintaining syslog events.

Perform Maintenance Option: Prune syslog events

Days before provisioning transaction event deletion

This option determines how long provisioning transaction events are maintained by IdentityIQ.

This option should be set in accordance with any audit/compliance requirements for tracking completion of provisioning actions.

Perform Maintenance Option: Prune provisioning transactions

 

Figure 1: Left Column: Days before snapshot deletion, days before certifications are archived, and days before certification archive deletion. Right column: Days before syslog event deletion and days before provisioning transaction event deletion

 

Sanudeep_0-1739858860487.png

 

 

Figure 2: Snapshot frequency in days and index history granularity

Sanudeep_1-1739858860490.png

Perform Maintenance Task Configuration

The Perform Maintenance task controls which objects are pruned and when.  It is recommended that you configure a dedicated task that executes on a separate cycle to prune and archive objects. You may choose to have a task for each object type for example.

WARNING:  Enabling the prune options will result in deleted data

 If utilizing IdentityIQ 8.2 or 8.3, follow A; whereas if utilizing IdentityIQ 8.4+, follow I.

  1. Create a dedicated “Object Pruning and Archive” task definition.  Creating a separate task allows you to separate the processing required for data pruning from the processing required for normal LCM operations.
    1. Decide if you need one task for each type of pruning operation or if they can be grouped into a single task.
  1. IdentityIQ 8.4+ contains a new Task Definition template to prune data named “System Maintenance Object Pruner”.  Create a task using this template and select the desired options.
  1. Schedule the task to run frequently enough to keep pace with data creation
    1. Snapshots and Task Results: weekly or daily
    2. Certifications: monthly or longer
  2. The first execution may take many days to run, depending on existing historical data
  3. Prune requests should be run separately based on a set schedule, such as every 60 or 90 days.
    1. This option deletes all completed requests, not based on age of request.

 

Figure 3: Setup > Tasks > Perform Maintenance

Sanudeep_2-1739858860492.png

 

Identity Refresh Task Configuration

The Identity Refresh task has options which control the amount of historical data retained for identities. For example, you can configure the Identity Refresh Task to maintain identity histories - this creates a snapshot of each identity that is refreshed when this task is run.

You can control the amount of data retained by this task in several ways:

  • Only enable the "Maintain identity histories" option if your business requires it.
  • If you want to maintain histories but don't want to create a snapshot every time an identity is refreshed (if for example you want to capture history information less frequently), you can set up a dedicated Identity Refresh task that is specifically for capturing snapshots and is separate from your main Identity Refresh task.
  • You can configure filters to only snapshot specific Identities, using the "Optional filter string..." and "Optional list or group of populations..." options in the task configuration, to select a specific identity or set of identities to snapshot.

Sanudeep_3-1739858860494.png

Data Archive Strategies

There are several strategies for preserving the certification, identity, and snapshot data for historical purposes. Choose a strategy that best fits your operating and business needs.

  1. Certification Archive Reports
    • Generate certification archive reports and save these exported data sets outside of IdentityIQ for long term retention
  2. Certification Data Warehouse
    • Export all certification data into a Data Warehouse. The Data Export task is a good example of a custom database process to sync the data.
  3. Offline IdentityIQ table archives
    • Database sync scripts to an offline archive of the tables
  4. Full Database Backup
    • Database backups are used to restore old data in extreme cases

 

Large Data Cleanup Examples

Unfortunately, many clients realize that data pruning configurations are necessary only when they start seeing performance or storage space issues.

The following SQL and HQL (for querying only) examples can be used as methods to remove large data sets directly at the database layer reducing the overall time to prune old data.  Note that these are samples and may need adjusting based on your database type and IdentityIQ version. Any reference to time fields must be made in milliseconds since epoch; these can be generated by adding three digits for milliseconds to results generated from an epoch time converter, such as http://www.epochconverter.com/

Be aware that some of these queries may take long periods of time to run and will consume processing capacity on the database while running.

An alternative option is to use the terminator to delete specific objects via a custom IdentityIQ rule.  While a custom rule will not be as efficient, it will reduce the impact of the performance impact since it can be controlled as any other IIQ thread can with OOTB tools in the Admin Console.

The console can be used to run SQL or HQL queries.

SQL queries are run using:

  • sql “select count(*) from spt_identity”
  • sql “select max(created) from spt_identity_history”

HQL queries are run using HQL. Note that returned time fields will be translated, but for queries, epoch time must be used. For indication of the names of various columns, refer to the hbm.xml files within the identityiq.jar file:

  • hql “select count(*) from Identity”
  • hql “select max(created) from IdentityHistory”

For SQL, you must know which tables are involved (such as spt_identity for Identity). For HQL, all you need to know is the name of the type of object (such as Identity). Here is a list of the table names involved with the largest datasets:

  • spt_identity_archive
  • spt_identity_snapshot
  • spt_certification_archive
  • spt_identity_history
  • spt_identity_history_item
  • spt_syslog_event
  • spt_provisioning_transaction

SQL Queries (note that timestamps are in epoch format and must be converted for queries):

  • delete from spt_identity_history_item where created < <predetermined max age of retention>
  • delete from spt_identity_history where id not in (select identity_history_id from spt_identity_history_item)
  • delete from spt_identity_snapshot where id not in (select snapshot_id from spt_certification_entity) and created < <predetermined max age of retention>”
  • delete from spt_identity_archive where  created < <predetermined max age of retention>
  • delete from spt_syslog_event where  created < <predetermined max age of retention>

Additional areas to consider

  • Audit Data – Audit data accumulates in the system indefinitely. There is no task or option to prune this data.  Depending on your data retention requirements, you can create a custom task to prune the data or manually prune this table using a simple SQL query to archive and delete the old records every X days. 
  • Identity Request – In the Perform Identity Request Maintenance task, the Max age (in days) for Identity Request objects setting controls how long request objects are kept in the database. Leaving this set to 0 tells IdentityIQ to keep these objects indefinitely. You can set this to a different value according to your business needs.

  

Additional resources

Perform Maintenance (Housekeeper) Task explained

Audit Configuration in IdentityIQ

 

 

 

 

 

Labels (2)
Comments

Great! Exactly what I was looking for!

Awesome, this helps a lot!

BTW, I know the configuration on "Other Object Expiration" items and others are based on the business requirement. But sometimes, we can't get the business requirement for that. Do we have any best practice on that?

Thanks,

Bill

Hi All,

Where we can configure the Prune Requests age.? Currently in my environment there are only 7 days of requests are holding

Identity Requests are pruned by the Perform Identity Request Maintenance task.  You can specify how long to retain them in that task.

If you have follow up questions, please ask them in the IdentityIQ Forums​.  You can put a link back to this article if you want, but more people follow the forums and you are likely to get more answers/participation there than here (plus it muddies the articles to have long threads of Q/A that are not specifically about clarifying doc content).

Hi,

great article. It was provided before sailpoint.object.ProvisioningTransaction have been intriduced to IdentityIQ. They contain good data but have a really high impact on database size and possibly performance. I think this should be added here.

 

Regards

Martin

Agreed with @martin_dehnProvisioningTransaction should be added in this documentation.

Very good article.

Thanks !

Version history
Revision #:
19 of 19
Last update:
‎Mar 05, 2025 01:33 AM
Updated by: