Provisioning to a JDBC Source via an external jar

Provisioning to a JDBC Source via an external jar


In an effort to better enable both partners and customers, we have outlined a process that allows technically savvy customers the ability to make modifications to their JDBC provisioning rule without directly engaging IdentityNow's Expert Services team. We do not recommend this approach if you do not have the correct technical resources on hand with at least an intermediate level knowledge of Java.


  1. A functional JDBC source that is aggregating accounts into IdentityNow successfully.
  2. A JDBC Driver jar attached to the Source Config. (see Required JDBC Driver JAR Files)
  3. Eclipse or another IDE able to import Maven projects.
  4. The 'identityiq.jar' to import into your IDE. (Attached at the bottom).
  5. Intermediate knowledge of Java.
  6. Understanding of Account and Attribute requests within an IdentityNow provisioning plan.

Elements of the project

  1. JDBC Rule Adapter- This is the rule that will still be needed to be uploaded to your org by Expert Services or Professional Services. Essentially, this rule calls a 'provision' method in an attached jar and all the logic is built there. Getting passed into the method is the application, the connection to the database, the plan, and an *optional* log file. An example is attached below.
  2. Primary Java class- This is the main class that we will be using in our example. This is essentially the tunneling and logic portion of the code. It is the home of the 'provision' method, receives the account and attribute requests and directs the request to the appropriate calls in the auxiliary class in the project. (In the attached project, this is
  3. Auxiliary Java class- This is the class that executes stored procedures or (in this projects case) prepared statements to the target source and contains the methods that are called from the Primary Java class. In this project, it is called

Building a project structure

Within Eclipse, or any other IDE, your file structure should be as follows:

Project Structure

Make sure that you have imported the 'identityiq.jar' and you can see it in your Maven dependencies.

Elements of the Primary Java class

This is the foundation that handles our project. Every request initially comes here and then calls other methods.

  1. Provision Method- This breaks down the request and calls the appropriate method for the operation:
    Provision Method
  2. Operation methods- These exist for every operation (Create, Modify, Enable, Disable).
    Create Method
  3. Now that a request has come in and we have defined it's type, we need to call the Auxiliary Java class in our project.

Elements of the Auxiliary Java class

  1. Static Final strings- We create both public and private final strings in order to streamline interacting with the database. This may include the query string and/or constants in the project.

  2. Working methods- These are the working methods in the class. We have received the request, determined where it should go, and now need to execute it. Notice that it's simply a prepared statement that calls a final string

    Create Worker Method

Clean Compile Package

The next step is to compile your Maven package. Upload the jar that is created to the source config.

Upload screen

Final Notes:

JDBC Rules can be very complicated depending on the source you are trying to connect to. We STRONGLY recommend that if you have any questions, bring them up to the Expert Services team for assistance.

Labels (1)

What's best practice for handling operations that aren't supported? For example, what should we return for a JDBC Source that doesn't support Disable/Enable Operations?  

The sample code in and the other doc here seems to imply you can just leave the ProvisioningResult at whatever the constructor defaults to?  Is that correct?

What is right build path for this project?

I'm getting error - Description Resource Path Location Type
The project was not built since its build path is incomplete. Cannot find the class file for org.aspectj.lang.JoinPoint$StaticPart. Fix the build path then try building this project jdbc-template Unknown Java Problem.

Currently i'm using Eclipse IDE to build this project and using Embedded 3.3.9 Maven version.

Do you think there is an issue with Maven version?




Try adding aspectjrt jar file to your project referenced libraries.

You can find it in the WEB-INF/lib folder of identityiq.

I'm getting error for external jar - org.apache.bsf.BSFException: BeanShell script error: bsh.EvalError: Sourced file: inline evaluation of: `` import; import org.apache.commons.loggin . . . '' : Error in method invocation: Static method provision(sailpoint.object.Application, org.apache.commons.dbcp2.PoolingDataSource$PoolGuardConnectionWrapper, sailpoint.object.ProvisioningPlan, org.apache.logging.log4j.jcl.Log4jLog) not found in class''

What would be the possible root cause of this issue?

Is there any issue with import statement in  "JDBC Rule Adapter.xml"?

I think issue may be with import statement - import

instead of import specific class, import package might help. 

Please give pointer on this if anyone faced similar issue before.


@rahul-bhosale Not sure if you ever got this figured out, but if so, maybe help for future cases: I had the same error and I realized the rule attached to this document is creating a _log variable using Log _log = LogFactory.getLog("customJDBCLog") and then sending that as the log argument in the provision method. However, in the attached jar file, the provision method is expecting the Log parameter to be of the SailPoint Log interface type. I changed my jar file to include org.apache.commons.logging and updated that parameter to an apache log and my rule was then able to call the method.

For general review: is it possible to get documentation on that log and how to access logs created from the jar file? I used System.out.println, which writes to the ccg, but would like to know if/how we can access a log specific to the jar.

@cassidiopia  Yes i have figured out the same. I did bit different changes than what you did, I used -

import openconnector.Log;
Log _log = LogFactory.getLog("customJDBCLog");  

And import the same in jar.  It helps me to print log in ccg.log file.

I am able to print log in ccg using, log.debug etc.

Hope this help

@cassidiopia  @rahul-bhosale  @wimvandijck 


Can we use a similar adapter rule (similar to the one which is attached in the main post) for provisioning to multiple JDBC sources? I mean same rule attached to multiple sources and Java class doing the logic to provision to respective jdbc source tables?



Can we use Stored Procedures instead of quires?


you could make that work, yes. I think it would require more complexity within the jar file. As one example, the class that your rule imports could simply contain the provision method that is called by the rule. Then you could have additional methods for account request operations and additional classes for separate sources. Depending on the variance of your source accounts, the attributes, and the entitlement names, the complexity may grow but is certainly manageable that way.


yes, you can. You can use a CallableStatement in your jar file to achieve that. I have tested it and it works fine. I had some issues with the ResultSet but that might have been our DBAs and the statements still worked so it wasn’t a big deal.


could you please share a example java class file with stored procedures and CallableStatements .

Hello to all

where can i put the jar? 

The next step is to compile your Maven package.
Upload the jar that is created to the source config.

You can upload your jar in source through upload option

I am getting below error. Any idea what might be the issue?

{"exception":{"stacktrace":" org.apache.bsf.BSFException: The application script threw an exception: java.lang.NullPointerException BSF info: JDBC Provisioning Rule Adapter at line: 0 column: columnNo\n\tat Method)\n\tat org.apache.bsf.BSFManager.eval(\n\tat sailpoint.server.BSFRuleRunner.eval(\n\tat sailpoint.server.BSFRuleRunner.runRule(\n\tat sailpoint.server.InternalContext.runRule(\n\tat sailpoint.server.InternalContext.runRule(\n\tat sailpoint.connector.DefaultConnectorServices.runRule(\n\tat sailpoint.connector.DefaultConnectorServices.runRule(\n\tat sailpoint.connector.CollectorServices.runRule(\n\tat sailpoint.connector.JDBCConnector.handleJDBCOperations(\n\tat sailpoint.connector.JDBCConnector.provision(\n\tat sailpoint.connector.ConnectorProxy.provision(\n\tat\n\tat\n\tat com.sailpoint.ccg.handler.ProvisionHandler.invoke(\n\tat sailpoint.gateway.accessiq.CcgPipelineMessageHandler.handleMessage(\n\tat com.sailpoint.pipeline.server.PipelineServer$InboundQueueListener$\n\tat java.util.concurrent.Executors$\n\tat\n\tat java.util.concurrent.ThreadPoolExecutor.runWorker(\n\tat java.util.concurrent.ThreadPoolExecutor$\n\tat\nCaused by: org.apache.bsf.BSFException: The application script threw an exception: java.lang.NullPointerException BSF info: JDBC Provisioning Rule Adapter at line: 0 column: columnNo\n\tat bsh.util.BeanShellBSFEngine.eval(\n\tat org.apache.bsf.BSFManager$\n\t... 22 more\n"}

Is it possible to share the rule code.

Not much sure about this

a)If possible ,can you try putting the system.out.println at each line to check where the flow is reaching 

d) Did you try running the rule from iiq console

rule "JDBC Provisioning Rule Adapter"

@manoj_caisucar I was able to figure out the issue after using Syso. The issue was in the code and I have corrected it now. Strangelt log.debug/ was not printing in ccg.log. Any idea how to configure logger the custom logger in IdentityNow?

Its good to know the root cause.

Did you check the response from rahul in earlier thread if it helps

import openconnector.Log;
Log _log = LogFactory.getLog("customJDBCLog");  

And import the same in jar.  It helps me to print log in ccg.log file.

I am able to print log in ccg using, log.debug etc.

@manoj_caisucar I had used apache logger in rule as well as custom code and the rule is already uploaded to our tenant.

import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
Log _log = LogFactory.getLog("customJDBCLog");



You can use System.out.println in the jar file and it will write your output to the CCG logs.

Thanks @cassidiopia@manoj_caisucar .

I am now seeing a weird error, the user account's status is getting disabled in DB via the custom code that I have written as part of JDBC provisioning rule.

However the account status in IdentityNow is not getting changed to Disabled. In the event log search, I am getting below error 

"Invalid object name 'account'."
Any idea what might be the cause. This was working fine till sometime back.

How are you handling the  IIQDisabled attribute a part of aggregation.

Also can you check the schema type and the one which you are handling in the code if they are matching.

Hi @mohanas ,

Were you able to figure out issue for "invalid object name" error? I am also facing the same.

I was getting this error because of code issue. I had to put System.out.println("") to identify the root cause.

Probably you can try to do the same to see if it is code issue. Hope this helps.

@mohanas + @nehab 

Hey I am having the same error, but it's only occurring in our production environment and I do not know where to start to troubleshoot it (other than comparing environments). Where in your code did you find the problem?

@cassidiopia @manoj_caisucar 

I have used this statement as well in my code and I do not see any logs in my ccg.log. What kind of log the customJDBClog is referring to?

import org.apache.commons.logging.Log;
import org.apache.commons.logging.LogFactory;
Log _log = LogFactory.getLog("customJDBCLog");

@Amber_Campbell I was trying to get the Identity object using getIdentity method which was throwing null pointer exception. In your case it might be for something else. Try adding System.out.println statements to see where the code is getting errored out. 

Hi SailPoint Team:

The Prerequisites list (bullet 2) mentions that we need a JDBC driver JAR, and the image in the "Clean Compile Package" section shows only one ojdbc6.jar file.

In our case, our project requires 3 related libraries to connect to MongoDB databases:

  • bson-4.7.2.jar
  • mongodb-driver-core-4.7.2.jar
  • mongodb-driver-sync-4.7.2.jar.

Is it possible to upload our main JAR file and these other three libraries, or is the system limited to only one JDBC driver JAR file per source? I ask this because the "Required JDBC Driver JAR Files" document mentions drivers with a single JAR file for each database type.

Thank you for your help.

Recently we got one strange thing, we had tried to modify the jar by adding some logs in that java file, however there is no related logs printed, even we changed the jar name we deployed and we also modify some decriptions in the source and restart the ccg, it still failed.

It seems that SailPoint is always picking the orignial jar we deployed. It is some bugs related.

Is there any advices on this? @cassidiopia 

@Michael_Tai I am facing the similar issue, were you able to get it working?

Hello @leenaraj  The issue is not fixed, I had raised this ticket to support team however no response from their side. A little disappointed.

Case - CS0211476 - Customer Support (

BTW: for the option1, it worked from my side now. however it is hard to develop/test/degug, so i would like to try this second solution.

Hi @Michael_Tai Until yesterday I was facing the issue where it was always picking the original jar, not the newly uploaded one. However, I just retried today and it is working totally fine, I can see it's picking the new jar.

Can you also retry it once and confirm if it's working for you now?

@leenaraj Thanks for sharing this.

From my side, even we tried to change the jar name/update some basic source information, however it does not work...

can we connect some times today or tomorrow? Not sure how we can connect in this community..

BTW: What kinds of changes you did recently ?

There are some recommendations from the SailPoint trainers, maybe we can raise support tickets to let sailpoint devops to remove the jars from backend, then we can re-try and i had done this just now, Let us see then @leenaraj 

We are facing similar issue since last week, is there any solution suggested by Sailpoint? @Michael_Tai @leenaraj 

@Anshu_Kunal  There is no solution at the moment, we were trying to remove jar with SailPoint devops help from backend to see whether this issue had been fixed.

Hi @Michael_Tai @Anshu_Kunal , I had opened the support ticket for the same and it's fixed now. Please retry again by removing the old jar, save it and add the new jar. Also, you can run this command 'sudo docker exec ccg ls -l /opt/sailpoint/ccg/lib/custom' to check if the new jar is added.

Thanks for sharing @leenaraj  We tried and find that orginial jar and it means that the updated jar does not uploaded in the pathe as your mentioned.  Currently we do not have any permision on deleting since it is in the root folder. Which options you did and may i know what kinds of support case you raised and what kind of actions that SailPoint supported team did.

sudo docker exec ccg ls -l /opt/sailpoint/ccg/lib/custom

@leenaraj could you share your support tickets as reference?

@Michael_Tai CS0212325 is the support ticket number. The issue is fixed now, try re-uploading the jar and it should work. 

Thanks a lot @leenaraj however i could not access and review that case due to permission.
Is it possible that you can share the support content to me? My personal email :

Version history
Revision #:
1 of 1
Last update:
‎Sep 24, 2018 03:24 PM
Updated by: