Create a load table Example (WAIT)
This example creates a load table called "Account_BulkAPIv2_Wait_Harddelete". The Id and Error columns are mandatory. Generally, an undelete payload would be built up from querying or working with prior deleted data for the affected Object. All load operations require the Error column, whether success or failure, to guide and inform you.
exec ss_Delta 'DEMO', 'Account'
drop table if exists Account_BulkAPIv2_Wait_Harddelete
select top 100
convert(nchar(18),Id) as Id
,convert(nvarchar(255),null) as Error
,Name as Name_Info
into Account_BulkAPIv2_Wait_Harddelete
from Account
order by createddate desc
Running the example
exec ss_Loader 'BulkAPIv2Harddelete', 'DEMO', 'Account_BulkAPIv2_Wait_Harddelete','WAIT'
SQL-SALES BulkAPIv2Harddelete run date: 2023-12-09 --------------
21:49:27: Using Env|Schema: DEMO|dbo
21:49:27: Starting Loader for Account batchsize 10000
21:49:27: SSId added to Account_BulkAPIv2_Wait_Harddelete
21:49:30: Connection method BULK & BULK API
21:49:30: Bulk API method WAIT
21:49:30: Columns checked against Salesforce metadata
21:49:31: Starting load for Account_BulkAPIv2_Wait_Harddelete
21:49:32: Failed to create job. Status code: 400, which in V2 of the Bulk API can mean your User does not have Hard delete permissions
-----------------------------------------------------------------
Note, by default a sys admin user will not have permission to run Harddelete, you will need a special profile created (in fact cloned) from a system administrator and Hard delete enabled on it, without this initial preparation you will encounter the above error message
exec ss_Loader 'BulkAPIv1Harddelete', 'HARD', 'Account_BulkAPIv1_Wait_Serial_Harddelete','WAIT:SERIAL'
A special Envionment has been setup in this Demo, called “HARD” (which uses a different username, with a Harddelete profile enabled
SQL-SALES BulkAPIv2Harddelete run date: 2023-12-09 --------------
21:50:36: Using Env|Schema: HARD|dbo
21:50:36: Starting Loader for Account batchsize 10000
21:50:36: SSId added to Account_BulkAPIv2_Wait_Harddelete
21:50:39: Connection method BULK & BULK API
21:50:39: Bulk API method WAIT
21:50:39: Columns checked against Salesforce metadata
21:50:39: Starting load for Account_BulkAPIv2_Wait_Harddelete
21:50:54: JobId: 7508d00000TtU9dAAF
21:50:55: Load complete: Success:100 Failure:0
-----------------------------------------------------------------
Note the Job Id for your submission is returned in the output for your reference, see also the log table ss_BulkAPILog.
ss_BulkAPILog table
The Job Id is also preserved in the ss_BulkAPILog table, written on each submission.
Checking the load table
The success or failure errors for each row will be automatically written back to the Error column.
Create a load table Example (BACK)
This example creates a load table called "Account_BulkAPIv2_BACK_Example_Harddelete". The Id and Error columns are mandatory. All load operations require the Error column, whether success or failure, to guide and inform you.
exec ss_Delta 'DEMO', 'Account'
drop table if exists Account_BulkAPIv2_BACK_Harddelete
select top 100
convert(nchar(18),Id) as Id
,convert(nvarchar(255),null) as Error
,Name as Name_Info
into Account_BulkAPIv2_BACK_Harddelete
from Account
order by createddate desc
Running the example (Step 1)
exec ss_Loader 'BulkAPIv2Harddelete', 'HARD', 'Account_BulkAPIv2_BACK_Harddelete','BACK'
SQL-SALES BulkAPIv2Harddelete run date: 2023-12-09 --------------
21:53:22: Using Env|Schema: HARD|dbo
21:53:22: Starting Loader for Account batchsize 10000
21:53:22: SSId added to Account_BulkAPIv2_BACK_Harddelete
21:53:25: Connection method BULK & BULK API
21:53:25: Bulk API method BACK
21:53:25: Columns checked against Salesforce metadata
21:53:25: Starting load for Account_BulkAPIv2_BACK_Harddelete
21:53:29: JobId: 7508d00000TtU3tAAF
21:53:29: BulkAPIv2Harddelete BACKGROUND completed successfully
-----------------------------------------------------------------
Note the Job Id for your submission is returned in the output for your reference, see also the log table ss_BulkAPILog.
Note unlike the WAIT method, simply running BACK will not populate the _Batch table nor write back Error data to your load table, see the next set of instructions for how to do this, however the Batch Id(s) are included in the output dump for information purposes
ss_BulkAPILog table
The Job Id is also preserved in the ss_BulkAPILog table, written on each submission.
Running the example (Step 2 Option 1)
At any time after you have run the initial BACK, you can retrieve processed rows to your load table by reverting back to using the WAIT method (either in SERIAL or PARALLEL mode). This is achieved by passing in the known Job Id into the @Special2 input parameter.
Note, when using “Option 1” if you have attempted to return processed rows “too soon” and some rows for a given Batch or Batches are not yet processed by Salesforce, SQL Sales will not be able to return an Error value hence caution should be exercised and for you to check your load table.
Alternatively, you can use “Option 2” to check the status of your Job by submitting a followup BACK request, alongside your known Job Id. This will instruct SQL Sales to check all related Batches and return the status of the Job Id. When the Job is Closed, no further processing will occur by Salesforce and you can now run with WAIT to return all Id and Error values.
exec ss_Loader 'BulkAPIv2Harddelete', 'HARD', 'Account_BulkAPIv2_BACK_Harddelete','JOB:WAIT','7508d00000TtU3tAAF'
SQL-SALES BulkAPIv2Harddelete run date: 2023-12-09 --------------
21:54:40: Using Env|Schema: HARD|dbo
21:54:40: Starting Loader for Account batchsize 10000
21:54:40: SSId added to Account_BulkAPIv2_BACK_Harddelete
21:54:43: Connection method BULK & BULK API
21:54:43: Bulk API method JOB:WAIT Job = 7508d00000TtU3tAAF
21:54:43: Columns checked against Salesforce metadata
21:54:43: Starting load for Account_BulkAPIv2_BACK_Harddelete
21:54:47: JobId: 7508d00000TtU3tAAF, Job Complete
21:54:47: Load complete: Success:100 Failure:0
-----------------------------------------------------------------
Checking the load table
The success or failure errors for each row will be automatically written back to the Error column.
Running the example (Step 2 Option 2)
You can keep submitting with BACK and the known Job Id until the Status shows that the Job has Closed. Once the Job is Closed you can run as with Option 1.
exec ss_Loader 'BulkAPIv2Harddelete', 'HARD', 'Account_BulkAPIv2_BACK_Harddelete','JOB:BACK','7508d00000TtU3tAAF'
SQL-SALES BulkAPIv2Harddelete run date: 2023-12-09 --------------
21:56:34: Using Env|Schema: HARD|dbo
21:56:34: Starting Loader for Account batchsize 10000
21:56:34: SSId added to Account_BulkAPIv2_BACK_Harddelete
21:56:36: Connection method BULK & BULK API
21:56:36: Bulk API method JOB:BACK Job = 7508d00000TtU3tAAF
21:56:36: Columns checked against Salesforce metadata
21:56:37: Starting load for Account_BulkAPIv2_BACK_Harddelete
21:56:40: JobId: 7508d00000TtU3tAAF, Job Complete
21:56:40: BulkAPIv2Harddelete BACKGROUND completed successfully
-----------------------------------------------------------------