Create a load table Example (WAIT)
(Refer to previous sections for details on SERIAL and PARALLEL options as well as WAIT and BACK methods).
exec ss_Delta 'DEMO', 'Account'
drop table if exists Account_BulkAPIv1_Wait_Serial_Harddelete
select top 100
convert(nchar(18),Id) as Id
,convert(nvarchar(255),null) as Error
,Name as Name_Info
into Account_BulkAPIv1_Wait_Serial_Harddelete
from Account
order by createddate desc
Running the example
exec ss_Loader 'BulkAPIv1Harddelete', 'DEMO', 'Account_BulkAPIv1_Wait_Serial_Harddelete','WAIT:SERIAL'
SQL-SALES BulkAPIv1Harddelete run date: 2023-12-09 --------------
20:00:23: Using Env|Schema: DEMO|dbo
20:00:23: Starting Loader for Account batchsize 10000
20:00:23: SSId added to Account_BulkAPIv1_Wait_Serial_Harddelete
20:00:26: Connection method BULK & BULK API
20:00:26: Bulk API method WAIT:SERIAL
20:00:26: Columns checked against Salesforce metadata
20:00:26: Starting load for Account_BulkAPIv1_Wait_Serial_Harddelete
20:00:28: Failed to create job. Response: {'exceptionCode': 'FeatureNotEnabled', 'exceptionMessage': 'hardDelete operation requires special user profile permission, please contact your system administrator'}
-----------------------------------------------------------------
Note, by default a sys admin user will not have permission to run Harddelete, you will need a special profile created (in fact cloned) from a system administrator and Hard delete enabled on it, without this initial preparation you will encounter the above error message
exec ss_Loader 'BulkAPIv1Harddelete', 'HARD', 'Account_BulkAPIv1_Wait_Serial_Harddelete','WAIT:SERIAL'
A special Environment has been setup in this Demo, called “HARD” (which uses a different username, with a Harddelete profile enabled
SQL-SALES BulkAPIv1Harddelete run date: 2023-12-09 --------------
20:09:50: Using Env|Schema: HARD|dbo
20:09:50: Starting Loader for Account batchsize 10000
20:09:50: SSId added to Account_BulkAPIv1_Wait_Serial_Harddelete
20:09:53: Connection method BULK & BULK API
20:09:53: Bulk API method WAIT:SERIAL
20:09:53: Columns checked against Salesforce metadata
20:09:53: Starting load for Account_BulkAPIv1_Wait_Serial_Harddelete
20:10:07: JobId: 7508d00000TtTqeAAF
20:10:08: Load complete: Success:100 Failure:0
-----------------------------------------------------------------
Note the Job Id for your submission is returned in the output for your reference, see also the log table ss_BulkAPILog.
_Return helper table
The batch to which a given row has been allocated is logged in the helper table created in the run that is your load table + “_Batch”, using the above example it is:
Account_BulkAPIv1_Wait_Serial_Harddelete_Return
The Error column is provided, paired with the originally submitted SSId.
ss_BulkAPILog table
The Job Id is also preserved in the ss_BulkAPILog table, written on each submission.
Checking the load table
The success or failure errors for each row will be automatically written back to the Error column.
Create a load table Example (BACK)
(Refer to previous sections for details on SERIAL and PARALLEL options)
This example creates a load table called "Account_BulkAPIv1_BACK_Serial_Harddelete". The Id and Error columns are mandatory. All load operations require the Error column, whether success or failure, to guide and inform you.
exec ss_Delta 'DEMO', 'Account'
drop table if exists Account_BulkAPIv1_BACK_Serial_Harddelete
select top 100
convert(nchar(18),Id) as Id
,convert(nvarchar(255),null) as Error
,Name as Name_Info
into Account_BulkAPIv1_BACK_Serial_Harddelete
from Account
order by createddate desc
Running the example (Step 1)
exec ss_Loader 'BulkAPIv1Harddelete', 'HARD', 'Account_BulkAPIv1_BACK_Serial_Harddelete','BACK:SERIAL'
SQL-SALES BulkAPIv1Harddelete run date: 2023-12-09 --------------
20:21:44: Using Env|Schema: HARD|dbo
20:21:44: Starting Loader for Account batchsize 10000
20:21:44: SSId added to Account_BulkAPIv1_BACK_Serial_Harddelete
20:21:47: Connection method BULK & BULK API
20:21:47: Bulk API method BACK:SERIAL
20:21:47: Columns checked against Salesforce metadata
20:21:47: Starting load for Account_BulkAPIv1_BACK_Serial_Harddelete
20:21:59: JobId: 7508d00000TtTt4AAF
20:21:59: BatchId: 7518d00000dC1eHAAS CreatedDate: 2023-12-09 20:21:49
20:21:59: BulkAPIv1Harddelete BACKGROUND completed successfully
-----------------------------------------------------------------
Note the Job Id for your submission is returned in the output for your reference, see also the log table ss_BulkAPILog.
Note unlike the WAIT method, simply running BACK will not populate the _Return table nor write back Error data to your load table, see the next set of instructions for how to do this, however the Batch Id(s) are included in the output dump for information purposes
ss_BulkAPILog table
The Job Id is also preserved in the ss_BulkAPILog table, written on each submission.
Running the example (Step 2 Option 1)
At any time after you have run the initial BACK, you can retrieve processed rows to your load table by reverting back to using the WAIT method (either in SERIAL or PARALLEL mode). This is achieved by passing in the known Job Id into the @Special2 input parameter.
Note, when using “Option 1” if you have attempted to return processed rows “too soon” and some rows for a given Batch or Batches are not yet processed by Salesforce, SQL Sales will not be able to return an Error value hence caution should be exercised and for you to check your load table.
Alternatively, you can use “Option 2” to check the status of your Job by submitting a followup BACK request, alongside your known Job Id. This will instruct SQL Sales to check all related Batches and return the status of the Job Id. When the Job is Closed, no further processing will occur by Salesforce and you can now run with WAIT to return all Id and Error values.
exec ss_Loader 'BulkAPIv1Harddelete', 'HARD', 'Account_BulkAPIv1_BACK_Serial_Harddelete','JOB:WAIT:SERIAL','7508d00000TtTt4AAF'
SQL-SALES BulkAPIv1Harddelete run date: 2023-12-09 --------------
20:26:05: Using Env|Schema: HARD|dbo
20:26:05: Starting Loader for Account batchsize 10000
20:26:05: SSId added to Account_BulkAPIv1_BACK_Serial_Harddelete
20:26:08: Connection method BULK & BULK API
20:26:08: Bulk API method JOB:WAIT:SERIAL Job = 7508d00000TtTt4AAF
20:26:08: Columns checked against Salesforce metadata
20:26:08: Starting load for Account_BulkAPIv1_BACK_Serial_Harddelete
20:26:12: JobId: 7508d00000TtTt4AAF, Job Closed
20:26:12: Load complete: Success:100 Failure:0
-----------------------------------------------------------------
_Return helper table
The batch to which a given row has been allocated is logged in the helper table created in the run that is your load table + “_Return”, using the above example it is:
Account_BulkAPIv1_BACK_Serial_Harddelete_Return
As with the Update and Delete operation load table itself, the Error column is provided, paired with the originally submitted SSId.
Checking the load table
The success or failure errors for each row will be automatically written back to the Error column.
Running the example (Step 2 Option 2)
You can keep submitting with BACK and the known Job Id until the Status shows that the Job has Closed (this will work with either SERIAL or PARALLEL). Once the Job is Closed you can run as with Option 1.