Skip to main content
All CollectionsLoading
BulkAPIv1Insert
BulkAPIv1Insert
Updated over 3 months ago

Create a load table (WAIT)

This example creates a load table called "Account_BulkAPIv1_Wait_Serial_Example_Insert". The Id and Error columns are mandatory, as this is an Insert there is nothing to add into Id, but it needs to be provided, as the resultant created record Id will be passed back on creation. All load operations require the Error column, whether success or failure, to guide and inform you.

WAIT means the SQL process will remain live while it waits for the Bulk API to return all responses. As the batches are created in v1 predictably, the responses can be tied back to the initial Insert payload and hence behave in a similar way to the SOAP API, in other words, the success/failure Error outcome will be posted back to the load table’s Error column.

SERIAL is how the Bulk API will internally process the created batches, Serial means one at a time which is a least stressful way to submit the batches (not that this should actually make a difference, however in some edge cases, particularly on complex objects with lots of config and code, submitting in parallel mode can cause threshold/governor limit failures).

PARALLEL is how the Bulk API will internally process the created batches, Parallel means that SQL Sales will submit as many batches to Salesforce as necessary according to the provided batchsize specified. Salesforce will process the batches potentially all in parallel, although in reality they should be processed as quickly as they can (still likely at least partially in parallel) according to the governor limits defined in your Salesforce instance.

drop table if exists Account_BulkAPIv1_Wait_Serial_Insert
select top 100
convert(nchar(18),null) as Id
,convert(nvarchar(255),null) as Error
,'DEMO__PREFIX' + Name as Name
-----
,Name as Name_Info
,AccountNumber as AccountNumber_Info
into Account_BulkAPIv1_Wait_Serial_Insert
from Account

Running the example

(Pass WAIT:PARALLEL for Parallel requirements)

exec ss_Loader 'BulkAPIv1Insert', 'DEMO', 'Account_BulkAPIv1_Wait_Serial_Insert','WAIT:SERIAL'

SQL-SALES BulkAPIv1Insert run date: 2023-12-08 ------------------

20:30:00: Using Env|Schema: DEMO|dbo

20:30:00: Starting Loader for Account batchsize 10000

20:30:00: SSId added to Account_BulkAPIv1_Wait_Serial_Insert

20:30:02: Connection method BULK & BULK API

20:30:02: Bulk API method WAIT:SERIAL

20:30:02: Columns checked against Salesforce metadata

20:30:03: Starting load for Account_BulkAPIv1_Wait_Serial_Insert

20:30:16: JobId: 7508d00000TtMlOAAV

20:30:17: Excluded: AccountNumber_Info is not available on object Account

20:30:17: Excluded: Name_Info is not available on object Account

20:30:17: Load complete: Success:100 Failure:0

-----------------------------------------------------------------

Note the Job Id for your submission is returned in the output for your reference, see also the log table ss_BulkAPILog

_Return helper table

The batch to which a given row has been allocated is logged in the helper table created in the run that is your load table + “_Return”, using the above example it is:

Account_BulkAPIv1_Wait_Serial_Insert_Return

As with the Insert load table itself, the newly created Id (via the BulkAPIv1Insert operation) is also provided as well as the Error column, paired with the originally submitted SSId).

ss_BulkAPILog table

The Job Id is also preserved in the ss_BulkAPILog table, written on each submission. Batch detail is not maintained on WAIT methods.

Checking the load table

The success or failure errors for each row will be automatically written back to the Error column and as with the SOAP API Insert operation, with BulkAPIv1Insert SQL Sales is able to pass back the newly created Id, paired with the submitted SSId.

Create a load table Example (BACK)

(Refer to previous sections for details on SERIAL and PARALLEL options)

This example creates a load table called "Account_BulkAPIv1_BACK_Serial_Insert". The Id and Error columns are mandatory as with all Loader tables although with a BACK method, they will not be written back following the Insert.

BACK (specifies BACKGROUND running) means the SQL process will submit the load data in the batches specified by batchsize, however unlike the WAIT method, the ss_Loader run will end once that’s done as the use case here is that no further action is required on the part of the User (i.e. submitting the data to the Salesforce Bulk API is sufficient enough for the job in hand). If you need to examine the success/failure of the submitted rows, there is a follow up script you can run to have the response data returned to SQL Server and your load table (instructions follow in this section).

drop table if exists Account_BulkAPIv1_BACK_Serial_Insert
select top 100
convert(nchar(18),null) as Id
,convert(nvarchar(255),null) as Error
,'DEMO__PREFIX' + Name as Name
-----
,Name as Name_Info
,AccountNumber as AccountNumber_Info
into Account_BulkAPIv1_BACK_Serial_Example_Insert
from Account

Running the example (Step 1)

exec ss_Loader 'BulkAPIv1Insert(50)', 'DEMO', 'Account_BulkAPIv1_BACK_Serial_Insert','BACK:SERIAL'

Note for the purposes of this test example, a batchsize of (50) has been defined, given the payload is 100 rows, this will force the creation of two batches, to illustrate the handling of multiple batches

SQL-SALES BulkAPIv1Insert(50) run date: 2023-12-09 --------------

07:47:44: Using Env|Schema: DEMO|dbo

07:47:44: Starting Loader for Account batchsize 50

07:47:44: SSId added to Account_BulkAPIv1_BACK_Serial_Insert

07:47:47: Connection method BULK & BULK API

07:47:47: Bulk API method BACK:SERIAL

07:47:47: Columns checked against Salesforce metadata

07:47:47: Starting load for Account_BulkAPIv1_BACK_Serial_Insert

07:48:01: JobId: 7508d00000TtQjVAAV

07:48:01: BatchId: 7518d00000dBwxRAAS CreatedDate: 2023-12-09 07:47:48

07:48:01: BatchId: 7518d00000dBwe6AAC CreatedDate: 2023-12-09 07:47:49

07:48:01: BulkAPIv1Insert BACKGROUND completed successfully

-----------------------------------------------------------------

Note the Job Id for your submission is returned in the output for your reference, see also the log table ss_BulkAPILog.

Note unlike the WAIT method, simply running BACK will not populate the _Result table nor write back Id or Error data to your load table, see the next set of instructions for how to do this, however the Batch Id(s) are included in the output dump for information purposes

ss_BulkAPILog table

The Job Id is also preserved in the ss_BulkAPILog table, written on each submission, for BACK methods, Batch information (Id and CreatedDate) is also written.

Running the example (Step 2 Option 1)

At any time after you have run the initial BACK, you can retrieve processed rows by reverting to using the WAIT method (either in SERIAL or PARALLEL mode). This is achieved by passing in the known Job Id into the @Special2 input parameter.

Note, when using “Option 1” if you have attempted to return processed rows “too soon” and some rows for a given Batch or Batches are not yet processed by Salesforce, SQL Sales will not be able to return an Id and Error value hence caution should be exercised and for you to check your load table.

Alternatively, you can use “Option 2” to check the status of your Job by submitting a followup BACK request, alongside your known Job Id. This will instruct SQL Sales to check all related Batches and return the status of the Job Id. When the Job is Closed, no further processing will occur by Salesforce and you can now run with WAIT to return all Id and Error values.

Note, running WAIT subsequently to the initial BACK for Update, Delete, Harddelete operations will return load status values to the Load table Error column. Whereas for Insert or Upsert, this is not possible, however for all operations, the _Return table is written back to, including new Ids and Error column values in the case of Insert or Upsert. The key difference is that the newly created or upserted to Ids are not tied back to the original load table row.

exec ss_Loader 'BulkAPIv1Insert', 'DEMO', 'Account_BulkAPIv1_BACK_Serial_Insert','JOB:WAIT:SERIAL','7508d00000TtQjVAAV'

SQL-SALES BulkAPIv1Insert run date: 2023-12-09 ------------------

08:05:12: Using Env|Schema: DEMO|dbo

08:05:12: Starting Loader for Account batchsize 10000

08:05:12: SSId added to Account_BulkAPIv1_BACK_Serial_Insert

08:05:15: Connection method BULK & BULK API

08:05:15: Bulk API method JOB:WAIT:SERIAL Job = 7508d00000TtQjVAAV

08:05:15: Columns checked against Salesforce metadata

08:05:15: Starting load for Account_BulkAPIv1_BACK_Serial_Insert

08:05:19: JobId: 7508d00000TtQjVAAV

08:05:19: Excluded: AccountNumber_Info is not available on object Account

08:05:19: Excluded: Name_Info is not available on object Account

08:05:19: Load complete: Success:100 Failure:0

-----------------------------------------------------------------

_Return helper table

The batch to which a given Id has been allocated is logged in the helper table created in the run that is your load table + “_Return”, using the above example it is:

Account_BulkAPIv1_BACK_Serial_Insert_Return

As with the Insert load table itself, the newly created Id (via the BulkAPIv1Insert operation) is also provided as well as the Error column, although for Insert and Upsert operations, these rows do not correlate to your load table, you would have to run with the WAIT method if you require this. For Update, Delete and Harddelete operations, the Result rows will correspond directly with your load table rows.

Checking the load table (Update, Delete, Harddelete)

The success or failure errors for each row will be automatically written back to the Error column and as with the SOAP API Insert operation, with BulkAPIv1Insert SQL Sales is able to pass back the newly created Id, paired with the submitted SSId.

Running the example (Step 2 Option 2)

You can keep submitting with BACK and the known Job Id until the Status shows that the Job has Closed (this will work with either SERIAL or PARALLEL). Once the Job is Closed you can run as with Option 1.

exec ss_Loader 'BulkAPIv1Insert(50)', 'DEMO', 'Account_BulkAPIv1_BACK_Serial_Insert','JOB:BACK:SERIAL','7508d00000TtQjVAAV'

SQL-SALES BulkAPIv1Insert(50) run date: 2023-12-09 --------------

08:41:26: Using Env|Schema: DEMO|dbo

08:41:26: Starting Loader for Account batchsize 50

08:41:26: SSId added to Account_BulkAPIv1_BACK_Serial_Insert

08:41:28: Connection method BULK & BULK API

08:41:28: Bulk API method JOB:BACK:SERIAL Job = 7508d00000TtQjVAAV

08:41:28: Columns checked against Salesforce metadata

08:41:29: Starting load for Account_BulkAPIv1_BACK_Serial_Insert

08:41:32: JobId: 7508d00000TtQjVAAV, Job Closed

08:41:32: BulkAPIv1Insert BACKGROUND completed successfully

-----------------------------------------------------------------

Did this answer your question?