Category Archives: Integration Corner

Wedid: Marketo and Infor (Formerly known as Saleslogix) Integration

Challenges

Customer uses Marketo to track incoming leads that would need to be converted to accounts and contacts respectively in their Infor CRM system. The customer would also like to track all field changes and/or activities that occur in Marketo into Infor as the contact related activities, as well as other custom records being tracked in Marketo at the moment.

Solution:

Purpose To ensure leads and their respective related records from Marketo are in-sync between Contact in their Infor system.
Applications Marketo
Infor
Tool Boomi
Information Marketo Lead > Infor Account/Contact
Infor Account/Contact > Marketo Lead
Marketo Activities > Infor Activities (Custom)
Infor Notes Histories (Custom) > Marketo Notes History (Custom)
Data Formats JSON
Volumes ~100 – 300/day
Process When a new lead is registered in Marketo, the lead record will be in return synced into Infor as Contact with the lead’s company being the Infor account record.
However, when a new contact is created or a contact record is updated in Infor, the information would need to be synced to Marketo as well.
For every change in Marketo Lead (Activities), these trails will need to be synced into Infor as well. Some additional information tracked in Infor (custom object records), needs to be created and tracked in Marketo too.
Schedule ~ 5 minutes
Complexity Medium

Wedid: ActionHRM and Kallidus Integration

Challenges

Customer is tracking employees information through ActionHRM (A Human Resource Management System). These employees need to be tracked as a student record in their Kallidus system (being a student and learning management system) too.

Solution:

Purpose To ensure student records in Kallidus system is up to date from the ActionHRM employee records
Applications Sharefile (FTP)
ActionHRM
Kallidus
Versions Sharefile (FTP)
ActionHRM
Kallidus
Tool Boomi
Information Sharefile (CSV) > (Query) ActionHRM Employees > (CSV) Kallidus
Data Formats CSV and JSON
Volumes ~10/day
Process A set of qualified employees Id will be listed in csv format and to be uploaded to an FTP server. The integration process in return query for the employees details using ActionHRM API to produce student information in csv format for Kallidus to process.
Schedule Daily
Complexity Medium

Boomi: Debugging low latency mode processes

You may have processes (especially web service listener type processes) that are configured with low latency mode plus the option “Only Generate Process Log on Error” enabled. This would make it difficult sometimes to troubleshoot or review the Boomi process logs as it would not show up in the process log not unless it is an error process.

One way to debug/troubleshoot your process is to add the parameter _boomi_debug=true on the endpoint URL while executing the specific process and the specific request will be printed on Boomi logs.

For example:

https://test.connect.boomi.com/ws/simple/executeMyAPI?_boomi_debug=true

For further details please see: http://help.boomi.com/atomsphere/GUID-9DBB4927-E4FA-40F7-9A3C-077063E1AE7F.html

Boomi: Handling Additional Parameters when extending OAuth 2 details

Boomi allows extending OAuth 2 details for example in a HTTP Client Connector.

http_extension

When defining the connection details in the development mode, you are allow to build the Authorization and Access Token URL with additional parameters via the “Add Authorization Parameter” and “Add Access Token Parameter” section:

additional_params

There is no way to extend the additional parameters portion as of today.

An alternative approach is to consider building the parameters directly through the “Authorization Token URL” and “Access Token URL” directly instead.

For example: ${the_authorization_url}?response_type=code&resource=……

In this case, you can then overwrite the parameters through the “OAuth2 Authorization Token URL” and “OAuth2 Access Token URL” in the extension.

This is something to be considered when you are planning to extend the OAuth 2 details for a connector.

Wedid: Boomi Atom Installation and Processes Transfer

Challenges

Customer has been using the Boomi Atom Cloud to run their integration processes and would like to shift to an on-premise atom to have better control over the atom configuration.

Customer would have difficulty transferring all the processes into the new production environment as well when the on-premise atom is available.

Solution:

  • Installed a 64-bit on-premise atom hosted in Linux server
  • Assisted in transferring existing processes into the new production environment
  • Updated the environment extension as well as redefine the integration schedule
Purpose Install on-premise atom and transfer the existing integration process to ensure all processes utilises the on-premise atom instead
Tool Boomi
Complexity Low

 

Boomi: SAP Integration Tips

This guide is not to help you to build a brilliant integration, but it gives you an important head start if you are integrating to SAP by using the Boomi SAP Connector.

Installation

  1. Before the installation, please make sure that you have installed the Java JDK (not JRE) in the server although Boomi Atom installer will download the Java JRE into the Atom installation directory (e.g, C:/Boomi Atomsphere/Atom – WDCi Atom/jre) if you do not have one in the server. Unfortunately, the Java JRE is not sufficient to run the SAP connector.
  2. You need to create a JAVA_HOME environment variable to point to the Java JDK directory and also include the JAVA_HOME/bin into the global path environment variable.
  3. After the installation, you will need to download the additional SAP library from the official site (service.sap.com/connectors > SAP Java Connector > Tools & Services) and extract the content into the <Boomi atom installation folder>/userlib/sapjco folder (e.g, C:/Boomi Atomsphere/Atom – WDCi Atom/userlib/sapjco).
  4. Once the above is done, please restart the Atom service in order for the changes to take effect.

Notes: If you would like to use SAP IDoc Listener (for real time integration), Boomi SAP connector requires a database (for tracking purpose) and you will need to download the required JDBC driver and place in the userlib/ folder.

Operation Profile

After importing the operation, please make sure that the date data type field has the correct date format in both Request and Response profile. The format used by the connector is:

  • Date = yyyy-MM-dd
  • Time = HH:mm:ss

Failing to configure the date in the Request profile would cause the connector to always returns all data from SAP.

Get Operation

Mostly, you will query the data by using last updated date time. The date/time format for the query operation is:

  • Date = yyyy-MM-dd (this is different from the format being used in SAP BAPI Tester, dd.MM.yyyy)
  • Time = HH:mm:ss

Send Operation

If you are sending information to SAP via a BAPI operation, it is best to double check with the SAP developer to see if they require Boomi SAP Connector to send “Commit Transaction” as part of the BAPI call. If yes, please make sure that the “Commit Transaction” option is checked in the SAP operation.

Salesforce Tips: Filter Opportunity Product Dynamically During Selection

We came accross a requirement to display only certain list of products dynamically for user to select when adding a new opportunity line item.

First thought was to use different Pricebook, however, this is not possible as the customer has an integration to sync all the product price into Standard Price Book.

The workaround is to create a new list button for Opportunity Product object that redirect the user to the Salesforce standard “add opportunity product” screen but with some predefined filter value. For example:

/p/opp/SelectSearch?addTo={!Opportunity.Id}&retURL=%2F{!Opportunity.Id}&PricebookEntrycol0=PRODUCT2.FAMILY_ENUM&PricebookEntryoper0=e&PricebookEntryfval0={!Opportunity.Type}

The new button will need to be added to the Opportunity page layout in the Opportunity Product related list.

opp_product_auto

Boomi Connector: Using JSON Schema

In the latest Boomi connector SDK, you can now write a custom connector to work with JSON profile instead of using the convention XML profile. So how to do this?

Let’s take the following JSON schema as an example:

{  
   "$schema":"http://json-schema.org/draft-04/schema#",
   "title":"Product set",
   "type":"array",
   "items":{  
      "title":"Product",
      "type":"object",
      "properties":{  
         "id":{  
            "description":"The unique identifier for a product",
            "type":"number"
         },
         "name":{  
            "type":"string"
         },
         "price":{  
            "type":"number",
            "minimum":0,
            "exclusiveMinimum":true
         }
      }
   }
}

The main object element in the JSON schema is “item”. In the connector browser, we just need to declare “item” as the element name (or they call in pointer in the connector) and specify the input and output type as below:

ObjectDefinition objDef = new ObjectDefinition();
objDef.setElementName("/" + elementName);
objDef.setInputType(ContentType.JSON);
objDef.setOutputType(ContentType.JSON);

objDef.setJsonSchema(jsonStr);

That’s all! Happy coding 😀

Boomi Tips: Retrying Error Document for Sub Process

In Boomi, we often design the integration process to run in a serial mode by having a Parent process that invokes Sub process.

Parent Process

This is really helpful if we have a business flow which we need to make sure that the parent record (e.g, Customer) is synced into the target system before the related record (e.g, Sales Order) is being synced. However, this design is not that handy when it comes to retrying error document. As you all know that Boomi allows you to retry an error document in the Process Monitoring console. This only works for the document that is related to the Boomi process that is attached to an environment directly and deployed.

As a workaround, you will need to attach and deploy the related Sub process to an environment and just let it sit there without having any scheduler.