Here are several important considerations when designing Integration Projects.
- Data Refresh or Cross-Reference Tables
- Code Tables over Reference Tables
- Using Cloud Adapters and Events
- Cloud Data Routing
- Cloud Connection to Customer Database
- Throughput Analysis
- API Consumers and Providers
- Using Long-running or Asynchronous Business Processes with a Web Service Provider
- Before Deploying
- Process Schedulers
- Pre-Process Business Process
- 997 Enablement and Schema
- Miscellaneous
Data refresh or cross-reference tables
If your Business Process (BP) needs a cross-reference table during CIC processing, but this information resides in an on-premise ERP environment, you'll need to work with the CIC Cloud support team to create the cross-reference tables in CIC and set up the automated process for data refresh.
Using Reference Tables
A Reference Table is a database table you can use for various purposes. Typically, Reference Tables are used for mostly dynamic data, XREF values used in transformations, and Business Processes. Reference Tables can be used for storing data that we save in Code Tables. With Reference Tables, however, the user gets CRUD functionality without deploying.
- You should not create columns that are not required.
- If possible, add indexing/Primary Key constraints on data in Reference Tables.
- Depending on business scenarios, you can develop a Business Process to either remove all the data from a Reference Table and insert new data or just insert new data.
- If you want a Reference Table to be purged based on a regular basis, for example, you want to purge entries older than 30 days, add a date column, and create SQL and a Business Process to delete the entries from the table.
- Examples of some implementation scenarios:
- Customer Master Data
- SKU – UPC relationship table
- Trading Partner Contacts table for direct reporting
- Product Attributes Table
Code Tables
Code Tables are preferred over Reference Tables in scenarios where data is static and not going to change frequently. Zipcode, City/State relationship are typical examples. Reference tables use SQL commands/Database connections, so if you must make many iterations for fetching data, it is recommended to use a Code Table instead.
Using Cloud Adapters and Events
Cloud Adapters must point to relevant Endpoints. Correct Events must be configured from within the CIC Cockpit when creating a Data Flow using a transformation Endpoint.
In cases where you have a single Filesystem Endpoint sending multiple documents, you'll need to include a launcher BP to segregate the various transaction types (outbound 856, 810, 846, etc.)
If there is only a single environment (Production), you should still create separate Cloud Adapters for Stage and Production environments. For example, Stage Adapters connect to Stage Endpoints, etc.
Note: Customers having single Integration Engine environments must have different EDI IDs between Stage and Production.
The following recommendations relate to the use of Cloud Adapters in your integration.
- Use two separate Cloud Adapters: one for inbound files used to send transformed files to the backend systems, and one for outbound files used to send to Trading Partners.
- Try to use the least number of Cloud Adapters as possible, and use variables to route files to the correct endpoint and filename. The filename can also be created as per requirement and passed to a Cloud Adapter from a Business Process.
- Try including a date-time stamp in the filename to make the name unique. This helps avoid files being overridden.
Cloud Data Routing
For inbound data, data transformation must use Business Processes to handle different data formats (like EDI, XML, CSV) received from the trading partner Endpoint. The Integration Engine must differentiate between the data formats based on either content-based routing (in the Integration Engine) or file name.
For outbound, it would be preferred to have {data_format} and {doc_type} subdirectories to route different documents within the Integration Engine. However, in the situation when the customer cannot create the subdirectories, the transformation needs to handle different data formats (XML, CSV, Flat file) and doc types (810, 855, 856) either based on file name or content-based routing within Business Processes.
Cloud Connection to Customer Database
Use a proxy to connect to your database. See Using a proxy to connect to a database from CIC in Database Integration. Do not use the JDBC SSL connection in this case. If working with a Managed Services customer, use a proxy to connect to a database, as noted above. However, if the database proxy method does not work, try a JDBC SSL connection.
Throughput analysis
Throughput analysis of the interfaces within CIC needs to be performed to plan for large-size file processing or large-volume file processing. The following details are important to know before starting an implementation.
- File size
- Number of expected files during peak hours
- The peak window in which maximum file flow is expected.
Use the Initial Planning Questionnaire to gather the required information upfront.
Please work with Cloud Support or your Solution Architect to resolve any issues.
API Consumers and Providers
Recommendations and best practices when working with API Consumers:
- If we have multiple interfaces on the same server, then it is recommended to create one API Consumer, and use Property Variables to construct the URL. For Example, if we have
cleointegration.io/order
for Orders andabc.cleointegration.io/shipment
then we can have one API consumer with the Base URLabc.cleointegration.io/{URL}
, where “URL” is the variable that can be sent from the Business Process. - Know the throttling limits (i.e. the number of API calls that can be made to a server in a given interval of time) before implementation. Ensure that the implementation is well within the limits and that the implementation can handle all throttling requirements.
- If a token-based authentication is required, then:
- For tokens that expire in short intervals (15 minutes for example), then we need to do an API call for the token and use that token in the Data API call.
- For tokens that expire in long time intervals (24 hours or never) then it is recommended to have a database table or Global Variable where the token can be saved and used wherever required, rather than doing an API call token every time we do a Data API call.
- Proper error handling should be done based on the reply from the API server.
- In case we need to ignore some standard (400,500 series) HTTP replies from creating unnecessary tickets then the below approach can be followed.
Recommendations and best practices when working with API Providers:
- Substantial tasks on data such as transformation, DB inserts, etc, should not be performed using the API Provider Launcher Business Process. Instead, call other Events using Send Event Notification Business process tasks.
- In case a file needs to be validated before sending the reply back, try to keep the Business Process steps as light as possible.
Using Long-running or Asynchronous Business Processes with a Web Service Provider
here are several use cases where you might want to use a Web Service Provider object with Business Processes that are either long-running or asynchronous. For example:
- A Business Process you want to be invoked asynchronously by Trading Partners.
- A scenario where you use a webhook to invoke a Web Service Provider API or to receive notification from a SaaS application to run a background process.
- A long-running Business Process with an execution time of more than 30 seconds invoked using a Web Service Provider.
The following best practices can help you successfully implement the use cases noted above.
- Create a Web Service Provider, which internally creates and attaches a Business Process (Parent BP).
- Add the required tasks to the Parent BP such that they run for a shorter period.
- Add a Send REST Web Services Reply task to send a response back to the client. This is typically an acknowledge response.
- Create a new long-running Business Process and add the required tasks.
- Create an Event object and configure the long-running BP details in the Event properties.
- Add a Send Event Notification task after the Send REST Web Services Reply task in the Parent BP, and provide the created Event details in the properties - to launch the long-running BPS as an asynchronous task.
- Add Set Exit Status as the last task in the Parent BP.
Pre-Process Business Processes
Pre-Process Business Processes should be implemented if we need to manipulate or do content/filename-based routing.
Example Scenario: Let’s assume all the outbound files are received via the same dataflow through the same event; in this case then logic within the pre-process BP can help segregate them and then call their respective Launcher BPs.
Another scenario could be replacing Special Characters from the file or routing files based on the filename.
Process Schedulers
To implement a scenario where we want to run Scheduled Business Process between a certain hour (for example, 9:00 AM to 5:00 PM every hour): create a scheduled BP which runs every hour and fetches the current date time; if the current time is greater-than or equal-to 9:00 AM and less-then or equal-to 5:00 PM then trigger the actual BP, otherwise exit.
Before Deploying
Before you perform any deployment:
- Inform the Customer in advance.
- (Internal only) Use the #cic-deployments Slack channel to inform the applicable internal team prior to deployment.
997 Enablement and Schema
Recommendations:
- Have one Schema per document type, and use this Schema across multiple Rulesets.
- Trading Partners may have user-defined qualifiers that are not part of standard X12. In this case we send rejected 997s; we recommend that you discuss with a Trading Partner to see if they can use the standard one; if not then we can manually add them to the respective Schema.
- If you need to modify an EDI Schema, it is better to create a new Schema rather than edit an existing one; this generally helps prevent other transactions from being affected.
Miscellaneous
- Use the loginfo Ruleset task before the Force Transformation task to pass meaningful information so that is easier to debug the error and it can be easily understood.
- Use good filtering in your BP that enables other integrators or Support to differentiate generic failures from force transformation failures; this prevents an email ticket from being sent to Support (when an email to the Customer will be enough). If it is required to generate a ticket, then use the following BPs: inboundForcedErrorCockpitBPS or outboundForcedErrorCockpitBPS.
- When designing a BP to process files based on filename, always be sure to address the negative scenario (i.e., when the filename does not match any criteria, etc) by having an e-mail generated that assists the Support team (and provides enough details that won’t require entire Project review.
- Map the Message Id to User reference 1.
- When possible, use process environment variables instead of User Reference values.
- When using context point user reference values, make sure that no more than 64 characters are used.
- Be sure to document unique implementation details for future maintenance and understanding.
Comments
0 comments
Please sign in to leave a comment.