Just read the table and create a csv for import. Really?

[Posted October 3, 2018] Just read the table and create a csv for import. Really? Because we’re trapped in 1985 here?

Times are changing. When facing an integration problem the old answer was usually something along the lines of read a table and create a delimited file for import. This answer is quickly losing its feasibility. So many systems are now making themselves available only online. Microsoft d365, Salesforce, NetSuite, and so many others no longer allow you to read the database. To get data in and out of today’s systems you need a formal integration solution capable of targeting APIs. Just hacking something together using Excel is too weak, and creating your own custom application not feasible.
Some products come with capabilities for importing or exporting data, but this does not satisfy the need to get data into your other systems and puts the burden on you to figure out your own integration solution.

You need a system that is capable of targeting the APIs for all calls that you need to make, and for all systems that you need to integrate. You need a system that is not limited to just one system or a few API calls. You need a system that you can buy once and leverage to integrate all of your systems, old and new. Tired of thinking through your own cobbled together integration solution each time you have a need to exchange data or automate transactions? The time has come for a formal integration solution that can handle your current and future needs, integrate to your current and future systems, and free up your time to focus on your business.

The importance of Automation

[POSTED September 11th, 2018] At 2:45 pm on May 6, 2010, Wall Street essentially had a heart attack. In just minutes, the stock market plunged 1000 points, for reasons traders, analysts, and business media could not explain. The “flash crash” wiped out $1.1 Trillion of investor dollars and even though most of that was quickly regained, it left the market badly shaken. What happened? It appears that a single keystroke error was to blame. The letter “B” was inserted in a sell order instead of the letter “M”. Billion was input where Million should have been and it triggered a ripple effect through the automated financial markets.
Is that an extreme example? Yes. However, millions of dollars are lost by using integrations that are not automated and not in real time. To run lean and mean things like orders, inventory, sales, and so on need to be updated now. In real time. Also, why have and maintain one integration to payroll, another one to the ERP, and yet another to the MES? The Link® can sit in the middle and transfer information to and from all of those places. Buy a new system tomorrow? No problem, The Link can integrate to that too.
If you are updating or buying software for your manufacturing eco system don’t just ask IF it can integrate, be educated enough to find out HOW it can integrate. We are here to help and this sixty second video explains a lot.

The importance of a distributed integration

[POSTED September 5th, 2018]

Integrating your enterprise is not as simple anymore as installing an integration server and running everything through one server at one location. Even if you have a cloud-based integration, integrating multiple sites can get bogged down in time lags and latencies. A better and more timely solution is using distributed integration services. Deploy your map that reads data close to the system that it is reading from. Deploy your map that writes data close to the system that it writes to. Use asynchronous messages queues to transfer data between integration maps so that the integration is robust and delays are less relevant.
Maintaining a real-time integration between geographically distributed systems is a challenge and you need an integration solution that has the proper strategy to deliver on the challenge. You need a solution using distributed integration services that synchronize with each other in the background, without affecting the systems or users by introducing delays.

Is your integration up to speed?

[POSTED July 11th, 2018]

Everyone wants real-time integration. Who would want to wait for their data to replicate? Some companies continue to rely on batch integration because of a need for speed. The LINK(R) integration services run as background windows services with incredibly low overhead.  Unburdened services run at around 400 transactions per second on a typical server, making the processing of thousands or tens of thousands of transactions per hour well within reach.

More …
Batch integrations are outdated and archaic.
Why are you waiting for your data?

Get your data in real-time with The LINK.

eCommerce Automation

[POSTED June 5th, 2018]
    1. Is your business automated? Do you fulfill orders manually, or is everything automated? Do your online sales require manual follow-up in order to fulfill the orders? Are depleted and received inventory updated automatically?
      You can improve your customer satisfaction by improving your speed to fulfillment, inventory accuracy, and by eliminating manual errors in order fulfillment process. Automated integration with your ERP system means that the online presence is always up to date and orders are electronically accepted immediately.
      Whether you are focused on B2B or B2C, a robust integration can help translate into more sales and more efficiency for your business.

 

      Upgrade to an integration suite that integrates and automates your business

More …

Tired of point-to-point integration and automation offerings that only target 2 or 3 systems? The Link® offers system-agnostic integration that connects all of your disperate systems. With built-in error trapping, logging, reporting, monitoring, review and transaction retry capabilities, all you have worry about is your business rules and nothing else; meaning that integration is a snap. You are up and running with just your business rules in place, and no other code development effort required.

Batch versus Real-Time Differences

Batch processes integrate all of the data matching a given criteria into the target system in one large process.  Real-time processes monitor the source system for new transactions and respond immediately to a new transaction by integrating the transaction with the target system transaction-by-transaction, in real-time.  Batch processes take a great deal of processing power and a long time to complete, since they must processes all records that meet the integration criteria at once.  Speed ratings (transactions per second for  instance) are critical for batch processes because they can run for a long time to process all records.  You may need to ensure that the integration completes in an acceptable amount of time each time it is run.  Real-time processes monitor and integrate data one transaction at a time as they appear.  This means that teal-time integration processes are always ‘caught-up’, so the transactions per second capabilities are of much less concern for these processes.

More …
 

Batch versus Real-Time – What is the best choice?

There is no methodology that is better than another in all cases.  In some cases, it makes sense to automate integration via batch processes, and in other cases real-time integration makes more sense.  A mature and capable integration solution will allow for each integration map to be integrated as you see fit.  A typical implementation will see a mixture of maps automated as batch processes and real-time processes.

Batch is best when …

Batch should be considered when:

 The data being integrated is not time-sensitive

 If the data you are integrating has little impact from being hours or days old, then batch integration could be a good choice.

 You have limited integration between systems and you need to be sure that all changes from many areas are kept in synch with as few processes as possible.

 If you are unable or unwilling to create data maps for all of the individual elements requiring integration, a batch processes that integrates many fields could save time and get the integration process up and running sooner.

 You have to integrate a large number of data elements, but not very many transactions (records).

 One of the pitfalls of a batch integration is the amount of time it takes to process large source records sets.  if the number of transactions occurring is minimal, then batch integration could be a good choice for integration.

The data be in integrated is closely coupled with other data that is being updated on a batch basis.

If you are integrating data that is closely coupled to data that is being updated via a batch process, then it only makes sense that the closely coupled data should be updated via a batch process as well.  For instance, if you are using an ERP system and integrating forecast orders (that are updated via a batch process) to another system, there may be no advantage to monitor the forecast orders in real-time since they are only updated with each ERP batch run.  In this case, it might be more efficient to set up a batch integration that is triggered just after the forecast orders are updated.

You don’t mind a little manual intervention

Batch processes can be automated just like real-time processes can be, but batch processes have a catch.  If a special situation arises where a refresh of the data is required that falls out of the normal batch processing times, then the batch synchronization will need to be triggered manually.  This can be a big deal if the person with all of the system smarts is not available or if the batch takes an excessively long time to complete.  This situation doesn’t come up every day, but it always comes up eventually when dealing with batch processes.

You have the necessary power to run large batch processes

Batch processes can take a long time to run and take up a lot of resources.  Since these processes usually have a time window in which they need to complete, adding more processes can make for a time crunch or a processing power crunch.  Before choosing to integrate systems using batch processes, be sure that you have excess capacity and/or time to allow the integration to complete in an acceptable amount of time.

Real-Time is best when …

Real-time integration should be considered when:

 The data being integrated is time-sensitive

 If the data you are integrating has an impact if it is hours old, then it might be good to choose real-time integration to ensure that the data is always as current as possible.

 You have the development expertise and time required to create integration maps that describe exactly what needs to be integrated, without resorting to integrating large collections of data when only specific data within  those sets is required.

 It’s ideal to be able to integrate only the data that is required to be integrated.  Sometimes a shortage of development time or expertise leads to more data being integrated than is necessary.  If you have the time and skill to create exacting integration maps, then you can create a greater number of real-time integration processes rather than a smaller number of batch processes.

 You have to integrate many transactions (records).

 If you are integrating systems with high transactional throughput, then real-time integration can be a better choice because it is always caught-up with processing.  This averts the problem of excessively long batch runs that can hog machine resources and interfere with other processes.

The data be in integrated is loosely coupled with other data that is being updated on a batch basis.

Most environments have a mix of data that is updated in real-time and data that is updated via batch processes.  If the data you are integrating is not tied to a batch processes, then you are free to choose real-time integration to integrate it with other systems as  efficiently as possible.

You want to avoid manual intervention

Real-time processes are always caught-up, so there is rarely a need for manual intervention to get them caught-up or synchronized with other activities.  Real-time integrations tend to “just work” and require the least amount of manual attention.

You do not have the necessary power to run large batch processes

Batch processes can take a long time to run and take up a lot of resources.  Since these processes usually have a time window in which they need to complete, adding more processes can make for a time crunch or a processing power crunch.  If you do not have the excess capacity and/or time to allow a batch integration to complete in an acceptable amount of time then you should choose real-time integration.  Real-time integration is much less processor intensive, and does not require a time window to complete.

A Note on THE LINK®

THE LINK allows all maps to be deployed as either a batch process or a real-time process.  There is no change in the map itself.  This is simply a deploy-time decision.  Batch maps can be scheduled.  Real-time maps can be associated with up-time and down-time schedules so that they do not run at inappropriate times

Efficient Cloud Integration

[POSTED April 16, 2018]

Cloud integration is changing

Cloud based systems such as d365 and others demand a new paradigm for how make them work best with your other enterprise software.  Reading tables and writing to files to facilitate your information isn’t good enough anymore.  New systems demand new strategies that leverage local reading and callback servers.  The local reading ensures that the system is being read without using network bandwidth or polling, and proper use of callback servers means that the data is transmitted across the network efficiently, without polling or unnecessary network traffic.

Are you still using file-based methodologies to move data in and out of the cloud?  Are you polling or flooding your network with unnecessary traffic?

THE LINK® with cloud-based monitoring and local callback server technology may be your answer to efficient cloud integration.

 

More …

 

Integration should be a snap, not a development process.

[POSTED March 12, 2018]
    How much time does it take you to build a simple integration map? If the answer is more than an hour then perhaps it is time to upgrade your integration software. Building a map should be as simple as mapping fields or invoking an API and nothing more. If you are building forms, controls, error trapping, logging, reporting, etc., then you’re probably wondering why your “simple” integration takes so much effort?
    Upgrade to an integration suite that magnifies your productivity and keeps you focused on business rules, not custom development.

    More …

    The Link® lets you target complex API’s of any type by allowing you to control the API construction and population of the required fields. Then you can test your link and deploy it; all without ever having to write a user interface or user controls. With built-in error trapping, logging, reporting, monitoring, review and transaction retry capabilities, all you have worry about is your business rules and nothing else; meaning that integration is a snap. You are up and running with just your business rules in place, and no other code development effort required.

ERP Integration with batch-based processes is dying!

[POSTED February 27, 2018]
    The old way of importing manually or through schedules is being replaced by automated transactional integration which synchronizes and replicates data in real-time, enabling state of the art planning, scheduling, capacity planning, MES, and supply chain systems to work dynamically and always be up to date. Production activities process in real-time enabling better reporting, resulting in more current, better informed reporting and decision making.
    Stuck in the 1980’s? Embrace the best practices of real-time transactional integration today.

    More …
    (The old way. Timers, batches, lots waiting)

    (The new way. Real-time transactional synchronization. All systems are up-to-date – Always)

Server Based Integrations are dying!

[POSTED February 20, 2018]
    Using centralized applications for integration is rapidly becoming an outdated methodology due to its inability to stand up to the features, speed, and flexibility of distributed microservices. It used to be that integrating data between systems could be accomplished by integration applications that ran on a centralized server, requiring that all data, all rules, and all horsepower be dedicated to that centralized server. This centralized (or monolithic) integration methodology is being supplanted with small, nimble, distributed microservices.

    More …

    Unlike web services distributed microservices do not need to be called upon to perform work. They work silently and automatically in the background at all times. They wait for their integration trigger to occur (a new or changed file, database record, website, API call, etc.) and as soon as the trigger occurs they apply the integration map (to transform and map the data) in real-time. These real-time service maps are deployed as discrete maps that are only concerned with mapping a single endpoint collection to another endpoint collection (think file to table, or table to API method or API method to xml, etc.) As new integration maps are needed, they are created using a design tool and then deployed as individual Windows services.The advantages to using microservices for integration over a traditional monolithic integration server are many, including:

    1. Services are tiny in size and tiny in memory use, requiring few resources. No need for a massive integration server with a ton of memory and expensive processors.
    2. With no need for centralizing data, services can be deployed anywhere. Deploy them close to the data they are monitoring, or close to data target system, or use two services (one to monitor and one to write) to ensure that everything has as little latency as possible.
    3. Change control becomes more isolated and more flexible since changes can be made to individual integration maps without affecting other deployed maps.

    An enterprise-class microservices integration suite such as: The Link® also includes all of the goodies that you expect with any integration solution:

    1. Centralized management (reporting, monitoring, error tracking, failed transaction retry, scheduling, etc.)
    2. Support for centralized or distributed development and deployment of service maps.
    3. Automated transaction logging, process logging, error logging, etc.
    4. Tight integration with Visual Studio to let you work in your IDE of choice, Visual Studio, as well as support for your existing classes, dlls, APIs, etc. Leverage your existing knowledge and existing code.

    Missing Link Technology developed The Link® , a powerful and flexible integration suite. The Link® allows add-on applications to communicate with an unlimited number of other systems such as ERP, Time and attendance, shop floor systems, and more. Using open standards, The Link® supports reading and writing: tables (any ODBC-compliant database), views, delimited files, xml files, web services, web APIs, .NET APIs, and more. Endlessly extensible through tight integration with Visual Studio, The Link® has the power and the flexibility to integrate to all your systems.

     

    The end result is more sales! The value proposition for The Link® and your customers is a real-time integration solution offering to facilitate integration between many systems including well-established ERP systems that any customer or partner can take advantage of. Also customers will see dramatically improved speed to solution with built-in error trapping, error reporting, transaction logging, monitoring, management, and reporting tools. The Link® allows you to provide world-class real-time integration solutions for your customers without the need for specific knowledge of each ERP or other enterprise systems.

     

    (The old way. Everything in one single, monolithic server. All eggs in one basket)

    (The new way. Small, agile, individual Windows services; each running a single integration map)