[Sponsored Post] Magento Community Driven Project – Asynchronous API

– Guest article by comwrap GmbH, Silver sponsor of Meet Magento DE 2018

One of our customers had a specific requirement in Magento:

He wanted to import about 2000 products to Magento by using the standard Magento API. For this task, he used its self-developed ESB solution, which pushed all products via the REST-API toMagento. However, the customer’s ERP only can process the API requests asynchronously, so it pushed all the products in Magento simultaneously without waiting for a status or response from Magento.
This led to several problems we had to solve for the customer. In this article you’ll read how we managed to solve this problem by developing a great way to deal with large product imports.

The problems we faced

During the implementation we realized that the default Magento REST API was not working correctly in our case.

Problem 1: Database tables deadlocks

This problem first appeared when we tried to asynchronously import a lot of data to Magento. During the import process we continuously received deadlocks for table updates. This issue involved the tables: “url_rewrite” and “media_*”:

Example:

Serialization failure: 1213 Deadlock found when trying to get lock; try restarting transaction

Problem 2: No ERP error messages

The customers ERP system is not capable of saving and tracking error messages. It just sends requests and only checks if the request was executed successful or not. As a result, if there was error, we could not retrace how it occurred and how it could be fixed.

Problem 3: Lack of Performance

As we sent large amounts of requests to Magento at the same time, it becomes a very big problem for a database. Hundreds of requests for the creation or the change of products in Magento result in a massive database overload that may lead to less performance which affects conversions or the user experience in the shop if this would happen on a live database.

Problem 4: ERP dependency

For massive imports, the ERP system has to stay online until the process is finished to track responses and push the products one by one.

Our interim solution

Since Magento didn’t meet our requirements at this point, we came up with another solution:
We developed a middleware that we placed between Magento and the customer’s ERP system to catch all API requests that are to be sent to the Magento API.

In the next step the system forwards them to RabbitMQ. RabbitMQ takes them from a queue and executes one by one. Using this method allowed us to highly reduce the system load, prevented any Deadlocks in the tables and made it possible to implement additional logs for tracking all the requests to react faster if any error occurs.



Merging the solution with Magento

Since our solution worked as planned and fixed the problems for our customer, we came up with the idea that the same middleware functionality could be directly implemented in Magento API.
The advantage would be to import large amounts of data without the need of a middleware solution, which would result in less time and cost for new implementations.
This approach was next discussed with the Magento Team and we all agreed that this solution would be a great feature for the Magento Commerce system and could be implemented as a part of the Magento Contribution program.

API Changes

Within the scope of the implementation we worked together with Balance Internet (https://www.balanceinternet.com.au/) on the BULK API project. Comwrap took on the tasks of the asynchronous and Bulk API implementations.

The main tasks: