React - Controlling Ajax Calls Made To The Server
Solution 1:
This is interesting problem to solve. One approach I can think is to make the 6th
ajax call as soon as any one in first batch finishes. This way there will be 5 ajax requests in process at any moment. I have tried to implement something similar. Though my solution does not make ajax calls, but I am guessing you can changed the process
function to make the ajax call and return the promise back.
/**
This function processes the jobs in batches, once a job is finished in batch it then processes the next job. This can be used to make network calls.
*/functionprocessBatch(queue, batchSize, processJob) {
// remove a batch of items from the queueconst items = queue.splice(0, batchSize);
let count = items.length;
// no more items?if (!count) {
returnPromise.resolve();
}
returnnewPromise((resolve, reject) => {
return items.forEach((item, i) => {
returnprocessJob(item).then(result => {
returnprocessBatch(queue, 1, processJob)
.then(_ => --count || resolve());
});
});
})
}
// create queuevar queue = [];
for (var i = 1; i <= 20; i++) {
queue.push(i);
}
// a per-item actionfunctionprocess(item) {
console.log('starting ' + item + '...');
returnnewPromise((resolve, reject) => {
// (simulating ajax)returnsetTimeout(function() {
console.log('completed ' + item + '!');
resolve();
}, Math.random() * 1000);
});
}
// start processing queue!processBatch(queue, 5, process)
.then(result =>console.log("All jobs processed"));
I just tried to implement a generic function using promises. You can try to run same with ajax calls. I would be interested to know how this solution would work for you.
As you can see, I am recursively calling the processBatch
function after successful execution of each job and the successive batchSize
is hard coded to 1
, but that can be changed and parameterized. Also, this function will work for only happy path case, as it does not take rejected promises into consideration.
Solution 2:
Interesting question, I am going to propose a different solution than the one you proposed, this one will make sure that there is always 5 requests being processed at maximum at each time.
functionmakeBatchCalls(arrayIds, length) {
// determines how many queries are being sent at any given timelet queued = 0;
// determines the index of the query to send at any given timelet currentIndex = 0;
// recursive function that launches queries and makes sure the queue is respectedlet launchNext = function() {
if (queued === length) {
return;
}
fetch(`https://jsonplaceholder.typicode.com/posts/${arrayIds[currentIndex]}`).then((results) => {
queued--;
launchNext();
// do something with your results here...
});
queued++;
currentIndex++;
};
// launch the first length queries as the queue is empty at firstfor (let i = 0; i < length; i++) {
launchNext();
}
}
Hope this helps.
Solution 3:
You can use the Async library for your use case. There is a queue function that does exactly this. It maintains a queue of tasks to be executed and executes them maintaining desired concurrency at all times.
Here is how your function can be changed to use Async queue.
asyncfunctionmakeBatchCalls(arrayIds, length)
{
// create a queue object with concurrency 5(equal to batch length in your case)var q = async.queue(function(task, callback) {
//Call your api for this task herefetch(`https://jsonplaceholder.typicode.com/posts/${call}`)
.then(function (response) {
//Once your task executes successfully, call the Q callback so that next pending task can be picked up. //This ensures all your tasks keep running one after the other with desired concurrencycallback();
})
.catch(function (err) {
//in case of failure, either you can return, or you can move to next task by calling callback
});
}, 5);
// Is called when all the tasks have completed
q.drain = function() {
console.log('all items have been processed');
};
// push all items to queuefor(var i=0; i < arrayIds.length; i++){
q.push(arrayIds[i]);
}
}
makeBatchCalls([1,2,3,4,5,6,7,8,9,10,12,12,13,14,15,16,17,18,19,20],5)
Post a Comment for "React - Controlling Ajax Calls Made To The Server"