How to limit and queue up processes in Python












0














Lets say I have 10000 tasks at hand. How can I process them in parallel, running precisely 8 processes at any time? The moment a task is finished, the next task should be fetched for execution immediately.



for e in arr:
pr=Process(target=execute, args=(q,e))
pr.start()
pr.join()


I want to do this because my CPU has only 8 hardware threads. Swarming it with 10000 tasks at once will slow down the overall computation due to the switching overhead. My memory is also limited.



(Edit: This is not a duplicate of this question as I am not asking how to fork a process.)










share|improve this question




















  • 3




    Look at the Pool classes in the docs
    – James K Polk
    Nov 12 '18 at 5:11






  • 1




    Possible duplicate of How to process a list in parallel in Python?
    – U9-Forward
    Nov 12 '18 at 5:13










  • I don't think it is a duplicate of that question, as I am not asking how to fork a process. Anyway, Pool is probably the solution to my problem. Thanks, James!
    – Chong Lip Phang
    Nov 12 '18 at 5:17
















0














Lets say I have 10000 tasks at hand. How can I process them in parallel, running precisely 8 processes at any time? The moment a task is finished, the next task should be fetched for execution immediately.



for e in arr:
pr=Process(target=execute, args=(q,e))
pr.start()
pr.join()


I want to do this because my CPU has only 8 hardware threads. Swarming it with 10000 tasks at once will slow down the overall computation due to the switching overhead. My memory is also limited.



(Edit: This is not a duplicate of this question as I am not asking how to fork a process.)










share|improve this question




















  • 3




    Look at the Pool classes in the docs
    – James K Polk
    Nov 12 '18 at 5:11






  • 1




    Possible duplicate of How to process a list in parallel in Python?
    – U9-Forward
    Nov 12 '18 at 5:13










  • I don't think it is a duplicate of that question, as I am not asking how to fork a process. Anyway, Pool is probably the solution to my problem. Thanks, James!
    – Chong Lip Phang
    Nov 12 '18 at 5:17














0












0








0







Lets say I have 10000 tasks at hand. How can I process them in parallel, running precisely 8 processes at any time? The moment a task is finished, the next task should be fetched for execution immediately.



for e in arr:
pr=Process(target=execute, args=(q,e))
pr.start()
pr.join()


I want to do this because my CPU has only 8 hardware threads. Swarming it with 10000 tasks at once will slow down the overall computation due to the switching overhead. My memory is also limited.



(Edit: This is not a duplicate of this question as I am not asking how to fork a process.)










share|improve this question















Lets say I have 10000 tasks at hand. How can I process them in parallel, running precisely 8 processes at any time? The moment a task is finished, the next task should be fetched for execution immediately.



for e in arr:
pr=Process(target=execute, args=(q,e))
pr.start()
pr.join()


I want to do this because my CPU has only 8 hardware threads. Swarming it with 10000 tasks at once will slow down the overall computation due to the switching overhead. My memory is also limited.



(Edit: This is not a duplicate of this question as I am not asking how to fork a process.)







python python-3.x process parallel-processing python-3.6






share|improve this question















share|improve this question













share|improve this question




share|improve this question








edited Nov 12 '18 at 5:31

























asked Nov 12 '18 at 5:08









Chong Lip Phang

2,83843041




2,83843041








  • 3




    Look at the Pool classes in the docs
    – James K Polk
    Nov 12 '18 at 5:11






  • 1




    Possible duplicate of How to process a list in parallel in Python?
    – U9-Forward
    Nov 12 '18 at 5:13










  • I don't think it is a duplicate of that question, as I am not asking how to fork a process. Anyway, Pool is probably the solution to my problem. Thanks, James!
    – Chong Lip Phang
    Nov 12 '18 at 5:17














  • 3




    Look at the Pool classes in the docs
    – James K Polk
    Nov 12 '18 at 5:11






  • 1




    Possible duplicate of How to process a list in parallel in Python?
    – U9-Forward
    Nov 12 '18 at 5:13










  • I don't think it is a duplicate of that question, as I am not asking how to fork a process. Anyway, Pool is probably the solution to my problem. Thanks, James!
    – Chong Lip Phang
    Nov 12 '18 at 5:17








3




3




Look at the Pool classes in the docs
– James K Polk
Nov 12 '18 at 5:11




Look at the Pool classes in the docs
– James K Polk
Nov 12 '18 at 5:11




1




1




Possible duplicate of How to process a list in parallel in Python?
– U9-Forward
Nov 12 '18 at 5:13




Possible duplicate of How to process a list in parallel in Python?
– U9-Forward
Nov 12 '18 at 5:13












I don't think it is a duplicate of that question, as I am not asking how to fork a process. Anyway, Pool is probably the solution to my problem. Thanks, James!
– Chong Lip Phang
Nov 12 '18 at 5:17




I don't think it is a duplicate of that question, as I am not asking how to fork a process. Anyway, Pool is probably the solution to my problem. Thanks, James!
– Chong Lip Phang
Nov 12 '18 at 5:17












2 Answers
2






active

oldest

votes


















0














I think if you split the "for" loop for join statement your problem might be solved. Right now you start a fork and want the result to come back and go do another fork process. And no fork is closed right now.



for e in arr:
pr=Process(target=execute, args=(q,e))
pr.start()

for e in arr:
pr.join()


Or just go with pool and map functions.






share|improve this answer

















  • 1




    I am going for Pool.
    – Chong Lip Phang
    Nov 12 '18 at 5:27



















0














For Pool to work here I need to call get() too.



from multiprocessing import Pool
pl=
pool = Pool(8)
for e in arr:
pl.append(pool.apply_async(execute, (e))
for pl2 in pl: pl2.get()





share|improve this answer





















    Your Answer






    StackExchange.ifUsing("editor", function () {
    StackExchange.using("externalEditor", function () {
    StackExchange.using("snippets", function () {
    StackExchange.snippets.init();
    });
    });
    }, "code-snippets");

    StackExchange.ready(function() {
    var channelOptions = {
    tags: "".split(" "),
    id: "1"
    };
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function() {
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled) {
    StackExchange.using("snippets", function() {
    createEditor();
    });
    }
    else {
    createEditor();
    }
    });

    function createEditor() {
    StackExchange.prepareEditor({
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: true,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: 10,
    bindNavPrevention: true,
    postfix: "",
    imageUploader: {
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    },
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    });


    }
    });














    draft saved

    draft discarded


















    StackExchange.ready(
    function () {
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53256229%2fhow-to-limit-and-queue-up-processes-in-python%23new-answer', 'question_page');
    }
    );

    Post as a guest















    Required, but never shown

























    2 Answers
    2






    active

    oldest

    votes








    2 Answers
    2






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    0














    I think if you split the "for" loop for join statement your problem might be solved. Right now you start a fork and want the result to come back and go do another fork process. And no fork is closed right now.



    for e in arr:
    pr=Process(target=execute, args=(q,e))
    pr.start()

    for e in arr:
    pr.join()


    Or just go with pool and map functions.






    share|improve this answer

















    • 1




      I am going for Pool.
      – Chong Lip Phang
      Nov 12 '18 at 5:27
















    0














    I think if you split the "for" loop for join statement your problem might be solved. Right now you start a fork and want the result to come back and go do another fork process. And no fork is closed right now.



    for e in arr:
    pr=Process(target=execute, args=(q,e))
    pr.start()

    for e in arr:
    pr.join()


    Or just go with pool and map functions.






    share|improve this answer

















    • 1




      I am going for Pool.
      – Chong Lip Phang
      Nov 12 '18 at 5:27














    0












    0








    0






    I think if you split the "for" loop for join statement your problem might be solved. Right now you start a fork and want the result to come back and go do another fork process. And no fork is closed right now.



    for e in arr:
    pr=Process(target=execute, args=(q,e))
    pr.start()

    for e in arr:
    pr.join()


    Or just go with pool and map functions.






    share|improve this answer












    I think if you split the "for" loop for join statement your problem might be solved. Right now you start a fork and want the result to come back and go do another fork process. And no fork is closed right now.



    for e in arr:
    pr=Process(target=execute, args=(q,e))
    pr.start()

    for e in arr:
    pr.join()


    Or just go with pool and map functions.







    share|improve this answer












    share|improve this answer



    share|improve this answer










    answered Nov 12 '18 at 5:22









    ytamer

    316




    316








    • 1




      I am going for Pool.
      – Chong Lip Phang
      Nov 12 '18 at 5:27














    • 1




      I am going for Pool.
      – Chong Lip Phang
      Nov 12 '18 at 5:27








    1




    1




    I am going for Pool.
    – Chong Lip Phang
    Nov 12 '18 at 5:27




    I am going for Pool.
    – Chong Lip Phang
    Nov 12 '18 at 5:27













    0














    For Pool to work here I need to call get() too.



    from multiprocessing import Pool
    pl=
    pool = Pool(8)
    for e in arr:
    pl.append(pool.apply_async(execute, (e))
    for pl2 in pl: pl2.get()





    share|improve this answer


























      0














      For Pool to work here I need to call get() too.



      from multiprocessing import Pool
      pl=
      pool = Pool(8)
      for e in arr:
      pl.append(pool.apply_async(execute, (e))
      for pl2 in pl: pl2.get()





      share|improve this answer
























        0












        0








        0






        For Pool to work here I need to call get() too.



        from multiprocessing import Pool
        pl=
        pool = Pool(8)
        for e in arr:
        pl.append(pool.apply_async(execute, (e))
        for pl2 in pl: pl2.get()





        share|improve this answer












        For Pool to work here I need to call get() too.



        from multiprocessing import Pool
        pl=
        pool = Pool(8)
        for e in arr:
        pl.append(pool.apply_async(execute, (e))
        for pl2 in pl: pl2.get()






        share|improve this answer












        share|improve this answer



        share|improve this answer










        answered Nov 12 '18 at 6:02









        Chong Lip Phang

        2,83843041




        2,83843041






























            draft saved

            draft discarded




















































            Thanks for contributing an answer to Stack Overflow!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.





            Some of your past answers have not been well-received, and you're in danger of being blocked from answering.


            Please pay close attention to the following guidance:


            • Please be sure to answer the question. Provide details and share your research!

            But avoid



            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function () {
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fstackoverflow.com%2fquestions%2f53256229%2fhow-to-limit-and-queue-up-processes-in-python%23new-answer', 'question_page');
            }
            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            Popular posts from this blog

            Full-time equivalent

            さくらももこ

            13 indicted, 8 arrested in Calif. drug cartel investigation