Skip to content

Instantly share code, notes, and snippets.

@chrisle
Last active August 29, 2015 13:57
Show Gist options
  • Star 0 You must be signed in to star a gist
  • Fork 0 You must be signed in to fork a gist
  • Save chrisle/9478860 to your computer and use it in GitHub Desktop.
Save chrisle/9478860 to your computer and use it in GitHub Desktop.
Is this what you mean?
You have ...
[1,2,3,4,5,6,7,8,9]
[1,2,3,4,5,6,7]
[1,2,3]
You want in chunks of N (say 3)
[
[1,2,3],
[4,5,6],
[7,8,9],
[1,2,3],
[4,5,6],
[7,1,2],
[3]
]
@eppiev
Copy link

eppiev commented Mar 11, 2014

Close -- I'm working on creating greater efficiency in batch scraping. I actually don't want to split any arrays (urls for a single domain) into smaller groups as it could have me inadvertently taking down a server. I just want to combine small ones into bigger groups.

Assume my target size is 5 and I have:

[1,2,3,4,5,6,7,8,9]
[1,2,3,4,5,6,7]
[1,2,3,4]
[1,2,3]
[1,2]
[1]

Anything size 5 and above gets ignored for these purposes -- it's already at optimal size. Small groups are not desirable because they create inefficiency in scaling up workers.

So, I'd want to combine them together.

Starting the from the bottom and adding up would give me
[
[1],
[1,2]
],
[
[1,2,3]
],
[
[1,2,3,4]
]
plus the ignored items above size 5.

That's better than we started but it's not optimal. Ideally, I'd get:
[
[1],
[1,2,3,4]
],
[
[1,2],
[1,2,3]
]

Now I've got two groups that both have 5 total elements plus the over 5s we ignored to start.

@chrisle
Copy link
Author

chrisle commented Mar 11, 2014

This is pretty much the bin packing problem.
let me see...

@chrisle
Copy link
Author

chrisle commented Mar 11, 2014

<?php

/**
 * Here, we just make some fake data. 1000 arrays filled with a random
 * number of URLs.
 */
$urls_to_split_up = array();
for ($i = 0; $i < 10; $i++) {
  $random_number = rand(1, 10);
  $urls = array();
  for ($n = 0; $n < $random_number; $n++) {
    array_push($urls, "http://www." . (rand(1, microtime(true))) . ".com");
  }
  array_push($urls_to_split_up, $urls);
}

/**
 * Based on http://bastian.rieck.ru/uni/bin_packing/bin_packing.pdf
 * Algorithm #10: Best-fit with lookup table
 *
 * If you sort then your runtime will be at least O(n^2). This has a better
 * runtime of O(nK) -- ie: it's a faster algorithm than using sorting.
 */
//

// Set your target size here. Anything bigger than this will not be processed.
$MINIMUM_BATCH_SIZE = 5;

// 1. Initialize a lookup table and storage
$bin_lookup = array_fill(0, count($urls_to_split_up), $MINIMUM_BATCH_SIZE);
$bin_store  = array_fill(0, count($urls_to_split_up), array());

// 2. For all object do ....
foreach ($urls_to_split_up as $urls) {

  // 3. Let W be the current object size
  $size_of_array = count($urls);

  // 4. Search suitible bin with remaining capacity
  if ($size_of_array <= $MINIMUM_BATCH_SIZE) {
    for ($i = 0; $i < count($bin_lookup); $i++) {
      if ($size_of_array <= $bin_lookup[$i]) { break; }
    }

    // Actually store the URLs and subtract remaining capacity
    $bin_lookup[$i] = $bin_lookup[$i] - $size_of_array;
    array_push($bin_store[$i], $urls);

  }

  /**
   * Uncomment these lines if you want to also process URL arrays bigger
   * than your target size
   */

  /*
  else {
    for ($i = 0; $i < count($bin_lookup); $i++) {
      if ($bin_lookup[$i] == $MINIMUM_BATCH_SIZE) { break; }
    }
    $bin_lookup[$i] = $bin_lookup[$i] - $size_of_array;
    array_push($bin_store[$i], $urls);
  }
  */
}

// Remove unused bins
$bin_store = array_values(array_filter($bin_store));

// Display results.
print_r($bin_store);

Example Results

Array
(
    [0] => Array
        (
            [0] => Array
                (
                    [0] => http://www.696358599.com
                    [1] => http://www.1101237996.com
                    [2] => http://www.39990498.com
                    [3] => http://www.808152997.com
                )

            [1] => Array
                (
                    [0] => http://www.104638105.com
                )

        )

    [1] => Array
        (
            [0] => Array
                (
                    [0] => http://www.659977343.com
                    [1] => http://www.528424096.com
                )

            [1] => Array
                (
                    [0] => http://www.1018981549.com
                    [1] => http://www.343493509.com
                    [2] => http://www.1037396543.com
                )

        )

    [2] => Array
        (
            [0] => Array
                (
                    [0] => http://www.306146048.com
                    [1] => http://www.1275939888.com
                    [2] => http://www.1141004911.com
                    [3] => http://www.1389901513.com
                    [4] => http://www.46297440.com
                )

        )

    [3] => Array
        (
            [0] => Array
                (
                    [0] => http://www.904192341.com
                    [1] => http://www.275948608.com
                    [2] => http://www.1146600584.com
                    [3] => http://www.610914222.com
                )

        )

    [4] => Array
        (
            [0] => Array
                (
                    [0] => http://www.560237467.com
                    [1] => http://www.878781689.com
                    [2] => http://www.975916448.com
                )

        )

)

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment