I need to run Batch API on a huge dataset. Specifically I have a custom table with a lot of rows which I need to iterate over. The batch process works fine using small numbers. When using the full set I run out of memory before it even starts because I am loading a very large array of IDs to pass in to the batch function. Think of a 5 million item array. What is a good way to handle this? Break it into smaller chunks somehow to then pass in to Batch API? Increase the memory limit to an (ungodly) amount?