[wp-hackers] get_users() exhausts memory with 'all_with_meta' argument

Jeremy Clarke jer at simianuprising.com
Fri Jan 3 18:15:17 UTC 2014

FWIW the line where you increase memory limits isn't a good solution, as
each affected user will need to get in touch with you after your plugin
breaks their site, and the RAM usage will hurt everyone on their server if
it's shared hosting (assuming their host even tolerates increasing the
memory usage beyond a certain point).

The question is interesting because it seems like it really applies to any
plugin that might want to fetch all users with meta. Unless we know 100%
that there will never be more than $x users on the site a plugin runs on we
should probably all have some kind of inherent limiting built into "all
users" queries.

I'd approach this in one of two ways:

*If one actually only needs to operate on some of the users:*

Ideally you should find a way to limit the list of users you are fetching
BEFORE you retrieve them with their full meta contents. Maybe you can add
other query parameters to get_users that will reduce the total number of
users returned, which would mean less bloat from filling up the meta cache.

The big concern with that solution is that it's easy to end up swapping a
memory-consuming meta cache with a time-consuming SQL query (I bet user
queries based on meta fields are as slow as post queries based on meta

Another way of doing this would be to get all users without meta, loop
through them and test something non-meta-related to filter out the ones
that aren't necessary, then pass the remaining user ids as the 'include'
parameter of a second get_users call that fetches all meta. Queries with an
'include' parameter full of IDs are usually so fast they're basically
irrelevant, so the SQL penalty of doing two queries shouldn't be a concern.

Of course all that might not be an option for your plugin, but generally
speaking it's probably a best practice when dealing with "all users" on a
WP site.

*If one truly needs to do an operation on the meta of all users:*

Do like Nikola indicated and make this a batch process that does ~500 users
at a time by executing a loop that limits the query to 500 users and uses
an offset that increases by 500 each round.

Note that because the data is being saved to the WP object cache these
loops will add up over the course of the pageload, so you'll probably want
to use AJAX requests or something to handle the batches as separate
requests (i.e. so that after each loop the object cache gets cleared and
you don't end up with one process that contains the full objects of all

Another option is to investigate the possibility of clearing each user from
the object cache after they've been "processed" by your script. Not sure
how you'd do that or if it's possible/feasible, but if you could do that
then you could skip the AJAX step, as each round of the 500-user-loop would
free up memory for the next loop.

A quick look at the siblings of wp_cache_get (/wp-includes/cache.php)
implies that wp_cache_delete() might do the trick.

Good luck :)

Jeremy Clarke • jeremyclarke.org
Code and Design • globalvoicesonline.org

More information about the wp-hackers mailing list