Support

Account

Home Forums Backend Issues (wp-admin) Performance issues on saving – repeater

Solving

Performance issues on saving – repeater

  • Hi,

    i have serious performance issues with a repeater-field in wp-admin. Frontend runs ok. The repeater-field contains 7 sub-fields (all text). With more than 100 entrys saving became really slow (over 30 seconds).
    Do you have an idea how to speed up the saving procedure?

    Thanks!

  • I have been puzzling this out myself for a bit. I have a repeater field and each row of the repeater can have about 50 fields, many of them wysiwyg fields. Some of the posts have as many as 20 rows. This monster can actually timeout when you save a post. If I’d know this would happen I would have designed it differently.

    The problem is not with ACF per-say. The problems is in the fact that WP can only insert or update a single custom field at a time. When you have a couple thousand fields that need to be updated it becomes ridiculously slow. I’ve been digging around in WP looking for a solutions. What I’d like to do is store all the upates and inserts and do them all at once. but $wpdb does not seem to have any way of doing multiple queries in one request.

  • Hi folks,

    have you had any success solving this issue or does anyone have any work-arounds? We’re having this same problem on two different sites. Both sites are hosted in Heroku where there’s hard-coded timeout of 30 seconds. If the request takes longer than that, the client receives an application error. Although the request continues to do its job, this makes the page editing very frustrating for our editors.

    I profiled the code a bit with a dump from our production database and the culprit seems to be generating a new revision of the page when the existing page is saved. We could disable the revisioning, but that’s a feature we rely on.

    On my local environment the issue isn’t so bad. One page that timeouts on production takes about 8-9 seconds to save on my machine. So issuing thousands of SQL queries and not so good database connection isn’t such a good combination.

    We’re using ACF PRO (versions 5.3.6.1 and 5.3.7).

    Thanks!

  • The only work around that I found for this was not really a work around but by explaining that it’s a limitation getting my client to be patient. Like you said, the update process continues after the time out. The client then clicks on the back button in the browser after the time out and refreshes the page. Until something is done in WP that for meta values to be updated all at once instead of one at a time, I don’t know that there will be a solution for pages with large numbers of fields other than keeping this in mind and figuring out way to build sites without having hundreds of fields on a single page.

  • Thanks John,

    that’s probably what we’re going to do until we figure out a “real” solution.

  • Hey, we found no solution or workaround..
    Splitted up the longest page to multiple WordPress-pages and aggregated the data on frontend then. Only saving changes is very slow, reading from DB not. 🙂

  • As a follow-up on this (1.5 years later), I too am experiencing slow saves on roughly 150 repeater fields with about 5 fields in each. I’ve read that some plugins out there save all custom field data to 1 post meta field as a serialized array, but can’t find anything explaining if that’s even possible with ACF Pro. I assume no one has figured out a better way to handle this yet?

  • Some other plugins might save all values as a single value in a serialized array.

    This would make it impossible to query posts based on these meta values. For the most part, saving them as individual fields ACF avoids this issue with most field types. It would be impossible for ACF to have any type of functionality to replace this if everything was stored in one meta field and would require much more work on the part of developers to make a field searchable. If ACF switched to this model it would be a “Display Data” only instead of something that can be used to build applications. You can’t search and filter by values in “The Content” and one big field would have the same effect.

    Also, it would be impossible to get any values if the plugin is deactivated. While it’s not highlighted in the documentation, it is possible to use built in WP function to get field values without formatted, building themes that will continue to work should ACF be deactivated. Most people do not do this and use acf function to get values, but I think this was a consideration.

    From what I know, the developer’s goal is to extend WP in a way that uses and extends built in WP functions. ACF does not do anything that we could not do ourselves following the WP documentation and using standard WP functions, filters and actions.

  • John,

    Thanks very much for the reply. Your explanation of why the backend fields cannot be saved as a serialized array makes much more sense to me, so I appreciate taking the time to explain. I am familiar with calling the post meta fields without the ACF functions on the front end, and tend to follow that pattern on most custom sites I build, should the clients ever accidentally/unintentionally disable/remove ACF.

  • I thought that might help others as well. I know that this problem can be a source of frustration when it is encountered for the first time. I’ve been there. I’ve looked at many possible solutions. Unfortunately, the solutions are complex, especially when dealing with repeaters and flex field, which is the source of the problem to begin with.

    On a positive note, with the coming of Gutenberg, there are going to be some changes to how WP handles meta values. While in general, I’m not over excited by the new editor because I cannot use it for the clients I do work for, from my reading it seems that this issue may be alleviated a bit by the changed that will be made to tie meta fields into the new editor. On the other hand, if you’ve read anything about this, the exact solution is far from clear and there is nothing stating how the new system will work.

  • I think part of the issue is it saves (updates) all rows each times it saves. it doesn’t simply updates the ones that actually need it (new and modified rows).
    If you could somehow hack the save function so it would only update what’s needed instead of every unmodified row it wouldn’t take that long to save the post. Is this possible or am i daydreaming?

  • I know this topic is years old but I was just wondering whether anyone had found any solutions to this?

    I’m currently using WordPress/ACF for a digital treasure trail PWA/webapp. I have a global ‘Scores’ repeater on an options page that adds a new row (with 16 basic number or text fields) each time a player completes a trail.

    At present the Options page/repeater has about 450 rows and it’s becoming really slow when executing save_post.

  • For my sake, I went ahead and switched the way I was saving/loading the data. Would it make sense to instead save each player’s completion to a custom post type post? That way, each save only saves those 16 fields, and each load only has to loop through the post type.

  • @superpotato Actually, do to the way that update_post_meta works only fields that have new values are updated but WP still does a get query to check to see if it needs to be updated. This is the main issue. WP might actually be performing multiple queries for every field save.


    @peterhintondesign
    I would probably have used a CPT for this rather than a repeater if I knew ahead of time that the rows would grow to this level.

    There is no solution. The only real solution is to understand the problem and devise a solution that does not cause issues. For older sites that have this problem I have created a solution to the timeout issue, which is to detect potential timeouts, intercept them, let the save complete and and then send the user on his way after that. https://github.com/Hube2/acf-prevent-timeouts/

  • Thanks for the quick replies Chris & John!

    I’ll have a think about the problem from another angle, specifically looking at CPTs.

    Fortunately the project is still relatively young and I’m in a position where I can change approach relatively easily.

    I was hoping that while there will likely be be a large number of rows, the data would be basic enough for this not to be an issue overall when using save_post.

Viewing 15 posts - 1 through 15 (of 15 total)

The topic ‘Performance issues on saving – repeater’ is closed to new replies.