Support

Account

Home Forums Add-ons Repeater Field Is there no way to uniquely identify an repeater field entry?

Solved

Is there no way to uniquely identify an repeater field entry?

  • I have a weird use case that goes as follows.

    I’m using the google maps field to generate Lat/Lng values. These values are used in two ways. First they’re used in a regular way, they’re returned in relation to their parent post to inject into a front end google maps component to display. That bit is easy.

    But I’m also running the hook on acf/save_post to take the lat/lng values and copying them into a secondary table so they can be transformed into POINT spatial values so I can do geometric calculations on them. In a perfect world I would be able to just have 3 columns, a Primary Key index, the Parent Post ID and the coords encoded into a POINT() value which I could query with spatial tests which would then give me a list of Post IDs to load.

    But this gets much more complicated when it comes to having multiple locations attached to one post because there doesn’t appear to be any way of uniquely checking what fields are already there just how many there are.

    My first thought was to add a second column as a sort of “subID” which was populated with the array index of the field as it was returned and then applying a MySQL unique constraint across the Post_ID column and the subID column. But that doesn’t cover for if a user deletes a repeater entry because lets say there are 5 entries in the repeater and the user deletes entry 3. Entry 4 would become entry 3, entry 5 would become entry 4 and I’d have to explicitly delete delete the highest subID of the post ID in a separate query and this is getting very complicated for what should be a simple action.

    My burned earth fallback is whenever the acf/save_post hook is called with the repeater field present, just delete all rows with that post ID and then put them all back in again which will ensure that there is absolute parity between the fields on the wp_postmeta table and my custom, but there has to be a better way.

    Here is my current code that works fine with only one location per post

    add_action('acf/save_post', 'geo_write_GIS', 11);
    
    function geo_write_GIS($post_id)
    {
        if (isset(get_fields($post_id)['location'])) {
            global $wpdb;
    
            $location_fields = get_fields($post_id)['location'];
            $lat = $location_fields['lat'];
            $lng = $location_fields['lng'];
            $wkt = "POINT ($lat $lng)";
    
            $table = $wpdb->prefix . '_geo';
    
            $result =  $wpdb->query(
                $wpdb->prepare(
                    "INSERT INTO $table (ID, pg)
             VALUES (%d, ST_PointFromText(%s, 4326))
             ON DUPLICATE KEY UPDATE pg=ST_PointFromText(%s, 4326)",
                    array(
                        $post_id,
                        $wkt,
                        $wkt
                    )
                )
            );
        }
    }
    
  • Hi @lfleming

    What about when initially adding the data, you use a Linux timestamp, when the acf/save_post runs, this is copied across to your other table.

    You might be able to then use that as the unique identifier rather than a row ID

  • My burned earth fallback is whenever the acf/save_post hook is called with the repeater field present, just delete all rows with that post ID and then put them all back in again which will ensure that there is absolute parity between the fields on the wp_postmeta table and my custom, but there has to be a better way.

    When dealing with copying repeaters your “burned earth” method is usually what I do.

    A unique ID field for each repeater row like @jarvis suggested could be used as. Create another field in the repeater to contain the unique ID, use an acf/prepare_field filter to make the field read only generate a value for the field if it does not have a value. But this won’t solve the problem of figuring out what you need to delete in the other table.

    To not use the burned earth method requires using a priority of < 10 on the acf/save_post action, then comparing what is in $_POST[‘acf’] (new values) with what is returned by get_field(), get_sub_field(), one row at a time to see what has changed. This is why, unless there is some pressing reason not to, that I use the burned earth method.

  • Brute force clearing by Post ID it is then. Thanks.

Viewing 4 posts - 1 through 4 (of 4 total)

You must be logged in to reply to this topic.

We use cookies to offer you a better browsing experience, analyze site traffic and personalize content. Read about how we use cookies and how you can control them in our Cookie Policy. If you continue to use this site, you consent to our use of cookies.