Forum Discussion

Wendy_S's avatar
9 years ago

Any way to change urls in posts to a new url from the backend/database by an sql statement/similar

Hi All.

 

Is there any way to bulk change url through a backend process?
For example, if posts contain links and those links have changed due to a tool change.

The thing that needs to happen is to take a docid in the old url and paste it behind a new url.

This could be done manually of course but if you're looking at volume of 50.000 one has to wonder if there is no way to do this on the backend. Today I can search for the old url string and find the results in the posts so I am sure there must be a search/replace statement that can be run on the backend?

Any Database/table guru out there?

Thanks for brainstorming with me!
Wendy

 

  • Hi Wendy,

     

    lovely task you are outlining there. Theoretically SQL knows about a REPLACE() function to replace substrings in one call, but unfortunately that never made it to LiQL. Arguably it's not the most common activity on a community dataset :)

     

    You would need to approach this in a script with three steps:

    1. Pull all matching messages and their ID via a SELECT id, body FROM messages WHERE body matches ('yourstring') call 
    2. For each message retrieved in the 1st step perform a substring replace.
    3. Write the updated message back via the UPDATE function

     

    All the needed LiQL functions and parameters are described in the API v2 documentation on the message resource. This can be done as a tool with almost any script language or also realized on platform via a Lithium custom component.

     

    While you are at it you might consider future proofing this approach for the next url update in 3 years ;) For example you could move the urls into a new custom data field that would be easier to update. This would require then to update your message UI to show this additional link in a separate UI element.

  • Hi Wendy,

     

    lovely task you are outlining there. Theoretically SQL knows about a REPLACE() function to replace substrings in one call, but unfortunately that never made it to LiQL. Arguably it's not the most common activity on a community dataset :)

     

    You would need to approach this in a script with three steps:

    1. Pull all matching messages and their ID via a SELECT id, body FROM messages WHERE body matches ('yourstring') call 
    2. For each message retrieved in the 1st step perform a substring replace.
    3. Write the updated message back via the UPDATE function

     

    All the needed LiQL functions and parameters are described in the API v2 documentation on the message resource. This can be done as a tool with almost any script language or also realized on platform via a Lithium custom component.

     

    While you are at it you might consider future proofing this approach for the next url update in 3 years ;) For example you could move the urls into a new custom data field that would be easier to update. This would require then to update your message UI to show this additional link in a separate UI element.

    • Wendy_S's avatar
      Wendy_S
      Boss

      Hi Claudius. My favorite Rockstar :-)

       

      Thanks for your thoughts. I have to admit, I need to call on the help from our developer fuenteso as I am not an expert on scripting.

      If we develop the script, would we run this ourselves on the backend? How? Or do we require help from lithium support?

       

      Good tip on the custom data field! We'll keep that one in mind :-) Luckily our links do not (usually) change too often :-)

       

      Have you had to do this yourself sometimes?

      Thanks!
      Wendy

      • Claudius's avatar
        Claudius
        Boss

        Hi Wendy,
        my top fellow European all-star :catvery-happy:

         

        No need to run it on a backend even. If it's a one time script your dev could even run it as a script from their computer. If you are using Lithium API there is no need to involve Lithium support.

         

        We never had to do it ourselves as we are apply an ongoing archiving of old content with cut off at 6 months and 2 years.