API Content and User Permissions
Hello,
I am working on a solution to build out an external search index with Khoros being one source of many. I having been trying unsuccessfully to use the API to extract out all content and users within our community and map permissions so I can pass that into our index.
Specific Calls:
- authenticating with admin user
- get all nodes
- https://community.anaplan.com/api/2.0/search
- [{ nodes: { fields: ["id", "parent", "depth", "descendants", "root_category", "node_type", "view_href", "roles", "messages", "title", "description"], limit: 1000 } }]
- for each node get roles
- for each node get messages
- get all users
- https://community.anaplan.com/api/2.0/search
- [{ users: { fields: ["id", "first_name", "last_name", "nodes", "email", "roles", "view_href", "messages", "web_page_url", "biography"], limit: 1000 } }]
- for each user get roles
note: for each call I am recursively hitting it until all are brought back.
note: I have also tried this same concept with the v1 api with no luck there either.
Below are a list of issues and questions I have run into and hope someone here might be able to help me with.
- Even though I am authenticated with an admin user, I get many 403 forbidden errors when trying to get roles for both user and node.
- From what I can tell I would need to use the Freemarker restadmin function. Is there really no way to use that through the actual API?
- This ends up being multiple thousands of API calls and taking an extremely long time.
- Sometimes the API times out
- Even if it didn't, this realistically is taking too long to be of much use.
- Which leads me to believe there is a better way to leverage the LiQL in order to reduce amount of API calls and time.
- Roles appear to be passed down from parent.
- Assuming this, I would need another loop through all node's parents in order to make a realistic map of permissions.
- Based on my second comment this seems unrealistic to achieve right now.
- Wondering if there is a way to force LiQL to send back the actual array of data instead of a query object to call the API again.
- Finally, since I am still learning all this, I'm not even 100% positive that this would actually pull out all the content in my community. Am I missing some other object type that I should also be querying?
I would like to reiterate that I am trying to achieve this only through the API and would also appreciate a blunt statement of "this is not possible" if that is the case.
Thank you so much for taking the time to even read this.
Hi Kevin,
I don't actually remember that being an issue for us and can't find any reference to it in emails/documentation so you might be surprised.
Having said that, it was fairly early on in our community and we may have made the decision to not worry about historical date, but I can't remember for sure... I have a feeling that Inbenta was able to do a spider crawl of the existing community for historical data.
What I would have a look at though, is Khoros' Bulk Data API tool - https://developer.khoros.com/khoroscommunitydevdocs/reference/bulk-data-api - which provides an interface for exporting large amounts of data from the system.
It does though limit each export to 7 days of data so I'd imagine you'd want to look at writing a script to automate that depending on how much historical data you have... not sure if it's the right solution for you but it may help.
Cheers
Nathan