ContributionsMost RecentMost LikesSolutionsRe: Un/read status for boards PerBonomi Thanks for the clarification. Yes the UI does allow for bold and unbold at the board level for unread boards and I can see how that would be a useful API to your superusers. Unfortunately an API for this is not currently on our near term roadmap. I would suggest you post this in the Ideas board. If there is a lot of demand for this API that will help push it up on our backlog. Re: How are Total Page Views calculated in the Event report? Brandon1115 Can you clarify what the Event Report is? Regarding data from the Bulk Data API, your analyst is querying the data correctly for all page views. Filter by action.key=view and counting for each one should give you page views for the date range of files you've collected. Re: Bulk data API - discern & filter company by email domain? Unfortunately you are correct. Bulk Data API does not contain email so there is no way to run domain filter. This was a security concern that we not store or expose PII in bulk. The only way to do this is to make a rest API call to the community to fetch email from the user ID and apply the filter separately before merging back into bulk. I realize for large communities that would be less than ideal. Re: Bulk Data API - Too Many Fields Error tmarshall Cap has been increased to 80 Re: Bulk Data - Looping Delay Needed? Hi tmarshall Unfortunately it's hard to guess how long it will take for a 24 hour file transmission to complete as there are multiple factors involved - file size, network latency, cluster response time, etc. The only way to prevent this is to run a staggered query that would wait till the last transmission stops before firing next 24 hour window. If you prefer to use the time based approach unfortunately I can't give you a time that will definitely work so you'll have to expand and experiment until the errors stop. regards Naoki Re: Bulk Data API - Too Many Fields Error tmarshall Great to hear. I think that's a better way forward. I think you need to add -H again to add to the header. Try this curl https://api.lithium.com/lsi-data/v1/data/export/community/mycompany.prod?fromDate=201603150000&toDate=201603160000 -H "client-id: cooluniquekeyforustogetdata=" -H "Accept: application/json" -u "secondcooluniquekey:" >> C:\Users\tmarshall\bulk_test\prod_20160315-test1.txt Out of curiosity what BI tool are you planning to use? I'm guessing they have their own JSON connector which should make it easier to use. Re: Bulk Data API - Too Many Fields Error tmarshall 1) While we do not often add columns or change the order, it will happen on occasion. We disclose these changes in the Lithium release notes, but I also try to post the on the Bulk Data API TKB a few weeks before it happens. We strongly recommend that you not count on field order to map your data, but instead do the mapping by header name. The header names have never changed. So ingest the full file - preferably as a json file by adding "Accept: application/json". Then parse it on your side for the fields that you need. Or if you're using some sort of BI tool there should be some function like HLookup in Excel. 2) That's correct you are capped at 20, but inclusive only and not exclusive. We can expand the number of fields if that helps, but the exclusive feature will probably not come till later this year. 3) If you have to do the mappings across 3 exports than document.id is the common key you can join across all data. But document.id will only be in files from March 4, 2016 Re: Bulk Data API Error Hi Tim Any chance this error happened yesterday. We had a brief outage on the API proxy which would have blocked access for an hour, but the service is back up now. But if this issue is still occurring, please file a Lithium support ticket so we can appropriately track the investigation. And sorry this happened while you were doing your historic data extracts. regards Naoki Re: Bulk Data API - Initial Load, verify daily pull 1. You will have to write a script that calls 24 hours worth of data at a time for historic data. Lithium won't be able to give you a historic one time dump. 2. For verificaiton, unfortunately you wont' be able to verify using an API. At this time there is no aggregated API on LSI data to get daily, weekly or monthly counts on a programattic basis. It's something on our roadmap but for later in the year. In the meantime, you'll have to manually periodically compare in UI or via CSV export. Re: API call to retrieve previous version of an article VarunGrazitti is correct. Uncommitted Candidate means it's a request that may make sense but is not high enough in terms of prioritization that is likely to get done in a near term timeframe that we can give you an ETA for.