Recently I was helping a customer with a use case where there was a need for data transformation. The requirement was to take daily snapshots of each customers accounts receivable data, aggregate it by country and then snapshot it.
If you are someone who can code and knows how to use RESTful web services, you’re probably thinking "no problem”. Otherwise, if you are someone who knows Quickbase (a real creative problem solver) you're probably thinking “I know I could figure out a way”.
The challenge for the developer type is where are you going to host the script, you’ll need to consider chunking the data because of the volume and since your chunking you’ll need to consider throttling, among other things. For the Quickbase creative problem solver, the volume of data would have you pulling your hair out trying to get something working and would likely have a number of daily manual steps in place. For me, I haven’t been a true developer for over 10 years, so everything that goes into the script like error handling, chunking, throttling etc., not to mention standing up an environment to host it, feels like a big effort.
This is where Quickbase Pipelines and the New Quickbase API come into play. Using these two new Quickbase features, I was able to quickly develop a strategy and work with the customer to implement it in a couple of days. For the trending analysis the data needed to be aggregated by Country which took the number of records down to ~60 on a daily basis. The daily data once aggregated could be wiped out the next day. So how did we snapshot and aggregate the daily data:
- Create a summary report grouped by Country
- Use the new API within Pipelines to execute the report and return the summarized records as a single record.
- Write them to a snapshot table.
With the new API, you can execute reports - more importantly summary reports - and get the entire data set back in JSON form. This is one of the things I really like about the new API. I’ve used this strategy numerous times since the new API was released. Then, with Pipelines, I can easily iterate over the summarized record set with the Pipelines JSON handler and create the snapshot records. With the XML-based API, you were not able to return summarized data with API_DoQuery, and with API_GenResultsTable the summarized columns were not returned as part of the record set.
Additionally, with the New API, calls are very much in line with modern APIs:
With Pipelines, I don’t need to worry about setting up an infrastructure. I just create a new Pipeline. I don’t need to worry about coding syntax, I use the Pipeline Channels and Actions which hide the syntax aspect of coding, so all I need to do is worry about the logic - even throttling and retries are built in.
One additional plug here for the New Quickbase API is the developer portal. If you frequently work with the XML Quickbase API, I highly recommend you go check it out. It provides a playground to quickly test out each of the new API calls. In fact, before I built the Pipeline discussed in this post, I used the developer portal to test out and verify my expected results.