A very popular option for sites that ne serious reporting is to download daily analytics data via API. This allows us to create a repository with data already model as we ne it (with the segments already appli) and directly query the MySql tables with the Looker Studio connection.
The problem is creating the script that downloads the data
This is usually more the job of BI people than of the analyst, but we can find several solutions on the Internet. At IKAUE we have our own library for this type of downloads develop in PHP and which is capable of multiplying the queries by creating collections of custom segments.
Another problem with this system is that anything that
You have not thought of as necessary data for these exports will not be available, so it is important to plan data downloads well.
Note: If there is a lot of data, we may also want twitter data to download it not only to MySQL but also to BigQuery. The result would actually be the same but faster to process.
Option 3: Google SpreadSheets as an intermiate connection to real APIs
I’m going to focus on Google Analytics, but this system actually allows us to add to Looker Studio not only the actual GA API but any API available for how we will measure weekly opportunity creation Google SpreadSheets.
What we are going to do is create a data repository sheet
This sheet must be able to automate data loads. Then from Looker Studio we will not ask for access to the APIs but to Google Spreadsheets with that repository. It would be something very similar to the downloads of model data to MySQL that we mention before but more accessible to most people (although also more limit).
Let’s look at a small example to illustrate how to import information about session-scop channels and page paths, as well as session metrics, sessions yeezys shoes with interaction, and transactions, into our dashboard using Google Sheets.