Getting Started With Analytics 2.0
- Last updated
- Save as PDF
Table of Contents
About Procore Analytics 2.0
Tip
If you need further assistance with implementing Procore Analytics 2.0, please reach out to your Procore point of contact for additional implementation services.Procore Analytics 2.0 has 3 guiding pillars of value today and in the future to make a user-friendly product that gives access to all of your data, with business intelligence included. Ultimately, this allows you to quickly access any Procore data you need and gain value from it to make informed, data-driven decisions.
Key benefits include:
|
Verify Permissions
Note
- You must have Procore Analytics enabled.
- Anyone with 'Admin' level access to Analytics can grant additional users access to Analytics.
- Users must have 'Admin' level access to Analytics to generate a data token.
- Any changes to a user's permissions in the Directory for Procore Analytics will take up to 24 hours to be active.
You must make sure the appropriate permissions are assigned to generate an access token so you can begin connecting your Procore data to your BI solution. Access to Procore Analytics is linked to your Procore login credentials, which allows you to generate a single access token. The access token is a string of digits you will enter in your BI system to access data.
Typically, users who need access tokens are data engineers or Power BI developers. If you have access to Procore Analytics in several companies, your token will allow you to pull data from all of them. The token is tied to you, not to a specific company, so it remains the same across all companies you have access to.
Company and Project Admins will be granted an Admin role by default. The following user access levels are permitted for the Procore Analytics tool:
- None: No access to Procore Analytics data.
- Admin: Has full access permissions to data for all tools and projects (except certain data marked as private such as correspondence data).
There are two ways to assign permissions to individual users:
Revoking Access
Access to specific tool and project data in the Procore Analytics tool will be revoked when the corresponding tool and project permissions are removed from the user. When a user’s contact record becomes inactive, the user will lose access to Procore Analytics data.
Generate Access Token
To start accessing your Procore data, you must generate an access token. The access token is a string of digits you will enter in your BI system to access data.
Considerations
- You must have the Procore Analytics tool enabled.
- By default, all Company Admins have 'Admin' level access to Analytics in the Directory.
- Anyone with 'Admin' level access to Analytics can grant additional users access to the Analytics tool.
- Users must have 'Admin' level access to the Analytics tool to generate a data token.
Steps
- Log in to Procore.
- Click the Account & Profile icon in the top-right area of the navigation bar.
- Click My Profile Settings.
- Under Choose Your Connection with Procore Analytics, go to Generate personal access token to get started with Procore Analytics.
- Choose an expiration date.
- Click Generate Tokens.
Note
- The token will disappear after one hour or it will also disappear if you navigate away from the page. To generate a new token, return to Step 1.
- It may take up to 24 hours for the data to become visible.
- Please do not regenerate your token during this processing time as doing so may cause issues with your token.
Upload Reports to Power BI
- Navigate to Procore Analytics from your Company Tools menu.
- Go to the Getting Started section.
- Under Power BI Files, select and download the available Power BI reports.
- Log in to the Power BI service using your Power BI login credentials.
- Create a workspace where you want to store your company's Procore Analytics reports. See Microsoft's Power BI support documentation for more information.
Notes: Licensing requirements may apply. - In the workspace, click Upload.
- Now click Browse.
- Select the report file from its location on your computer and click Open.
- After uploading the file, click Filter and select Semantic Model.
- Hover your cursor over the row with the report's name and click the vertical ellipsis icon.
- Click Settings.
- On the settings page, click Data source credentials and then click Edit Credentials.
- In the 'Configure [Report Name]' window that appears, complete the following:
- Authentication Method: Select 'Key'.
- Account Key: Enter the token you received from the token generation page in Procore.
- Privacy level setting for this data source: Select the privacy level. We recommend selecting 'Private' or 'Organizational'. See Microsoft's Power BI support documentation for more information about the privacy levels.
- Click Sign in.
- Click Refresh and do the following:
- Time zone: Select the time zone you want to use for scheduled data refreshes.
- Under Configure a refresh schedule, turn the toggle to the ON position.
- Refresh frequency: Select 'Daily'.
- Time: Click Add another time and select 7:00 a.m.
Note: You may add up to 8 refresh times. - Optional:
- Mark the 'Send refresh failure notifications to the dataset owner' checkbox to send refresh failure notifications.
- Enter the email addresses of any other colleagues you want the system to send refresh failure notifications to.
- Click Apply.
- To verify that the settings were configured correctly and that the report's data will refresh properly, return to the 'Filter and select Semantic Model' page and complete the following steps:
- Hover your cursor over the row with the report's name and click the circular arrow icon to refresh the data manually.
- Check the 'Refreshed' column to see if there is a warning icon.
- If no warning icon displays, the report's data is successfully refreshed.
- If a warning icon displays, an error has occurred. Click the warning icon to see more information about the error.
- To delete the blank dashboard the Power BI service created automatically, complete the following steps:
- Hover your cursor over the row with the dashboard's name. Click the ellipsis icon and click Delete.
- To verify that the report renders properly, navigate to the 'All' or 'Content' page and click on the report's name to view the report in the Power BI service.
Tip
Reference the 'Type' column to ensure you click on the report instead of a different asset.
- Repeat the steps above within Power BI for each Procore Analytics report file.
Connecting to Your Power BI Desktop Option
Note
This method of connection is typically used by data professionals.- Open your Power BI Desktop.
- From the Home page, click New to expand the section.
- Click Report.
- Click Get data from another source.
- In the search bar, type 'Delta Sharing'.
- Select Delta Sharing, then click Connect.
- Type or paste the Delta Sharing Server URL you received from Procore.
- If this is the first time you are connecting to this source, you will be prompted to provide your Delta Sharing Bearer Token.
- After authentication, select the Procore Analytics tables you want to bring into your Power BI report.
- Select Load to view your report or select Transform Data to make more transformations in Power Query.
Connecting to Your Analytics Models Option
Note
This method of connection is typically used by data professionals.Create Credentials File
You must first generate a data token within the Procore web application. See Generate Access Token.
- Create a file called config.share.
- Add the fields below:
{
"shareCredentialsVersion": 1,
"bearerToken": "",
"endpoint": "",
"expirationTime": ""
} - Add the Bearer Token, Endpoint, Share Credentials Version, and Expiration Time values received from Procore to the config.share file.
Run user_exp.py script
You can use the following scripts to create a config.yaml file with the necessary configurations.
- For Azure Storage:
cron_job: #true/false
run_as: #pyspark/python
source_config:
config_path: #path to the config.share file
tables:
- '' # table name if you want to download a specific table. Leave it empty if you want to download all tables
source_type: delta_share
target_config:
auth_type: service_principal
client_id: #client_id
secret_id: #secret_id
storage_account: #storage-account name
storage_path: #<container>@<storage-account>.dfs.core.windows.net/<directory>
enant_id: #tenant_id
target_type: azure_storage
- For MSSQL DB:
cron_job: #true/false
run_as: #pyspark/python
source_config:
config_path: #path to the config.share file
tables:
- '' # table name if you want to download a specific table. Leave it empty if you want to download all tables
source_type: delta_share
target_config:
database: #target database
host: #target hostname:port
password: #password
schema: #target schema (default to procore_analytics)
username: #username
target_type: sql_server
Run as PySpark
If your environment is already set up with Spark, choose the 'pyspark' option when requested or once the 'config.yaml' is generated, you can run the following commands to download the reports to the data directory.
- For Writing to ADLS Gen2 Storage:
spark-submit --packages io.delta:delta-sharing-spark_2.12:3.1.0,org.apache.hadoop:hadoop-azure:3.4.0,com.microsoft.azure:azure-storage:8.6.6,org.apache.hadoop:hadoop-common:3.4.0 --exclude-packages com.sun.xml.bind:jaxb-impl delta_share_to_sql_spark.py - For Writing to MSSQL DB:
spark-submit --packages io.delta:delta-sharing-spark_2.12:3.1.0 --jars <Location of mssql-jdbc jar> delta_share_to_sql_spark.py
Run as Python
- From the command line, navigate to the folder by entering the “cd <path to the folder>” command.
- Install required packages using “pip install -r requirements.txt” or “python -m pip install -r requirements.txt”.
- Execute the command python delta_share_to_azure_pandy.py.
Using SSIS
- Open SSIS and create a new project.
- From the SSIS Toolbox drag and drop Execute Process Task.
- Double click Execute Process Task.
- Go to the Process tab.
- Next to Executable, enter the path to python.exe in the Python installation folder.
- In WorkingDirectory, enter the path to the folder containing the script you want to execute (without the script file name).
- In Arguments, enter the name of the script delta_share_to_azure_panda.py you want to execute with the .py extension and click Save.
- Click Start in the top ribbon menu.
- During the execution of the task, the output of the Python console is displayed in the external console window.
- Once the task is done it will display a checkmark.
Choose Your Own Method
Delta Sharing is an open protocol for secure data sharing. You can find the public GitHub repository for Delta Sharing at https://github.com/delta-io/delta-sharing. The repository includes examples and documentation for accessing shared data using various languages such as Python and Spark Connector (SQL, Python, Scala, Java, R).
Note
Make sure you have appropriate permissions and access rights to download the required files and run Docker containers on your system. Always follow security best practices and guidelines provided by Procore when handling sensitive data and credentials.Connecting to Your Databricks Option
Note
This method of connection is typically used by data professionals.- Log in to your Databricks environment.
- Navigate to the Catalog section.
- Select Delta Sharing from the top menu.
- Copy the Sharing Identifier provided for you on the right side.
- Next, you will log in to the Procore web application.
- Click the Account & Profile icon in the top-right area of the navigation bar.
- Click My Profile Settings.
- Under Choose Your Connection with Procore Analytics, enter your 'Databricks Sharing Identifier.'
- Click Connect.
- Once the sharing identifier is added to Procore's system, the Procore Databricks connection will appear under the Share with me tab in your Databricks environment.
Note: It may take up to 24 hours to see the data. - When your Procore Databricks connection becomes visible in the Share with me tab, click Create Catalog.
- Enter your preferred name for the shared catalog and click Create.
- Your shared catalog and tables will now show under the provided name in the Catalog Explorer.