Cloud Connector is how Procore shares Data, Intelligence, and Analytics with our customers. This can be shared directly to reporting tools like Power BI or Tableau. It can also be shared to customers' data warehouses, stores, lakes, or other applications. Customers can even build programmatic access to their data using Cloud Connector for true automation. Cloud Connector is based on the Delta Share open sharing protocol.
Delta Sharing is the industry’s first open protocol for secure data sharing, making it simple to share data with other organizations regardless of which computing platforms they use. Many applications can access data with Delta Share. However, to further enhance the customer experience, Procore has added connectors, prebuilt code, and guides for the following platforms, reducing setup time and complexity to enable a seamless, out-of-the-box connection.
More data connectors coming soon!
Comprehensive documentation and code examples are available in the Procore Analytics product directly in the Procore web application accessible by your Procore admins. These resources provide step-by-step instructions, code snippets, and best practices to help you set up and manage your data integration effectively.
Continue to the next section of this guide to begin the setup process.
For additional inquiries or assistance, please contact your account manager or our support team.
You must make sure the appropriate permissions are assigned to generate an access token so you can begin connecting your Procore data to your BI solution. Access to Procore Analytics is linked to your Procore login credentials, which allows you to generate a single access token. The access token is a string of digits you will enter in your BI system to access data.
Typically, users who need access tokens are data engineers or Power BI developers. If you have access to Procore Analytics in several companies, your token will allow you to pull data from all of them. The token is tied to you, not to a specific company, so it remains the same across all companies you have access to.
Company and Project Admins will be granted an Admin role by default. The following user access levels are permitted for the Procore Analytics tool:
Access to specific tool and project data in the Procore Analytics tool will be revoked when the corresponding tool and project permissions are removed from the user. When a user’s contact record becomes inactive, the user will lose access to Procore Analytics data.
To start accessing your Procore data, there are two options for generating your data access credentials: the Databricks direct connection method or the Delta Share Token method. The access token is a string of digits you will enter in your applicable data connector to access data.
The Procore Analytics Cloud Connect Access tool is a command-line interface (CLI) that helps you configure and manage data transfers from Procore to MS SQL Server. It consists of two main components:
This will help you set up the following:
After configuration, you have two options to run the data sync:
{
"shareCredentialsVersion": 1,
"bearerToken": "xxxxxxxxxxxxx",
"endpoint": "https://nvirginia.cloud.databricks.c...astores/xxxxxx"
}
You'll need to provide the following MS SQL Server details:
This guide provides detailed instructions for setting up and using the Delta Sharing integration package on a Windows operation system to seamlessly integrate data into your workflows with Procore Analytics. The package supports multiple execution options, allowing you to choose your desired configuration and integration method.
Ensure you have the following before proceeding:
{
"shareCredentialsVersion": 1,
"bearerToken": "xxxxxxxxxxxxx",
"endpoint": "https://nvirginia.cloud.databricks.c...astores/xxxxxx"
}
You can also check scheduling by running in terminal command:
For Linux and MacOs:
To edit/delete - edit scheduling cron by using:
```bash
EDITOR=nano crontab -e
```
For Windows:
Immediate Execution question:
Common Issues and Solutions
Support
For additional help:
Notes
This guide provides detailed instructions for setting up and using the Delta Sharing integration package on a Windows operation system to seamlessly integrate data into your workflows with Procore Analytics. The package supports multiple execution options, allowing you to choose your desired configuration and integration method.
Ensure you have the following before proceeding:
{
"shareCredentialsVersion": 1,
"bearerToken": "xxxxxxxxxxxxx",
"endpoint": "https://nvirginia.cloud.databricks.c...astores/xxxxxx"
}
You can also check scheduling by running in terminal command:
For Linux and MacOs:
To edit/delete - edit scheduling cron by using:
```bash
EDITOR=nano crontab -e
```
For Windows:
Immediate Execution question:
Common Issues and Solutions
Support
For additional help:
Notes
This guide walks you through setting up and deploying an Azure Function for integrating Delta Sharing data with Procore Analytics. The Azure Function enables efficient data processing and sharing workflows with Delta Sharing profiles.
Go to Microsoft Learn to for instructions on installing Azure Functions Core Tools.
This guide provides detailed instructions for setting up and using the Delta Sharing integration package on a Windows operation system to seamlessly integrate data into your workflows with Procore Analytics. The package supports multiple execution options, allowing you to choose your desired configuration and integration method.
Ensure you have the following before proceeding:
{
"shareCredentialsVersion": 1,
"bearerToken": "xxxxxxxxxxxxx",
"endpoint": "https://nvirginia.cloud.databricks.c...astores/xxxxxx"
}
You can also check scheduling by running in terminal command:
For Linux and MacOs:
To edit/delete - edit scheduling cron by using:
```bash
EDITOR=nano crontab -e
```
For Windows:
Immediate Execution question:
Common Issues and Solutions
Support
For additional help:
Notes
This guide provides detailed instructions for setting up and using the Delta Sharing integration package on a Windows operation system to seamlessly integrate data into your workflows with Procore Analytics. The package supports multiple execution options, allowing you to choose your desired configuration and integration method.
Ensure you have the following before proceeding:
{
"shareCredentialsVersion": 1,
"bearerToken": "xxxxxxxxxxxxx",
"endpoint": "https://nvirginia.cloud.databricks.c...astores/xxxxxx"
}
You can also check scheduling by running in terminal command:
For Linux and MacOs:
To edit/delete - edit scheduling cron by using:
```bash
EDITOR=nano crontab -e
```
For Windows:
Immediate Execution question:
Common Issues and Solutions
Support
For additional help:
Notes
Integrating Delta Sharing with Microsoft Fabric Data Factory enables seamless access and processing of shared Delta tables for your analytics workflows with Procore Analytics 2.0. Delta Sharing, an open protocol for secure data collaboration, ensures organizations can share data without duplication.
After configuring the dataflow, you can now apply transformations to the shared Delta data. Choose your Delta Sharing Data option from the list below:
Test your data pipelines and flows to ensure smooth execution.Use monitoring tools within Data
Factory to track progress and logs for each activity.
Using Data Factory in Microsoft Fabric with Delta Sharing enables seamless integration and processing of shared Delta tables as part of your analytics workflows with Procore Analytics 2.0. Delta Sharing is an open protocol for secure data sharing, allowing collaboration across organizations without duplicating data.
This guide walks you through the steps to set up and use Data Factory in Fabric with Delta Sharing, utilizing Notebooks for processing and exporting data to a Lakehouse.
3. Copy code of ds_to_lakehouse.py and paste into notebook window (Pyspark Python):
The next step is to upload your own config.yaml and config.share into the Resources folder of the Lakehouse. You can create your own directory or use a builtin directory (already created for resources by Lakehouse):
The example below shows a standard builtin directory for a config.yaml file .
Note: Make sure you upload both files on the same level and for the property config_path:
4. Check the code of the notebook, lines 170-175.
The example below shows the necessary line changes:
config_path = "./env/config.yaml"
to
config_path = "./builtin/config.yaml"
Since the files are in a builtin folder and not in a custom env, make sure to monitor your own structure of the files. You can upload them into different folders, but in such cases, update the code of the notebook to find config.yaml file properly.
5. Click Run cell:
This guide walks you through setting up and deploying an Azure Function for integrating Delta Sharing data with Procore Analytics. The Azure Function enables efficient data processing and sharing workflows with Delta Sharing profiles.
Go to Microsoft Learn to for instructions on installing Azure Functions Core Tools.
This document provides step-by-step instructions for setting up a data pipeline in Microsoft Fabric to transfer data from Delta Share to a SQL warehouse. This configuration enables seamless data integration between Delta Lake sources and SQL destinations.
Common issues and solutions:
Using Data Factory in Microsoft Fabric with Delta Sharing enables seamless integration and processing of shared Delta tables as part of your analytics workflows with Procore Analytics 2.0. Delta Sharing is an open protocol for secure data sharing, allowing collaboration across organizations without duplicating data.
This guide walks you through the steps to set up and use Data Factory in Fabric with Delta Sharing, utilizing Notebooks for processing and exporting data to a Lakehouse.
3. Copy code of ds_to_lakehouse.py and paste into notebook window (Pyspark Python):
The next step is to upload your own config.yaml and config.share into the Resources folder of the Lakehouse. You can create your own directory or use a builtin directory (already created for resources by Lakehouse):
The example below shows a standard builtin directory for a config.yaml file .
Note: Make sure you upload both files on the same level and for the property config_path:
4. Check the code of the notebook, lines 170-175.
The example below shows the necessary line changes:
config_path = "./env/config.yaml"
to
config_path = "./builtin/config.yaml"
Since the files are in a builtin folder and not in a custom env, make sure to monitor your own structure of the files. You can upload them into different folders, but in such cases, update the code of the notebook to find config.yaml file properly.
5. Click Run cell:
Please reach out to Procore Support if you have any questions or need assistance.
The Procore Analytics Cloud Connect Access tool is a command-line interface (CLI) that helps you configure and manage data transfers from Procore to Snowflake.
It consists of two main components:
Run the configuration utility using python user_exp.py.
After configuration, you have two options to run the data sync:
For Linux and MacOS:
```bash
EDITOR=nano crontab -e
```
2 * * * *
/Users/your_user/snowflake/venv/bin/python
/Users/your_user/snowflake/sql_server_python/connection_config.py
2>&1 | while read line; do echo "$(date) - $line"; done>>
/Users/your_user/snowflake/sql_server_python/procore_scheduling.log # procore-data-import
For Windows:
```
powershell
schtasks /query /tn "ProcoreDeltaShareScheduling" /fo LIST /v
```
{
"shareCredentialsVersion": 1,
"bearerToken": "xxxxxxxxxxxxx",
"endpoint": "https://nvirginia.cloud.databricks.c...astores/xxxxxx"
}
You'll need to provide the following Snowflake details:
The tool offers the ability to schedule automatic data synchronization.
Unset
├── requirements.txt # Dependencies
├── user_exp.py # Configuration utility
├── ds_to_snowflake.py # Data sync script
├── config.yaml # Generated configuration
├── config.share # Delta Share config file
├── procore_scheduling.log # Log of scheduling runs
Note: Remember to always backup your configuration before making changes and test new configurations in a non-production environment first.
The Procore Analytics Cloud Connect Access tool is a command-line interface (CLI) that helps you configure and manage data transfers from Procore to Amazon S3 with Procore Analytics 2.0.
It consists of two main components:
Run the configuration utility using python user_exp.py.
This will help you set up the following:
{
"shareCredentialsVersion": 1,
"bearerToken": "xxxxxxxxxxxxx",
"endpoint": "xxxxxx"
}
You'll need to provide the following S3 details:
The tool offers the ability to schedule automatic data synchronization.
You can also check scheduling by running in terminal command
For Linux and MacOs:
To edit/delete - edit scheduling cron by using:
```bash
EDITOR=nano crontab -e
```
For Windows:
Unset
├── requirements.txt # Dependencies
├── user_exp.py # Configuration utility
├── delta_share_to_s3.py # Data sync script
├── config.yaml # Generated configuration
├── config.share # Delta Share config file
├── procore_scheduling.log # Log of scheduling runs
Common issues and solutions:
Notes:
Delta Sharing is an open protocol for secure real-time data sharing, allowing organizations to share data across different computing platforms. This guide will walk you through the process of connecting to and accessing data through Delta Sharing.
The Delta Sharing Python Connector is a Python library that implements the Delta Sharing Protocol to read tables from a Delta Sharing server. You can load shared tables as a pandas DataFrame, or as an Apache Spark DataFrame if running in PySpark with the Apache Spark Connector installed.
Unset
pip3 install delta-sharing
The connector accesses shared tables based on profile files, which are JSON files containing a user's credentials to access a Delta Sharing server. We have several ways to get started:
After you save the profile file, you can use it in the connector to access shared tables.
import delta_sharing
The Apache Spark Connector implements the Delta Sharing Protocol to read shared tables from a Delta Sharing Server. It can be used in SQL, Python, Java, Scala and R.
The connector loads user credentials from profile files.
You can set up Apache Spark to load the Delta Sharing connector in the following two
ways:
If you are using Databricks Runtime, you can skip this section and follow Databricks Libraries doc to install the connector on your clusters.
To use Delta Sharing connector interactively within the Spark’s Scala/Python shell, you can launch the shells as follows.
PySpark Shell
Unset
pyspark --packages io.delta:delta-sharing-spark_2.12:3.1.0
Scala Shell
Unset
bin/spark-shell --packages
io.delta:delta-sharing-spark_2.12:3.1.0
If you want to build a Java/Scala project using Delta Sharing connector from Maven Central Repository, you can use the following Maven coordinates.
You include Delta Sharing connector in your Maven project by adding it as a dependency in your POM file. Delta Sharing connector is compiled with Scala 2.12.
<dependency>
<groupId>io.delta</groupId>
<artifactId>delta-sharing-spark_2.12</artifactId>
<version>3.1.0</version>
</dependency>