Troubleshoot transfer configurations
This document is intended to help you troubleshoot the most common issues encountered when setting up a BigQuery Data Transfer Service transfer. This document does not encompass all possible error messages or issues.
If you are experiencing issues that are not covered in this document, you can request support.
Before contacting Cloud Customer Care, capture transfer configuration and transfer run details. For information on how to get these details, see Get transfer details and View transfer run details and log messages.
Examine errors
If your initial transfer run fails, you can examine the details in the run history. Errors listed in the run history can help you identify an appropriate resolution using this document.
You can also view error messages for a specific transfer job using the Logs Explorer. The following Logs Explorer filter returns information about a specific transfer configuration job, along with any error messages:
resource.type="bigquery_dts_config"
labels.run_id="RUN_ID"
resource.labels.config_id="CONFIG_ID"
Replace the following:
RUN_ID: the ID number of a specific job runCONFIG_ID: the ID number of a transfer configuration job
Before contacting Customer Care, capture any relevant information from the run history or Logs Explorer including any error messages.
If you use event-driven transfers, the event-driven transfer configuration might fail to trigger a transfer run. You can view error messages at the top of the run history page or configuration page.
General issues
When diagnosing general transfer issues, verify the following:
- Verify that you have completed all the steps in the "Before You Begin" section of the documentation page for your transfer type.
- The transfer configuration properties are correct.
- The user account used to create the transfer has access to the underlying resources.
If your transfer configuration is correct, and the appropriate permissions are granted, refer to the following for solutions to commonly encountered issues.
- Error:
An unexpected issue was encountered. If this issue persists, please contact customer support. - Resolution: This error typically indicates a temporary outage or an issue within BigQuery. Wait approximately 2 hours for the issue to be resolved. If the problem persists, request support.
- Error:
Quota Exceeded. Resolution: Transfers are subject to BigQuery quotas on load jobs. If you need to increase your quota, contact your Cloud de Confiance by S3NS sales representative. For more information, see Quotas and limits.
If you are loading Cloud Billing exports to BigQuery, you can encounter the
Quota Exceedederror. Both the Cloud Billing export tables, and the destination BigQuery tables created by the BigQuery Data Transfer Service service are partitioned. Choosing the overwrite option while setting such BigQuery Data Transfer Service jobs causes the quota errors depending on how much data is exported. For information about troubleshooting quotas, see Troubleshoot quota and limit errors.If the error is because of BigQuery Data Transfer Service jobs for Cloud Billing exports, then note, that since the individual Cloud Billing Export tables are partitioned, so is the target table created by the BigQuery Data Transfer Service, and hence choosing the overwrite option while setting up such data transfer jobs will result into (DML) Quota errors depending on how old the Billing Accounts are. For information about troubleshooting quotas, see Troubleshoot quota and limit errors.
- Error:
The caller does not have permission. Resolution: Confirm the signed-in account in Cloud de Confiance console is the same as the account you select for BigQuery Data Transfer Service when creating the transfer.
Signed-in account in Cloud de Confiance console:
Choose an account to continue to BigQuery Data Transfer Service:
- Error:
Access Denied: ... Permission bigquery.tables.get denied on table ... Resolution: Confirm that the BigQuery Data Transfer Service service agent is granted the
bigquery.dataEditorrole on the target dataset. This grant is automatically applied when creating and updating the transfer, but it's possible that the access policy was modified manually afterwards. To regrant the permission, see Grant access to a dataset.- Error:
region violates constraint constraints/gcp.resourceLocations on the resource projects/project_id Resolution: This error occurs when a user tries to create a transfer configuration in a restricted location, as specified in the location restriction organization policy. You can resolve this issue by changing the organization policy to allow for the region, or by changing the transfer configuration to a destination dataset located in a region unrestricted by the organization policy.
- Error:
Please look into the errors[] collection for more details. Resolution: This error can occur when a data transfer fails. For more information about why the data transfer failed, you can use Cloud Logging to view your logs. You can find logs for a specific run by searching using the transfer
run_id.- Error:
Network Attachment with connected endpoints cannot be deleted. Resolution: This error can occur when a user tries to delete their network attachments soon after they have deleted their transfer. This happens because it can take several days after a transfer deletion before the BigQuery Data Transfer Service can fully remove all resources associated with the transfer, which can prevent the network attachments from being deleted. To resolve this error, wait several days before trying to delete the network attachments. If you want to have the network attachments deleted sooner, you can contact support.
Authorization and permission issues
The following are some common permission errors that you can encounter when you transfer data from different data sources:
- Error:
BigQuery Data Transfer Service is not enabled for <project_id> - Error:
BigQuery Data Transfer Service has not been used in project <project_id> before or it is disabled ... Resolution: Verify that the service agent role is granted with the following steps:
In the Cloud de Confiance console, go to the IAM & Admin page.
Select the Include S3NS-provided role grants checkbox.
Verify that the service account with the name
service-<project_number>@gcp-sa-bigquerydatatransfer.s3ns-system.iam.gserviceaccount.comis shown or that it has been granted the BigQuery Data Transfer Service the BigQuery Data Transfer Service Agent role.
If the service account is not shown, or it does not have the BigQuery Data Transfer Service service agent role granted, grant the predefined role in the Cloud de Confiance console or by running the following Google Cloud CLI command:
gcloud projects add-iam-policy-binding PROJECT_NUMBER \ --member serviceAccount:service-PROJECT_NUMBER@gcp-sa-bigquerydatatransfer.s3ns-system.iam.gserviceaccount.com \ --role roles/bigquerydatatransfer.serviceAgentReplace
PROJECT_NUMBERwith the project number associated with this service account.- Error:
There was an error loading this table. Check that the table exists and that you have the correct permissions. Resolution:
In the Cloud de Confiance console, go to the BigQuery page.
Click the destination dataset used in the transfer.
Click the Sharing menu, and then click Permissions.
Expand the BigQuery Data Editor role.
Verify that the BigQuery Data Transfer Service service agent is added to this role. If not, grant the BigQuery Data Editor (
roles/bigquery.dataEditor) role to the BigQuery Data Transfer Service service agent.
- Error:
A permission denied error was encountered: PERMISSION_DENIED. Please ensure that the user account setting up the transfer config has the necessary permissions, and that the configuration settings are correct Resolution:
In the Cloud de Confiance console, go to the Data Transfers page.
Click the failed transfer, then select the Configuration tab.
Verify that the transfer owner listed in the User field has all the required permissions for the data source.
If the transfer owner does not have all the required permissions, grant the required permissions by updating their credentials. You can also change the transfer owner to another user with the required permissions.
- Error:
Authentication failure: User Id not found. Error code: INVALID_USERID Resolution: The transfer owner has an invalid user ID. Change the transfer owner to a different user by updating their credentials. If you are using a service account, you should also verify that the accounts running the data transfer have all the required permissions to use a service account.
- Error:
The user does not have permission Resolution: Verify that the transfer owner is a service account, and that the service has all the required permissions set. Another possibility is that the service account used was created under a different project than the project used to create this transfer. To resolve cross-project permission issues, see the following resources:
- Enable service accounts to be attached across projects
- Cross-project Service Account Authorization (for granting the necessary permissions)
- Error:
HttpError 403 when requesting returned "The caller does not have permission" googleapiclient.errors.HttpError: <HttpError 403 when requesting returned "The caller does not have permission". Details: "The caller does not have permission">This error might appear when you attempt to set up a scheduled query with a service account.
Resolution: Ensure that the service account has all the permissions required to schedule or modify a scheduled query, and ensure that the user setting up the scheduled query has access to the service account.
If the correct permissions are all assigned but you still encounter the error, check if the Disable Cross-Project Service Account Usage policy is enforced on the project by default. You can check for the policy in the Cloud de Confiance console by navigating to IAM & Admin > Organization Policies and searching for the policy.

If the Disable Cross-Project Service Account Usage policy is enforced, you can disable the policy by doing the following:
- Identify the service accounts associated with the project using the Cloud de Confiance console, by navigating to IAM & Admin > Service Accounts. This view displays all service accounts for the current project.
- Disable the policy in the project where the service accounts are located using the following command. To disable this policy, the user must be an Organization Policy Administrator. Only the Organization Administrator can grant a user this role.
gcloud resource-manager org-policies disable-enforce iam.disableCrossProjectServiceAccountUsage --project=[PROJECT-ID]
Event-driven transfer configuration issues
The following are common issues you might encounter when creating an event-driven transfer.
- Error:
Data Transfer Service is not authorized to pull message from the provided Pub/Sub subscription. Resolution: Verify that the BigQuery Data Transfer Service service agent is granted the
pubsub.subscriberrole:In the Cloud de Confiance console, go to the Pub/Sub page.
Select the Pub/Sub subscription that you used in the event-driven transfer.
If the info panel is hidden, click Show info panel in the upper right corner.
In the Permissions tab, verify that the BigQuery Data Transfer Service service agent has the
pubsub.subscriberrole

If the service agent doesn't have the
pubsub.subscriberrole granted. Click Add principal to grant thepubsub.subscriberrole toservice-PROJECT_NUMBER@gcp-sa-bigquerydatatransfer.s3ns-system.iam.gserviceaccount.com- Error:
Cloud Pub/Sub API has not been used in project PROJECT_NUMBER before or it is disabled. Resolution: Verify that Cloud Pub/Sub API is enabled for your project:
In the Cloud de Confiance console, go to the APIs & Services page.
Click Enable APIs and services.
Search for
Cloud Pub/Sub API, select the first result and click Enable.
- Error:
Data Transfer Service does not have required permission to use project quota of project PROJECT_NUMBER to access Pub/Sub. Resolution: Verify that the BigQuery Data Transfer Service service agent is granted the
serviceusage.serviceUsageConsumerrole:In the Cloud de Confiance console, go to the IAM & Admin page.
Select the Include S3NS-provided role grants checkbox.
Verify that the service account with the name
service-<project_number>@gcp-sa-bigquerydatatransfer.s3ns-system.iam.gserviceaccount.comis shown and that it has been granted the Service Usage Consumer role.
- Issue: When you use Cloud Storage event-driven transfer, no transfer run is triggered after uploading or updating files in Cloud Storage bucket.
Transfer runs are not triggered immediately after an event is received. It might take several minutes to trigger a transfer run. To check the status of the next transfer run, you can check the Target date for next run field in the run history. This field displays the schedule time for the next run, or displays waiting for events to schedule next run if no events were received. If you have uploaded or updated files in your Cloud Storage bucket, but the Target date for next run has not updated and no runs are triggered for 10-20 minutes, see the following resolution.
Resolution: Verify that your Pub/Sub subscription specified in the transfer config is able to get messages published from Cloud Storage events:
In the Cloud de Confiance console, go to the Pub/Sub page.
Select the Pub/Sub subscription that you used in the event-driven transfer.
In the Metrics tab, check the "Oldest unacked message age" graph and see if there are any messages.

If no messages are published, check if Pub/Sub notification is correctly configured for Cloud Storage. You may use the following Google Cloud CLI command to check the notification configurations associated with your bucket:
gcloud storage buckets notifications list gs://BUCKET_NAMEReplace
BUCKET_NAMEwith the name of the bucket that you use for notification. For information on configuring a Pub/Sub notification for Cloud Storage, see Configure Pub/Sub notification for Cloud Storage.If there are messages, check if the same Pub/Sub subscription is used in another event-driven transfer configs. The same Pub/Sub subscription cannot be reused by multiple event-driven transfer configs. For more information about event-driven transfers, see Event-driven transfers.
Quota issues
- Error:
Quota exceeded: Your project exceeded quota for imports per project. - Resolution: Verify you have not scheduled too many transfers in your project. For information on calculating the number of load jobs initiated by a transfer, see Quotas and limits.