- 3.38.0 (latest)
 - 3.37.0
 - 3.36.0
 - 3.35.1
 - 3.34.0
 - 3.33.0
 - 3.31.0
 - 3.30.0
 - 3.29.0
 - 3.27.0
 - 3.26.0
 - 3.25.0
 - 3.24.0
 - 3.23.1
 - 3.22.0
 - 3.21.0
 - 3.20.1
 - 3.19.0
 - 3.18.0
 - 3.17.2
 - 3.16.0
 - 3.15.0
 - 3.14.1
 - 3.13.0
 - 3.12.0
 - 3.11.4
 - 3.4.0
 - 3.3.6
 - 3.2.0
 - 3.1.0
 - 3.0.1
 - 2.34.4
 - 2.33.0
 - 2.32.0
 - 2.31.0
 - 2.30.1
 - 2.29.0
 - 2.28.1
 - 2.27.1
 - 2.26.0
 - 2.25.2
 - 2.24.1
 - 2.23.3
 - 2.22.1
 - 2.21.0
 - 2.20.0
 - 2.19.0
 - 2.18.0
 - 2.17.0
 - 2.16.1
 - 2.15.0
 - 2.14.0
 - 2.13.1
 - 2.12.0
 - 2.11.0
 - 2.10.0
 - 2.9.0
 - 2.8.0
 - 2.7.0
 - 2.6.2
 - 2.5.0
 - 2.4.0
 - 2.3.1
 - 2.2.0
 - 2.1.0
 - 2.0.0
 - 1.28.2
 - 1.27.2
 - 1.26.1
 - 1.25.0
 - 1.24.0
 - 1.23.1
 - 1.22.0
 - 1.21.0
 - 1.20.0
 - 1.19.0
 - 1.18.0
 - 1.17.0
 - 1.16.0
 
LoadJob(job_id, source_uris, destination, client, job_config=None)Asynchronous job for loading data into a table.
Can load from Google Cloud Storage URIs or from a file.
Parameters
| Name | Description | 
| job_id | 
  	str
  	the job's ID  | 
      
| source_uris | 
  	Optional[Sequence[str]]
  	URIs of one or more data files to be loaded. See https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#JobConfigurationLoad.FIELDS.source_uris for supported URI formats. Pass None for jobs that load from a file.  | 
      
| destination | 
  	google.cloud.bigquery.table.TableReference
  	reference to table into which data is to be loaded.  | 
      
| client | 
  	google.cloud.bigquery.client.Client
  	A client which holds credentials and project configuration for the dataset (which requires a project).  | 
      
Inheritance
builtins.object > google.api_core.future.base.Future > google.api_core.future.polling.PollingFuture > google.cloud.bigquery.job.base._AsyncJob > LoadJobProperties
allow_jagged_rows
See allow_jagged_rows.
allow_quoted_newlines
autodetect
See autodetect.
clustering_fields
See clustering_fields.
create_disposition
See create_disposition.
created
Datetime at which the job was created.
| Type | Description | 
| Optional[datetime.datetime] | the creation time (None until set from the server). | 
destination
google.cloud.bigquery.table.TableReference: table where loaded rows are written
destination_encryption_configuration
google.cloud.bigquery.encryption_configuration.EncryptionConfiguration: Custom encryption configuration for the destination table.
Custom encryption configuration (e.g., Cloud KMS keys)
or :data:None if using default encryption.
destination_table_description
Optional[str] name given to destination table.
destination_table_friendly_name
Optional[str] name given to destination table.
encoding
See encoding.
ended
Datetime at which the job finished.
| Type | Description | 
| Optional[datetime.datetime] | the end time (None until set from the server). | 
error_result
Error information about the job as a whole.
| Type | Description | 
| Optional[Mapping] | the error information (None until set from the server). | 
errors
Information about individual errors generated by the job.
| Type | Description | 
| Optional[List[Mapping]] | the error information (None until set from the server). | 
etag
ETag for the job resource.
| Type | Description | 
| Optional[str] | the ETag (None until set from the server). | 
field_delimiter
See field_delimiter.
ignore_unknown_values
input_file_bytes
Count of bytes loaded from source files.
| Type | Description | 
| ValueError | for invalid value types. | 
| Type | Description | 
| Optional[int] | the count (None until set from the server). | 
input_files
Count of source files.
| Type | Description | 
| Optional[int] | the count (None until set from the server). | 
job_id
str: ID of the job.
job_type
Type of job.
| Type | Description | 
| str | one of 'load', 'copy', 'extract', 'query'. | 
labels
Dict[str, str]: Labels for the job.
location
str: Location where the job runs.
max_bad_records
See max_bad_records.
null_marker
See null_marker.
num_child_jobs
The number of child jobs executed.
See: https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#JobStatistics.FIELDS.num_child_jobs
output_bytes
Count of bytes saved to destination table.
| Type | Description | 
| Optional[int] | the count (None until set from the server). | 
output_rows
Count of rows saved to destination table.
| Type | Description | 
| Optional[int] | the count (None until set from the server). | 
parent_job_id
Return the ID of the parent job.
See: https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#JobStatistics.FIELDS.parent_job_id
| Type | Description | 
| Optional[str] | parent job id. | 
path
URL path for the job's APIs.
| Type | Description | 
| str | the path based on project and job ID. | 
project
Project bound to the job.
| Type | Description | 
| str | the project (derived from the client). | 
quote_character
See quote_character.
range_partitioning
See range_partitioning.
reservation_usage
Job resource usage breakdown by reservation.
| Type | Description | 
| List[google.cloud.bigquery.job.ReservationUsage] | Reservation usage stats. Can be empty if not set from the server. | 
schema
See schema.
schema_update_options
self_link
URL for the job resource.
| Type | Description | 
| Optional[str] | the URL (None until set from the server). | 
skip_leading_rows
See skip_leading_rows.
source_format
See source_format.
source_uris
Optional[Sequence[str]]: URIs of data files to be loaded. See https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#JobConfigurationLoad.FIELDS.source_uris for supported URI formats. None for jobs that load from a file.
started
Datetime at which the job was started.
| Type | Description | 
| Optional[datetime.datetime] | the start time (None until set from the server). | 
state
Status of the job.
| Type | Description | 
| Optional[str] | the state (None until set from the server). | 
time_partitioning
See time_partitioning.
transaction_info
Information of the multi-statement transaction if this job is part of one.
.. versionadded:: 2.24.0
use_avro_logical_types
user_email
E-mail address of user who submitted the job.
| Type | Description | 
| Optional[str] | the URL (None until set from the server). | 
write_disposition
See write_disposition.
script_statistics
API documentation for bigquery.job.LoadJob.script_statistics property.
Methods
add_done_callback
add_done_callback(fn)Add a callback to be executed when the operation is complete.
If the operation is not already complete, this will start a helper thread to poll for the status of the operation in the background.
| Name | Description | 
| fn | 
          Callable[Future]
          The callback to execute when the operation is complete.  | 
      
cancel
cancel(client=None, retry: retries.Retry = <google.api_core.retry.Retry object>, timeout: float = None)API call: cancel job via a POST request
See https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/cancel
| Name | Description | 
| timeout | 
          Optional[float]
          The number of seconds to wait for the underlying HTTP transport before using   | 
      
| client | 
          Optional[google.cloud.bigquery.client.Client]
          the client to use. If not passed, falls back to the   | 
      
| retry | 
          Optional[google.api_core.retry.Retry]
          How to retry the RPC.  | 
      
| Type | Description | 
| bool | Boolean indicating that the cancel request was sent. | 
cancelled
cancelled()Check if the job has been cancelled.
This always returns False. It's not possible to check if a job was
cancelled in the API. This method is here to satisfy the interface
for google.api_core.future.Future.
| Type | Description | 
| bool | False | 
done
done(retry: retries.Retry = <google.api_core.retry.Retry object>, timeout: float = None, reload: bool = True)Checks if the job is complete.
| Name | Description | 
| timeout | 
          Optional[float]
          The number of seconds to wait for the underlying HTTP transport before using   | 
      
| reload | 
          Optional[bool]
          If   | 
      
| retry | 
          Optional[google.api_core.retry.Retry]
          How to retry the RPC. If the job state is   | 
      
| Type | Description | 
| bool | True if the job is complete, False otherwise. | 
exception
exception(timeout=None)Get the exception from the operation, blocking if necessary.
| Name | Description | 
| timeout | 
          int
          How long to wait for the operation to complete. If None, wait indefinitely.  | 
      
| Type | Description | 
| Optional[google.api_core.GoogleAPICallError] | The operation's error. | 
exists
exists(client=None, retry: retries.Retry = <google.api_core.retry.Retry object>, timeout: float = None)API call: test for the existence of the job via a GET request
See https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/get
| Name | Description | 
| timeout | 
          Optional[float]
          The number of seconds to wait for the underlying HTTP transport before using   | 
      
| client | 
          Optional[google.cloud.bigquery.client.Client]
          the client to use. If not passed, falls back to the   | 
      
| retry | 
          Optional[google.api_core.retry.Retry]
          How to retry the RPC.  | 
      
| Type | Description | 
| bool | Boolean indicating existence of the job. | 
from_api_repr
from_api_repr(resource: dict, client)Factory: construct a job given its API representation
| Name | Description | 
| resource | 
          Dict
          dataset job representation returned from the API  | 
      
| client | 
          google.cloud.bigquery.client.Client
          Client which holds credentials and project configuration for the dataset.  | 
      
| Type | Description | 
| google.cloud.bigquery.job.LoadJob | Job parsed from ``resource``. | 
reload
reload(client=None, retry: retries.Retry = <google.api_core.retry.Retry object>, timeout: float = None)API call: refresh job properties via a GET request.
See https://cloud.google.com/bigquery/docs/reference/rest/v2/jobs/get
| Name | Description | 
| timeout | 
          Optional[float]
          The number of seconds to wait for the underlying HTTP transport before using   | 
      
| client | 
          Optional[google.cloud.bigquery.client.Client]
          the client to use. If not passed, falls back to the   | 
      
| retry | 
          Optional[google.api_core.retry.Retry]
          How to retry the RPC.  | 
      
result
result(retry: retries.Retry = <google.api_core.retry.Retry object>, timeout: float = None)Start the job and wait for it to complete and get the result.
| Name | Description | 
| timeout | 
          Optional[float]
          The number of seconds to wait for the underlying HTTP transport before using   | 
      
| retry | 
          Optional[google.api_core.retry.Retry]
          How to retry the RPC. If the job state is   | 
      
| Type | Description | 
| google.cloud.exceptions.GoogleAPICallError | if the job failed. | 
| concurrent.futures.TimeoutError | if the job did not complete in the given timeout. | 
| Type | Description | 
| _AsyncJob | This instance. | 
running
running()True if the operation is currently running.
set_exception
set_exception(exception)Set the Future's exception.
set_result
set_result(result)Set the Future's result.
to_api_repr
to_api_repr()Generate a resource for _begin.
__init__
__init__(job_id, source_uris, destination, client, job_config=None)Initialize self. See help(type(self)) for accurate signature.
LoadJob
LoadJob(job_id, source_uris, destination, client, job_config=None)Asynchronous job for loading data into a table.
Can load from Google Cloud Storage URIs or from a file.
| Name | Description | 
| job_id | 
          str
          the job's ID  | 
      
| source_uris | 
          Optional[Sequence[str]]
          URIs of one or more data files to be loaded. See https://cloud.google.com/bigquery/docs/reference/rest/v2/Job#JobConfigurationLoad.FIELDS.source_uris for supported URI formats. Pass None for jobs that load from a file.  | 
      
| destination | 
          google.cloud.bigquery.table.TableReference
          reference to table into which data is to be loaded.  | 
      
| client | 
          google.cloud.bigquery.client.Client
          A client which holds credentials and project configuration for the dataset (which requires a project).  |