| --- |
| # ---------------------------------------------------------------------------- |
| # |
| # *** AUTO GENERATED CODE *** Type: MMv1 *** |
| # |
| # ---------------------------------------------------------------------------- |
| # |
| # This file is automatically generated by Magic Modules and manual |
| # changes will be clobbered when the file is regenerated. |
| # |
| # Please read more about how to change this file in |
| # .github/CONTRIBUTING.md. |
| # |
| # ---------------------------------------------------------------------------- |
| subcategory: "Biglake" |
| description: |- |
| Represents a table. |
| --- |
| |
| # google\_biglake\_table |
| |
| Represents a table. |
| |
| |
| To get more information about Table, see: |
| |
| * [API documentation](https://cloud.google.com/bigquery/docs/reference/biglake/rest/v1/projects.locations.catalogs.databases.tables) |
| * How-to Guides |
| * [Manage open source metadata with BigLake Metastore](https://cloud.google.com/bigquery/docs/manage-open-source-metadata#create_tables) |
| |
| <div class = "oics-button" style="float: right; margin: 0 0 -15px"> |
| <a href="https://console.cloud.google.com/cloudshell/open?cloudshell_git_repo=https%3A%2F%2Fgithub.com%2Fterraform-google-modules%2Fdocs-examples.git&cloudshell_working_dir=biglake_table&cloudshell_image=gcr.io%2Fcloudshell-images%2Fcloudshell%3Alatest&open_in_editor=main.tf&cloudshell_print=.%2Fmotd&cloudshell_tutorial=.%2Ftutorial.md" target="_blank"> |
| <img alt="Open in Cloud Shell" src="//gstatic.com/cloudssh/images/open-btn.svg" style="max-height: 44px; margin: 32px auto; max-width: 100%;"> |
| </a> |
| </div> |
| ## Example Usage - Biglake Table |
| |
| |
| ```hcl |
| resource "google_biglake_catalog" "catalog" { |
| name = "my_catalog" |
| location = "US" |
| } |
| |
| resource "google_storage_bucket" "bucket" { |
| name = "my_bucket" |
| location = "US" |
| force_destroy = true |
| uniform_bucket_level_access = true |
| } |
| |
| resource "google_storage_bucket_object" "metadata_folder" { |
| name = "metadata/" |
| content = " " |
| bucket = google_storage_bucket.bucket.name |
| } |
| |
| |
| resource "google_storage_bucket_object" "data_folder" { |
| name = "data/" |
| content = " " |
| bucket = google_storage_bucket.bucket.name |
| } |
| |
| resource "google_biglake_database" "database" { |
| name = "my_database" |
| catalog = google_biglake_catalog.catalog.id |
| type = "HIVE" |
| hive_options { |
| location_uri = "gs://${google_storage_bucket.bucket.name}/${google_storage_bucket_object.metadata_folder.name}" |
| parameters = { |
| "owner" = "Alex" |
| } |
| } |
| } |
| |
| resource "google_biglake_table" "table" { |
| name = "my_table" |
| database = google_biglake_database.database.id |
| type = "HIVE" |
| hive_options { |
| table_type = "MANAGED_TABLE" |
| storage_descriptor { |
| location_uri = "gs://${google_storage_bucket.bucket.name}/${google_storage_bucket_object.data_folder.name}" |
| input_format = "org.apache.hadoop.mapred.SequenceFileInputFormat" |
| output_format = "org.apache.hadoop.hive.ql.io.HiveSequenceFileOutputFormat" |
| } |
| # Some Example Parameters. |
| parameters = { |
| "spark.sql.create.version" = "3.1.3" |
| "spark.sql.sources.schema.numParts" = "1" |
| "transient_lastDdlTime" = "1680894197" |
| "spark.sql.partitionProvider" = "catalog" |
| "owner" = "John Doe" |
| "spark.sql.sources.schema.part.0"= "{\"type\":\"struct\",\"fields\":[{\"name\":\"id\",\"type\":\"integer\",\"nullable\":true,\"metadata\":{}},{\"name\":\"name\",\"type\":\"string\",\"nullable\":true,\"metadata\":{}},{\"name\":\"age\",\"type\":\"integer\",\"nullable\":true,\"metadata\":{}}]}" |
| "spark.sql.sources.provider" = "iceberg" |
| "provider" = "iceberg" |
| } |
| } |
| } |
| ``` |
| |
| ## Argument Reference |
| |
| The following arguments are supported: |
| |
| |
| * `name` - |
| (Required) |
| Output only. The name of the Table. Format: |
| projects/{project_id_or_number}/locations/{locationId}/catalogs/{catalogId}/databases/{databaseId}/tables/{tableId} |
| |
| |
| - - - |
| |
| |
| * `type` - |
| (Optional) |
| The database type. |
| Possible values are: `HIVE`. |
| |
| * `hive_options` - |
| (Optional) |
| Options of a Hive table. |
| Structure is [documented below](#nested_hive_options). |
| |
| * `database` - |
| (Optional) |
| The id of the parent database. |
| |
| |
| <a name="nested_hive_options"></a>The `hive_options` block supports: |
| |
| * `parameters` - |
| (Optional) |
| Stores user supplied Hive table parameters. An object containing a |
| list of "key": value pairs. |
| Example: { "name": "wrench", "mass": "1.3kg", "count": "3" }. |
| |
| * `table_type` - |
| (Optional) |
| Hive table type. For example, MANAGED_TABLE, EXTERNAL_TABLE. |
| |
| * `storage_descriptor` - |
| (Optional) |
| Stores physical storage information on the data. |
| Structure is [documented below](#nested_storage_descriptor). |
| |
| |
| <a name="nested_storage_descriptor"></a>The `storage_descriptor` block supports: |
| |
| * `location_uri` - |
| (Optional) |
| Cloud Storage folder URI where the table data is stored, starting with "gs://". |
| |
| * `input_format` - |
| (Optional) |
| The fully qualified Java class name of the input format. |
| |
| * `output_format` - |
| (Optional) |
| The fully qualified Java class name of the output format. |
| |
| ## Attributes Reference |
| |
| In addition to the arguments listed above, the following computed attributes are exported: |
| |
| * `id` - an identifier for the resource with format `{{database}}/tables/{{name}}` |
| |
| * `create_time` - |
| Output only. The creation time of the table. A timestamp in RFC3339 UTC |
| "Zulu" format, with nanosecond resolution and up to nine fractional |
| digits. Examples: "2014-10-02T15:01:23Z" and |
| "2014-10-02T15:01:23.045123456Z". |
| |
| * `update_time` - |
| Output only. The last modification time of the table. A timestamp in |
| RFC3339 UTC "Zulu" format, with nanosecond resolution and up to nine |
| fractional digits. Examples: "2014-10-02T15:01:23Z" and |
| "2014-10-02T15:01:23.045123456Z". |
| |
| * `delete_time` - |
| Output only. The deletion time of the table. Only set after the |
| table is deleted. A timestamp in RFC3339 UTC "Zulu" format, with |
| nanosecond resolution and up to nine fractional digits. Examples: |
| "2014-10-02T15:01:23Z" and "2014-10-02T15:01:23.045123456Z". |
| |
| * `expire_time` - |
| Output only. The time when this table is considered expired. Only set |
| after the table is deleted. A timestamp in RFC3339 UTC "Zulu" format, |
| with nanosecond resolution and up to nine fractional digits. Examples: |
| "2014-10-02T15:01:23Z" and "2014-10-02T15:01:23.045123456Z". |
| |
| * `etag` - |
| The checksum of a table object computed by the server based on the value |
| of other fields. It may be sent on update requests to ensure the client |
| has an up-to-date value before proceeding. It is only checked for update |
| table operations. |
| |
| |
| ## Timeouts |
| |
| This resource provides the following |
| [Timeouts](https://developer.hashicorp.com/terraform/plugin/sdkv2/resources/retries-and-customizable-timeouts) configuration options: |
| |
| - `create` - Default is 20 minutes. |
| - `update` - Default is 20 minutes. |
| - `delete` - Default is 20 minutes. |
| |
| ## Import |
| |
| |
| Table can be imported using any of these accepted formats: |
| |
| * `{{database}}/tables/{{name}}` |
| |
| |
| In Terraform v1.5.0 and later, use an [`import` block](https://developer.hashicorp.com/terraform/language/import) to import Table using one of the formats above. For example: |
| |
| ```tf |
| import { |
| id = "{{database}}/tables/{{name}}" |
| to = google_biglake_table.default |
| } |
| ``` |
| |
| When using the [`terraform import` command](https://developer.hashicorp.com/terraform/cli/commands/import), Table can be imported using one of the formats above. For example: |
| |
| ``` |
| $ terraform import google_biglake_table.default {{database}}/tables/{{name}} |
| ``` |