We read every piece of feedback, and take your input very seriously.
To see all available qualifiers, see our documentation.
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
When creating a duplicate Spark operator stack the creation fails with an error on the IAM policy for Spark:
│ Error: creating IAM Policy (spark-operator-doeks-spark-irsa): operation error IAM: CreatePolicy, https response error StatusCode: 409, ..., EntityAlreadyExists: A policy called spark-operator-doeks-spark-irsa already exists. Duplicate names are not allowed. │ │ with aws_iam_policy.spark, │ on spark-team.tf line 66, in resource "aws_iam_policy" "spark": │ 66: resource "aws_iam_policy" "spark" { │
Module version [Required]: v1.0.3 (latest)
Terraform version: v1.9.5
Steps to reproduce the behavior:
create two copies of the Spark operator (v4) stack in the same account (i was using a different region)
we should use a unique name for that policy to avoid conflicts
The text was updated successfully, but these errors were encountered:
i think it's this resource giving me a headache:
data-on-eks/analytics/terraform/spark-k8s-operator/spark-team.tf
Line 66 in 473189d
We can probably use name_prefix instead of name to ensure uniqueness on the policy, even when someone uses the same name for the stack. https://registry.terraform.io/providers/hashicorp/aws/latest/docs/resources/iam_policy#argument-reference
name_prefix
name
Sorry, something went wrong.
solved in #672 for the spark operator blueprint, we may need to check others
alanty
No branches or pull requests
Description
When creating a duplicate Spark operator stack the creation fails with an error on the IAM policy for Spark:
Versions
Module version [Required]: v1.0.3 (latest)
Terraform version: v1.9.5
Reproduction Code [Required]
Steps to reproduce the behavior:
create two copies of the Spark operator (v4) stack in the same account (i was using a different region)
Expected behavior
we should use a unique name for that policy to avoid conflicts
The text was updated successfully, but these errors were encountered: