Skip to content

Commit

Permalink
Warn about input errors with object inputs, remove DNS section (#606)
Browse files Browse the repository at this point in the history
  • Loading branch information
Nuru authored Mar 14, 2024
1 parent f6b441e commit 095d66b
Show file tree
Hide file tree
Showing 12 changed files with 144 additions and 64 deletions.
5 changes: 5 additions & 0 deletions .editorconfig
Original file line number Diff line number Diff line change
Expand Up @@ -11,3 +11,8 @@ indent_size = 4

[*.sh]
indent_style = tab

[*.py]
indent_style = space
indent_size = 4

2 changes: 1 addition & 1 deletion content/docs/fundamentals/introduction.md
Original file line number Diff line number Diff line change
Expand Up @@ -34,7 +34,7 @@ How does it differentiate from these solutions?

1. It's 100% Open Source: SweetOps [is on GitHub](https://github.com/cloudposse) and is free to use with no strings attached under Apache 2.0.
1. It's comprehensive: SweetOps is not only about Terraform. It provides patterns and conventions for building cloud native platforms that are security focused, Kubernetes-based, and driven by continuous delivery.
1. It's community focused: SweetOps has [over 3400 users in Slack](https://sweetops.com/slack/), well-attended weekly office hours, and a [budding community forum](https://ask.sweetops.com/).
1. It's community focused: SweetOps has [over 9000 users in Slack](https://sweetops.com/slack/), well-attended weekly office hours, and a [budding community forum](https://ask.sweetops.com/).


## How is this documentation structured?
Expand Down
4 changes: 2 additions & 2 deletions content/docs/intro.md
Original file line number Diff line number Diff line change
Expand Up @@ -12,7 +12,7 @@ Start with getting familiar with the [geodesic](/reference/tools.mdx#geodesic).

Get intimately familiar with docker inheritance and [multi-stage docker builds](/reference/best-practices/docker-best-practices.md#multi-stage-builds). We use this pattern extensively.

Check out our [terraform-aws-components](https://github.com/cloudposse/terraform-aws-components) for reference architectures to easily provision infrastructure
Check out our [terraform-aws-components](https://github.com/cloudposse/terraform-aws-components) for reference architectures to easily provision infrastructure.

## Tools

Expand Down Expand Up @@ -70,7 +70,7 @@ Review our [glossary](/category/glossary/) if there are any terms that are confu

File issues anywhere you find the documentation lacking by going to our [docs repo](https://github.com/cloudposse/docs).

Join our [Slack Community](https://cloudposse.com/slack/) and speak directly with the maintainers
Join our [Slack Community](https://cloudposse.com/slack/) and speak directly with the maintainers.

We provide "white glove" DevOps support. [Get in touch](/contact-us.md) with us today!

Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -15,10 +15,10 @@ Try to leverage the same base image in as many of your images as possible for fa

## Multi-stage Builds

There are two ways to leverage multi-stage builds.
There are two ways to leverage multi-stage builds:

1. *Build-time Environments* The most common application of multi-stage builds is for using a build-time environment for compiling apps, and then a minimal image (E.g. `alpine` or `scratch`) for distributing the resultant artifacts (e.g. statically-linked go binaries).
2. *Multiple-Inheritance* We like to think of "multi-stage builds" as a mechanism for "multiple inheritance" as it relates to docker images. While not technically the same thing, using mult-stage images, it's possible `COPY --from=other-image` to keep things very DRY.
1. *Build-time Environments* The most common application of multi-stage builds is for using a build-time environment for compiling apps, and then a minimal image (E.g. `alpine` or `scratch`) for distributing the resultant artifacts (e.g. statically-linked `go` binaries).
2. *Multiple-Inheritance* We like to think of "multi-stage builds" as a mechanism for "multiple inheritance" as it relates to docker images. While not technically the same thing, using multi-stage images makes it possible to `COPY --from=other-image` to keep things very DRY.

:::info
- <https://docs.docker.com/develop/develop-images/multistage-build/>
Expand Down
69 changes: 47 additions & 22 deletions content/docs/reference/best-practices/terraform-best-practices.md
Original file line number Diff line number Diff line change
Expand Up @@ -100,6 +100,23 @@ conversion errors, and allows for future expansion without breaking changes.
Make as many fields as possible optional, provide defaults at every level of
nesting, and use `nullable = false` if possible.

:::caution Extra (or Misspelled) Fields in Object Inputs Will Be Silently Ignored

If you use an object with defaults as an input, Terraform will not give any
indication if the user provides extra fields in the input object. This is
particularly a problem if they misspelled an optional field name, because
the misspelled field will be silently ignored, and the default value the
user intended to override will silently be used. This is
[a limitation of Terraform](https://github.com/hashicorp/terraform/issues/29204#issuecomment-1989579801).
Furthermore, there is no way to add any checks for this situation, because
the input will have already been transformed (unexpected fields removed) by
the time any validation code runs. This makes using an object a trade-off
versus using separate inputs, which do not have this problem, or `type = any`
which allows you to write validation code to catch this problem and
additional code to supply defaults for missing fields.

:::

Reserve `type = any` for exceptional cases where the input is highly
variable and/or complex, and the module is designed to handle it. For
example, the configuration of a [Datadog synthetic test](https://registry.terraform.io/providers/DataDog/datadog/latest/docs/resources/synthetics_test)
Expand All @@ -117,8 +134,9 @@ often use a large number of input variables of simple types. This is because
in the early development of Terraform, there was no good way to define
complex objects with defaults. However, now that Terraform supports complex
objects with field-level defaults, we recommend using a single object input
variable with such defaults to group related configuration. This makes the
interface easier to understand and use.
variable with such defaults to group related configuration, taking into consideration
the trade-offs listed in the [above caution](#use-objects-with-optional-fields-for-complex-inputs).
This makes the interface easier to understand and use.

For example, prefer:

Expand Down Expand Up @@ -151,6 +169,27 @@ variable "eip_delete_timeout" {
}
```

However, using an object with defaults versus multiple simple inputs is not
without trade-offs, as explained in the [above caution](#use-objects-with-optional-fields-for-complex-inputs).


There are a few ways to mitigate this problem besides using separate inputs:

- If all the defaults are null or empty, you can use a `map(string)` input
variable and use the `keys` function to check for unexpected fields. This
catches errors, but has the drawback that it does not provide
documentation of what fields are expected.
- You can use `type = any` for inputs, but then you have to write the extra
code to validate the input and supply defaults for missing fields. You
should also document the expected fields in the input description.
- If all you are worried about is misspelled field names, you can make the
correctly spelled field names required, ensuring they are supplied.
Alternatively, if the misspelling is predictable, such as you have a field
named `minsize` but people are likely to try to supply `min_size`, you can
make the misspelled field name optional with a sentinel value and then
check for that value in the validation code.


### Use custom validators to enforce custom constraints

Use the `validation` block to enforce custom constraints on input variables.
Expand Down Expand Up @@ -277,7 +316,12 @@ configuration to a separate template file.

Linting helps to ensure a consistent code formatting, improves code quality and catches common errors with syntax.

Run `terraform fmt` before committing all code. Use a `pre-commit` hook to do this automatically. See [Terraform Tips & Tricks](/reference/best-practices/terraform-tips-tricks.md)
Run `terraform fmt` before committing all code. Use a `pre-commit` hook to
do this automatically. See [Terraform Tips &
Tricks](/reference/best-practices/terraform-tips-tricks.md)

Consider using [`tflint`](https://github.com/terraform-linters/tflint) with
the `aws` plugin.

### Use CIDR math interpolation functions for network calculations

Expand Down Expand Up @@ -412,25 +456,6 @@ To enforce consistency, we require that all modules use the [`terraform-null-lab
With this module, users have the ability to change the way resource names are generated such as by changing the order of parameters or the delimiter.
While the module is opinionated on the parameters, it's proved invaluable as a mechanism for generating consistent resource names.

## DNS Infrastructure

### Use lots of DNS zones

Never mingle DNS from different stages or environments in the same zone.

### Delegate DNS zones across account boundaries

Delegate each AWS account a DNS zone for which it is authoritative.

### Distinguish between branded domains and service discovery domains

Service discovery domains are what services use to discover each other. These are seldom if ever used by end-users. There should only
be one service discovery domain, but there may be many zones delegated from that domain.

Branded domains are the domains that users use to access the services. These are determined by products, marketing, and business use-cases.
There may be many branded domains pointing to a single service discovery domain. The architecture of the branded domains won't mirror the
service discovery domains.

## Module Design

### Small Opinionated Modules
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -303,7 +303,7 @@ If you are unlucky (or if you run `terraform apply` 3 times), the change
will go through, and user "Dick" will be renamed user "Tom", meaning that
whatever access Dick had, Tom now gets. Likewise, user Dick is renamed Harry,
getting Harry's access, and Harry get the newly created user. For example, Tom
can now log in with user name Tom using Dick's password, while Harry will be
can now log in with user name "Tom" using Dick's password, while Harry will be
locked out as a new user. This nightmare scenario has a lot to do with
peculiarities of the implementation of IAM principals, but gives you an idea
of what can happen when you use `count` with a list of resource configurations.
Expand All @@ -322,6 +322,8 @@ affected. The answer to this is `for_each`, but that is not without its own
limitations.
:::

### For Each is Stable, But Not Always Feasible to Use

#### The Stability of `for_each`

In large part to address the instability of `count`, Terraform introduced
Expand Down Expand Up @@ -399,6 +401,16 @@ this has all the same problems as `count`, in which case using `count` is
better because it is simpler and all of the issues with `count` are already
understood.

::: note
Another limitation, though not frequently encountered, is that "sensitive"
values, such as sensitive input variables, sensitive outputs, or sensitive
resource attributes, cannot be used as arguments to `for_each`. As stated
previously, the value supplied to `for_each` is used as part of the resource
address, and as such, it will always be disclosed in UI output, which is why
sensitive values are not allowed. Attempts to use sensitive values as
`for_each` arguments will result in an error.
:::

Ideally, as we saw with IAM users in the examples above, the user would
supply static keys in the initial configuration, and then they would always
be known and usable in `for_each`, while allowing the user to add or remove
Expand Down
8 changes: 4 additions & 4 deletions content/docs/tutorials/geodesic-getting-started.md
Original file line number Diff line number Diff line change
Expand Up @@ -27,9 +27,9 @@ Before we jump in, it's important to note that Geodesic is built around some adv

Let's talk about a few of the ways that one can run Geodesic. Our toolbox has been built to satisfy many use-cases, and each result in a different pattern of invocation:

1. You can **run standalone** Geodesic as a standard docker container using `docker run`. This enables you to get started quickly, to avoid fiddling with configuration or run one-off commands using some of the built-in tools.
1. You can **run standalone** Geodesic as a standard docker container using `docker run`. This enables you to quickly use most of the built-in tools. (Some tools require installing the wrapper script first, as explained in the next step.)
1. Example: `docker run -it --rm --volume $HOME:/localhost cloudposse/geodesic:latest-debian --login` opens a bash login shell (`--login` is our Docker `CMD` here; it's actually just [the arguments passed to the `bash` shell](https://www.gnu.org/software/bash/manual/html_node/Bash-Startup-Files.html) which is our `ENTRYPOINT`) in our Geodesic container.
1. Example: `docker run -it --rm --volume $HOME:/localhost cloudposse/geodesic:latest-debian -c "terraform version"` executes the `terraform version` command as a one off and outputs the result.
1. Example: `docker run --rm cloudposse/geodesic:latest-debian -c "terraform version"` executes the `terraform version` command as a one-off and outputs the result.
1. You can **install** Geodesic onto your local machine using what we call the docker-bash pattern (e.g. `docker run ... | bash`). Similar to above, this enables a quickstart process but supports longer lived usage as it creates a callable script on your machine that enables reuse any time you want to start a shell.
1. Example: `docker run --rm cloudposse/geodesic:latest-debian init | bash -s latest-debian` installs `/usr/local/bin/geodesic` on your local machine which you can execute repeatedly via simply typing `geodesic`. In this example, we're pinning the script to use the `cloudposse/geodesic:latest-debian` docker image, but we could also pin to our own image or to a specific version.
1. Lastly, you can **build your own toolbox** on top of Geodesic. This is what SweetOps generally recommends to practitioners. We do this when we want to provide additional packages or customization to our team while building on the foundation that geodesic provides. This is simple to do by using Geodesic as your base image (e.g. `FROM cloudposse/geodesic:latest-debian`) in your own `Dockerfile`, adding your own Docker `RUN` commands or overriding environment variables, and then using `docker build` to create a new image that you distribute to your team. This is more advanced usage and we'll cover how to do this in a future how-to article.
Expand Down Expand Up @@ -90,9 +90,9 @@ terraform init
terraform apply -auto-approve
```

Sweet, you should see a successful `terraform apply` with some detailed `output` info on the original star wars hero! 😎
Sweet, you should see a successful `terraform apply` with some detailed `output` data on the original star wars hero! 😎

Just to show some simple usage of another tool in the toolbox, how about we pull apart that info and get that hero's name?
Just to show some simple usage of another tool in the toolbox, how about we parse that data and get that hero's name?

### 4. Read some data from our Outputs

Expand Down
4 changes: 2 additions & 2 deletions scripts/docs-collator/AbstractFetcher.py
Original file line number Diff line number Diff line change
Expand Up @@ -19,8 +19,8 @@ def __init__(self, github_provider, download_dir):
def _fetch_readme_yaml(self, repo, module_download_dir):
self.github_provider.fetch_file(repo, README_YAML, module_download_dir)

def _fetch_docs(self, repo, module_download_dir):
remote_files = self.github_provider.list_repo_dir(repo, DOCS_DIR)
def _fetch_docs(self, repo, module_download_dir, submodule_dir=""):
remote_files = self.github_provider.list_repo_dir(repo, os.path.join(submodule_dir, DOCS_DIR))

for remote_file in remote_files:
if os.path.basename(remote_file) == TARGETS_MD: # skip targets.md
Expand Down
17 changes: 10 additions & 7 deletions scripts/docs-collator/AbstractRenderer.py
Original file line number Diff line number Diff line change
Expand Up @@ -15,20 +15,23 @@ def __init__(self, message):


class AbstractRenderer:
def _pre_rendering_fixes(self, repo, module_download_dir):
readme_yaml_file = os.path.join(module_download_dir, README_YAML)
def _pre_rendering_fixes(self, repo, module_download_dir, submodule_dir=""):
readme_yaml_file = os.path.join(module_download_dir, submodule_dir, README_YAML)
content = io.read_file_to_string(readme_yaml_file)
content = rendering.remove_targets_md(content)
content = rendering.rename_name(repo, content)
if submodule_dir == "":
content = rendering.rename_name(repo.name, content)
else:
content = rendering.rename_name("pre-fix-" + os.path.basename(submodule_dir), content)
io.save_string_to_file(readme_yaml_file, content)

def _post_rendering_fixes(self, repo, readme_md_file):
def _post_rendering_fixes(self, repo, readme_md_file, submodule_dir=""):
content = io.read_file_to_string(readme_md_file)
content = rendering.fix_self_non_closing_br_tags(content)
content = rendering.fix_custom_non_self_closing_tags_in_pre(content)
content = rendering.fix_github_edit_url(content, repo)
content = rendering.fix_sidebar_label(content, repo)
content = rendering.replace_relative_links_with_github_links(repo, content)
content = rendering.fix_github_edit_url(content, repo, submodule_dir)
content = rendering.fix_sidebar_label(content, repo, os.path.basename(submodule_dir))
content = rendering.replace_relative_links_with_github_links(repo, content, submodule_dir)
io.save_string_to_file(readme_md_file, content)

def _copy_extra_resources_for_docs(self, module_download_dir, module_docs_dir):
Expand Down
17 changes: 13 additions & 4 deletions scripts/docs-collator/ModuleFetcher.py
Original file line number Diff line number Diff line change
Expand Up @@ -40,9 +40,18 @@ def __fetch_images(self, repo, module_download_dir):

def __fetch_submodules(self, repo, module_download_dir):
remote_files = self.github_provider.list_repo_dir(repo, SUBMODULES_DIR)
readme_files = {}

for remote_file in remote_files:
if os.path.basename(remote_file) != README_MD:
continue

self.github_provider.fetch_file(repo, remote_file, module_download_dir)
base_name = os.path.basename(remote_file)
dir_name = os.path.dirname(remote_file)

if base_name == README_YAML:
readme_files[dir_name] = remote_file
elif base_name == README_MD and dir_name not in readme_files:
readme_files[dir_name] = remote_file

for readme_file in readme_files.values():
self.github_provider.fetch_file(repo, readme_file, module_download_dir)
if os.path.basename(readme_file) == README_YAML:
self._fetch_docs(repo, module_download_dir, submodule_dir=os.path.dirname(readme_file))
Loading

0 comments on commit 095d66b

Please sign in to comment.