Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Nomad unable to parse template client parameters when using json config format #24001

Closed
ncode opened this issue Sep 19, 2024 · 2 comments · Fixed by #24007
Closed

Nomad unable to parse template client parameters when using json config format #24001

ncode opened this issue Sep 19, 2024 · 2 comments · Fixed by #24007
Labels
good first issue stage/accepted Confirmed, and intend to work on. No timeline committment though. theme/cli theme/config type/bug

Comments

@ncode
Copy link
Contributor

ncode commented Sep 19, 2024

Nomad version

Output from nomad version

./nomad-mac version
Nomad v1.8.4-dev

Operating system and Environment details

Macos and Linux

Issue

Using the template parameters https://developer.hashicorp.com/nomad/docs/configuration/client#template do not work with the config in json format

Reproduction steps

cat ./debug/config.hcl
datacenter = "dc1"
name       = "nomad-client-001"
bind_addr  = "0.0.0.0"
data_dir   = "/opt/nomad/data"

vault {
  enabled = true
  address = "https://vault.services.internal"
}

client {
  enabled = true
  min_dynamic_port = 20000
  max_dynamic_port = 20010
  servers = ["nomad-server-001.internal"]
  cni_path = "/usr/libexec/cni"
  cni_config_dir = "/etc/nomad/cni.d"
  template {
    wait {
       min = "5s"
       max = "15s"
    }
   }
}

consul {
  address = "consul.internal:8500"
}

telemetry {
  collection_interval = "1s"
  disable_hostname = true
  prometheus_metrics = true
  publish_allocation_metrics = true
  publish_node_metrics = true
}
{
  "bind_addr": "0.0.0.0",
  "client": {
    "cni_config_dir": "/etc/nomad/cni.d",
    "cni_path": "/usr/libexec/cni",
    "enabled": true,
    "max_dynamic_port": 20010,
    "min_dynamic_port": 20000,
    "servers": [
      "nomad-server-001.internal"
    ],
    "template": {
      "wait": {
        "max": "15s",
        "min": "5s"
      }
    }
  },
  "consul": {
    "address": "consul.internal:8500"
  },
  "data_dir": "/opt/nomad/data",
  "datacenter": "dc1",
  "name": "nomad-client-001",
  "telemetry": {
    "collection_interval": "1s",
    "disable_hostname": true,
    "prometheus_metrics": true,
    "publish_allocation_metrics": true,
    "publish_node_metrics": true
  },
  "vault": {
    "address": "https://vault.services.internal",
    "enabled": true
  }
}
./nomad config validate ./debug/config.hcl
WARNING: mTLS is not configured - Nomad is not secure without mTLS!
Configuration is valid!

./nomad config validate ./debug/config.json
1 error occurred:
        * Error loading configuration from ./debug/config.json: Error loading debug/config.json: client unexpected keys wait

Expected Result

Template config parameters to be properly used in json

Actual Result

Fail to parse the config

@jrasell
Copy link
Member

jrasell commented Sep 19, 2024

Hi @ncode and thanks for raising this issue. I have been able to reproduce this locally using main at 4d6856a30619572ef43d0e6ffb803ddb459c4856. I'll move this onto our backlog for roadmapping.

@jrasell jrasell added theme/cli theme/config good first issue stage/accepted Confirmed, and intend to work on. No timeline committment though. labels Sep 19, 2024
@jrasell jrasell moved this from Needs Triage to Needs Roadmapping in Nomad - Community Issues Triage Sep 19, 2024
@ncode
Copy link
Contributor Author

ncode commented Sep 19, 2024

Hi @jrasell I've added a PR for this issue. But looking at the code it looks like the artifact will also need to be fixed later.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
good first issue stage/accepted Confirmed, and intend to work on. No timeline committment though. theme/cli theme/config type/bug
Projects
Development

Successfully merging a pull request may close this issue.

2 participants