Modified custom.yaml is not working so unable to upgrade helm
search cancel

Modified custom.yaml is not working so unable to upgrade helm

book

Article ID: 232022

calendar_today

Updated On:

Products

CA Application Test

Issue/Introduction

Please find below logs from helm debug.

$ helm upgrade --debug <HELMVERSION> ./devtest --install --force --values ./custom-values.yaml --namespace <NAMESPACE>
WARNING: Kubernetes configuration file is group-readable. This is insecure. Location: /Users/<user>/.kube/config
WARNING: Kubernetes configuration file is world-readable. This is insecure. Location: /Users/<user>/.kube/config
history.go:56: [debug] getting history for release <HELMVERSION>
Release "<HELMVERSION>" does not exist. Installing it now.
install.go:173: [debug] Original chart version: ""
install.go:190: [debug] CHART PATH: /Users/<user>/Documents/Lisa/LisaLinuxInstaller/aws/10.6/Lisa_10.6_PatchV.1_Test_SSO/devtest

Error: unable to build kubernetes objects from release manifest: error validating "": error validating data: ValidationError(StatefulSet.spec): unknown field "minReadySeconds" in io.k8s.api.apps.v1.StatefulSetSpec
helm.go:81: [debug] error validating "": error validating data: ValidationError(StatefulSet.spec): unknown field "minReadySeconds" in io.k8s.api.apps.v1.StatefulSetSpec
helm.sh/helm/v3/pkg/kube.scrubValidationError
 /private/tmp/helm-20210414-93729-197z3ms/pkg/kube/client.go:594
helm.sh/helm/v3/pkg/kube.(*Client).Build
 /private/tmp/helm-20210414-93729-197z3ms/pkg/kube/client.go:187
helm.sh/helm/v3/pkg/action.(*Install).Run
 /private/tmp/helm-20210414-93729-197z3ms/pkg/action/install.go:256
main.runInstall
 /private/tmp/helm-20210414-93729-197z3ms/cmd/helm/install.go:242
main.newUpgradeCmd.func2
 /private/tmp/helm-20210414-93729-197z3ms/cmd/helm/upgrade.go:115
github.com/spf13/cobra.(*Command).execute
 /Users/brew/Library/Caches/Homebrew/go_mod_cache/pkg/mod/github.com/spf13/[email protected]/command.go:850
github.com/spf13/cobra.(*Command).ExecuteC
 /Users/brew/Library/Caches/Homebrew/go_mod_cache/pkg/mod/github.com/spf13/[email protected]/command.go:958
github.com/spf13/cobra.(*Command).Execute
 /Users/brew/Library/Caches/Homebrew/go_mod_cache/pkg/mod/github.com/spf13/[email protected]/command.go:895
main.main
 /private/tmp/helm-20210414-93729-197z3ms/cmd/helm/helm.go:80
runtime.main
 /usr/local/Cellar/go/1.16.3/libexec/src/runtime/proc.go:225
runtime.goexit
 /usr/local/Cellar/go/1.16.3/libexec/src/runtime/asm_amd64.s:1371
unable to build kubernetes objects from release manifest
helm.sh/helm/v3/pkg/action.(*Install).Run
 /private/tmp/helm-20210414-93729-197z3ms/pkg/action/install.go:258
main.runInstall
 /private/tmp/helm-20210414-93729-197z3ms/cmd/helm/install.go:242
main.newUpgradeCmd.func2
 /private/tmp/helm-20210414-93729-197z3ms/cmd/helm/upgrade.go:115
github.com/spf13/cobra.(*Command).execute
 /Users/brew/Library/Caches/Homebrew/go_mod_cache/pkg/mod/github.com/spf13/[email protected]/command.go:850
github.com/spf13/cobra.(*Command).ExecuteC
 /Users/brew/Library/Caches/Homebrew/go_mod_cache/pkg/mod/github.com/spf13/[email protected]/command.go:958
github.com/spf13/cobra.(*Command).Execute
 /Users/brew/Library/Caches/Homebrew/go_mod_cache/pkg/mod/github.com/spf13/[email protected]/command.go:895
main.main
 /private/tmp/helm-20210414-93729-197z3ms/cmd/helm/helm.go:80
runtime.main
 /usr/local/Cellar/go/1.16.3/libexec/src/runtime/proc.go:225
runtime.goexit
 /usr/local/Cellar/go/1.16.3/libexec/src/runtime/asm_amd64.s:1371

 

Also the following:
Error: render manifest contains a resource that already exists.
Unable to continue with the install: ServiceAccount "DEvTest: in Namespace "XXXXXXXXX" exists and cannot be imported into the current lease.
Invalidating key" app.kubernetes.io/manager-by: must be set to helm
annotation validation error:
Missing key meta.helm.sh/release-name:

Environment

All supported DevTest releases and platforms.

Cause

devtest Kubernetes requires helm 2.17.
The above errors are caused by helm 3.x being installed and used.

Resolution

Downgrade helm to 2.17 version and deploy again.