Looking at your cluster, nodegroups looks to be in a bad shape. Did you recently update things on your AWS interface? Changed your AWS access key? ore reset Qovery user access key? Here is what I see:
Looking at your EKS nodegroup, I can see this from your AWS account:
"health": {
"issues": [
{
"code": "AccessDenied",
"message": "The aws-auth ConfigMap in your cluster is invalid.",
"resourceIds": [
"qovery-eks-workers-z16cd1bde"
]
}
]
},
Also, I see 3 persons added to the Admins group. Is it possible that one of them has his own account disabled or something like that? Or did you update permissions to the Qovery account?
Note: the aws-auth ConfigMap is regularly regenerated by Qovery, and looking into it, at first signs, it looks correct. So I’m guessing about things that have recently changed (permissions or users)
first of all - i have reverted my changes and i think it has fixed the issue.
back to question - I want to be able to give other team members access to view the EKS configuration on AWS GUI console.
the way i have done this before is to add the users role arn using kubectl configmaps - but for some reason each time i add the users roles, the yaml file does not like my changes
Ahah yes, I see. As said above, we have a tool, that regularly updates the config. So you shouldn’t have to update this file manually otherwise it will conflict.
My suggestion for this is to use the Admins group. Add everyone you want to give access to, inside this group, wait 5 min and it should be ok.
i have followed this documentation and it seems to have done the trick
so basically from my understanding - Qovery does not want you to modify the kube system configmaps using kubectl. Qovery rather expects you create some IAM groups and policies to implement this capability