published by nick
Tue 08 May 2018

While working on stack-docker I was working on revamping the project to no longer include passwords as environments variables. There have been to many incidents as of late with big companies who have accidentally leaked passwords in log files or tracebacks etc. This got me to thinking we should stop by default running containers with passwords in environment variables.

We should stop promoting this as a way of deploying Elasticsearch.

docker run -e ELASTIC_PASSWORD=changeme -p 9200:9200 docker.elastic.co/elasticsearch/elasticsearch:6.2.4 

While working on this project I learned to love the keystore. All products in the Elasticstack support the keystore. The keystore is a way to store secure settings without exposing them in plain text at any point in your deployment (aside from the moment you are trying to store the password in the keystore).

Elasticsearch, Logstash, Kibana, Beats, APM all have their own implementation of the keystore tool.

Elasticsearch

The elasticsearch-keystore tool is packaged with every build of Elasticsearch and is located in the $ES_HOME/bin directory.

To create the Elasticsearch keystore just run the command:

bin/elasticsearch-keystore create

This will create an elasticsearch.keystore file in the same location as your elasticsearch.yml config file.

This keystore file we can store some sensitive information such as our ES bootstrap.password. When Elasticsearch 6.x came out the default password of changeme was removed, and the bootstrap.password was created. This was done because of the massive amount of unprotected Elastic clusters in the wild using the default password.

To bootstrap our clusters default password with the keystore we need to add the bootstrap.password.

bin/elasticsearch-keystore add bootstrap.password

This will prompt us to type in a value for the password.

After the bootstrap password has been created we can start Elasticsearch

bin/elasticsearch

And navigating to localhost:9200 will prompt us for a username and password.

We can now use this password to programatically set the passwords for our other system users kibana and logstash_system.

ES_URL=https://elastic:elastic_password@elasticsearch:9200

# Set password for kibana user
curl -k -s -H 'Content-Type:application/json' -XPUT $ES_URL/_xpack/security/user/kibana/_password -d "{\"password\": \"my_super_secret_password\"}"

# Set password for logstash_system user
curl -k -s -H 'Content-Type:application/json' -XPUT $ES_URL/_xpack/security/user/logstash_system/_password -d "{\"password\": \"my_super_secret_password\"}"

Kibana

Now we have Elasticsearch running with a secure password. Now we need to make sure kibana can connect to Elasticsearch. Normally I see a lot of people type the password for the kibana user to connect to ES by typing it in the kibana.yml file. This leads to security issues.

  1. The password is stored in plain-text where anyone who has access to the kibana.yml file can read the password
  2. The password can be mis-typed which can lead to some headaches trying to figure out why login to Kibana has been disabled.

bin/kibana-keystore to the rescue! Just like Elasticsearch, Kibana can utilize keystores to securly store sentivie information.

To create the keystore file just run:

bin/kibana-keystore create

Next we need to add the elasticsearch.password password.

bin/kibana-keystore add elasticsearch.username

And just like with the Elasticsearch keystore we will be prompted to enter the password for our kibana user.

Now the trick with the kibana keystore is that kibana will read from kibana.yml and kibana.keystore for settings.

So we can configure our kibana.yml like this:

elasticsearch.url: https://elasticsearch:9200
elasticsearch.username: kibnaa

Notice how we did not add elasticsearch.password. That is because we've specified the password in our keystore file. Now Kibana can connect to Elasticsearch and we haven't exposed our password in plaintext anywhere! yay \0/

Logstash

Similar to our Kibana exercise Logstash has a keystore as well.

bin/logstash-keystore create

This will create our keystore file for Logstash.

To seed our keystore with the password for our logstash_system user we can create the entry:

bin/logstash-keystore create ES_PASSWORD

Again this will prompt us to type the password for the logstash_system user.

Logstash, in both, our logstash.yml and various pipeline.conf files can read from the logstash keystore. So access to the keystore in Logstash will be a bit different.

Lets first configure our logstash.yml file to setup X-Pack Monitoring.

# read password from logstash.keystore
xpack.monitoring.elasticsearch.password: ${ES_PASSWORD}
xpack.monitoring.elasticsearch.url: https://elasticsearch:9200
xpack.monitoring.elasticsearch.username: logstash_system

Now lets create a sample logstash pipeline which will output to Elasticsearch.

input {
  heartbeat {
    interval => 5
    message  => 'Hello from Logstash 💓'
  }
}

output {
  elasticsearch {
    hosts    => [ 'elasticsearch' ]
    user     => 'elastic'
    password => "${ES_PASSWORD}"  # read password from logstash.keystore
  }
}

Beats and APM

Beats and the APM server all are written in Go and operate in the same way.

To create the keystore

<beat_name/apm-server> keystore create

where beat_name could be packetbeat or filebeat etc... Or apm-serveris the binary that runs the apm-server.

Now to add values, such as our ES_PASSWORD to the keystore we'd run:

<beat_name/apm-server> keystore add ES_PASSWORD

And just like in the Logstash example we can access the ES_PASSWORD in our config files by:

output.elasticsearch:
  hosts: ['elasticsearch:9200']
  username: logstash_system
  password: "${ES_PASSWORD}"

We've now successfully created our ELK stack using keystores for Elasticsearch, Kibana, Logstash, Beats and APM. And we've secured our passwords so they aren't stored in plaintext anywhere in our deployment.

Mom's coming over and we're gonna paint!

It's nice to get out and on the trails again. I miss taking Roxy. Even though she hasn't been able to "go" for a few years now. I was still flooded with memories of when I used to run with her in college and then later in KS and then again exploring these trails for the first time around my home when I moved back to CO. She's was always the first one ready to go and never wanted to stop. Even when i'd have to carry her she was still ready to keep going.

I'm going to go for a run today. I think it will be good for me. Taking Maya and Juko and Rosalie with me.

yesterday my oldest friend passed away. I miss her so much.

Headed to Cleveland for PyCon today! :)

also hooked one up in the living room so we can tell Siri to turn off the tv and pause unpairs the appletv.

Next is hooking up the Elgato temp sensors to trigger the fireplace to turn on in the evening when the temp drops too low.

last night I had fun hooking up some broadlink ir blasters to homebridge.

Now I can tell Siri to turn on the fireplace. And have set up a scene so that maya can wind down slowly for the evening.

finally got my taxes done. Always a relief to get it done with.

listening to the postal service.

This music never gets old.

Teaching a virtual class today. TBH feels super weird. I feel as if I'm talking to myself!