Preface
This post assumes that you have some basic understanding of Docker/Podman, Docker Compose, and the key components used in the docker ecosystem. Get up to speed, with the Prepare Your Container Environment with Docker or Podman section of Docker docs.
- Install Docker or Podman
- install Docker Compose
Deploy Elasticsearch Single Node with Docker Compose
Create the docker-compose.yml with the following:
# docker-compose.yml
services:
elasticsearch:
image: docker.elastic.co/elasticsearch/elasticsearch-wolfi:8.17.4
container_name: elasticsearch
environment:
- cluster.name=oio-es-cluster
- node.name=es-node
- discovery.type=single-node
- bootstrap.memory_lock=true
- network.host=0.0.0.0
- xpack.license.self_generated.type=trial
- xpack.security.enabled=false
- "ES_JAVA_OPTS=-Xms1024m -Xmx1024m"
ports: ['9200:9200']
volumes:
- 'es_data:/usr/share/elasticsearch/data'
healthcheck:
test: curl -s http://localhost:9200 >/dev/null || exit 1
interval: 30s
timeout: 10s
retries: 5
volumes:
es_data:
Run
docker-compose up -d
Container Status
docker-compose ps -a
docker container ls
docker ps -a
Cluster Health & Stats APIs
Once the container is up, explore the cluster:
| API | Command |
|---|---|
| Node list | curl -s localhost:9200/_cat/nodes?pretty |
| Cluster health | curl -s localhost:9200/_cat/health?pretty |
| Cluster stats | curl -s localhost:9200/_cluster/stats?human&pretty |
| Node stats | curl -s localhost:9200/_nodes/stats?pretty |
| Specific node | curl -s localhost:9200/_nodes/es-node/stats?pretty |
| List indices | curl -s localhost:9200/_cat/indices?pretty |
| All indices | curl -s localhost:9200/_cat/indices?expand_wildcards=all&pretty |
| Index-level stats | curl -s localhost:9200/_nodes/stats/indices?pretty |
| Plugins | curl -s localhost:9200/_nodes/plugins |
CRUD Operations
Create Index
Without mappings: Elasticsearch auto-generates field types on the first document (dynamic mapping).
With mappings: You define field types up front for better control (e.g.,
With mappings: You define field types up front for better control (e.g.,
text vs keyword). curl -X PUT http://localhost:9200/ramayana_characters \
-H "Content-Type: application/json" -d '
{
"mappings": {
"properties": {
"name": { "type": "text" },
"description": { "type": "text" }
}
}
}'
{"acknowledged":true,"shards_acknowledged":true,"index":"ramayana_characters"}Insert Document
curl -X POST http://localhost:9200/ramayana_characters/_doc \
-H "Content-Type: application/json" -d '
{
"name": "Rama",
"description": "Hero of the Ramayana, seventh avatar of Vishnu."
}'
{"_index":"ramayana_characters","_id":"_g1qMpYBhtVSA9-yauE0","_version":1,"result":"created", ...}Copy the
_idfrom the output — you’ll need it for the next queries.
Select All
curl -X GET 'http://localhost:9200/ramayana_characters/_search?pretty'
{ "hits": { "total": { "value": 1 }, "hits": [{ "_id": "_g1qMpYBhtVSA9-yauE0", "_source": { "name": "Rama", "description": "Hero of the Ramayana, seventh avatar of Vishnu." } }] } }Select by ID
export DOC_ID=_g1qMpYBhtVSA9-yauE0
curl -X GET "http://localhost:9200/ramayana_characters/_doc/${DOC_ID}?pretty"
Update Document
curl -X POST "http://localhost:9200/ramayana_characters/_update/${DOC_ID}?pretty" \
-H "Content-Type: application/json" -d '
{
"doc": {
"name": "Raama"
}
}'
{"_index":"ramayana_characters","_id":"...","_version":2,"result":"updated"}Try the GET call again to see the updated value.
Delete Document
curl -X DELETE "http://localhost:9200/ramayana_characters/_doc/${DOC_ID}?pretty"
{"_index":"ramayana_characters","_id":"...","_version":3,"result":"deleted"}Homework
| # | Task | Goal |
|---|---|---|
| 1 | Deploy & Explore | Run Docker Compose, use curl to hit every health API |
| 2 | Create & Query | Create an index, insert 5 documents, run _search and get-by-ID |
| 3 | Understand Mappings | Compare behaviour with and without explicit mappings |
| 4 | Yellow Cluster? | Investigate why health turns yellow after creating an index (hint: replica shards) |
Conclusion
With this setup, you can quickly spin up a single-node Elasticsearch stack for development or learning purposes, and start exploring powerful search and analytics capabilities using RESTful APIs.