Setting Up Logstash In Docker-Compose For Bulk Ingest Of CSV Files In Local Machine

I just read up about mappings inside Mapping | Elasticsearch Guide [8.11] | Elastic

I don't think it looks that straight forward to edit the fields directly.

I consciously realised quite alot of my fields have been read as Text and not numbers.

Could this help me reduce file size?

Meanwhile: Docker Hard Disk Image File Being Too Large - Elastic Stack / Logstash - Discuss the Elastic Stack