Skip to main content

Dynamically Loading YAML Files in Spring Boot with Java

In modern applications, configuration plays a crucial role in maintaining flexibility and modularity. Often, we need to load properties from multiple YAML files, organized across nested directories, to ensure scalability and modular configuration. In this blog, we’ll explore how to dynamically load all YAML files from a specific directory structure and merge their properties into a single configuration object.


Use Case

Imagine we have a configuration directory like this:

src/main/resources

├── application.yml

├── configs

│   ├── environment

│   │   ├── dev.yml

│   │   ├── qa.yml

│   ├── features

│   │   ├── feature1.yml

│   │   ├── feature2.yml


We want to load all YAML files under the configs directory (including subdirectories), merge them, and make the properties available for use in our Spring Boot application.


Implementation

1. Dynamic YAML Loader

We will create a utility class to dynamically load and merge all YAML files into a single Properties object.

package com.example.config;


import org.springframework.beans.factory.config.YamlPropertiesFactoryBean;

import org.springframework.core.io.Resource;

import org.springframework.core.io.support.PathMatchingResourcePatternResolver;


import java.io.IOException;

import java.util.Properties;


public class DynamicYamlLoader {


    public static Properties loadYamlFiles(String basePath) throws IOException {

        Properties combinedProperties = new Properties();

        PathMatchingResourcePatternResolver resolver = new PathMatchingResourcePatternResolver();


        // Use '**' to include all nested directories and `.yml` files

        Resource[] resources = resolver.getResources(basePath + "/**/*.yml");


        for (Resource resource : resources) {

            if (resource.exists() && resource.isReadable()) {

                YamlPropertiesFactoryBean yamlFactory = new YamlPropertiesFactoryBean();

                yamlFactory.setResources(resource);

                Properties yamlProperties = yamlFactory.getObject();

                if (yamlProperties != null) {

                    combinedProperties.putAll(yamlProperties);

                }

            }

        }


        return combinedProperties;

    }

}

2. Spring Configuration Loader

To integrate this loader into Spring, we’ll create a configuration class that loads the properties and makes them available in the application context.


package com.example.config;


import org.springframework.context.annotation.Bean;

import org.springframework.context.annotation.Configuration;


import java.io.IOException;

import java.util.Properties;


@Configuration

public class AppConfig {


    @Bean

    public Properties dynamicProperties() throws IOException {

        // Specify the base directory

        String basePath = "classpath:/configs";

        return DynamicYamlLoader.loadYamlFiles(basePath);

    }

}


3. Example YAML Files

Here are some sample YAML files to simulate a realistic use case:

configs/environment/dev.yml:


app:

  name: DemoApp

  environment: development

logging:

  level: DEBUG

configs/features/feature1.yml:

feature1:

  enabled: true

  maxRetries: 3

configs/features/feature2.yml:

feature2:

  enabled: false

  timeout: 5000

4. Accessing Properties

You can now access the combined properties in your application. For example:

package com.example.service;


import org.springframework.stereotype.Service;


import java.util.Properties;


@Service

public class PropertyService {


    private final Properties properties;


    public PropertyService(Properties properties) {

        this.properties = properties;

    }


    public void printProperties() {

        System.out.println("App Name: " + properties.getProperty("app.name"));

        System.out.println("Feature1 Enabled: " + properties.getProperty("feature1.enabled"));

        System.out.println("Feature2 Timeout: " + properties.getProperty("feature2.timeout"));

    }

}


When running the application, the output will be:

App Name: DemoApp

Feature1 Enabled: true

Feature2 Timeout: 5000


Explanation

  1. Recursive File Loading:

    • The PathMatchingResourcePatternResolver ensures all YAML files under the specified path (including subdirectories) are discovered.
  2. Merging Properties:

    • Using Properties.putAll() allows us to combine properties from all YAML files. If duplicate keys exist, the last-loaded file overwrites the earlier values.
  3. Spring Integration:

    • By declaring a @Bean, we make the combined properties available for dependency injection.

 


 


Comments

Popular posts from this blog

Advanced Kafka Resilience: Dead-Letter Queues, Circuit Breakers, and Exactly-Once Delivery

Introduction In distributed systems, failures are inevitable—network partitions, broker crashes, or consumer lag can disrupt data flow. While retries help recover from transient issues, you need stronger guarantees for mission-critical systems. This guide covers three advanced Kafka resilience patterns: Dead-Letter Queues (DLQs) – Handle poison pills and unprocessable messages. Circuit Breakers – Prevent cascading failures when Kafka is unhealthy. Exactly-Once Delivery – Avoid duplicates in financial/transactional systems. Let's dive in! 1. Dead-Letter Queues (DLQs) in Kafka What is a DLQ? A dedicated Kafka topic where "failed" messages are sent after max retries (e.g., malformed payloads, unrecoverable errors). ...

Project Reactor Important Methods Cheat Sheet

🔹 1️⃣ subscribeOn – "Decides WHERE the Pipeline Starts" 📝 Definition: subscribeOn influences the thread where the data source (upstream) (e.g., data generation, API calls) runs . It affects the source and everything downstream (until a publishOn switches it). Flux<Integer> flux = Flux.range(1, 3) .doOnNext(i -> System.out.println("[Generating] " + i + " on " + Thread.currentThread().getName())) .subscribeOn(Schedulers.boundedElastic()) // Change starting thread .map(i -> { System.out.println("[Processing] " + i + " on " + Thread.currentThread().getName()); return i * 10; }); flux.blockLast(); Output: [Generating] 1 on boundedElastic-1 [Processing] 1 on boundedElastic-1 [Generating] 2 on boundedElastic-1 [Processing] 2 on boundedElastic-1 [Generating] 3 on boundedElastic-1 [Processing] 3 on boundedElastic-1 📢 Key Insight: ...

🔄 Kafka Producer Internals: send() Explained with Delivery Semantics and Transactions

Kafka Producer Internal Working Apache Kafka is known for its high-throughput, fault-tolerant message streaming system. At the heart of Kafka's data pipeline is the Producer —responsible for publishing data to Kafka topics. This blog dives deep into the internal workings of the Kafka Producer, especially what happens under the hood when send() is called. We'll also break down different delivery guarantees and transactional semantics with diagrams. 🧠 Table of Contents Kafka Producer Architecture Overview What Happens When send() is Called Delivery Semantics Kafka Transactions & Idempotence Error Handling and Retries Diagram: Kafka Producer Internals Conclusion 🏗️ Kafka Producer Architecture Overview Kafka Producer is composed of the following core components: Serializer : Converts key/value to bytes. Partitioner : Determines which partition a record should go to. Accumulator : Buffers the records in memory be...