Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
Original file line number Diff line number Diff line change
Expand Up @@ -136,8 +136,8 @@ private void addPropertySourcesFromPaths(Environment environment, CompositePrope
filename, composite);
}
else if (filename.endsWith(".yml") || filename.endsWith(".yaml")) {
addPropertySourceIfNeeded(c -> PROPERTIES_TO_MAP.apply(yamlParserGenerator(environment).apply(c)),
content, filename, composite);
addPropertySourceIfNeeded(c -> yamlParserGenerator(environment).apply(c), content, filename,
composite);
}
}
catch (IOException e) {
Expand Down
Original file line number Diff line number Diff line change
@@ -0,0 +1,270 @@
/*
* Copyright 2013-2025 the original author or authors.
*
* Licensed under the Apache License, Version 2.0 (the "License");
* you may not use this file except in compliance with the License.
* You may obtain a copy of the License at
*
* https://www.apache.org/licenses/LICENSE-2.0
*
* Unless required by applicable law or agreed to in writing, software
* distributed under the License is distributed on an "AS IS" BASIS,
* WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
* See the License for the specific language governing permissions and
* limitations under the License.
*/

package org.springframework.cloud.kubernetes.commons.config;

import java.io.IOException;
import java.io.Reader;
import java.nio.charset.StandardCharsets;
import java.util.ArrayList;
import java.util.Collection;
import java.util.Collections;
import java.util.LinkedHashMap;
import java.util.List;
import java.util.Map;
import java.util.Properties;
import java.util.stream.Collectors;

import org.apache.commons.logging.LogFactory;
import org.yaml.snakeyaml.Yaml;
import org.yaml.snakeyaml.reader.UnicodeReader;

import org.springframework.core.CollectionFactory;
import org.springframework.core.io.ByteArrayResource;
import org.springframework.core.io.Resource;
import org.springframework.core.log.LogAccessor;
import org.springframework.lang.Nullable;
import org.springframework.util.StringUtils;

import static org.springframework.beans.factory.config.YamlProcessor.DocumentMatcher;
import static org.springframework.beans.factory.config.YamlProcessor.MatchStatus;
import static org.springframework.cloud.kubernetes.commons.config.Constants.SPRING_CONFIG_ACTIVATE_ON_PROFILE;
import static org.springframework.cloud.kubernetes.commons.config.Constants.SPRING_PROFILES;

/**
* A class based on
* {@link org.springframework.beans.factory.config.YamlPropertiesFactoryBean} that takes
* care to override profile-based collections and maps.
*
* We can't use the same functionality of loading yaml files that spring-boot does :
* {@link org.springframework.boot.env.YamlPropertySourceLoader} and thus OriginTrackedYamlLoader,
* because spring-boot loads every single yaml document (all in a file) into a separate PropertySource.
* So each yaml document ends up in a separate PropertySource. We, on the other hand, have to load all yaml documents
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is where I am hung up. Why do we need to read everything into a single property source? The following sentence says because we need to read configmaps and secrets but what about configmap and secrets is different than reading a file?

Copy link
Contributor Author

@wind57 wind57 Aug 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This is where I am hung up. Why do we need to read everything into a single property source?

because this is they way our code was designed from the beginning. We have this for example:

ConfigMapPropertySourceLocator implements PropertySourceLocator {
    
     public PropertySource<?> locate(Environment environment) { ... }

}

and this method returns a CompositePropertySource composite = new CompositePropertySource("composite-configmap");.

So when we read a configmap, we assume we will compute a single property source, for each source. I mean:

Set<NormalizedSource> sources = new LinkedHashSet<>(this.properties.determineSources(environment));
				LOG.debug("Config Map normalized sources : " + sources);
				sources.forEach(s -> {
					MapPropertySource propertySource = getMapPropertySource(s, env);
					if ("true".equals(propertySource.getProperty(Constants.ERROR_PROPERTY))) {
						LOG.warn("Failed to load source: " + s);
					}
					else {
						LOG.debug("Adding config map property source " + propertySource.getName());
						composite.addFirstPropertySource(propertySource);
					}
				});

so :

  • find the sources
  • for each of them, create a single PropertySource.

But in case of a yaml multi document, things are not trivial to do. spring-boot does not impose this limitation, but we had it from the very beginning.


This is why I said, I totally agree with the comment from spring-boot maintainer, when he says : "create a single PropertySource just the say we do for each individual yaml document". I agree, but our code is far too involved at this point in time. We would need to revise the entire code and the way we create property sources.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So basically for multi-document yaml files (configmaps/secrets) we have always created a single property source per file instead of separating each document into their own property source?

And when we use Boots YamlPropertiesFactoryBean it creates multiple properties sources and this somehow results in the weird overriding of properties?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

So basically for multi-document yaml files (configmaps/secrets) we have always created a single property source per file instead of separating each document into their own property source?

exactly.

And when we use Boots YamlPropertiesFactoryBean it creates multiple properties sources and this somehow results in the weird overriding of properties?

The idea is correct, the details are not. We were already using YamlPropertiesFactoryBean and this one is overriding properties in a "bad way".

We can't use the Boots way, because it creates one PropertySource per yaml document.

Essentially, I think you are spot on with the idea here, yes.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The idea is correct, the details are not. We were already using YamlPropertiesFactoryBean and this one is overriding properties in a "bad way".

Is it because the multiple property sources from the multiple documents results in incorrect ordering of the property sources? Do you have an example yaml document, the property sources Boot would create and the order they would be in as well as what we would expect? This should help me understand the crux of the issue.

Copy link
Contributor Author

@wind57 wind57 Aug 3, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yes, an example would make things simpler I guess. I will use what the OP presented to us. I will strip it down to the bare minimum. Suppose you have this yaml:

my-application:
  items:
    - "Item 1"
    - "Item 2"
    - "Item 3"
    - "Item 4"
    - "Item 5"
---
spring:
  config:
    activate:
      on-profile: dev

my-application:
  items:
    - "Item 1"
    - "Item 6"
    - "Item 7"

spring-boot, using YamlPropertySourceLoader, will create two property sources:

  • name of the first property source : Config resource 'class path resource [application.yaml]' via location 'optional:classpath:/' (document #0)

  • values of the fist property source : my-application.items[0] = Item 1, my-application.items[1] = Item 2 ....

  • name of the second property source: Config resource 'class path resource [application.yaml]' via location 'optional:classpath:/' (document #1)

  • value of the second property source: my-application.items[0] = Item 1, my-application.items[1] = Item 6, my-application.items[2] = Item 7 ....

Internally, later, when you need to wire things, IndexedElementsBinder will correctly chose the profile based properties (second yaml document from the example).


In our case, we were using YamlPropertiesFactoryBean. So when that one was reading the yaml file, it would do :

    protected Properties createProperties() {
        Properties result = CollectionFactory.createStringAdaptingProperties();
        this.process((properties, map) -> result.putAll(properties));
        return result;
    }

In simpler words, it will "merge" properties from yaml documents, into a single Map, using : result.putAll(properties).

So, it will read the first yaml document, create the properties:

my-application.items[0] = Item 1
my-application.items[1] = Item 2
my-application.items[2] = Item 3
my-application.items[3] = Item 4
my-application.items[4] = Item 5

Then, it will read the second yaml document, create the proeprties:

my-application.items[0] = Item 1
my-application.items[1] = Item 6
my-application.items[2] = Item 7

then, again using result.putAll(properties), merge these. Resulting obviously, in :

my-application.items[0] = Item 1
my-application.items[1] = Item 6
my-application.items[2] = Item 7
my-application.items[3] = Item 4
my-application.items[4] = Item 5

which is not correct.


As we talked above, if we had the option to create separate PropertySources from these two documents, we would and spring-boot injection mechanism would have taken care of the rest. But we are in a situation were this is not an option. And this is why I created the workaround.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is this incorrect? The dev profile is active so it is overriding the items in the array from the default profile.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

it was supposed to override the entire array, exactly like spring-boot does, not individual items. no?

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yes OK, thats where I was all caught up. You are right it should override the property my-application.items. Thanks for being so patient!

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

if you wire properties using spring-boot only, no kubernetes involved, you will end up with :

List<String> items; // [Item 1, Item 6, Item 7]

but if users use configmap to read proeprties and we have the same yaml file, they will end up with :

List<String> items; // [Item 1, Item 6, Item 7, Item 4, Item 5]

* into a single Properties file, that ends up being a single PropertySource.
* This happens because we first have to read configmaps / secrets
* and only at that point do we know if a yaml contains more than one document.
*
* As such, we mimic the same things that spring-boot achieves by creating our own yaml reader, that is neavily based
* on the YamlPropertiesFactoryBean.
*
* This is how it does things:
*
* <ul>
* <li>read all the documents in a yaml file</li>
* <li>flatten all properties besides collection and maps,
* YamlPropertiesFactoryBean does not do that and starts flattening everything</li>
* <li>take only those that match the document matchers</li>
* <li>split them in two : those that have profile activation and those that don't</li>
* <li>override properties in the non-profile based yamls with the ones from profile based ones.
* This achieves the same result as a plain spring-boot app, where profile based properties have a higher
* precedence.</li>
* <li>once the overriding happened, we do another flattening, this time including collection and maps</li>
* </ul>
*
* @author wind57
*/
final class ProfileActivationAwareYamlPropertiesFactoryBean {

private static final LogAccessor LOG = new LogAccessor(LogFactory.getLog(ProfileActivationAwareYamlPropertiesFactoryBean.class));

private List<DocumentMatcher> documentMatchers = Collections.emptyList();

Map<String, Object> createProperties(String source) {
LinkedHashMap<String, Object> finalMap = new LinkedHashMap<>();
Yaml yaml = new Yaml();

Resource resource = new ByteArrayResource(source.getBytes(StandardCharsets.UTF_8));
try (Reader reader = new UnicodeReader(resource.getInputStream())) {
Iterable<Object> iterable = yaml.loadAll(reader);

List<LinkedHashMap<String, Object>> allYamlDocuments = new ArrayList<>();

// 1. read all the documents that are contained in the yaml (might be more
// than one).
// We flatten all properties besides collection and maps. This is needed to
// be able to properly override them
for (Object singleYamlDocument : iterable) {
if (singleYamlDocument != null) {
LinkedHashMap<String, Object> flattenedMap = new LinkedHashMap<>();
Map<String, Object> originalSource = asMap(singleYamlDocument);
buildFlattenedMapWithoutComplexObjects(flattenedMap, originalSource, null);
allYamlDocuments.add(flattenedMap);
}
}

// 2. take only those that match document matchers
List<LinkedHashMap<String, Object>> yamlDocumentsMatchedAgainstDocumentMatchers = filterBasedOnDocumentMatchers(
allYamlDocuments);

// 3. split them in two: ones that do not have any profile activation
// and ones that do have profile activation.
Map<Boolean, List<LinkedHashMap<String, Object>>> partitioned = yamlDocumentsMatchedAgainstDocumentMatchers
.stream()
.collect(Collectors.partitioningBy(
x -> !x.containsKey(SPRING_CONFIG_ACTIVATE_ON_PROFILE) && !x.containsKey(SPRING_PROFILES)));

LOG.debug(() -> "non-profile source : " + partitioned.get(Boolean.TRUE).toString());
LOG.debug(() -> "profile source : " + partitioned.get(Boolean.FALSE));

// 4. once they are split, iterate and compute a single properties map
// (with collections and maps unflattened yet), but correctly overridden.
// Meaning non-profile-based sources come first.

LinkedHashMap<String, Object> flattenedWithoutComplexObjects = new LinkedHashMap<>();
partitioned.get(Boolean.TRUE).forEach(flattenedWithoutComplexObjects::putAll);
partitioned.get(Boolean.FALSE).forEach(flattenedWithoutComplexObjects::putAll);

// 5. we now know the correct order, let's do the final flattening
buildFlattenedMap(finalMap, flattenedWithoutComplexObjects, null);

LOG.debug(() -> "final source : " + finalMap);

}
catch (IOException e) {
throw new RuntimeException(e);
}

return finalMap;
}

void setDocumentMatchers(DocumentMatcher... matchers) {
this.documentMatchers = List.of(matchers);
}

private List<LinkedHashMap<String, Object>> filterBasedOnDocumentMatchers(
List<LinkedHashMap<String, Object>> allDocuments) {
return allDocuments.stream().filter(x -> {
Properties properties = CollectionFactory.createStringAdaptingProperties();
properties.putAll(x);
MatchStatus matchStatus = MatchStatus.ABSTAIN;
for (DocumentMatcher matcher : this.documentMatchers) {
MatchStatus match = matcher.matches(properties);
matchStatus = MatchStatus.getMostSpecific(match, matchStatus);
if (match == MatchStatus.FOUND) {
LOG.debug(() -> "Matched document with document matcher: " + properties);
return true;
}

if (matchStatus == MatchStatus.ABSTAIN) {
LOG.debug(() -> "Matched document with default matcher: " + properties);
return true;
}
}
return false;
}).toList();
}

/**
* builds the flattened properties.
*/
@SuppressWarnings({ "rawtypes", "unchecked" })
private static void buildFlattenedMap(Map<String, Object> result, Map<String, Object> source,
@Nullable String path) {
source.forEach((key, value) -> {
if (StringUtils.hasText(path)) {
if (key.startsWith("[")) {
key = path + key;
}
else {
key = path + '.' + key;
}
}
if (value instanceof String) {
result.put(key, value);
}
else if (value instanceof Map map) {
// Need a compound key
buildFlattenedMap(result, map, key);
}
else if (value instanceof Collection collection) {
// Need a compound key
if (collection.isEmpty()) {
result.put(key, "");
}
else {
int count = 0;
for (Object object : collection) {
buildFlattenedMap(result, Collections.singletonMap("[" + (count++) + "]", object), key);
}
}
}
else {
result.put(key, (value != null ? value : ""));
}
});
}

/**
* flatten properties, but without collections or maps. So it looks like this, for
* example: <pre>
* bean.test=[{name=Alice, role=admin}, {name=ER, role=user}]}, {bean.items=[Item 10]}]
* </pre>
*/
@SuppressWarnings({ "rawtypes", "unchecked" })
private static void buildFlattenedMapWithoutComplexObjects(Map<String, Object> result, Map<String, Object> source,
@Nullable String path) {
source.forEach((key, value) -> {
if (StringUtils.hasText(path)) {
if (key.startsWith("[")) {
key = path + key;
}
else {
key = path + '.' + key;
}
}
if (value instanceof String) {
result.put(key, value);
}
else if (value instanceof Map map) {
// Need a compound key
buildFlattenedMapWithoutComplexObjects(result, map, key);
}
else if (value instanceof Collection collection) {
if (collection.isEmpty()) {
result.put(key, "");
}
else {
result.put(key, collection);
}
}
else {
result.put(key, (value != null ? value : ""));
}
});
}

@SuppressWarnings({ "rawtypes", "unchecked" })
private static Map<String, Object> asMap(Object object) {
Map<String, Object> result = new LinkedHashMap<>();
if (object instanceof Map map) {
map.forEach((key, value) -> {
if (value instanceof Map) {
value = asMap(value);
}
if (key instanceof CharSequence) {
result.put(key.toString(), value);
}
else {
// It has to be a map key in this case
result.put("[" + key.toString() + "]", value);
}
});
}

return result;
}

}
Original file line number Diff line number Diff line change
Expand Up @@ -19,17 +19,14 @@
import java.io.ByteArrayInputStream;
import java.io.IOException;
import java.io.UncheckedIOException;
import java.nio.charset.StandardCharsets;
import java.util.Map;
import java.util.Properties;
import java.util.function.BinaryOperator;
import java.util.function.Function;
import java.util.stream.Collectors;

import org.springframework.beans.factory.config.YamlPropertiesFactoryBean;
import org.springframework.core.env.Environment;
import org.springframework.core.env.Profiles;
import org.springframework.core.io.ByteArrayResource;
import org.springframework.util.StringUtils;

import static org.springframework.beans.factory.config.YamlProcessor.MatchStatus.ABSTAIN;
Expand Down Expand Up @@ -76,9 +73,9 @@ private PropertySourceUtils() {
* @param environment Environment.
* @return properties.
*/
public static Function<String, Properties> yamlParserGenerator(Environment environment) {
return s -> {
YamlPropertiesFactoryBean yamlFactory = new YamlPropertiesFactoryBean();
public static Function<String, Map<String, Object>> yamlParserGenerator(Environment environment) {
return source -> {
ProfileActivationAwareYamlPropertiesFactoryBean yamlFactory = new ProfileActivationAwareYamlPropertiesFactoryBean();
yamlFactory.setDocumentMatchers(properties -> {
if (environment != null) {
String profiles = null;
Expand All @@ -98,8 +95,7 @@ else if (springProfiles != null) {
}
return ABSTAIN;
});
yamlFactory.setResources(new ByteArrayResource(s.getBytes(StandardCharsets.UTF_8)));
return yamlFactory.getObject();
return yamlFactory.createProperties(source);
};
}

Expand All @@ -109,6 +105,7 @@ else if (springProfiles != null) {
* @param <T> type of the argument
* @return a {@link BinaryOperator}
*/
@Deprecated(forRemoval = true)
public static <T> BinaryOperator<T> throwingMerger() {
return (left, right) -> {
throw new IllegalStateException("Duplicate key " + left);
Expand Down
Original file line number Diff line number Diff line change
Expand Up @@ -71,7 +71,7 @@ public static Map<String, Object> processAllEntries(Map<String, String> input, E
String propertyValue = singleEntry.getValue();
if (propertyName.endsWith(".yml") || propertyName.endsWith(".yaml")) {
LOG.debug("The single property with name: [" + propertyName + "] will be treated as a yaml file");
return yamlParserGenerator(environment).andThen(PROPERTIES_TO_MAP).apply(propertyValue);
return yamlParserGenerator(environment).apply(propertyValue);
}
else if (propertyName.endsWith(".properties")) {
LOG.debug("The single property with name: [" + propertyName + "] will be treated as a properties file");
Expand Down Expand Up @@ -176,15 +176,15 @@ private static Map<String, Object> defaultProcessAllEntries(Map<String, String>

private static Map<String, Object> extractProperties(String resourceName, String content, Environment environment) {

if (resourceName.endsWith(".yml") || resourceName.endsWith(".yaml") || resourceName.endsWith(".properties")) {
if (ENDS_IN_EXTENSION.test(resourceName)) {

if (resourceName.endsWith(".properties")) {
LOG.debug("entry : " + resourceName + " will be treated as a single properties file");
return KEY_VALUE_TO_PROPERTIES.andThen(PROPERTIES_TO_MAP).apply(content);
}
else {
LOG.debug("entry : " + resourceName + " will be treated as a single yml/yaml file");
return yamlParserGenerator(environment).andThen(PROPERTIES_TO_MAP).apply(content);
return yamlParserGenerator(environment).apply(content);
Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

yamlParserGenerator now returns a Map<String, Object> instead of Properties, so we can get rid of one step

}
}

Expand Down
Loading
Loading