-
Notifications
You must be signed in to change notification settings - Fork 1k
Fix 1935 #1956
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Merged
Merged
Fix 1935 #1956
Changes from all commits
Commits
Show all changes
7 commits
Select commit
Hold shift + click to select a range
3a47c5f
started work
wind57 053f459
fix-1935: first test
wind57 3f81769
fix-1935: more tests
wind57 99352de
minor simplification
wind57 4da318c
Merge branch '3.2.x' into fix-1935
wind57 459d14d
added documentation
wind57 d7062a6
review comments
wind57 File filter
Filter by extension
Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
There are no files selected for viewing
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
270 changes: 270 additions & 0 deletions
270
...work/cloud/kubernetes/commons/config/ProfileActivationAwareYamlPropertiesFactoryBean.java
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
| @@ -0,0 +1,270 @@ | ||
| /* | ||
| * Copyright 2013-2025 the original author or authors. | ||
| * | ||
| * Licensed under the Apache License, Version 2.0 (the "License"); | ||
| * you may not use this file except in compliance with the License. | ||
| * You may obtain a copy of the License at | ||
| * | ||
| * https://www.apache.org/licenses/LICENSE-2.0 | ||
| * | ||
| * Unless required by applicable law or agreed to in writing, software | ||
| * distributed under the License is distributed on an "AS IS" BASIS, | ||
| * WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. | ||
| * See the License for the specific language governing permissions and | ||
| * limitations under the License. | ||
| */ | ||
|
|
||
| package org.springframework.cloud.kubernetes.commons.config; | ||
|
|
||
| import java.io.IOException; | ||
| import java.io.Reader; | ||
| import java.nio.charset.StandardCharsets; | ||
| import java.util.ArrayList; | ||
| import java.util.Collection; | ||
| import java.util.Collections; | ||
| import java.util.LinkedHashMap; | ||
| import java.util.List; | ||
| import java.util.Map; | ||
| import java.util.Properties; | ||
| import java.util.stream.Collectors; | ||
|
|
||
| import org.apache.commons.logging.LogFactory; | ||
| import org.yaml.snakeyaml.Yaml; | ||
| import org.yaml.snakeyaml.reader.UnicodeReader; | ||
|
|
||
| import org.springframework.core.CollectionFactory; | ||
| import org.springframework.core.io.ByteArrayResource; | ||
| import org.springframework.core.io.Resource; | ||
| import org.springframework.core.log.LogAccessor; | ||
| import org.springframework.lang.Nullable; | ||
| import org.springframework.util.StringUtils; | ||
|
|
||
| import static org.springframework.beans.factory.config.YamlProcessor.DocumentMatcher; | ||
| import static org.springframework.beans.factory.config.YamlProcessor.MatchStatus; | ||
| import static org.springframework.cloud.kubernetes.commons.config.Constants.SPRING_CONFIG_ACTIVATE_ON_PROFILE; | ||
| import static org.springframework.cloud.kubernetes.commons.config.Constants.SPRING_PROFILES; | ||
|
|
||
| /** | ||
| * A class based on | ||
| * {@link org.springframework.beans.factory.config.YamlPropertiesFactoryBean} that takes | ||
| * care to override profile-based collections and maps. | ||
| * | ||
| * We can't use the same functionality of loading yaml files that spring-boot does : | ||
| * {@link org.springframework.boot.env.YamlPropertySourceLoader} and thus OriginTrackedYamlLoader, | ||
| * because spring-boot loads every single yaml document (all in a file) into a separate PropertySource. | ||
| * So each yaml document ends up in a separate PropertySource. We, on the other hand, have to load all yaml documents | ||
| * into a single Properties file, that ends up being a single PropertySource. | ||
| * This happens because we first have to read configmaps / secrets | ||
| * and only at that point do we know if a yaml contains more than one document. | ||
| * | ||
| * As such, we mimic the same things that spring-boot achieves by creating our own yaml reader, that is neavily based | ||
| * on the YamlPropertiesFactoryBean. | ||
| * | ||
| * This is how it does things: | ||
| * | ||
| * <ul> | ||
| * <li>read all the documents in a yaml file</li> | ||
| * <li>flatten all properties besides collection and maps, | ||
| * YamlPropertiesFactoryBean does not do that and starts flattening everything</li> | ||
| * <li>take only those that match the document matchers</li> | ||
| * <li>split them in two : those that have profile activation and those that don't</li> | ||
| * <li>override properties in the non-profile based yamls with the ones from profile based ones. | ||
| * This achieves the same result as a plain spring-boot app, where profile based properties have a higher | ||
| * precedence.</li> | ||
| * <li>once the overriding happened, we do another flattening, this time including collection and maps</li> | ||
| * </ul> | ||
| * | ||
| * @author wind57 | ||
| */ | ||
| final class ProfileActivationAwareYamlPropertiesFactoryBean { | ||
|
|
||
| private static final LogAccessor LOG = new LogAccessor(LogFactory.getLog(ProfileActivationAwareYamlPropertiesFactoryBean.class)); | ||
|
|
||
| private List<DocumentMatcher> documentMatchers = Collections.emptyList(); | ||
|
|
||
| Map<String, Object> createProperties(String source) { | ||
| LinkedHashMap<String, Object> finalMap = new LinkedHashMap<>(); | ||
| Yaml yaml = new Yaml(); | ||
|
|
||
| Resource resource = new ByteArrayResource(source.getBytes(StandardCharsets.UTF_8)); | ||
| try (Reader reader = new UnicodeReader(resource.getInputStream())) { | ||
| Iterable<Object> iterable = yaml.loadAll(reader); | ||
|
|
||
| List<LinkedHashMap<String, Object>> allYamlDocuments = new ArrayList<>(); | ||
|
|
||
| // 1. read all the documents that are contained in the yaml (might be more | ||
| // than one). | ||
| // We flatten all properties besides collection and maps. This is needed to | ||
| // be able to properly override them | ||
| for (Object singleYamlDocument : iterable) { | ||
| if (singleYamlDocument != null) { | ||
| LinkedHashMap<String, Object> flattenedMap = new LinkedHashMap<>(); | ||
| Map<String, Object> originalSource = asMap(singleYamlDocument); | ||
| buildFlattenedMapWithoutComplexObjects(flattenedMap, originalSource, null); | ||
| allYamlDocuments.add(flattenedMap); | ||
| } | ||
| } | ||
|
|
||
| // 2. take only those that match document matchers | ||
| List<LinkedHashMap<String, Object>> yamlDocumentsMatchedAgainstDocumentMatchers = filterBasedOnDocumentMatchers( | ||
| allYamlDocuments); | ||
|
|
||
| // 3. split them in two: ones that do not have any profile activation | ||
| // and ones that do have profile activation. | ||
| Map<Boolean, List<LinkedHashMap<String, Object>>> partitioned = yamlDocumentsMatchedAgainstDocumentMatchers | ||
| .stream() | ||
| .collect(Collectors.partitioningBy( | ||
| x -> !x.containsKey(SPRING_CONFIG_ACTIVATE_ON_PROFILE) && !x.containsKey(SPRING_PROFILES))); | ||
|
|
||
| LOG.debug(() -> "non-profile source : " + partitioned.get(Boolean.TRUE).toString()); | ||
| LOG.debug(() -> "profile source : " + partitioned.get(Boolean.FALSE)); | ||
|
|
||
| // 4. once they are split, iterate and compute a single properties map | ||
| // (with collections and maps unflattened yet), but correctly overridden. | ||
| // Meaning non-profile-based sources come first. | ||
|
|
||
| LinkedHashMap<String, Object> flattenedWithoutComplexObjects = new LinkedHashMap<>(); | ||
| partitioned.get(Boolean.TRUE).forEach(flattenedWithoutComplexObjects::putAll); | ||
| partitioned.get(Boolean.FALSE).forEach(flattenedWithoutComplexObjects::putAll); | ||
|
|
||
| // 5. we now know the correct order, let's do the final flattening | ||
| buildFlattenedMap(finalMap, flattenedWithoutComplexObjects, null); | ||
|
|
||
| LOG.debug(() -> "final source : " + finalMap); | ||
|
|
||
| } | ||
| catch (IOException e) { | ||
| throw new RuntimeException(e); | ||
| } | ||
|
|
||
| return finalMap; | ||
| } | ||
|
|
||
| void setDocumentMatchers(DocumentMatcher... matchers) { | ||
| this.documentMatchers = List.of(matchers); | ||
| } | ||
|
|
||
| private List<LinkedHashMap<String, Object>> filterBasedOnDocumentMatchers( | ||
| List<LinkedHashMap<String, Object>> allDocuments) { | ||
| return allDocuments.stream().filter(x -> { | ||
| Properties properties = CollectionFactory.createStringAdaptingProperties(); | ||
| properties.putAll(x); | ||
| MatchStatus matchStatus = MatchStatus.ABSTAIN; | ||
| for (DocumentMatcher matcher : this.documentMatchers) { | ||
| MatchStatus match = matcher.matches(properties); | ||
| matchStatus = MatchStatus.getMostSpecific(match, matchStatus); | ||
| if (match == MatchStatus.FOUND) { | ||
| LOG.debug(() -> "Matched document with document matcher: " + properties); | ||
| return true; | ||
| } | ||
|
|
||
| if (matchStatus == MatchStatus.ABSTAIN) { | ||
| LOG.debug(() -> "Matched document with default matcher: " + properties); | ||
| return true; | ||
| } | ||
| } | ||
| return false; | ||
| }).toList(); | ||
| } | ||
|
|
||
| /** | ||
| * builds the flattened properties. | ||
| */ | ||
| @SuppressWarnings({ "rawtypes", "unchecked" }) | ||
| private static void buildFlattenedMap(Map<String, Object> result, Map<String, Object> source, | ||
| @Nullable String path) { | ||
| source.forEach((key, value) -> { | ||
| if (StringUtils.hasText(path)) { | ||
| if (key.startsWith("[")) { | ||
| key = path + key; | ||
| } | ||
| else { | ||
| key = path + '.' + key; | ||
| } | ||
| } | ||
| if (value instanceof String) { | ||
| result.put(key, value); | ||
| } | ||
| else if (value instanceof Map map) { | ||
| // Need a compound key | ||
| buildFlattenedMap(result, map, key); | ||
| } | ||
| else if (value instanceof Collection collection) { | ||
| // Need a compound key | ||
| if (collection.isEmpty()) { | ||
| result.put(key, ""); | ||
| } | ||
| else { | ||
| int count = 0; | ||
| for (Object object : collection) { | ||
| buildFlattenedMap(result, Collections.singletonMap("[" + (count++) + "]", object), key); | ||
| } | ||
| } | ||
| } | ||
| else { | ||
| result.put(key, (value != null ? value : "")); | ||
| } | ||
| }); | ||
| } | ||
|
|
||
| /** | ||
| * flatten properties, but without collections or maps. So it looks like this, for | ||
| * example: <pre> | ||
| * bean.test=[{name=Alice, role=admin}, {name=ER, role=user}]}, {bean.items=[Item 10]}] | ||
| * </pre> | ||
| */ | ||
| @SuppressWarnings({ "rawtypes", "unchecked" }) | ||
| private static void buildFlattenedMapWithoutComplexObjects(Map<String, Object> result, Map<String, Object> source, | ||
| @Nullable String path) { | ||
| source.forEach((key, value) -> { | ||
| if (StringUtils.hasText(path)) { | ||
| if (key.startsWith("[")) { | ||
| key = path + key; | ||
| } | ||
| else { | ||
| key = path + '.' + key; | ||
| } | ||
| } | ||
| if (value instanceof String) { | ||
| result.put(key, value); | ||
| } | ||
| else if (value instanceof Map map) { | ||
| // Need a compound key | ||
| buildFlattenedMapWithoutComplexObjects(result, map, key); | ||
| } | ||
| else if (value instanceof Collection collection) { | ||
| if (collection.isEmpty()) { | ||
| result.put(key, ""); | ||
| } | ||
| else { | ||
| result.put(key, collection); | ||
| } | ||
| } | ||
| else { | ||
| result.put(key, (value != null ? value : "")); | ||
| } | ||
| }); | ||
| } | ||
|
|
||
| @SuppressWarnings({ "rawtypes", "unchecked" }) | ||
| private static Map<String, Object> asMap(Object object) { | ||
| Map<String, Object> result = new LinkedHashMap<>(); | ||
| if (object instanceof Map map) { | ||
| map.forEach((key, value) -> { | ||
| if (value instanceof Map) { | ||
| value = asMap(value); | ||
| } | ||
| if (key instanceof CharSequence) { | ||
| result.put(key.toString(), value); | ||
| } | ||
| else { | ||
| // It has to be a map key in this case | ||
| result.put("[" + key.toString() + "]", value); | ||
| } | ||
| }); | ||
| } | ||
|
|
||
| return result; | ||
| } | ||
|
|
||
| } | ||
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
This file contains hidden or bidirectional Unicode text that may be interpreted or compiled differently than what appears below. To review, open the file in an editor that reveals hidden Unicode characters.
Learn more about bidirectional Unicode characters
| Original file line number | Diff line number | Diff line change |
|---|---|---|
|
|
@@ -71,7 +71,7 @@ public static Map<String, Object> processAllEntries(Map<String, String> input, E | |
| String propertyValue = singleEntry.getValue(); | ||
| if (propertyName.endsWith(".yml") || propertyName.endsWith(".yaml")) { | ||
| LOG.debug("The single property with name: [" + propertyName + "] will be treated as a yaml file"); | ||
| return yamlParserGenerator(environment).andThen(PROPERTIES_TO_MAP).apply(propertyValue); | ||
| return yamlParserGenerator(environment).apply(propertyValue); | ||
| } | ||
| else if (propertyName.endsWith(".properties")) { | ||
| LOG.debug("The single property with name: [" + propertyName + "] will be treated as a properties file"); | ||
|
|
@@ -176,15 +176,15 @@ private static Map<String, Object> defaultProcessAllEntries(Map<String, String> | |
|
|
||
| private static Map<String, Object> extractProperties(String resourceName, String content, Environment environment) { | ||
|
|
||
| if (resourceName.endsWith(".yml") || resourceName.endsWith(".yaml") || resourceName.endsWith(".properties")) { | ||
| if (ENDS_IN_EXTENSION.test(resourceName)) { | ||
|
|
||
| if (resourceName.endsWith(".properties")) { | ||
| LOG.debug("entry : " + resourceName + " will be treated as a single properties file"); | ||
| return KEY_VALUE_TO_PROPERTIES.andThen(PROPERTIES_TO_MAP).apply(content); | ||
| } | ||
| else { | ||
| LOG.debug("entry : " + resourceName + " will be treated as a single yml/yaml file"); | ||
| return yamlParserGenerator(environment).andThen(PROPERTIES_TO_MAP).apply(content); | ||
| return yamlParserGenerator(environment).apply(content); | ||
|
Contributor
Author
There was a problem hiding this comment. Choose a reason for hiding this commentThe reason will be displayed to describe this comment to others. Learn more.
|
||
| } | ||
| } | ||
|
|
||
|
|
||
Oops, something went wrong.
Oops, something went wrong.
Add this suggestion to a batch that can be applied as a single commit.
This suggestion is invalid because no changes were made to the code.
Suggestions cannot be applied while the pull request is closed.
Suggestions cannot be applied while viewing a subset of changes.
Only one suggestion per line can be applied in a batch.
Add this suggestion to a batch that can be applied as a single commit.
Applying suggestions on deleted lines is not supported.
You must change the existing code in this line in order to create a valid suggestion.
Outdated suggestions cannot be applied.
This suggestion has been applied or marked resolved.
Suggestions cannot be applied from pending reviews.
Suggestions cannot be applied on multi-line comments.
Suggestions cannot be applied while the pull request is queued to merge.
Suggestion cannot be applied right now. Please check back later.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
This is where I am hung up. Why do we need to read everything into a single property source? The following sentence says because we need to read configmaps and secrets but what about configmap and secrets is different than reading a file?
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
because this is they way our code was designed from the beginning. We have this for example:
and this method returns a
CompositePropertySource composite = new CompositePropertySource("composite-configmap");.So when we read a configmap, we assume we will compute a single property source, for each source. I mean:
so :
But in case of a yaml multi document, things are not trivial to do. spring-boot does not impose this limitation, but we had it from the very beginning.
This is why I said, I totally agree with the comment from spring-boot maintainer, when he says : "create a single PropertySource just the say we do for each individual yaml document". I agree, but our code is far too involved at this point in time. We would need to revise the entire code and the way we create property sources.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
So basically for multi-document yaml files (configmaps/secrets) we have always created a single property source per file instead of separating each document into their own property source?
And when we use Boots YamlPropertiesFactoryBean it creates multiple properties sources and this somehow results in the weird overriding of properties?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
exactly.
The idea is correct, the details are not. We were already using
YamlPropertiesFactoryBeanand this one is overriding properties in a "bad way".We can't use the Boots way, because it creates one PropertySource per yaml document.
Essentially, I think you are spot on with the idea here, yes.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Is it because the multiple property sources from the multiple documents results in incorrect ordering of the property sources? Do you have an example yaml document, the property sources Boot would create and the order they would be in as well as what we would expect? This should help me understand the crux of the issue.
Uh oh!
There was an error while loading. Please reload this page.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
yes, an example would make things simpler I guess. I will use what the OP presented to us. I will strip it down to the bare minimum. Suppose you have this yaml:
spring-boot, using
YamlPropertySourceLoader, will create two property sources:name of the first property source :
Config resource 'class path resource [application.yaml]' via location 'optional:classpath:/' (document #0)values of the fist property source :
my-application.items[0] = Item 1, my-application.items[1] = Item 2 ....name of the second property source:
Config resource 'class path resource [application.yaml]' via location 'optional:classpath:/' (document #1)value of the second property source:
my-application.items[0] = Item 1, my-application.items[1] = Item 6, my-application.items[2] = Item 7 ....Internally, later, when you need to wire things,
IndexedElementsBinderwill correctly chose the profile based properties (second yaml document from the example).In our case, we were using
YamlPropertiesFactoryBean. So when that one was reading the yaml file, it would do :In simpler words, it will "merge" properties from yaml documents, into a single
Map, using :result.putAll(properties).So, it will read the first yaml document, create the properties:
Then, it will read the second yaml document, create the proeprties:
then, again using
result.putAll(properties), merge these. Resulting obviously, in :which is not correct.
As we talked above, if we had the option to create separate PropertySources from these two documents, we would and spring-boot injection mechanism would have taken care of the rest. But we are in a situation were this is not an option. And this is why I created the workaround.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Why is this incorrect? The dev profile is active so it is overriding the items in the array from the default profile.
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
it was supposed to override the entire array, exactly like spring-boot does, not individual items. no?
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Ah yes OK, thats where I was all caught up. You are right it should override the property
my-application.items. Thanks for being so patient!There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
if you wire properties using spring-boot only, no kubernetes involved, you will end up with :
but if users use configmap to read proeprties and we have the same yaml file, they will end up with :