Set compress_js and compress_css system settings to false by default.#12538
Conversation
|
+1 |
|
Do we need this with the option added to the setup per #12486? |
|
Yes. It should be false by default in any case |
|
I'm not sure if I agree (though I definitely appreciate it's the cause of many forum topics). There's a lot of files that aren't minified or concatenated, and despite the problems, it does help bring down the download size. When we can pre-process at the very least the core js files on every page to ship as a single minified .js file I'm happy taking out the entire /manager/min/ directory, but until then, as compression can already be disabled in the setup for those that don't want it, I'm not too sure. |
|
That are only backend scripts. If they are loaded once, they are cached. So there should be almost no delay on the second backend load. |
|
I know there's semver and all, but I'd advocate removing them entirely. If we removed them nothing would break so it seems ok to target removing them for 2.x? 💣 Not only do the compression settings break default MODX installs in certain environment it can break plugins as well. If performance is the justification, there is no performance gain from doing this. At all. Really, there isn't. I hate to break it to you but after the first time they land on the Dashboard in practically everything is cached... @Mark-H you bring up download size but you still don't seem to consider:
Also if we look towards the future of HTTP/2 there is no need to shave off every HTTP request possibly anyways. When using There are plenty of issues related to these settings. Some of them may cause people to give up on MODX before they even get to use it. Check out this critical issue. #12609
I don't understand why a mythological performance gain is worth all this madness. If we can't remove them entirely I think all the methods should have some sort of
That's not exactly true because if |
|
That would certainly be agreeable to me! |
|
Another thing I didn't really mention is that obviously different server environments are going to be running different versions of PHP software and related plugins which is out of our control so there is no way for us to reliable know if a given server will properly compress the files. What if it corrupts them with something as simple as an extra semicolon? What kind of code coverage does that give us? Is the risk worth the reward here? |
|
You're preaching to the choir here, I think. |
I know, but the problem is I still don't think I've convinced @Mark-H. I recall he once said that if I proved it wasn't performant he would remove them himself. Apparently I've failed to do so even after having this debate going on well over a year now. The problem is you have to keep in mind the performance is relevant to the CPU load on the server AND that the process itself is keeping the server from doing other things. There may be test cases where the initial doc ready load time is quicker for a given MODX Dashboard with a cleared cache and compress settings on. I'll give you that. There also may be cases were it isn't. There definitely are cases where it breaks the manager entirely. I guess my question is specially what else @Mark-H needs to be convinced these settings should go. @Mark-H you ask for numbers but you don't seem to acknowledge is that this isn't just about the numbers. Any performance that could be measured is relative to:
Design isn't about pulling best practices or performance tips out of a hat and applying them blindly. We're designing for a Manager interface that is entirely dependent on Javascript. We have numerous critical bugs related to these settings. MODX is all about having control over every character of your code yet if I draft some amazing Javascript how can you ensure me |
|
I'm flattered you believe I am the person that needs to be convinced in I do think taking out the compression (or disabling by default) has big That said, you don't seem to realize there are on my last count 5 other There's one concern in terms of backwards compatibility and that's custom |
|
Mark, You offered to author the PR yourself if convinced. That's the only reason I felt you need to be convinced specifically. That's when I wrote that forum post. You should have known I'd much rather attempt to convince you then author the PR ;) Also, I can't think of anyone who has advocated their preservation more than you. We should be less worried about the number of bytes and HTTP requests we ship and more concerned with practical application, server CPU, and white screens of death. — On Tue, Aug 25, 2015 at 5:05 PM, Mark Hamstra notifications@github.com
|
|
One last thing JP, I'm just as tired of dealing with compress_js issues as I've got other things that I prefer putting energy in, so please stop |
|
Guess I'd better stop offering to fix issues that aren't in my personal top |
|
This one did: #12486 |
Fair enough. Do we differ on the definition of performance?
When compress_js breaks the entire MODX Manager, or plugins, it is by definition not performing. Breakages aside take this scenario. Bob loves MODX. He logs into 20+ MODX sites a day. If common assets were attempted to be loaded separately from a CDN first (h5bp style with local fallbacks) the common assets for all 20 MODX sites Bob is going to log in today are cached once he logs into the first one. His cache is already prepared for the MODX sites he's yet to even log into. That is an example of optimization by breaking "best practice" rules of thumb and is also considered the best way to gracefully share assets across web components. In order to accomplish this level of optimization you have to breathe deep and accept the number of HTTP Requests that will be commonly cached after their first login. Yet under the current architecture with the compression settings anytime the URL for the minified file(s) has changed everything has to get re-downloaded with fingers crossed. You say we should do this performance, but you are dead wrong. Consider this:
Thoughts? 😯 I'm curious, are we only concerned with how MODX performs on a 1:1 ratio with environments, or should we also be concerned with how it performs as a world wide web enabled tool? I'm also curious if any of the other integrators beside @Mark-H are in support of compression. To date, Mark you are the only person I know of advocating their persistence and the only person that has posted to that forum thread in support of keeping them citing "performance". |
|
All I can say is that I've disabled it for every site I work on for some time now, and have never noticed any performance differences. |
I'm advocating for better compression, not to keep /manager/min/ around indefinitely. There's a big difference there you don't seem to notice. Also you're the one that keeps mentioning performance, not me. I'd like to see #11682 be brought back alive, issues with it fixed, tested properly, and implemented in 2.5. There's no need anyone outside core developers need to load X files when they can be combined and compressed during development/release. With that we could then remove /manager/min/ and any trace of it, leaving the compress_js and related settings enabled by default and in charge of loading a single manager.js rather than all individual files. |
|
@Mark-H I'll have another look at #11682 and see what I can do hopefully this week. @sottwell I'm glad you disable them. In fact, I think most people do. As @Mark-H and I are both well aware, this issue is breaking redactor-2.0.0-rc7 so it is high on my priority list. We've had 103 downloads of our redactor-2.0.0 prerelease and since rc1 it has been broken when compress_js is enabled. No one has reported the issue except @Mark-H. To me, that is evidence practically everyone disables these settings. @Mark-H do remember writing any of this?
I spilled 'em. Your numbers and use cases are in this PR and that forum post. |
|
Lol, no you still haven't given any numbers to prove it is harming performance, which was your claim. You can't claim something and then not give any proof for it, it doesn't work that way. You need to back it up with something real. All I've seen is anecdotal "evidence" and highlights of cases where it doesn't work. That is hardly scientific, that sounds more like FUD to me. I don't think Redactor 1.5 had issues with compress_js, so we must have changed something in our code that broke it. The minifier hasn't changed recently, only Redactor has. You also bring up CDNs repeatedly which may be great, but need a completely different discussion as they have their own pros (shared assets cached once) and cons (some CDNs are blocked in certain countries, and when they go down you can't manage any of your sites). Also, not at all related to compression as you can have compressed and uncompressed files on a CDN. Again though, I'll just keep on repeating this until it sticks, you don't have to convince me. Just bring a proven better alternative to the table and get one of the 6 integrators to hit the big green button, and we can all live happily ever after. |
|
@Mark-H 103 downloads of redactor-2.0.0-rc1-7 and no one but you have reported the issue. How is that not a case study with numbers? This issue is literally holding up the next major release of your most sold premium modmore extra and you still don't understand how it is affecting performance. I feel like I'm talking to a wall. I never needed numbers to be convinced that I'm currently working on course material for Thinkful's Front End Web Development courses and work closely on Slack with hundreds of mentors who specialize in Front End Development. We discuss things like this on a daily basis. Would you like me to run this discussion by them? Should we do some sort of a survey? Why are you so obsessed with numbers? Will you ever trust me? How many sites have you developed with a 99+% YSlow rating? I think if you listen to the contributors that are trying to help you help you, you might learn something. Instead you tell them to go away and come back with pie charts. If you are so worried about whitespace and extra bytes regardless of Grunt I have one word for you: GZIP. Why should I do more tests and studies and provide you with more numbers or "benchmarks" when you just retracted your offer to author the PR? In the same comment you ask for numbers and tell me to author the PR but now I need diverge my efforts into authoring rather than advocating. Regarding CDNs forget about them as they related to this issue, I was just making yet another point. What I still am trying to convince any of the 6 integrators is that doing nothing but depreciating the settings (not even using Grunt to automate) is far more performant and that bearing how browser's actually cache files in mind any performance benchmarks you measure only pertain to the first time the user hits a Manager page so it's a fairly pointless thing to be concerned about anyways. |
|
There's a real world analogy that I think is applicable here. Think about the difference between when you go to a nice multi-course meal and when you are on the go in the morning needing to just get some nutrition quickly. You go to the nice restaurant and order some nice courses. You might wait a little for them to come out separately, but it's worth it. They ask if you want the salad first. You want the salad first. Once the courses arrive they are separate and you can experience them individually. The may even come out at totally different times from one another. Once courses foul taste doesn't affect another. The only thing the courses have in common is you are consuming them. You server may even remember your order and more quickly be able to prepare it on your next visit. Now when you are on the go in the morning and barely have time to get out the door maybe you make a quick breakfast shake. You put a bunch of veggies and fruit in a blender and blend it all up into one thing, slam it down and get out the door. In the process of making your shake, if you forget to put the lid on your breakfast dripping from the ceiling. The MODX Manager experience is a multi-course dining experience. Designing for a public facing front end web page is often the exact opposite, it's the shake and you are simply trying to get all stuff down so you can get out of there in 30 seconds. To use compression settings in the MODX Manager would be for your multi course server to bring all your food out at one in a big nasty blender. It just doesn't make sense. |
|
@sottwell @opengeek Assuming #12611 is merged with the pending 2.5 prerelease, I think we should let these settings:
|
|
Now I'm confused. Will compress_js and compress_css be true by default upon installation or not? |
@jpdevries I thought you were advocating that we rely on the preprocessing for asset optimization; shouldn't this be become false? Once the preprocessor handles the compression and minification is there any need to have MODX continue having the compress options; could this be removed entirely? |
I am. And with #12611 they do. In other words, with #12611 even with |
|
Thanks for the clarification. Does |
It doesn't really "run" anything at all, at least anymore. The So from a performance perspective there is no more waiting for the server to minify files. They are already there. One thing that isn't happening yet is the |
|
Closing as #12611 made js compression safer, and should only be disabled for development. |



Set compress_css and compress_js to false by default in System Settings on installation.
_build/data/transport.core.system_settings.php on lines 318 and 327