Replies: 4 comments
-
|
What are you using to draw these graphs as an unrelated question!
I do agree in some instances there might be a need to remove any pre processing of the data, this can be done upstream if needed unless it's an inherent part of the pelt algorithm. |
Beta Was this translation helpful? Give feedback.
-
|
It's not inherent to the pelt algorithm I think? Unless there is some hidden pre processing going on (?). I would like to know whether I should do my own normalization up front, and how it might affect certain cost functions in the pelt algorithm (L1, L2, ...). The plotting is just matplotlib + seaborn! |
Beta Was this translation helpful? Give feedback.
-
Beta Was this translation helpful? Give feedback.
-
|
Thanks @deepcharles, that makes sense.
No normalization above shows the raw signal. I guess fine tuning the penalty will do the trick. Or is there anything that I am missing (a better cost function for my signal for example)? Thanks for your help! |
Beta Was this translation helpful? Give feedback.





Uh oh!
There was an error while loading. Please reload this page.
Uh oh!
There was an error while loading. Please reload this page.
-
Hi, this is a question, not an issue.
I have a bunch of features that I track over time. I am feeding them into
signalhere is (for example) a 500x16 (timepoints x features). The features themselves live on pretty different scales, such that I thought that some kind of scaling / normalization (for example via https://scikit-learn.org/stable/modules/generated/sklearn.preprocessing.scale.html#sklearn.preprocessing.scale) could make sense. Now I wonder though how different costs would be affected by that. In the example I am attaching below you can see the normalized signal for L1 and L2 norms -> change points are depicted with dashed lines. You can see that there are some obvious misses there (calibrating the penalty helps sometimes, but is a finicky process).Should normalization be skipped altogether / is there a better alternative cost for these kind of signals?
Beta Was this translation helpful? Give feedback.
All reactions