Hi, I'm getting nonsensical on just the simplest example.
I'm on a computer with macOS Sonoma 14.7.6
I use a fresh Conda environment and the following environment.yml:
name: ripser-test
channels:
- conda-forge
dependencies:
- python=3.13
- numpy
- ripser
This installs the latest version of ripser 0.6.12
Then I try to run the simple example from the documentation:
test.py
import numpy as np
from ripser import Rips
np.random.seed(1)
rips = Rips()
data = np.random.random((100,2))
diagrams = rips.fit_transform(data)
rips.plot(diagrams, show=True)
conda env create -n ripser-test -f environment.yml
conda activate ripser-test
python test.py
The outputs for seeds 1, 2, 3 and 8 look like this:
Importantly, there is never a death time at infinity, instead the death time is zero in cases where it shouldn't be.
Hi, I'm getting nonsensical on just the simplest example.
I'm on a computer with macOS Sonoma 14.7.6
I use a fresh Conda environment and the following
environment.yml:This installs the latest version of ripser 0.6.12
Then I try to run the simple example from the documentation:
test.py
The outputs for seeds 1, 2, 3 and 8 look like this:
Importantly, there is never a death time at infinity, instead the death time is zero in cases where it shouldn't be.