-
Notifications
You must be signed in to change notification settings - Fork 3
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Better documentation of chopping options #131
Comments
Hi @tk231 Really happy to hear that you find this package useful, and thanks for opening this issue. Feedback from users is very valuable in order to make a software that works in different scenarios that I might not have tested myself. When it comes to how the beats are chopped then I think the best resource for this is the API documentation: Here for chopping data with pacing info (i.e you use the pacing protocol to determine when to chop): or the more sophisticated method without this information about pacing (which is probably the one you are referring to): One thing that might mess this up is the However, there might be cases that I haven't thought about, so feel free to drop some sample data here, and then I can take a look. |
Hi @finsberg Thank you for the prompt reply, and for your direction to the API documentation! I have had a look into the more sophisticated method without pacing information (IMO this suits my use case more), although I have missed the documentation of the function with pacing info. I think I know the problem now: the program looks for instances "where the pacing amplitude goes from zero to something positive". In my optical mapping data, our intensities are always positive, which is why I would definitely need to perform some kind of baseline correction for this to work. I'll try the baseline correction algorithm in Here's a sample of my optical map recording, just for your amusement, if you're interested in trying it out. We had a sampling rate of 478 Hz, if that information helps. P.S.: I have not seen a publication for this library, is there a way to cite |
Thanks for shared the dataset. I tried to chop the dataset into beats, and it seems to work if you select a low value of import numpy as np
import matplotlib.pyplot as plt
import ap_features as apf
data = np.load("Basler_acA720-520um__40190569__20230919_114100243_vol_denoised.npy")
t = np.arange(0, len(data))
y = apf.Beats(
y=data,
t=t,
background_correction_method="full",
chopping_options={"min_window": 5},
)
beats = y.beats
fig, ax = plt.subplots()
for beat in beats[:20]:
ax.plot(beat.t, beat.y)
fig.savefig("beats.png") produces the following plot Not sure if this is what you expect? Also thanks for mentioning For citation I have now added a citation file which you can use for now. |
Hi @finsberg, thanks for getting back! I have not seen mention of the The data I sent was the raw data that one would expect to obtain from optical maps. Usually, the processing of the data would entail:
From that plot, I notice two things: (1) the signals have not been inverted, and (2) the splitting of the signals seem to be based on the point of upstroke (downstroke if the signal was inverted). I have used your code and modified it. This is how it looks like:
This is the resulting plot (cropped for the first 5 secs): It looks like the first few AP beats were not correctly cropped, but the others looked okay, with some looking pretty good while some others have a some kind of overlap. Would you happen to have an explanation for that? P.S.: Thanks for the citation file! Will definitely use that whenever I use |
OK, I think one of the main issues here (which should have been better documented). Is that the values of y = apf.Beats(
y=inverted_data,
t=time,
background_correction_method="full",
chopping_options={
"extend_front": 0.05,
"min_window": 0.05,
"max_window": 0.2,
},
) I renamed this issue to indicate that the problem is really with documenting the chopping options. |
Description
Hi all,
first of, this project is really cool, and I'm really glad that a colleague that I met during a conference introduced me to your program! It seems to be able to do everything I need, i.e.: analysis of optical mapping data.
I've however been trying to get the code to work on my data, and it seems that it is unable to process the beats within my optical map recordings. I don't think I understand how the chopping of beats works: it seems like it works simply by assigning every occasion where the signal passes through the threshold to be the start/end of the beat, but it does not seem to work on my optically mapped action potentials. My entire recording is basically mapped into a single beat, which is definitely not correct. I have not tried correcting the baseline of my recording, and there is actually a significant drift in normalised signal, is this what I should do when I initialise my Beats? There unfortunately are no mentions of your baseline correction methods in your tutorial, but seems to be pretty well-documented in the package.
Would you know what could be the issue? I could provide a numpy file with my signals if it would make troubleshooting easier.
Thanks in advance!
The text was updated successfully, but these errors were encountered: