Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Recording starts before resolving promise #1

Closed
limoragni opened this issue Jun 26, 2017 · 6 comments · Fixed by #12 or #14
Closed

Recording starts before resolving promise #1

limoragni opened this issue Jun 26, 2017 · 6 comments · Fixed by #12 or #14

Comments

@limoragni
Copy link

I'm recording the screen because I need to show some visualizations over a video I'm reproducing. I need to know when the recording starts so I can then play the video at the same time. Now the video it's playing almost a second after it has started recording. I'm still trying to figure out if there's a problem on my code causing this but I see one of the comments on the code says // R is printed by Swift when the recording **actually** starts like if the "actually" has air quotes or something.

So, is there a know lapse of time that passes from when the recording starts until the promise resolves? Can I do something about this?

Thanks!

@sindresorhus
Copy link
Contributor

We're aware of the problem. See wulkano/Kap#3.

@skllcrn
Copy link
Member

skllcrn commented Jun 26, 2017

It's upstream, see wulkano/Kap#3. We can't know the exact time, we're looking at a couple of Kap specific fixes

@skllcrn
Copy link
Member

skllcrn commented Jun 26, 2017

@sindresorhus "First!”

@limoragni
Copy link
Author

I see the solution you're proposing.

"Save the time when .startRecording() is called. Have Aperture send the time back in the resolved promise when the recording actually started. Use ffmpeg to losslessly trim that time away from the start of the movie."

This would work form me too since I also have ffmpeg as part of my app. How would I go about passing the time when aperture starts the recording. Are you planning on adding it soon?

Would it be a change on this part of the code?

        if (data.trim() === 'R') {
          // `R` is printed by Swift when the recording **actually** starts
          clearTimeout(timeout);
          resolve(this.tmpPath);
        }

Or do you think it needs to be done on the swift part (I imagine this is the case)?

@sindresorhus
Copy link
Contributor

Are you planning on adding it soon?

I won't have time for a while, but we're always happy to review pull requests.

Would it be a change on this part of the code?

Probably the Swift code.


That being said, I'm not super happy about this solution. I'm still hoping someone will come up with a better solution. There might be a way to fix this without introducing so much complexity.

@sindresorhus
Copy link
Contributor

sindresorhus commented Sep 14, 2017

I have thought of two other possible solutions, but both will require #47 to be implemented first.

Solution 1

According to the .startRunning() documentation, it's the slow blocking call that causes this issue.

The startRunning() method is a blocking call which can take some time, therefore you should perform session setup on a serial queue so that the main queue isn't blocked (which keeps the UI responsive)

If we instead spawned the binary when the Aperture class is initialized, so it's ready when the .start() method is called, I think it would solve our timing issues. This will require us to be able to send messages to the Swift binary after it's been spawned, hence #47. We also need to run the session in the Swift code on a separate queue.

Example of this here: https://developer.apple.com/library/content/samplecode/AVCam/Introduction/Intro.html#//apple_ref/doc/uid/DTS40010112

Solution 2

I'm pretty sure 1 will solve our problems, but if not:

Add a .prepare() method that resolves a promise when the session is ready and implements the didOutputSampleBuffer delegate callback, which starts recording, but not writing. From that point on, you should be able to call .record() immediately, since the recording is already running, but not writing yet.

Example of this here: https://github.com/gardner-lab/video-capture/blob/414e9f7f30640b90e06ba54eb7b5bc0a07768180/VideoCapture/CaptureControl.swift#L58-L71


I think both of these solutions are better than recording the timestamp and later trim the video.


I don't have the time to work on this at the moment though, so pull request welcome.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
3 participants