-
Notifications
You must be signed in to change notification settings - Fork 9
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Local LL-HLS Files #7
Comments
@PanicKk Generate files locally is able to implement. I try run in container, it seems good and does not occured "Broken pipe". FROM alpine:latest
RUN apk --update add ffmpeg python3 py3-pip && rm -rf /var/cache/apk/*
COPY requirements.txt requirements.txt
RUN pip3 install -r requirements.txt
COPY biim biim
COPY fmp4.py fmp4.py
EXPOSE 8080
CMD ["sh", "-c", "ffmpeg -re -f lavfi -i testsrc=700x180:r=30000/1001 -f lavfi -i sine=frequency=1000:r=48000 -c:v libx264 -tune zerolatency -preset ultrafast -r 30 -g 15 -pix_fmt yuv420p -c:a aac -ac 2 -ar 48000 -f mpegts - | ./fmp4.py"] |
Thank you! @monyone I was trying with an ubuntu image and tried cloning the repo and then starting the push, I also missed the port exposing part. I tried the provided Dockerfile and it works fine, thank you. But, I generating the files locally is a better approach for my intention, since I need to upload the files to a CDN. Could you please instruct me on how to generate the files locally? |
I have another concern! Is the role of the playlist.m3u8 that of a master file? Also, if the ffmpeg command includes multiple output resolutions, will those be included in the playlist? |
biim can serve LL-HLS origin (HTTP 1.1). |
Currently not support multiple resolution (multi-variant playlist). |
It would be amazing if you added support for multiple resolutions ! Maybe something like: localhost:8080/1080p.m3u8, localhost:8080/720p.m3u8 ... based on the given ffmpeg command, so the user can decide the encoding configurations for each resolution. Also what do you mean with Can I upload them either way ? |
CDN (almost) pull file from existing origin server. |
Hey @monyone ! I apologize for the mention and disturbing you, but I was wondering if you could inform me about supporting multiple resolution outputs. Could you do something about that, or could you please instruct me how to achieve it. Thank you ! |
Any updates @monyone ? |
Is there a way to generate the files locally instead of pushing them to localhost:8080 ?
I thought of this, because when I run the script inside a docker container, I get an error such as "Broken pipe" and I couldn't find a solution for that.
So, is there a fix for that or can I generate the files locally? (meaning inside the container)
The text was updated successfully, but these errors were encountered: