Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Error when processing new files #236

Closed
r3dsouza opened this issue Jun 16, 2020 · 7 comments · Fixed by #237
Closed

Error when processing new files #236

r3dsouza opened this issue Jun 16, 2020 · 7 comments · Fixed by #237
Assignees
Labels
bug Something isn't working

Comments

@r3dsouza
Copy link

r3dsouza commented Jun 16, 2020

Hi,

Firstly, thank you for this project. It is really nice to have a docker version of the AI Tool!!

Issue Description
The issue I'm facing is that when new files are saved to the aiinput folder, these are correctly recognised and the analysis initiated, however, I immediately get an invalid image error. Interestingly, once an image file is already in the aiinput folder, if I simply modify the image file name, it works perfectly and gets a response from the DeepStack server and triggers web requests and MQTT. I get the same issue, no matter if images have been FTP'd by Camera, or if I manually copy files to it via samba from my laptop. I'm not sure what else to look for. Any help would be greatly appreciated!

Log details
2020-06-16T01:43:10+01:00 [Trigger Person detector] /aiinput/2020/06/16/Camera1_20200615_125329.0.64.jpg: Analyzing
2020-06-16T01:43:10+01:00 [DeepStack] Failed to call DeepStack at http://192.168.1.147:5000/: {"success":false,"error":"invalid image"}
2020-06-16T01:43:10+01:00 [Trigger Person detector] /aiinput/2020/06/16/Camera1_20200615_125329.0.64.jpg: Analysis failed

Installation details
I'm running the latest version of the docker image in a QNAP TS-451+ NAS, following the instructions provided at Deploying to Synology or Unraid. The aiinput folder of the container is mapped to a share in the NAS. I have Reolink 410(5MP) IP cameras, that are configured to save images to this NAS share via FTP when they detect motion.

The DeepStackAI runs as a docker container on a separate server running Proxmox VE. Note, I have tried using the windows AI Tool that points to the same DeepStackAI docker instance and it works perfectly!

Docker Compose
version: "3"

networks:
qnet-dhcp-eth0-adadab:
external: true
services:
deepstackai-trigger:
container_name: deepstackai-trigger
image: danecreekphotography/node-deepstackai-trigger:latest
hostname: deepstackai-trigger
networks:
- qnet-dhcp-eth0-adadab
ports:
- 4242:4242
environment:
- TZ=Europe/London
- DEEPSTACK_URI=http://192.168.1.147:5000/
- VERBOSE=true
volumes:
- /share/Cameras/AI_Input:/aiinput
- /share/Volumes/DeepStackAI_Trigger/config:/config
- /share/Volumes/DeepStackAI_Trigger/node-deepstackai-trigger:/node-deepstackai-trigger
restart: always

@r3dsouza r3dsouza added the bug Something isn't working label Jun 16, 2020
@neilenns
Copy link
Owner

neilenns commented Jun 16, 2020

Thanks for the detailed report! My guess is it's the combination of the network share and Docker causing the file to get read and sent to Deepstack before the file is actually fully there. Since when you rename the file is already present it works.

Can you try setting the following environment variable in your Docker config for the deepstackai-trigger container:

- CHOKIDAR_USEPOLLING=false

This will probably wind up being even worse and won't detect any files at all, but it's worth a try!

@neilenns neilenns self-assigned this Jun 16, 2020
neilenns added a commit that referenced this issue Jun 16, 2020
@neilenns
Copy link
Owner

Actually scratch that. I'll have something else for you to try as soon as Docker Hub finishes building. Docker Hub is sooooo slow...

@neilenns
Copy link
Owner

neilenns commented Jun 16, 2020

Ok @r3dsouza, update your docker compose to use this image tag instead:

danecreekphotography/node-deepstackai-trigger:issue236

That should be the only change you need to make, leave everything else as it was when you saw the problem. Start the containers again and... hopefully... it will work!

Edit: Also set this environment variable on the container:

- CHOKIDAR_AWAITWRITEFINISH=true

@r3dsouza
Copy link
Author

That did the trick! Using the docker image issue236 and adding the environment variable, it works perfectly!

Thank you!!

@neilenns
Copy link
Owner

Awesome! I'll spend a bit of time testing the change then move it to the main branch. You can keep using that tag image for now and I'll let you know when it's moved to the "latest" tag so you can update your docker compose file.

neilenns added a commit that referenced this issue Jun 16, 2020
neilenns added a commit that referenced this issue Jun 16, 2020
* Error when processing new files
Fixes #236

* Move awaitWriteFinish setting to environment variable

* Error when processing new files
Fixes #236
@neilenns
Copy link
Owner

This fix is now in the main release @r3dsouza. Please update your Docker tag to point to latest, thanks!

@r3dsouza
Copy link
Author

Thanks so much! Really appreciate it.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug Something isn't working
Projects
None yet
Development

Successfully merging a pull request may close this issue.

2 participants