Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

The analysis fails to start and ends immediately. [Extraction container could not be reached.] #1277

Open
oguzhanozgur opened this issue Oct 1, 2024 · 32 comments
Labels

Comments

@oguzhanozgur
Copy link

FACT version

4.3-dev

Environment

Distribution : Ubuntu 22.04.5 LTS
powered by FACT 4.3-dev
© Fraunhofer FKIE 2015-2024

Steps to reproduce

  1. Starting the tool by writing below command:
    ~/FACT_core/start_all_installed_fact_components
  2. Then, filling the required fields for the file. (version, names etc.)
  3. Starting analysis.

Observed Behavior

The analysis has ended immediately and nothing showed up at result screen.

Expeced Behavior

The analysis should start normally and the tool analyze the fields which I marked.

Installation logs

No response

Backend logs

[2024-10-01 13:29:01][connectionpool][WARNING]: Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7dbd805f7c70>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
[2024-10-01 13:29:01][connectionpool][WARNING]: Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7dbd805f7ee0>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
[2024-10-01 13:29:02][connectionpool][WARNING]: Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7dbd805f71c0>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
[2024-10-01 13:29:02][unpacking_scheduler][ERROR]: Could not fetch unpacking container logs
[2024-10-01 13:29:02][unpacking_scheduler][ERROR]: Exception happened during extraction of 2bd7fcbb382db9223414bde8aefd4f7eab3299bc0084e43356e6c1ac26af3baf_4535.: Extraction container could not be reached.
Traceback (most recent call last):
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 199, in _new_conn
sock = connection.create_connection(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection
raise err
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/util/connection.py", line 73, in create_connection
sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 789, in urlopen
response = self._make_request(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 495, in _make_request
conn.request(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 441, in request
self.endheaders()
File "/usr/lib/python3.10/http/client.py", line 1278, in endheaders
self._send_output(message_body, encode_chunked=encode_chunked)
File "/usr/lib/python3.10/http/client.py", line 1038, in _send_output
self.send(msg)
File "/usr/lib/python3.10/http/client.py", line 976, in send
self.connect()
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 279, in connect
self.sock = self._new_conn()
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 214, in _new_conn
raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7dbd80424160>: Failed to establish a new connection: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/adapters.py", line 589, in send
resp = conn.urlopen(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
return self.urlopen(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
return self.urlopen(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
return self.urlopen(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 843, in urlopen
retries = retries.increment(
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/util/retry.py", line 519, in increment
raise MaxRetryError(_pool, url, reason) from reason # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=9900): Max retries exceeded with url: /status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7dbd80424160>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 55, in _extract_with_worker
response = container.start_unpacking(tmp_dir, timeout=WORKER_TIMEOUT)
File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 118, in start_unpacking
response = self._check_connection()
File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 133, in _check_connection
return session.get(url, timeout=5)
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/sessions.py", line 602, in get
return self.request("GET", url, **kwargs)
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
resp = self.send(prep, **send_kwargs)
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
r = adapter.send(request, **kwargs)
File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/adapters.py", line 622, in send
raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=9900): Max retries exceeded with url: /status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7dbd80424160>: Failed to establish a new connection: [Errno 111] Connection refused'))

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 199, in work_thread
extracted_objects = self.unpacker.unpack(task, tmp_dir, container)
File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack.py", line 42, in unpack
extracted_files = self.extract_files_from_file(current_fo.file_path, tmp_dir, container)
File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 41, in extract_files_from_file
self._extract_with_worker(file_path, container, tmp_dir)
File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 59, in _extract_with_worker
raise ExtractionError('Extraction container could not be reached.') from error
unpacker.unpack_base.ExtractionError: Extraction container could not be reached.
[2024-10-01 13:29:02][unpacking_scheduler][INFO]: Unpacking completed: 2bd7fcbb382db9223414bde8aefd4f7eab3299bc0084e43356e6c1ac26af3baf_4535 (extracted files: 0)
[2024-10-01 13:29:02][unpacking_scheduler][INFO]: Unpacking of firmware 2bd7fcbb382db9223414bde8aefd4f7eab3299bc0084e43356e6c1ac26af3baf_4535 completed.
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file /home/oguzhanozgur/FACT_core/src/bin/firmware' /home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file /home/oguzhanozgur/FACT_core/src/bin/firmware'
Process ExceptionSafeProcess-109:
Traceback (most recent call last):
File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
self.run()
File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
raise exception
File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
Process.run(self)
File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
self._target(*self._args, **self._kwargs)
File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
self.check_pending()
File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
container.restart()
File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
self.stop()
File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
[2024-10-01 13:29:03][scheduler][INFO]: Analysis Completed: 2bd7fcbb382db9223414bde8aefd4f7eab3299bc0084e43356e6c1ac26af3baf_4535
[2024-10-01 13:29:03][analysis_status][INFO]: Analysis of firmware 2bd7fcbb382db9223414bde8aefd4f7eab3299bc0084e43356e6c1ac26af3baf_4535 completed

Frontend logs

No response

Other information

No response

@jstucke
Copy link
Collaborator

jstucke commented Oct 1, 2024

urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=9900): Max retries exceeded with url: /status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7dbd80424160>: Failed to establish a new connection: [Errno 111] Connection refused'))

This means that the extractor container could not be reached. The extractor is responsible for unpacking the firmware image and runs inside a docker container. Could you make sure the docker image is there? docker image ls | grep -i fact_extractor should give at least one result with tag "latest". Also could you make sure there are no errors when you start the backend? It should look something like this when the unpacking scheduler starts without errors:

[2024-10-01 14:43:46][extraction_container][INFO]: Started unpack worker 0
[2024-10-01 14:43:47][extraction_container][INFO]: Started unpack worker 1
[2024-10-01 14:43:47][extraction_container][INFO]: Started unpack worker 2
[2024-10-01 14:43:47][extraction_container][INFO]: Started unpack worker 3
[2024-10-01 14:43:47][unpacking_scheduler][INFO]: Unpacking scheduler online

It could also be the case that extractor containers are still running from another time when you did not shut down FACT cleanly. Then you can run docker ps | grep fact_extractor | cut --delimiter " " --fields 1 | xargs docker stop to stop the containers before starting FACT again.

@oguzhanozgur
Copy link
Author

oguzhanozgur commented Oct 1, 2024

When I start the tool, i can see these logs (and I guess there is no error here.)

oguzhanozgur@oguzhanozgur:~/FACT_core$ ./start_all_installed_fact_components
[2024-10-01 15:56:46][start_all_installed_fact_components][INFO]: starting database
[2024-10-01 15:56:46][start_all_installed_fact_components][INFO]: starting frontend
[2024-10-01 15:56:46][start_all_installed_fact_components][INFO]: starting backend
[2024-10-01 15:56:46][fact_base][INFO]: Starting FACT Frontend @ 4.3-dev (d96db1d, Python 3.10.12)
[2024-10-01 15:56:46][init][INFO]: Alembic DB revision: head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-01 15:56:46][fact_base][INFO]: Starting FACT DB-Service @ 4.3-dev (d96db1d, Python 3.10.12)
[2024-10-01 15:56:46][init][INFO]: Alembic DB revision: head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-01 15:56:46][fact_base][INFO]: Starting FACT Backend @ 4.3-dev (d96db1d, Python 3.10.12)
[2024-10-01 15:56:46][fact_base][INFO]: Successfully started FACT DB-Service
[2024-10-01 15:56:46][init][INFO]: Alembic DB revision: head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-01 15:56:47][fact_base][INFO]: Successfully started FACT Frontend
[uWSGI] getting INI configuration from /home/oguzhanozgur/FACT_core/src/config/uwsgi_config.ini
*** Starting uWSGI 2.0.25.1 (64bit) on [Tue Oct 1 15:56:47 2024] ***
compiled with version: 11.4.0 on 01 October 2024 07:03:53
os: Linux-6.5.0-1027-oem #28-Ubuntu SMP PREEMPT_DYNAMIC Thu Jul 25 13:32:46 UTC 2024
nodename: oguzhanozgur
machine: x86_64
clock source: unix
detected number of CPU cores: 12
current working directory: /home/oguzhanozgur/FACT_core/src
detected binary path: /home/oguzhanozgur/ozzy/bin/uwsgi
!!! no internal routing support, rebuild with pcre support !!!
your processes number limit is 125501
your memory page size is 4096 bytes
detected max file descriptor number: 1024
lock engine: pthread robust mutexes
thunder lock: enabled
uwsgi socket 0 bound to TCP address 127.0.0.1:5000 fd 3
Python version: 3.10.12 (main, Sep 11 2024, 15:47:36) [GCC 11.4.0]
Python main interpreter initialized at 0x58d1df7f1fc0
python threads support enabled
your server socket listen backlog is limited to 100 connections
your mercy for graceful operations on workers is 60 seconds
mapped 500064 bytes (488 KB) for 10 cores
*** Operational MODE: preforking+threaded ***
[2024-10-01 15:56:47][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/ip_rules.yara
[2024-10-01 15:56:47][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/uri_rules.yara
[2024-10-01 15:56:47][frontend_main][INFO]: Web front end online
WSGI app 0 (mountpoint='') ready in 0 seconds on interpreter 0x58d1df7f1fc0 pid: 54743 (default app)
*** uWSGI is running in multiple interpreter mode ***
spawned uWSGI master process (pid: 54743)
spawned uWSGI worker 1 (pid: 54962, cores: 2)
spawned uWSGI worker 2 (pid: 54964, cores: 2)
spawned uWSGI worker 3 (pid: 54966, cores: 2)
spawned uWSGI worker 4 (pid: 54968, cores: 2)
spawned uWSGI worker 5 (pid: 54970, cores: 2)
*** Stats server enabled on 127.0.0.1:9191 fd: 19 ***
[2024-10-01 15:56:47][compare][INFO]: Comparison plugins available: Software, File_Coverage, File_Header
[2024-10-01 15:56:48][scheduler][INFO]: Analysis scheduler online
[2024-10-01 15:56:48][scheduler][INFO]: Analysis plugins available: binwalk 1.0.0, cpu_architecture 0.4.0, crypto_hints 0.2.1, crypto_material 0.5.2, cve_lookup 0.1.0, cwe_checker 0.5.4, device_tree 2.0.0, elf_analysis 0.3.4, exploit_mitigations 0.2.0, file_hashes 1.2, file_system_metadata 1.0.0, file_type 1.0.0, hardware_analysis 0.2, hashlookup 0.1.4, information_leaks 0.2.0, init_systems 0.4.2, input_vectors 0.1.2, interesting_uris 0.1, ip_and_uri_finder 0.4.2, ipc_analyzer 0.1.1, kernel_config 0.3.1, known_vulnerabilities 0.2.1, printable_strings 0.3.4, qemu_exec 0.5.2, software_components 0.4.2, source_code_analysis 0.7.1, string_evaluator 0.2.1, tlsh 0.2, users_and_passwords 0.5.4
[2024-10-01 15:56:48][unpacking_scheduler][INFO]: Unpacking scheduler online
[2024-10-01 15:56:48][unpacking_scheduler][INFO]: Queue Length (Analysis/Unpack): 0 / 0
[2024-10-01 15:56:48][comparison_scheduler][INFO]: Comparison scheduler online
[2024-10-01 15:56:48][back_end_binding][INFO]: Intercom online
[2024-10-01 15:56:48][fact_base][INFO]: Successfully started FACT Backend
[2024-10-01 15:56:51][fact_base][INFO]: System memory usage: 21.3%; open file count: 6
[2024-10-01 15:56:52][fact_base][INFO]: System memory usage: 21.3%; open file count: 7
[2024-10-01 15:56:53][fact_base][INFO]: System memory usage: 21.3%; open file count: 542

And here is the docker output you want :
oguzhanozgur@oguzhanozgur:~$ docker image ls | grep -i fact_extractor
fkiecad/fact_extractor latest d128d1a4c51c 12 days ago 2.26GB

Also, there is an error about the command you provided.
oguzhanozgur@oguzhanozgur:~/FACT_core$ docker ps | grep fact_extractor | cut --delimiter " " --fields 1 | xargs docker stop
"docker stop" requires at least 1 argument.
See 'docker stop --help'.

Usage: docker stop [OPTIONS] CONTAINER [CONTAINER...]

Stop one or more running containers

@jstucke
Copy link
Collaborator

jstucke commented Oct 1, 2024

There is no log entry that suggest an extractor container was started. Did you change anything in the configuration file (src/config/fact-core-config.toml by default)? What happens if you try to start the extractor container manually with docker run -it --rm --entrypoint bash fkiecad/fact_extractor:latest (it should normally give you a shell which you can exit with Ctrl+D)?

Also, there is an error about the command you provided.
oguzhanozgur@oguzhanozgur:~/FACT_core$ docker ps | grep fact_extractor | cut --delimiter " " --fields 1 | xargs docker stop
"docker stop" requires at least 1 argument.

That is actually good, since it means there are no orphaned extractor containers running. It does not explain what the problem is, though.

@oguzhanozgur
Copy link
Author

I didnt change anything in the conf flies. When i try to execute your command:
image
What should i do now?

@jstucke
Copy link
Collaborator

jstucke commented Oct 1, 2024

I'm still not sure what the underlying problem is. Everything looks fine apart from the extraction containers not starting in the unpacking scheduler. Could you try running the scheduler tests?

pytest src/test/integration/scheduler

Also could you try starting only the backend with ./start_fact_backend.py from the src directory and after it has started check the output of docker ps?

Normally the extractor containers should show up in the output, e.g.

CONTAINER ID   IMAGE                                   COMMAND                   CREATED         STATUS                 PORTS                                                 NAMES
734b1efc61aa   fkiecad/fact_extractor                  "gunicorn --timeout …"    2 seconds ago   Up 2 seconds           0.0.0.0:9903->5000/tcp, [::]:9903->5000/tcp           relaxed_clarke
c3d3017ec49a   fkiecad/fact_extractor                  "gunicorn --timeout …"    3 seconds ago   Up 2 seconds           0.0.0.0:9902->5000/tcp, [::]:9902->5000/tcp           nice_leakey
fcf139c31a01   fkiecad/fact_extractor                  "gunicorn --timeout …"    3 seconds ago   Up 3 seconds           0.0.0.0:9901->5000/tcp, [::]:9901->5000/tcp           goofy_chaplygin
74760d59043d   fkiecad/fact_extractor                  "gunicorn --timeout …"    4 seconds ago   Up 3 seconds           0.0.0.0:9900->5000/tcp, [::]:9900->5000/tcp           adoring_turing

@oguzhanozgur
Copy link
Author

oguzhanozgur commented Oct 1, 2024

Here is the all results :

oguzhanozgur@oguzhanozgur:~/FACT_core/src$ ls
alembic.ini          compile_yara_signatures.py  flask_app_wrapper.py  intercom                     plugins                 start_fact_frontend.py  unpacker
analysis             config                      helperFunctions       manage_users.py              __pycache__             start_fact.py           update_statistic.py
bin                  config.py                   init_postgres.py      migrate_database.py          scheduler               statistic               version.py
check_signatures.py  conftest.py                 install               migrate_db_to_postgresql.py  start_fact_backend.py   storage                 web_interface
compare              fact_base.py                install.py            objects                      start_fact_database.py  test
oguzhanozgur@oguzhanozgur:~/FACT_core/src$ ./start_fact_backend.py 
Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/./start_fact_backend.py", line 35, in <module>
    from intercom.back_end_binding import InterComBackEndBinding
  File "/home/oguzhanozgur/FACT_core/src/intercom/back_end_binding.py", line 15, in <module>
    from storage.binary_service import BinaryService
  File "/home/oguzhanozgur/FACT_core/src/storage/binary_service.py", line 11, in <module>
    from unpacker.tar_repack import TarRepack
  File "/home/oguzhanozgur/FACT_core/src/unpacker/tar_repack.py", line 11, in <module>
    from unpacker.unpack_base import UnpackBase
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 11, in <module>
    from docker.types import Mount
ModuleNotFoundError: No module named 'docker'
oguzhanozgur@oguzhanozgur:~/FACT_core/src$ docker ps
CONTAINER ID   IMAGE     COMMAND   CREATED   STATUS    PORTS     NAMES
oguzhanozgur@oguzhanozgur:~/FACT_core/src$ pytest src/test/integration/scheduler
/usr/local/lib/python3.10/dist-packages/_pytest/config/__init__.py:329: PluggyTeardownRaisedWarning: A plugin raised an exception during an old-style hookwrapper teardown.
Plugin: helpconfig, Hook: pytest_cmdline_parse
ConftestImportFailure: ModuleNotFoundError: No module named 'semver' (from /home/oguzhanozgur/FACT_core/src/conftest.py)
For more information see https://pluggy.readthedocs.io/en/stable/api_reference.html#pluggy.PluggyTeardownRaisedWarning
  config = pluginmanager.hook.pytest_cmdline_parse(
ImportError while loading conftest '/home/oguzhanozgur/FACT_core/src/conftest.py'.
conftest.py:15: in <module>
    from analysis.plugin import AnalysisPluginV0
analysis/plugin/__init__.py:1: in <module>
    from .plugin import AnalysisPluginV0, Tag  # noqa: F401
analysis/plugin/plugin.py:8: in <module>
    import semver
E   ModuleNotFoundError: No module named 'semver'
oguzhanozgur@oguzhanozgur:~/FACT_core/src$ 

@jstucke
Copy link
Collaborator

jstucke commented Oct 1, 2024

Did you maybe forget to activate your virtualenv? There are some import errors:

ModuleNotFoundError: No module named 'docker'

ModuleNotFoundError: No module named 'semver'

@oguzhanozgur
Copy link
Author

oguzhanozgur commented Oct 1, 2024

Yes, i forgot :(
I tried again and here is the result :

  1. When I try to start backend only :
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core/src$ ./start_fact_backend.py
[2024-10-01 17:46:31][fact_base][INFO]: Starting FACT Backend @ 4.3-dev (d96db1d1, Python 3.10.12)
[2024-10-01 17:46:31][__init__][INFO]: Alembic DB revision:  head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-01 17:46:32][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/ip_rules.yara
[2024-10-01 17:46:32][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/uri_rules.yara
[2024-10-01 17:46:32][compare][INFO]: Comparison plugins available: Software, File_Coverage, File_Header
[2024-10-01 17:46:32][scheduler][INFO]: Analysis scheduler online
[2024-10-01 17:46:32][scheduler][INFO]: Analysis plugins available: binwalk 1.0.0, cpu_architecture 0.4.0, crypto_hints 0.2.1, crypto_material 0.5.2, cve_lookup 0.1.0, cwe_checker 0.5.4, device_tree 2.0.0, elf_analysis 0.3.4, exploit_mitigations 0.2.0, file_hashes 1.2, file_system_metadata 1.0.0, file_type 1.0.0, hardware_analysis 0.2, hashlookup 0.1.4, information_leaks 0.2.0, init_systems 0.4.2, input_vectors 0.1.2, interesting_uris 0.1, ip_and_uri_finder 0.4.2, ipc_analyzer 0.1.1, kernel_config 0.3.1, known_vulnerabilities 0.2.1, printable_strings 0.3.4, qemu_exec 0.5.2, software_components 0.4.2, source_code_analysis 0.7.1, string_evaluator 0.2.1, tlsh 0.2, users_and_passwords 0.5.4
[2024-10-01 17:46:32][unpacking_scheduler][INFO]: Unpacking scheduler online
[2024-10-01 17:46:32][unpacking_scheduler][INFO]: Queue Length (Analysis/Unpack): 0 / 0
[2024-10-01 17:46:32][comparison_scheduler][INFO]: Comparison scheduler online
[2024-10-01 17:46:32][back_end_binding][INFO]: Intercom online
[2024-10-01 17:46:32][fact_base][INFO]: Successfully started FACT Backend
[2024-10-01 17:46:37][fact_base][INFO]: System memory usage: 19.2%; open file count: 542
[2024-10-01 17:47:24][unpacking_scheduler][INFO]: Queue Length (Analysis/Unpack): 0 / 0
  1. This is the docker ps result :
(ozzy) oguzhanozgur@oguzhanozgur:~$ docker ps
CONTAINER ID   IMAGE     COMMAND   CREATED   STATUS    PORTS     NAMES
  1. And lastly, this is the pytest result :
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ pytest src/test/integration/scheduler
============================= test session starts ==============================
platform linux -- Python 3.10.12, pytest-7.2.0, pluggy-1.5.0 -- /home/oguzhanozgur/ozzy/bin/python3
cachedir: .pytest_cache
rootdir: /home/oguzhanozgur/FACT_core, configfile: pyproject.toml
plugins: cov-4.0.0, timeout-2.1.0, flaky-3.7.0
collected 6 items                                                              

src/test/integration/scheduler/test_cycle_with_tags.py::TestTagPropagation::test_run_analysis_with_tag FAILED [ 16%]
src/test/integration/scheduler/test_regression_virtual_file_path.py::test_check_collision FAILED [ 33%]
src/test/integration/scheduler/test_regression_virtual_file_path.py::test_unpacking_skip FAILED [ 50%]
src/test/integration/scheduler/test_unpack_analyse_and_compare.py::TestFileAddition::test_unpack_analyse_and_compare FAILED [ 66%]
src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse ERROR [ 83%]
src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse ERROR [ 83%]
src/test/integration/scheduler/test_unpack_only.py::TestFileAddition::test_unpack_only ERROR [100%]

==================================== ERRORS ====================================
__________ ERROR at setup of TestFileAddition.test_unpack_and_analyse __________

request = <SubRequest 'analysis_scheduler' for <Function test_unpack_and_analyse>>
pre_analysis_queue = <multiprocessing.queues.Queue object at 0x7b4e1fbae6b0>
post_analysis_queue = <multiprocessing.queues.Queue object at 0x7b4e1fbae980>
analysis_finished_event = <multiprocessing.synchronize.Event object at 0x7b4e1fbae7d0>
analysis_finished_counter = <Synchronized wrapper for c_int(0)>
_unpacking_lock_manager = <storage.unpacking_locks.UnpackingLockManager object at 0x7b4e1fbaecb0>
test_config = SchedulerTestConfig(backend_db_class=<class 'test.integration.common.MockDbInterface'>, comparison_db_class=<class 'st...orage.db_interface_view_sync.ViewUpdater'>, pipeline=True, start_processes=True, items_to_analyze=0, items_to_unpack=0)
monkeypatch = <_pytest.monkeypatch.MonkeyPatch object at 0x7b4e22318af0>

>   ???

src/test/conftest.py:325: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
src/scheduler/analysis/scheduler.py:111: in __init__
    self._load_plugins()
src/scheduler/analysis/scheduler.py:250: in _load_plugins
    runner = PluginRunner(plugin, runner_config, schemata)
src/scheduler/analysis/plugin.py:66: in __init__
    ???
/usr/lib/python3.10/multiprocessing/context.py:103: in Queue
    return Queue(maxsize, ctx=self.get_context())
/usr/lib/python3.10/multiprocessing/queues.py:42: in __init__
    self._reader, self._writer = connection.Pipe(duplex=False)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

duplex = False

    def Pipe(duplex=True):
        '''
        Returns pair of connection objects at either end of a pipe
        '''
        if duplex:
            s1, s2 = socket.socketpair()
            s1.setblocking(True)
            s2.setblocking(True)
            c1 = Connection(s1.detach())
            c2 = Connection(s2.detach())
        else:
>           fd1, fd2 = os.pipe()
E           OSError: [Errno 24] Too many open files

/usr/lib/python3.10/multiprocessing/connection.py:527: OSError
------------------------------ Captured log setup ------------------------------
ERROR    root:scheduler.py:235 Could not import analysis plugin elf_analysis
Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/scheduler/analysis/scheduler.py", line 232, in _load_plugins
    self.analysis_plugins[PluginClass.NAME] = PluginClass()
  File "/home/oguzhanozgur/FACT_core/src/analysis/PluginBase.py", line 90, in __init__
    self.manager = Manager()
  File "/usr/lib/python3.10/multiprocessing/context.py", line 57, in Manager
    m.start()
  File "/usr/lib/python3.10/multiprocessing/managers.py", line 552, in start
    reader, writer = connection.Pipe(duplex=False)
  File "/usr/lib/python3.10/multiprocessing/connection.py", line 527, in Pipe
    fd1, fd2 = os.pipe()
OSError: [Errno 24] Too many open files
ERROR    root:scheduler.py:235 Could not import analysis plugin file_hashes
Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/scheduler/analysis/scheduler.py", line 232, in _load_plugins
    self.analysis_plugins[PluginClass.NAME] = PluginClass()
  File "/home/oguzhanozgur/FACT_core/src/analysis/PluginBase.py", line 84, in __init__
    self.in_queue = Queue()
  File "/usr/lib/python3.10/multiprocessing/context.py", line 103, in Queue
    return Queue(maxsize, ctx=self.get_context())
  File "/usr/lib/python3.10/multiprocessing/queues.py", line 42, in __init__
    self._reader, self._writer = connection.Pipe(duplex=False)
  File "/usr/lib/python3.10/multiprocessing/connection.py", line 527, in Pipe
    fd1, fd2 = os.pipe()
OSError: [Errno 24] Too many open files
________ ERROR at teardown of TestFileAddition.test_unpack_and_analyse _________

    @pytest.fixture
    def _firmware_file_storage_directory() -> str:  # noqa: PT005
>       with TemporaryDirectory(prefix='fact-firmware-file-storage-directory') as tmp_dir:

src/conftest.py:33: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
/usr/lib/python3.10/tempfile.py:1017: in __exit__
    self.cleanup()
/usr/lib/python3.10/tempfile.py:1021: in cleanup
    self._rmtree(self.name, ignore_errors=self._ignore_cleanup_errors)
/usr/lib/python3.10/tempfile.py:1003: in _rmtree
    _rmtree(name, onerror=onerror)
/usr/lib/python3.10/shutil.py:725: in rmtree
    _rmtree_safe_fd(fd, path, onerror)
/usr/lib/python3.10/shutil.py:633: in _rmtree_safe_fd
    onerror(os.scandir, path, sys.exc_info())
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

topfd = 1023, path = '/tmp/fact-firmware-file-storage-directoryuh0vvob5'
onerror = <function TemporaryDirectory._rmtree.<locals>.onerror at 0x7b4e2231f640>

    def _rmtree_safe_fd(topfd, path, onerror):
        try:
>           with os.scandir(topfd) as scandir_it:
E           OSError: [Errno 24] Too many open files: '/tmp/fact-firmware-file-storage-directoryuh0vvob5'

/usr/lib/python3.10/shutil.py:629: OSError
------------------------------ Captured log setup ------------------------------
ERROR    root:scheduler.py:235 Could not import analysis plugin elf_analysis
Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/scheduler/analysis/scheduler.py", line 232, in _load_plugins
    self.analysis_plugins[PluginClass.NAME] = PluginClass()
  File "/home/oguzhanozgur/FACT_core/src/analysis/PluginBase.py", line 90, in __init__
    self.manager = Manager()
  File "/usr/lib/python3.10/multiprocessing/context.py", line 57, in Manager
    m.start()
  File "/usr/lib/python3.10/multiprocessing/managers.py", line 552, in start
    reader, writer = connection.Pipe(duplex=False)
  File "/usr/lib/python3.10/multiprocessing/connection.py", line 527, in Pipe
    fd1, fd2 = os.pipe()
OSError: [Errno 24] Too many open files
ERROR    root:scheduler.py:235 Could not import analysis plugin file_hashes
Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/scheduler/analysis/scheduler.py", line 232, in _load_plugins
    self.analysis_plugins[PluginClass.NAME] = PluginClass()
  File "/home/oguzhanozgur/FACT_core/src/analysis/PluginBase.py", line 84, in __init__
    self.in_queue = Queue()
  File "/usr/lib/python3.10/multiprocessing/context.py", line 103, in Queue
    return Queue(maxsize, ctx=self.get_context())
  File "/usr/lib/python3.10/multiprocessing/queues.py", line 42, in __init__
    self._reader, self._writer = connection.Pipe(duplex=False)
  File "/usr/lib/python3.10/multiprocessing/connection.py", line 527, in Pipe
    fd1, fd2 = os.pipe()
OSError: [Errno 24] Too many open files
_____________ ERROR at setup of TestFileAddition.test_unpack_only ______________

    @pytest.fixture
    def _unpacking_lock_manager() -> UnpackingLockManager:  # noqa: PT005
>       _manager = UnpackingLockManager()

src/test/conftest.py:289: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 
src/storage/unpacking_locks.py:8: in __init__
    self.manager = Manager()
/usr/lib/python3.10/multiprocessing/context.py:57: in Manager
    m.start()
/usr/lib/python3.10/multiprocessing/managers.py:562: in start
    self._process.start()
/usr/lib/python3.10/multiprocessing/process.py:121: in start
    self._popen = self._Popen(self)
/usr/lib/python3.10/multiprocessing/context.py:281: in _Popen
    return Popen(process_obj)
/usr/lib/python3.10/multiprocessing/popen_fork.py:19: in __init__
    self._launch(process_obj)
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <multiprocessing.popen_fork.Popen object at 0x7b4e1f70df90>
process_obj = <ForkProcess name='SyncManager-189' parent=64725 initial>

    def _launch(self, process_obj):
        code = 1
>       parent_r, child_w = os.pipe()
E       OSError: [Errno 24] Too many open files

/usr/lib/python3.10/multiprocessing/popen_fork.py:64: OSError
=================================== FAILURES ===================================
________________ TestTagPropagation.test_run_analysis_with_tag _________________

self = <test.integration.scheduler.test_cycle_with_tags.TestTagPropagation object at 0x7b4e05ffbd90>
analysis_finished_event = <multiprocessing.synchronize.Event object at 0x7b4e05958160>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7b4e22510670>
backend_db = <storage.db_interface_backend.BackendDbInterface object at 0x7b4e05e4bfd0>
analysis_scheduler = <scheduler.analysis.scheduler.AnalysisScheduler object at 0x7b4e059be680>

    @pytest.mark.SchedulerTestConfig(
        # 5 objects * 3 analyses = 15 calls
        items_to_analyze=15,
        pipeline=True,
    )
    def test_run_analysis_with_tag(self, analysis_finished_event, unpacking_scheduler, backend_db, analysis_scheduler):
        test_fw = Firmware(file_path=f'{get_test_data_dir()}/container/with_key.7z')
        test_fw.version, test_fw.vendor, test_fw.device_name, test_fw.device_class = ['foo'] * 4
        test_fw.release_date = '2017-01-01'
        test_fw.scheduled_analysis = ['crypto_material']
    
        unpacking_scheduler.add_task(test_fw)
    
>       assert analysis_finished_event.wait(timeout=20)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e05958160>>(timeout=20)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e05958160>> = <multiprocessing.synchronize.Event object at 0x7b4e05958160>.wait

src/test/integration/scheduler/test_cycle_with_tags.py:23: AssertionError
----------------------------- Captured stderr call -----------------------------
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
Process ExceptionSafeProcess-70:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
_____________________________ test_check_collision _____________________________

frontend_db = <storage.db_interface_frontend.FrontEndDbInterface object at 0x7b4e05e4bca0>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7b4e225132b0>
unpacking_finished_counter = <Synchronized wrapper for c_int(1)>
unpacking_finished_event = <multiprocessing.synchronize.Event object at 0x7b4e22513f70>

    @flaky(max_runs=3, min_passes=1)  # test may fail when the CI is very busy
    @pytest.mark.SchedulerTestConfig(items_to_unpack=4)
    def test_check_collision(
        frontend_db,
        unpacking_scheduler,
        unpacking_finished_counter,
        unpacking_finished_event,
    ):
        add_test_file(unpacking_scheduler, 'regression_one')
>       assert unpacking_finished_event.wait(timeout=25)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e22513f70>>(timeout=25)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e22513f70>> = <multiprocessing.synchronize.Event object at 0x7b4e22513f70>.wait

src/test/integration/scheduler/test_regression_virtual_file_path.py:41: AssertionError
_____________________________ test_unpacking_skip ______________________________

frontend_db = <storage.db_interface_frontend.FrontEndDbInterface object at 0x7b4e05e4bca0>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7b4e2235ebf0>
unpacking_finished_event = <multiprocessing.synchronize.Event object at 0x7b4e2235f2e0>
post_unpack_queue = <multiprocessing.queues.Queue object at 0x7b4e2235f640>

    @flaky(max_runs=3, min_passes=1)  # test may fail when the CI is very busy
    @pytest.mark.SchedulerTestConfig(items_to_unpack=4)
    def test_unpacking_skip(
        frontend_db,
        unpacking_scheduler,
        unpacking_finished_event,
        post_unpack_queue,
    ):
        add_test_file(unpacking_scheduler, 'vfp_test.zip')
    
>       assert unpacking_finished_event.wait(timeout=25)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2235f2e0>>(timeout=25)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2235f2e0>> = <multiprocessing.synchronize.Event object at 0x7b4e2235f2e0>.wait

src/test/integration/scheduler/test_regression_virtual_file_path.py:66: AssertionError
----------------------------- Captured stderr call -----------------------------
Process ExceptionSafeProcess-86:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
----------------------------- Captured stderr call -----------------------------
Process ExceptionSafeProcess-90:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
----------------------------- Captured stderr call -----------------------------
Process ExceptionSafeProcess-94:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
_______________ TestFileAddition.test_unpack_analyse_and_compare _______________

self = <test.integration.scheduler.test_unpack_analyse_and_compare.TestFileAddition object at 0x7b4e05e60370>
backend_db = <storage.db_interface_backend.BackendDbInterface object at 0x7b4e05e4bfd0>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7b4e200ab820>
analysis_scheduler = <scheduler.analysis.scheduler.AnalysisScheduler object at 0x7b4e22338dc0>
comparison_scheduler = <scheduler.comparison_scheduler.ComparisonScheduler object at 0x7b4e200b8370>
comparison_db = <storage.db_interface_comparison.ComparisonDbInterface object at 0x7b4e05e49090>
analysis_finished_event = <multiprocessing.synchronize.Event object at 0x7b4e22339db0>
comparison_finished_event = <multiprocessing.synchronize.Event object at 0x7b4e200b88b0>
post_analysis_queue = <multiprocessing.queues.Queue object at 0x7b4e22338730>

    @pytest.mark.SchedulerTestConfig(
        # 2 container with 3 files each and 2 plugins
        items_to_analyze=4 * 2 * 2,
        start_processes=True,
        pipeline=True,
        backend_db_class=BackendDbInterface,
    )
    def test_unpack_analyse_and_compare(  # noqa: PLR0913
        self,
        backend_db,
        unpacking_scheduler,
        analysis_scheduler,
        comparison_scheduler,
        comparison_db,
        analysis_finished_event,
        comparison_finished_event,
        post_analysis_queue,
    ):
        test_fw_1 = Firmware(file_path=f'{get_test_data_dir()}/container/test.zip')
        test_fw_2 = Firmware(file_path=f'{get_test_data_dir()}/regression_one')
    
        for fw in [test_fw_1, test_fw_2]:
            fw.version, fw.vendor, fw.device_name, fw.device_class = ['foo'] * 4
            fw.release_date = '2017-01-01'
            unpacking_scheduler.unpacker.file_storage_system.store_file(fw)
            unpacking_scheduler.add_task(fw)
    
>       assert analysis_finished_event.wait(timeout=20)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e22339db0>>(timeout=20)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e22339db0>> = <multiprocessing.synchronize.Event object at 0x7b4e22339db0>.wait

src/test/integration/scheduler/test_unpack_analyse_and_compare.py:42: AssertionError
----------------------------- Captured stderr call -----------------------------
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
===Flaky Test Report===

test_check_collision failed (2 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e222ec070>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e222ec070>> = <multiprocessing.synchronize.Event object at 0x7b4e222ec070>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:41>]
test_check_collision failed (1 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2231acb0>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2231acb0>> = <multiprocessing.synchronize.Event object at 0x7b4e2231acb0>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:41>]
test_check_collision failed; it passed 0 out of the required 1 times.
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e22513f70>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e22513f70>> = <multiprocessing.synchronize.Event object at 0x7b4e22513f70>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:41>]
test_unpacking_skip failed (2 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2233b2e0>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2233b2e0>> = <multiprocessing.synchronize.Event object at 0x7b4e2233b2e0>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:66>]
test_unpacking_skip failed (1 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2231a170>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2231a170>> = <multiprocessing.synchronize.Event object at 0x7b4e2231a170>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:66>]
test_unpacking_skip failed; it passed 0 out of the required 1 times.
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2235f2e0>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7b4e2235f2e0>> = <multiprocessing.synchronize.Event object at 0x7b4e2235f2e0>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:66>]

===End Flaky Test Report===
=========================== short test summary info ============================
FAILED src/test/integration/scheduler/test_cycle_with_tags.py::TestTagPropagation::test_run_analysis_with_tag - assert False
FAILED src/test/integration/scheduler/test_regression_virtual_file_path.py::test_check_collision - assert False
FAILED src/test/integration/scheduler/test_regression_virtual_file_path.py::test_unpacking_skip - assert False
FAILED src/test/integration/scheduler/test_unpack_analyse_and_compare.py::TestFileAddition::test_unpack_analyse_and_compare - assert False
ERROR src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse - OSError: [Errno 24] Too many open files
ERROR src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse - OSError: [Errno 24] Too many open files: '/tmp/fact-firmware-file-storage-d...
ERROR src/test/integration/scheduler/test_unpack_only.py::TestFileAddition::test_unpack_only - OSError: [Errno 24] Too many open files
=================== 4 failed, 3 errors in 209.56s (0:03:29) ====================
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ 

@jstucke
Copy link
Collaborator

jstucke commented Oct 1, 2024

OSError: [Errno 24] Too many open files

This could be the root cause. FACT needs a lot of open files at once. What do you get when you run ulimit -n and ulimit -n -H? This gives you the allowed number of open files (soft limit and hard limit respectively). If this is exceeded, you will get errors. The soft limit should be at least 600. You can raise the soft limit by running e.g. ulimit -n 9999. Raising the hard limit is a lot trickier (see https://askubuntu.com/questions/162229/how-do-i-increase-the-open-files-limit-for-a-non-root-user). The limits count only for your current shell, so you may need to set them for each shell/tab independently (you can also add it to the .bashrc for example). Could you try raising the limit and run the tests again?

@oguzhanozgur
Copy link
Author

oguzhanozgur commented Oct 2, 2024

Hi again,
Here is the new result:

(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ ulimit -n
1024
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ ulimit -n -H
1048576
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ ulimit -n 9999
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ pytest src/test/integration/scheduler/
================================================================================= test session starts =================================================================================
platform linux -- Python 3.10.12, pytest-7.2.0, pluggy-1.5.0 -- /home/oguzhanozgur/ozzy/bin/python3
cachedir: .pytest_cache
rootdir: /home/oguzhanozgur/FACT_core, configfile: pyproject.toml
plugins: cov-4.0.0, timeout-2.1.0, flaky-3.7.0
collected 6 items                                                                                                                                                                     

src/test/integration/scheduler/test_cycle_with_tags.py::TestTagPropagation::test_run_analysis_with_tag FAILED                                                                   [ 16%]
src/test/integration/scheduler/test_regression_virtual_file_path.py::test_check_collision FAILED                                                                                [ 33%]
src/test/integration/scheduler/test_regression_virtual_file_path.py::test_unpacking_skip FAILED                                                                                 [ 50%]
src/test/integration/scheduler/test_unpack_analyse_and_compare.py::TestFileAddition::test_unpack_analyse_and_compare FAILED                                                     [ 66%]
src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse FAILED                                                                     [ 83%]
src/test/integration/scheduler/test_unpack_only.py::TestFileAddition::test_unpack_only FAILED                                                                                   [100%]

====================================================================================== FAILURES =======================================================================================
____________________________________________________________________ TestTagPropagation.test_run_analysis_with_tag ____________________________________________________________________

self = <test.integration.scheduler.test_cycle_with_tags.TestTagPropagation object at 0x7846b42f7b80>
analysis_finished_event = <multiprocessing.synchronize.Event object at 0x7846b3b4c040>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7846d06dc3d0>
backend_db = <storage.db_interface_backend.BackendDbInterface object at 0x7846b41c7640>, analysis_scheduler = <scheduler.analysis.scheduler.AnalysisScheduler object at 0x7846b3bb63e0>

    @pytest.mark.SchedulerTestConfig(
        # 5 objects * 3 analyses = 15 calls
        items_to_analyze=15,
        pipeline=True,
    )
    def test_run_analysis_with_tag(self, analysis_finished_event, unpacking_scheduler, backend_db, analysis_scheduler):
        test_fw = Firmware(file_path=f'{get_test_data_dir()}/container/with_key.7z')
        test_fw.version, test_fw.vendor, test_fw.device_name, test_fw.device_class = ['foo'] * 4
        test_fw.release_date = '2017-01-01'
        test_fw.scheduled_analysis = ['crypto_material']
    
        unpacking_scheduler.add_task(test_fw)
    
>       assert analysis_finished_event.wait(timeout=20)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846b3b4c040>>(timeout=20)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846b3b4c040>> = <multiprocessing.synchronize.Event object at 0x7846b3b4c040>.wait

src/test/integration/scheduler/test_cycle_with_tags.py:23: AssertionError
-------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
Process ExceptionSafeProcess-70:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
________________________________________________________________________________ test_check_collision _________________________________________________________________________________

frontend_db = <storage.db_interface_frontend.FrontEndDbInterface object at 0x7846b41c76a0>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7846d04cdab0>, unpacking_finished_counter = <Synchronized wrapper for c_int(1)>
unpacking_finished_event = <multiprocessing.synchronize.Event object at 0x7846d04cc2e0>

    @flaky(max_runs=3, min_passes=1)  # test may fail when the CI is very busy
    @pytest.mark.SchedulerTestConfig(items_to_unpack=4)
    def test_check_collision(
        frontend_db,
        unpacking_scheduler,
        unpacking_finished_counter,
        unpacking_finished_event,
    ):
        add_test_file(unpacking_scheduler, 'regression_one')
>       assert unpacking_finished_event.wait(timeout=25)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04cc2e0>>(timeout=25)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04cc2e0>> = <multiprocessing.synchronize.Event object at 0x7846d04cc2e0>.wait

src/test/integration/scheduler/test_regression_virtual_file_path.py:41: AssertionError
_________________________________________________________________________________ test_unpacking_skip _________________________________________________________________________________

frontend_db = <storage.db_interface_frontend.FrontEndDbInterface object at 0x7846b41c76a0>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7846d052eda0>
unpacking_finished_event = <multiprocessing.synchronize.Event object at 0x7846d052f190>, post_unpack_queue = <multiprocessing.queues.Queue object at 0x7846d052f550>

    @flaky(max_runs=3, min_passes=1)  # test may fail when the CI is very busy
    @pytest.mark.SchedulerTestConfig(items_to_unpack=4)
    def test_unpacking_skip(
        frontend_db,
        unpacking_scheduler,
        unpacking_finished_event,
        post_unpack_queue,
    ):
        add_test_file(unpacking_scheduler, 'vfp_test.zip')
    
>       assert unpacking_finished_event.wait(timeout=25)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d052f190>>(timeout=25)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d052f190>> = <multiprocessing.synchronize.Event object at 0x7846d052f190>.wait

src/test/integration/scheduler/test_regression_virtual_file_path.py:66: AssertionError
-------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------
Process ExceptionSafeProcess-86:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
-------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------
Process ExceptionSafeProcess-90:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
-------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------
Process ExceptionSafeProcess-94:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
__________________________________________________________________ TestFileAddition.test_unpack_analyse_and_compare ___________________________________________________________________

self = <test.integration.scheduler.test_unpack_analyse_and_compare.TestFileAddition object at 0x7846b42f75b0>
backend_db = <storage.db_interface_backend.BackendDbInterface object at 0x7846b41c7640>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7846cffc4970>
analysis_scheduler = <scheduler.analysis.scheduler.AnalysisScheduler object at 0x7846d050bd90>
comparison_scheduler = <scheduler.comparison_scheduler.ComparisonScheduler object at 0x7846cffc5480>
comparison_db = <storage.db_interface_comparison.ComparisonDbInterface object at 0x7846b41c7760>
analysis_finished_event = <multiprocessing.synchronize.Event object at 0x7846d0508670>, comparison_finished_event = <multiprocessing.synchronize.Event object at 0x7846cffc59c0>
post_analysis_queue = <multiprocessing.queues.Queue object at 0x7846d0508400>

    @pytest.mark.SchedulerTestConfig(
        # 2 container with 3 files each and 2 plugins
        items_to_analyze=4 * 2 * 2,
        start_processes=True,
        pipeline=True,
        backend_db_class=BackendDbInterface,
    )
    def test_unpack_analyse_and_compare(  # noqa: PLR0913
        self,
        backend_db,
        unpacking_scheduler,
        analysis_scheduler,
        comparison_scheduler,
        comparison_db,
        analysis_finished_event,
        comparison_finished_event,
        post_analysis_queue,
    ):
        test_fw_1 = Firmware(file_path=f'{get_test_data_dir()}/container/test.zip')
        test_fw_2 = Firmware(file_path=f'{get_test_data_dir()}/regression_one')
    
        for fw in [test_fw_1, test_fw_2]:
            fw.version, fw.vendor, fw.device_name, fw.device_class = ['foo'] * 4
            fw.release_date = '2017-01-01'
            unpacking_scheduler.unpacker.file_storage_system.store_file(fw)
            unpacking_scheduler.add_task(fw)
    
>       assert analysis_finished_event.wait(timeout=20)
E       assert False
E        +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d0508670>>(timeout=20)
E        +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d0508670>> = <multiprocessing.synchronize.Event object at 0x7846d0508670>.wait

src/test/integration/scheduler/test_unpack_analyse_and_compare.py:42: AssertionError
-------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
______________________________________________________________________ TestFileAddition.test_unpack_and_analyse _______________________________________________________________________

self = <test.integration.scheduler.test_unpack_and_analyse.TestFileAddition object at 0x7846b4141570>
analysis_scheduler = <scheduler.analysis.scheduler.AnalysisScheduler object at 0x7846cfd81a20>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7846cd1d6bc0>, post_analysis_queue = <multiprocessing.queues.Queue object at 0x7846cfd81ed0>

    @pytest.mark.SchedulerTestConfig(
        pipeline=True,
        backend_db_class=MockDbInterface,
    )
    def test_unpack_and_analyse(self, analysis_scheduler, unpacking_scheduler, post_analysis_queue):
        test_fw = Firmware(file_path=f'{get_test_data_dir()}/container/test.zip')
        test_fw.release_date = '1970-01-01'
    
        unpacking_scheduler.add_task(test_fw)
    
        processed_container = {}
        for _ in range(4 * 2):  # container with 3 included files times 2 mandatory plugins run
>           uid, plugin, analysis_result = post_analysis_queue.get(timeout=3)

src/test/integration/scheduler/test_unpack_and_analyse.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _

self = <multiprocessing.queues.Queue object at 0x7846cfd81ed0>, block = True, timeout = 2.9999988390009094

    def get(self, block=True, timeout=None):
        if self._closed:
            raise ValueError(f"Queue {self!r} is closed")
        if block and timeout is None:
            with self._rlock:
                res = self._recv_bytes()
            self._sem.release()
        else:
            if block:
                deadline = time.monotonic() + timeout
            if not self._rlock.acquire(block, timeout):
                raise Empty
            try:
                if block:
                    timeout = deadline - time.monotonic()
                    if not self._poll(timeout):
>                       raise Empty
E                       _queue.Empty

/usr/lib/python3.10/multiprocessing/queues.py:114: Empty
-------------------------------------------------------------------------------- Captured stderr call ---------------------------------------------------------------------------------
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
Process ExceptionSafeProcess-235:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
__________________________________________________________________________ TestFileAddition.test_unpack_only __________________________________________________________________________

self = <test.integration.scheduler.test_unpack_only.TestFileAddition object at 0x7846b4141d50>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7846cd201b10>, post_unpack_queue = <multiprocessing.queues.Queue object at 0x7846cd17ffd0>

    def test_unpack_only(self, unpacking_scheduler, post_unpack_queue):
        test_fw = Firmware(file_path=f'{get_test_data_dir()}/container/test.zip')
    
        unpacking_scheduler.add_task(test_fw)
    
        processed_container = post_unpack_queue.get(timeout=5)
    
>       assert len(processed_container.files_included) == 3, 'not all included files found'  # noqa: PLR2004
E       AssertionError: not all included files found
E       assert 0 == 3
E        +  where 0 = len(set())
E        +    where set() = None None v. None\nProcessed Analysis: ['unpacker']\nScheduled Analysis: None.files_included

src/test/integration/scheduler/test_unpack_only.py:17: AssertionError
===Flaky Test Report===

test_check_collision failed (2 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d06deb30>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d06deb30>> = <multiprocessing.synchronize.Event object at 0x7846d06deb30>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:41>]
test_check_collision failed (1 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04f9ed0>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04f9ed0>> = <multiprocessing.synchronize.Event object at 0x7846d04f9ed0>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:41>]
test_check_collision failed; it passed 0 out of the required 1 times.
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04cc2e0>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04cc2e0>> = <multiprocessing.synchronize.Event object at 0x7846d04cc2e0>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:41>]
test_unpacking_skip failed (2 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d050add0>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d050add0>> = <multiprocessing.synchronize.Event object at 0x7846d050add0>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:66>]
test_unpacking_skip failed (1 runs remaining out of 3).
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04cd060>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d04cd060>> = <multiprocessing.synchronize.Event object at 0x7846d04cd060>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:66>]
test_unpacking_skip failed; it passed 0 out of the required 1 times.
	<class 'AssertionError'>
	assert False
 +  where False = <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d052f190>>(timeout=25)
 +    where <bound method Event.wait of <multiprocessing.synchronize.Event object at 0x7846d052f190>> = <multiprocessing.synchronize.Event object at 0x7846d052f190>.wait
	[<TracebackEntry /home/oguzhanozgur/FACT_core/src/test/integration/scheduler/test_regression_virtual_file_path.py:66>]

===End Flaky Test Report===
=============================================================================== short test summary info ===============================================================================
FAILED src/test/integration/scheduler/test_cycle_with_tags.py::TestTagPropagation::test_run_analysis_with_tag - assert False
FAILED src/test/integration/scheduler/test_regression_virtual_file_path.py::test_check_collision - assert False
FAILED src/test/integration/scheduler/test_regression_virtual_file_path.py::test_unpacking_skip - assert False
FAILED src/test/integration/scheduler/test_unpack_analyse_and_compare.py::TestFileAddition::test_unpack_analyse_and_compare - assert False
FAILED src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse - _queue.Empty
FAILED src/test/integration/scheduler/test_unpack_only.py::TestFileAddition::test_unpack_only - AssertionError: not all included files found
============================================================================ 6 failed in 217.06s (0:03:37) ============================================================================
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ 

@jstucke
Copy link
Collaborator

jstucke commented Oct 2, 2024

It seems an exception occurred in the extraction container, but we need to activate additional logging to see what it was. Could you try to run this command?

pytest -vvv --log-cli-level=DEBUG -s src/test/integration/scheduler/test_unpack_and_analyse.py

@oguzhanozgur
Copy link
Author

oguzhanozgur commented Oct 2, 2024

(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ pytest -vvv --log-cli-level=DEBUG -s src/test/integration/scheduler/test_unpack_and_analyse.py
==================================================================================== test session starts =====================================================================================
platform linux -- Python 3.10.12, pytest-7.2.0, pluggy-1.5.0 -- /home/oguzhanozgur/ozzy/bin/python3
cachedir: .pytest_cache
rootdir: /home/oguzhanozgur/FACT_core, configfile: pyproject.toml
plugins: cov-4.0.0, timeout-2.1.0, flaky-3.7.0
collected 1 item                                                                                                                                                                             

src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse 
--------------------------------------------------------------------------------------- live log setup ---------------------------------------------------------------------------------------
DEBUG    root:unpacking_locks.py:10 Started unpacking locks manager 
DEBUG    root:db_interface_view_sync.py:18 view updated: source_code_analysis
DEBUG    root:db_interface_view_sync.py:18 view updated: crypto_hints
DEBUG    root:db_interface_view_sync.py:18 view updated: exploit_mitigations
DEBUG    root:db_interface_view_sync.py:18 view updated: file_system_metadata
DEBUG    root:scheduler.py:653 ExamplePlugin: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: cpu_architecture
DEBUG    root:db_interface_view_sync.py:18 view updated: file_type
DEBUG    root:db_interface_view_sync.py:18 view updated: cve_lookup
DEBUG    root:db_interface_view_sync.py:18 view updated: information_leaks
DEBUG    root:base.py:29 hashlookup: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: crypto_material
DEBUG    root:db_interface_view_sync.py:18 view updated: ipc_analyzer
DEBUG    root:db_interface_view_sync.py:18 view updated: tlsh
DEBUG    root:db_interface_view_sync.py:18 view updated: known_vulnerabilities
DEBUG    root:db_interface_view_sync.py:18 view updated: ip_and_uri_finder
INFO     root:ip_and_uri_finder_analysis.py:154 ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/ip_rules.yara
INFO     root:ip_and_uri_finder_analysis.py:158 ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/uri_rules.yara
DEBUG    root:base.py:29 interesting_uris: No view available! Generic view will be used.
DEBUG    root:base.py:29 dummy_plugin_for_testing_only: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: kernel_config
DEBUG    root:db_interface_view_sync.py:18 view updated: cwe_checker
DEBUG    docker.utils.config:config.py:21 Trying paths: ['/home/oguzhanozgur/.docker/config.json', '/home/oguzhanozgur/.dockercfg']
DEBUG    docker.utils.config:config.py:28 No config file found
DEBUG    docker.utils.config:config.py:21 Trying paths: ['/home/oguzhanozgur/.docker/config.json', '/home/oguzhanozgur/.dockercfg']
DEBUG    docker.utils.config:config.py:28 No config file found
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /version HTTP/11" 200 778
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/create HTTP/11" 201 88
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/json HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/start HTTP/11" 204 0
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/wait HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/logs?stderr=1&stdout=1&timestamps=0&follow=0&tail=all HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/json HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/stop HTTP/11" 304 0
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "DELETE /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561?v=False&link=False&force=False HTTP/11" 204 0
DEBUG    root:cwe_checker.py:61 Version is cwe_checker 0.9.0

DEBUG    root:db_interface_view_sync.py:18 view updated: binwalk
DEBUG    root:db_interface_view_sync.py:18 view updated: qemu_exec
DEBUG    root:db_interface_view_sync.py:18 view updated: users_and_passwords
DEBUG    root:db_interface_view_sync.py:18 view updated: input_vectors
DEBUG    root:db_interface_view_sync.py:18 view updated: software_components
DEBUG    root:db_interface_view_sync.py:18 view updated: printable_strings
DEBUG    root:db_interface_view_sync.py:18 view updated: string_evaluator
DEBUG    root:db_interface_view_sync.py:18 view updated: init_systems
DEBUG    root:db_interface_view_sync.py:18 view updated: hardware_analysis
DEBUG    root:db_interface_view_sync.py:18 view updated: elf_analysis
DEBUG    root:base.py:29 file_hashes: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: device_tree
DEBUG    root:analysis_status.py:108 starting analysis status worker (pid: 77452)
DEBUG    root:scheduler.py:326 Started analysis scheduling worker 0 (pid=77454)
DEBUG    root:scheduler.py:326 Started analysis scheduling worker 1 (pid=77455)
DEBUG    root:scheduler.py:326 Started analysis scheduling worker 2 (pid=77458)
DEBUG    root:scheduler.py:326 Started analysis scheduling worker 3 (pid=77460)
DEBUG    root:scheduler.py:518 Started analysis result collector worker 0 (pid=77462)
DEBUG    root:PluginBase.py:112 exploit_mitigations: 1 worker threads started
DEBUG    root:scheduler.py:518 Started analysis result collector worker 1 (pid=77464)
DEBUG    root:PluginBase.py:251 started exploit_mitigations worker 0 (pid=77466)
DEBUG    root:PluginBase.py:251 started cpu_architecture worker 0 (pid=77468)
DEBUG    root:PluginBase.py:251 started cpu_architecture worker 1 (pid=77470)
DEBUG    root:PluginBase.py:251 started cpu_architecture worker 2 (pid=77472)
DEBUG    root:PluginBase.py:112 cpu_architecture: 4 worker threads started
DEBUG    root:PluginBase.py:251 started cpu_architecture worker 3 (pid=77474)
DEBUG    root:PluginBase.py:251 started cve_lookup worker 0 (pid=77476)
DEBUG    root:PluginBase.py:112 cve_lookup: 2 worker threads started
DEBUG    root:PluginBase.py:251 started cve_lookup worker 1 (pid=77478)
DEBUG    root:PluginBase.py:112 information_leaks: 1 worker threads started
DEBUG    root:PluginBase.py:251 started information_leaks worker 0 (pid=77480)
DEBUG    root:PluginBase.py:112 hashlookup: 1 worker threads started
DEBUG    root:PluginBase.py:251 started hashlookup worker 0 (pid=77482)
DEBUG    root:PluginBase.py:112 crypto_material: 1 worker threads started
DEBUG    root:PluginBase.py:251 started crypto_material worker 0 (pid=77484)
DEBUG    root:PluginBase.py:112 ipc_analyzer: 1 worker threads started
DEBUG    root:PluginBase.py:251 started ipc_analyzer worker 0 (pid=77486)
DEBUG    root:PluginBase.py:112 tlsh: 1 worker threads started
DEBUG    root:PluginBase.py:251 started tlsh worker 0 (pid=77488)
DEBUG    root:PluginBase.py:112 known_vulnerabilities: 1 worker threads started
DEBUG    root:PluginBase.py:251 started known_vulnerabilities worker 0 (pid=77490)
DEBUG    root:PluginBase.py:112 ip_and_uri_finder: 1 worker threads started
DEBUG    root:PluginBase.py:251 started ip_and_uri_finder worker 0 (pid=77492)
DEBUG    root:PluginBase.py:112 interesting_uris: 1 worker threads started
DEBUG    root:PluginBase.py:251 started interesting_uris worker 0 (pid=77494)
DEBUG    root:PluginBase.py:112 dummy_plugin_for_testing_only: 1 worker threads started
DEBUG    root:PluginBase.py:251 started dummy_plugin_for_testing_only worker 0 (pid=77496)
DEBUG    root:PluginBase.py:112 kernel_config: 1 worker threads started
DEBUG    root:PluginBase.py:251 started kernel_config worker 0 (pid=77498)
DEBUG    root:PluginBase.py:112 cwe_checker: 1 worker threads started
DEBUG    root:PluginBase.py:251 started cwe_checker worker 0 (pid=77500)
DEBUG    root:PluginBase.py:112 qemu_exec: 1 worker threads started
DEBUG    root:PluginBase.py:251 started qemu_exec worker 0 (pid=77502)
DEBUG    root:PluginBase.py:112 users_and_passwords: 1 worker threads started
DEBUG    root:PluginBase.py:251 started users_and_passwords worker 0 (pid=77504)
DEBUG    root:PluginBase.py:112 input_vectors: 1 worker threads started
DEBUG    root:PluginBase.py:251 started input_vectors worker 0 (pid=77506)
DEBUG    root:PluginBase.py:112 software_components: 1 worker threads started
DEBUG    root:PluginBase.py:251 started software_components worker 0 (pid=77508)
DEBUG    root:PluginBase.py:112 printable_strings: 1 worker threads started
DEBUG    root:PluginBase.py:251 started printable_strings worker 0 (pid=77510)
DEBUG    root:PluginBase.py:112 string_evaluator: 1 worker threads started
DEBUG    root:PluginBase.py:251 started string_evaluator worker 0 (pid=77512)
DEBUG    root:PluginBase.py:112 init_systems: 1 worker threads started
DEBUG    root:PluginBase.py:251 started init_systems worker 0 (pid=77514)
DEBUG    root:PluginBase.py:112 hardware_analysis: 1 worker threads started
DEBUG    root:PluginBase.py:251 started hardware_analysis worker 0 (pid=77516)
DEBUG    root:PluginBase.py:112 elf_analysis: 1 worker threads started
DEBUG    root:PluginBase.py:251 started elf_analysis worker 0 (pid=77518)
DEBUG    root:PluginBase.py:112 file_hashes: 1 worker threads started
DEBUG    root:PluginBase.py:251 started file_hashes worker 0 (pid=77520)
INFO     root:scheduler.py:131 Analysis scheduler online
INFO     root:scheduler.py:132 Analysis plugins available: binwalk 1.0.0, cpu_architecture 0.4.0, crypto_hints 0.2.1, crypto_material 0.5.2, cve_lookup 0.1.0, cwe_checker 0.5.4, device_tree 2.0.0, dummy_plugin_for_testing_only 0.0, elf_analysis 0.3.4, ExamplePlugin 0.0.0, exploit_mitigations 0.2.0, file_hashes 1.2, file_system_metadata 1.0.0, file_type 1.0.0, hardware_analysis 0.2, hashlookup 0.1.4, information_leaks 0.2.0, init_systems 0.4.2, input_vectors 0.1.2, interesting_uris 0.1, ip_and_uri_finder 0.4.2, ipc_analyzer 0.1.1, kernel_config 0.3.1, known_vulnerabilities 0.2.1, printable_strings 0.3.4, qemu_exec 0.5.2, software_components 0.4.2, source_code_analysis 0.7.1, string_evaluator 0.2.1, tlsh 0.2, users_and_passwords 0.5.4
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/create HTTP/11" 400 139
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/create HTTP/11" 400 139
INFO     root:unpacking_scheduler.py:90 Unpacking scheduler online
DEBUG    root:unpacking_scheduler.py:294 Started unpacking work load monitor (pid=77557)
INFO     root:unpacking_scheduler.py:314 Queue Length (Analysis/Unpack): 0 / 0
DEBUG    root:unpacking_scheduler.py:147 Starting unpacking scheduler loop (pid=77561)
DEBUG    root:unpacking_scheduler.py:158 Started Worker on 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787 (<TemporaryDirectory '/tmp/fact-docker-mount-base-dir8iz8swbh/tmpau2ihkim'>)
DEBUG    urllib3.connectionpool:connectionpool.py:243 Starting new HTTP connection (1): localhost:9900
DEBUG    urllib3.util.retry:retry.py:521 Incremented Retry for (url='/status'): Retry(total=2, connect=None, read=None, redirect=None, status=None)
WARNING  urllib3.connectionpool:connectionpool.py:870 Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7456c0128310>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
DEBUG    urllib3.connectionpool:connectionpool.py:243 Starting new HTTP connection (2): localhost:9900
DEBUG    urllib3.util.retry:retry.py:521 Incremented Retry for (url='/status'): Retry(total=1, connect=None, read=None, redirect=None, status=None)
WARNING  urllib3.connectionpool:connectionpool.py:870 Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7456c0128610>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
DEBUG    urllib3.connectionpool:connectionpool.py:243 Starting new HTTP connection (3): localhost:9900
DEBUG    urllib3.util.retry:retry.py:521 Incremented Retry for (url='/status'): Retry(total=0, connect=None, read=None, redirect=None, status=None)
WARNING  urllib3.connectionpool:connectionpool.py:870 Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7456c0128880>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
DEBUG    urllib3.connectionpool:connectionpool.py:243 Starting new HTTP connection (4): localhost:9900
ERROR    root:unpacking_scheduler.py:273 Could not fetch unpacking container logs
ERROR    root:unpacking_scheduler.py:202 Exception happened during extraction of 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787.: Extraction container could not be reached.
Traceback (most recent call last):
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 199, in _new_conn
    sock = connection.create_connection(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection
    raise err
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 495, in _make_request
    conn.request(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 441, in request
    self.endheaders()
  File "/usr/lib/python3.10/http/client.py", line 1278, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.10/http/client.py", line 1038, in _send_output
    self.send(msg)
  File "/usr/lib/python3.10/http/client.py", line 976, in send
    self.connect()
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 279, in connect
    self.sock = self._new_conn()
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py", line 214, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x7456c0128af0>: Failed to establish a new connection: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/adapters.py", line 589, in send
    resp = conn.urlopen(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
    return self.urlopen(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
    return self.urlopen(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
    return self.urlopen(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/util/retry.py", line 519, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=9900): Max retries exceeded with url: /status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7456c0128af0>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 55, in _extract_with_worker
    response = container.start_unpacking(tmp_dir, timeout=WORKER_TIMEOUT)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 118, in start_unpacking
    response = self._check_connection()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 133, in _check_connection
    return session.get(url, timeout=5)
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/home/oguzhanozgur/ozzy/lib/python3.10/site-packages/requests/adapters.py", line 622, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=9900): Max retries exceeded with url: /status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x7456c0128af0>: Failed to establish a new connection: [Errno 111] Connection refused'))

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 199, in work_thread
    extracted_objects = self.unpacker.unpack(task, tmp_dir, container)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack.py", line 42, in unpack
    extracted_files = self.extract_files_from_file(current_fo.file_path, tmp_dir, container)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 41, in extract_files_from_file
    self._extract_with_worker(file_path, container, tmp_dir)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 59, in _extract_with_worker
    raise ExtractionError('Extraction container could not be reached.') from error
unpacker.unpack_base.ExtractionError: Extraction container could not be reached.
INFO     root:unpacking_scheduler.py:208 Unpacking completed: 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787 (extracted files: 0)
DEBUG    root:analysis_status.py:116 updating status (queue: 0)
INFO     root:unpacking_scheduler.py:241 Unpacking of firmware 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787 completed.
DEBUG    root:plugin.py:120 Queueing analysis for 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787
DEBUG    root:plugin.py:218 <Worker name='file_type worker' parent=77194 started>: Beginning file_type analysis on 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
DEBUG    root:plugin.py:238 <Worker name='file_type worker' parent=77194 started>: Finished file_type analysis on 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787
DEBUG    root:PluginBase.py:255 Worker 0: Begin file_hashes analysis on 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787
DEBUG    root:PluginBase.py:224 Worker 0: Finished file_hashes analysis on 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787
INFO     root:scheduler.py:549 Analysis Completed: 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787
INFO     root:analysis_status.py:193 Analysis of firmware 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787 completed
Process ExceptionSafeProcess-70:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
DEBUG    root:unpacking_scheduler.py:314 Queue Length (Analysis/Unpack): 0 / 0
FAILED
------------------------------------------------------------------------------------- live log teardown --------------------------------------------------------------------------------------
DEBUG    root:unpacking_scheduler.py:101 Shutting down unpacking scheduler
DEBUG    root:unpacking_scheduler.py:316 Stopped unpacking work load monitor
INFO     root:unpacking_scheduler.py:112 Unpacking scheduler offline
DEBUG    root:scheduler.py:139 Shutting down analysis scheduler
DEBUG    root:scheduler.py:334 Stopped analysis scheduling worker 0
DEBUG    root:scheduler.py:334 Stopped analysis scheduling worker 1
DEBUG    root:scheduler.py:334 Stopped analysis scheduling worker 3
DEBUG    root:scheduler.py:334 Stopped analysis scheduling worker 2
INFO     root:plugin.py:188 <Worker name='file_type worker' parent=77194 started> received SIGTERM. Shutting down.
INFO     root:plugin.py:188 <Worker name='crypto_hints worker' parent=77194 started> received SIGTERM. Shutting down.
INFO     root:plugin.py:188 <Worker name='device_tree worker' parent=77194 started> received SIGTERM. Shutting down.
INFO     root:plugin.py:188 <Worker name='binwalk worker' parent=77194 started> received SIGTERM. Shutting down.
INFO     root:plugin.py:188 <Worker name='source_code_analysis worker' parent=77194 started> received SIGTERM. Shutting down.
INFO     root:plugin.py:188 <Worker name='ExamplePlugin worker' parent=77194 started> received SIGTERM. Shutting down.
INFO     root:plugin.py:188 <Worker name='file_system_metadata worker' parent=77194 started> received SIGTERM. Shutting down.
DEBUG    root:scheduler.py:537 Stopped analysis result collector worker 1
DEBUG    root:scheduler.py:537 Stopped analysis result collector worker 0
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 1 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 2 stopped
DEBUG    root:PluginBase.py:263 worker 3 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 1 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:PluginBase.py:263 worker 0 stopped
DEBUG    root:analysis_status.py:116 updating status (queue: 0)
DEBUG    root:analysis_status.py:120 stopped analysis status worker
INFO     root:scheduler.py:162 Analysis scheduler offline


========================================================================================== FAILURES ==========================================================================================
__________________________________________________________________________ TestFileAddition.test_unpack_and_analyse __________________________________________________________________________

self = <test.integration.scheduler.test_unpack_and_analyse.TestFileAddition object at 0x7456a3cef340>
analysis_scheduler = <scheduler.analysis.scheduler.AnalysisScheduler object at 0x7456a3a03550>
unpacking_scheduler = <scheduler.unpacking_scheduler.UnpackingScheduler object at 0x7456c00fa350>, post_analysis_queue = <multiprocessing.queues.Queue object at 0x7456a3999840>

    @pytest.mark.SchedulerTestConfig(
        pipeline=True,
        backend_db_class=MockDbInterface,
    )
    def test_unpack_and_analyse(self, analysis_scheduler, unpacking_scheduler, post_analysis_queue):
        test_fw = Firmware(file_path=f'{get_test_data_dir()}/container/test.zip')
        test_fw.release_date = '1970-01-01'
    
        unpacking_scheduler.add_task(test_fw)
    
        processed_container = {}
        for _ in range(4 * 2):  # container with 3 included files times 2 mandatory plugins run
>           uid, plugin, analysis_result = post_analysis_queue.get(timeout=3)

src/test/integration/scheduler/test_unpack_and_analyse.py:21: 
_ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ _ 

self = <multiprocessing.queues.Queue object at 0x7456a3999840>, block = True, timeout = 2.999998934999894

    def get(self, block=True, timeout=None):
        if self._closed:
            raise ValueError(f"Queue {self!r} is closed")
        if block and timeout is None:
            with self._rlock:
                res = self._recv_bytes()
            self._sem.release()
        else:
            if block:
                deadline = time.monotonic() + timeout
            if not self._rlock.acquire(block, timeout):
                raise Empty
            try:
                if block:
                    timeout = deadline - time.monotonic()
                    if not self._poll(timeout):
>                       raise Empty
E                       _queue.Empty

/usr/lib/python3.10/multiprocessing/queues.py:114: Empty
------------------------------------------------------------------------------------- Captured log setup -------------------------------------------------------------------------------------
DEBUG    root:unpacking_locks.py:10 Started unpacking locks manager 
DEBUG    root:db_interface_view_sync.py:18 view updated: source_code_analysis
DEBUG    root:db_interface_view_sync.py:18 view updated: crypto_hints
DEBUG    root:db_interface_view_sync.py:18 view updated: exploit_mitigations
DEBUG    root:db_interface_view_sync.py:18 view updated: file_system_metadata
DEBUG    root:scheduler.py:653 ExamplePlugin: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: cpu_architecture
DEBUG    root:db_interface_view_sync.py:18 view updated: file_type
DEBUG    root:db_interface_view_sync.py:18 view updated: cve_lookup
DEBUG    root:db_interface_view_sync.py:18 view updated: information_leaks
DEBUG    root:base.py:29 hashlookup: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: crypto_material
DEBUG    root:db_interface_view_sync.py:18 view updated: ipc_analyzer
DEBUG    root:db_interface_view_sync.py:18 view updated: tlsh
DEBUG    root:db_interface_view_sync.py:18 view updated: known_vulnerabilities
DEBUG    root:db_interface_view_sync.py:18 view updated: ip_and_uri_finder
INFO     root:ip_and_uri_finder_analysis.py:154 ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/ip_rules.yara
INFO     root:ip_and_uri_finder_analysis.py:158 ip signature path: /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/uri_rules.yara
DEBUG    root:base.py:29 interesting_uris: No view available! Generic view will be used.
DEBUG    root:base.py:29 dummy_plugin_for_testing_only: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: kernel_config
DEBUG    root:db_interface_view_sync.py:18 view updated: cwe_checker
DEBUG    docker.utils.config:config.py:21 Trying paths: ['/home/oguzhanozgur/.docker/config.json', '/home/oguzhanozgur/.dockercfg']
DEBUG    docker.utils.config:config.py:28 No config file found
DEBUG    docker.utils.config:config.py:21 Trying paths: ['/home/oguzhanozgur/.docker/config.json', '/home/oguzhanozgur/.dockercfg']
DEBUG    docker.utils.config:config.py:28 No config file found
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /version HTTP/11" 200 778
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/create HTTP/11" 201 88
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/json HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/start HTTP/11" 204 0
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/wait HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/logs?stderr=1&stdout=1&timestamps=0&follow=0&tail=all HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "GET /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/json HTTP/11" 200 None
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561/stop HTTP/11" 304 0
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "DELETE /v1.43/containers/e9ebd6a407f81788d3174bf03880b498c682d018b3a5086cab970ba889671561?v=False&link=False&force=False HTTP/11" 204 0
DEBUG    root:cwe_checker.py:61 Version is cwe_checker 0.9.0

DEBUG    root:db_interface_view_sync.py:18 view updated: binwalk
DEBUG    root:db_interface_view_sync.py:18 view updated: qemu_exec
DEBUG    root:db_interface_view_sync.py:18 view updated: users_and_passwords
DEBUG    root:db_interface_view_sync.py:18 view updated: input_vectors
DEBUG    root:db_interface_view_sync.py:18 view updated: software_components
DEBUG    root:db_interface_view_sync.py:18 view updated: printable_strings
DEBUG    root:db_interface_view_sync.py:18 view updated: string_evaluator
DEBUG    root:db_interface_view_sync.py:18 view updated: init_systems
DEBUG    root:db_interface_view_sync.py:18 view updated: hardware_analysis
DEBUG    root:db_interface_view_sync.py:18 view updated: elf_analysis
DEBUG    root:base.py:29 file_hashes: No view available! Generic view will be used.
DEBUG    root:db_interface_view_sync.py:18 view updated: device_tree
DEBUG    root:PluginBase.py:112 exploit_mitigations: 1 worker threads started
DEBUG    root:PluginBase.py:112 cpu_architecture: 4 worker threads started
DEBUG    root:PluginBase.py:112 cve_lookup: 2 worker threads started
DEBUG    root:PluginBase.py:112 information_leaks: 1 worker threads started
DEBUG    root:PluginBase.py:112 hashlookup: 1 worker threads started
DEBUG    root:PluginBase.py:112 crypto_material: 1 worker threads started
DEBUG    root:PluginBase.py:112 ipc_analyzer: 1 worker threads started
DEBUG    root:PluginBase.py:112 tlsh: 1 worker threads started
DEBUG    root:PluginBase.py:112 known_vulnerabilities: 1 worker threads started
DEBUG    root:PluginBase.py:112 ip_and_uri_finder: 1 worker threads started
DEBUG    root:PluginBase.py:112 interesting_uris: 1 worker threads started
DEBUG    root:PluginBase.py:112 dummy_plugin_for_testing_only: 1 worker threads started
DEBUG    root:PluginBase.py:112 kernel_config: 1 worker threads started
DEBUG    root:PluginBase.py:112 cwe_checker: 1 worker threads started
DEBUG    root:PluginBase.py:112 qemu_exec: 1 worker threads started
DEBUG    root:PluginBase.py:112 users_and_passwords: 1 worker threads started
DEBUG    root:PluginBase.py:112 input_vectors: 1 worker threads started
DEBUG    root:PluginBase.py:112 software_components: 1 worker threads started
DEBUG    root:PluginBase.py:112 printable_strings: 1 worker threads started
DEBUG    root:PluginBase.py:112 string_evaluator: 1 worker threads started
DEBUG    root:PluginBase.py:112 init_systems: 1 worker threads started
DEBUG    root:PluginBase.py:112 hardware_analysis: 1 worker threads started
DEBUG    root:PluginBase.py:112 elf_analysis: 1 worker threads started
DEBUG    root:PluginBase.py:112 file_hashes: 1 worker threads started
INFO     root:scheduler.py:131 Analysis scheduler online
INFO     root:scheduler.py:132 Analysis plugins available: binwalk 1.0.0, cpu_architecture 0.4.0, crypto_hints 0.2.1, crypto_material 0.5.2, cve_lookup 0.1.0, cwe_checker 0.5.4, device_tree 2.0.0, dummy_plugin_for_testing_only 0.0, elf_analysis 0.3.4, ExamplePlugin 0.0.0, exploit_mitigations 0.2.0, file_hashes 1.2, file_system_metadata 1.0.0, file_type 1.0.0, hardware_analysis 0.2, hashlookup 0.1.4, information_leaks 0.2.0, init_systems 0.4.2, input_vectors 0.1.2, interesting_uris 0.1, ip_and_uri_finder 0.4.2, ipc_analyzer 0.1.1, kernel_config 0.3.1, known_vulnerabilities 0.2.1, printable_strings 0.3.4, qemu_exec 0.5.2, software_components 0.4.2, source_code_analysis 0.7.1, string_evaluator 0.2.1, tlsh 0.2, users_and_passwords 0.5.4
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/create HTTP/11" 400 139
DEBUG    urllib3.connectionpool:connectionpool.py:546 http://localhost:None "POST /v1.43/containers/create HTTP/11" 400 139
INFO     root:unpacking_scheduler.py:90 Unpacking scheduler online
----------------------------------------------------------------------------------- Captured log teardown ------------------------------------------------------------------------------------
DEBUG    root:unpacking_scheduler.py:101 Shutting down unpacking scheduler
INFO     root:unpacking_scheduler.py:112 Unpacking scheduler offline
DEBUG    root:scheduler.py:139 Shutting down analysis scheduler
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
DEBUG    root:PluginBase.py:118 Shutting down...
INFO     root:scheduler.py:162 Analysis scheduler offline
================================================================================== short test summary info ===================================================================================
FAILED src/test/integration/scheduler/test_unpack_and_analyse.py::TestFileAddition::test_unpack_and_analyse - _queue.Empty
===================================================================================== 1 failed in 6.58s ======================================================================================
(ozzy) oguzhanozgur@oguzhanozgur:~/FACT_core$ 

@jstucke
Copy link
Collaborator

jstucke commented Oct 2, 2024

ERROR    root:unpacking_scheduler.py:273 Could not fetch unpacking container logs
ERROR    root:unpacking_scheduler.py:202 Exception happened during extraction of 418a54d78550e8584291c96e5d6168133621f352bfc1d43cf84e81187fef4962_787.: Extraction container could not be reached. 

Sadly, this was not really helpful: It seems the container did not start at all and therefore did not produce any error log. This is really puzzling. What happens if you try to start the container manually with the parameters used in FACT:

docker run --rm -it -p "9990:5000/tcp" --entrypoint gunicorn fkiecad/fact_extractor:latest --timeout 600 -w 1 -b 0.0.0.0:5000 server:app

Does it start? Are you able to do a curl "http://localhost:9990/status" in another shell when it runs?

@oguzhanozgur
Copy link
Author

image

@jstucke
Copy link
Collaborator

jstucke commented Oct 2, 2024

So the container starts fine, just not when started from FACT using the docker python API it seems.

What is the version of the python docker package? (you can get it by running pip show docker | grep -i version with activated venv)

Also is does not look like you are really using a venv judging by the package path /home/oguzhanozgur/ozzy/lib/python3.10/site-packages/urllib3/connection.py. Could you try creating a venv in your src folder by running

python3 -m venv venv
. venv/bin/activate

and then in that shell reinstall all python packages with

pip install -r install/requirements_pre_install.txt
python3 install.py

and then try starting FACT again (also from this shell)?

@oguzhanozgur
Copy link
Author

oguzhanozgur commented Oct 2, 2024

I thinks there is a problem with docker. Here are the new logs: (btw, i could start FACT successfully but same issue occured again)

oguzhanozgur@oguzhanozgur:~$ pip show docker | grep -i version
WARNING: Package(s) not found: docker
oguzhanozgur@oguzhanozgur:~$ . venv/bin/activate
bash: venv/bin/activate: No such file or directory
oguzhanozgur@oguzhanozgur:~$ cd FACT_core/src
oguzhanozgur@oguzhanozgur:~/FACT_core/src$ . venv/bin/activate
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core/src$ pip show docker | grep -i version
WARNING: Package(s) not found: docker
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core/src$ pip install -r install/requirements_pre_install.txt 
Collecting git+https://github.com/fkie-cad/common_helper_files.git (from -r install/requirements_pre_install.txt (line 12))
  Cloning https://github.com/fkie-cad/common_helper_files.git to /tmp/pip-req-build-_wsabl92
  Running command git clone --filter=blob:none --quiet https://github.com/fkie-cad/common_helper_files.git /tmp/pip-req-build-_wsabl92
  Resolved https://github.com/fkie-cad/common_helper_files.git to commit 3f9a641d56216e6a74774737b310a5d4b598e017
  Preparing metadata (setup.py) ... done
Collecting virtualenv
  Downloading virtualenv-20.26.6-py3-none-any.whl (6.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 6.0/6.0 MB 5.6 MB/s eta 0:00:00
Collecting distro==1.8.0
  Downloading distro-1.8.0-py3-none-any.whl (20 kB)
Collecting python-magic==0.4.27
  Downloading python_magic-0.4.27-py2.py3-none-any.whl (13 kB)
Collecting requests==2.32.2
  Downloading requests-2.32.2-py3-none-any.whl (63 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 63.9/63.9 KB 8.1 MB/s eta 0:00:00
Collecting pydantic==2.4.0
  Downloading pydantic-2.4.0-py3-none-any.whl (395 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 395.4/395.4 KB 9.9 MB/s eta 0:00:00
Collecting werkzeug~=3.0.3
  Downloading werkzeug-3.0.4-py3-none-any.whl (227 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 227.6/227.6 KB 9.6 MB/s eta 0:00:00
Collecting toml==0.10.2
  Downloading toml-0.10.2-py2.py3-none-any.whl (16 kB)
Collecting charset-normalizer<4,>=2
  Using cached charset_normalizer-3.3.2-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (142 kB)
Collecting urllib3<3,>=1.21.1
  Downloading urllib3-2.2.3-py3-none-any.whl (126 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 126.3/126.3 KB 8.9 MB/s eta 0:00:00
Collecting certifi>=2017.4.17
  Downloading certifi-2024.8.30-py3-none-any.whl (167 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 167.3/167.3 KB 10.0 MB/s eta 0:00:00
Collecting idna<4,>=2.5
  Downloading idna-3.10-py3-none-any.whl (70 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 70.4/70.4 KB 6.9 MB/s eta 0:00:00
Collecting annotated-types>=0.4.0
  Downloading annotated_types-0.7.0-py3-none-any.whl (13 kB)
Collecting pydantic-core==2.10.0
  Downloading pydantic_core-2.10.0-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (2.0 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.0/2.0 MB 8.7 MB/s eta 0:00:00
Collecting typing-extensions>=4.6.1
  Using cached typing_extensions-4.12.2-py3-none-any.whl (37 kB)
Collecting platformdirs<5,>=3.9.1
  Downloading platformdirs-4.3.6-py3-none-any.whl (18 kB)
Collecting distlib<1,>=0.3.7
  Downloading distlib-0.3.8-py2.py3-none-any.whl (468 kB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 468.9/468.9 KB 10.5 MB/s eta 0:00:00
Collecting filelock<4,>=3.12.2
  Downloading filelock-3.16.1-py3-none-any.whl (16 kB)
Collecting MarkupSafe>=2.1.1
  Downloading MarkupSafe-2.1.5-cp310-cp310-manylinux_2_17_x86_64.manylinux2014_x86_64.whl (25 kB)
Collecting bitmath
  Using cached bitmath-1.3.3.1-py3-none-any.whl
Using legacy 'setup.py install' for common-helper-files, since package 'wheel' is not installed.
Installing collected packages: distlib, bitmath, urllib3, typing-extensions, toml, python-magic, platformdirs, MarkupSafe, idna, filelock, distro, common-helper-files, charset-normalizer, certifi, annotated-types, werkzeug, virtualenv, requests, pydantic-core, pydantic
  Running setup.py install for common-helper-files ... done
Successfully installed MarkupSafe-2.1.5 annotated-types-0.7.0 bitmath-1.3.3.1 certifi-2024.8.30 charset-normalizer-3.3.2 common-helper-files-0.3.1 distlib-0.3.8 distro-1.8.0 filelock-3.16.1 idna-3.10 platformdirs-4.3.6 pydantic-2.4.0 pydantic-core-2.10.0 python-magic-0.4.27 requests-2.32.2 toml-0.10.2 typing-extensions-4.12.2 urllib3-2.2.3 virtualenv-20.26.6 werkzeug-3.0.4
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core/src$ python3 install.py
[2024-10-02 12:33:36][install][INFO]: FACT Installer 1.2
[2024-10-02 12:33:36][common][INFO]: Updating system
[sudo] password for oguzhanozgur: 
[2024-10-02 12:33:42][install][INFO]: Installing apt-transport-https autoconf automake build-essential git gnupg2 libtool python3 python3-dev unzip wget libfuzzy-dev libmagic-dev
[2024-10-02 12:34:05][install][INFO]: Installing nodejs papirus-icon-theme
[2024-10-02 12:34:21][frontend][INFO]: Creating directory for authentication
[2024-10-02 12:34:21][frontend][INFO]: Initializing docker container for radare
[2024-10-02 12:34:24][frontend][INFO]: Pulling pdf report container
psql (PostgreSQL) 14.13 (Ubuntu 14.13-1.pgdg22.04+1)
[2024-10-02 12:34:26][db][INFO]: Skipping PostgreSQL installation. Reason: Already installed.
[2024-10-02 12:34:26][db][INFO]: Initializing PostgreSQL database
[2024-10-02 12:34:26][install][INFO]: Installing libjpeg-dev libssl-dev redis binutils file openssl bison flex pkg-config
[2024-10-02 12:34:34][backend][INFO]: Skipping yara installation: Already installed and up to date
[2024-10-02 12:34:34][backend][INFO]: Installing checksec.sh
[2024-10-02 12:34:34][backend][INFO]: Pulling fact extraction container
[2024-10-02 12:34:36][backend][INFO]: Installing linter plugin.
[2024-10-02 12:34:52][backend][INFO]: Finished installing linter plugin.

[2024-10-02 12:34:52][backend][INFO]: Installing crypto_hints plugin.
[2024-10-02 12:34:53][backend][INFO]: Finished installing crypto_hints plugin.

[2024-10-02 12:34:53][backend][INFO]: Installing file_system_metadata plugin.
[2024-10-02 12:34:53][backend][INFO]: Finished installing file_system_metadata plugin.

[2024-10-02 12:34:53][backend][INFO]: Installing architecture_detection plugin.
[2024-10-02 12:34:55][backend][INFO]: Finished installing architecture_detection plugin.

[2024-10-02 12:34:55][backend][INFO]: Installing cve_lookup plugin.
[2024-10-02 12:36:15][backend][INFO]: Finished installing cve_lookup plugin.

[2024-10-02 12:36:15][backend][INFO]: Installing ipc plugin.
[2024-10-02 12:36:16][backend][INFO]: Finished installing ipc plugin.

[2024-10-02 12:36:16][backend][INFO]: Installing known_vulnerabilities plugin.
[2024-10-02 12:36:17][backend][INFO]: Finished installing known_vulnerabilities plugin.

[2024-10-02 12:36:17][backend][INFO]: Installing ip_and_uri_finder plugin.
[2024-10-02 12:36:24][backend][INFO]: Finished installing ip_and_uri_finder plugin.

[2024-10-02 12:36:24][backend][INFO]: Installing kernel_config plugin.
[2024-10-02 12:36:28][backend][INFO]: Finished installing kernel_config plugin.

[2024-10-02 12:36:28][backend][INFO]: Installing cwe_checker plugin.
[2024-10-02 12:36:30][backend][INFO]: Finished installing cwe_checker plugin.

[2024-10-02 12:36:30][backend][INFO]: Installing binwalk plugin.
[2024-10-02 12:36:59][backend][INFO]: Finished installing binwalk plugin.

[2024-10-02 12:36:59][backend][INFO]: Installing qemu_exec plugin.
[2024-10-02 12:37:02][backend][INFO]: Finished installing qemu_exec plugin.

[2024-10-02 12:37:02][backend][INFO]: Installing users_and_passwords plugin.
[2024-10-02 12:37:05][backend][INFO]: Finished installing users_and_passwords plugin.

[2024-10-02 12:37:05][backend][INFO]: Installing input_vectors plugin.
[2024-10-02 12:37:06][backend][INFO]: Finished installing input_vectors plugin.

[2024-10-02 12:37:06][backend][INFO]: Installing software_components plugin.
[2024-10-02 12:37:07][backend][INFO]: Finished installing software_components plugin.

[2024-10-02 12:37:07][backend][INFO]: Installing device_tree plugin.
[2024-10-02 12:37:07][backend][INFO]: Finished installing device_tree plugin.

[2024-10-02 12:37:07][backend][INFO]: Creating firmware directory
Create signature directory /home/oguzhanozgur/FACT_core/src/analysis/signatures
Compile signatures in /home/oguzhanozgur/FACT_core/src/plugins/analysis/crypto_hints/signatures
Compile signatures in /home/oguzhanozgur/FACT_core/src/plugins/analysis/crypto_material/signatures
Compile signatures in /home/oguzhanozgur/FACT_core/src/plugins/analysis/known_vulnerabilities/signatures
Compile signatures in /home/oguzhanozgur/FACT_core/src/plugins/analysis/software_components/signatures
[2024-10-02 12:37:08][install][INFO]: installation complete
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core/src$ ls
alembic.ini                 init_postgres.py             start_fact_backend.py
analysis                    install                      start_fact_database.py
bin                         install.log                  start_fact_frontend.py
check_signatures.py         install.py                   start_fact.py
compare                     intercom                     statistic
compile_yara_signatures.py  manage_users.py              storage
config                      migrate_database.py          test
config.py                   migrate_db_to_postgresql.py  unpacker
conftest.py                 objects                      update_statistic.py
fact_base.py                plugins                      venv
flask_app_wrapper.py        __pycache__                  version.py
helperFunctions             scheduler                    web_interface
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core/src$ cd ..
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core$ ls
CODE_OF_CONDUCT.md  INSTALL.md          src
CONTRIBUTING.md     INSTALL.vagrant.md  start_all_installed_fact_components
docs                LICENSE             start_fact_backend
docsrc              pyproject.toml      start_fact_database
install.log         README.md           start_fact_frontend
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core$ ./start_all_installed_fact_components 
[2024-10-02 12:38:04][start_all_installed_fact_components][INFO]: starting database
[2024-10-02 12:38:04][start_all_installed_fact_components][INFO]: starting frontend
[2024-10-02 12:38:04][start_all_installed_fact_components][INFO]: starting backend
[2024-10-02 12:38:04][fact_base][INFO]: Starting FACT Frontend @ 4.3-dev (d96db1d1, Python 3.10.12)
[2024-10-02 12:38:04][fact_base][INFO]: Starting FACT DB-Service @ 4.3-dev (d96db1d1, Python 3.10.12)
[2024-10-02 12:38:04][__init__][INFO]: Alembic DB revision:  head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-02 12:38:04][__init__][INFO]: Alembic DB revision:  head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-02 12:38:04][fact_base][INFO]: Successfully started FACT DB-Service
[2024-10-02 12:38:04][fact_base][INFO]: Starting FACT Backend @ 4.3-dev (d96db1d1, Python 3.10.12)
[2024-10-02 12:38:04][__init__][INFO]: Alembic DB revision:  head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-02 12:38:04][fact_base][INFO]: Successfully started FACT Frontend
[uWSGI] getting INI configuration from /home/oguzhanozgur/FACT_core/src/config/uwsgi_config.ini
*** Starting uWSGI 2.0.25.1 (64bit) on [Wed Oct  2 12:38:04 2024] ***
compiled with version: 11.4.0 on 01 October 2024 07:03:53
os: Linux-6.5.0-1027-oem #28-Ubuntu SMP PREEMPT_DYNAMIC Thu Jul 25 13:32:46 UTC 2024
nodename: oguzhanozgur
machine: x86_64
clock source: unix
detected number of CPU cores: 12
current working directory: /home/oguzhanozgur/FACT_core/src
detected binary path: /home/oguzhanozgur/FACT_core/src/venv/bin/uwsgi
!!! no internal routing support, rebuild with pcre support !!!
your processes number limit is 125501
your memory page size is 4096 bytes
detected max file descriptor number: 1024
lock engine: pthread robust mutexes
thunder lock: enabled
uwsgi socket 0 bound to TCP address 127.0.0.1:5000 fd 3
Python version: 3.10.12 (main, Sep 11 2024, 15:47:36) [GCC 11.4.0]
Python main interpreter initialized at 0x652a5c91f0e0
python threads support enabled
your server socket listen backlog is limited to 100 connections
your mercy for graceful operations on workers is 60 seconds
mapped 500064 bytes (488 KB) for 10 cores
*** Operational MODE: preforking+threaded ***
[2024-10-02 12:38:04][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/ip_rules.yara
[2024-10-02 12:38:04][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/common_analysis_ip_and_uri_finder/yara_rules/uri_rules.yara
[2024-10-02 12:38:05][frontend_main][INFO]: Web front end online
WSGI app 0 (mountpoint='') ready in 1 seconds on interpreter 0x652a5c91f0e0 pid: 84798 (default app)
*** uWSGI is running in multiple interpreter mode ***
spawned uWSGI master process (pid: 84798)
spawned uWSGI worker 1 (pid: 84987, cores: 2)
spawned uWSGI worker 2 (pid: 84989, cores: 2)
spawned uWSGI worker 3 (pid: 84991, cores: 2)
spawned uWSGI worker 4 (pid: 84993, cores: 2)
spawned uWSGI worker 5 (pid: 84995, cores: 2)
*** Stats server enabled on 127.0.0.1:9191 fd: 19 ***
[2024-10-02 12:38:05][compare][INFO]: Comparison plugins available: Software, File_Coverage, File_Header
[2024-10-02 12:38:05][scheduler][INFO]: Analysis scheduler online
[2024-10-02 12:38:05][scheduler][INFO]: Analysis plugins available: binwalk 1.0.0, cpu_architecture 0.4.0, crypto_hints 0.2.1, crypto_material 0.5.2, cve_lookup 0.1.0, cwe_checker 0.5.4, device_tree 2.0.0, elf_analysis 0.3.4, exploit_mitigations 0.2.0, file_hashes 1.2, file_system_metadata 1.0.0, file_type 1.0.0, hardware_analysis 0.2, hashlookup 0.1.4, information_leaks 0.2.0, init_systems 0.4.2, input_vectors 0.1.2, interesting_uris 0.1, ip_and_uri_finder 0.4.2, ipc_analyzer 0.1.1, kernel_config 0.3.1, known_vulnerabilities 0.2.1, printable_strings 0.3.4, qemu_exec 0.5.2, software_components 0.4.2, source_code_analysis 0.7.1, string_evaluator 0.2.1, tlsh 0.2, users_and_passwords 0.5.4
[2024-10-02 12:38:05][unpacking_scheduler][INFO]: Unpacking scheduler online
[2024-10-02 12:38:05][unpacking_scheduler][INFO]: Queue Length (Analysis/Unpack): 0 / 0
[2024-10-02 12:38:05][comparison_scheduler][INFO]: Comparison scheduler online
[2024-10-02 12:38:05][back_end_binding][INFO]: Intercom online
[2024-10-02 12:38:05][fact_base][INFO]: Successfully started FACT Backend
[2024-10-02 12:38:09][fact_base][INFO]: System memory usage: 19.9%; open file count: 6
[2024-10-02 12:38:09][fact_base][INFO]: System memory usage: 19.9%; open file count: 7
[2024-10-02 12:38:11][fact_base][INFO]: System memory usage: 19.8%; open file count: 542
[2024-10-02 12:38:12][update][INFO]: time to create stats: 0s
[pid: 84987|app: 0|req: 1/1] 127.0.0.1 () {46 vars in 903 bytes} [Wed Oct  2 12:38:12 2024] GET / => generated 18170 bytes in 71 msecs (HTTP/1.1 200) 3 headers in 96 bytes (1 switches on core 0)
[pid: 84987|app: 0|req: 2/2] 127.0.0.1 () {44 vars in 879 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/bootstrap/dist/css/bootstrap.min.css => generated 162264 bytes in 3 msecs via sendfile() (HTTP/1.1 200) 8 headers in 305 bytes (0 switches on core 1)
[pid: 84993|app: 0|req: 2/3] 127.0.0.1 () {44 vars in 805 bytes} [Wed Oct  2 12:38:12 2024] GET /static/css/home.css => generated 428 bytes in 8 msecs via sendfile() (HTTP/1.1 200) 8 headers in 292 bytes (0 switches on core 1)
[pid: 84989|app: 0|req: 2/4] 127.0.0.1 () {44 vars in 937 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/bootstrap-datepicker/dist/css/bootstrap-datepicker.standalone.css => generated 18058 bytes in 11 msecs via sendfile() (HTTP/1.1 200) 8 headers in 322 bytes (0 switches on core 1)
[pid: 84989|app: 0|req: 2/5] 127.0.0.1 () {44 vars in 889 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/@fortawesome/fontawesome-free/css/all.css => generated 73577 bytes in 11 msecs via sendfile() (HTTP/1.1 200) 8 headers in 295 bytes (0 switches on core 0)
[pid: 84993|app: 0|req: 2/6] 127.0.0.1 () {44 vars in 861 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/bootstrap/dist/js/bootstrap.min.js => generated 62563 bytes in 11 msecs via sendfile() (HTTP/1.1 200) 8 headers in 310 bytes (0 switches on core 0)
[pid: 84991|app: 0|req: 2/7] 127.0.0.1 () {44 vars in 843 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/jquery/dist/jquery.min.js => generated 87462 bytes in 11 msecs via sendfile() (HTTP/1.1 200) 8 headers in 308 bytes (0 switches on core 0)
[pid: 84991|app: 0|req: 2/8] 127.0.0.1 () {44 vars in 857 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/popper.js/dist/umd/popper.min.js => generated 21233 bytes in 11 msecs via sendfile() (HTTP/1.1 200) 8 headers in 308 bytes (0 switches on core 1)
[pid: 84987|app: 0|req: 3/9] 127.0.0.1 () {44 vars in 803 bytes} [Wed Oct  2 12:38:12 2024] GET /static/js/quick_search.js => generated 3568 bytes in 1 msecs via sendfile() (HTTP/1.1 200) 8 headers in 308 bytes (0 switches on core 0)
[pid: 84987|app: 0|req: 4/10] 127.0.0.1 () {44 vars in 861 bytes} [Wed Oct  2 12:38:12 2024] GET /static/fact_logo_inv.svg => generated 32224 bytes in 0 msecs via sendfile() (HTTP/1.1 200) 8 headers in 310 bytes (0 switches on core 1)
[pid: 84993|app: 0|req: 3/11] 127.0.0.1 () {44 vars in 853 bytes} [Wed Oct  2 12:38:12 2024] GET /static/fact_logo.png => generated 70887 bytes in 0 msecs via sendfile() (HTTP/1.1 200) 8 headers in 287 bytes (0 switches on core 1)
[pid: 84995|app: 0|req: 2/12] 127.0.0.1 () {44 vars in 897 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/bootstrap-datepicker/dist/js/bootstrap-datepicker.js => generated 58136 bytes in 8 msecs via sendfile() (HTTP/1.1 200) 8 headers in 318 bytes (0 switches on core 1)
[pid: 84995|app: 0|req: 2/13] 127.0.0.1 () {44 vars in 835 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/moment/dist/moment.js => generated 154671 bytes in 12 msecs via sendfile() (HTTP/1.1 200) 8 headers in 306 bytes (0 switches on core 0)
[pid: 84989|app: 0|req: 4/14] 127.0.0.1 () {46 vars in 999 bytes} [Wed Oct  2 12:38:12 2024] GET /static/node_modules/@fortawesome/fontawesome-free/webfonts/fa-solid-900.woff2 => generated 78268 bytes in 3 msecs via sendfile() (HTTP/1.1 200) 8 headers in 293 bytes (0 switches on core 0)
[pid: 84993|app: 0|req: 4/15] 127.0.0.1 () {44 vars in 853 bytes} [Wed Oct  2 12:38:12 2024] GET /static/fact_icon.ico => generated 5216 bytes in 0 msecs via sendfile() (HTTP/1.1 200) 8 headers in 300 bytes (0 switches on core 0)
[pid: 84989|app: 0|req: 4/16] 127.0.0.1 () {46 vars in 865 bytes} [Wed Oct  2 12:38:12 2024] GET /ajax/stats/system => generated 65 bytes in 39 msecs (HTTP/1.1 200) 3 headers in 85 bytes (1 switches on core 1)
[pid: 84991|app: 0|req: 3/17] 127.0.0.1 () {48 vars in 960 bytes} [Wed Oct  2 12:38:15 2024] GET /upload => generated 46073 bytes in 49 msecs (HTTP/1.1 200) 3 headers in 96 bytes (1 switches on core 0)
[pid: 84991|app: 0|req: 4/18] 127.0.0.1 () {48 vars in 998 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/bootstrap/dist/css/bootstrap.min.css => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 206 bytes (0 switches on core 1)
[pid: 84987|app: 0|req: 6/19] 127.0.0.1 () {48 vars in 963 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/jquery/dist/jquery.min.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 203 bytes (0 switches on core 1)
[pid: 84993|app: 0|req: 5/20] 127.0.0.1 () {48 vars in 977 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/popper.js/dist/umd/popper.min.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 203 bytes (0 switches on core 1)
[pid: 84987|app: 0|req: 6/21] 127.0.0.1 () {48 vars in 1009 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/@fortawesome/fontawesome-free/css/all.css => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 197 bytes (0 switches on core 0)
[pid: 84995|app: 0|req: 4/23] 127.0.0.1 () {44 vars in 797 bytes} [Wed Oct  2 12:38:15 2024] GET /static/js/upload.js => generated 2229 bytes in 1 msecs via sendfile() (HTTP/1.1 200) 8 headers in 302 bytes (0 switches on core 0)
[pid: 84995|app: 0|req: 4/23] 127.0.0.1 () {48 vars in 980 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/bootstrap/dist/js/bootstrap.min.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 205 bytes (0 switches on core 1)
[pid: 84987|app: 0|req: 7/25] 127.0.0.1 () {48 vars in 922 bytes} [Wed Oct  2 12:38:15 2024] GET /static/js/quick_search.js => generated 0 bytes in 0 msecs (HTTP/1.1 304) 5 headers in 204 bytes (0 switches on core 1)
[pid: 84993|app: 0|req: 6/25] 127.0.0.1 () {48 vars in 1056 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/bootstrap-datepicker/dist/css/bootstrap-datepicker.standalone.css => generated 0 bytes in 0 msecs (HTTP/1.1 304) 5 headers in 224 bytes (0 switches on core 0)
[pid: 84991|app: 0|req: 6/26] 127.0.0.1 () {48 vars in 956 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/moment/dist/moment.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 200 bytes (0 switches on core 0)
[pid: 84989|app: 0|req: 6/27] 127.0.0.1 () {44 vars in 815 bytes} [Wed Oct  2 12:38:15 2024] GET /static/css/upload.css => generated 94 bytes in 1 msecs via sendfile() (HTTP/1.1 200) 8 headers in 292 bytes (0 switches on core 0)
[pid: 84989|app: 0|req: 6/28] 127.0.0.1 () {44 vars in 799 bytes} [Wed Oct  2 12:38:15 2024] GET /static/js/loading.js => generated 190 bytes in 1 msecs via sendfile() (HTTP/1.1 200) 8 headers in 301 bytes (0 switches on core 1)
[pid: 84991|app: 0|req: 6/29] 127.0.0.1 () {48 vars in 1017 bytes} [Wed Oct  2 12:38:15 2024] GET /static/node_modules/bootstrap-datepicker/dist/js/bootstrap-datepicker.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 213 bytes (0 switches on core 1)
[pid: 84987|app: 0|req: 8/30] 127.0.0.1 () {48 vars in 981 bytes} [Wed Oct  2 12:38:15 2024] GET /static/fact_logo_inv.svg => generated 0 bytes in 0 msecs (HTTP/1.1 304) 5 headers in 207 bytes (0 switches on core 0)
[pid: 84993|app: 0|req: 7/31] 127.0.0.1 () {44 vars in 853 bytes} [Wed Oct  2 12:38:15 2024] GET /static/Pacman.gif => generated 71915 bytes in 0 msecs via sendfile() (HTTP/1.1 200) 8 headers in 283 bytes (0 switches on core 1)
[pid: 84995|app: 0|req: 5/32] 127.0.0.1 () {56 vars in 1143 bytes} [Wed Oct  2 12:38:42 2024] POST /upload => generated 11313 bytes in 37 msecs (HTTP/1.1 200) 3 headers in 96 bytes (1 switches on core 0)
[pid: 84995|app: 0|req: 6/33] 127.0.0.1 () {48 vars in 998 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/bootstrap/dist/css/bootstrap.min.css => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 206 bytes (0 switches on core 1)
[pid: 84993|app: 0|req: 8/35] 127.0.0.1 () {48 vars in 1056 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/bootstrap-datepicker/dist/css/bootstrap-datepicker.standalone.css => generated 0 bytes in 0 msecs (HTTP/1.1 304) 5 headers in 224 bytes (0 switches on core 0)
[pid: 84987|app: 0|req: 9/35] 127.0.0.1 () {48 vars in 1009 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/@fortawesome/fontawesome-free/css/all.css => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 197 bytes (0 switches on core 1)
[pid: 84989|app: 0|req: 8/36] 127.0.0.1 () {48 vars in 977 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/popper.js/dist/umd/popper.min.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 203 bytes (0 switches on core 0)
[pid: 84991|app: 0|req: 8/37] 127.0.0.1 () {48 vars in 956 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/moment/dist/moment.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 200 bytes (0 switches on core 1)
[pid: 84989|app: 0|req: 8/38] 127.0.0.1 () {48 vars in 980 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/bootstrap/dist/js/bootstrap.min.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 205 bytes (0 switches on core 1)
[pid: 84991|app: 0|req: 8/39] 127.0.0.1 () {48 vars in 963 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/jquery/dist/jquery.min.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 203 bytes (0 switches on core 0)
[pid: 84987|app: 0|req: 10/40] 127.0.0.1 () {48 vars in 1017 bytes} [Wed Oct  2 12:38:42 2024] GET /static/node_modules/bootstrap-datepicker/dist/js/bootstrap-datepicker.js => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 213 bytes (0 switches on core 0)
[pid: 84993|app: 0|req: 9/41] 127.0.0.1 () {48 vars in 922 bytes} [Wed Oct  2 12:38:42 2024] GET /static/js/quick_search.js => generated 0 bytes in 0 msecs (HTTP/1.1 304) 5 headers in 204 bytes (0 switches on core 1)
[pid: 84995|app: 0|req: 7/42] 127.0.0.1 () {48 vars in 981 bytes} [Wed Oct  2 12:38:42 2024] GET /static/fact_logo_inv.svg => generated 0 bytes in 1 msecs (HTTP/1.1 304) 5 headers in 207 bytes (0 switches on core 0)
[2024-10-02 12:38:43][connectionpool][WARNING]: Retrying (Retry(total=2, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x784464617af0>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
[2024-10-02 12:38:43][connectionpool][WARNING]: Retrying (Retry(total=1, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x784464617d30>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
[2024-10-02 12:38:43][connectionpool][WARNING]: Retrying (Retry(total=0, connect=None, read=None, redirect=None, status=None)) after connection broken by 'NewConnectionError('<urllib3.connection.HTTPConnection object at 0x784464617f10>: Failed to establish a new connection: [Errno 111] Connection refused')': /status
[2024-10-02 12:38:43][unpacking_scheduler][ERROR]: Could not fetch unpacking container logs
[2024-10-02 12:38:43][unpacking_scheduler][ERROR]: Exception happened during extraction of 325cb643b5f6849f9351fe8a21014e6a79687251603337ce0df02136799b7564_3889376.: Extraction container could not be reached.
Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connection.py", line 199, in _new_conn
    sock = connection.create_connection(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/util/connection.py", line 85, in create_connection
    raise err
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/util/connection.py", line 73, in create_connection
    sock.connect(sa)
ConnectionRefusedError: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 789, in urlopen
    response = self._make_request(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 495, in _make_request
    conn.request(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connection.py", line 441, in request
    self.endheaders()
  File "/usr/lib/python3.10/http/client.py", line 1278, in endheaders
    self._send_output(message_body, encode_chunked=encode_chunked)
  File "/usr/lib/python3.10/http/client.py", line 1038, in _send_output
    self.send(msg)
  File "/usr/lib/python3.10/http/client.py", line 976, in send
    self.connect()
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connection.py", line 279, in connect
    self.sock = self._new_conn()
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connection.py", line 214, in _new_conn
    raise NewConnectionError(
urllib3.exceptions.NewConnectionError: <urllib3.connection.HTTPConnection object at 0x784464644130>: Failed to establish a new connection: [Errno 111] Connection refused

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/requests/adapters.py", line 589, in send
    resp = conn.urlopen(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
    return self.urlopen(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
    return self.urlopen(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 873, in urlopen
    return self.urlopen(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/connectionpool.py", line 843, in urlopen
    retries = retries.increment(
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/urllib3/util/retry.py", line 519, in increment
    raise MaxRetryError(_pool, url, reason) from reason  # type: ignore[arg-type]
urllib3.exceptions.MaxRetryError: HTTPConnectionPool(host='localhost', port=9900): Max retries exceeded with url: /status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x784464644130>: Failed to establish a new connection: [Errno 111] Connection refused'))

During handling of the above exception, another exception occurred:

Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 55, in _extract_with_worker
    response = container.start_unpacking(tmp_dir, timeout=WORKER_TIMEOUT)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 118, in start_unpacking
    response = self._check_connection()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 133, in _check_connection
    return session.get(url, timeout=5)
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/requests/sessions.py", line 602, in get
    return self.request("GET", url, **kwargs)
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/requests/sessions.py", line 589, in request
    resp = self.send(prep, **send_kwargs)
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/requests/sessions.py", line 703, in send
    r = adapter.send(request, **kwargs)
  File "/home/oguzhanozgur/FACT_core/src/venv/lib/python3.10/site-packages/requests/adapters.py", line 622, in send
    raise ConnectionError(e, request=request)
requests.exceptions.ConnectionError: HTTPConnectionPool(host='localhost', port=9900): Max retries exceeded with url: /status (Caused by NewConnectionError('<urllib3.connection.HTTPConnection object at 0x784464644130>: Failed to establish a new connection: [Errno 111] Connection refused'))

The above exception was the direct cause of the following exception:

Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 199, in work_thread
    extracted_objects = self.unpacker.unpack(task, tmp_dir, container)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack.py", line 42, in unpack
    extracted_files = self.extract_files_from_file(current_fo.file_path, tmp_dir, container)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 41, in extract_files_from_file
    self._extract_with_worker(file_path, container, tmp_dir)
  File "/home/oguzhanozgur/FACT_core/src/unpacker/unpack_base.py", line 59, in _extract_with_worker
    raise ExtractionError('Extraction container could not be reached.') from error
unpacker.unpack_base.ExtractionError: Extraction container could not be reached.
[2024-10-02 12:38:43][unpacking_scheduler][INFO]: Unpacking completed: 325cb643b5f6849f9351fe8a21014e6a79687251603337ce0df02136799b7564_3889376 (extracted files: 0)
[2024-10-02 12:38:43][unpacking_scheduler][INFO]: Unpacking of firmware 325cb643b5f6849f9351fe8a21014e6a79687251603337ce0df02136799b7564_3889376 completed.
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
/home/oguzhanozgur/FACT_core/src/bin/internal_symlink_magic, 7: Warning: using regular magic file `/home/oguzhanozgur/FACT_core/src/bin/firmware'
Process ExceptionSafeProcess-109:
Traceback (most recent call last):
  File "/usr/lib/python3.10/multiprocessing/process.py", line 314, in _bootstrap
    self.run()
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 93, in run
    raise exception
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/process.py", line 87, in run
    Process.run(self)
  File "/usr/lib/python3.10/multiprocessing/process.py", line 108, in run
    self._target(*self._args, **self._kwargs)
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 149, in extraction_loop
    self.check_pending()
  File "/home/oguzhanozgur/FACT_core/src/scheduler/unpacking_scheduler.py", line 173, in check_pending
    container.restart()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 91, in restart
    self.stop()
  File "/home/oguzhanozgur/FACT_core/src/unpacker/extraction_container.py", line 67, in stop
    raise RuntimeError('Container is not running.')
RuntimeError: Container is not running.
[2024-10-02 12:38:44][scheduler][INFO]: Analysis Completed: 325cb643b5f6849f9351fe8a21014e6a79687251603337ce0df02136799b7564_3889376
[2024-10-02 12:38:44][analysis_status][INFO]: Analysis of firmware 325cb643b5f6849f9351fe8a21014e6a79687251603337ce0df02136799b7564_3889376 completed
[2024-10-02 12:38:57][unpacking_scheduler][INFO]: Queue Length (Analysis/Unpack): 0 / 0
^C[2024-10-02 12:39:05][fact_base][INFO]: Received signal 10. Shutting down FACT Backend...
[2024-10-02 12:39:06][fact_base][INFO]: Shutting down components of FACT Backend
[2024-10-02 12:39:07][back_end_binding][INFO]: Intercom offline
[2024-10-02 12:39:07][comparison_scheduler][INFO]: Comparison scheduler offline
[2024-10-02 12:39:08][unpacking_scheduler][INFO]: Unpacking scheduler offline
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='source_code_analysis worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='file_type worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='file_system_metadata worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='crypto_hints worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='binwalk worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='device_tree worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='source_code_analysis worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='file_type worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='binwalk worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='crypto_hints worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='device_tree worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:08][plugin][INFO]: <Worker name='file_system_metadata worker' parent=84398 started> received SIGTERM. Shutting down.
[2024-10-02 12:39:09][fact_base][INFO]: System memory usage: 17.1%; open file count: 6
[2024-10-02 12:39:10][fact_base][INFO]: System memory usage: 17.1%; open file count: 7
[2024-10-02 12:39:11][scheduler][INFO]: Analysis scheduler offline
[2024-10-02 12:39:11][process][CRITICAL]: SHUTTING DOWN SYSTEM
[2024-10-02 12:39:11][fact_base][INFO]: Received signal 10. Shutting down FACT Frontend...
[2024-10-02 12:39:15][fact_base][INFO]: Shutting down components of FACT Frontend
SIGINT/SIGTERM received...killing workers...
[deadlock-detector] a process holding a robust mutex died. recovering...
worker 1 buried after 1 seconds
worker 2 buried after 1 seconds
worker 3 buried after 1 seconds
worker 4 buried after 1 seconds
worker 5 buried after 1 seconds
goodbye to uWSGI.

[2024-10-02 12:39:26][fact_base][INFO]: Received signal 10. Shutting down FACT DB-Service...

[2024-10-02 12:39:29][fact_base][INFO]: Shutting down components of FACT DB-Service
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core$ 
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core$ 
(venv) oguzhanozgur@oguzhanozgur:~/FACT_core$ 

@jstucke
Copy link
Collaborator

jstucke commented Oct 2, 2024

I thinks there is a problem with docker.

That's the only remaining source for this problem I can think of at this point. There are some problems that can occur if you install docker and then try to use it without restarting. Have you restarted the system since you installed docker (or FACT)? If not, that could also be worth a try.

And if that wasn't the issue, could you try running this script to try starting the container manually using the python docker API?

from time import sleep

import docker

DOCKER_CLIENT = docker.from_env()

container = DOCKER_CLIENT.containers.run(
    image='fkiecad/fact_extractor',
    ports={'5000/tcp': 9999},
    volumes={'/dev': {'bind': '/dev', 'mode': 'rw'}},
    privileged=True,
    detach=True,
    remove=True,
    entrypoint='gunicorn --timeout 600 -w 1 -b 0.0.0.0:5000 server:app',
)
sleep(3)
print(container.logs().decode())
container.stop()

@oguzhanozgur
Copy link
Author

Where should I write this script, couldnt understand

@jstucke
Copy link
Collaborator

jstucke commented Oct 2, 2024

You could either write it to a file and run the file with python3 <file> or simply start a python shell by running python3 in the terminal and pasting the contents into the shell (with ctrl+shift+v).

@oguzhanozgur
Copy link
Author

Here is the script result:

image

@jstucke
Copy link
Collaborator

jstucke commented Oct 2, 2024

That also looks normal. I'm really not sure what to make of this: Starting the containers doesn't work when you start FACT, but it works when you do it manually. There could in theory be a problem with permissions on the mounted folders. Could you try it like this (so that it works exactly like it is called in FACT):

from multiprocessing import Manager
from pathlib import Path
from tempfile import TemporaryDirectory
from time import sleep

from config import load, backend
from unpacker.extraction_container import ExtractionContainer

load()
Path(backend.docker_mount_base_dir).mkdir(exist_ok=True)
tmp_dir = TemporaryDirectory(dir=backend.docker_mount_base_dir)
try:
    with Manager() as manager:
        ec = ExtractionContainer(id_=1, tmp_dir=tmp_dir, value=manager.Value('i', 0))
        ec.start()
        sleep(3)
        container = ec._get_container()
        print(container.logs())
        ec.stop()
finally:
    tmp_dir.cleanup()

There is also one thing I did not ask about: How much RAM do you have in the system where FACT is running?

@oguzhanozgur
Copy link
Author

I think there is a problem here :

image

@jstucke
Copy link
Collaborator

jstucke commented Oct 7, 2024

I think there is a problem here :

This script must be executed form the src directory for the import paths to line up. Sorry for the confusion.

@oguzhanozgur
Copy link
Author

Hi again,
Sorry for the late response.

image

@jstucke
Copy link
Collaborator

jstucke commented Oct 9, 2024

Sorry that was my fault: The directory does not exists yet which causes the error. I updated the script. Also you didn't answer this question:

How much RAM do you have in the system where FACT is running?

This can be a problem, because FACT is rather memory hungry. Normally it shouldn't be a problem during startup, though.

@oguzhanozgur
Copy link
Author

Sorry, didnt see that one.
I have 32GB ram total.

@oguzhanozgur
Copy link
Author

oguzhanozgur commented Oct 9, 2024

By the way, I dont know why this happened but I can not start FACT anymore.
Here are the logs:

oguzhanozgur@oguzhanozgur:~$ source .venv/bin/activate
(.venv) oguzhanozgur@oguzhanozgur:~$ cd FACT_core/
(.venv) oguzhanozgur@oguzhanozgur:~/FACT_core$ ./start_all_installed_fact_components 
[2024-10-09 17:01:06][start_all_installed_fact_components][INFO]: starting database
[2024-10-09 17:01:06][start_all_installed_fact_components][INFO]: starting frontend
[2024-10-09 17:01:06][start_all_installed_fact_components][INFO]: starting backend
[2024-10-09 17:01:06][fact_base][INFO]: Starting FACT DB-Service @ 4.3-dev (adfbfe8f, Python 3.12.3)
[2024-10-09 17:01:06][fact_base][INFO]: Starting FACT Frontend @ 4.3-dev (adfbfe8f, Python 3.12.3)
[2024-10-09 17:01:06][__init__][INFO]: Alembic DB revision:  head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-09 17:01:06][__init__][INFO]: Alembic DB revision:  head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-09 17:01:06][fact_base][INFO]: Successfully started FACT DB-Service
[2024-10-09 17:01:06][install][ERROR]: Failed to run docker compose -f /home/oguzhanozgur/FACT_core/src/install/radare/docker-compose.yml up -d:
time="2024-10-09T17:01:06+03:00" level=warning msg="/home/oguzhanozgur/FACT_core/src/install/radare/docker-compose.yml: the attribute `version` is obsolete, it will be ignored, please remove it to avoid potential confusion"
 Network radare_default  Creating
 Network radare_default  Error
failed to create network radare_default: Error response from daemon: could not find an available, non-overlapping IPv4 address pool among the defaults to assign to the network

Traceback (most recent call last):
  File "/home/oguzhanozgur/FACT_core/src/../start_fact_frontend", line 92, in <module>
    FactFrontend().main()
    ^^^^^^^^^^^^^^
  File "/home/oguzhanozgur/FACT_core/src/../start_fact_frontend", line 72, in __init__
    run_cmd_with_logging(f'docker compose -f {COMPOSE_YAML} up -d')
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/install.py", line 221, in run_cmd_with_logging
    raise err
  File "/home/oguzhanozgur/FACT_core/src/helperFunctions/install.py", line 216, in run_cmd_with_logging
    subprocess.run(cmd_, stdout=PIPE, stderr=STDOUT, encoding='UTF-8', shell=shell, check=True, **kwargs)
  File "/usr/lib/python3.12/subprocess.py", line 571, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['docker', 'compose', '-f', '/home/oguzhanozgur/FACT_core/src/install/radare/docker-compose.yml', 'up', '-d']' returned non-zero exit status 1.
[2024-10-09 17:01:06][fact_base][INFO]: Starting FACT Backend @ 4.3-dev (adfbfe8f, Python 3.12.3)
[2024-10-09 17:01:07][__init__][INFO]: Alembic DB revision:  head: 05d8effce8b3, current: 05d8effce8b3
[2024-10-09 17:01:07][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/.venv/lib/python3.12/site-packages/common_analysis_ip_and_uri_finder/yara_rules/ip_rules.yara
[2024-10-09 17:01:07][ip_and_uri_finder_analysis][INFO]: ip signature path: /home/oguzhanozgur/.venv/lib/python3.12/site-packages/common_analysis_ip_and_uri_finder/yara_rules/uri_rules.yara
[2024-10-09 17:01:08][compare][INFO]: Comparison plugins available: Software, File_Coverage, File_Header
[2024-10-09 17:01:08][scheduler][INFO]: Analysis scheduler online
[2024-10-09 17:01:08][scheduler][INFO]: Analysis plugins available: binwalk 1.0.0, cpu_architecture 0.4.0, crypto_hints 0.2.1, crypto_material 0.5.2, cve_lookup 0.1.0, cwe_checker 0.5.4, device_tree 2.0.0, elf_analysis 0.3.4, exploit_mitigations 0.2.0, file_hashes 1.2, file_system_metadata 1.0.0, file_type 1.0.0, hardware_analysis 0.2, hashlookup 0.1.4, information_leaks 0.2.0, init_systems 0.4.2, input_vectors 0.1.2, interesting_uris 0.1, ip_and_uri_finder 0.4.2, ipc_analyzer 0.1.1, kernel_config 0.3.1, known_vulnerabilities 0.3.0, printable_strings 0.3.4, qemu_exec 0.5.2, software_components 0.5.0, source_code_analysis 0.7.1, string_evaluator 0.2.1, tlsh 0.2, users_and_passwords 0.5.4
[2024-10-09 17:01:08][unpacking_scheduler][INFO]: Unpacking scheduler online
[2024-10-09 17:01:08][unpacking_scheduler][INFO]: Queue Length (Analysis/Unpack): 0 / 0
[2024-10-09 17:01:08][comparison_scheduler][INFO]: Comparison scheduler online
[2024-10-09 17:01:08][back_end_binding][INFO]: Intercom online
[2024-10-09 17:01:08][fact_base][INFO]: Successfully started FACT Backend
[2024-10-09 17:01:11][fact_base][INFO]: System memory usage: 23.0%; open file count: 7
[2024-10-09 17:01:13][fact_base][INFO]: System memory usage: 23.0%; open file count: 542

@jstucke
Copy link
Collaborator

jstucke commented Oct 9, 2024

I have 32GB ram total.

That shoud be plenty.

failed to create network radare_default: Error response from daemon: could not find an available, non-overlapping IPv4 address pool among the defaults to assign to the network

This is not a problem I encountered before. Maybe it is also related to the problems with starting the extractor containers. According to https://stackoverflow.com/questions/43720339/docker-error-could-not-find-an-available-non-overlapping-ipv4-address-pool-am this may have something to do with a VPN running in the background. Could this be the problem in your case?

@oguzhanozgur
Copy link
Author

Yes, you are right. I disconnected from VPN and tool worked again but our problem is still continuing.

@jstucke
Copy link
Collaborator

jstucke commented Oct 14, 2024

our problem is still continuing.

I'm sorry that the problem still persists. I'm still puzzled what could be the underlying issue here.
What you could also do is running FACT in a VM. There are also pre-built Vagrant VirtualBox images that you can download here: https://portal.cloud.hashicorp.com/vagrant/discover/fact-cad/FACT-master (you may need to add port forwarding to access the web interface from your system as it uses NAT by default)

@oguzhanozgur
Copy link
Author

Hi again,

Vagrant is also has same issue. Also I want to install the tool with normal way. I tried re-installation several times but nothing changed. What should I do?

@jstucke
Copy link
Collaborator

jstucke commented Nov 11, 2024

Vagrant is also has same issue

Are you sure that it is the same issue? Since it runs in a VM, this would not make a lot of sense. Maybe it has something to do with your hardware or your network

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
Projects
None yet
Development

No branches or pull requests

2 participants