Skip to content

Commit

Permalink
chore: Upgrade BE deps (SCE-18) (#1092)
Browse files Browse the repository at this point in the history
  • Loading branch information
kaloster authored Sep 6, 2024
1 parent 5607bf3 commit 284dd6c
Show file tree
Hide file tree
Showing 10 changed files with 68 additions and 59 deletions.
12 changes: 6 additions & 6 deletions .github/workflows/push_tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -19,10 +19,10 @@ jobs:
- uses: actions/checkout@v3
- run: |
git fetch --depth=1 origin +${{github.base_ref}}
- name: Set up Python 3.8
- name: Set up Python 3.11
uses: actions/setup-python@v1
with:
python-version: 3.8
python-version: 3.11
- name: Node cache
uses: actions/cache@v1
with:
Expand Down Expand Up @@ -57,10 +57,10 @@ jobs:
timeout-minutes: 10
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.8
- name: Set up Python 3.11
uses: actions/setup-python@v1
with:
python-version: 3.8
python-version: 3.11
- name: Python cache
uses: actions/cache@v1
with:
Expand Down Expand Up @@ -90,10 +90,10 @@ jobs:
with:
# Chromatic needs full Git history graph
fetch-depth: 0
- name: Set up Python 3.8
- name: Set up Python 3.11
uses: actions/setup-python@v1
with:
python-version: 3.8
python-version: 3.11
- name: Python cache
uses: actions/cache@v1
with:
Expand Down
6 changes: 2 additions & 4 deletions .github/workflows/scale-test.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,10 +9,10 @@ jobs:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- name: Set up Python 3.8
- name: Set up Python 3.11
uses: actions/setup-python@v1
with:
python-version: 3.8
python-version: 3.11
- name: Install dependencies
run: |
pip install -r server/tests/locust/requirements-locust.txt
Expand All @@ -26,5 +26,3 @@ jobs:
DEV_STATS=$(tail -n 15 locust_dev_stats.txt)
DEV_MSG="\`\`\`CELLXGENE EXPLORER DEV SCALE TEST RESULTS: ${DEV_STATS}\`\`\`"
curl -X POST -H 'Content-type: application/json' --data "{'text':'${DEV_MSG}'}" $SLACK_WEBHOOK
2 changes: 1 addition & 1 deletion .infra/rdev/values.yaml
Original file line number Diff line number Diff line change
Expand Up @@ -2,7 +2,7 @@ stack:
services:
explorer:
image:
tag: sha-d6c2ebd2
tag: sha-03aea9e9
replicaCount: 1
env:
# env vars common to all deployment stages
Expand Down
50 changes: 28 additions & 22 deletions dev_docs/developer_scripts.md
Original file line number Diff line number Diff line change
Expand Up @@ -11,20 +11,22 @@ $PROJECT_ROOT`.
### Build

**Usage:** from the `$PROJECT_ROOT` directory run:
* `make build` builds whole app client and server
* `make build-client` runs webpack build
* `make build-for-server-dev` builds client and copies output directly into

- `make build` builds whole app client and server
- `make build-client` runs webpack build
- `make build-for-server-dev` builds client and copies output directly into
source tree (only for server devlopment)

### Clean

Deletes generated files.

**Usage:** from the `$PROJECT_ROOT` directory run:
* `make clean` cleans everything including node modules (means build with take

- `make clean` cleans everything including node modules (means build with take
a while
* `make clean-lite` cleans built directories
* `make clean-server` cleans source tree
- `make clean-lite` cleans built directories
- `make clean-server` cleans source tree

### Release

Expand All @@ -35,7 +37,8 @@ See `release_process.md`.
Installs requirements files.

**Usage:** from the `$PROJECT_ROOT` directory run:
* `make dev-env` installs requirements and requirements-dev (for building code)

- `make dev-env` installs requirements and requirements-dev (for building code)

## Client-level scripts

Expand All @@ -46,13 +49,14 @@ Installs requirements files.
**About** Serve the current client javascript independently from the `server` code.

**Requires**
* The server to be running. Best way to do this is with [backend_dev](#backend_dev).
* `make ci` to install the necessary node modules

- The server to be running. Best way to do this is with [backend_dev](#backend_dev).
- `make ci` to install the necessary node modules

**Usage:** from the `$PROJECT_ROOT/client` directory run `make start-frontend`

NB: the frontend server reads in the desired base_url and dataset name to form the complete url base for API calls. *In
order to use an arbitrary dataset successfully, the frontend server must be started **after** the backend server*, which
NB: the frontend server reads in the desired base_url and dataset name to form the complete url base for API calls. _In
order to use an arbitrary dataset successfully, the frontend server must be started **after** the backend server_, which
writes out the given base_url and dataset anew each time.

#### backend_dev
Expand All @@ -68,33 +72,35 @@ environment and installs explorer requirements from the current branch.
**Usage:** from the `$PROJECT_ROOT` directory run `./scripts/backend_dev`

**Options:**
* In parallel, you can then launch the node development server to serve the

- In parallel, you can then launch the node development server to serve the
current state of the FE with [`start-frontend`](#start-frontend), usually in
a different terminal tab.
* You can use a specific dataroot using `./launch_dev_server.sh <custom_dataroot>`.
* You can also pass (current/desktop/legacy) cli options to the `./launch_dev_server.sh` command.
- You can use a specific dataroot using `./launch_dev_server.sh <custom_dataroot>`.
- You can also pass (current/desktop/legacy) cli options to the `./launch_dev_server.sh` command.

**Breakdown**

| command | purpose |
| ---------------------------------------- | ----------------------------------------------------------- |
| python3.6 -m venv explorer | creates explorer virtual environment |
| source explorer/bin/activate | activates virtual environment |
| ./launch_dev_server.sh [cli options] | launches api server (can supply arbitrary config) |
| command | purpose |
| ------------------------------------ | ------------------------------------------------- |
| python3.11 -m venv explorer | creates explorer virtual environment |
| source explorer/bin/activate | activates virtual environment |
| ./launch_dev_server.sh [cli options] | launches api server (can supply arbitrary config) |

### Client test scripts

Methods used to test the client javascript code

**Usage:** from the `$PROJECT_ROOT/client` directory run:
* `make unit-test` Runs all unit tests. It excludes any tests in the e2e

- `make unit-test` Runs all unit tests. It excludes any tests in the e2e
folder. This is used by travis to run unit tests.
* `make smoke-test` Starts backend development server and runs end to end
- `make smoke-test` Starts backend development server and runs end to end
tests. This is what travis runs. It depends on the `e2e` and the
`backend-dev` targets. One starts the server, the other runs the tests. If
developing a front-end feature and just checking if tests pass, this is
probably the one you want to run.
* `npm run e2e` Runs backend tests without starting the server. You will need to
- `npm run e2e` Runs backend tests without starting the server. You will need to
start the rest api separately with the pbmc3k.cxg file. Note you can use
the `JEST_ENV` environment variable to change how JEST runs in the browser.
The test runs against `localhost:3000` by default. You can use the
Expand Down
9 changes: 8 additions & 1 deletion server/common/fbs/matrix.py
Original file line number Diff line number Diff line change
Expand Up @@ -104,7 +104,14 @@ def encode_matrix_fbs(matrix, row_idx=None, col_idx=None, num_bins=None):
matrix = serialize_matrix(builder, n_rows, n_cols, matrix_column_vec, cidx)

builder.Finish(matrix)
return builder.Output()

output = builder.Output()

# Werkzeug has strict requirements that the response data must be in bytes
if isinstance(output, bytearray):
output = bytes(output)

return output


def decode_matrix_fbs(fbs):
Expand Down
2 changes: 1 addition & 1 deletion server/common/rest.py
Original file line number Diff line number Diff line change
Expand Up @@ -6,10 +6,10 @@
import sys
import zlib
from http import HTTPStatus
from urllib.parse import unquote as url_unquote

import requests
from flask import abort, current_app, jsonify, make_response, redirect
from werkzeug.urls import url_unquote

from server.app.api.util import get_dataset_artifact_s3_uri
from server.common.config.client_config import get_client_config
Expand Down
6 changes: 3 additions & 3 deletions server/dataset/dataset.py
Original file line number Diff line number Diff line change
Expand Up @@ -180,7 +180,7 @@ def update_parameters(self, parameters):
parameters.update(self.parameters)

def _index_filter_to_mask(self, filter, count):
mask = np.zeros((count,), dtype=np.bool)
mask = np.zeros((count,), dtype=np.bool_)
for i in filter:
if isinstance(i, list):
mask[i[0] : i[1]] = True
Expand All @@ -189,7 +189,7 @@ def _index_filter_to_mask(self, filter, count):
return mask

def _axis_filter_to_mask(self, axis, filter, count):
mask = np.ones((count,), dtype=np.bool)
mask = np.ones((count,), dtype=np.bool_)
if "index" in filter:
mask = np.logical_and(mask, self._index_filter_to_mask(filter["index"], count))
if "annotation_value" in filter:
Expand All @@ -198,7 +198,7 @@ def _axis_filter_to_mask(self, axis, filter, count):
return mask

def _annotation_filter_to_mask(self, axis, filter, count):
mask = np.ones((count,), dtype=np.bool)
mask = np.ones((count,), dtype=np.bool_)
for v in filter:
name = v["name"]
if axis == Axis.VAR:
Expand Down
23 changes: 11 additions & 12 deletions server/requirements.txt
Original file line number Diff line number Diff line change
@@ -1,26 +1,25 @@
anndata==0.8.0
anndata==0.10.9
bitarray==2.7.3
boto3==1.26.94
click==8.1.3
envyaml==1.10.211231
flask==2.2.3
flask==2.2.5
flask-compress==1.13
flask-cors==3.0.10
flask-cors==5.0.0
flask-restful==0.3.9
flask-talisman==1.0.0
flatbuffers>=1.11.0,<2.0.0 # cellxgene is not compatible with 2.0.0. Requires migration
flatten_dict==0.4.2
fsspec>=0.4.4,<0.8.0
gunicorn[gevent]==20.0.4
numba==0.56.4
numpy==1.23.5
pandas==1.5.3
pillow==9.5.0
pydantic==1.10.6
requests==2.28.2
scipy==1.10.1
numba==0.60.0
numpy>=1.24.0,<2.1.0
pandas>=2.2.2
pydantic==1.10.13
requests==2.32.3
scipy==1.11.1
flask-server-timing==0.1.2
s3fs==0.4.2
tiledb==0.25.0 # Explorer's major/minor tiledb version should always be the >= Portal's tiledb major/minor version (for read/write compatibility)
Werkzeug==2.2.3 # 2.3.0 breaks the binary endpoints
tiledb # unpinned to match Portal's requirements
Werkzeug==3.0.4
python-json-logger==2.0.7
4 changes: 2 additions & 2 deletions server/tests/unit/common/apis/test_api_v3.py
Original file line number Diff line number Diff line change
Expand Up @@ -453,7 +453,7 @@ def test_fbs_default(self):
url = f"{url_base}{endpoint}"
headers = {"Accept": "application/octet-stream"}
result = self.client.put(url, headers=headers)
self.assertEqual(result.status_code, HTTPStatus.BAD_REQUEST)
self.assertEqual(result.status_code, HTTPStatus.UNSUPPORTED_MEDIA_TYPE)

filter = {"filter": {"var": {"index": [0, 1, 4]}}}
result = self.client.put(url, headers=headers, json=filter)
Expand All @@ -466,7 +466,7 @@ def test_data_put_fbs(self):
url = f"{url_base}{endpoint}"
header = {"Accept": "application/octet-stream"}
result = self.client.put(url, headers=header)
self.assertEqual(result.status_code, HTTPStatus.BAD_REQUEST)
self.assertEqual(result.status_code, HTTPStatus.UNSUPPORTED_MEDIA_TYPE)

def test_data_get_fbs(self):
endpoint = "data/var"
Expand Down
13 changes: 6 additions & 7 deletions server/tests/unit/common/utils/test_type_conversion_utils.py
Original file line number Diff line number Diff line change
Expand Up @@ -42,7 +42,7 @@ def test__get_schema_type_hint_from_dtype(self):
with self.assertRaises(TypeError):
get_schema_type_hint_from_dtype(np.dtype(dtype))

for dtype in [np.float16, np.float32, np.float64]:
for dtype in [np.float32, np.float64]:
self.assertEqual(get_schema_type_hint_from_dtype(np.dtype(dtype)), {"type": "float32"})

for dtype in [np.dtype(object), np.dtype(str)]:
Expand Down Expand Up @@ -126,9 +126,9 @@ def __exit__(self, exc_type, exc_val, exc_tb):
"data": data,
"expected_encoding_dtype": np.float32,
"expected_schema_hint": {"type": "float32"},
"logs": None if data.dtype != np.float64 else {"level": logging.WARNING, "output": "may lose precision"},
"logs": None if dtype == np.float32 else {"level": logging.WARNING, "output": "may lose precision"},
}
for dtype in [np.float16, np.float32, np.float64]
for dtype in [np.float32, np.float64]
for data in [
np.arange(-128, 1000, dtype=dtype),
pd.Series(np.arange(-128, 1000, dtype=dtype)),
Expand Down Expand Up @@ -201,9 +201,9 @@ def __exit__(self, exc_type, exc_val, exc_tb):
"data": data,
"expected_encoding_dtype": np.float32,
"expected_schema_hint": {"type": "categorical"},
"logs": {"level": logging.WARNING, "output": "may lose precision"},
"logs": None if dtype == np.float32 else {"level": logging.WARNING, "output": "may lose precision"},
}
for dtype in [np.float16, np.float32, np.float64]
for dtype in [np.float32, np.float64]
for data in [
pd.Series(np.array([0, 1, 2], dtype=dtype), dtype="category"),
pd.Series(np.array([0, 1, 2], dtype=dtype), dtype="category").cat.remove_categories([1]),
Expand All @@ -216,7 +216,7 @@ def __exit__(self, exc_type, exc_val, exc_tb):
"data": data,
"expected_encoding_dtype": np.float32,
"expected_schema_hint": {"type": "categorical"},
"logs": {"level": logging.WARNING, "output": "may lose precision"},
"logs": None if dtype == np.float32 else {"level": logging.WARNING, "output": "may lose precision"},
}
for dtype in [
np.int8,
Expand All @@ -227,7 +227,6 @@ def __exit__(self, exc_type, exc_val, exc_tb):
np.uint32,
np.int64,
np.uint64,
np.float16,
np.float32,
np.float64,
]
Expand Down

0 comments on commit 284dd6c

Please sign in to comment.