Skip to content

Commit

Permalink
Merge branch 'master' into full-multisolve
Browse files Browse the repository at this point in the history
  • Loading branch information
g-poveda authored Jan 11, 2024
2 parents 4ee1b8f + b10e1ec commit 27b0df5
Show file tree
Hide file tree
Showing 16 changed files with 528 additions and 495 deletions.
68 changes: 49 additions & 19 deletions .github/workflows/build.yml
Original file line number Diff line number Diff line change
Expand Up @@ -201,6 +201,7 @@ jobs:
if: needs.setup.outputs.do_macos == 'true'
strategy:
matrix:
arch: ["arm64", "x86_64"] # NB: only x86_64 wheel will be tested as no macosx_arm64 github runner available
os: ${{ fromJSON(needs.setup.outputs.build).macos }}
python-version: ${{ fromJSON(needs.setup.outputs.python_version_per_os).macos }}
fail-fast: false
Expand Down Expand Up @@ -258,9 +259,6 @@ jobs:
if: steps.cache-build-dependencies.outputs.cache-hit != 'true'
run: echo "SKDECIDE_SKIP_DEPS=0" >> $GITHUB_ENV

- name: Install omp
run: brew install libomp

- name: Install and restore ccache
uses: hendrikmuhs/[email protected]
with:
Expand All @@ -273,30 +271,62 @@ jobs:
echo "CMAKE_C_COMPILER_LAUNCHER=ccache" >> ${GITHUB_ENV}
- name: Build wheel
env:
ARCH: ${{ matrix.arch }}
PYTHON_VERSION: ${{ matrix.python-version }}
run: |
export "Boost_ROOT=$PWD/$BOOST_DIR"
export "OpenMP_ROOT=$(brew --prefix)/opt/libomp"
python -m pip install --upgrade pip
pip install build poetry-dynamic-versioning
# cross-compile for macosx-10.15
export MACOSX_DEPLOYMENT_TARGET=10.15
python -m build --sdist --wheel "--config-setting=--plat-name=macosx_10_15_x86_64"
# hack wheel name to be recognized by macos 10.15
wheel_name=$(ls dist/*.whl)
new_wheel_name=$(echo $wheel_name | sed -e 's/macosx_.*_x86_64.whl/macosx_10_15_x86_64.whl/')
echo "mv $wheel_name $new_wheel_name"
mv $wheel_name $new_wheel_name
if [[ "$ARCH" == arm64 ]]; then
# SciPy requires 12.0 on arm to prevent kernel panics
# https://github.com/scipy/scipy/issues/14688
# We use the same deployment target to match SciPy.
export MACOSX_DEPLOYMENT_TARGET=12.0
OPENMP_URL="https://anaconda.org/conda-forge/llvm-openmp/11.1.0/download/osx-arm64/llvm-openmp-11.1.0-hf3c4609_1.tar.bz2"
else
export MACOSX_DEPLOYMENT_TARGET=10.15
OPENMP_URL="https://anaconda.org/conda-forge/llvm-openmp/11.1.0/download/osx-64/llvm-openmp-11.1.0-hda6cdc1_1.tar.bz2"
fi
PYTHON_VERSION_WO_DOT=$(echo ${PYTHON_VERSION} | sed -e 's/\.//g') # remove "."
MACOSX_DEPLOYMENT_TARGET_WO_DOT=$(echo ${MACOSX_DEPLOYMENT_TARGET} | sed -e 's/\./_/g') # replace "." by "_"
# install appropriate version of openmp
sudo conda create -n build $OPENMP_URL
# make openmp and boost available
export Boost_ROOT=$PWD/$BOOST_DIR
export OpenMP_ROOT=$CONDA/envs/build
export CPPFLAGS="$CPPFLAGS -Xpreprocessor -fopenmp"
export CFLAGS="$CFLAGS -I$OpenMP_ROOT/include"
export CXXFLAGS="$CXXFLAGS -I$OpenMP_ROOT/include"
export LDFLAGS="$LDFLAGS -Wl,-rpath,$OpenMP_ROOT/lib -L$OpenMP_ROOT/lib -lomp"
# cmake flag to cross-compile the c++
export CMAKE_OSX_ARCHITECTURES=${ARCH}
python -m pip install cibuildwheel
# cibuildwheel flags
export CIBW_BUILD_FRONTEND="build"
export CIBW_ARCHS=${ARCH}
export CIBW_BUILD="cp${PYTHON_VERSION_WO_DOT}-macosx_${ARCH}"
# build wheel
python -m cibuildwheel --output-dir wheelhouse
# set the proper platform tag
#  - with poetry build + cross-compilation for arm64, the tag could been still x64_64 (https://cibuildwheel.readthedocs.io/en/stable/faq/#how-to-cross-compile)
# - we downgrade the displayed macosx version to ensure compatibility with lesser macosx than the ones used on this runner
pip install "wheel>=0.40"
wheel tags --platform-tag macosx_${MACOSX_DEPLOYMENT_TARGET_WO_DOT}_${ARCH} --remove wheelhouse/*.whl
- name: Update build cache from wheels
if: steps.cache-build-dependencies.outputs.cache-hit != 'true'
run: 7z x dist/*.whl -y
run: 7z x wheelhouse/*.whl -y

- name: Upload as build artifacts
uses: actions/upload-artifact@v3
with:
name: wheels
path: dist/*.whl
path: wheelhouse/*.whl

build-ubuntu:
needs: [setup]
Expand Down Expand Up @@ -553,7 +583,7 @@ jobs:
- name: Install scikit-decide and test dependencies
run: |
python_version=${{ matrix.python-version }}
wheelfile=$(ls ./wheels/scikit_decide*-cp${python_version/\./}-*macos*.whl)
wheelfile=$(ls ./wheels/scikit_decide*-cp${python_version/\./}-*macos*x86_64.whl)
pip install ${wheelfile}[all] pytest gymnasium[classic-control]
- name: Test with pytest
Expand Down Expand Up @@ -669,7 +699,7 @@ jobs:
strategy:
matrix:
os: [ubuntu-latest]
python-version: ["3.8"]
python-version: ["3.10"]
fail-fast: false
runs-on: ${{ matrix.os }}

Expand Down
62 changes: 50 additions & 12 deletions .github/workflows/release.yml
Original file line number Diff line number Diff line change
Expand Up @@ -145,6 +145,7 @@ jobs:
needs: [setup]
strategy:
matrix:
arch: [ "arm64", "x86_64" ] # NB: only x86_64 wheel will be tested as no macosx_arm64 github runner available
os: ${{ fromJSON(needs.setup.outputs.build).macos }}
python-version: ${{ fromJSON(needs.setup.outputs.python_version_per_os).macos }}
fail-fast: false
Expand Down Expand Up @@ -202,26 +203,63 @@ jobs:
if: steps.cache-build-dependencies.outputs.cache-hit != 'true'
run: echo "SKDECIDE_SKIP_DEPS=0" >> $GITHUB_ENV

- name: Install omp
run: brew install libomp

- name: Build wheel
env:
ARCH: ${{ matrix.arch }}
PYTHON_VERSION: ${{ matrix.python-version }}
run: |
export "Boost_ROOT=$PWD/$BOOST_DIR"
export "OpenMP_ROOT=$(brew --prefix)/opt/libomp"
python -m pip install --upgrade pip
pip install build poetry-dynamic-versioning
python -m build --sdist --wheel
if [[ "$ARCH" == arm64 ]]; then
# SciPy requires 12.0 on arm to prevent kernel panics
# https://github.com/scipy/scipy/issues/14688
# We use the same deployment target to match SciPy.
export MACOSX_DEPLOYMENT_TARGET=12.0
OPENMP_URL="https://anaconda.org/conda-forge/llvm-openmp/11.1.0/download/osx-arm64/llvm-openmp-11.1.0-hf3c4609_1.tar.bz2"
else
export MACOSX_DEPLOYMENT_TARGET=10.15
OPENMP_URL="https://anaconda.org/conda-forge/llvm-openmp/11.1.0/download/osx-64/llvm-openmp-11.1.0-hda6cdc1_1.tar.bz2"
fi
PYTHON_VERSION_WO_DOT=$(echo ${PYTHON_VERSION} | sed -e 's/\.//g') # remove "."
MACOSX_DEPLOYMENT_TARGET_WO_DOT=$(echo ${MACOSX_DEPLOYMENT_TARGET} | sed -e 's/\./_/g') # replace "." by "_"
# install appropriate version of openmp
sudo conda create -n build $OPENMP_URL
# make openmp and boost available
export Boost_ROOT=$PWD/$BOOST_DIR
export OpenMP_ROOT=$CONDA/envs/build
export CPPFLAGS="$CPPFLAGS -Xpreprocessor -fopenmp"
export CFLAGS="$CFLAGS -I$OpenMP_ROOT/include"
export CXXFLAGS="$CXXFLAGS -I$OpenMP_ROOT/include"
export LDFLAGS="$LDFLAGS -Wl,-rpath,$OpenMP_ROOT/lib -L$OpenMP_ROOT/lib -lomp"
# cmake flag to cross-compile the c++
export CMAKE_OSX_ARCHITECTURES=${ARCH}
python -m pip install cibuildwheel
# cibuildwheel flags
export CIBW_BUILD_FRONTEND="build"
export CIBW_ARCHS=${ARCH}
export CIBW_BUILD="cp${PYTHON_VERSION_WO_DOT}-macosx_${ARCH}"
# build wheel
python -m cibuildwheel --output-dir wheelhouse
# set the proper platform tag
#  - with poetry build + cross-compilation for arm64, the tag could been still x64_64 (https://cibuildwheel.readthedocs.io/en/stable/faq/#how-to-cross-compile)
# - we downgrade the displayed macosx version to ensure compatibility with lesser macosx than the ones used on this runner
pip install "wheel>=0.40"
wheel tags --platform-tag macosx_${MACOSX_DEPLOYMENT_TARGET_WO_DOT}_${ARCH} --remove wheelhouse/*.whl
- name: Update build cache from wheels
if: steps.cache-build-dependencies.outputs.cache-hit != 'true'
run: 7z x dist/*.whl -y
run: 7z x wheelhouse/*.whl -y

- name: Upload as build artifacts
uses: actions/upload-artifact@v3
with:
name: wheels
path: dist/*.whl
path: wheelhouse/*.whl

build-ubuntu:
needs: [setup]
Expand Down Expand Up @@ -465,7 +503,7 @@ jobs:
- name: Install scikit-decide and test dependencies
run: |
python_version=${{ matrix.python-version }}
wheelfile=$(ls ./wheels/scikit_decide*-cp${python_version/\./}-*macos*.whl)
wheelfile=$(ls ./wheels/scikit_decide*-cp${python_version/\./}-*macos*x86_64.whl)
pip install ${wheelfile}[all] pytest gymnasium[classic-control]
- name: Test with pytest
Expand Down Expand Up @@ -628,7 +666,7 @@ jobs:
runs-on: ubuntu-latest
env:
DOCS_VERSION_PATH: /
python_version: "3.8"
python_version: "3.10"

steps:
- name: Get scikit-decide release version and update online docs path
Expand Down
1 change: 0 additions & 1 deletion examples/full_multisolve.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,7 +14,6 @@
In doing so (`pip install gymnasium[accept-rom-license]`), you agree to own a license to these Atari 2600 ROMs
and agree to not distribution these ROMS.
If you still does not have the ROMs after that, and getting the following error:
> gymnasium.error.Error: We're Unable to find the game "MsPacman". Note: Gymnasium no longer distributes ROMs.
Expand Down
3 changes: 2 additions & 1 deletion notebooks/11_maze_tuto.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -17,7 +17,8 @@
"Notes:\n",
"- In order to focus on scikit-decide use, we put some code not directly related to the library in a [separate module](./maze_utils.py) (like maze generation and display).\n",
"- A similar maze domain is already defined in [scikit-decide hub](https://github.com/airbus/scikit-decide/blob/master/skdecide/hub/domain/maze/maze.py) but we do not use it for the sake of this tutorial.\n",
"\n"
"- **Special notice for binder + sb3:**\n",
"it seems that [stable-baselines3](https://stable-baselines3.readthedocs.io/en/master/) algorithms are extremely slow on [binder](https://mybinder.org/). We could not find a proper explanation about it. We strongly advise you to either launch the notebook locally or on colab, or to skip the cells that are using sb3 algorithms (here PPO solver).\n"
]
},
{
Expand Down
5 changes: 4 additions & 1 deletion notebooks/12_gym_tuto.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -28,7 +28,10 @@
" - Wrap a gymnasium environment in a scikit-decide domain;\n",
" - Use a classical RL algorithm like PPO to solve our problem;\n",
" - Give CGP (Cartesian Genetic Programming) a try on the same problem;\n",
" - Finally use IW (Iterated Width) coming from the planning community on the same problem."
" - Finally use IW (Iterated Width) coming from the planning community on the same problem.\n",
"\n",
"**Special notice for binder + sb3:**\n",
"it seems that [stable-baselines3](https://stable-baselines3.readthedocs.io/en/master/) algorithms are extremely slow on [binder](https://mybinder.org/). We could not find a proper explanation about it. We strongly advise you to either launch the notebook locally or on colab, or to skip the cells that are using sb3 algorithms (here PPO solver)."
]
},
{
Expand Down
66 changes: 29 additions & 37 deletions notebooks/15_flightplanning_tuto.ipynb
Original file line number Diff line number Diff line change
Expand Up @@ -7,7 +7,7 @@
"source": [
"# Flight Planning Domain \n",
"\n",
"This notebook aims to make a short and interactive example of the Flight Planning Domain. You can find more information about it in the README file."
"This notebook aims to make a short and interactive example of the Flight Planning Domain. See the [online documentation](https://airbus.github.io/scikit-decide/reference/_skdecide.hub.domain.flight_planning.domain.html#flightplanningdomain) for more information."
]
},
{
Expand Down Expand Up @@ -87,21 +87,18 @@
"metadata": {},
"outputs": [],
"source": [
"import datetime\n",
"\n",
"from skdecide.hub.domain.flight_planning.domain import FlightPlanningDomain, WeatherDate\n",
"from skdecide.hub.domain.flight_planning.weather_interpolator.weather_tools.get_weather_noaa import (\n",
" get_weather_matrix,\n",
")\n",
"from skdecide.hub.domain.flight_planning.weather_interpolator.weather_tools.interpolator.GenericInterpolator import (\n",
" GenericWindInterpolator,\n",
")"
"from skdecide.hub.solver.astar import Astar"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"### Definition of the problem\n",
"## Definition of the problem\n",
"\n",
"Here we will make a short haul flight from Paris Charles de Gaulle airport (ICAO : LFPG) to Toulouse-Blagnac airport (ICAO: LFBO), using an airbus A320 aircraft."
]
Expand All @@ -122,7 +119,7 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Now, we are going to define a date, in the past 6 months, to get access to weather interpolation."
"Now, we are going to define a date that will be used for weather interpolation. If the data has not already be downloaded on your computer, be careful to choose a date within the past 6 months, so that the data is available on https://www.ncei.noaa.gov."
]
},
{
Expand All @@ -131,29 +128,27 @@
"metadata": {},
"outputs": [],
"source": [
"weather_date = WeatherDate(day=21, month=4, year=2023)\n",
"w_dict = weather_date.to_dict()\n",
"mat = get_weather_matrix(\n",
" year=w_dict[\"year\"],\n",
" month=w_dict[\"month\"],\n",
" day=w_dict[\"day\"],\n",
" forecast=w_dict[\"forecast\"],\n",
" delete_npz_from_local=False,\n",
" delete_grib_from_local=False,\n",
")\n",
"wind_interpolator = GenericWindInterpolator(file_npz=mat)"
"# we set a date valid for 4 months to avoid downloading weather data at each daily run.\n",
"today = datetime.date.today()\n",
"month = ((today.month) - 1) // 4 * 4 + 1 # will result in january, may, or september\n",
"year = today.year\n",
"day = 1\n",
"\n",
"weather_date = WeatherDate(day=day, month=month, year=year)\n",
"print(weather_date)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"metadata": {},
"source": [
"We can now define heuristic & cost function, to feed the A* solver. This aims to guide him along the airways graph to solve the problem, and get as close as possible to the optimal solution. \n",
"We can now define heuristic and cost function, to feed the A* solver. This aims to guide him along the airways graph to solve the problem, and get as close as possible to the optimal solution. \n",
"\n",
"The heuristic function can be either fuel, time, distance, lazy_fuel, lazy_time and None. If None, the A* will behave as a Dijkstra like search algorithm, as we give a 0 value to the A* algorithm. \n",
"The heuristic function can be either `\"fuel\"`, `\"time\"`, `\"distance\"`, `\"lazy_fuel\"`, `\"lazy_time\"`, or `None`. If `None`, the A* will behave as a Dijkstra-like search algorithm, as we give a 0 value to the A* algorithm. \n",
"\n",
"The cost function can be either fuel, time and distance. It will define the cost of the flight plan, computed during the state to state flight simulation. "
"The cost function can be either `\"fuel\"`, `\"time\"`, or `\"distance\"`. \n",
"It will define the cost of the flight plan, computed during the state-to-state flight simulation. "
]
},
{
Expand All @@ -171,11 +166,9 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"### Definition of the Domain and Optional features \n",
"## Definition of the corresponding domain\n",
"\n",
"There is many other optionnal features, as described in the README. We are now going to define the domain. \n",
"\n",
"It will initialize the Flight Planning Domain with the given features, and can take some time, especially if it needs to download the differents weather files, and if you ask for a fuel loop. \n"
"We are now going to define the domain. It can take some time, especially if it needs to download some weather files, or if you ask for a fuel loop. \n"
]
},
{
Expand All @@ -189,7 +182,6 @@
" destination,\n",
" aircraft,\n",
" weather_date=weather_date,\n",
" wind_interpolator=wind_interpolator,\n",
" heuristic_name=heuristic,\n",
" objective=cost_function,\n",
" fuel_loop=False,\n",
Expand All @@ -203,7 +195,10 @@
"cell_type": "markdown",
"metadata": {},
"source": [
"Then we can solve the problem : (can take time depending on )"
"## Solving and rendering out the flight planning\n",
"\n",
"We use her an A* solver as mentionned before. \n",
"We also use the custom rollout proposed to have some visualization during the flight planning generation."
]
},
{
Expand All @@ -212,15 +207,12 @@
"metadata": {},
"outputs": [],
"source": [
"domain.solve(domain_factory, make_img=True)"
"with Astar(\n",
" heuristic=lambda d, s: d.heuristic(s), domain_factory=domain_factory\n",
") as solver:\n",
" domain.solve_with(solver=solver, domain_factory=domain_factory)\n",
" domain.custom_rollout(solver=solver)"
]
},
{
"cell_type": "code",
"execution_count": null,
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
Expand Down
Loading

0 comments on commit 27b0df5

Please sign in to comment.