Skip to content

Commit

Permalink
Use gRPC gateway to create openapi.yaml from bmi.proto
Browse files Browse the repository at this point in the history
  • Loading branch information
sverhoeven committed Nov 13, 2023
1 parent 4671c61 commit 66449be
Show file tree
Hide file tree
Showing 4 changed files with 1,165 additions and 0 deletions.
1 change: 1 addition & 0 deletions docs/index.rst
Original file line number Diff line number Diff line change
Expand Up @@ -18,6 +18,7 @@ Welcome to grpc4bmi's documentation!
container/building
container/usage
cli
openapi
python_api

Indices and tables
Expand Down
79 changes: 79 additions & 0 deletions docs/openapi.rst
Original file line number Diff line number Diff line change
@@ -0,0 +1,79 @@
OpenAPI
=======

Your model might be written in a language which does not have gRPC support.
In this case, you can use the OpenAPI specifcation to wrap your model in a JSON web service.

Generate spec
-------------

The OpenAPI spec can be generated from the gRPC spec using the `protoc-gen-openapiv2` plugin.
The plugin can be found at https://github.com/grpc-ecosystem/grpc-gateway/

```bash
protoc -I . --openapiv2_out . \
--openapiv2_opt=output_format=yaml \
--openapiv2_opt=generate_unbound_methods=true \
./proto/grpc4bmi/bmi.proto
```

Generate Python client
----------------------

```bash
npx --package @openapitools/openapi-generator-cli openapi-generator-cli generate -i proto/grpc4bmi/bmi.swagger.yaml -g python -o openapi/python-client
```

Consuming
---------

To consume a web service using the BMI OpenAPI specification, you can use the Python client:

```python
from grpc4bmi.bmi_openapi_client import BmiOpenApiClient
model = BmiOpenApiClient(host='localhost', port=50051, timeout=10)
model.initialize(config_file)
model.update()
```

To spin up a web service inside a container and s client in one go you can use

```python
from grpc4bmi.bmi_openapi_client import BmiOpenApiApptainerClient, BmiOpenApiDockerClient
model = BmiOpenApiApptainerClient(
image='wflowjl.sif', work_dir='/tmp/workdir', input_dirs=[]
)
model = BmiOpenApiDockerClient(
image='ghcr.io/eWatercycle/wflowjl', work_dir='/tmp/workdir', input_dirs=[]
)
```

Providing
---------

To provide a web service using the BMI OpenAPI specification you will need to create a web service in the language in which you have your model.

Python
~~~~~~

Generate the server stubs with:

```shell
npx --package @openapitools/openapi-generator-cli openapi-generator-cli generate -i proto/grpc4bmi/bmi.swagger.yaml -g python-fastapi -o openapi/python-server
```

Inside each stub call your corresponding BMI method of your model.

As Python is supported by gRPC, you should not need to use the OpenAPI for Python.

Julia
~~~~~

Generate the server stubs with:

```shell
npx --package @openapitools/openapi-generator-cli openapi-generator-cli generate -i proto/grpc4bmi/bmi.swagger.yaml -g julia-server -o openapi/julia-server
```

Inside each stub call your corresponding BMI method of your model.
1 change: 1 addition & 0 deletions proto/grpc4bmi/bmi.proto
Original file line number Diff line number Diff line change
@@ -1,6 +1,7 @@
syntax = "proto3";

package bmi;
option go_package = "github.com/eWatercycle/grpc4bmi";

message Empty{}

Expand Down
Loading

0 comments on commit 66449be

Please sign in to comment.