-
-
Notifications
You must be signed in to change notification settings - Fork 704
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Code examples in the documentation should be formatted with Black #1718
Comments
Tried this:
Got this: diff --git a/docs/authentication.rst b/docs/authentication.rst
index 0d98cf8..8008023 100644
--- a/docs/authentication.rst
+++ b/docs/authentication.rst
@@ -381,11 +381,7 @@ Authentication plugins can set signed ``ds_actor`` cookies themselves like so:
.. code-block:: python
response = Response.redirect("/")
- response.set_cookie("ds_actor", datasette.sign({
- "a": {
- "id": "cleopaws"
- }
- }, "actor"))
+ response.set_cookie("ds_actor", datasette.sign({"a": {"id": "cleopaws"}}, "actor"))
Note that you need to pass ``"actor"`` as the namespace to :ref:`datasette_sign`.
@@ -412,12 +408,16 @@ To include an expiry, add a ``"e"`` key to the cookie value containing a `base62
expires_at = int(time.time()) + (24 * 60 * 60)
response = Response.redirect("/")
- response.set_cookie("ds_actor", datasette.sign({
- "a": {
- "id": "cleopaws"
- },
- "e": baseconv.base62.encode(expires_at),
- }, "actor"))
+ response.set_cookie(
+ "ds_actor",
+ datasette.sign(
+ {
+ "a": {"id": "cleopaws"},
+ "e": baseconv.base62.encode(expires_at),
+ },
+ "actor",
+ ),
+ )
The resulting cookie will encode data that looks something like this:
diff --git a/docs/spatialite.rst b/docs/spatialite.rst
index d1b300b..556bad8 100644
--- a/docs/spatialite.rst
+++ b/docs/spatialite.rst
@@ -58,19 +58,22 @@ Here's a recipe for taking a table with existing latitude and longitude columns,
.. code-block:: python
import sqlite3
- conn = sqlite3.connect('museums.db')
+
+ conn = sqlite3.connect("museums.db")
# Lead the spatialite extension:
conn.enable_load_extension(True)
- conn.load_extension('/usr/local/lib/mod_spatialite.dylib')
+ conn.load_extension("/usr/local/lib/mod_spatialite.dylib")
# Initialize spatial metadata for this database:
- conn.execute('select InitSpatialMetadata(1)')
+ conn.execute("select InitSpatialMetadata(1)")
# Add a geometry column called point_geom to our museums table:
conn.execute("SELECT AddGeometryColumn('museums', 'point_geom', 4326, 'POINT', 2);")
# Now update that geometry column with the lat/lon points
- conn.execute('''
+ conn.execute(
+ """
UPDATE museums SET
point_geom = GeomFromText('POINT('||"longitude"||' '||"latitude"||')',4326);
- ''')
+ """
+ )
# Now add a spatial index to that column
conn.execute('select CreateSpatialIndex("museums", "point_geom");')
# If you don't commit your changes will not be persisted:
@@ -186,13 +189,14 @@ Here's Python code to create a SQLite database, enable SpatiaLite, create a plac
.. code-block:: python
import sqlite3
- conn = sqlite3.connect('places.db')
+
+ conn = sqlite3.connect("places.db")
# Enable SpatialLite extension
conn.enable_load_extension(True)
- conn.load_extension('/usr/local/lib/mod_spatialite.dylib')
+ conn.load_extension("/usr/local/lib/mod_spatialite.dylib")
# Create the masic countries table
- conn.execute('select InitSpatialMetadata(1)')
- conn.execute('create table places (id integer primary key, name text);')
+ conn.execute("select InitSpatialMetadata(1)")
+ conn.execute("create table places (id integer primary key, name text);")
# Add a MULTIPOLYGON Geometry column
conn.execute("SELECT AddGeometryColumn('places', 'geom', 4326, 'MULTIPOLYGON', 2);")
# Add a spatial index against the new column
@@ -201,13 +205,17 @@ Here's Python code to create a SQLite database, enable SpatiaLite, create a plac
from shapely.geometry.multipolygon import MultiPolygon
from shapely.geometry import shape
import requests
- geojson = requests.get('https://data.whosonfirst.org/404/227/475/404227475.geojson').json()
+
+ geojson = requests.get(
+ "https://data.whosonfirst.org/404/227/475/404227475.geojson"
+ ).json()
# Convert to "Well Known Text" format
- wkt = shape(geojson['geometry']).wkt
+ wkt = shape(geojson["geometry"]).wkt
# Insert and commit the record
- conn.execute("INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))", (
- "Wales", wkt
- ))
+ conn.execute(
+ "INSERT INTO places (id, name, geom) VALUES(null, ?, GeomFromText(?, 4326))",
+ ("Wales", wkt),
+ )
conn.commit()
Querying polygons using within()
diff --git a/docs/writing_plugins.rst b/docs/writing_plugins.rst
index bd60a4b..5af01f6 100644
--- a/docs/writing_plugins.rst
+++ b/docs/writing_plugins.rst
@@ -18,9 +18,10 @@ The quickest way to start writing a plugin is to create a ``my_plugin.py`` file
from datasette import hookimpl
+
@hookimpl
def prepare_connection(conn):
- conn.create_function('hello_world', 0, lambda: 'Hello world!')
+ conn.create_function("hello_world", 0, lambda: "Hello world!")
If you save this in ``plugins/my_plugin.py`` you can then start Datasette like this::
@@ -60,22 +61,18 @@ The example consists of two files: a ``setup.py`` file that defines the plugin:
from setuptools import setup
- VERSION = '0.1'
+ VERSION = "0.1"
setup(
- name='datasette-plugin-demos',
- description='Examples of plugins for Datasette',
- author='Simon Willison',
- url='https://github.com/simonw/datasette-plugin-demos',
- license='Apache License, Version 2.0',
+ name="datasette-plugin-demos",
+ description="Examples of plugins for Datasette",
+ author="Simon Willison",
+ url="https://github.com/simonw/datasette-plugin-demos",
+ license="Apache License, Version 2.0",
version=VERSION,
- py_modules=['datasette_plugin_demos'],
- entry_points={
- 'datasette': [
- 'plugin_demos = datasette_plugin_demos'
- ]
- },
- install_requires=['datasette']
+ py_modules=["datasette_plugin_demos"],
+ entry_points={"datasette": ["plugin_demos = datasette_plugin_demos"]},
+ install_requires=["datasette"],
)
And a Python module file, ``datasette_plugin_demos.py``, that implements the plugin:
@@ -88,12 +85,12 @@ And a Python module file, ``datasette_plugin_demos.py``, that implements the plu
@hookimpl
def prepare_jinja2_environment(env):
- env.filters['uppercase'] = lambda u: u.upper()
+ env.filters["uppercase"] = lambda u: u.upper()
@hookimpl
def prepare_connection(conn):
- conn.create_function('random_integer', 2, random.randint)
+ conn.create_function("random_integer", 2, random.randint)
Having built a plugin in this way you can turn it into an installable package using the following command::
@@ -123,11 +120,13 @@ To bundle the static assets for a plugin in the package that you publish to PyPI
.. code-block:: python
- package_data={
- 'datasette_plugin_name': [
- 'static/plugin.js',
- ],
- },
+ package_data = (
+ {
+ "datasette_plugin_name": [
+ "static/plugin.js",
+ ],
+ },
+ )
Where ``datasette_plugin_name`` is the name of the plugin package (note that it uses underscores, not hyphens) and ``static/plugin.js`` is the path within that package to the static file.
@@ -152,11 +151,13 @@ Templates should be bundled for distribution using the same ``package_data`` mec
.. code-block:: python
- package_data={
- 'datasette_plugin_name': [
- 'templates/my_template.html',
- ],
- },
+ package_data = (
+ {
+ "datasette_plugin_name": [
+ "templates/my_template.html",
+ ],
+ },
+ )
You can also use wildcards here such as ``templates/*.html``. See `datasette-edit-schema <https://github.com/simonw/datasette-edit-schema>`__ for an example of this pattern. |
On the one hand, I'm not crazy about some of the indentation decisions Black made here - in particular this one, which I had indented deliberately for readability: diff --git a/docs/authentication.rst b/docs/authentication.rst
index 0d98cf8..8008023 100644
--- a/docs/authentication.rst
+++ b/docs/authentication.rst
@@ -381,11 +381,7 @@ Authentication plugins can set signed ``ds_actor`` cookies themselves like so:
.. code-block:: python
response = Response.redirect("/")
- response.set_cookie("ds_actor", datasette.sign({
- "a": {
- "id": "cleopaws"
- }
- }, "actor"))
+ response.set_cookie("ds_actor", datasette.sign({"a": {"id": "cleopaws"}}, "actor")) But... consistency is a virtue. Maybe I'm OK with just this one disagreement? Also: I've been mentally trying to keep the line lengths a bit shorter to help them be more readable on mobile devices. I'll try a different line length using I like this more - here's the result for that example: diff --git a/docs/authentication.rst b/docs/authentication.rst
index 0d98cf8..2496073 100644
--- a/docs/authentication.rst
+++ b/docs/authentication.rst
@@ -381,11 +381,10 @@ Authentication plugins can set signed ``ds_actor`` cookies themselves like so:
.. code-block:: python
response = Response.redirect("/")
- response.set_cookie("ds_actor", datasette.sign({
- "a": {
- "id": "cleopaws"
- }
- }, "actor"))
+ response.set_cookie(
+ "ds_actor",
+ datasette.sign({"a": {"id": "cleopaws"}}, "actor"),
+ ) |
There's no |
In the absence of % git diff-index --quiet HEAD --
% echo $?
0
% blacken-docs -l 60 docs/*.rst
docs/authentication.rst: Rewriting...
...
% git diff-index --quiet HEAD --
% echo $?
1 |
I'm going to push the first commit with a deliberate missing formatting to check that the tests fail. |
Uses blacken-docs. This has a deliberate error which I hope will fail CI.
The tests failed there because of what I thought were warnings but turn out to be treated as errors:
|
Looking at that first error it appears to be a place where I had deliberately omitted the body of the function: Lines 196 to 211 in 3657363
I can use Fixing those warnings actually helped me spot a couple of bugs, so I'm glad this happened. |
OK, I'm expecting this one to fail at the |
One more attempt at testing the |
Turns out I didn't need that Submitted a documentation PR to that project instead: |
Wrote up what I learned in a TIL: https://til.simonwillison.net/sphinx/blacken-docs |
For example on this page: https://docs.datasette.io/en/stable/writing_plugins.html#packaging-a-plugin
I wonder if there's an easy way for me to enforce this for Sphinx documentation?
The text was updated successfully, but these errors were encountered: